首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A More Efficient Contextuality Distillation Protocol
Authors:Hui-xian Meng  Huai-xin Cao  Wen-hua Wang  Ya-jing Fan  Liang Chen
Institution:1.School of Mathematics and Information Science,Shaanxi Normal University,Xi’an,China;2.School of Ethnic Education,Shaanxi Normal University,Xi’an,China
Abstract:Based on the fact that both nonlocality and contextuality are resource theories, it is natural to ask how to amplify them more efficiently. In this paper, we present a contextuality distillation protocol which produces an n-cycle box B ? B from two given n-cycle boxes B and B . It works efficiently for a class of contextual n-cycle (n ≥?4) boxes which we termed as “the generalized correlated contextual n-cycle boxes”. For any two generalized correlated contextual n-cycle boxes B and B , B ? B is more contextual than both B and B . Moreover, they can be distilled toward to the maximally contextual box C H n as the times of iteration goes to infinity. Among the known protocols, our protocol has the strongest approximate ability and is optimal in terms of its distillation rate. What is worth noting is that our protocol can witness a larger set of nonlocal boxes that make communication complexity trivial than the protocol in Brunner and Skrzypczyk (Phys. Rev. Lett. 102, 160403 2009), this might be helpful for exploring the problem that why quantum nonlocality is limited.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号