Abstract: In the first half of this talk I will discuss the quantification of entanglement in Gaussian systems and how this relates to channel simulation and error correction. We conclude that entanglement of formation is a more faithful measure of entanglement in Gaussian systems than negativity, both qualitatively and quantitatively, and illustrate this with examples. In the second half of this talk I will discuss the macroscopicity problem in quantum mechanics. If quantum theory is universal we would expect to be able to observe quantum effects such as superpositions on a macroscopic scale. The current inability to observe such effects is commonly attributed to decoherence, leading to a so-called â€œmacro-scaleâ€ beyond which physical systems can be analysed without any reference to the quantum formalism. We challenge this view by showing that, with the assistance of a second system, a macroscopic system can be proved to be entangled even after arbitrary decoherence. We show this by introducing a modified Wignerâ€™s friend gedankenexperiment where the observer is not assumed to preserve quantum coherence.
CQT Talk by Guillermo Romero, The University of Santiago, Chile
Title: Nucleation of superfluid-light domains in a quenched dynamics Date/Time: 17-Oct, 11:00AM Venue: CQT Level 3 Seminar Room, S15-03-15
Abstract: Strong correlation effects emerge from light-matter interactions in coupled resonator arrays, such as the Mott-insulator to superfluid phase transition of atom-photon excitations. We demonstrate that the quenched dynamics of a finite-sized complex array of coupled resonators induces a first-order like phase transition. The latter is accompanied by domain nucleation that can be used to manipulate the photonic transport properties of the simulated superfluid phase; this in turn leads to an empirical scaling law. This universal behavior emerges from the light-matter interaction and the topology of the array. The validity of our results over a wide range of complex architectures might lead to a promising device for use in scaled quantum simulations.
CQT Industry Talk by Ben Miles and Andy Collins, QTEC, UK
Title: Developing Leaders in Quantum Technologies: The QTEC Fellowship Program Date/Time: 02-Oct, 04:00PM Venue: CQT Level 3 Seminar Room, S15-03-15
The Quantum Technology Enterprise Centre (QTEC) is a world-first incubator for quantum-based technology innovators. Its vision is to educate and create the quantum entrepreneurs of the future who will be the foundation, pillars, and growth of the UKâ€™s and the Global Quantum Industry. Its mission is to develop the thought leaders and entrepreneurs who will take quantum technologies out of the lab and into the real world.
QTEC is a collaboration between the University of Bristol and The Bettany Centre for Entrepreneurship at Cranfield University. We are part of the Â£270M UK National Quantum Technologies Programme which has a mission to support the translation of revolutionary quantum technology from the lab into the commercial market. Our aim is to produce the pioneers and businesses that will make the UK a global leader in the sectors where the emergence of quantum engineering will have a transformative impact.
CQT Colloquium by Mikhail Baranov, Institute for Quantum Optics and Quantum Information, Austrian Academy of Sciences
Abstract: The generation of subwavelength optical barriers on the scale of tens of nanometers, as conservative optical potentials for cold atoms, is discussed both theoretically and experimentally. In the proposed scheme they originate from nonadiabatic corrections to Born-Oppenheimer potentials for position-dependent â€œdark statesâ€ in atomic Î› configurations. The subwavelength optical barriers represent a â€œKronig-Penneyâ€ potential, and I discuss the corresponding band structure including the effects of spontaneous emission and atom loss due to â€œbrightâ€ channels. Inclusion of an interparticle dipole-dipole interaction leads to formation of â€œdomain wall moleculesâ€ and to unconventional Hubbard models with modulated in space interparticle interactions.
As a brief discussion of potential applications, the subwavelength barrier can be used as a â€œsplitterâ€ to create a double-wire (or double-layer) with subwavelength spacing and, therefore, with substantially increased couplings as compared to ordinary optical lattices. As another application, specially designed subwavelength atomic internal-state spatial structures can be used for building an atomic scanning microscope with subwavelength resolution.
Abstract: The k-Even Set problem is a parameterized variant of the Minimum Distance Problem for binary linear codes, which can be stated as follows: given a generator matrix A and an integer k, determine whether the code generated by A has distance at most k. Here, k is the parameter of the problem. The question of whether k-Even Set is fixed parameter tractable (FPT) has been repeatedly raised in literature and has earned its place in Downey and Fellows' book (2013) as one of the "most infamous" open problems in the field of Parameterized Complexity. In this talk, I will present our work showing that k-Even Set does not admit FPT algorithms under the (randomized) Gap Exponential Time Hypothesis (Gap-ETH). In fact, our result rules out not only exact FPT algorithms, but also any constant factor FPT approximation algorithms for the problem. Furthermore, our result holds even under the following weaker assumption, which is also known as the Parameterized Inapproximability Hypothesis (PIH): no (randomized) FPT algorithm can distinguish a satisfiable 2CSP instance from one which is only 0.99-satisfiable (where the parameter is the number of variables). In a subsequent work, Bonnet, Egri, Lin and Marx showed inapproximability of the parameterized Nearest Codeword problem, which together with our result, implies unconditional W-hardness of Even Set. If there is time, I will also sketch this result.
Joint work with Karthik C.S., Suprovat Ghoshal and Pasin Manurangsi.
CQT Talk by Le Phuc Thinh, CQT
Title: CQT Talk - Quantum Machine Learning Journal Club Talk by Le Phuc Thinh, CQT Date/Time: 28-Sep, 03:00PM Venue: CQT Level 3 Seminar Room, S15-03-15
Abstract: Suppose we have many copies of an unknown n-qubit state ρ. We measure some copies of ρ using a known two-outcome measurement E1, then other copies using a measurement E2, and so on. At each stage t, we generate a current hypothesis σt about the state ρ, using the outcomes of the previous measurements. We show that it is possible to do this in a way that guarantees that |Tr(E_iσt)−Tr(E_iρ)|, the error in our prediction for the next measurement, is at least ε at most O(n/ε2) times. Even in the "non-realizable" setting---where there could be arbitrary noise in the measurement outcomes---we show how to output hypothesis states that do significantly worse than the best possible states at most O(Tn‾‾‾√) times on the first T measurements. These results generalize a 2007 theorem by Aaronson on the PAC-learnability of quantum states, to the online and regret-minimization settings. We give three different ways to prove our results---using convex optimization, quantum postselection, and sequential fat-shattering dimension---which have different advantages in terms of parameters and portability.