Facebook Google Plus Twitter YouTube Mailing List RSS Feed


Monday, 7 December 2015, from 4.30pm.

This event is open to the public. It's free but registration is required. Please register here: CQT Annual Symposium 2015

University Hall Auditorium
Level 2, Lee Kong Chian Wing
University Hall, NUS
Location of the Auditorium can be found at: Google Map



"Certified Quantum Randomness"
Certified Quantum Randomness
Serge Massar
Université Libre de Bruxelles, Belgium

Randomness is a physical phenomena which we are confronted with all the time. Will it rain today? At what time? Will the train be on time? But are such phenomena truly random? Good randomness is essential for many applications. Cryptography, the art of hiding information from malicious users, is only as good as the source of randomness that underlies it. Quantum mechanics, the theory of microscopic phenomena, can only predict the probability of events: for instance quantum theory can only predict the probability that a radioactive nucleus will decay, not if the nucleus will decay. Does this mean that microscopic phenomena are truly random? By studying systems of two entangled particles, it can be shown both theoretically and experimentally, that events at the microscopic scale are truly random, truly unpredictable. Beyond its philosophical implications, these works also have important potential applications. Indeed they imply that one can build random number generators that certify that they work correctly. That is, if the random number generator malfunctions in some way, if the numbers it produces cease to be random, this will automatically be detected. By extending this idea, one could also build quantum cryptographic systems and quantum computers that certify that they work correctly. We discuss the perspectives for practical implementations.

5.30pm: break


"Computing beyond the Age of Moore’s Law "
Computing beyond the Age of Moore’s Law
Stanley Williams
Hewlett Packard Labs, USA

With the end of Moore’s Law within sight (really!), what opportunities exist to continue the exponential scaling of computer performance and efficiency into the future? The primary consumers of energy and time in computers today are not the processors, but rather the communication and storage systems in a data center. Even if it were possible to perform computation infinitely fast with zero energy consumption, there would be little change in the power and time required to perform a calculation on the types of ‘big data’ sets that are beginning to dominate information technology. I will describe a path forward with two stages. The first is to change the fundamental structure of a computer from the processor-centric von Neumann paradigm that has dominated for the past seven decades to a memory-driven architecture based on a flat nonvolatile memory space, high bandwidth photonic interconnect and dispersed system-on-chip processors. This is the vision behind The Machine, a major research and development effort currently in progress in Hewlett Packard Enterprise. Once this transformation has been completed, a new era of hybrid computation can begin, which is the motivation behind a new Nanotechnology-Inspired Grand Challenge for Future Computing recently announced by the US White House. One of the drivers for this initiative is the thesis that although our present understanding of the brain is limited, we know enough now to design and build circuits that can accelerate certain computational tasks; and as we learn more about how brains communicate and process information, we will be able to harness that understanding to create a new exponential growth path for computing technology.

7.00pm: End of Symposium

Bookmark and Share