Quantum Computing System Lecture Series
Please signup here to get Email notifications on upcoming lectures!
Research Session
1. Research Session Speaker: Prineha Narang
Assistant Professor@Harvard University
Title: Building Blocks of Scalable Quantum Information Science
Abstract: Quantum information technologies are expected to enable transformative technologies with wide-ranging global impact. Towards realizing this tremendous promise, efforts have emerged to pursue quantum architectures capable of supporting distributed quantum computing, networks and quantum sensors. Quantum architecture at scale would consist of interconnected physical systems, many operating at their individual classical or quantum limit. Such scalable quantum architecture requires modeling that accurately describes these mesoscopic hybrid phenomena. By creating predictive theoretical and computational approaches to study dynamics, decoherence and correlations in quantum matter, our work could enable such hybrid quantum technologies1,2. Capturing these dynamics poses unique theoretical and computational challenges. The simultaneous contribution of processes that occur on many time and length-scales have remained elusive for state-of-the-art calculations and model Hamiltonian approaches alike, necessitating the development of new methods in computational physics3–5. I will show selected examples of our approach in ab initio design of active defects in quantum materials6–8, and control of collective phenomena to link these active defects9,10. Building on this, in the second part of my seminar, I will present promising physical mechanisms and device architectures for coupling (transduction) to other qubit platforms via dipole-, phonon-, and magnon-mediated interactions9–12. In a molecular context, will discuss approaches to entangling molecules in the strong coupling regime. Being able to control molecules at a quantum level gives us access to degrees of freedom such as the vibrational or rotational degrees to the internal state structure. Entangling those degrees of freedom offers unique opportunities in quantum information processing, especially in the construction of quantum memories. In particular, we look at two identical molecules spatially separated by a variable distance within a photonic environment such as a high-Q optical cavity. By resonantly coupling the effective cavity mode to a specific vibrational frequency of both molecules, we theoretically investigate how strong light-matter coupling can be used to control the entanglement between vibrational quantum states of both molecules. Linking this with detection of entanglement and quantifying the entanglement with an appropriate entanglement measure, we use quantum tomographic techniques to reconstruct the density matrix of the underlying quantum state. Taking this further, I will present some of our recent work in capturing non-Markovian dynamics in open quantum systems (OQSs) built on the ensemble of Lindblad’s trajectories approach 13–16. Finally, I will present ideas in directly emulating quantum systems, particularly addressing the issues of model abstraction and scalability, and connect with the various quantum algorithm efforts underway.
Bio: Prineha Narang is an Assistant Professor at the John A. Paulson School of Engineering and Applied Sciences at Harvard University. Prior to joining the faculty, Prineha came to Harvard as a Ziff Environmental Fellow at the Harvard University Center for the Environment. She was also a Research Scholar in Condensed Matter Theory at the MIT Dept. of Physics, working on new theoretical methods to describe quantum interactions. Prineha’s work has been recognized by many, including the Mildred Dresselhaus Prize, a Friedrich Wilhelm Bessel Research Award (Bessel Prize) from the Alexander von Humboldt Foundation, a Max Planck Sabbatical Award from the Max Planck Society, and the IUPAP Young Scientist Prize in Computational Physics in 2021, a National Science Foundation CAREER Award in 2020, being named a Moore Inventor Fellow by the Gordon and Betty Moore Foundation for innovations in quantum science and technology, CIFAR Azrieli Global Scholar by the Canadian Institute for Advanced Research, a Top Innovator by MIT Tech Review (MIT TR35), and a Young Scientist by the World Economic Forum in 2018. In 2017, she was named by Forbes Magazine on their “30under30” list for her work in atom-by-atom quantum engineering.
Time: Sep 15th (Reschedule to Oct 6th), Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
2. Research Session Speaker: Jakub Szefer
Asscoiate Professor@Yale University
Title: Quantum Computer Hardware Cybersecurity
Abstract: As Quantum Computer device research continues to advance rapidly, there are also advances at the other levels of the computer system stack that involve these devices. In particular, more and more of the Quantum Computer devices are becoming available as cloud-based services through IBM Quantum, Amazon Braket, Microsoft Azure, and others. In parallel, researchers have put forward ideas about multi-programming of the Quantum Computer devices where single device can be shared by multiple programs, or even multiple users. While all of the advances make the Quantum Computer devices more easily accessible and increase utilization, they open up the devices to various security threats. Especially, with cloud-based access and multi-tenancy, different, remote, and untrusted users could abuse the Quantum Computer devices to leak information from other users using the shared devices, map or learn about the Quantum Computer infrastructure itself. Malicious users could also try to reverse engineer the Quantum Computer architectures to learn about the design of the hardware devices. On the other hand, users are not immune today from malicious or compromised cloud operators who may want to spy on the circuits of the users which are executing on the Quantum Computers hosted within the operator’s data centers. Considering the different security threats, and lessons learned from security of classical computers, this talk will introduce the new research field of Quantum Computer Hardware Security, present recent research results in attacks and defenses on Quantum Computers. The goal of the presentation is to motivate discussion about Quantum Computer Hardware Cybersecurity and make connections between the Quantum Computer research community and the Hardware Security research community to help develop secure Quantum Computer architectures and protect the devices before they are widely deployed.
Bio: Jakub Szefer’s research focuses on computer architecture and hardware security. His research encompasses secure processor architectures, cloud security, FPGA attacks and defenses, hardware FPGA implementation of cryptographic algorithms, and most recently quantum computer cybersecurity. His research is supported through National Science Foundation and industry grants and donations. He is currently an Associate Professor of Electrical Engineering at Yale University, where he leads the Computer Architecture and Security Laboratory (CASLAB). Prior to joining Yale, he received Ph.D. and M.A. degrees in Electrical Engineering from Princeton University, and B.S. degree with highest honors in Electrical and Computer Engineering from University of Illinois at Urbana-Champaign. He has received the NSF CAREER award in 2017. Jakub is the author of first book focusing on processor architecture security: “Principles of Secure Processor Architecture Design”, published in 2018. Recently, he has been promoted to the IEEE Senior Member rank in 2019 and is a recipient of the 2021 Ackerman Award for Teaching and Mentoring. Details of Jakub’s research and projects can be found at: https://caslab.csl.yale.edu/~jakub
Time: Sep 22, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
3. Research Session Speaker: Guan Qiang
Assistant Professor@Kent State University
Title: Enabling robust quantum computer system by understanding errors from NISQ machines
Abstract: The growth of the need for quantum computers in many domains such as machine learning, numerical scientific simulation, and finance has urged quantum computers to produce more stable and less error-prone results. However, mitigating the impact of the noise inside each quantum device remains a present challenge. In this project, we utilize the system calibration data collected from the existing IBMQ machines, applying fidelity degradation detection to generate the fidelity degradation matrix. Based on the fidelity degradation matrix, we define multiple new evaluation metrics to compare the fidelity between the qubit topology of the quantum machines fidelity of qubits on the same topology, and to search for the most error-robust machine so that users can expect the most accurate results and study the insight of correlation between qubits that may further motivate the quantum compiler design for the qubit mapping. Besides, we build a visualization system VACSEN to illustrate the errors and reliability of the quantum computing backend.
Bio: Dr. Qiang Guan is an assistant professor in the Department of Computer Science at Kent State University, Kent, Ohio. Dr. Guan is the director of the Green Ubiquitous Autonomous Networking System lab (GUANS). He is also a member of the Brain Health Research Institute (BHRI) at Kent State University. He was a computer scientist at Los Alamos National Laboratory before joining KSU. His current research interests include: fault tolerance design for HPC applications; HPC-Cloud hybrid systems; virtual reality; quantum computing systems and applications.
Time: Sep 29, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
4. Research Session Speaker: Bochen Tan
PhD student@UCLA
Title: Compilation for Near-Term Quantum Computing: Gap Analysis and Optimal Solution
Abstract: The most challenging stage in compilation for near-term quantum computing is qubit mapping, also called layout synthesis, where qubits in quantum programs are mapped to physical qubits. In order to understand the quality of existing solutions, we apply the measure-improve methodology, which has been successful in classical circuit placement, to this problem. We construct quantum mapping examples with known optimal, QUEKO, to measure the optimality gaps of leading heuristic compilers. On the revelation of large gaps, we set out to close them with optimal layout synthesis for quantum computing, OLSQ, a more efficient formulation of the qubit mapping problem into mathematical programming. We accelerate OLSQ with the transition mode and expand its solution space with domain-specific knowledge on applications like quantum approximate optimization algorithm, QAOA.
Bio: Bochen Tan received the B.S. degree in electrical engineering from Peking University in 2019, and the M.S. degree in computer science from University of California, Los Angeles in 2022. He is currently a graduate student researcher at UCLA focusing on design automation for quantum computing.
Time: Oct 13, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
5. Research Session Speaker: Zeyuan Zhou
PhD student@JHU
Title: Quantum Crosstalk Robust Quantum Control
Abstract: The prevalence of quantum crosstalk in current quantum devices poses challenges to achieving high-fidelity quantum logic operations and reliable quantum processing. Through quantum control theory, we develop an analytical condition for achieving crosstalk-robust single-qubit control of multi-qubit systems. We examine the effects of quantum crosstalk via a cumulant expansion approach and develop a condition to suppress the leading order contributions to the dynamics. The efficacy of the condition is illustrated in the domains of quantum state preservation and noise characterization through the development of crosstalk-robust dynamical decoupling (DD) and quantum noise spectroscopy (QNS) protocols. Using the IBM Quantum Experience superconducting qubits, crosstalk-robust state preservation is demonstrated on 27 qubits, where a 3× improvement in coherence decay is observed for single-qubit product and multipartite entangled states. Through the use of noise injection, we experimentally demonstrate the first known parallel crosstalk-robust dephasing QNS on a seven-qubit processor, where a 10^4 improvement in reconstruction accuracy over “cross-susceptible” alternatives is found. Together, these experiments highlight the significant impact the crosstalk mitigation condition can have on improving multi-qubit characterization and control on current quantum devices. In this talk, I will go through the theoretical framework we leveraged which enables the co-suppression of quantum crosstalk and system-environment noise. For the second part, I will discuss a wide range of applications on near-term devices from physical layer control and characterization to robust algorithms design and logical encoding.
Bio: Zeyuan(Victor) Zhou is a graduate student and a research assistant at Dr. Gregory Quiroz’s group at Johns Hopkins University. He is the recipient of the Dean’s fellowship at the G.W.C. Whiting School of Engineering. His primary research interests include theoretical quantum control, quantum error mitigation, and robust quantum algorithms. Victor has been working on devising general control criteria to suppress quantum crosstalk noise prevailing in current quantum technologies. The technique is broadly applied to different layers of quantum software stacks and enables robust and scalable quantum information processing. He received his B.S. in Physics and B.S. in Applied Mathematics and Statistics also from Johns Hopkins University.
Time: Oct 20, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
6. Research Session Speaker: Wei Tang
PhD student@Princeton
Title: Distributed Quantum Computing
Abstract: Quantum processing units (QPUs) have to satisfy highly demanding quantity and quality requirements on their qubits to produce accurate results for problems at useful scales. Furthermore, classical simulations of quantum circuits generally do not scale. Instead, quantum circuit cutting techniques cut and distribute a large quantum circuit Into multiple smaller subcircuits feasible for less powerful QPUs. However, the classical post-processing incurred from the cutting introduces runtime and memory bottlenecks. We present TensorQC, which addresses the bottlenecks via novel algorithmic techniques including (1) a State Merging framework that locates the solution states of large quantum circuits using a linear number of recursions; (2) an automatic solver that finds high-quality cuts for complex quantum circults2x larger than prior works; and (3) a tensor network based post-processing that minimizes the classical overhead by orders of magnitudes over prior parallelization techniques. Our experiments reduce the quantum area requirement by at least 60% over the purely quantum platforms. We also demonstrated benchmarks up to 200 qubits on a single GPU, much beyond the reach of the strictly classical platforms.
Bio: Wei Tang is a fourth year Computer Science Ph.D. student at Princeton University in Professor Margaret Martonosi‘s group. His research interests include but not limited to Quantum Computing Architecture, and Machine Learning X Quantum Computing. Previously, He worked with Professor Jungsang Kim at Duke University on ion trapping experiments, and James B. Duke Professor Alfred Goshaw at Duke University in the field of high energy physics.
Time: Oct 27, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
7. Research Session Speaker: Prof. Tirthak Patel
Incoming Assitant Professor at Rice University
Title: Developing Robust System Software Support for Quantum Computers
Abstract: The field of quantum computing has observed extraordinary advances in the last decade, including the design and engineering of quantum computers with more than a hundred qubits. While these engineering advances have been celebrated widely, computational scientists continue to struggle to make meaningful use of existing quantum computers. This is primarily because quantum computers suffer from prohibitively high noise levels, which lead to erroneous program outputs and limit the practical usability of quantum computers. Researchers and practitioners are actively devising theoretical and quantum hardware-based error mitigation techniques for quantum computers; while these efforts are useful, they do not help us realize the full potential of quantum computers. In this talk, I will discuss a unique opportunity space for improving performance and fidelity of quantum programs from a system software perspective. In particular, I will demonstrate how to carefully design novel system software solutions that can further the reach of hardware-only solutions and improve the usability of quantum computers.
Bio: Tirthak Patel is an incoming Assistant Professor at the Rice University Department of Computer Science as part of the Rice Quantum Initiative. He is currently a Computer Engineering Ph.D. Candidate at Northeastern University conducting systems level research at the intersection of quantum computing and high-performance computing (HPC). His research explores the trade-offs among factors affecting reliability, performance, and efficiency, in recognition of which I have received the ACM-IEEE CS George Michael Memorial HPC Fellowship and the NSERC Alexander Graham Bell Canada Graduate Scholarship (CGS D-3).
Time: Nov 03, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
8. Research Session Speaker: Prof. Gushu Li
Incoming Assistant Professor at the University of Pennsylvania
Title: Enabling Deeper Quantum Compiler Optimization at High Level
Abstract: A quantum compiler is one essential and critical component in a quantum computing system to deploy and optimize the quantum programs onto the underlying physical quantum hardware platforms. Yet, today’s quantum compilers are still far from optimal. One reason is that most optimizations in today’s quantum compilers are local program transformations over very few qubits and gates. In general, it is highly non-trivial for a compiler that runs on a classical computer to automatically derive large-scale program optimizations at the gate-level.In this talk, we will discuss how we can systematically enhance the quantum compilers by introducing high-level program optimizations in the quantum software/compiler infrastructure. Instead of optimizing the quantum programs at the gate level, we design new quantum programming language primitives and intermediate representations that can maintain high-level properties of the programs. These high-level properties can then be leveraged to derive new large-scale quantum compiler optimizations beyond the capabilities of gate-level optimizations. In particular, we will introduce optimizing quantum simulation programs over a Pauli string based intermediate representation, mapping surface code onto superconducting architectures, and quantum program testing/error mitigation through projection-based quantum assertions. We believe that the high-level optimization approach can also be applicable to other quantum application domains and algorithmic properties.
Bio: Mr. Gushu Li is an incoming Assistant Professor at the Department of Computer and Information Science, University of Pennsylvania. He is currently a Ph.D. candidate at the University of California, Santa Barbara, advised by Prof. Yuan Xie and Prof. Yufei Ding. His research features the emerging quantum computer system and spans mainly across the quantum programming language, quantum compiler, and quantum computer architecture. His research has been recognized by the ACM SIGPLAN Distinguished Paper Award at OOPSLA 2020 and an NSF Quantum Information Science and Engineering Network Fellow Grant Award. His research outputs have been adopted by several industry/academia quantum software frameworks, including IBM’s Qiskit, Amazon’s Braket, Quantinuum’s t|ket>, and Oak Ridge National Lab’s qcor.
Time: Nov 10, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
9. Research Session Speaker: Prof. Nai-Hui Chia
Assistant Professor at the Rice University
Title: Classical Verification of Quantum Depth
Abstract: Verifying if a remote server has sufficient quantum resources to demonstrate quantum advantage is a fascinating question in complexity theory as well as a practical challenge. One approach is asking the server to solve some classically intractable problem, such as factoring. Another approach is the proof of quantumness protocols. These protocols enable a classical client to check whether a remote server can complete some classically intractable problem and thus can be used to distinguish quantum from classical computers. However, these two approaches mainly focus on distinguishing quantum computers from classical ones. They do not directly translate into ones that separate quantum computers with different quantum resources. In this talk, we want to go one step further by showing protocols that can distinguish machines with different quantum depths. We call such protocols Classical Verification of Quantum Depth (CVQD). Roughly speaking, if a server has quantum circuit depth at most d, the classical client will reject it; otherwise, the classical client will accept it. Note that a malicious server, in general, can use classical computers to cheat. Thus, CVQD protocols shall be able to distinguish hybrid quantum-classical computers with different quantum depths. We will see two CVQD protocols: the first protocol can separate hybrid quantum-classical computers with quantum depth d and d+c (for c some fixed constant) assuming quantum LWE, and the second protocol is a two-prover protocol that achieves sharper separation (d versus d+3).
Bio: Nai-Hui Chia is an Assistant Professor in the Department of Computer Science at Rice University. Before that, he was an Assistant Professor in the Luddy School of Informatics, Computing, and Engineering at Indiana University Bloomington from 2021 to 2022, a Hartree Postdoctoral Fellow in the Joint Center for Quantum Information and Computer Science (QuICS) at the University of Maryland from 2020 to 2021, supervised by Dr. Andrew Childs, and a Postdoctoral Fellow at UT Austin from 2018 to 2020, working under the supervision of Dr. Scott Aaronson. he received my Ph.D. in Computer Science and Engineering at Penn State University, where he was fortunate to have Dr. Sean Hallgren as his advisor.
Time: Nov 17, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
10. Research Session Speaker: Prof. Mohsen Heidari
Assistant Professor at the Indiana University, Bloomington
Title: Learning and Training in Quantum Environments
Abstract: Quantum computing presents fascinating new opportunities for various applications, including machine learning, simulation, and optimization. Quantum computers (QCs) are expected to push beyond the limits established by the classical laws of physics and surpass the capabilities of classical supercomputers. They leverage quantum-mechanical principles such as superposition and entanglement for computation, information processing, and pattern recognition. Superposition allows a system to exist in multiple states (until measurement). Entanglement facilitates non-local statistical correlations that classical models cannot produce since they violate Bell Inequalities. With such unique features, not only quantum advantage is on the horizon, but also a far greater capability to learn patterns from inherently quantum data by directly operating on quantum states of physical systems (e.g., photons or states of matter). Utilizing quantum data provides the ability to comprehend better, predict, and control quantum processes and opens doors to a wide range of applications in drug discovery, communications, security, and even human cognition. The first part of this talk covers an introduction to foundational concepts in quantum computing. The second part focuses on learning using near-term quantum computers for classical and quantum data. Mainly, I discuss the training of quantum neural networks (QNNs) using quantum-classical hybrid loops. I present some of the unique challenges in quantum learning due to effects such as the no-cloning principle, measurement incompatibility, and stochasticity of quantum. Then, I introduce a few solutions to address such challenges, particularly one-shot gradient-based training of QNNs suitable for near-term quantum computers with minimal qubit processing power. Lastly, I discuss applications of QNNs in the classification of quantum states, e.g., entanglement versus separability of qubits.
Bio: Mohsen Heidari is an Assistant Professor in the Department of Computer Science at the Luddy School of Informatics, Computing, and Engineering at Indiana University, Bloomington. He is a member of the NSF Center for Science of Information and Indiana University Quantum Science and Engineering Center (QSEc). He obtained his Ph.D. in Electrical Engineering in 2019 and his M.Sc. in Mathematics in 2017, both from the University of Michigan, Ann Arbor. Mohsen’s research interests lie in theoretical machine learning, quantum computing and algorithms, and classical and quantum information theory.
Time: Dec 01, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
11. Research Session Speaker: Prof. Jun Qi
Assistant Professor at the Fudan University
Title: Quantum Machine Learning: Theoretical Foundations and Applications on NISQ Devices
Abstract: Quantum machine learning (QML) is a trailblazing research subject that integrates quantum computing and machine learning. With recent advances in quantum computing, we have witnessed the NISQ era which admits as many as a few hundred qubits available for our QML applications, particularly based on variational quantum circuits (VQC). This talk first reviews our pioneering research of VQC-based QML approaches in reinforcement learning, speech recognition, and natural language processing. Then, we characterize the theoretical foundations of VQC and also improve the representation and generalization powers of VQC by proposing an end-to-end TTN-VQC model. Moreover, we further characterize the hybrid quantum-classical neural network in the context of Meta-learning.
Bio: Dr. Jun Qi is now an Assistant Professor in the Department of Electronic Engineering of the School of Information Science and Engineering at Fudan University. He received his Ph.D. in the School of Electrical and Computer Engineering at Georgia Institute of Technology, Atlanta, GA, in 2022, advised by Prof. Chin-Hui Lee and Prof. Xiaoli Ma. Previously, he obtained two Masters in Electrical Engineering from the University of Washington, Seattle, and Tsinghua University, Beijing, in 2013 and 2017, respectively. Besides, he was a research intern in the Deep Learning Technology Center at Microsoft Research, Redmond, WA, Tencent AI Lab, WA, and MERL, MA, USA. Dr. Qi was the recipient of 1st prize in Xanadu AI Quantum Machine Learning Competition 2019, and his ICASSP paper on quantum speech recognition was nominated as the best paper candidate in 2022. Besides, he gave two Tutorials on Quantum Neural Networks for Speech and Language Processing at the venues of IJCAI 21 and ICASSP 22.
Time: Dec 08, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
12. Research Session Speaker: Yasuo Oda
PhD student at John Hopkins University
Title: Noise Modeling of the IBM Quantum Experience
Abstract: The influence of noise in quantum dynamics is one of the main factors preventing Noisy Intermediate-Scale Quantum (NISQ) devices from performing useful quantum computations. Errors must be suppressed in order to achieve the desired levels of accuracy, which requires a thorough understanding of the nature and interplay of the different sources of noise. In this work, we propose an effective error model of single-qubit operations on the IBM Quantum Experience, that possesses considerable predictive power and takes into account spatio-temporally correlated noise. Additionally, we showcase how Quantum Noise Spectroscopy (QNS) can be used alongside other error characterization techniques, such as T1 experiments, to obtain a more complete error model of the system. We focus on a Hamiltonian description of the noise, with parameters obtained from a small set of characterization experiments. We show that simulations using this error model are capable of recovering the characterization experiments’ results to a high degree of accuracy. We also successfully compare the simulations against test data consisting of experimental results of varying circuit lengths and types of implemented operations.
Bio: Yasuo Oda received his undergraduate and master’s degrees from Balseiro Institute, Argentina, in 2016 and 2017. Yasuo is currently a PhD candidate at Johns Hopkins University, under the supervision of Dr. Gregory Quiroz, where he focuses on understanding and mitigating correlated noise in quantum devices. His research interests include Quantum Control, Variational Quantum Algorithms, and Quantum Error Mitigation.
Time: Dec 15, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
13. Research Session Speaker: Yuxiang Peng
PhD student at University of Maryland, College Park
Title: Software Tools for Analog Quantum Computing
Abstract: Recent experimental results suggest that continuous-time analog quantum simulation would be advantageous over gate-based digital quantum simulation in the Noisy Intermediate-Size Quantum (NISQ) machine era. However, programming such analog quantum simulators is much more challenging due to the lack of a unified interface between hardware and software, and the only few known examples are all hardware-specific. In our talk, we will introduce two tools designed for better programming and controlling analog quantum simulators. First, we design and implement SimuQ, the first domain-specific language for Hamiltonian simulation that supports pulse-level compilation to heterogeneous analog quantum simulators. Specifically, in SimuQ, front-end users will specify the target Hamiltonian evolution with Hamiltonian Modeling Language, and the programmability of analog simulators is specified through a new abstraction called the abstract analog instruction set by hardware providers. Through a solver-based compilation, SimuQ will generate the pulse-level instruction schedule on the target analog simulator for the desired Hamiltonian evolution, which has been demonstrated on pulse-controlled superconducting (Qiskit Pulse) and neutral-atom (QuEra Bloqade) quantum systems, as well as on normal circuit-based digital quantum machines. Second, we formulate the first differentiable analog quantum computing framework with specific parameterization design at the analog signal (pulse) level to better exploit near-term quantum devices via variational methods. We further propose a scalable approach to estimate the gradients of quantum dynamics using a forward pass with Monte Carlo sampling, which leads to a quantum stochastic gradient descent algorithm for scalable gradient-based training in our framework. Our method significantly boosts the performance of quantum optimization and quantum control and can be integrated into SimuQ seamlessly.
Bio: Yuxiang Peng is a PhD student at the Department of Computer Science, University of Maryland, College Park. He is also affiliated to the Joint Center for Quantum Information and Computer Science. He is currently advised by Prof. Xiaodi Wu. His research interests are broadly in programming languages, quantum computing and physics. He received his bachelor’s degrees from the Institute for Interdisciplinary Information Science and the Department of Mathematical Science, Tsinghua University, and his master’s degree from the Department of Computer Science, University of Maryland, College Park.
Time: Dec 22, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
14. Research Session Speaker: Runzhou Tao
PhD student at Columbia University
Title: Automatic Formal Verification of the Qiskit Compiler
Abstract: Quantum compilers are essential in the quantum software stack but are error-prone. Elliminating bugs in quantum compilers is crucial for the success of near-term quantum computation. In this talk, we presents Giallar, a fully-automated verification toolkit for quantum compilers to formally prove that the compiler is bug-free. Giallar requires no manual specifications, invariants, or proofs, and can automatically verify that a compiler pass preserves the semantics of quantum circuits. To deal with unbounded loops in quantum compilers, Giallar abstracts three loop templates, whose loop invariants can be automatically inferred. To efficiently check the equivalence of arbitrary input and output circuits that have complicated matrix semantics representation, Giallar introduces a symbolic representation for quantum circuits and a set of rewrite rules for showing the equivalence of symbolic quantum circuits. With Giallar, we implemented and verified 44 (out of 56) compiler passes in 13 versions of the Qiskit compiler, the open-source quantum compiler standard, during which three bugs were detected in and confirmed by Qiskit. Our evaluation shows that most of Qiskit compiler passes can be automatically verified in seconds and verification imposes only a modest overhead to compilation performance.
Bio: Runzhou Tao is a fourth year Ph.D. student at the Department of Computer Science, Columbia University. He is advised by Prof. Ronghui Gu. His research focus on programming language and operating system support for quantum computing. He won best paper awards from conferences such as FOCS and OSDI. He received his Bachelor’s degree from Yao Class at Tsinghua University in 2019.
Time: Jan 05, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
15. Research Session Speaker: Jiyuan Wang
PhD Candidate at University of California, Los Angeles
Title: QDiff: Differential Testing for Quantum Software Stacks
Abstract: Several quantum software stacks (QSS) have been developed in response to rapid hardware advances in quantum computing. A QSS includes a quantum programming language, an optimizing compiler that translates a quantum algorithm written in a high-level language into quantum gate instructions, a quantum simulator that emulates these instructions on a classical device, and a software controller that sends analog signals to a very expensive quantum hardware based on quantum circuits. In comparison to traditional compilers and architecture simulators, QSSes are difficult to tests due to the probabilistic nature of results, the lack of clear hardware specifications, and quantum programming complexity. QDiff devises a novel differential testing approach for QSSes, with three major innovations: (1) We generate input programs to be tested via semantics-preserving, source to source transformation to explore program variants. (2) We speed up differential testing by filtering out quantum circuits that are not worthwhile to execute on quantum hardware by analyzing static characteristics such as a circuit depth, 2- gate operations, gate error rates, and T1 relaxation time. (3) We design an extensible equivalence checking mechanism via distribution comparison functions such as Kolmogorov–Smirnov test and cross entropy.
Bio: Jiyuan Wang is a fourth year Ph.D. candidate in Computer Science Department at University of California, Los Angeles. He is designing testing and program synthesis method for heterogeneous computing, including quantum computer and FPGA. He is a member of SOLAR group and co-advised by Professor Miryung Kim and Professor Harry Xu. His paper QDiff was selected as SIGSOFT reseach highlight in 2022.
Time: Jan 12, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
16. Research Session Speaker: Charles Yuan
PhD student at Massachusetts Institute of Technology
Title: Abstractions Are Bridges Toward Quantum Programming
Abstract: In this talk, I present abstractions that help classical developers reason about the quantum world, with the goal of designing expressive and sound tools for quantum programming.
First, I present Tower, a language to enable quantum programming with data structures, on which emerging quantum algorithms rely to demonstrate computational advantage. To correctly operate in superposition, a data structure must satisfy three properties — reversibility, history independence, and bounded-time execution. Standard implementations, such as the representation of a set as a hash table, fail these properties, calling for tools to develop specialized implementations. We present Tower, a language that enables the developer to implement data structures as pointer-based, linked data, and Boson, the first memory allocator that supports reversible, history-independent, and constant-time dynamic memory allocation in quantum superposition. Using them, we implement Ground, the first quantum library of data structures, featuring an executable and efficient implementation of sets.
Next, I present Twist, the first language that features a type system for sound reasoning about entanglement. Entanglement, the phenomenon of correlated measurement outcomes, can determine the correctness of algorithms and suitability of programming patterns. Twist leverages purity, a property of an expression that implies freedom from entanglement from the rest of the computation. It features a type system that enables the developer to identify pure expressions using type annotations and purity assertion operators that state the absence of entanglement in the output of quantum gates. To soundly check these assertions, Twist uses a combination of static analysis and runtime verification. We evaluate Twist’s type system and analyses on a benchmark suite of quantum programs in simulation, demonstrating that Twist can express quantum algorithms, catch programming errors in them, and support programs that existing languages disallow, while incurring runtime verification overhead of less than 3.5%.
Bio: Charles Yuan is a Ph.D. student at MIT working with Michael Carbin whose research interests lie in programming languages for the computer systems of tomorrow and who is currently investigating programming abstractions for quantum systems.
Time: Jan 26, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
17. Research Session Speaker: Minzhao Liu
PhD student at University of Chicago
Title: Understanding Quantum Supremacy Conditions for Gaussian Boson Sampling with High Performance Computing
Abstract: Recent quantum supremacy experiments demonstrated with boson sampling garnered significant attention, while efforts to perfect approximate classical simulation techniques challenge supremacy claims on different fronts. Single-photon boson sampling has been proven to be efficiently simulable due to the limited growth of entanglement entropy, under the condition that the loss rate scales with the input photon number rapidly. However, similar studies for gaussian boson sampling remained difficult due to the increased Hilbert space dimensionality. We develop a graphical processing unit-accelerated algorithm and increase the algorithm parallelism to exploit high-performance computing resources, reducing the time-to-solution significantly. With the new capability, we numerically observe similar entanglement entropy plateaus and reductions as input mode numbers increase under certain loss scalings. Additionally, we observe the non-trivial effects of squeezing parameters on entanglement entropy scaling. These new findings shed light on the conditions under which gaussian boson sampling is classically intractable.
Bio: Minzhao Liu is a 3rd year PhD student at University of Chicago working with Dr. Yuri Alexeev and Prof. Liang Jiang. He is interested in answering theoretical questions in quantum computing and quantum physics through the use of advanced computational tools, including machine learning, tensor networks, and hardware accelerators. Specifically, he is interested in approaches to demonstrate quantum supremacy with near term devices, and understand limitations of quantum supremacy claims through the use of classical simulations.
Time: Feb 02, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
18. Research Session Speaker: Dr.Naoki Kanazawa
Research Scientist at IBM Quantum
Title: Pulse Control for Superconducting Quantum Computers
Abstract: In superconducting quantum computer, quantum gates are compiled down to a sequence of microwave pulses. Better understanding of pulse control techniques enable us to improve gate fidelity, and it makes the quantum circuit execution more tolerant of noise. In this seminar, we begin with the high-level overview of quantum electrodynamics theory to understand pulse control techniques with the experiment tools provided by the open-source software Qiskit. Several interesting research topics are also briefly introduced.
Bio: Naoki Kanazawa is a research scientist at IBM Quantum and Qiskit developer. He specializes in pulse control techniques for superconducting quantum computer and leads the software development of pulse module in Qiskit.
Time: Feb 09, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
19. Research Session Speaker: Prof. He Li
Associate Professor at Southeast University (China)
Title: Rethinking Most-significant Digit-first Arithmetic for Quantum Computing in NISQ Era
Abstract: In recent years, quantum computers have attracted extensive research interest due to their potential capability of solving problems which are not easily solvable using classical computers. In parallel to the constant research aiming at the physical implementation of quantum processors, quantum algorithms, such as Shor’s algorithm and quantum linear algebra and quantum machine learning algorithms, have been actively developed for real-life applications to show quantum advantages, many of which benefit from quantum arithmetic algorithms and their efficient implementations. Although various least-significant digit-first quantum arithmetic operators have been introduced in literature, interest in investigating the efficient implementation of most-significant digit-first (MSDF) arithmetic is growing. In this talk, we first review of quantum arithmetic circuits design, and then present a novel design space exploration method to implement quantum most-significant digit-first arithmetic operators, taking quantum MSDF addition as a case study to demonstrate low qubits, low quantum gates usage and low quantum circuit depth architectures. Scalability and quantitative comparisons for different quantum arithmetic circuits will also be discussed.
Bio: Dr. He Li is now an Associate Professor with Southeast University, Nanjing, China. Before joining SEU, Dr. Li was with the University of Cambridge as a Research Associate for Quantum Information, and a Teaching Fellow at Trinity College Cambridge. Before joining Cambridge, He received the PhD degree at Imperial College London, UK. His research interests include FPGA circuits and systems design, quantum computing, quantum communication and hardware security, where over 40 peer-reviewed articles/papers have been published. He serves on technical programme committees of the top-tier EDA and reconfigurable computing conferences (DAC, ICCAD, ICCD, FCCM, FPL, FPT, ASP-DAC, ASAP and SOCC, etc.), the editorial board of Frontiers in Electronics and the guest editor in Electronics. Dr. Li has served as reviewers in many IEEE/ACM Transactions, such as TCAD, TVLSI, TETC, TECS, TRETS, TVT, TCAS-I &-II. He serves as the publicity co-chair of IEEE FPT 2020-2022. Dr.Li is the FPT’17 best paper presentation award recipient.
Time: Feb 16, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
20. Research Session Speaker: Dr. Thinh DINH
Researcher at Vietnam National University
Title: Efficient Hamiltonian Reduction for Scalable Quantum Computing on Clique Cover/Graph Coloring Problems in SatCom
Abstract: Clique cover and graph coloring are complementary problems which have many applications in wireless communications, especially in satellite communications (SatCom). They are NP-hard problems which are intractable for classical computers. Recently, quantum computing has emerged as a novel technology which revolutionizes how to solve challenging optimization problems by formulating Quadratic Unconstrained Binary Optimization (QUBO), then preparing Hamiltonians as inputs for quantum computers. However, due to limited hardware resources, existing quantum computers are unable to tackle large optimization spaces. In this work, we studied how to apply quantum computing to solve Beam Placement (BP) problems, a variant of clique cover problem in Low-Earth Orbit (LEO) SatCom systems. We propose an efficient Hamiltonian Reduction method that allows quantum processors to solve large BP instances encountered in LEO systems. We conduct our simulations on real quantum computers (D-Wave Advantage) using a real dataset of vessel locations in the US. Numerical results show that our algorithm outperforms commercialized solutions of D-Wave by allowing existing quantum annealers to solve 17.5 times larger BP instances while maintaining high solution quality. Although quantum computing cannot theoretically overcome the hardness of BP problems, this work contributes early efforts to applying quantum computing in satellite optimization problems, especially applications formulated as clique cover/graph coloring problems.
Bio: Dr. Thinh Dinh received his PhD degree in Information System Technology and Design at the Singapore University of Technology and Design in 2019, supervised by Prof. Tony Quek. His PhD focused on resource allocation problems in Edge Computing. He was the recipient of IEEE Stephen O. Rice prize in 2020 (Best Paper Award of IEEE Transactions on Communications over the previous 3 years), Best Paper Award of IEEE ATC in 2021 for selected works in that topic. After his PhD, he joined Trusting Social, and Fossil’s Connected Devices Group working on telco and wearable AI applications. Then, he joined University of Luxembourg as a research associate. Currently, he is a researcher at University of Information Technology, a member of Vietnam National University. His research interests focus on applications of quantum computing in wireless communications.
Time: Feb 23, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
21. Research Session Speaker: Dr. Marco Pistoia
Managing Director, Distinguished Engineer, and Head of JPMorgan Chase’s Global Technology Applied Research Center
Title: Quantum Computing and Quantum Communication in the Financial World
Abstract: Finance has been identified as the first industry sector to benefit from quantum computing, due to its abundance of use cases with exponential complexity and the fact that, in finance, time is of the essence, which makes the case for solutions to be computed with high accuracy in real time. Typical use cases in finance that lend themselves to quantum computing are portfolio optimization, derivative pricing, risk analysis, and several problems in the realm of machine learning, such as fraud detection and extractive text summarization. This talk describes the state of the art of quantum computing for finance, focusing on the research work conducted by the quantum computing team at JPMorgan Chase in the area of quantum algorithms and applications for finance. It is also well known that quantum computing has the potential to break public-key cryptography. A quantum-enabled attacker with a sufficiently powerful quantum computer will have the ability to factor large numbers into prime factors in a short amount of time, and consequently compute anyone’s private key from the corresponding public key, decrypt confidential data, and impersonate other entities. Numerous cryptographer worldwide have been working on new, Post Quantum Cryptography (PQC) algorithms, but there is no mathematical proof that these algorithms are resistant to quantum computing attacks. This presentation will also describe JPMorgan Chase’s research effort consisting of coupling PQC with Quantum Key Distribution (QKD), which is mathematically proven to be unconditionally secure.
Bio: Marco Pistoia, Ph.D. is Managing Director, Distinguished Engineer, and Head of JPMorgan Chase’s Global Technology Applied Research Center, where he leads the Quantum Computing and Quantum Communication areas of research. He joined JPMorgan Chase in January 2020. Formerly, he was a Senior Manager, Distinguished Research Staff Member and Master Inventor at the IBM Thomas J. Watson Research Center in New York, where he managed an international team of researchers responsible for Quantum Computing Algorithms and Applications. He is the inventor of over 250 patents, granted by the U.S. Patent and Trademark Office, and over 300 patent-pending applications. He is also the author of three books and over 400 scholarly papers published in international journals and conferences. In the course of his career, he received five distinguished paper awards from the IEEE and ACM.
Time: Mar 02, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
22. Research Session Speaker: Prof. Yuan Feng
Professor at the Centre for Quantum Software and Information, University of Technology Sydney
Title: Hoare logic for verification of quantum programs
Abstract: Quantum computing and quantum communication provide potential speed-up and enhanced security compared with their classical counterparts. However, the analysis of quantum algorithms and protocols is notoriously difficult. Furthermore, due to the lack of reliable and scalable quantum hardware, traditional techniques such as testing and debugging in classical software engineering will not be readily available in the near future, while static analysis of quantum programs based on formal methods seems indispensable. In this talk, I will introduce Hoare logic, a syntax-oriented method for reasoning about program correctness which has been shown to be effective in the verification of classical and probabilistic programs. An extension of Hoare logic to a simple quantum while-language will be discussed, and some examples will be given to illustrate its utility.
Bio: Prof. Yuan Feng received his Bachelor of Science and Ph.D. in Computer Science from Tsinghua University, China in 1999 and 2004, respectively. He is currently a professor at the Centre for Quantum Software and Information, University of Technology Sydney (UTS), Australia. Before joining UTS in 2009, he was an associate professor at Tsinghua University. His research interests include quantum information and quantum computation, quantum programming theory, and probabilistic systems. He was awarded an ARC (Australian Research Council) Future Fellowship in 2010.
Time: Mar 16, Thursday, 20:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
23. Research Session Speaker: Lia Yeh
PhD student at University of Oxford
Title: Quantum Graphical Calculi: Tutorial and Applications
Abstract: Quantum computing and quantum communication provide potential speed-up and enhanced security compared with their classical counterparts. However, the analysis of quantum algorithms and protocols is notoriously difficult. Furthermore, due to the lack of reliable and scalable quantum hardware, traditional techniques such as testing and debugging in classical software engineering will not be readily available in the near future, while static analysis of quantum programs based on formal methods seems indispensable. In this talk, I will introduce Hoare logic, a syntax-oriented method for reasoning about program correctness which has been shown to be effective in the verification of classical and probabilistic programs. An extension of Hoare logic to a simple quantum while-language will be discussed, and some examples will be given to illustrate its utility.
Bio: Lia Yeh is a computer science PhD student in the Quantum Group at the University of Oxford, where her primary research focus is on applying and developing ZX-calculus and related quantum graphical calculi as a language for qudit circuit synthesis and quantum error correction. She has bachelor’s degrees in physics and computing at the College of Creative Studies of the University of California, Santa Barbara where she designed microwave spectroscopy algorithms to determine molecular structure. She currently is a part-time Research Engineer at Quantinuum, volunteers for IEEE Quantum Education as a steering committee member of the IEEE Quantum Initiative, and volunteers for the Quantum Universal Education not-for-profit community.
Time: Mar 23, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
24. Research Session Speaker: Zhirui Hu
PhD student at George Mason University
Title: Optimize Quantum Learning on Near-Term Noisy Quantum Computers
Abstract: In recent years, there has been a significant breakthrough in the development of superconducting quantum computers, with IBM’s 433-qubit quantum computer being a prime example of the progress made in addressing scalability issues. However, the noise generated by quantum bits, or qubits, remains a significant obstacle to realizing the full potential of quantum computing in real-world applications. Despite extensive efforts by researchers to suppress noise, build noise models that describe its effects in simulators, create more accurate qubits, and design robust circuits, the inherent fluctuation of noise (instability) can still undermine the performance of error-aware designs. Worse still, users may not even be aware of the performance degradation caused by changes in noise. In this section, we will discuss the challenges and solutions to the problem of Near-Term Noisy Quantum Computers.
Bio: Zhirui Hu is a first-year Ph.D. student from George Mason University. Her academic advisor is Weiwen Jiang. Her research interest during the Ph.D. period is circuit and algorithm optimization on Near-Term Noisy Quantum Computers. So far, she has 3 top conference papers (ICCAD, ICCD 2022, DAC2023) as the first author in this area. She has bachelor’s degrees in automation at Huazhong University of science and technology so she also has background in control and machine learning. She is willing to learn new skills and collaborate with others. She will participate in a quantum summer school at LANL this summer.
Time: April 06, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
25. Research Session Speaker: Prof. Zhu Han
John and Rebecca Moores Professor at University of Houston
Title: Hybrid Quantum-Classic Computing for Future Network Optimization
Abstract: Benefited from the technology development of controlling quantum particles and constructing quantum hardware, quantum computation has attracted more and more attention in recent years. This talk will give an introduction of quantum computing and its applications in network optimization. We first introduce the basics of quantum computing and what quantum parallelism is. Second, we will discuss the adiabatic quantum computing math model and one real implementation, Quadratic Unconstrained Binary Optimization (QUBO) on D-wave quantum annealer. Then we propose a hybrid quantum Benders’ decomposition algorithm for joint quantum and classic CPU computing. Finally, we will discuss how our proposed framework can be employed in network optimization, smart grid, and machine learning.
Bio: Zhu Han received the B.S. degree in electronic engineering from Tsinghua University, in 1997, and the M.S. and Ph.D. degrees in electrical and computer engineering from the University of Maryland, College Park, in 1999 and 2003, respectively. From 2000 to 2002, he was an R&D Engineer of JDSU, Germantown, Maryland. From 2003 to 2006, he was a Research Associate at the University of Maryland. From 2006 to 2008, he was an assistant professor at Boise State University, Idaho. Currently, he is a John and Rebecca Moores Professor in the Electrical and Computer Engineering Department as well as the Computer Science Department at the University of Houston, Texas. Dr. Han is an NSF CAREER award recipient of 2010, and the winner of the 2021 IEEE Kiyo Tomiyasu Award. He has been an IEEE fellow since 2014, an AAAS fellow since 2020, an IEEE Distinguished Lecturer from 2015 to 2018, and an ACM Distinguished Speaker from 2022-2025. Dr. Han is also a 1% highly cited researcher since 2017.
Time: April 13, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
26. Research Session Speaker: Dr. Samuel Yen-Chi Chen
Senior Software Engineer at Wells Fargo
Title: Hybrid Quantum-Classical Machine Learning with Applications
Abstract: The development of machine learning (ML) and quantum computing (QC) hardware has generated a lot of interest in creating quantum machine learning (QML) applications. This presentation will provide a broad overview of the hybrid quantum-classical machine learning approach, including key concepts such as quantum gradient calculation. Additionally, recent advancements in QML across multiple fields, including distributed or federated learning, natural language processing, reinforcement learning, and classification, will be discussed. Potential benefits, scalability, and use cases of QML in the NISQ era will also be covered.
Bio: Dr. Samuel Yen-Chi Chen received the Ph.D. and B.S. degree in physics and the M.D. degree in medicine from National Taiwan University, Taipei City, Taiwan. He is now a senior software engineer at Wells Fargo Bank. Prior to that, he was an assistant computational scientist in the Computational Science Initiative, Brookhaven National Laboratory. His research interests include building quantum machine learning algorithms as well as applying classical machine learning techniques to solve quantum computing problems. He won the First Prize In the Software Competition (Research Category) from Xanadu Quantum Technologies, in 2019.
Time: April 20, Thursday, 10:00 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
27. Research Session Speaker: Dr. Ji Liu
Postdoctoral Researcher at Argonne National Laboratory
Title: Elevating Quantum Compiler Performance through Enhanced Awareness in the Compilation Stages
Abstract: Quantum compiler plays a critical role in practical quantum compilation, particularly in the Noise-Intermediate-Scale-Quantum (NISQ) era. With limited number of qubits/connectivity and the presence of noisy quantum operations, it is crucial to leverage the quantum compiler to optimize circuits and map them onto the target hardware. The talk will first provide an overview of the quantum compilation flow and highlight the important stages in the process. The second and the third parts of this talk will delve deeper into these stages, focusing on optimizing the gate decomposition and qubit routing stage by bringing more cross-stage awareness in the compiler. At last, the talk will cover our recent work on Permutation-Aware Mapping (PAM). PAM exploits the optimization and mapping potential of synthesis with block-level routing heuristics. We show that the PAM framework significantly reduces the routing overhead and may outperform the optimal routing algorithms that minimize the SWAP gate overhead. Additionally, PAM generates the best-known implementations for important algorithms like Quantum Fourier Transform (QFT) and Transverse Field Ising Model (TFIM).
Bio: Dr. Ji Liu is a postdoctoral researcher at Argonne National Laboratory. His research focuses on improving the programmability, debuggability, and reliability of quantum computers. He currently works on quantum compiler optimization and noise mitigation techniques for NISQ computers. Prior to joining Argonne, he received his Ph.D. in Computer Engineering from North Carolina State University and B.S. in Applied Physics from University of Science and Technology of China. Dr. Liu was the recipient of Special Oracle Award in IBM Quantum Challenge 2019. His work on qubit routing has been recognized by the Distinguished Artifact Award at HPCA 2022.
Time: April 27, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
28. Research Session Speaker: Dr. Daniel Egger
Senior Research Scientist at IBM Quantum, IBM Zurich
Title: Pulse-based Variational Quantum Eigensolver and Pulse-Efficient Transpilation
Abstract: State-of-the-art noisy digital quantum computers can only execute short-depth quantum circuits. Variational algorithms are a promising route to unlock the potential of noisy quantum computers since the depth of the corresponding circuits can be kept well below hardware-imposed limits. Typically, the variational parameters correspond to virtual RZ gate angles, implemented by phase changes of calibrated pulses. By encoding the variational parameters directly as hardware pulse amplitudes and durations we succeed in further shortening the pulse schedule and overall circuit duration. This decreases the impact of qubit decoherence and gate noise. As a demonstration, we apply our pulse-based variational algorithm to the calculation of the ground state of different hydrogen-based molecules (H2, H3 and H4) using IBM cross-resonance-based hardware. We observe a reduction in schedule duration of up to 5× compared to CNOT-based Ansätze, while also reducing the measured energy. In particular, we observe a sizable improvement of the minimal energy configuration of H3 compared to a CNOT-based variational form. Finally, we discuss possible future developments including error mitigation schemes and schedule optimizations, which will enable further improvements of our approach paving the way towards the simulation of larger systems on noisy quantum devices.
Bio: Dr. Daniel J. Egger is a Senior Research Scientist working at IBM Quantum, IBM Research Europe – Zurich. His research focuses on the control of quantum computers, integrating them in modern software stacks, and on the practical applications of quantum algorithms in finance, optimization, and natural sciences. Dr. Egger joined IBM in 2016. From 2014 to 2016 he worked in the asset management industry as a risk manager. He earned a PhD in theoretical physics in 2014 for his work on quantum simulations and optimal control of quantum computers based on superconducting qubits.
Time: May 11, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
29. Research Session Speaker: Thomas Alexander
Software Developer at IBM Quantum, Market Leader in Quantum Systems and Services
Title: Control Systems & Systems Software @ IBM Quantum
Abstract: The field of quantum computing has rapidly developed around a cloud model, in which users receive a remote handle to a quantum system, compile (transpile) quantum programs (circuits) for the target system, and then remotely invoke an execution on the system from which they then fetch the results of. As a result of the cloud-native architecture most vendors provide, there is little insight into the software and control systems that orchestrate the QPU to expose a quantum computer to the end user. We will provide an overview of the systems-level quantum computing stack at IBM Quantum. We will introduce at a high-level our control system computer architecture and how we use it to orchestrate a real-time dynamic circuit quantum program. We will then walk through compiling and executing quantum programs in a production environment. Finally, we will discuss the challenges we see in the next 3-5 years in this domain as we work towards delivering on IBM Quantum’s roadmap.
Bio: Thomas Alexander is a software developer at IBM Quantum, a market leader in quantum systems and services. At IBM, Thomas helps design and build the software toolchain for the control electronics that power a quantum computer – such as modeling and compiling quantum programs, generating code, and helping to architect the quantum control systems. Currently, Thomas is leading an interdisciplinary team to deliver software-defined infrastructure for IBM’s road-mapped quantum systems. Previously, Thomas led the effort to deliver dynamic circuit capabilities and Qiskit Pulse to IBM Quantum clients. Thomas enjoys contributing to the quantum computing community and has been a core contributor to Qiskit Terra & OpenQASM. Prior to joining IBM Quantum, Thomas studied quantum computing at the Institute for Quantum Computing at the University of Waterloo where he performed experiments in solid-state NMR, NV centers, and developed software for experiment design systems.
Time: May 18, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
30. Research Session Speaker: Dr. Ruslan Shaydulin
Applied Research Lead at the Global Technology Applied Research Center at JPMorgan Chase & Co
Title: Parameter Setting in Quantum Approximate Optimization of Weighted Problems
Abstract: Quantum Approximate Optimization Algorithm (QAOA) is a leading candidate algorithm for solving combinatorial optimization problems on quantum computers. However, in many cases QAOA requires computationally intensive parameter optimization. The challenge of parameter optimization is particularly acute in the case of weighted problems, for which the eigenvalues of the phase operator are non-integer and the QAOA energy landscape is not periodic. In this work, we develop parameter setting heuristics for QAOA applied to a general class of weighted problems. First, we derive optimal parameters for QAOA with depth p = 1 applied to the weighted MaxCut problem under different assumptions on the weights. In particular, we rigorously prove the conventional wisdom that in the average case the first local optimum near zero gives globally-optimal QAOA parameters. Second, for p ≥ 1 we prove that the QAOA energy landscape for weighted MaxCut approaches that for the unweighted case under a simple rescaling of parameters. Therefore, we can use parameters previously obtained for unweighted MaxCut for weighted problems. Finally, we prove that for p = 1 the QAOA objective sharply concentrates around its expectation, which means that our parameter setting rules hold with high probability for a random weighted instance. We numerically validate this approach on a dataset of 34,701 weighted graphs with up to 20 nodes and show that the QAOA energy with the proposed fixed parameters is only 1.1 percentage points (p.p.) away from that with optimized parameters. Third, we propose a general heuristic rescaling scheme inspired by the analytical results for weighted MaxCut and demonstrate its effectiveness using QAOA with the XY Hamming-weight-preserving mixer applied to the portfolio optimization problem as an example. We show that our simple rule improves the convergence of local optimizers, reducing the number of iterations required to reach a fixed local minimum by 7.2x on average.
Bio: Ruslan Shaydulin is an Applied Research Lead at the Global Technology Applied Research center at JPMorgan Chase. Ruslan’s research centers on applying quantum algorithms to classical problems, with a focus on optimization and machine learning. Prior to joining JPMorgan Chase, Ruslan was a Maria Goeppert Mayer fellow at Argonne National Laboratory.
Time: June 08, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
31. Research Session Speaker: Prof. Giulio Chiribella
Full Professor and the Director of QICI Quantum Information and Computation Initiative at the Department of Computer Science of The University of Hong Kong
Title: The Nonequilibrium Cost of Accuracy
Abstract: Accurate information processing is crucial both in technology and in nature. To achieve it, any information processing system needs an initial supply of resources away from thermal equilibrium. In this talk, I will discuss the in-principle limits on the accuracy achievable with a given amount of nonequilibrium resources. Specifically, I will present a limit based on an entropic quantity, named the reverse entropy, associated to a time reversal of the information processing task under consideration. The limit is achievable for all deterministic classical computations and for all their quantum extensions. As an application, I will show the optimal tradeoffs between nonequilibrium and accuracy for thel tasks of storing, transmitting, cloning, and erasing information. These results set a target for the design of new devices approaching the ultimate efficiency limit, and provide a framework for demonstrating thermodynamical advantages of genuine quantum information processing.
Bio: Giulio Chiribella is a full professor and the director of QICI Quantum Information and Computation Initiative at the Department of Computer Science of The University of Hong Kong. He has done research on quantum causal networks, on the information-theoretic foundations of quantum theory, and on the ultimate precision limits of quantum measurements, for which he was awarded the Hermann Weyl Prize 2010. In 2020 and 2018 he received Senior Research Fellowships from the Hong Kong Research Grant Council (RGC) and from the Croucher Foundation, respectively. He currently serves as an elected member of the Hong Kong Young Academy of Sciences, as a visiting professor at the University of Oxford, and as an editorial board member of the journal Communications in Mathematical Physics. Before joining the University of Hong Kong, he held faculty positions at Oxford University and Tsinghua University, Beijing.
Time: June 15, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
32. Research Session Speaker: Prof. Zichang He
senior research associate at the Global Technology Applied Research center at JPMorgan Chase
Title: Align or Not Align? Design Quantum Approximate Operator Ansatz (QAOA) with Applications in Constrained Optimization
Abstract: Combinatorial optimization has been one of most promising use cases of the near-term quantum computers. To facilitate its practical utils, the encoding of practical constraints is an important task. The quantum approximate operator ansatz (QAOA), an extension of the well-known quantum approximate optimization algorithm, is one of the leading quantum algorithms in constrained optimization. Within a high-depth QAOA, quantum adiabatic theorem inspires an alignment between the initial state and the mixer operator. However, the low-depth QAOA mechanism and ansatz design remains less explored. In this talk, we will validate that the alignment effect continues to improves the QAOA performance even in the low-depth regime. We take portfolio optimization as a comprehensive case study, where the hamming wight constraint is encoded by XY mixers. Furthermore, we demonstrate these findings in a 32-asset problem utilizing Quantinuum’s system model H2 device. To the best of our knowledge, this is the largest-scale QAOA demonstration in a universal quantum computer to date.
Bio: Zichang He is a senior research associate at the Global Technology Applied Research center at JPMorgan Chase and a PhD candidate in Electrical and Computer Engineering at UC Santa Barbara. Zichang’s research primarily focuses on the quantum computing and its design automation. Zichang is the receipt of IEE Excellent in Research Fellowship in 2021 at UCSB and two best student paper awards in IEEE EPEPS 2020 and IEEE HPEC 2022.
Time: June 22, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
33. Research Session Speaker: Prof. Zheng (Eddy) Zhang
Associate Professor at Rutgers University
Title: A Structured Method for Compilation of QAOA Circuits in Quantum Computing
Abstract: Zheng (Eddy) Zhang is an Associate Professor at Rutgers University. Her research is in compilers, systems, and quantum computing. Her recent work dealt with the synergistic interaction between quantum applications, programming languages, intermediate representation, compilers, and micro-architecture for NISQ computing devices. She will be talking about a paper that will appear in ASPLOS 2024.
Bio: A critical feature in today’s quantum circuit is that they have permutable two-qubit operators. The flexibility in ordering the permutable two-qubit gates leads to more compiler optimization opportunities. However, it also imposes significant challenges due to the additional degree of freedom. Our Contributions are two-fold. We first propose a general method that can find structured solutions for scalable quantum hardware. It breaks down the complex compilation problem into two sub-problems that can be solved at a small scale. Second, we show how such a structured method can be adapted to practical cases that handle sparsity of the input problem graphs and the noise variability in real hardware. We evaluate our method on IBM and Google architecture coupling graphs for up to 1,024 qubits and demonstrate better results in both depth and gate count – by up to 72% reduction in depth, and 66% reduction in gate count. Our real experiments on IBM Mumbai show that we can find better expected minimal energy than the state-of-the-art baseline(s).
Time: June 29, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
34. Research Session Speaker: Tianyi Hao
PhD student at University of Wisconsin, Madison
Title: Enabling High Performance Debugging for Variational Quantum Algorithms using Compressed Sensing
Abstract: Variational quantum algorithms (VQAs) are promising for solving practical problems on Noisy Intermediate Scale Quantum (NISQ) devices. However, developing VQAs is challenging due to the limited availability of quantum hardware, their high error rates, and the significant overhead of classical simulations. In addition, for a VQA to work, researchers must choose proper configurations for its various components in an empirical manner, as there are few techniques or software tools to configure and tune the VQA hyperparameters.In this talk, Tianyi will present OSCAR, a tool to help VQA researchers pick the right initial points, choose suitable optimizer configurations, and deploy appropriate error mitigation methods. OSCAR enables efficient debugging and performance tuning by providing users with the loss function landscape without running thousands of quantum circuits as required by the grid search. Furthermore, OSCAR can compute an optimizer function query in an instant by interpolating a computed landscape, thus significantly reducing the overhead of optimizer hyperparameter tuning.
Bio: Tianyi Hao is a CS Ph.D. student at the University of Wisconsin-Madison in Prof. Swamit Tannu’s group. Prior to joining UW-Madison, he received his Bachelor’s degrees in CS and Physics at the University of Illinois at Urbana-Champaign and Master’s degree in CS at Stanford. His research interests include quantum algorithms, quantum optimization, and quantum circuit simulation.
Time: Aug 10, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
35. Research Session Speaker: Dr. Michael Goerz
Senior Post-doctoral Fellow at the U.S. Army Research Lab
Title: Numerical Methods of Optimal Quantum Control
Abstract: At the heart of the next-level quantum technology like quantum computing lies the problem of actively controlling the dynamics of a quantum system. I will give an overview of the numerical methods of open-loop quantum control theory. These methods are based on simulating the dynamics of the system and then iteratively minimizing the value of an optimization functional, e.g., a gate error. I will discuss the efficient simulation of quantum dynamics and describe the widely used gradient ascent (GRAPE) and Krotov’s methods for unconstrained quantum control. Using the QuantumControl.jl software package, I will show an example of using semi-automatic differentiation to optimize gate entanglement. Finally, I will give an outlook on the wider ecosystem of control methods, the possibility to adapt to experimental constraints, and the general design of quantum control software.
Bio: Michael Goerz is a senior post-doctoral fellow at the U.S. Army Research Lab in Adelphi, MD. He received his PhD on “Optimizing Robust Quantum Gates in Open Quantum Systems” in the group of Christiane Koch in Kassel, Germany. Before joining the Army Research Lab, he was a postdoc in the group of Hideo Mabuchi at Stanford. His research focuses on methods of quantum optimal control in a wide range of applications, currently for the design of pulse schemes for quantum metrology with trapped atoms. He is the lead developer of the QuantumControl Julia package.
Time: Aug 24, Thursday, 22:00 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
36. Research Session Speaker: Daniel Silver
PhD Candidate at Northeastern University
Title: Quantum Machine Learning on Current Quantum Computers
Abstract: Quantum computing has the potential to speed up many tasks in machine learning. Properties of quantum computing such as superposition, entanglement, and reversibility, set it apart from classical computing. However, there are many challenges to the adaptation of this technology in the current noisy intermediate-scale quantum (NISQ) era of quantum computing. This era is characterized by high levels of quantum hardware noise and relatively small-sized quantum computers. Nonetheless, the current technology can still be used in innovative ways to execute machine-learning tasks. This talk presents quantum machine learning solutions for solving problems in classification, similarity detection, and image generation. Specifically, this talk will focus on recent research on how quantum computers can be used to execute machine learning tasks today.
Bio: Daniel Silver is a Ph.D. Candidate at Northeastern University with a focus on integrating the principles of machine learning with the emerging field of quantum computing. Before starting his Ph.D. journey, Daniel also received his B.Sc. in Computer Engineering and M.Sc. in Machine Learning from Northeastern University. He has published his research at top-tier conference venues such as AAAI, ICCV, SC, ISCA, and DATE.
Time: Aug 31, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
38. Research Session Speaker:Evan McKinney
PhD student at University of Pittsburgh
Title: Quantum Circuit Decomposition and Routing Collaborative Design
Abstract: In this talk, I will highlight our work in the co-design of superconducting quantum computers, emphasizing our focus on developing specialized transpilation tools. Transpilation serves to tackle key bottlenecks, notably the computational overhead arising from the frequent use of SWAP gates due to limited qubit connectivity in NISQ systems. Our most recent contribution is an optimization technique termed MIRAGE. This approach enhances existing circuit routing algorithms by incorporating ‘mirror gates,’ which inherently include data movement in their decompositions. Mirror gates execute alternative quantum gates simply by reversing the order of qubit outputs. MIRAGE effectively reduces circuit depth by minimizing the reliance on SWAP gates, making it highly compatible with systems employing the root of iswap basis gate.
Bio: Evan McKinney is a third-year PhD student, co-advised by Dr. Alex K. Jones and Dr. Michael Hatridge at the University of Pittsburgh. He received his bachelor’s from Iowa State University in computer engineering and physics. His research is on quantum computing architecture and optimizations for near-term QC applications.
Time: Sep 07, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
39. Research Session Speaker: Prof. Archana Kamal
Associate Professor in the Department of Physics and Applied Physics at the University of Massachusetts Lowell
Title: Quantum Reservoir Engineering for Fast and Scalable Entanglement Stabilization
Abstract: High-fidelity entanglement is a prerequisite for almost any quantum information processing task. A powerful approach to generate robust entanglement is quantum reservoir engineering that employs controlled dissipation to act on a quantum system, such that the resultant dynamics naturally relax the system to an entangled state (or state space) of interest. However, in conventional reservoir engineering protocols based on resonant driving, high fidelity necessarily comes at the cost of speed of state preparation. In this talk, I will describe a new class of “exact” quantum reservoir engineering protocols that exhibit concurrent scaling of steady-state fidelity and preparation speed. I will then discuss how modular dissipation allows extensions of such ideas for scalable entanglement generation in NISQ-era platforms.
Bio: Archana Kamal is an Associate Professor in the Department of Physics and Applied Physics at the University of Massachusetts Lowell (UML) and directs the QUantum Engineering Science and Technology (QUEST) Group at UML. After completing her pre-doctorate education in India, she pursued her doctoral research at Yale University, followed by a postdoctoral stint at MIT. Her research spans both fundamental and applied aspects of quantum information processing, with a focus on engineered quantum systems that are “controllable” like classical machines, while intrinsically behaving quantum-mechanically like atoms. Her research led to new designs of noise-resilient artificial atoms (or “qubits”) and new protocols for noiseless information routing and nonreciprocal amplification, which are now routinely employed in many laboratories around the world. Some of the current themes of her research include generation and control of large-scale entanglement, high-fidelity quantum measurement and readout, and applications of quantum information concepts to tackle questions at the interface of condensed matter, cosmology and thermodynamics. Her contributions to nonreciprocal quantum signal processing were recognized by MIT Technology Review with a TR35 – Global Innovator Award in 2018. She is also the recipient of 2021 AFOSR Young Investigator Award and 2021 NSF CAREER Award.
Time: Sep 14, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
40. Research Session Speaker: Prof. Mitch Thornton
Cecil H. Green Chair of Engineering and Professor in the Department of Electrical and Computer Engineering at Southern Methodist University
Title: Quantum Oracle Synthesis with an Application to QRNG
Abstract: Several prominent quantum computing algorithms—including Grover’s search algorithm and Shor’s algorithm for finding the prime factorization of an integer—employ subcircuits termed ‘oracles’ that embed a specific instance of a mathematical function into a corresponding bijective function that is then realized as a quantum circuit representation. Designing oracles, and particularly, designing them to be optimized for a particular use case, can be a non-trivial task. For example, the challenge of implementing quantum circuits in the current era of NISQ-based quantum computers generally dictates that they should be designed with a minimal number of qubits, as larger qubit counts increase the likelihood that computations will fail due to one or more of the qubits decohering. However, some quantum circuits require that function domain values be preserved, which can preclude using the minimal number of qubits in the oracle circuit. Thus, quantum oracles must be designed with a particular application in mind. In this work, we present methods for automatic quantum oracle synthesis. We then present an application of the method for synthesizing programmable quantum random number generators (QRNG) that obey arbitrary user-defined probability distribution functions.
Bio: Mitchell A. (Mitch) Thornton is currently the Cecil H. Green Chair of Engineering and Professor in the Department of Electrical and Computer Engineering at Southern Methodist University in Dallas, Texas. He also serves as the Executive Director of the Darwin Deason Institute for Cyber Security, a research-only unit, and as Program Director for the interdisciplinary M.S. in Data Engineering degree program within the Lyle School of Engineering at SMU. He is an author or co-author of five books and more than 300 technical articles. He is a named inventor on over 20 US/PCT/WIPO patents and patents pending. During his career as an academic researcher, he has performed sponsored research for numerous federal government agencies and industrial organizations that, in total, exceeds $10M in combined research support. He received the PhD in computer engineering from SMU in 1995, MS in computer science from SMU in 1993, MS in electrical engineering from the University of Texas at Arlington in 1990, and BS in electrical engineering from Oklahoma State University in 1985.
Time: Sep 28, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
41. Research Session Speaker: Prof. Prabhat Mishra
Professor in the Department of Computer and Information Science and Engineering and a UF Research Foundation Professor at the University of Florida, IEEE Fellow, AAAS Fellow, ACM Distinguished Scientist
Title: Design Automation for Quantum Computing
Abstract: This talk will provide the full picture of design automation for quantum computing – from designing quantum algorithms and quantum circuits to quantum computing using physical quantum hardware. In this context, I will describe various design automation concepts and associated tools, including quantum noise, quantum compilation, quantum simulation, quantum measurement, quantum error correction, quantum state preparation, and quantum security.
Bio: Prabhat Mishra is a Professor in the Department of Computer and Information Science and Engineering and a UF Research Foundation Professor at the University of Florida. His research interests include quantum computing, embedded systems, hardware security, energy-aware computing, formal verification, and explainable AI. He has published 9 books, 37 book chapters, and more than 200 research articles in premier international journals and conferences. His research has been recognized by several awards including the NSF CAREER Award, IBM Faculty Award, three best paper awards, eleven best paper nominations, and EDAA Outstanding Dissertation Award. He currently serves as an Associate Editor of ACM Transactions on Embedded Computing Systems. He is an IEEE Fellow, a Fellow of the American Association for the Advancement of Science, and an ACM Distinguished Scientist.
Time: Oct 12, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
42. Research Session Speaker: Prof. Xiantao Li
Professor in the Department of Mathematics at the Pennsylvania State University
Title: Open Quantum Systems in Quantum Computing
Abstract: Quantum dynamics in real-world scenarios seldom stand alone; they occur under the continuous influence of surrounding environments. One important example is the quantum computers’ gate operations, which are subject to environment noise. This noise can subsequently deviate the intended outcomes by the quantum algorithm. Consequently, the dynamics of open quantum systems describes a broader range of problems than the conventionally known time-dependent Schrödinger equation. In this presentation, we will show how models for open quantum systems can be derived, and how they can be simulated on quantum computers.
Bio: Xiantao Li received his PhD from University of Wisconsin in 2002. He is currently a professor in Department of Mathematics at the Pennsylvania State University. His current research interest lies in quantum computing algorithms, electron structure calculations, machine learning and model reduction.
Time: Oct 19, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09
43. Research Session Speaker: Dr. Weitang Li
senior researcher at Tencent Quantum Lab
Title: Quantum Computational Chemistry: Variational Quantum Eigensolver and the Design of Ansatz
Abstract: Over the past decade, quantum computing has experienced remarkable growth and development, opening up new possibilities for simulating molecular properties. In this talk, I will provide an overview of the fundamental concepts of quantum chemistry and the variational quantum eigensolver (VQE) algorithm. I will place particular emphasis on recent advancements in parameterized quantum circuits for VQE, including the unitary coupled-cluster (UCC) ansatz and the hardware-efficient ansatz (HEA). The talk will conclude with an optimistic estimation of the resources required to achieve quantum advantage.
Bio: Weitang Li is a senior researcher at Tencent Quantum Lab, Shenzhen, China. He received his Ph.D. from department of chemistry, Tsinghua University, where he conducted research on the application of tensor networks and quantum computing to chemical systems. At Tencent Quantum Lab he continues his research on quantum computational chemistry and develops TenCirChem, an efficient and versatile quantum computation package for molecular properties.
Time: Oct 26, Thursday, 10:30 ET
Zoom: https://notredame.zoom.us/j/99013211531?pwd=UW5Wa1ltTEVqY25CczRvUzNhTmQ0dz09