Highlights of Data Science (GDS) Talks @ APS 2021 March Meeting
American Physics Society (APS) March meeting is one of the largest physics meetings in the world. In 2021, the meeting will be held online due to COVID-19.
To help the community quickly catch up on the work to be presented in this meeting, Paper Digest Team processed all talk abstracts, and generated one highlight sentence (typically the main topic) for each. Readers are encouraged to read these machine generated highlights / summaries to quickly get the main idea of each talk. This article is on the talks related to Data Science (GDS).
If you do not want to miss any interesting academic paper, you are welcome to sign up our free daily paper digest service to get updates on new papers published in your area every day. You are also welcome to follow us on Twitter and Linkedin to get updated with new conference digests.
Paper Digest Team
team@paperdigest.org
TABLE : Data Science (GDS)
Title | Authors | Highlight | Session | |
---|---|---|---|---|
1 | Network Theory Meets Materials Science | Wolverton, Christopher | Here we consider a complementary approach, a top-down study of the organizational structure of networks of materials, based on the interaction between materials themselves. | Session 1: AI Materials Design and Discovery |
2 | Neural network – assisted search for active site ensembles in dilute bimetallic nanoparticle catalysts | Marcella, Nicholas; Torrisi, Steven; Lim, Jin Soo; Kozinsky, Boris; Frenkel, Anatoly | Here we present a method that combines the in situ measurements of X-ray absorption fine structure spectroscopy (XAFS) and catalytic activity, by way of neural network modeling, to create starting configurations for theoretical reaction modeling. | Session 1: AI Materials Design and Discovery |
3 | Accelerating Finite-Temperature Kohn-Sham Density Functional Theory with Deep Neural Networks | Cangi, Attila; Ellis, J. A.; Modine, Normand; Stephens, J. Adam; Thompson, Aidan; Rajamanickam, Sivasankaran | We present a numerical modeling workflow based on deep neural networks that reproduce spatially-resolved, energy-resolved, and integrated quantities of Kohn-Sham density functional theory at finite electronic temperature to within chemical accuracy. | Session 1: AI Materials Design and Discovery |
4 | Graph Neural Network for Metal Organic Framework Potential Energy Approximation: Energy Landscape Database and Rigidity | Owen, Christopher; Zaman, Shehtab | We conclude with implications of our work for the design of MOFs for application purposes. | Session 1: AI Materials Design and Discovery |
5 | Symmetry incorporated graph convolutional neural networks for solid-state materials | Gong, Weiyi; Bai, Hexin; Chu, Peng; Ling, Haibin; Yan, Qimin | In this talk, we will demonstrate the development of a graph convolutional neural network with global and local symmetries in both real and reciprocal spaces incorporated. | Session 1: AI Materials Design and Discovery |
6 | CCDCGAN: Inverse design of crystal structures | Long, Teng; Fortunato, Nuno; Zhang, Yixuan; Shen, Chen; Gutfleisch, Oliver; Zhang, Hongbin | We have developed constrained crystal deep convolutional generative adversarial networks (CCDCGAN), which can be used to design unreported (meta-)stable crystal structures using encoded 2D latent space. | Session 1: AI Materials Design and Discovery |
7 | Network-based representation and analysis of materials space | Veremyev, Alexander; Liyanage, Laalitha; Fornari, Marco; Boginski, Vladimir; Curtarolo, Stefano; Butenko, Sergiy; Buongiorno Nardelli, Marco | In this talk, we consider the problem of mapping and exploring the materials universe using network science tools and concepts. | Session 1: AI Materials Design and Discovery |
8 | Uncovering the Relationship Between Thermal Conductivity and Anharmonicity with Symbolic Regression | Purcell, Thomas; Scheffler, Matthias; Ghiringhelli, Luca; Carbogno, Christian | Here we present descriptors of κ based on our new measure of anharmonicity, σ A [1]. | Session 1: AI Materials Design and Discovery |
9 | Enhanced Machine Learning Models for Structure-Property Mapping with Principal Covariates Regression | Cersonsky, Rose; Helfrecht, Benjamin; Fraux, Guillaume; Engel, Edgar; Ceriotti, Michele | Here we introduce a kernelized version of PCovR and demonstrate the performance of this approach in revealing and predicting structure-property relations in chemistry and materials science. | Session 1: AI Materials Design and Discovery |
10 | Graph Neural Network for Metal-Organic Framework Potential Energy Approximation | Zaman, Shehtab; Owen, Christopher; Chiu, Kenneth; Lawler, Michael | We propose a machine learning approach for estimating the potential energy of candidate MOFs, decomposing it into separate pair-wise atomic interactions using a graph neural network. | Session 1: AI Materials Design and Discovery |
11 | Towards Inverse Design of Metal-Organic Frameworks to Maximize Hydrogen Storage using Deep Learning | Phillips, Kevin; Zaman, Shehtab; Chiu, Kenneth; Lawler, Michael | We implement Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs) that utilize scaled-down voxel representations of real MOFs for the inverse design of new MOFs with maximal hydrogen adsorption. | Session 1: AI Materials Design and Discovery |
12 | Predicting geometric properties of metal-organic frameworks by fusing 3D and graph convolutional neural networks | Barkovitch, Jacob; Zhou, Musen; Zaman, Shehtab; Chiu, Kenneth; Lawler, Michael; Wu, Jianzhong | We propose a fusion model that combines a 3D convolutional neural network and a graph convolutional neural network to predict geometric properties of MOFs such as Henry’s constant, surface area, pore limiting diameter, and largest cavity diameter. | Session 1: AI Materials Design and Discovery |
13 | Generating Multiscale Amorphous Molecular Structures Using Deep Learning: A Study in 2D | Simine, Lena; Kilgour, Michael; Gastellu, Nicolas; Hui, David; Bengio, Yoshua | We present a method based on deep learning that leverages the finite range of structural correlations for an autoregressive generation of disordered molecular aggregates up to arbitrary size from small-scale computational or experimental samples. | Session 1: AI Materials Design and Discovery |
14 | Machine Learning and Evolutionary Prediction of Superhard B-C-N Compounds | Chen, Cheng-Chien; Chen, Wei-Chih; Vohra, Yogesh | We build random forests models to predict mechanical properties of a compound, using only its chemical formula as input. | Session 1: AI Materials Design and Discovery |
15 | Machine Learning Prediction of the Quasiparticle and Optical Gaps | Olson, Sydney; Biswas, Tathagata | Machine Learning Prediction of the Quasiparticle and Optical Gaps | Session 1: AI Materials Design and Discovery |
16 | Using machine learning to optimize optical response of all-dielectric core-shell nanoparticle | Hoxie, David; Bangalore, Purushotham; Appavoo, Kannatassen | Here we utilize a neural network to train, and subsequently compute Mie optical responses for multi-layer nanoparticles, consisting of amorphous silicon as the core and silicon dioxide as the coating. | Session 1: AI Materials Design and Discovery |
17 | A Novel Artificial Intelligence Platform Applied to the Generative Design of Polymer Dielectrics | Gurnani, Rishi; Kamal, Deepak; Tran, Huan; Ramprasad, Rampi | Here, we present a novel AI platform for the generative design of polymers and use it to discover promising dielectric materials. | Session 1: AI Materials Design and Discovery |
18 | Machine learning the molecular dipole moment with atomic partial charges and atomic dipoles | Veit, Max; Wilkins, David`; Yang, Yang; Distasio, Robert; Ceriotti, Michele | We represent the dipole with a physically-inspired machine learning model that captures the two distinct physical effects contributing to molecular polarization: Local atomic polarization is captured within the symmetry-adapted Gaussian process regression (SA-GPR) framework, while long-range movement of charge is captured by assigning a scalar charge to each atom. | Session 1: AI Materials Design and Discovery |
19 | Machine learning as a solution to the electronic structure problem | Gonzalez, Beatriz; Ramprasad, Rampi | Here, we explore the applicability of this latter methodology using deep learning neural networks to learn and predict the electronic structure of carbon, for a large variety of allotropes [1], and its extension to hydrocarbon molecules and polymers. | Session 1: AI Materials Design and Discovery |
20 | Machine-learning-assisted prediction of the power conversion efficiencies of non-fullerene organic solar cells | Yoshimoto, Yuta; Kamijima, Chihiro; Takagi, Shu; Kinefuchi, Ikuya | We create a new dataset composed of over 1500 non-fullerene organic solar cells (NF-OSCs) by curating experimental data from recently published literature. | Session 1: AI Materials Design and Discovery |
21 | Predicting the Absorption Spectra of Azobenzene Dyes | Stanev, Valentin; Maehashi, Ryota; OHTA, YOSHIMI; Takeuchi, Ichiro | With the reduced set of predictors, we trained separate regression models to predict the absorption at different wavelengths in the UV – visible light range. | Session 1: AI Materials Design and Discovery |
22 | A Machine Learned Model for Solid Form Volume Estimation Based on Packing-Accessible Surface and Molecular Topological Fragments | Bier, Imanuel; Marom, Noa | We present a machine learned model for predicting the volume of a homomolecular crystal from the single molecule structure. | Session 1: AI Materials Design and Discovery |
23 | Predicting outcomes of catalytic reactions using machine learning | Rhone, Trevor; Hoyt, Robert; O’Connor, Christopher; Montemore, Matthew; Kumar, Challa; Friend, Cynthia; Kaxiras, Efthimios | In this talk we show that machine learning can be used to accurately predict the outcomes of catalytic reactions on the surface of oxygen-covered and bare gold in a database. | Session 1: AI Materials Design and Discovery |
24 | Optical engineering of carbon-based nanowires using machine learning | Shapera, Ethan; Heil, Christoph; Braeuninger-Weimer, Philipp | We demonstrate engineering of the optical response of graphene nanoribbons using density functional theory to compute bandgaps and dielectric functions and machine learning. | Session 1: AI Materials Design and Discovery |
25 | Machine Learning the Long-Time Dynamics of Spin Ice | Sherman, Kyle; Chatterjee, Snigdhansu; Karim, Rejaul; Mcilhany, Kevin; Pauluis, Olivier; Trinkle, Dallas; Lawler, Michael | In the interest of increasing the scale of simulations, we’ve implemented a convolutional neural network which contains no dense layers. | Session 1: AI Materials Design and Discovery |
26 | Machine-Learning Thermal Properties | Gaines II, Dale; Xia, Yi; Wolverton, Christopher | Here, we train a simple machine-learning model to efficiently predict the vibrational entropy and free energy of materials from composition alone. | Session 1: AI Materials Design and Discovery |
27 | Capturing and Leveraging Computational and Experimental Data in Materials Physics | Chan, Maria | In this talk, we will discuss efforts in generating, capturing, and leveraging computational and experimental data, with examples in generation of computational defect properties datasets, capturing microscopy data, and combining streams of computational and experimental data. | Session 1: AI Materials Design and Discovery |
28 | Physics-Informed Data-Driven Approach for Optimizing Electrocaloric Cooling | Gong, Jie; Mehta, Rohan; McGaughey, Alan | We build a random forest regression model on the data set. | Session 1: AI Materials Design and Discovery |
29 | First-Principles Prediction of Substrate Induced Changes in Layered Nanomaterials via Physics-Based Machine Learning | Neogi, Sanghamitra; Pimachev, Artem | It is highly desirable to formulate a method that is capable to learn the information from high-cost calculations and predict propertie of a wide range of configurations. | Session 1: AI Materials Design and Discovery |
30 | Featureless adaptive optimization accelerates functional electronic materials design | Wang, Yiqun; Rondinelli, James | Electronic materials exhibiting multiple phase transitions between metastable states with distinct physical properties are challenging to decoding using conventional machine learning methods owing to data scarcity and absence of physically meaningful materials descriptors. | Session 1: AI Materials Design and Discovery |
31 | Benchmarking Coordination Number Prediction Algorithms on Inorganic Crystal Structures | Pan, Hillary; Ganose, Alex; Horton, Matthew; Aykol, Muratahan; Persson, Kristin; Zimmermann, Nils; Jain, Anubhav | Apart from performance on the benchmark, we provide other analyses that may be important for implementation of these algorithms such as computational demand and sensitivity towards small perturbations that mimic thermal motion. | Session 1: AI Materials Design and Discovery |
32 | Prediction of atomization energies using entropic data representation and machine learning | De La Rosa, Michael; Munoz, Jorge | We introduce a new method of data representation using a novel information entropy metric that is unaffected by the size or order of the Coulomb matrix. | Session 1: AI Materials Design and Discovery |
33 | Highly Accurate Machine Learning Point Group Classifier for Crystals | Alsaui, Abdulmohsen; Alqahtani, Saad; Mumtaz, Faisal; Alsayoud, Ibrahim; Al Ghadeer, Mohammed; Muqaibel, Ali; Rashkeev, Sergey; Baloch, Ahmer; Alharbi, Fahhad | In this work, the first step is to generate a space of all possible ternary compounds based on the common and uncommon oxidation states of 77 elements. | Session 1: AI Materials Design and Discovery |
34 | CRYSPNet: Machine Learning Tool for Crystal Structure Predictions | liang, haotong; Stanev, Valentin; Kusne, Aaron; Takeuchi, Ichiro | As an alternative, we developed a tool, CRYSPNet, that can predict the Bravais lattice, space group, and lattice parameters of a material based on its chemical formula. | Session 1: AI Materials Design and Discovery |
35 | Machine learning materials properties for small datasets | De Breuck, Pierre-Paul; Hautier, Geoffroy; Rignanese, Gian-Marco | In this work, a novel all-round framework is presented which relies on a feedforward neural network and the selection of physically-meaningful features. | Session 1: AI Materials Design and Discovery |
36 | Identifying "materials genes" by symbolic regression: The hierarchical SISSO approach | Foppa, Lucas; Purcell, Thomas; Levchenko, Sergey V.; Scheffler, Matthias; Ghiringhelli, Luca | In particular, we discuss a new strategy for discovering more complicated relationships between the features and properties by exploiting the learning of simpler, but related properties: the hierarchical sure-independence screening and sparsifying operator (hiSISSO) approach. | Session 1: AI Materials Design and Discovery |
37 | A massive dataset of synthesis-friendly hypothetical polymers | Rajan, Arunkumar; Kim, Chiho; Kuenneth, Christopher; Kamal, Deepak; Gurnani, Rishi; Batra, Rohit; Ramprasad, Rampi | It aims to build data-driven models to instantaneously predict the properties of polymers, and use this capability to screen a large candidate set of polymers to identify promising ones based on their predicted properties. | Session 1: AI Materials Design and Discovery |
38 | Bayesian Optimization Approach for Discovery of High-Capacity Small-Molecule Adsorption in Metal-Organic Frameworks | Taw, Eric; Neaton, Jeffrey | Using ~51,000 hypothetical MOF structures and data calculated from [1] for CH 4, we show it is possible to identify candidates for high-performance CH 4 adsorbents by calculating uptake capacities for <1% of the database using Bayesian optimization. | Session 1: AI Materials Design and Discovery |
39 | Data-driven studies of the magnetic anisotropy of two-dimensional magnetic materials | Xie, Yiqi; Rhone, Trevor; Tritsaris, Georgios; Grånäs, Oscar; Kaxiras, Efthimios | Our data-driven study aims to uncover physical insights into the microscopic origins of magnetism in reduced dimensions and to demonstrate the success of a high-throughput computational approach for the targeted design of quantum materials with potential applications from sensing to data storage. | Session 1: AI Materials Design and Discovery |
40 | Learning about learning by many-body systems | Yunger Halpern, Nicole | Learning about learning by many-body systems | Session 2: AI and Statistical/Thermal Physics |
41 | Can artificial intelligence learn and predict molecular dynamics? | Tiwary, Pratyush | In this talk we draw parallels between such tasks, and the efficient sampling of complex molecules with hundreds of thousands of atoms. | Session 2: AI and Statistical/Thermal Physics |
42 | Optimal machine intelligence near the edge of chaos | Feng, Ling; Zhang, Lin; Lai, Choy Heng | We develop a general theory that reveals the exact edge of chaos for generic non-linear systems is the boundary between the chaotic phase and the (pseudo)periodic phase arising from Neimark-Sacker bifurcation. | Session 2: AI and Statistical/Thermal Physics |
43 | Using learning by confusion to identify the order of a phase transition | Richter-Laskowska, Monika; Maska, Maciej | For a few selected models we demonstrate how this method can be used to distinguish between first and second order phase transitions. | Session 2: AI and Statistical/Thermal Physics |
44 | Asymptotic stability of the neural network and its generalization power | Zhang, Lin; Feng, Ling; Chen, Kan; Lai, Choy Heng | Based on this, we propose a method to calculate a lower bound for the regularization strength which could maintain the model at the boundary of stability. | Session 2: AI and Statistical/Thermal Physics |
45 | Renormalized Mutual Information for Artificial Scientific Discovery | Sarra, Leopoldo; Aiello, Andrea; Marquardt, Florian | We develop a new “renormalized” version, with the same physical meaning but finite. | Session 2: AI and Statistical/Thermal Physics |
46 | How neural nets compress invariant manifolds | Paccolat, Jonas; Petrini, Leonardo; Geiger, Mario; Tyloo, Kevin; Wyart, Matthieu | We study how neural networks compress uninformative input space in models where data lie in d dimensions, but whose label only vary within a linear manifold of dimension d p < d. | Session 2: AI and Statistical/Thermal Physics |
47 | Perturbation Theory for the Information Bottleneck | Ngampruetikorn, Vudtiwat; Schwab, David | Here we derive a perturbation theory for the IB method and report new analytical results for the learning onset – the limit of maximum relevant information per each bit, extracted from data. | Session 2: AI and Statistical/Thermal Physics |
48 | Real-space mutual information neural estimation algorithm for single-step extraction of renormalisation group-relevant degrees of freedom | Gokmen, Doruk Efe; Ringel, Zohar; Huber, Sebastian; Koch-Janusz, Maciej | We develop an efficient numerical algorithm based on recent rigorous results on mutual information estimation with neural networks. | Session 2: AI and Statistical/Thermal Physics |
49 | Deep learning in phase transition prediction of disordered materials | Kamrava, Serveh; Sahimi, Muhammad | We present a deep neural network (DNN) for predicting such properties of two- and three-dimensional systems and in particular their percolation probability, the threshold p c. All the predictions are in excellent agreement with the data. | Session 2: AI and Statistical/Thermal Physics |
50 | Integrating machine learning and multiscale modeling: perspectives, challenges, and opportunities in the biological, biomedical, and behavioral sciences | Kuhl, Ellen | Prof. Ellen Kuhl has been the champion in integrating machine learning techniques with multiscale physical modeling, particularly in the area of biomedicine. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
51 | Comparison of statistical parametric mapping method and scaled subprofile model for functional neuroimage analysis | hocurscak, Lara; Tomanic, Tadej; Trost, Maja; Simoncic, Urban | We developed analytical relation between the SPM and SSM/PCA results. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
52 | A Deep Learning Network for Disease Classification with Longitudinal Data | Deatsch, Alison; Jeraj, Robert | This work aims to develop a deep learning model to predict disease status and investigate the influence of longitudinal data on model performance. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
53 | Developing Dynamical Models to Characterize Stroke Gait Impairments | Winner, Taniel; Kesar, Trisha; Berman, Gordon; Ting, Lena | Here, we present a Recurrent Neural Network (RNN)-based model that produces a robust kinematic gait signature for visualizing and comparing gait kinematics between able-bodied individuals and stroke-survivors. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
54 | The Predictive Value of Deep Learning and Radiomics in Medical Imaging | Kalpathy-Cramer, Jayashree | The Predictive Value of Deep Learning and Radiomics in Medical Imaging | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
55 | Estimation of Radiobiological Indices in Radiotherapy of Lung Cancer using an Artificial Neural Network | Pudasaini, Mukunda; Leventouri, Theodora; Pella, Silvia; Muhammad, Dr. Wazir | The purpose of this study is to develop an artificial neural network (ANN) to predict radiobiological indices in radiotherapy of lung cancer. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
56 | Superspreading k-cores at the center of COVID-19 pandemic persistence | serafino, matteo; S. Monteiro, Higor; Luo, Shaojun; Reis, Saulo D. S.; Igual, Carles; S. Lima Neto, Antonio; Travizan, Matias; Soares De Andrade Jr, Jose; Makse, Hernan | Here, we implement a comprehensive contact tracing network analysis to find the optimal quarantine protocol to dismantle the chain of transmission of coronavirus with minimal disruptions to society. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
57 | Framework for Assessing the Impact of CNN-based Image Segmentation on Multi-step Biomarker Extraction | Huff, Daniel; Klanecek, Zan; Studen, Andrej; Jeraj, Robert | Here, we describe a framework for assessing CNN impact on final biomarker extraction in the clinical application of detecting colitis on 18F-FDG PET/CT. | Session 3: Artificial Intelligence, Machine Learning, and Data Science in Medicine and Biomedicine |
58 | Autonomous Materials Research and Discovery at the Beamline | Kusne, Aaron; McDannald, Austin; DeCost, Brian; Mehta, Apurva; Takeuchi, Ichiro | In this talk we will discuss autonomous systems being developed at NIST with a particular focus on autonomous control of X-ray diffraction and neutron scattering for materials characterization, exploration and discovery. | Session 4: Autonomous Systems and Control |
59 | Autonomous Nanocars based on Reinforcement Learning | Ramsauer, Bernhard R.; Hofmann, Oliver T.; Simpson, Grant J.; Grill, Leonhard | Here, we show how an artificial intelligence (AI) based on reinforcement learning (RL) can be implemented to manipulate single molecules. | Session 4: Autonomous Systems and Control |
60 | Machine Learning and Reinforcement Learning for Automated Experimentation and Materials Synthesis | Vasudevan, Rama; Jesse, Stephen; Ziatdinov, Maxim; Kelley, Kyle; Funakubo, Hiroshi; Ghosh, Ayana; Kalinin, Sergei | Here, we will focus on the applications and developments of these methods as pertaining to physical sciences with several textbook cases. | Session 4: Autonomous Systems and Control |
61 | SciAI for Grain Mapping with Electron Backscatter Diffraction: Leveraging Physics-Based Constraints and Uncertainty Propagation | McDannald, Austin; Rohrer, Gregory; Verma, Amit; Lee, Sukbin; Kusne, Aaron | Here we present how material science knowledge can be encoded into active learning frameworks to efficiently navigate such search spaces. | Session 4: Autonomous Systems and Control |
62 | Using Reinforcement Learning to Optimize Crystal Structure Determination | Ratcliff, William; Kienzle, Paul; Meuse, Kate; Opsahl-Ong, Jessica; Cho, Ryan; Rath, Joseph; Wilson, Abigail; Yan, Telon | We compare several approaches within this framework including epsilon-greedy, Q-learning, and actor-critic. | Session 4: Autonomous Systems and Control |
63 | Active learning of Bayesian force fields at quantum accuracy for fast molecular dynamics simulations of rare events. | Kozinsky, Boris | We develop ML interatomic potential models that are interpretable and uncertainty-aware, and orders of magnitude faster than reference quantum methods. | Session 4: Autonomous Systems and Control |
64 | Towards Secure and Interpretable AI: Scalable Methods, Interactive Visualizations, and Practical Tools | Chau, Polo | We present our joint works with Intel which include the first targeted physical adversarial attack (ShapeShifter) that fools state-of-the-art object detectors; a fast defense (SHIELD) that removes digital adversarial noise by stochastic data compression; and interactive systems (ADAGIO and MLsploit) that further democratize the study of adversarial machine learning and facilitate real-time experimentation for deep learning practitioners. | Session 5: Data Science Platforms: Algorithms and Visualization |
65 | Visualizing multiparameter probabilistic models in Minkowski space | Kheng, Han; Griniasty, Itay; Quinn, Katherine; Kent-Dobias, Jaron; Clement, Colin; Xu, Qingyang; Zheng, Jingyang; Roeser, Andrea; Sethna, James; Cohen, Itai; Goldberg, Jesse H. | In this talk, we will showcase how this technique can be combined with a probabilistic neural network to study cartilage tissue and bird song data. | Session 5: Data Science Platforms: Algorithms and Visualization |
66 | High dimensional model representation with machine-learned component functions: a powerful tool to learn multivariate functions from sparse data | Boussaidi, Mohamed Ali; Ren, Owen; Voytsekhovsky, Dmitry; Manzhos, Sergei | Specifically here we present a HDMR-GPR combination where the use of GPR to represent component functions allows nonparametric (unbiased) representation and the possibility to work only with functions of desired dimensionality, obviating the need to build an expansion over orders of coupling. | Session 5: Data Science Platforms: Algorithms and Visualization |
67 | Enhancing searches for resonances with robust classifiers using moment decomposition | Kitouni, Ouail; Nachman, Benjamin; Weisser, Constantin; Williams, Mike | We develop a new set of tools using a novel moment loss function (Moment Decomposition or MoDe) which relax the assumption of independence without creating structures in the background. | Session 5: Data Science Platforms: Algorithms and Visualization |
68 | Matplotlib and Scientific Visualization | Caswell, Thomas | This talk will highlight some of the key features of the library, focusing on examples of interactive multi-scale visualizations and tuning figures in preparation for publication. | Session 5: Data Science Platforms: Algorithms and Visualization |
69 | Mode-Assisted Joint Training of Deep Boltzmann Machines | Manukian, Haik; Di Ventra, Massimiliano | A recent technique we have proposed [1], called mode-assisted training, has shown success in improving the unsupervised training of RBMs. | Session 5: Data Science Platforms: Algorithms and Visualization |
70 | The Fully-Automated Nanoscale To Atomistic Structures from Theory and eXperiment (FANTASTX) code | Kolluru, Venkata Surya Chaitanya; Schwenker, Eric; Chan, Maria | In this talk, we will discuss the modular FANTASTX (Fully Automated Nanoscale To Atomistic Structures from Theory and eXperiments) toolkit which explores the potential energy landscape for low energy structures that also match with experimental data. | Session 5: Data Science Platforms: Algorithms and Visualization |
71 | Python Software for Multimodal Optimization of X-ray Reflectivity Data using First Principles Theory | Cheung, Nicholas; Chan, Maria; Fong, Dillon; Letchworth-Weaver, Kendra | Diffraction-based experimental techniques like X-ray Reflectivity (XRR) determine the distribution of electrons at the surface of a crystalline solid but inverting this data to obtain the atomic structure of the surface is a challenge. | Session 5: Data Science Platforms: Algorithms and Visualization |
72 | ParaMonte – A cross-platform parallel scalable high-performance Monte Carlo optimization, sampling, and integration library in C, C++, Fortran, MATLAB, Python, and R | Kumbhare, Shashank; Bagheri, Fatemeh; Osborn, Joshua; Shahmoradi, Amir | Here, we present the ParaMonte software, a suite of parallel Monte Carlo optimization, sampling, & integration algorithms for Bayesian inference problems. | Session 5: Data Science Platforms: Algorithms and Visualization |
73 | Deep Learning for Dynamical Systems | Brunton, Steven | In this talk, we will discuss several deep learning approaches to simultaneously discover coordinate transformations and parsimonious models of the dynamics. | Session 6: Data Science for Dynamical Systems and Real World Networks |
74 | The spectra of small-world random networks | Larson, Elizabeth; Kirst, Christoph; Vucelja, Marija | Here we present a practical tool to verify whether a network can be considered a small-world based on its eigenvalue spectrum properties. | Session 6: Data Science for Dynamical Systems and Real World Networks |
75 | Machine Learning for Partial Differential Equations | Brenner, Michael | I will discuss several ways in which machine learning can be used for solving and understanding the solutions of nonlinear partial differential equations. | Session 6: Data Science for Dynamical Systems and Real World Networks |
76 | Digital Twin: A Theorist’s Playground for APXPS and Surface Science | Qian, Jin; Crumlin, Ethan | As daunting as it sounds, I will explain the challenges along with the milestones: 1) developing physically accurate quantum chemistry methods that improve the numerical accuracy of XPS binding energy (BE) calculation; 2) realizing that a central piece of chemical reaction network (CRN) is universal in the chemical systems of interest, such as reactors and heterogeneous catalysis 3) sharing a user-friendly, natural chemical language syntax Digital Twin v.01 software package, which we welcome collaboration and feedback in any form. | Session 6: Data Science for Dynamical Systems and Real World Networks |
77 | TBA | Gonzalez, Marta | TBA | Session 6: Data Science for Dynamical Systems and Real World Networks |
78 | Cascading Failure From Targeted Road Network Disruptions | Vivek, Skanda | Guided by microscopic traffic simulations, we develop a theoretical framework for predicting the growth in cascading traffic jams around disruptions. | Session 6: Data Science for Dynamical Systems and Real World Networks |
79 | Experimental Realization of Reservoir Computing with Wave Chaotic Systems | Ma, Shukai; Antonsen, Thomas; Ott, Edward; Anlage, Steven | We propose unique techniques to create virtual RC nodes by both spectral and spatial perturbation. | Session 6: Data Science for Dynamical Systems and Real World Networks |
80 | Investment vs. reward in a competitive knapsack problem | Neumann, Oren; Gros, Claudius | Our goal is to investigate the balance between the metabolic costs of larger brains compared to the advantage they provide in solving general and combinatorial problems. | Session 6: Data Science for Dynamical Systems and Real World Networks |
81 | Can (Almost) Unsupervised Artificial Intelligence Learn Chemistry and Physics from Microscopic Observations? | Kalinin, Sergei | In this presentation, I will discuss several applications of autoencoders and variational autoencoders for the analysis of image and spectral data in STEM and SPM. | Session 7: Deep Learning and Computer Vision |
82 | RG-Flow: A hierarchical and explainable flow model based on renormalization group and sparse prior | Hu, Hong-Ye; Wu, Dian; You, Yizhuang; Olshausen, Bruno; Chen, Yubei | In this work, we incorporate the key idea of renormalization group (RG) and sparse prior distribution to design a hierarchical flow-based generative model, called RG-Flow, which can separate different scale information of images with disentangle representations at each scale. | Session 7: Deep Learning and Computer Vision |
83 | High throughput detection and quantification of Giardia lamblia cysts using holographic imaging flow-cytometry and deep learning | Gorocs, Zoltan; Baum, David; Song, Fang; de Haan, Kevin; Ceylan Koydemir, Hatice; Qui, Yunzhe; Cai, Zilin; Skandakumar, Thamira; Peterman, Spencer; Tamamitsu, Miu; Ozcan, Aydogan | To provide a cost-effective water screening tool, we created a field-portable holographic imaging flow-cytometer that can acquire in-focus phase and amplitude images of microscopic objects in water samples with a half-pitch resolution of <2µm and a liquid throughput of 100 mL/h. | Session 7: Deep Learning and Computer Vision |
84 | Machine learning the Biot-Savart law from quantum sensor data | Ku, Mark; Turner, Matthew; Bhutto, Danyal; Zhu, Bo; Rosen, Matthew; Walsworth, Ronald | We use a supervised neural network to reconstruct current distributions from magnetic field maps provided by a quantum diamond microscope (QDM). | Session 7: Deep Learning and Computer Vision |
85 | Trainable Diffractive Surfaces for Spectral Encoding of Spatial Information | Li, Jingxi; Mengu, Deniz; Yardimci, Nezih; Luo, Yi; Li, Xurong; Veli, Muhammed; Rivenson, Yair; Jarrahi, Mona; Ozcan, Aydogan | We demonstrate a deep-learning based single-pixel optical machine vision framework, where multiple diffractive surfaces are used to transform and encode the spatial information of objects into the power spectrum of the diffracted light. | Session 7: Deep Learning and Computer Vision |
86 | Ensemble learning enhances the inference accuracy of diffractive deep neural networks | Rahman, Md Sadman Sakib; Li, Jingxi; Mengu, Deniz; Rivenson, Yair; Ozcan, Aydogan | Here, we demonstrate the use of ensemble learning and feature engineering to significantly improve the inference performance of diffractive optical systems for object recognition. | Session 7: Deep Learning and Computer Vision |
87 | Misalignment Insensitive Diffractive Optical Networks | Mengu, Deniz; Zhao, Yifan; Yardimci, Nezih; Rivenson, Yair; Jarrahi, Mona; Ozcan, Aydogan | Here, we demonstrate a new training scheme that formulates the layer-to-layer misalignments and fabrication artefacts through continuous random variables embedded into the forward training model enabling accurate optical inference over a large range of physical misalignments. | Session 7: Deep Learning and Computer Vision |
88 | Terahertz Pulse Engineering Using Diffractive Optical Networks | Veli, Muhammed; Mengu, Deniz; Yardimci, Nezih; Luo, Yi; Li, Jingxi; Rivenson, Yair; Jarrahi, Mona; Ozcan, Aydogan | Deep learning is driving a new transformation in optics by providing non-intuitive solutions to a diverse set of problems. | Session 7: Deep Learning and Computer Vision |
89 | Increased Computation Speed of Neural Network-Aided Computer Vision Via Coded Diffraction of Off-Axis Optical Vorticies | Perry, Altai; Muminov, Baurzhan; Vuong, Luat | We establish a preprocessing technique that uses coded diffraction of off-axis optical vortices of varying topologies in the Fourier domain to improve the performance of quick Dense Neural Networks. | Session 7: Deep Learning and Computer Vision |
90 | Early detection and classification of live bacteria using holography and deep learning | Wang, Hongda; Ceylan Koydemir, Hatice; Qiu, Yunzhe; Bai, Bijie; Zhang, Yibo; Jin, Yiyin; Tok, Sabiha; Yilmaz, Enis; Gumustekin, Esin; Rivenson, Yair; Ozcan, Aydogan | Here we present a live bacteria detection system that captures time-lapse holographic images of a 60 mm-diameter agar plate followed by differential image analysis and deep neural network-based processing for specific and sensitive detection of bacterial growth and classification of the growing species. | Session 7: Deep Learning and Computer Vision |
91 | Paraphrasing Francis Crick: If you want to understand structure, study spectrum | Frenkel, Anatoly | Here we report on the use of X-ray absorption near edge structure (XANES) spectroscopy and supervised machine learning for investigating the information content “hidden” in the spectra. | Session 8: Deep Learning for Spectroscopy |
92 | Latent space interpretation of X-ray absorption fine structure spectra by an autoencoder approach | Liu, Yang; Routh, Prahlad; Marcella, Nicholas; Frenkel, Anatoly | In this work, we applied supervised machine learning and unsupervised machine learning approach to do the quantitative analysis of structural descriptors and explore what XANES features are embedded into a “bottleneck” representation. | Session 8: Deep Learning for Spectroscopy |
93 | Probabilistic generative models for latent representation learning of X-ray absorption fine structure (XAFS) spectra | Routh, Prahlad K.; Liu, Yang; Marcella, Nicholas; Frenkel, Anatoly | In this work, we will underscore the importance of applying probabilistic generative models to analyze X-ray absorption fine structure (XAFS) data and develop pathways for good latent representations. | Session 8: Deep Learning for Spectroscopy |
94 | Mapping Atomic Structures and X-ray Absorption Spectra using First Principles Computations and Machine Learning | Mannodi Kanakkithodi, Arun Kumar; Pothoof, Justin; Stegmann, Amy; Wang, Xinyue; Hsiao, Yu-Hsuan; Rojsatien, Srisuda; Chen, Yiming; Bertoni, Mariana; Chan, Maria | In this work, we used first principles computations to generate Cu and As K-edge XANES data for point defects and defect complexes in bulk and grain boundary structures of CdTe, as well as various relevant compounds of the impurity atoms, with the idea of capturing the structural diversity likely to be found in the solar cell material. | Session 8: Deep Learning for Spectroscopy |
95 | Revealing the correlated phonon properties in Raman spectra of graphene using machine learning | Chen, Zhuofa; Swan, Anna | Here we use machine learning techniques to find and study the relevant correlation of Raman spectral parameters, and reveal the properties of graphene in different dielectric environments. | Session 8: Deep Learning for Spectroscopy |
96 | Generation of Synthetic XPS spectra for Neural Network Quantification of RHEED Data of Complex Oxides | Demos, Michael; Provence, Sydney; Paudel, Rajendra; Comes, Ryan; Drera, Giovanni | Generation of Synthetic XPS spectra for Neural Network Quantification of RHEED Data of Complex Oxides | Session 8: Deep Learning for Spectroscopy |
97 | Big data spectromicroscopy: achieving new observables in ARPES from 2D surface maps | Kotta, Erica; Miao, Lin; Xu, Yishuai; Breitweiser, Stanley; Jozwiak, Chris; Bostwick, Aaron; Rotenberg, Eli; Zhang, Wenhan; Wu, Weida; Suzuki, Takehito; Checkelsky, Joseph; Wray, Lewis | In this talk, I will describe a data acquisition and analysis framework termed sparse big data (SBD) spectroscopy, in which one rapidly maps a sample surface to resolve the variation of electronic structure as a function of local environment. | Session 8: Deep Learning for Spectroscopy |
98 | AI assisted analysis of x-ray spectra | Suram, Santosh; Torrisi, Steven; Hung, Linda; Carbone, Matthew; Gregoire, John; Gomes, Carla; Yano, Junko | By combining random-forest and physically meaningful featurizations we show that we can automatically capture coordination number, bader charge, and nearest neighbor distances. | Session 8: Deep Learning for Spectroscopy |
99 | Machine-learning assisted identification of atomic properties from X-ray spectroscopy | Chen, Yiming; Chen, Chi; Sun, Chengjun; Heald, Steve; Chan, Maria; Ong, Shyue Ping | We will discuss how machine learning models are used to extract those properties from X-ray spectra. | Session 8: Deep Learning for Spectroscopy |
100 | Machine-Learning X-Ray Absorption Spectra to Quantitative Accuracy | Lu, Deyu; Carbone, Matthew; Topsakal, Mehmet; Yoo, Shinjae | As a proof of principle, we demonstrate that graph-based neural networks can be used to predict the x-ray absorption near-edge structure spectra of molecules to quantitative accuracy. | Session 8: Deep Learning for Spectroscopy |
101 | Predicting Density Functional Theory-Quality Nuclear Magnetic Resonance Chemical Shifts via Δ-Machine Learning | Unzueta, Pablo; Greenwell, Chandler; Beran, Gregory | The present study demonstrates how much higher accuracy chemical shieldings can be obtained via a Δ-ML approach. | Session 8: Deep Learning for Spectroscopy |
102 | Neural Network Ab-initio Molecular Dynamics (NNAIMD) for Water and Covalent Glasses | Nomura, Ken-ichi; Baradwaj, Nitish; Fukushima, Shogo; Kalia, Rajiv; Krishnamoorthy, Aravind; Mishra, Ankit; Nakano, Aiichiro; Rajak, Pankaj; Shimamura, Kohei; Shimojo, Fuyuki; Vashishta, Priya | In this talk, I will discuss our recent progress and applications to water and medium range order in covalent glasses systems. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
103 | Tensor-Field Molecular Dynamics – A Highly Accurate and Data-Efficient Interatomic Potential from SE(3)-equivariant Graph Neural Networks | Batzner, Simon; Smidt, Tess; Sun, Lixin; Mailoa, Jonathan; Kornbluth, Mordechai; Kozinsky, Boris | We present Tensor-Field Molecular Dynamics (TFMD), a novel Deep Learning Interatomic Potential for accelerating Molecular Dynamics simulations. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
104 | Thermodynamic properties by on-the-fly machine-learned interatomic potentials: thermal transport and phase transitions of zirconia | Verdi, Carla; Karsai, Ferenc; Jinnouchi, Ryosuke; Kresse, Georg | Here we employ a recently developed on-the-fly learning technique based on molecular dynamics and Bayesian regression [1] in order to generate an interatomic potential capable to describe the thermodynamic properties of the prototypical transition metal oxide ZrO 2. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
105 | Neural network molecular dynamics of ferroelectric domain boundary | Aditya, Anikeya; Nomura, Ken-ichi; Linker, Thomas; Kalia, Rajiv; Krishnamoorthy, Aravind; Nakano, Aiichiro; Shimamura, Kohei; Shimojo, Fuyuki; Tiwari, Subodh; Vashishta, Priya | In this study, I will discuss the development of NNAIMD force field to study PTO crystal, along with simulation results on complex DW dynamics using it. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
106 | Accurate and Efficient ML Force Fields for Hundreds of Atoms | Chmiela, Stefan; Vassilev Galindo, Valentin; Sauceda, Huziel; Muller, Klaus-Robert; Tkatchenko, Alexandre | To overcome this limitation, we develop an efficient iterative, parameter-free solver to train symmetric gradient domain machine learning (sGDML) [Chmiela et al., 2018] potentials for systems with several hundred atoms. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
107 | A Fourth-Generation High-Dimensional Neural Network Potential with Accurate Electrostatics Including Non-local Charge Transfer | Finkler, Jonas; Ko, Tsz; Goedecker, Stefan A; Behler, Jorg | The methods significantly improved description of the potential energy surface substantially extends the applicability of modern machine learning potentials. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
108 | BIGDML: Efficient Gradient-Domain Machine Learning Force Fields for Materials | Sauceda, Huziel; Gálvez-González, Luis; Chmiela, Stefan; Paz-Borbón, Lauro; Muller, Klaus-Robert; Tkatchenko, Alexandre | Here we introduce Bravais-Inspired GDML[1,2] (BIGDML) model, with which we are able to construct meV-accurate force fields for materials using a training set with just 10-100 geometries. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
109 | The effects of different exchange and correlation functionals on Neural Networks for water | Pedroza, Luana; Torres, Alberto; Rocha, Alexandre | In this work we obtain neural-network-trained force fields that are accurate at the level of Density Functional Theory (DFT). | Session 9: Emerging Trends in MD Simulations and Machine Learning |
110 | Quantum parallel algorithm for the thermal canonical ensemble | Iitaka, Toshiaki | In this talk, I propose an algorithm ( https://arxiv.org/abs/2006.14459 ) that is embarrassingly parallel and expected to work extremely efficient on massive parallel classical supercomputers such as Fugaku, where tensor network representation [1] is introduced as the solution of the first difficulty, and quantum parallelization of METTS algorithm [2] using random state method [3] is introduced as the solution of the second difficulty. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
111 | An adaptive-mesh, GPU-accelerated, and optimally error-controlled special relativistic hydrodynamics code | Tseng, Po-Hsun; Schive, Hsi-Yu; Chiueh, Tzihong | An adaptive-mesh, GPU-accelerated, and optimally error-controlled special relativistic hydrodynamics code | Session 9: Emerging Trends in MD Simulations and Machine Learning |
112 | A Numerical Code for Automated Calculation of Coarse-Grained Potentials using the Iterative Boltzmann Inversion (IBI) Method | Johnson, Lilian; Hoang, T.; Phelan, Frederick | We report here on a software code which automates the development of coarse-grained potentials using IBI. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
113 | Differentiable Molecular Simulations | Wang, Wujie; Axelrod, Simon; Gomez-Bombarelli, Rafael | The applications we present including solving inverse structure elucidation problem from experimental observation and parameterizing control protocols for non-equilibrium chemical dynamics. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
114 | Machine learning dielectric constant of water in a large pressure-temperature range | RUI, HOU; YUHUI, QUAN; Pan, Ding | Here, we built a neural network dipole model, which can be combined with molecular dynamics to compute P-T dependent dielectric properties of water as accurately as first-principles methods but much more efficiently. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
115 | Backmapping of Equilibrated Condensed-Phase Molecular Structures with Generative Adversarial Networks | Stieffenhofer, Marc; Wand, Michael; Bereau, Tristan | In this study we introduce DeepBackmap: A deep neural network based approach to directly predict equilibrated molecular structures for condensed-phase systems. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
116 | Frequency dependence of W made simple using a multi-pole approximation | Leon Valido, Dario; Cardoso, Claudia; Varsano, Daniele; Molinari, Elisa; Ferretti, Andrea | In this work we explore a multi-pole approach for getting an effective representation of W. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
117 | Learning electron densities in condensed-phase space | Lewis, Alan; Grisafi, Andrea; Ceriotti, Michele; Rossi, Mariana | In this work, we present a model that is able to learn and predict the electronic density of diverse materials, ranging from liquids to solid semiconductors and metals. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
118 | Biased, efficient sampling of polymer conformations using Brownian bridges | Narsimhan, Vivek; WANG, Shiyan; Ramkrishna, Doraiswami | In this talk, we introduce a mathematical concept known as a Brownian bridge, and describe how it can be utilized in many areas of polymer physics. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
119 | Flexible Molecules Need More Flexible Machine Learning Force Fields | Vassilev Galindo, Valentin; Cordeiro Fonseca, Grgory; Poltavskyi, Igor; Tkatchenko, Alexandre | To resolve this, we propose moving from learning the entire PES within a single ML model to the employment of local models that are combined into a global force field. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
120 | Network structure of non-equilibrium quantum transport models. | Poteshman, Abigail; Bassett, Lee; Bassett, Danielle | In this work, we construct networks representing the energy landscape of non-equilibrium transport through quantum antidots—an example of an open, many-body quantum system—corresponding to two distinct models of internal quantum states: a single-particle, non-interacting model and a mean-field model including interactions. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
121 | Two-tier machine learning acceleration of molecular dynamics with enhanced sampling: surface reactions and restructuring on metal catalysts | Sun, Lixin; Batzner, Simon; Che, Wei; Lim, Jin Soo; Xie, Yu; Torrisi, Steven; Vandermause, Jonathan; Kozinsky, Boris | To solve this problem, we introduce a two-tier machine learning approach to accelerate MD simulations. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
122 | Generalization of SNAP to arbitrary machine-learning interatomic potentials in LAMMPS | Thompson, Aidan | I will discuss the underlying algorithms and describe some interesting applications. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
123 | Quantum Dynamics Made Fast: Achieving Linear Time-Scaling for Nonequilibrium Green Functions | Schlünzen, Niclas; Joost, Jan-Philip; Bonitz, Michael | Among others, the nonequilibrium Green functions (NEGF) method has proven to be a powerful tool to reliably predict the quantum dynamics, without being limited to 1D or spatially homogeneous systems. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
124 | Modeling the Dynamics of Complex Energy Materials with Machine Learning | Artrith, Nongnuch | Here, I will give an overview of recent methodological advancements of ML potentials based on artificial neural networks (ANNs) [1-5] and applications of the method to challenging materials classes including metal and oxide nanoparticles and amorphous phases. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
125 | Accurate many-body repulsive potentials for density-functional tight binding from deep tensor neural networks | Medrano Sandonas, Leonardo; Stoehr, Martin; Tkatchenko, Alexandre | Hence, we combine DFTB with deep tensor neural networks (DTNN) to maximize the strengths of both approaches. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
126 | Improving Molecular Force Fields Across Configurational Space by Combining Supervised and Unsupervised Machine Learning | Cordeiro Fonseca, Grgory; Poltavskyi, Igor; Vassilev Galindo, Valentin; Tkatchenko, Alexandre | To bypass this issue, we combine unsupervised and supervised ML methods: (I) we cluster CS into subregions similar in terms of geometry and energetics, (II) we iteratively test a MLFF model on each subregion and expand the training set to flatten the prediction accuracy across CS. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
127 | Central Moment Lattice Boltzmann Schemes with Fokker-Planck Guided Collision for Simulation of Multiphase Flows with Surfactant Effects and Turbulence | Schupbach, William; Premnath, Kannan | We present central moment lattice Boltzmann (LB) schemes, whose collision steps are represented by a novel Fokker-Planck (FP) kinetic model, for computations of multiphase hydrodynamics, interface tracking and surfactant evolution. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
128 | Machine learning assisted interatomic and electronic structure models for molecular simulation | Zhang, Linfeng | We introduce a machine learning (ML)-based framework for building interatomic and electronic structure models following two general principles: 1) ML-based models should respect important physical constraints in a faithful and adaptive way; 2) to build truly reliable models, efficient algorithms are needed to explore relevant physical space and construct optimal training data sets. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
129 | WannierBerri code: High performance Wannier interpolation of Berry curvature and related quantities. | Liu, Xiaoxiong; Ghim, Minsu; Lenggenhager, Patrick; Jiménez Herrera, Miguel Ángel; Robredo, Iñigo; Ryoo, Ji Hoon; Lihm, Jae-Mo; Park, Cheol-Hwan; Souza, Ivo; Tsirkin, Stepan | We present WannierBerri (WB) [1] – a new Python code for Wannier interpolation. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
130 | Systematic Comparison and Cross-validation of Fixed-Node Diffusion Monte Carloand Phaseless Auxiliary-Field Quantum Monte Carlo in Solids | Benali, Anouar; Malone, Fionn; Morales, Miguel; Caffarel, Michel; Kent, Paul; Shulenburger, Luke | In this work we assess the feasibility of determining exact total energies for solid state Hamiltonians by studying primitive cells of four representative materials, Al, LiF and Carbon-diamond. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
131 | Prospects and Scaling Properties of Quantum Monte Carlo Forces for Heavier Ions | Tiihonen, Juha; Clay, Raymond; Krogel, Jaron | Here we give an outlook of continuum variational Monte Carlo and diffusion Monte Carlo forces in applications including heavier than usual elements, such as third row transition metals which require the use of pseudopotentials. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
132 | Entanglement transitions as a probe of quasiparticles and quantum thermalization | Lu, Tsung-Cheng; Grover, Tarun | We introduce a diagnostic for quantum thermalization based on mixed-state entanglement. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
133 | MuST: A high performance ab initio framework for the study of disordered structures | Wang, Yang; Eisenbach, Markus; Liu, Xianglin; Karabin, Mariia; Ghosh, Swarnava; Terletska, Hanna; Mondal, Wasim; Tam, Ka-Ming; Zhang, Yi; Chioncel, Liviu; Raghuraman, Vishnu; Widom, Michael; Tian, Fuyang | In this presentation, I will introduce MuST, an open source package designed for enabling first principles investigation of disordered materials. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
134 | A first-principles Quantum Monte Carlo study of two-dimensional (2D) GaSe and GaSe1-xSx alloys | Wines, Daniel; Saritas, Kayahan; Ataca, Can | We aim to present a terminal theoretical benchmark for pristine monolayer GaSe and alloys, which will aid in the further study of 2D PTMCs and alloys using DMC methods. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
135 | Ensemble Green’s function theory for interacting electrons with degenerate ground states | Linnér, Erik; Aryasetiawan, Ferdi | Rev. B 100, 235106 (2019)] we propose an ensemble Green’s function formalism, based on the von Neumann density matrix approach, for treating the one-electron excitation spectra of a degenerate electronic system. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
136 | Boost all-electron full-potential DFT calculation with the domain specific SIRIUS library. | Zhang, Long; Trickey, Samuel; Cheng, H-P. | We introduce the EXCITING-PLUS(EP) full potential linearized augmented plane wave (FP-LAPW) code interfaced with domain specific SIRIUS library. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
137 | Simulation of Quantum Spin-Liquid Phases with Spectral Methods | Brito, Francisco; Ferreira, Aires | In this work, we combine accurate Chebyshev polynomial expansions [1-3] and thermal pure quantum states (TPQ) [4] to simulate quantum spin models with highly entangled ground states. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
138 | Excited states in variational Monte Carlo using a penalty method | Pathak, Shivesh; Busemeyer, Brian; Rodrigues, Jo?o; Wagner, Lucas | We present an algorithm based on orthogonalization to the ground state that resolves these difficulties in the limit as the wave function parameterization becomes complete. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
139 | Quantum Many-body Eigensolvers with Entanglement Renormalization | Khan, Abid; Yu, Xiongjie; Clark, Bryan; Pekker, David | We provide an approximate algorithm for computing eigenstates of Hamiltonians with hundreds of sites. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
140 | Permutation Matrix Representation Quantum Monte Carlo | Gupta, Lalit; Albash, Tameem; Hen, Itay | We present a quantum Monte Carlo algorithm for the simulation of general quantum and classical many-body models within a single unifying framework. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
141 | Multi-Task Reinforcement Learning for Autonomous Material Design | Rajak, Pankaj | In this talk, I will discuss our recent work on multi-task reinforcement learning (RL) for automated material-discovery with target properties and predictive synthesis of quantum materials. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
142 | Fast Bayesian Force Fields from Active Learning: Application to 2D Material and Substrates | Xie, Yu; Vandermause, Jonathan; Sun, Lixin; Cepellotti, Andrea; Kozinsky, Boris | We present a way to dramatically accelerate Gaussian process models for interatomic force fields based on many-body kernels by mapping both forces and uncertainties onto functions of low-dimensional features. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
143 | Efficient construction of training datasets based on random sampling and structural optimization | Choi, Youngjae; Jhi, Seung-Hoon | We develop a scheme named randomized atomic-system generator (RAG) to produce the training sets that widely cover the potential energy surface by combining the random sampling and structural optimization. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
144 | Data-Driven Interatomic Potentials for Molten Salts | Tovey, Samuel; Holm, Christian | In our work, we have developed an interatomic potential for molten NaCl using Gaussian process regression. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
145 | Design of novel polymer-metal interfaces using first principles-informed artificial intelligence techniques | MA, RURU; Linker, Thomas; Yang, Liqiu; Mishra, Ankit; Kamal, Deepak; Wang, Yifei; Nomura, Ken-ichi; Shimojo, Fuyuki; Nakano, Aiichiro; Kalia, Rajiv; Vashishta, Priya | In this work, we investigate how Aluminum and Boron-Nitride coating affects the charge injection barrier and hot carrier dynamics in various polymer systems. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
146 | Electronic density and atomic forces in solids by plane-wave auxiliary-field quantum Monte Carlo | Chen, Siyuan; Motta, Mario; Ma, Fengjie; Zhang, Shiwei | We present accurate electronic densities and ionic forces in solids. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
147 | A Potential Improvement for Electrostatic Interactions: Constructing A Fluctuating Charge Model for Nucleic Acids | Myers, Christopher; Chen, Alan | As such, we will present our approach to augmenting AMBER based force fields for nucleic acids with the ability to account for polarization. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
148 | Transfer learning of neural network potentials for reactive chemistry | Hu, Quin; Goodpaster, Jason | In this study, we are developing a method to train a neural network potential with high-level wavefunction theory on targeted system of interest that are able to describe bond breaking. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
149 | Multi-task and Uncertainty Prediction of Polymer Properties with Graph Network | Mishra, Ankit; Rajak, Pankaj; Ramprasad, Rampi; Nakano, Aiichiro; Kalia, Rajiv; Vashishta, Priya | Here, we propose a graph-based Bayesian multi-task learning model to inherently capture the relation between multiple properties for a given polymer candidate. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
150 | Efficient construction of linear models in materials modeling and applications to force constant expansions | Fransson, Erik; Eriksson, Fredrik; Erhart, Paul | In this presentation, we analyze the efficacy and efficiency of several state-of-the-art regression and feature selection methods in the context of FC extraction and the prediction of different thermodynamic properties. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
151 | A Molecular-Dynamicist Walks into an Error Bar: Rigorously Quantifying Uncertainties in Simulations of Transport under Confinement | Wang, Gerald; Li, Yuanhao | In this talk, we highlight this principle in the context of a deceptively straightforward problem, namely, computing the diffusion coefficient of a fluid under nanoscale confinement. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
152 | Applying Neural Networks and Gaussian Process Regression to the Transition Structure Factor | Weiler, Laura; Mihm, Tina; Shepherd, James | We explore two machine learning algorithms for analyzing the transition structure factor based on coupled cluster doubles calculations on the uniform electron gas. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
153 | Maxwell + Polarizable MD multi-scale simulation for vibrational spectroscopy | Yamada, Atsushi | We present a novel computational scheme of classical molecular simulation that is unified with Maxwell’s equations based on a multi-scale model to describe the coupled dynamics of light electromagnetic waves and molecules in crystalline solids [1]. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
154 | The Self Learning Kinetic Monte Carlo (SLKMC) method augmented with data analytics for adatom-island diffusion on surfaces | Rahman, Talat | In this talk, I will present results for the diffusion kinetics of two dimensional adatoms islands in two types of systems: homoepitaxial and heteroepitaxial. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
155 | Multiscale reweighted stochastic embedding (MRSE): Deep learning of collective variables for enhanced sampling | Rydzewski, Jakub; Valsson, Omar | We present a new machine learning method called multiscale reweighted stochastic embedding (MRSE) [1] for automatically constructing collective variables (CVs) to represent and drive the sampling of free energy landscapes in enhanced sampling simulations. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
156 | Finite Electron Temperature Density Functional Theory and Neural Network Molecular Dynamics study of Sub Pico-Second Optical Control of Ferroelectric Domains in PbTiO3 based Nanostructuress | Linker, Thomas; Kalia, Rajiv; Nakano, Aiichiro; Nomura, Ken-ichi; Shimojo, Fuyuki; Vashishta, Priya | To study large polar domains common in PbTiO 3 based nanostructures we developed a neural-network force-field model based on ground state DFT and FT-DFT training data. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
157 | A unified Bayesian approach to learning many-body potentials | Vandermause, Jonathan; Kozinsky, Boris | In this talk, we present Bayesian force fields that unite three frameworks—the Atomic Cluster Expansion (ACE), Gaussian Approximation Potentials (GAP), and Spectral Neighbor Analysis Potentials (SNAP)—opening the door to scalable, uncertainty-aware molecular dynamics simulations of complex materials. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
158 | Optimizing Free Energy Estimation with Machine Learning | Wirnsberger, Peter; Ballard, Andrew; Papamakarios, George; Abercrombie, Stuart; Racanière, Sébastien; Pritzel, Alexander; Jimenez Rezende, Danilo; Blundell, Charles | Optimizing Free Energy Estimation with Machine Learning | Session 9: Emerging Trends in MD Simulations and Machine Learning |
159 | Insights on Bimetallic Surface Dynamics via Automatically Trained Gaussian Process Machine Learning Potentials | Torrisi, Steven; Lim, Jin Soo; Sun, Lixin; Xie, Yu; Vandermause, Jonathan; Kozinsky, Boris | Machine learning force field models enable the study of long time scale molecular dynamics for large systems, but generating and selecting training data used to fit these models is a tedious and challenging task. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
160 | Quantum Monte Carlo of cohesion and excitations in diamond Si: benchmarks | Annaberdiyev, Abdulgani; Wang, Guangming; Mitas, Lubos | We present a study of Si bulk in diamond structure by fixed-node (FN) QMC since systems with Si tend to exhibit some of the smallest FN errors. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
161 | Dynamically consistent coarse-grained models of chemically specific polymer melts via friction parameterization | Johnson, Lilian; Phelan, Frederick | Here, we aim to develop a chemically specific, thermodynamically consistent, and dynamically correct model by combining a conservative potential and a dissipative potential. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
162 | Exploring, fitting, and characterizing the configuration space of materials with multiscale universal descriptors | Bernstein, Noam; Stenczel, Tamas; Csanyi, Gabor | We present heuristics for a universal set of multiscale Smooth Overlap of Atomic Positions (SOAP) descriptors. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
163 | Investigation of global charge distributions for constructing non-local machine learning potentials | Ko, Tsz Wai; Finkler, Jonas; Goedecker, Stefan A; Behler, Jorg | Here we use fourth-generation high-dimensional neural network potentials [3] to illustrate the role of non-local effects and suggest possible improvements for current state-of-the-art MLPs. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
164 | Optimization of the diffusion Monte Carlo nodal surface | McFarland, John; Manousakis, Efstratios | The work presented here explores a novel method that optimizes the location nodal surface from the walker distribution of DMC. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
165 | Orbital optimization in quantum Monte Carlo applied on solids | Luo, Ye | Thus, we enable orbital optimization schemes like rotating an extended set of orbitals and further directly optimizing orbital shapes with much more challenging amount of paramters. | Session 9: Emerging Trends in MD Simulations and Machine Learning |
166 | GDS Business Meeting (6:00pm-7:00pm) | Morrison, Terrance | GDS Business Meeting (6:00pm-7:00pm) | Session 10: GDS Business Meeting (6:00pm-7:00pm) |
167 | Dynamics in correlated quantum matter with neural networks | Schmitt, Markus; Heyl, Markus | In this talk I will present a versatile and efficient machine learning inspired approach based on a recently introduced artificial neural network encoding of quantum many-body wave functions. | Session 11: Machine Learning for Quantum Matter |
168 | Neural network enhanced hybrid quantum many-body dynamics | Koch, Rouven; Lado, Jose | We demonstrate that combining kernel polynomial techniques [2] and real-time evolution, together with deep neural networks, allows to compute dynamical quantities faithfully. | Session 11: Machine Learning for Quantum Matter |
169 | Autoregressive Neural Network for Simulating Open Quantum Systems via a Probabilistic Formulation | Luo, Di; Chen, Zhuo; Carrasquilla, Juan; Clark, Bryan | We propose an efficient machine learning approach to simulate such dynamics using a probabilistic formulation of quantum mechanics based on the positive operator-valued measure, parameterizing the quantum states with autoregressive neural networks for exact sampling. | Session 11: Machine Learning for Quantum Matter |
170 | Customizable neural-network states for topological phases | Valenti, Agnes; Greplova, Eliska; Lindner, Netanel; Huber, Sebastian | Here, we introduce an interpretable physically motivated variational neural network ansatz based on a tunable extension of the Restricted Boltzmann Machine architecture. | Session 11: Machine Learning for Quantum Matter |
171 | Variational Neural Annealing | Hibat-Allah, Mohamed; Inack, Estelle; Wiersema, Roeland; Melko, Roger; Carrasquilla, Juan | In this talk, we present a combination of the variational principle in classical and quantum physics with recurrent neural networks (RNNs), whose dynamics are naturally devoid of slow Markov chains, to accurately emulate annealing in its classical and quantum formulations, for the purpose of solving optimization problems. | Session 11: Machine Learning for Quantum Matter |
172 | A Neural-Network approach to the simulation of Open Quantum Dynamics using POVMs | Reh, Moritz; Gaerttner, Martin; Schmitt, Markus | Among many other achievements, they present a competetive approach to the solution of the quantum many-body problem, utilizing state of the art network-designs that represent inherent physical properties of the system under scrutiny, e.g. translational symmetry in Convolutional Networks. | Session 11: Machine Learning for Quantum Matter |
173 | Hamiltonian reconstruction as metric for a variational study of the spin-1/2 J1-J2 Heisenberg model | Zhang, Kevin; Lederer, Samuel; Choo, Kenny; Neupert, Titus; Carleo, Giuseppe; Kim, Eun-Ah | We propose using a recently developed Hamiltonian reconstruction method for a multi-faceted approach to evaluating wavefunctions. | Session 11: Machine Learning for Quantum Matter |
174 | Convolutional Neural Network Wave Functions: learning quantum many-body physics | Hendry, Douglas; Feiguin, Adrian | Here, we propose, discuss and benchmark novel strategies to improve and train CNNs as VWF for the frustrated 2D J1-J2 Heisenberg model on the square lattice with focus on the use of real or complex weights, choice of activation functions, the overall architecture, and enforcement of symmetries. | Session 11: Machine Learning for Quantum Matter |
175 | Challenges for simulating quantum spin dynamics in two dimensions by neural network quantum states | Hofmann, Damian; Fabiani, Giammarco; Mentink, Johan; Carleo, Giuseppe; Sentef, Michael | In this work, we employ both t-VMC and deterministic TDVP-based propagation to spin-1/2 Heisenberg systems and take a closer look at various sources of error which can affect the stability and accuracy of the resulting dynamics. | Session 11: Machine Learning for Quantum Matter |
176 | Gauge equivariant neural networks for quantum lattice gauge theories | Luo, Di; Carleo, Giuseppe; Clark, Bryan; Stokes, James | We propose a family of neural-network quantum states with gauge equivariant architecture which exactly satisfy the local Hilbertspace constraints of quantum lattice gauge theories. | Session 11: Machine Learning for Quantum Matter |
177 | Quantum Ground States from Reinforcement Learning | Barr, Ariel; Gispen, Willem; Lamacraft, Austen | Our work provides a novel neural approach to many-body quantum mechanics that leverages optimal control to approximate the Feynman–Kac path measure. | Session 11: Machine Learning for Quantum Matter |
178 | Neural network wave functions and the sign problem | Szabo, Attila; Castelnovo, Claudio | In this talk, I present a neural network architecture with a simple, explicit, and interpretable phase ansatz, which can robustly represent such states and achieve state-of-the-art variational energies for both conventional and frustrated antiferromagnets. | Session 11: Machine Learning for Quantum Matter |
179 | Neural Networks for Analytic Continuation of Response Functions | Verret, Simon; Nourafkan, Reza; Weyrich, Quinton; Desrosiers, Samuel; Tremblay, A.-M. | In this work, we extend the use of deep neural networks to the case of the longitudinal conductivity, in particular the DC conductivity. | Session 11: Machine Learning for Quantum Matter |
180 | Spiking Neuromorphic Chip Encodes Quantum Entanglement Correlations | Czischek, Stefanie; Baumbach, Andreas; Billaudelle, Sebastian; Cramer, Benjamin; Kades, Lukas; Pawlowski, Jan; Schemmel, Johannes; Oberthaler, Markus; Petrovici, Mihai; Gasenzer, Thomas; Gaerttner, Martin | Here we report on the realization of a prototype using the spike-based BrainScaleS hardware developed in the context of European’s Human Brain Project (HBP). | Session 11: Machine Learning for Quantum Matter |
181 | Reinforcement Learning for Many-Body Ground State Preparation based on Counter-Diabatic Driving | Yao, Jiahao; Lin, Lin; Bukov, Marin | We propose a generalized QAOA ansatz called CD-QAOA, which is inspired by the counter-diabatic (CD) driving procedure, designed for quantum many-body systems, and optimized using a reinforcement learning (RL) approach. | Session 11: Machine Learning for Quantum Matter |
182 | Entanglement and Tensor Networks for Supervised Image Classification | Martyn, John; Vidal, Guifre; Roberts, Chase; Leichenauer, Stefan | We revisit the use of tensor networks for supervised image classification, as pioneered by Stoudenmire and Schwab. | Session 11: Machine Learning for Quantum Matter |
183 | Continuous monitoring and feedback control of qubit dynamics using differentiable programming | Schäfer, Frank; Sekatski, Pavel; Koppenhoefer, Martin; Loerch, Niels; Bruder, Christoph; Kloc, Michal | Starting from a distribution of initial states, we aim to find the optimal control scheme to fulfill the control task over a certain time interval. | Session 11: Machine Learning for Quantum Matter |
184 | Machine Learned Predictions of Complex Quantities from Differentiable Networks | Malenfant-Thuot, Olivier; Ryczko, Kevin; Tamblyn, Isaac; Cote, Michel | We are working on a method to optimize the data generation of these structures and the training of models in a single fully machine learned workflow, aiming to reduce the number of data points needed and the biases they carry. | Session 11: Machine Learning for Quantum Matter |
185 | Mitigating sign problem by automatic differentiation | Wan, Zhouquan; Zhang, Shixin; Yao, Hong | Here, we propose a general framework using automatic differentiation (AD) to automatically search for the best continuously-parameterized QMC scheme, which we call “automatic differentiable sign mitigation” (ADSM). | Session 11: Machine Learning for Quantum Matter |
186 | Self-learning projective quantum Monte Carlo simulations guided by restricted Boltzmann machines | Inack, Estelle | In this work, we present a novel method that uses unsupervised machine learning techniques to combine the two steps above. | Session 11: Machine Learning for Quantum Matter |
187 | Improving training schemes for encoding quantum states on neuromorphic hardware | Klassert, Robert; Czischek, Stefanie; Baumbach, Andreas; Gärttner, Martin; Gasenzer, Thomas | Here we aim to improve the training scheme used in this work and explore applications to larger classes of states. | Session 11: Machine Learning for Quantum Matter |
188 | Neural networks for atomistic modelling – are we there yet? | Kucukbenli, Emine | In this talk, we will give an overview of the current state of affairs of the field, in particular for approaches that attempt to bypass the quantum mechanical simulations. | Session 11: Machine Learning for Quantum Matter |
189 | Finding Symmetry Breaking Order Parameters with Euclidean Neural Networks | Smidt, Tess; Geiger, Mario; Miller, Benjamin | We demonstrate that symmetry equivariant neural networks uphold Curie’s principle and can be used to articulate many symmetry-relevant scientific questions into simple optimization problems. | Session 11: Machine Learning for Quantum Matter |
190 | Machine learning dielectric screening for the simulation of excited state properties of molecules and materials | Dong, Sijia; Govoni, Marco; Galli, Giulia | We present an approach to improve the efficiency of first principles calculations of absorption spectra of complex materials at finite temperature, based on the solution of the Bethe-Salpeter Equation in finite-field (FF) [1]. | Session 11: Machine Learning for Quantum Matter |
191 | Generative Model Learning For Molecular Electronics | Mitchell, Andrew; Rigo, Jonas; Sen, Sudeshna | No single theoretical method can treat the low-temperature physics of such systems exactly. | Session 11: Machine Learning for Quantum Matter |
192 | An assessment of the structural resolution of various fingerprints commonly used in machine learning | Parsaeifard, Behnam; De, Deb; Christensen, Anders; Faber, Felix; Kocer, Emir; De, Sandip; Behler, Jorg; Von Lilienfeld, O.; Goedecker, Stefan A | In this work, we compare the performance of fingerprints based on the Overlap Matrix(OM), the Smooth Overlap of Atomic Positions (SOAP), Behler-Parrinello atom-centered symmetry functions (ACSF), modified Behler-Parrinello symmetry functions (MBSF) used in the ANI-1ccxpotential and the Faber-Christensen-Huang-Lilienfeld (FCHL) fingerprint under various aspects. | Session 11: Machine Learning for Quantum Matter |
193 | Vestigial nematic order in Pd-RTe3 studied using X-ray diffraction TEmperature Clustering (X-TEC) | Mallayya, Krishnanand; Matty, Michael; Straquadine, Joshua; Krogstad, Matthew; Osborn, Raymond; Rosenkranz, Stephan; Fisher, Ian; Kim, Eun-Ah | Here, we use diffuse x-ray scattering to study the effects of Pd-intercalation, which introduces controlled disorder, on CDW formation in ErTe3, a weakly orthorhombic material for which CDW fluctuations are present in both in-plane directions. | Session 11: Machine Learning for Quantum Matter |
194 | Reactive Machine Learning Potential Models for the NO Formation Reaction | Johannesen, Andrew; Goodpaster, Jason | In this work, we produce machine learning potentials for CO 2, O 2, N 2, and NO, with the goal of modeling reactive equilibria between the latter three species. | Session 11: Machine Learning for Quantum Matter |
195 | Achieving Smaller Effective Spot Sizes in nano-ARPES with Machine Learning | Stansbury, Conrad; Lanzara, Alessandra | We demonstrate that convex optimization and machine learning enhanced nano-ARPES allows for resolving the individual contributions of sub-beam domains to the ARPES spectra of discrete domain patterned materials. | Session 11: Machine Learning for Quantum Matter |
196 | INVESTIGATING BAND GAP DIRECTNESS USING MACHINE LEARNING | Ogoshi de Melo, Elton; Popolin Neto, Mário; Mera Acosta, Carlos; M. Nascimento, Gabriel; Rodrigues, João; N. Oliveira Jr., Osvaldo; V. Paulovich, Fernando; M. Dalpian, Gustavo | Using all semiconductors’ band structures from Materials Project, a total of 18,372 materials, we have used classification Machine Learning methods for general prediction of band gap directness and, more importantly, for extraction of interpretable knowledge of direct-indirect transitions. | Session 11: Machine Learning for Quantum Matter |
197 | Unsupervised machine learning of quantum phase transitions using diffusion maps | Lidiak, Alex | Unsupervised machine learning methods are particularly promising in overcoming this challenge. | Session 11: Machine Learning for Quantum Matter |
198 | Learning Algorithms for Control and Characterization of Quantum Matter | Greplova, Eliska; Jin, Guliuxin; Valenti, Agnes; Bucko, Jozef; Romero, Imelda; Schäfer, Frank; Huber, Sebastian | In this presentation, I will discuss how learning algorithms can be used efficiently for this task. | Session 11: Machine Learning for Quantum Matter |
199 | Deep neural networks for quantum state characterization, part 1: classification | Ahmed, Shahnawaz; Muñoz, Carlos Sánchez; Nori, Franco; Frisk Kockum, Anton | We discuss the problem of quantum state characterization in the context of discriminative modelling. | Session 11: Machine Learning for Quantum Matter |
200 | Deep neural networks for quantum state characterization, part 2: reconstruction | Ahmed, Shahnawaz; Muñoz, Carlos Sánchez; Nori, Franco; Frisk Kockum, Anton | Generative models based on deep neural networks attempt to learn an underlying distribution for observed data. | Session 11: Machine Learning for Quantum Matter |
201 | Chebyshev expansion of spectral functions using restricted Boltzmann machines | Chen, Hongwei; Hendry, Douglas; Weinberg, Phillip; Feiguin, Adrian | We hereby present a variational approach based on a Chebyshev expansion of the spectralfunction and a neural network representation for the wave functions. | Session 11: Machine Learning for Quantum Matter |
202 | Correlator Convolutional Neural Networks: An Interpretable Architecture for Image-like Quantum Matter Data | Miles, Cole; Bohrdt, Annabelle; Wu, Ruihan; Chiu, Christie; Xu, Muqing; Ji, Geoffrey; Greiner, Markus; Weinberger, Kilian; Demler, Eugene; Kim, Eun-Ah | Our approach is applicable to arbitrary lattice data, paving the way for new physical insights from machine learning studies of experimental and numerical data. | Session 11: Machine Learning for Quantum Matter |
203 | Variational optimization in the AI era | Clark, Bryan | We will describe these advancements and our effort to push forward, in the age of AI, the variational approach to the quantum many body problem. | Session 11: Machine Learning for Quantum Matter |
204 | Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks | Spencer, James; Pfau, David; Botev, Aleksander; G. de G. Matthews, Alexander; Foulkes, W Matthew | We show that deep neural networks can learn the ground state wavefunction of chemical systems given only the positions and charges of the nuclei using variational Monte Carlo[4]. | Session 11: Machine Learning for Quantum Matter |
205 | Fermionic lattice models with first-quantized deep neural-network quantum states | Robledo Moreno, Javier; Stokes, James; Pnevmatikakis, Eftychios; Carleo, Giuseppe | In this talk I will describe first-quantized deep Neural-Network techniques for analyzing strongly coupled fermionic systems on the lattice. | Session 11: Machine Learning for Quantum Matter |
206 | Approaching exact solutions of the electronic Schrödinger equation with deep quantum Monte Carlo | Hermann, Jan; Schätzle, Zeno; Noe, Frank | Here, we present PauliNet, a deep-neural-network architecture that includes the Hartree–Fock solution and exact cusp conditions as a baseline, and uses the Jastrow factor and backflow transformation as entry points for a graph neural network which ensures permutational antisymmetry. | Session 11: Machine Learning for Quantum Matter |
207 | Unitary quantum process tomography by time-delayed measurements | Dietrich, Felix; Lopez Gutierrez, Irene; Mendl, Christian | In this work, we investigate an approach based on the Takens and Ruelle time-delay embedding to learn the Hamiltonian from quantum measurements. | Session 11: Machine Learning for Quantum Matter |
208 | Closed-loop discovery of optimal materials using artificial intelligence | Aykol, Muratahan | In this talk, I will present how closed-loop research systems that build on past data and knowledge, automated experiments/computations and automated decision-making can be designed, tested and deployed to solve certain discovery problems in materials science through iterative, sequential optimization. | Session 11: Machine Learning for Quantum Matter |
209 | Interpretable and unsupervised phase classification based on averaged input features | Arnold, Julian; Schäfer, Frank; Zonda, Martin; Lode, Axel | Here, we present a physically motivated, computationally cheap, unsupervised, and interpretable method to infer phase boundaries from data [1]. | Session 11: Machine Learning for Quantum Matter |
210 | Exploration of Topological Metamaterial Band Structures and Chern numbers using Deep Learning | Peano, Vittorio; Sapper, Florian; Marquardt, Florian | Recently, we have introduced a numerical method for band structure calculations based on deep neural networks (NNs). | Session 11: Machine Learning for Quantum Matter |
211 | Unsupervised learning of topological order | Dagnew, Gebremedhin; Myers, Owen; Herdman, Chris; Haywards, Lauren | In particular, we apply dimensional reduction algorithms such as principal component analysis, clustering algorithms such as k-means, and the internal cluster performance metric known as silhouette analysis directly to raw spin configurations sampled from Monte Carlo simulations. | Session 11: Machine Learning for Quantum Matter |
212 | Machine learning augmented neutron and x-ray scattering for quantum materials | Li, Mingda | In this presentation, I will offer three examples from our works to introduce how ML can augment the spectroscopy analysis, including elastic scattering (momentum-space), inelastic scattering (energy-space) and absorption spectroscopy (intensity-space). | Session 11: Machine Learning for Quantum Matter |
213 | Topological quantum phase transitions retrieved through unsupervised machine learning | Che, Yanming; Gneiting, Clemens; Liu, Tao; Nori, Franco | Here we show with several prototypical and relevant models that topological quantum phase transitions can indeed be automatically retrieved, with unsupervised machine learning, and requiring only a very limited number of hyperparameters. | Session 11: Machine Learning for Quantum Matter |
214 | Machine learning dynamics of phase separation in correlated electron magnets | Zhang, Puhan; Saha, Preetha; Chern, Gia-Wei | Here we show that linear-scaling exchange field computation can be achieved using neural networks trained by datasets from exact calculation on small lattices. | Session 11: Machine Learning for Quantum Matter |
215 | Machine learning spectral indicators of topology | Andrejevic, Nina; Andrejevic, Jovana; Rycroft, Christopher; Li, Mingda | Here, we study the effectiveness of XAS as a predictor of topology using machine learning methods to disentangle key structural information from the complex spectral features. | Session 11: Machine Learning for Quantum Matter |
216 | AI-guided engineering of nanoscale topological materials | Srinivasan, Srilok; Cherukara, Mathew; Eckstein, David; Avarca, Anthony; Sankaranarayanan, Subramanian; Darancet, Pierre | As of today, we have identified 224,071 new topological nanoribbons using our framework [6]. | Session 11: Machine Learning for Quantum Matter |
217 | Automatic Learning of Topological Phase Boundaries | Kerr, Alexander; Jose, Geo; Riggert, Colin; Mullen, Kieran | In this article we introduce a heuristic that requires no such tuning. | Session 11: Machine Learning for Quantum Matter |
218 | Data Science in Leadership Consulting and People Analytics | Anzelc, Meghan | In this session, I will provide an overview of my own career path from particle physics PhD to data science leader, sharing how skills from a physics degree translate to data science careers. | Session 12: New Ways of Seeing with Data Science |
219 | Machine Learning enables a new view in the Agriculture industry | Hobbs, Jennifer | We compare two-stage, single-stage, and density-estimation approaches for speed and accuracy. | Session 12: New Ways of Seeing with Data Science |
220 | Building a safe and professional community at LinkedIn using Data Science | Wu, Pan | In this talk, we would like to provide a taste of the Data Science work here at LinkedIn’s Trust team: experimentations are leveraged to understand product changing impacts to optimize member trust experience, advanced analytics are used to identify bad actors behavior patterns for better prevention, and prevalence studies are carried out to understand the gap in our current detection mechanism. | Session 12: New Ways of Seeing with Data Science |
221 | Fluctuations, control and suppression of viral outbreaks | Schwartz, Ira; hindes, jason; Kaufman, James; Bianco, Simone; Shaw, Leah | We develop an analytical theory for COVID-19-like diseases propagating through networks using an SEIR-like model, showing how characteristic closure periods emerge that minimize the total disease outbreak, and increase predictably with the reproductive number and incubation periods of a disease as long as both are within predictable limits. | Session 12: New Ways of Seeing with Data Science |
222 | Applying Deep Learning to Natural Language and Conversational AI | Zheng, Huaixiu | In this talk, I will talk about several case studies where successful applied research of deep learning in the problem space of natural language and conversational AI brings enormous real-world impacts by transforming the way people interact with natural language interfaces. | Session 12: New Ways of Seeing with Data Science |
223 | FAIR and Reproducible High-Throughput Workflows with AiiDA | Huber, Sebastiaan; Marzari, Nicola; Pizzi, Giovanni | We have thus developed a comprehensive, robust, open source, high-throughput infrastructure AiiDA (http://aiida.net) dedicated to address the challenges in automated workflow management and data provenance storage. | Session 13: Open Science and Open Data |
224 | Ontologies in Computational Materials Science | Lenz-Himmer, Maja-Olivia; Ghiringhelli, Luca; Baldauf, Carsten; Scheffler, Matthias | We advanced it to an ontology and extended it to increase semantics based on the European Materials and Modeling Ontology (EMMO). | Session 13: Open Science and Open Data |
225 | The Organic Superconductor Database | Ganter, Owen; Agosta, Charles | The band structure, density of states, and Fermi surface is available from calculations made by applying a tight binding model to charge transfer integrals obtained using Gaussian09. | Session 13: Open Science and Open Data |
226 | The OpenAIRE Research Graph: Science as a public good | Manghi, Paolo | The OpenAIRE Research Graph: Science as a public good | Session 13: Open Science and Open Data |
227 | The NOMAD Artificial-Intelligence Toolkit: Web-Based FAIR-Data-Driven Materials Science | Sbailò, Luigi; Scheffler, Matthias; Ghiringhelli, Luca | Here, we present the NOMAD Artificial-Intelligence (AI) Toolkit, a web-based infrastructure for the interactive analysis of the material-science Findable, Accessible, Interoperable, and Recyclable (FAIR) data stored in the NOMAD Archive. | Session 13: Open Science and Open Data |
228 | Discovery of rare-earth-free magnetic materials through databases | Sakurai, Masahiro; Wang, Renhai; Liao, Timothy; Zhang, Chao; Sun, Huaijun; Sun, Yang; Wang, Haidi; Zhao, Xin; Wang, Songyou; Balasubramanian, Balamurugan; Xu, Xiaoshan; Sellmyer, David; Antropov, Vladimir; Zhang, Jianhua; Wang, Cai-Zhuang; Ho, Kai; Chelikowsky, James | In particular, we use an adaptive genetic algorithm (AGA) to efficiently explore a broad range of compositional and structural space. | Session 13: Open Science and Open Data |
229 | The power of quantum neural networks | Abbas, Amira; Sutter, David; Zoufal, Christa; Lucchi, Aurelien; Figalli, Alessio; Woerner, Stefan | In this work, we use tools from information geometry to define a notion of expressibility for quantum and classical models. | Session 14: Quantum Machine Learning |
230 | Variational Quantum Boltzmann Machines | Zoufal, Christa; Lucchi, Aurélien; Woerner, Stefan | This work presents a novel realization approach to Quantum Boltzmann Machines (QBMs). | Session 14: Quantum Machine Learning |
231 | Data re-uploading for a universal quantum classifier | Pérez-Salinas, Adrián; Cervera-Lierta, Alba; Gil-Fuster, Elies; Latorre, José I. | The extension of this idea to several qubits enhances the efficiency of the strategy as entanglement expands the superpositions carried along with the classification. | Session 14: Quantum Machine Learning |
232 | Learnability and Complexity of Quantum Sample | Niu, Murphy Yuezhen; Dai, Andrew; Li, Li; Smelyanskiy, Vadim; Neven, Hartmut; Boixo, Sergio | A similar exponential separation has yet to be established in generative models through quantum sample learning: given samples from an n-qubit computation, can we learn the underlying quantum distribution using models with training parameters that scale polynomial in n under a fixed training time? | Session 14: Quantum Machine Learning |
233 | Operational Natural Gradients For Variational Quantum Algorithms | McMahon, Nathan | Here I will present an alternate natural gradient technique for ground state minimisation that abstracts away the wave function and focuses on the output distribution used to compute the Energy. | Session 14: Quantum Machine Learning |
234 | Generation of High Resolution Handwritten Digits with Samples from a Quantum Device | Rudolph, Manuel; Toussaint Bashige, Ntwali; Katabarwa, Amara; Peropadre, Borja; Perdomo-Ortiz, Alejandro | To maximize the potential of this algorithm on NISQ devices, we propose a novel technique that leverages on the unique quantum possibilities of measuring in bases other than the computational basis, enhancing the expressibility of the prior distribution of our quantum-classical approach. | Session 14: Quantum Machine Learning |
235 | Tensor-Flow Quantum: An open source software framework for hybrid quantum-classical machine learning | Mohseni, Masoud | We present several new techniques for quantum circuit learning on Noisy Intermediate-Scale Quantum (NISQ) processors. | Session 14: Quantum Machine Learning |
236 | Barren Plateaus in Quantum Neural Networks | Cerezo de la Roca, Marco; Sone, Akira; Sharma, Kunal; Volkoff, Tyler; Cincio, Lukasz; Coles, Patrick | For both of these we provide conditions under which the parameter trainability can be guaranteed, and we connect the notion of locality of the cost with its trainability. | Session 14: Quantum Machine Learning |
237 | Power of data in quantum machine learning | Huang, Hsin-Yuan; Broughton, Michael; Mohseni, Masoud; Babbush, Ryan; McClean, Jarrod | In this work, we show that some problems that are classically hard to compute can, in fact, be predicted easily with classical machines that learn from data. | Session 14: Quantum Machine Learning |
238 | Quantum Machine Learning with Quantum-Probabilistic Generative Models | Martinez, Antonio; Roeder, Geoffrey; Verdon-Akzam, Guillaume | In this work we explore the task of generatively modelling mixed quantum states using hybridizations of classical probabilistic machine learning models and quantum neural networks (QNNs). | Session 14: Quantum Machine Learning |
239 | A divide-and-conquer algorithm for quantum state preparation | Araujo, Israel; Park, Kyungdeock; Petruccione, Francesco; da Silva, Adenilton | Here, we show that it is possible to load an N-dimensional vector with exponential time advantage using a quantum circuit with polylogarithmic depth and entangled information in ancillary qubits. | Session 14: Quantum Machine Learning |
240 | Enhancing Combinatorial Optimization with Quantum Generative Models | Fernandez Alcazar, Francisco; Perdomo, Alejandro | In this work we introduce a new family of quantum-enhanced optimizers and demonstrate how quantum machine learning models, knows as quantum generative models, can enhance the performance over results based only on state-of-the-art classical solvers. | Session 14: Quantum Machine Learning |
241 | Adversarial Robustness of Quantum Machine Learning Models | Liao, Haoran; Convy, Ian; Huggins, William; Whaley, Birgitta | In this paper, we focus on the adversarial robustness in classifying a subset of encoded states that are smoothly generated from a Gaussian latent space. | Session 14: Quantum Machine Learning |
242 | A Quantum Reservoir Computing Approach to Image Classification | Hu, Fangjun; Khan, Saeed; Angelatos, Gerasimos; Tureci, Hakan | In this work, we consider and analyze the efficacy of a reservoir computing approach to address these issues. | Session 14: Quantum Machine Learning |
243 | Neuromorphic computing with single-element quantum reservoirs | Govia, Luke; Kalfus, William; Ribeill, Guilhem; Rowlands, Graham; Krovi, Hari; Ohki, Thomas | We study the noise-resilient neuromorphic computing scheme of reservoir computing with a quantum system as a reservoir. | Session 14: Quantum Machine Learning |
244 | Quantum Learning at High Temperatures in a Dissipative Electronic System | Miller, John; Villagran, Martha; Wosik, Jarek; Kolapo, Ayo | We discuss proposed concepts that exploit such phenomena, including a CDW quantum reservoir computing concept and quantum devices based on patterned ion implantation of CDW materials. | Session 14: Quantum Machine Learning |
245 | Storage properties of a quantum perceptron | Gratsea, Aikaterini; Kasper, Valentin; Lewenstein, Maciej | In this work, we explore the storage capacity of a specific quantum perceptron architecture. | Session 14: Quantum Machine Learning |
246 | Kerr Network Reservoir Computing for Quantum State Measurement | Angelatos, Gerasimos; Khan, Saeed; Tureci, Hakan | Here we propose reservoir processing as a hardware-based solution to superconducting qubit readout. | Session 14: Quantum Machine Learning |
247 | Quantum Thermodynamics of Quantum Boltzmann Machines | Oh, Sangchul; Kais, Sabre | The entropy, free energy, work, and Jarzynski equality were investigated. | Session 14: Quantum Machine Learning |
248 | Quantum-assisted GAN networks for particle shower simulation | Delgado, Andrea | For this reason, we consider an approach for incorporating near-term quantum hardware into deep learning models in which a quantum model is trained and deployed on quantum hardware and used to implement a portion (e.g., a layer of a deep neural network) of the overall deep learning model. | Session 14: Quantum Machine Learning |
249 | Quantum generative adversarial networks with provable convergence | Niu, Murphy Yuezhen; Broughton, Michael; Zlokapa, Alexander; Mohseni, Masoud; Smelyanskiy, Vadim; Neven, Hartmut | In this work, we prove that the iterative training of a discriminator circuit against a generator circuit of previously proposed quantum GANs does not converge for certain initializations, but instead exhibits periodic oscillation between two configurations. | Session 14: Quantum Machine Learning |
250 | Quantum Long Short-Term Memory | Chen, Samuel Yen-Chi; Yoo, Shinjae; Fang, Yao-Lung L. | In this talk, we propose a model of LSTM based on the hybrid quantum-classical paradigm, which we call QLSTM. | Session 14: Quantum Machine Learning |
251 | Implementation of quantum machine learning for electronic structure calculations of periodic systems on NISQ devices | Sureshbabu, Shree Hari; Xia, Rongxin; Kais, Sabre | We present the modified approach that can be implemented on Noisy Intermediate-Scale Quantum (NISQ) devices along with the results of implementing this method on IBM-Q for the computation of the electronic structure of graphene. | Session 14: Quantum Machine Learning |
252 | Analysis of a Quantum Kernel-Based Classifier Using a Tunable Trapped Ion Noisy Simulator | Kenemer, Keith; Cubeddu, Michael; MacCormack, Ian; Delaney, Conor; Aggarwal, Nidhi; Narang, Prineha | In this work, we develop a tunable trapped-ion noisy simulator to analyze the noise-sensitivity of a relevant quantum machine learning (QML) algorithm with respect to various noise metrics specific to existing and near-term trapped-ion hardware. | Session 14: Quantum Machine Learning |
253 | RL-QAOA: A Reinforcement Learning Approach to Many-Body Ground State Preparation | Yao, Jiahao; Lin, Lin; Bukov, Marin | We proposed a reinforcement learning (RL) approach to preparing the ground state of many-body quantum systems. | Session 14: Quantum Machine Learning |
254 | Reinforcement learning for semi-autonomous approximate quantum eigensolver | Albarran-Arriagada, Francisco; Retamal, Juan Carlos; Lamata, Lucas; Solano, Enrique | Here, we propose a protocol to obtain an approximation of the eigenvectors of an arbitrary Hermitian quantum operator. | Session 14: Quantum Machine Learning |
255 | Differentiable Quantum Architecture Search | Zhang, Shixin; Hsieh, Chang-Yu; Zhang, Shengyu; Yao, Hong | Hereby, we propose a general framework of differentiable quantum architecture search (DQAS), which enables automated designs of quantum circuits in an end-to-end differentiable fashion. | Session 14: Quantum Machine Learning |
256 | Quantum computation on defective circuits | Ansari, Mohammad | I present how to update ideal gate operations to get the expected output state from such faulty circuit and present examples performed on real quantum processors as well as simulation machines. | Session 14: Quantum Machine Learning |
257 | Branching Quantum Convolutional Neural Networks: A Variational Ansatz with Mid-Circuit Measurements | MacCormack, Ian; Delaney, Conor; Galda, Alexey; Narang, Prineha | We introduce the bQCNN, a variation of the quantum convolutional neural network (QCNN) in which outcomes from mid-circuit measurements of subsets of qubits inform subsequent quantum gate operations. | Session 14: Quantum Machine Learning |
258 | Learning local and nonlocal quatum data via generative model over tensor network architechture | Najafi, Khadijeh; Azizi, Ahmadreza; Stoudenmire, Miles; Gao, Xun; Lukin, Mikhail; Yelin, Susanne; Mohseni, Masoud | To this end, we investigate the training of the Born Machine for learning both local and nonlocal data encoded in GHZ and Cluster states over various tensor network architectures. | Session 14: Quantum Machine Learning |
259 | A few examples of Machine Learning and Artificial Neural Networks applied to Quantum Physics | Nori, Franco | Machine learning provides effective methods for identifying topological features [1]. | Session 14: Quantum Machine Learning |
260 | Quantum-enhanced data classification with a variational entangled sensor network | Xia, Yi; Li, Wei; Zhuang, Quntao; Zhang, Zheshen | Supervised learning assisted by an entangledsensor network (SLAEN) is a distinct paradigm that harnesses VQCs trained by classical machine learning algorithms to tailor multipartite entanglement shared by the sensors for solving practically useful data processing problems. | Session 14: Quantum Machine Learning |
261 | Classical variational simulation of the Quantum Approximate Optimization Algorithm | Medvidović, Matija; Carleo, Giuseppe | We introduce a method to classically simulate quantum circuits made of several layers of parameterized gates, a key component of variational algorithms suitable for near-term quantum computers. | Session 14: Quantum Machine Learning |
262 | Using Reinforcement Learning for Quantum Control in Magnetic Resonance | Kaufman, Will; Alford, Benjamin; Peng, Pai; Huang, Xiaoyang; Cappellaro, Paola; Ramanathan, Chandrasekhar | We compare RL algorithms to gradient ascent pulse engineering (GRAPE) for both state-to-state transfer operations as well as the design of desired unitary operations on single- and two-qubit systems. | Session 14: Quantum Machine Learning |
263 | Machine Learning-Derived Entanglement Witnesses | Zhu, Eric; Wu, Larry; Qian, Li | Recent studies of the classification of entangled states have utilized aspects of machine learning such as neural networks. | Session 14: Quantum Machine Learning |
264 | Unsupervised machine learning quantum dynamics | Choi, Matthew; Flam-Shepherd, Daniel; Kyaw, Thi Ha; Aspuru-Guzik, Alan | In this talk, we describe the application of generative models using neural ODEs to quantum dynamics, which we show, can learn the underlying quantum dynamics and can extrapolate well beyond the training regime when performing reconstructions. | Session 14: Quantum Machine Learning |
265 | Quantum adiabatic machine learning with zooming | Zlokapa, Alexander; Mott, Alex; Job, Joshua; Vlimant, Jean-Roch; Lidar, Daniel; Spiropulu, Maria | We propose QAML-Z, a novel algorithm that iteratively zooms in on a region of the energy surface by mapping the problem to a continuous space and sequentially applying quantum annealing to an augmented set of weak classifiers. | Session 14: Quantum Machine Learning |
266 | Convolutional Neural Networks and Symmetries of Quantum 1D Spin Chains | Alam, Shah Saad; Ju, Yilong; Minoff, Jonathan; Anselmi, Fabio; Patel, Ankit; Pu, Han | Using neural network architectures that employ quantum variational Monte Carlo methods has opened up a new method of studying quantum many body systems. | Session 14: Quantum Machine Learning |
267 | Deep Quantum Control: End-to-end quantum control using deep learning algorithms | Khosravani, Omid | Here we propose an "end-to-end" framework, which instead starts from direct experimental observations to obtain optimal quantum control trajectories that are sufficiently resilient to all sources of errors. | Session 14: Quantum Machine Learning |
268 | Unsupervised Learning of Physical Systems with Two-dimensional Tensor Network Structures | Azizi, Ahmadreza; Najafi, Khadijeh; Mohseni, Masoud | Leveraging on the expressibility and training power of Projected Entangled Pair State (PEPS) networks, we study the capability of our Born Machine with PEPS structure in learning the underlying patterns in the classical Ising model and two dimensional Rydberg atom. | Session 14: Quantum Machine Learning |
269 | Pattern-Recognition Training of a Quantum Neuron on a Quantum Computer | Cavaletto, London; Candelori, Luca; Matos Abiague, Alex | We propose an alternative quantum perceptron (QP) model that uses a reduced number of multi-qubit gates and is less susceptible to quantum errors than other existing models. | Session 14: Quantum Machine Learning |
270 | Explainable Natural Language Processing with Matrix Product States | Bhadola, Pradeep; Tangpanitanon, Jirawat; Mangkang, Chanatip; Minato, Yuichiro; Angelakis, Dimitris; chotibut, thiparat | Here, we attempt to provide systematic answers through the mapping between DL and its matrix product state (MPS) counterpart [2]. | Session 14: Quantum Machine Learning |
271 | Machine-learning tools for rapid control, calibration and characterization of QPUs and other quantum devices | Wittler, Nicolas; Roy, Federico; Pack, Kevin; Werninghaus, Max; Roy, Anurag Saha; Egger, Daniel; Filipp, Stefan; Wilhelm, Frank; Machnes, Shai | In this talk I shall present the overall concept, and insights into future directions. | Session 14: Quantum Machine Learning |
272 | Teaching computation for large student class sizes | Henkes, Silke | I will present our 2nd year course ‘Mathematical Programming’ in the School of Mathematics at the University of Bristol as a case study. | Session 15: Teaching Computation and Data Science within the Physics Curriculum |
273 | An Introduction to Cloud-Based Data Science Tools | Soltanieh-Ha, Mohammad | In this talk, I will provide an overview of tools and techniques that can improve both the learning experience of the students and the instructor’s ability to manage the class and materials. | Session 15: Teaching Computation and Data Science within the Physics Curriculum |
274 | Data science competencies for physics education | Shahmoradi, Amir | Here I will describe our continued efforts at The University of Texas at Arlington (UTA) to bridge the existing gaps between the training of undergraduate students in the Data Science program of UTA and the data-science technical and soft skill competencies that are desired by the job market. | Session 15: Teaching Computation and Data Science within the Physics Curriculum |
275 | Teaching data science and Bayesian statistics for physical sciences | Seljak, Uros | I will describe the development and teaching experiences of the new course in UC Bereley physics department covering Bayesian statistics and machine learning applications in physical sciences. | Session 15: Teaching Computation and Data Science within the Physics Curriculum |
276 | PICUP resources for integrating computation in the online and in-person classroom | Lopez del Puerto, Marie | In this talk, I will share my experience integrating computation into the Introductory Physics sequence and an upper-level Thermodynamics and Statistical Mechanics course. | Session 15: Teaching Computation and Data Science within the Physics Curriculum |