Dr. Arthur B. McDonald, CC, FRSC
Professor Emeritus, Department of Physics, Queen’s University
Art McDonald, O. Ont, O. N.S., FRS, P. Eng, is a native of Sydney, N.S. Canada. He has degrees in physics from Dalhousie University (BSc, MSc) and Caltech (PhD) and nine honorary degrees. From 1969-1982 he was a Research Officer at AECL Chalk River Laboratories; 1982-1989, Professor at Princeton University; 1989-2013 Professor at Queen’s University, Kingston, Canada, 2006-2013 Gordon and Patricia Gray Chair in Particle Astrophysics and 2013 became Professor Emeritus. Since 1989 he has been Director of the Sudbury Neutrino Observatory (SNO) Scientific Collaboration. Among many awards, he is a Companion of the Order of Canada, Co-recipient of the 2015 Nobel Prize in Physics and the 2016 Breakthrough Prize in Fundamental Physics with the SNO Collaboration. He continues to be active in basic research in Neutrinos and Dark Matter at the SNOLAB underground laboratory. He is a member of the Board of Directors of the Perimeter Institute and was a member of the 2016/2017 Federal Panel on Fundamental Science.
Title: SNO, SNOLAB and High Performance Computing.
Abstract: High Performance Computing has been an important part of the experimental work 2 km underground at the Sudbury Neutrino Observatory and the SNOLAB Underground Research Laboratory. Large-scale experiments have used the ultra-low radioactivity environment to detect neutrinos from the sun and to pursue observations of Weakly Interacting Massive Particles (WIMPS) thought to be a possibility for the Dark Matter making up about 26% of our Universe. Descriptions will be given of some of these projects and the roles that High Performance Computing has played in detector simulation and data analysis.
Title: Using Big Data for Cognitive Computing approaches to Personalised Medicine for Preterm Birth, First Responder Resilience and Astronaut Health
Abstract: The effective use of Big Data within the context of health and wellness has the potential to improve health outcomes and increase efficiencies in healthcare and many believe that it has the potential to be the next most disruptive influence on healthcare since genomics. To enable a disruptive impact systemic approaches to create effective change in healthcare using Big Data platforms and techniques are needed. The Artemis project, an international award winning research project, is a Big Data platform for critical care and specifically neonatal intensive care that has been used to support clinical research studies in Canada, USA and China resulting in new medical discoveries for several conditions within that population. These new discoveries provide a new knowledge base opening new pathways to cognitive computing approaches for personalised medicine for preterm birth and neonatal care in general. In addition, research to utilise Artemis to propose new approaches for real-time on board health monitoring on long range space flight to move beyond the current paradigm of earth based telehealth which becomes less feasible on long range missions will be presented. Big Data analytics also has great potential to revolutionalise and personalise resilience training for first responders. The Athena platform, named after the Greek Goddess of intelligence, warfare and wisdom, extends the Artemis platform by providing not only physiological input from participants but also information from the virtual reality game that they are playing. Initial results of the use of that environment together with a new haptic garment ARAIG for new approaches to personalise resilience training to potentially help to reduce the incidence of post traumatic stress disorder will be provided.
Dr. William Peltier, FRSC
University Professor and Professor of Physics
Director of the Centre for Global Change Science
PI of the Polar Climate Stability Network
Scientific Director of SciNet
Department of Physics, University of Toronto
William Richard Peltier, Ph.D., D.Sc. is University Professor of Physics at the University of Toronto where he is Director of the Centre for Global Change Science and Scientific Director of Canada’s largest supercomputer centre, SciNet. He is a Fellow of the Royal Society of Canada and recipient of the Vetlesen Prize (2004), the Bower Award and Prize of the Franklin Institute of Philadelphia (2010), the Gerhard Herzberg Gold medal of NSERC (2011) and the Killam Prize in natural science of the Canada Council for the Arts (2012). His work is focused upon problems in planetary evolution, including the nonlinear dynamics of the climate system.
Title: The Dansgaard-Oeschger Oscillation of Glacial Climate: A Stringent Test of Global Warming Model Skill
Abstract: Models of the coupled climate system that are employed to simulate the global warming process are heavily tuned structures designed to accurately fit the constraints provided by instrumental era data on the evolution of mean surface temperature. Even though these models are successful in fitting this observational constraint, when they are employed to project the increase of warming into the future by employing a specific future greenhouse gas emission scenario there is significant divergence among their projections by the end of the 21st century. This is because the large suite of models being exercised in the context of the work of the Intergovernmental Panel on Climate Change (IPCC) differ significantly in their “climate sensitivity” to increasing carbon dioxide concentration. This raises the important question as to whether such models can be expected to have significant skill outside of the region of parameter space in which they have been tuned.
A probe of this issue involves the application of a leading global climate model, the Community Earth System Model of the US National Center for Atmospheric Research, to the understanding of the so-called Dansgaard-Oeschger oscillation of the cold glacial climate conditions that existed approximately 20,000 years ago. At this time the Canadian land mass was covered by a thick veneer of land ice whose maximum thickness exceeded 4 km. The D-O oscillation process consists of individual surface temperature pulses of millennium timescale which are of relaxation oscillation form, characterized by extremely rapid transitions from cold glacial conditions to warm “interstadial” conditions followed by a slow relaxation back to the cold state. Discovered more than 30 years ago by the scientists for whom they are named, on the basis of analyses of oxygen isotopic time series obtained from Greenland ice cores, until recently they remained unexplained. The underlying physics has been traced to the behavior that I have described as involving a “kicked” salt oscillator in the Atlantic Ocean (Peltier and Vettoretti, GRL, 2014). The characteristic fast timescale aspect of the D-O process has recently been shown to involve the opening of a “super-polynya” in the extensive sea ice cover that forms over the North Atlantic Ocean under cold “stadial” conditions (Vettoretti and Peltier, GRL, 2016). The model has also been successful in explaining the “bipolar seesaw” aspect of the phenomenon through which northern hemisphere influence is transmitted into the southern hemisphere where it appears as oxygen isotopic variability in Antarctic ice cores (Peltier and Vettoretti, GRL, 2014). These analyses have been performed using the highest resolution version of the CESM1 model which is 1 degree x 1 degree in the horizontal in both atmosphere and ocean and required individual run wall clock times of 8 months on the Power 6 cluster that is operated by the SciNet facility at the University of Toronto. This cluster was funded by the original CFI grant that led to the creation of the National HPC Platform.
This analysis clearly demonstrates that modern global warming models possess significant skill well outside the region of parameter space in which they are tuned. Since the D-O process involves full coupling between the atmospheric, oceanographic and sea ice components of the global warming model, in the absence of greenhouse gas forcing, it seems clear that error in the representation of these components of the model can be playing no role in contributing to the model differences of climate sensitivity to increasing greenhouse gas concentrations.
Dr. Erin R. Johnson
Associate Professor Dept. of Chemistry Dalhousie
Herzberg-Becke Chair in Theoretical Chemistry
B.Sc. (Hons.) in Integrated Science (Mathematics and Chemistry) Carleton University
Ph.D. Chemistry at Queen’s University
Erin R. Johnson is an Associate Professor at Dalhousie University and holds the Herzberg-Becke Chair in Theoretical Chemistry. Her research focuses on development and application of density-functional theory (DFT), with an emphasis on intermolecular interactions. Notable methods co-developed by Johnson include the exchange-hole dipole moment (XDM) dispersion model, the non-covalent interactions (NCI) index, and the Becke-Johnson exchange potential.
Title: Dispersion-Corrected Density-Functional Theory, Molecular Crystals, and Polymorphism
Abstract: Density-functional theory (DFT) methods are the workhorse of modern computational chemistry. However, conventional functionals do not include the physics of London dispersion, which is necessary to model intermolecular interactions. One approach to modeling dispersion is the exchange-hole dipole moment (XDM) model. This is a non-empirical, density-functional approach, based on second-order perturbation theory. XDM can be used in conjunction with popular density functionals to provide highly accurate results for both van der Waals complexes and molecular crystals. In this talk, some applications of XDM are presented, focusing on predicting the enantiomeric excess of chiral crystals and energetic ranking of crystal polymorphs.
Dr. Michael Jones
W.K. Estes Chair of Cognitive Modeling
Professor of Cognitive Science
Professor of Psychological and Brain Sciences
Adjunct Professor of Informatics and Computing
Indiana University, Bloomington
Michael Jones is Professor of Cognitive Science, Psychology, and Informatics at Indiana University where he holds the William and Katherine Estes Endowed Chair in Cognitive Modeling, and directs the Cognitive Computing Laboratory. A native of Toronto, he completed degrees at Nipissing University (B.A.) and Queen’s University (M.A., Ph.D.), and was subsequently a postdoctoral fellow at the Institute of Cognitive Science, University of Colorado.
Dr. Jones’ research is broadly in the emerging field of cognitive computing, with a particular focus on understanding the computational mechanisms used by the brain to learn and represent knowledge from linguistic and sensory experience. The second prong of his research is to transfer these models of human information processing to solve practical applications in the information sciences that are currently a challenge for purely data-driven machine learning models. He has been awarded Outstanding Career Awards from the National Science Foundation, the Federation of Behavioral and Brain Sciences, and the Psychonomic Society. He is currently Editor-in-Chief of Behavior Research Methods, and author of the recent book Big Data in Cognitive Science. His research is funded by the National Science Foundation, National Institutes of Health, Institute of Education Sciences, Clinical and Translational Sciences Institute, and Google Research.
Title: Modeling Human Learning to Inform Machine Learning
Abstract: The human brain has been optimized by evolution to handle massive information processing tasks in an efficient and flexible manner. Despite recent trends in computing science toward “deep learning” models, there remain significant gaps between humans and machines in how they learn, represent, and use information. But these gaps are also an opportunity for interdisciplinary collaboration to produce novel insights in machine intelligence by studying human intelligence. In this talk, I will survey recent progress in my lab toward developing large-scale computational models of human cognition, simulating the mechanisms that humans use to learn deep knowledge representations from first-order statistical experience. These models can be “raised” on linguistic and perceptual information representative of our environments, and the parameters tuned to constrain between theoretical accounts of learning. They can then be transferred and applied to a variety of practical information retrieval and data mining tasks. Because humans are both the producers and consumers of such a large amount of the data we wish mine for knowledge, models of human cognition can offer unique insights not captured by purely data-driven machine learning techniques. I will conclude with a few applied examples from my lab using hybrid cognitive models in clinical data mining and automated tutoring systems in elementary school classrooms.