Skip to main content

Explore research on data and AI

The development of graphics processing units, which dramatically increased computing power, and the exponential growth of data have led to major strides in artificial intelligence over the past ten years. UL researchers in actuarial science, mathematics, computer science, software engineering, and electronics are working to make sense of this data to produce accurate interpretations that will inform decision making and help solve complex problems. Some of them are developing new ways of transmitting data and new tools for analyzing it, while others are refining financial risk assessment tools and teaching software to self-learn complex tasks that a single human could not perform on his or her own (machine learning).

The faces of data and AI research

Here’s a look at the topics our researchers are addressing in actuarial science, computer science, software engineering, electrical engineering, mathematics, and statistics with respect to data and AI, along with a brief profile of the work being done by one of our faculty members in each area.

Big data

Artificial intelligence makes it possible to extract meaning from the big data sets that technological advances have made it possible to collect. The challenge for researchers is to find innovative ways of using computers to aggregate, analyze, and cross-reference data to reach new and reliable conclusions. They are using computational neurobiology, mathematical logic, and computer science to solve logically or algorithmically complex problems. Big data analysis, processing, and mining have applications in many fields, including diagnostic assistance in medicine, decision support and task automation in industry and transportation, and machine translation in communications.

Data sampling and analysis

Louis-Paul Rivest, professeur au Département de mathématiques et de statistique

Louis-Paul Rivest, professeur titulaire, Department of Mathematics and Statistics

Canada Research Chair in Statistical Sampling and Data Analysis

Professor Rivest specializes in sampling and survey methodology, directional data analysis, and capture-recapture and multidimensional statistical models, such as Copulas are mathematical objects used in probability theory. and frequency tables. Through his chair, he develops data acquisition and analysis tools that are used to analyze complex phenomena affecting human and animal populations. For example, he combines data from social surveys with data from social networks to improve estimates for small population subgroups.

Infrared vision

Infrared is a form of electromagnetic radiation whose waves have a lower frequency than visible light (red); it is directly linked to heat because heat waves are emitted in the infrared spectrum. Multispectral remote sensing systems capture infrared (or thermal) radiation along with reflected infrared. New infrared devices and increased computing power have opened up new possibilities for using infrared technology, including detecting weak points in building insulation, locating hard-to-access victims during rescue operations, studying nocturnal animal species, and diagnosing pathologies.

Xavier Maldague, professeur titulaire, Department of Electrical and Computer Engineering

Canada Research Chair in Multipolar Infrared Vision

Xavier Maldague, a specialist in non-destructive material analysis using infrared thermography, is attempting to solve problems that limit the use of infrared vision in a number of sectors. For example, he is working to improve the detection of subtle surface temperature changes that reveal the presence of hidden defects. He is also using digital simulations of thermal phenomena to explore the usefulness of infrared vision in a number of fields. He uses thermographic cameras to observe anomalies in Egyptian pyramids such as Cheops and Giza in order to examine these unique structures and understand how they were built.

Machine learning

Machine learning is a subfield of AI that focuses on designing, analyzing, developing, and implementing methods that teach computers to acquire new knowledge and skills by processing data in order to perform complex tasks. The ability of machines to make inferences from large amounts of data using algorithms is used in many areas, including voice, facial, and object recognition and machine translation, among others.

Richard Khoury, professeur au Département d'informatique et de génie logiciel

Richard Khoury, professeur titulaire, Department of Computer Science and Software Engineering

Affiliated researcher at the Big Data Research Centre (BDRC)

Richard Khoury specializes in AI, machine learning, big data processing and natural language. He is heading up research to improve computer performance so they can solve problems using learning algorithms. One of the projects he is working on aims to improve the effectiveness of software used to help community moderators identify hate speech in online conversations. He is also working on advanced learning algorithms that will allow companies to combine customer information from multiple databases in order to personalize their customer product offerings.

I chose to do a PhD in actuarial science to develop my autonomy and creativity in a research field that involves forecasting and risk management. I am specifically interested in dependency modelling, risk measures, and capital allocation.

Ishan Chaoubi, PhD student in actuarial science supervised by Professor Hélène Cossette

Multiphysical mathematical modelling

Multiphysical mathematical modelling makes it possible to create mathematical models that take into account all the complex physical phenomena (e.g., elasticity, plasticity, thermal and fluid dynamics, and electromagnetism) that occur during product manufacturing and use. These models are used to solve applied problems with a view to designing better quality products. Researchers use partial differential equations (Fourier, Navier-Stokes, Maxwell, etc.) to study the theoretical resolution of complex situations and develop tools to solve them.

Risk theory

For insurance companies, risk theory involves assessing the overall risk of risk portfolios covered by insurance contracts. The distribution of total claims for a portfolio is used to measure the overall risk and determine the premium, which offsets the compensation paid out to insured persons in the event of a claim. Frequency and severity are the two parameters used to measure risks.

Etienne Marceau, professeur à l'École d'actuariat

Etienne Marceau, professeur titulaire, School of Actuarial Science

Codirector of Laboratoire ACT & Risk

Étienne Marceau specializes in actuarial risk modelling and assessment and quantitative risk management. He has even written a book on the subject: Modélisation et évaluation quantitative des risques en actuariat, Springer Verlag, 2013. Professor Marceau’s interests include modelling the relationship between different actuarial risks and developing more accurate digital methods for quantitatively assessing overall risk and setting premiums accordingly. He is also interested in the actuarial modelling of financial and mortality risk.

Signal processing

Signal processing involves techniques for processing, analyzing, and interpreting signals. It includes control, filtering, data transmission, noise reduction, and identification. Signal processing draws on electronics and automation, as well as a number of other fields, including mathematics (linear algebra and In a stochastic process there is at least one random variable for any given variable.), information theory, and numerical analysis. Analog signals, which are produced by sensors and amplifiers, are distinguished from digital signals, which are produced by computers and terminals.

Paul Fortier, professeur titulaire, Department of Electrical and Computer Engineering

Director of the Institute for Information Technologies and Societies

Professor Fortier specializes in the application of digital signal processing to communications. He is interested in systems that combine orthogonal This technique for modulating signals transmitted via radio waves involves distributing the data over multiple neighbouring carrier frequencies. and Multiple-input multiple-output is a multiplexing technology used in wireless networks and mobile networks for long-range, high-rate data transfer using antennae. in wireless, mobile, and fixed-link environments. One of his projects involves building new architectures for the wireless systems of the future.