Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.

MSIAM course list

Students must choose courses in the list below, which covers a very wide range of mathematical topics. Faculty members will be pleased to advise students on how to construct their curriculum. Meetings are scheduled at the start of the semester to inform them on the scope of the courses.

The list below is likely to change slightly.

Labs usually means numerical experiments with computers



Refresher courses (0 ECTS)

Introduction to matrix numerical analysis and numerical optimisation

6h courses + 6h seminar + 6h Labs (L.), Franck Iutzeler and Jérôme Malick

  • matrix numerical analysis
    • matrix analysis (matrices, eigen/singular values, condition number, functions)
    • numerical methods (factorizations, linear equation solving, eigenvalue/vectors computation)
    • Practical work: practical evaluation of the cost of basic operations, application to regression models…
  • numerical optimisation
    • introduction to optimization (definitions, examples, convexity)
    • algorithms in unconstrained optimisation (gradient, Newton, quasi-Newton)
    • Practical work: comparison of optimization algorithms, application to logistic regression

Links: See [Fall 2016] Basics on Matrix Analysis and Optimization


Advanced Algorithms for Machine Learning and Data Mining

3 ECTS 18h, Eric Gaussier and Ahlame Douzal

  • A prior algorithms (Frequent item sets) & Page Rank
  • Monte-carlo, MCMC methods: Metropolis-Hastings and Gibbs Sampling
  • Matrix Factorization (Stochastic Gradient Descent, SVD)
  • Generalized kmeans and its variants (Bach, Online, large scale), Kernel clustering (Support Vector Clustering), Spectral clustering
  • Classification and Regression Trees, Support Vector regression
  • Alignment and matching algorithms (local/global, pairwise/multiple), dynamic programming, Hungarian algorithm,…

Evaluation: final exam.


Advanced Imaging

3 ECTS, C. 18h, Sylvain Meignen

In this course, we will first focus on linear methods for image denoising. In this regard, we will investigate some properties of the heat equation and of the Wiener filter. We will then introduce nonlinear partial equations such as the Perona­Malick model for noise removal, and some other similar models. A last part of the course will be devoted to edge detection for which we will consider the Canny approach and, more precisely, we will deal in details with active contours and level sets methods.


An introduction to shape and topology optimization

3 ECTS, C. 18h, Eric Bonnetier and Charles Dapogny

In a very broad acceptation, shape and topology optimization is about finding the best domain (which may represent, depending on applications, a mechanical structure, a fluid channel,…) with respect to a given performance criterion (e.g. robustness, weight, etc.), under some constraints (e.g. of a geometric nature). Fostered by its impressive technological and industrial achievements, this discipline has aroused a growing enthusiasm among mathematicians, physicists and engineers since the seventies. Nowadays, problems pertaining to fields so diverse as mechanical engineering, fluid mechanics or biology, to name a few, are currently tackled with optimal design techniques, and constantly raise new, challenging issues.

The purpose of this course is to discuss the main aspects related to the numerical resolution and the practical implementation of shape and topology optimization problems, and to present state-of-the-art elements of response. It focuses as well on the needed theoretical ingredients as on the related numerical considerations. More specifically, the following issues will be addressed:

  • How to define a `good' notion of derivative for a ``cost'' function depending on the domain;
  • How to calculate the shape derivative of a function which depends on the domain

via the solution of a Partial Differential Equation posed on it;

  • How to devise efficient first-order algorithms (e.g. steepest-descent algorithms) based on the notion of shape derivative;
  • How to numerically represent shapes so that it is at the same time convenient to perform Finite Element computations on them,

and to deal with their evolution in the course of the optimization process.

Prerequisites: Only a basic knowledge of functional analysis and scientific computing will be assumed: differential calculus, Finite Element method, etc.


Congestion Phenomena and Compressibility for Granular Media

3 ECTS, C. 18h, Dider Bresch

Granular flows are at the heart of phenomena such as erosion, landslides and vulcanology. The mathematical study of these complex events is an important numerical and physical challenge. We will show how it requires a general view related to nonlinear PDEs. The objective of this course will be two-fold:

  1. Show how the compressibility and the viscoplasticity of the phenomenon can play an important role
  2. Discuss congestion phenomena in granular media (maximum packing) that can be compared mathematically to floating structure phenomena in the presence of a free boundary.

The main idea of this lecture is to motivate by examples interdisciplinary collaborations needed to deal with complex situations.


Computational biology

3 ECTS, C. 18h, Antoine Frenoy and Clovis Galliez

This interdisciplinary MSc course is designed for applicants with a biomedical, computational or mathematical background. It provides students with the necessary skills to produce effective research in bioinformatics and computational biology.

The objective is to provide a short introduction on bioinformatics modelling and advanced tools for the analysis of sequence data. The first part of the course focuses on application in molecular biology and evolution, including hierarchical clustering and the analysis of phylogenetic and population genetic data.

The second part of the course focuses on machine learning for biological data, and includes change point detection in sequences and unsupervised clustering of massive genetic data. The course is evaluated with two lab-works, one for each part of the course.

No specific prerequisites.

Evaluation: project (1/2) + final exam (1/2).


Data science seminar

3 ECTS, Seminars, Pierre Etoré, Ronald Phlypo and Hacheme Ayasso

Our master programs now include a series of 5 seminars given by active researchers in the field of data processing methods and analysis.

These seminars are intended to give students some insights on modern problems and solutions developed in a data science framework, with applications in a variety of fields.

In order to make these seminars a most valuable experience for all students, a scientific paper dealing with the topic of the seminar will be selected by the speaker and dispatched to all students about 2 weeks before the seminar. Students are expected to read and study this paper, and to prepare questions, before attending the seminar. Presence at the seminars is compulsory for master students.

Oldies: announcements of 2019-2020 on https://data-institute.univ-grenoble-alpes.fr/education/data-science-seminar-series/ (regularly updated)

In addition please follow information about this course on the Chamillo page: https://chamilo.grenoble-inp.fr/courses/PHELMAWPMTSSP7/index.php?id_session=0

This module is common with the M2 programmes MSIAM Data Science, MoSIG Data Science and SIGMA.

Evaluation: final written report and oral presentation.


Efficient methods in optimization

3 ECTS, C. 18h, Roland Hildebrand

The subject of this half-semester course are more advanced methods in convex optimization. It consists of 6 lectures, 2 x 1,5 hours each, and can be seen as continuation of the course “Non-smooth methods in convex optimization”.

This course deals with:

  1. Linear programs
    • Representations of linear programs
    • Simplex method
    • Duality
    • Liftings / Complexity
  2. Conic programs
    • Duality
    • Symmetric cones
    • Second order conic / semi-definite programming
  3. Robust optimization
  4. * Robust counterparts of conic programs
    • Robust Linear / Second order conic / Semi-definite programs
  5. Interior-point methods
    • Self-concordant barriers
    • Path-following methods
  6. Relaxations
    • MaxCut (Goemans / Williamson)
    • Stable set (Lovasz / Schrijver)
    • Copositive programming relaxations
  7. Polynomial optimization
    • Sums of squares relaxations
    • Moment relaxations

Evaluation : A two-hours written exam (E1) in December. For those who do not pass there will be another two-hours exam (E2) in session 2 in spring.


Fundamentals of probabilistic data mining

3 ECTS, C. 13.5h, L. 4.5h, Xavier Alameda-Pineda

Content: This courses introduces probabilistic models with latent variables, and the associated algorithms to estimate the parameters and perform inference over the latent variables. Such models are used for unsupervised tasks such as clustering and source modeling, as well as for supervised tasks such as classification and regression. You will discover the basic probabilistic models as well as more advanced techniques.

The following topics are addressed:

  • Principles of probabilistic data mining and generative models
  • Latent variables and probabilistic graphical models
  • Mixture models
  • The linear-Gaussian model and probabilistic PCA
  • Markov models for time series with continuous and discrete latent variables
  • Variational inference and variational auto-encoders

At the end of the course, the student will have basic knowledge in the most common probabilistic models with latent variables. Therefore, the student will be able to perform model-based clustering, analysis and segmentation of time-series with hidden Markov models, build a graphical model associated with a given distribution, represent numerical multivariate data with missing coordinates into planes and work with state-of-the-art non- linear regression models based on variational autoencoders.

Prerequisite: Fundamental principles in probability theory (conditioning) and statistics (maximum likelihood estimator and its usual asymptotic properties).

Evaluation: The first session combines a written exam (E1) and the reports of the three practical sessions (P). The final mark of the first session is obtained as (E1+P)/2. The second session consists of only a written exam (E2) which constitutes the final grade.


Geophysical imaging

3 ECTS, C. 18h, Ludovic metivier

Understanding Earth's interior mechanisms, assessing seismic hazard due to earthquakes and volcanoes, securing our access to hydrocarbon resources and monitoring CO2 storage sites, all represent crucial issues for modern societies. They all share the same need for a deep and accurate knowledge of the Earth interior, particularly the crust. We present in this course a seismic imaging method yielding unprecedented high resolution information on the subsurface structure. This seismic imaging method is formulates as an optimization problem, constrained by partial differential equations representing seismic wave propagation. We study in this course all of the constitutive elements of this method, insisting on mathematical and numerical aspects of it: wave modeling in heterogeneous media, large scale optimization methods, implementation on high performance computing parallel architectures.


GPU Computing

6 ECTS, C 18h, Christophe Picard

In this course, we will introduce parallel programming paradigms to the students in the context of applied mathematics. The students will learn to identify the parallel pattern in numerical algorithm. The key components that the course will focus on are : efficiency, scalability, parallel pattern, comparison of parallel algorithms, operational intensity and emerging programming paradigm. Trough different lab assignments, the students will apply the concepts of efficient parallel programming using Graphic Processing Unit. In the final project, the students will have the possibility to parallelize one of their own numerical application developed in a previous course.

Syllabus:

  • Introduction to parallelism
  • Introduction to general context of parallelism
  • Models of parallel programming
  • Description of various model of parallelism
  • Paradigm of parallelism
  • Templates of parallelism
  • Parallel architectures
  • Programming tools: Cuda

Prerequisite: C or C++, Compiling, Data structures, Architecture, Concurrency

Evaluation: project.


Information access and retrieval

3 ECTS, C. 18h. Georges Quenot, Philippe Mulhem and Jean-Pierre Chevallet

This course addresses advanced aspects of information access and retrieval, focusing on several points: models (probabilistic, vector-space and logical), multimedia indexing, web information retrieval, and their links with machine learning. These last parts provide opportunities to present the processing of large amount of partially structured data. Each part is illustrated on examples associated with different applications.

Course contents:

Part I. Foundations of Information Retrieval

Course 1: Information retrieval basics. Course 2: Classical models for information retrieval. Course 3: Natural language processing for information retrieval. Course 4: Theoretical models for information retrieval.

Part II: Web and social networks

Course 5: Web information retrieval and evaluation. Course 6: Social networks and information retrieval. Course 7: Personalized and mobile information retrieval. Course 8: Recommender systems.

Part III: Multimedia indexing and retrieval

Course 9: Visual content representation and retrieval. Course 10: Classical machine Learning for multimedia indexing. Course 11: Deep learning for information retrieval. Course 12: Deep learning for multimedia indexing and retrieval.


Evaluation: Final exam.

Introduction to extreme-value analysis

3 ECTS, C. 18h, Stéphane Girard

Taking into account extreme events (heavy rainfalls, floods, etc.) is often crucial in the statistical approach to risk modeling. In this context, the behavior of the distribution tail is then more important than the shape of the central part of the distribution. Extreme-value theory offers a wide range of tools for modeling and estimating the probability of extreme events. In particular, the following points will be addressed in the course:

1) Asymptotic behavior of the largest value of a sample. Extreme-value Distribution (EVD). Maximum domains of attraction (Fréchet, Weibull and Gumbel). Asymptotic behavior of excesses over a threshold. Generalized Pareto Distribution (GPD). Regularly varying functions.

2) Estimation of the parameters of the EVD and GPD. Hill estimator. Application to the estimation of extreme quantiles. Illustration on simulated and real data.

Prerequisites: knowledge of statistics and probability will be assumed.

Evaluation: will be a written exam.


Inverse problem and data assimilation : variational and Bayesian approaches

3 ECTS, C. 18h, Elise Arnaud

This course is about inverse problem and data assimilation.

Inverse methods allow to combine optimally all sources of information available about a given (physical, biological, chemical, …) system: - mathematical equations (physical laws or the biological processes, …); - observations (measures of real experiments); - error statistics (observation errors, model errors, …). These sources of information are usually heterogeneous: different nature, varying quality and quantity. In geosciences, inverse methods are often called data assimilation. Historically, the idea was to estimate the initial state of the atmosphere, in order to produce weather forecasts. Today, it has many applications, not only initial state estimation (parameter estimation, physical law parameterisation, numerical parameter estimation, unknown forcing sources estimation…). It is also used in many application domains, not only weather forecasting (oceanography, oil drill, seismology, energy, medicine, biology, glaciology, agronomy, construction industry, …).

The purpose of this course is to give an overview of the existing methods, from variational approaches - which describes the problem in terms of function optimisation - to Bayesian techniques - that relie on sampling theory. Labs are also part of the course.

Keys words and phrases: parameter estimation, uncertainty quantification and reduction, numerical modeling, optimisation, Monte Carlo filter

Evaluation is based on a final written report and oral presentation (one final note).


Kernel methods for machine learning

3 ECTS, C. 18h, Julien Mairal

Statistical learning is about the construction and study of systems that can automatically learn from data. With the emergence of massive datasets commonly encountered today, the need for powerful machine learning is of acute importance. Examples of successful applications include effective web search, anti-spam software, computer vision, robotics, practical speech recognition, and a deeper understanding of the human genome. This course gives an introduction to this exciting field, with a strong focus on kernels as a versatile tool to represent data, in combination with (un)supervised learning techniques that are agnostic to the type of data that is learned from. The learning techniques that will be covered include regression, classification, clustering and dimension reduction. We will cover both the theoretical underpinnings of kernels, as well as a series of kernels that are important in practical applications. Finally we will touch upon topics of active research, such as large-scale kernel methods and the use of kernel methods to develop theoretical foundations of deep learning models.

Evaluation: project (1/2) + final exam (1/2)

Link : Julien Mayral's webpage of the course


Level set methods and optimization algorithms with applications in imaging

3 ECTS, C. 18h, Emmanuel Maître and Charles Dapogny

This lecture will link level­set modeling of biomechanical systems (e.g. immersed elastic membranes mechanics) with optimal transportation theory. Interpolation algorithms based on physical knowledge of images content will be studied. Theoretical as well as practical implementation aspects will be considered.


Machine learning fundamentals

3 ECTS 18h, Massih-Rezah Amini and Emilie Devijver

  • Consistency of the Empirical Risk Minimization
  • Uniform Generalization Bounds and Structural Risk Minimization
  • Unconstrained Convex Optimization
  • Binary Classification algorithms (Perceptron, Adaboost, Logistic Regression, SVM) and their link with the ERM and the SRM principles
  • Multiclass classification
  • Application and experimentations

Model exploration for approximation of complex, high-dimensional problems

3 ECTS, 18h, Olivier Zahm.

Many industrial applications invole expensive computational codes which can take weeks or months to run. It is typical for weather prediction, in aerospace sector or in the civil engineering field. There is here an important (economic) challenge to reduce the computational cost by constructing a surrogate for the input-to-output relationship. Since only a few number of model runs is affordable, dedicated tools have been developed to exploit this type of “not-so-big” data sets. This lecture focuses on some of the most recent advances in that direction.

Prerequisites: Basic knowledge in probability and statistics

Target skills: The goal of this lecture is to address the difficult problem of approximating high-dimensional functions, meaning functions of a large number of parameters. The first part of the lecture is devoted to interpolation techniques via polynomial functions or via Gaussian processes. In the second part, we present two methods for reducing the dimension of the input parameters space, namely the Sliced Inverse Regression and the Ridge Function Recovery.

References Springer Handbook on UQ, R. Ghanem, D. Higdon and H. Owhadi (Eds)

Evaluation: mid-term exam (1/2) + final practical work (1/2)


Model selection for large-scale learning

3 ECTS, C. 18h, Emilie Devijver

When estimating parameters in a statistical model, sharp calibration is important to get optimal performances. In this course, we will focus on the selection of estimators with respect to the data. Particularly, we will consider calibration of parameters (e.g., regularization parameter for minimization of regularized empirical risk, like Lasso or Ridge estimators) and model selection (where each estimator minimizes the empirical risk on a specifi ed model, as mixture models with several number of clusters).

We will focus on the penalized empirical risk, where the penalty may be deterministic (as BIC or ICL) or estimated with data (as the slope heuristic).

Prerequisites:
Basic knowledges in probability and statistics

Target skills: Learn

  • When model selection is needed.
  • What can be proved theoretically for existing methods.
  • How those results can help in practice to choose a criterion for some speci fic statistical problem
  • How the theory can serve to de fine new procedures of selection.

References
T. Hastie, R. Tibshirani and J. Friedman, The Elements of Statistical Learning. Data Mining, Inference, and Prediction
P. Buhlmann and S. van de Geer, Statistics for High-Dimensional Data. Methods, Theory and Applications
P. Massart, Concentration Inequalities and Model Selection


Modelling Seminar and Projects

6 ECTS, Tut. 36h, Labs 36h. Emmanuel Maître et Cécile Lalande

This lecture proposes modelling problems. The problems can be industrial or academic. Students are faced to an industrial problem or an academic problem (research oriented). They are in charge of this project. An teacher/tutor may guide them to find solutions to the problem. For industrial project, they have to understand the user needs, to analyze and model the problem, to derive specifications, to implement a solution and to develop the communication and the presentation of the proposed solution. More academic projects are linked to the courses. They are constructed such that the students can go deeper into a subject.

This lecture introduces basic communication methodes in industry. This part is in french and optional.

Rules: the students have to choose TWO subjects (either academic or industrial). They work in small groups on both projects with tutor (analysis of the problem, bibliography, construction of a solution, numerical simulations, etc.). At the end, they defend their results in front of a jury and provide a short report.

See also: http://chamilo.grenoble-inp.fr/main/document/document.php?cidReq=ENSIMAGWMM9AM10 (intranet: for registered students only).

MSIAM Course list ; Semester 3 (MSIAM tracks)


Non-smooth Convex Optimization Methods

3 ECTS, C. 18h, Roland Hildebrand / Franck Iutzeler

The subject of this half-semester course are the basic mathematical tools necessary to understand convex optimization problems, as well as a variety of methods for their solution. The course is divided in two parts, a basic mathematical part (3 weeks with 2 x 1.5 hours each, presented by R. Hildebrand) and a more applied part (3 weeks with 2 x 1.5 hours each, presented by F. Iutzeler). .

This course deals with:

  1. Optimization problems
    • Structure of optimization problems
    • Convex / Non-convex problems
    • Simple methods (line search, ellipsoid method)
  2. Mathematical basics
    • Topology
    • Norms
    • Affine spaces
  3. Convexity
    • Convex sets, hulls, combinations
    • Separation theorems
    • Facial structure
    • Duality
    • Cones
  4. Convex functions
    • subdifferential, optimality, etc
    • Lipschitz gradient & gradient method
    • Non-Euclidean geometry & Mirror Descent/Dual Averaging
  5. Minimizing unstructured non-smooth functions: Bundle methods
    • Lower models
    • Proximal Bundle
  6. Minimizing structured functions: splitting methods
    • Methods of Multiplier / Augmented Lagrangian
    • ADMM
    • Proximal gradient
    • Generic construction of splitting methods

Evaluation : A two-hours written exam (E1) in December. For those who do not pass there will be another two-hours exam (E2) in session 2 in spring.


Numerical optimal transport and geometry

3 ECTS, C. 18h, Boris Thibert

Optimal transport is an important field of mathematics that was originally introduced in the 1700's by the French mathematician and engineer Gaspard Monge to solve the following very applied problem: what is the cheapest way of sending a pile of sand to a hole, knowing the cost of transportation of each sand grain of the pile to a possible target location ? This very applied problem gave the birth of the theory of optimal transport. This theory has connections with PDEs, geometry and probability and has been used in many fields such as computer vision, economy, non-imaging optics… In the last 15 years, this problem has been extensively studied from a computational point of view and different efficient algorithms have been proposed.

The goal of this course is to introduce basics on the optimal transport theory and to present the recent algorithms that have been shown to be very efficient. We will first focus on the discrete setting that corresponds to the transportation between discrete measures, with the entropic relaxation and the Sinkorn algorithm. We will then study the semi-discrete setting that corresponds to transporting a continuous measure to a discrete one. This has connections with computational geometry and can be solved efficiently with Newton algorithms. We will also present applications in different fields such as image processing and geometric optics.

Prerequisite: not particularly.

Evaluation: written test / presentation of a research paper.


Reinforcement learning

3 ECTS, C. 18h, Nicolas Gast

Reinforcement learning is an area of machine learning in which an agent interacts repeatedly with an environment in order to maximize their cumulative reward. Compared to the classical supervised or unsupervised learning frameworks, here we are typically interested in problem in which an agent takes decision and learn at the same time, a paradigm that is also known as online learning (in which a typical tradeoff is the exploration versus exploitation dilemma). The application of reinforcement learning spans many areas of artificial intelligence. For instance, driving a car or designing a computer that plays the game of go can be achieved by reinforcement learning techniques.

The goal of this course will be to provide an overview of the main tools used to apprehends these problems. This course will have a strong theoretical component. We will cover the basics of online optimization (multi-armed bandits algorithms, regret minimization), and of Markov decision processes, Bellman’s optimality principle and basic learning algorithms for Markov processes. Throughout the course, we will focus on the mathematical and the algorithmic aspects of the theory. We will present implementation tutorials for the course’s algorithmic content.

Program:

Part I : Online Optimization

In this part, we will introduce the concept of online learning algorithms. We will define the notion of regret – which is central to online learning theory – and explain how to construct low-regret algorithms. Notions:

  • The multi-armed bandit framework.
  • Regret minimization.
  • Upper and lower regret bounds and how to achieve them.

Part II : Markov decision processes and Reinforcement Learning

Reinforcement learning is classical framed in the context of Markov decision processes. In this part, we will define what is a Markov decision process and how this can be used to construct powerful control algorithms. Notions:

  • Markov decision processes and Bellman's optimality principle.
  • Model-free reinforcement learning algorithms (Q-learning, TD-learning, Deep queue learning).
  • Model-based reinforcement learning (UCRL2, PSRL).

Evaluation: Homework 30%, Exam 70%


Software Development Tools and Methods

3 ECTS, C. 9h, Labs 30h, Mourad Ismail

This lecture presents various useful applications, libraries and methods for software engineering related to applied mathematics. These include:

  • C++ project management, development and profiling (cmake, subversion, qtcreator, gdb, gprof, valgrind)
  • Linear algebra (Eigen)
  • User interface (Qt)
  • Data processing (XML)
  • Prototyping and interfacing using Python

Statistical methods for forecasting

3 ECTS, C. 18h, Sana Louhichi

This course is related to mathematical and statistical methods for forecasting in supervised learning. We will present several tools and ingredients to predict the future value of a variable. We shall focus on methods for regression and methods for classification for inde- pendent or correlated training dataset. This course will be followed by four practical sessions with the R software.

Course outline: Introduction. Linear methods for regression. Non linear methods for regression. Supervised classification.

Keys words and phrases: Parametric regression, Lasso, Ridge, Nonparametric trend estimation, Kernel nonparametric models, Smoothing parameter selection, Average squared error, Mean average squared error, Mallows criterion, Cross validation, Generalized cross validation, Dependent random variables, Martingale difference sequences, Stochastic Volatility, Moment inequalities, Maximal inequali- ties, Supervised classification.

Evaluation: (1/2) project + (1/2) written exam.


Temporal and spatial point processes

3 ECTS, C. 18h, Julien Chevallier et Jean-François Coeurjolly

Point processes are a class of stochastic processes modelling random events in interaction. By event we can think of the time a neuron activates, the time a tweet has been retweeded, etc or the location of a tree in a forest, the impact of a lightning strike, etc. This course intends to provide an introduction to stochastic models which could cover such applications, to discuss the main characteristics of such processes, standard models (properties, simulation) and statistical procedures to infer them.

Part I (6 hours) - Temporal point processes

Definition and simulation of one-dimensional point processes (conditional/stochastic intensity); Likelihood and goodness-of-fit tests (illustration on the Poisson point process); Hawkes processes (estimation, goodness-of-fit, stationarity, ergodicity).

Part II (12 hours) - Spatial point processes

Definition and characterizationo of a spatial point process, intensity functions and conditional intensity functions; Poisson point process; Intensity estimation and summary statistics; Models for spatial point processes (Cox, determinantal and Gibbs point processes): characterization, simulation, statistical inference and validation. Keywords: stochastic processes; modelling of dependence; simulation and statistical inference; Poisson point process.

Evaluation: One or two homework(s) (theoretical and practical) and one final exam.


Wavelets and applications

3 ECTS, C. 18h, Kevin Posilano

Wavelets are basis functions widely used in a large variety of fields: signal and image processing, numerical schemes for partial differential equations, scientific visualization. This course will present the construction and practical use of the wavelet transform, and their applications to image processing : Continuous wavelet transform, Fast Wavelet Transform (FWT), compression (JPEG2000 format), denoising, inverse problems. The theory will be illustrated by several applications in medical imaging (segmentation, local tomography, …).

Evaluation: during the course (“contrôle continu”).

lectures.txt · Last modified: 2021/06/25 16:10 by etore
Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0