$$ \newcommand{\dint}{\mathrm{d}} \newcommand{\vphi}{\boldsymbol{\phi}} \newcommand{\vpi}{\boldsymbol{\pi}} \newcommand{\vpsi}{\boldsymbol{\psi}} \newcommand{\vomg}{\boldsymbol{\omega}} \newcommand{\vsigma}{\boldsymbol{\sigma}} \newcommand{\vzeta}{\boldsymbol{\zeta}} \renewcommand{\vx}{\mathbf{x}} \renewcommand{\vy}{\mathbf{y}} \renewcommand{\vz}{\mathbf{z}} \renewcommand{\vh}{\mathbf{h}} \renewcommand{\b}{\mathbf} \renewcommand{\vec}{\mathrm{vec}} \newcommand{\vecemph}{\mathrm{vec}} \newcommand{\mvn}{\mathcal{MN}} \newcommand{\G}{\mathcal{G}} \newcommand{\M}{\mathcal{M}} \newcommand{\N}{\mathcal{N}} \newcommand{\S}{\mathcal{S}} \newcommand{\diag}[1]{\mathrm{diag}(#1)} \newcommand{\diagemph}[1]{\mathrm{diag}(#1)} \newcommand{\tr}[1]{\text{tr}(#1)} \renewcommand{\C}{\mathbb{C}} \renewcommand{\R}{\mathbb{R}} \renewcommand{\E}{\mathbb{E}} \newcommand{\D}{\mathcal{D}} \newcommand{\inner}[1]{\langle #1 \rangle} \newcommand{\innerbig}[1]{\left \langle #1 \right \rangle} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\norm}[1]{\lVert #1 \rVert} \newcommand{\two}{\mathrm{II}} \newcommand{\GL}{\mathrm{GL}} \newcommand{\Id}{\mathrm{Id}} \newcommand{\grad}[1]{\mathrm{grad} \, #1} \newcommand{\gradat}[2]{\mathrm{grad} \, #1 \, \vert_{#2}} \newcommand{\Hess}[1]{\mathrm{Hess} \, #1} \newcommand{\T}{\text{T}} \newcommand{\dim}[1]{\mathrm{dim} \, #1} \newcommand{\partder}[2]{\frac{\partial #1}{\partial #2}} \newcommand{\rank}[1]{\mathrm{rank} \, #1} $$

Machine Learning with Python



Neural Networks I

Background concerning the definition of Neural Networks.

Neural Networks II

In this part, I propose to dynamically test different type of neural networks on classical datasets.

Unsupervised Classification

Background for unsupervised classification of data.

Neural Network with Keras

We give some background concerning the well-known Keras library dedicated to Neural Networks.

Reinforcement Learning - Part 1

We learn by interacting with our environment is probably the first to occur to us when we think about the nature of learning. Whether we are learning to drive a car or to hold a conversation, we are acutely aware of how our environment responds to what we do, and we seek to influence what happens through our behavior. Learning from interaction is a foundational idea underlying nearly all theories of learning and intelligence. In this first part we adress the Finite Markov Decision Process and the Bellman equation.

Reinforcement Learning - Part 2

In this part we assume that the Agent has no information concerning its environment. Consequently, the Agent has no idea about the properties of the Finite Decision Process modelizing the environment. For this, we define the Q-learning methodology. We will use Keras to create a neural network so as to define the Q-values.

Variational Autoencoder: Implementation with Keras

Variational Autoencoder is viewed as a bayesian inference problem modeling the underlying probability distribution of data.

HomeWork Audio Classification

We propose to create from an audio dataset of different speakers two neural networks so as to recognize the speaker and the number that a speaker say.

HomeWork Grasping Regression

We propose to create a neural network which will help a robot to grasp an object. For this, we will use the Cornell dataset.

Optimization and Gradient Descent on Riemannian Manifolds

One of the most ubiquitous applications in the field of geometry is the optimization problem. In this article we will discuss the familiar optimization problem on Euclidean spaces by focusing on the gradient descent method, and generalize them on Riemannian manifolds.


Riemannian Geometry

Recently I have been studying differential geometry, including Riemannian geometry. When studying this subject, I found that the argument from this point-of-view to be very elegant, which motivates me further to study geometry in depth. This writing is a collection of small notes (largely from Lee’s Introduction to Smooth Manifolds and Introduction to Riemannian Manifolds) that I find useful as a reference on this subject.