Ali Salman

Ali Salman

Machine Learning Engineer

Hi! I am a Machine Learning Engineer at Bird.i (subsidiary of Zonda), where we train and deploy deep learning models to track the different construction cycles in large residential projects using satellite images. I have graduated from Grenoble INP - ENSIMAG with a Master’s degree in computer science, speciallizing in computer vision and robotics. I am broadly interested in machine learning, especially when applied to computer vision.

In 2019, I interned at Uber ATG in San Francisco, where I worked as a software engineer at the intersection of machine learning and prediction for autonomous driving. In 2017, I spent the summer as an intern in the biorobotics lab at Carnegie Mellon University, where I worked with Professor Howie Choset on deep reinforcement learning for robot navigation, and multiagent area coverage.

Interests

  • Machine Learning
  • Deep Learning
  • Computer Vision
  • Robotics

Education

  • MSc in Computer Science, 2020

    Grenoble INP - ENSIMAG

  • BE in Mechanical Engineering, 2018

    Lebanese University

Experience

 
 
 
 
 

Machine Learning Engineer

Bird.i | Zonda

Jan 2021 – Present Glasgow, UK
 
 
 
 
 

Graduate Research Intern

Inria

Feb 2020 – Jul 2020 Grenoble, France
 
 
 
 
 

Software Engineering Intern

Uber ATG - Self-driving Cars

Jun 2019 – Sep 2019 San Francisco, USA
 
 
 
 
 

Graduate Research Intern

Grenoble Informatics Lab

Feb 2019 – Jun 2019 Grenoble, France
 
 
 
 
 

Summer Intern

The Robotics Institute - Carnegie Mellon University

Jul 2017 – Sep 2017 Pittsburgh, USA

Projects

*

3D Graphics

Created a space themed scene using Opengl.

Autonomous Mobile Robot

We have built a turtlebot-like robot from scratch (using an Arduino), which can navigate a predefined map autonomously, and also follow a person depending on the color of shirt he/she is wearing.

Deep Reinforcement Learning for Robot Navigation

Reinforcement Learning for LiDAR-based Navigation, in simulation, in a maze.

Hand Gesture Recognition

Integrated hand gesture recognition on a hexapod robot using the Intel Euclid sensor.

Nachos

Implemented the main os components and mechanisms on top of Nachos.

Object Detection and Human Posture Recognition with Pepper Robot

Integrated deep learning techniques for object detection and human posture recognition with Pepper robot.