AI Developer Technology Engineer Intern - 2025

NVIDIA

2.7

(9)

Beijing, China

#JR1989499

Position summary

as human creativity and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for Deep Learning, and NVIDIA is increasingly known as "the AI computing company." Come join a team full of world-class computer scientists to work in its Compute Developer Technology team as an AI Developer Technology Engineer.

What you will be doing:

  • You will work and develop state of the art techniques in deep learning, graphs, machine learning, and data analytics, and perform in-depth analysis and optimization to ensure the best possible performance on current- and next-generation GPU architectures

  • You will provide the best AI solutions using GPUs working directly with key customers

  • Collaborate closely with the architecture, research, libraries, tools, and system software teams to influence the design of next-generation architectures, software platforms, and programming models

What we need to see:

  • Pursuing MS or PhD from a leading University in an engineering or Computer Science related discipline

  • Strong knowledge of C/C++, software design, programming techniques, and AI algorithms

  • Experience with parallel programming, ideally CUDA C/C++

  • Good communication and organization skills, with a logical approach to problem solving, time management, and task prioritization skills

Preferred internship duration: 6+ months

NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law.

#deeplearning