Review Shengming Zhang 1 , MSc, BSc ; Chaohai Zhang 1 , BSc ; Jiaxin Zhang 1, 2 , PhD 1School of Automation and Intelligent Manufacturing, Southern University of Science and Technology, Shenzhen, Guangdong, China 2Guangdong Provincial Key Laboratory of Fully Actuated System Control Theory and Technology, School of Automation and Intelligent Manufacturing, Southern University of Science

30,000 NVIDIA Engineers Use Generative AI for 3x Higher Code Output
The company that started the entire wave of AI infrastructure and development is now enjoying the fruits of its work. NVIDIA has deployed generative AI tools across its company to an astonishing 30,000 engineers. In a partnership with San Francisco-based Anysphere Inc., the company is getting a customized version of the Cursor integrated developer environment, which focuses on AI code design. This is important to note as NVIDIA’s engineers are now reportedly producing as much as three times the code compared to the previous development pipeline, and we are now probably using NVIDIA’s products or services that have been designed by AI guided by humans.
NVIDIA offers a range of mission-critical products that cannot afford to be as error-prone as most AI-generated code tends to be. This includes GPU drivers that support everything from basic gaming to large-scale AI training and inference operations. The company is likely enforcing strict guidelines for its newly generated code, with an extensive range of tests required before the code is deployed in production. This isn’t the first time NVIDIA has utilized AI-assisted workflows in its products. The company has already implemented a dedicated supercomputer that has been continuously enhancing DLSS (Deep Learning Super Sampling) for several years, and some chip designs have been optimized using the company’s internal AI tools.
Not every use of AI necessarily has to be bad. NVIDIA and Cursor report that the bug rate reportedly stayed flat, despite NVIDIA producing three times more code, so some company-specific optimizations are implemented. Some interesting AI-assisted developments brought us DLSS 4 and 25% smaller GPU dies than comparable industry-standard solutions. If used for interesting applications like NVIDIA demonstrated, developments can be quite beneficial both for gamers and the company. We have to wait and see what new technologies (or even bugs) come out of the new software development pipeline.
