Founded by three visionaries in a West Coast fast-food restaurant, Nvidia has risen to the top of the world's stock market rankings, temporarily surpassing giants such as Amazon and Alphabet (Google's parent company). Its impressive market capitalization of $1.82 billion, just behind Microsoft ($3.11 billion) and Apple ($2.9 billion), testifies to its dazzling momentum.
Originally specializing in graphics chips (GPUs) for computers and game consoles, Nvidia has rapidly broadened its scope to include artificial intelligence. While its name is familiar to gamers and technology enthusiasts, let’s say its contribution to AI is less well-known to the general public.
Nvidia's chips don't just improve video game graphics. Their computing power is now used for complex generative AI tasks, enabling the creation of new content such as text, images, audio, and video.
These mini-technologies have become essential components of data centers belonging to the GAFAM (Google, Amazon, Facebook, Apple and Microsoft), and Nvidia is making a lot of money from them: by 2023, the majority of its $26.97 billion in sales came from this activity.
Nvidia's expansion in the field of AI is in line with the global trend towards an increasingly digitized and automated world. It's a fact: companies are now looking for AI-based solutions to optimize their operations, increase efficiency, and improve competitiveness in the global marketplace.
Another decisive factor for Nvidia is the lack of competition in the market. Sure, Nvidia is not alone in producing powerful GPUs in the USA. Some competitors such as AMD, Intel, and Apple are producing their own product ranges, but no major European competitor stands in its way. A figure that doesn't lie: Nvidia currently holds 83% of the global graphics chip market.
An enviable position and influence on the tech market. What's more, the company's enthusiastic investors were not mistaken: since the end of 2022, when ChatGPT was created and the AI craze began, Nvidia's share price has soared 4-fold.
One thing's for sure: there's no end in sight to the trend towards supercomputing.