First Mover Advantages and Switching Cost. AI case.

2 min readOct 16, 2022

From a marketing perspective, Nvidia in the AI business is another example of how the first mover company could capitalize on its position to create the switching cost that prevents customers from migrating to the competitors.

Labeling himself as the inventor of GPU, Jensen Huang, The CEO, introduced “accelerated computing” to the world. That is the computing process in which the CPU takes the complex (but fewer) instructions while the GPU takes the simple (but much more in number) ones. It revolutionizes how a task is executed. We used to think CPU as the sole processor for all instructions. Nvidia opens our eyes.

For cloud computing and AI, GPU plays a more vital role than CPU because the task is simple but needs to be repeated million times. Short story, with its GPU, Nvidia dominates the data center world — a facility that brings cloud computing into the real world. Thus, it is not surprising that Nvidia become the GPU supplier for cloud services providers like Microsoft (Azure) and Amazon (AWS).

The cloud provider (Microsoft, Amazon) writes the program in Nvidia architecture (let’s call it Fermi) and this is the beginning of the switching cost creation. For them, it is almost impossible to migrate to another provider other than Nvidia since they have to write all of that codes from scratch. Painful.

Nvidia’s circumstances are similar to Microsoft’s in the 90s. It created OS (Windows) and Office. Microsoft gained popularity via its Windows then it create large inertia, thanks to millions of customer base. If your works are in docx, pptx, or xlsx format, it is almost impossible to use software other than Office at that time. Let alone the features it has.

If your cloud services have been written in Nvidia language, making another new one is a daunting task. This is the point where Nvidia leads like Microsoft.

This is part of the Nvidia stock analysis, which you can access at: