AI Switching Costs: A Nuanced Landscape Compared to Web Platforms

Image for AI Switching Costs: A Nuanced Landscape Compared to Web Platforms

Bojan Tunguz, a prominent voice in the technology sphere, recently posited that switching costs for artificial intelligence (AI) might be "even lower for the AI than they were for search engines and browsers," a claim that sparks discussion on the evolving dynamics of user lock-in in the digital age. This assertion suggests a potential paradigm shift where users can transition between AI services with greater ease than they could between established web platforms.

The landscape of AI switching costs, however, presents a complex picture. While some aspects of AI adoption might indeed offer lower barriers to change, significant lock-in mechanisms are emerging, particularly within the foundational layers of the AI supply chain. According to a SUERF policy brief, companies like Nvidia maintain dominance in hardware due to proprietary platforms like CUDA, creating high switching costs for users. Similarly, the cloud computing layer, dominated by a few major providers, also exhibits substantial barriers to switching, including egress fees and proprietary software restrictions, which reinforce customer lock-in.

The OECD highlights that control over essential inputs such as compute infrastructure, proprietary data, and model interfaces (APIs) can lead to "ecosystem lock-in," making users dependent on specific platforms. This is particularly true for vertically integrated firms that leverage their market power in one layer of the AI stack to strengthen their position in others. Such practices can create systemic lock-in, reducing the ability to multi-home and raising switching costs for both users and developers.

Conversely, the user-facing AI applications layer appears more dynamic, with a wide range of tools and services built on foundation models. The sheer number of foundation models and the rapid evolution of the market suggest potential for competition and lower switching costs at the application level. However, even here, "winner takes all" dynamics can emerge, as seen with ChatGPT capturing a significant market share, underscoring the advantage of early entry. The ability for firms to fine-tune semi-open or open-weight models with proprietary data also suggests that customization and differentiation could support innovation and potentially reduce switching costs for specialized applications.

The comparison to search engines and browsers is critical. Historically, users have faced moderate switching costs with these platforms, often due to ingrained habits, data portability challenges, and ecosystem integration. For AI, the debate centers on whether the technological architecture and market structure will ultimately foster greater interoperability and user choice, or if the concentration of power in foundational AI components will lead to new, potentially higher, forms of lock-in.