Recent court documents from the remedy rulings in Google's ongoing monopoly case have shed light on a previously less-known internal technology called "FastSearch." This proprietary search variant, distinct from Google's main search engine, is primarily utilized for grounding the company's Gemini AI models. The revelation came to public attention after digital marketing expert Marie Haynes highlighted the details from the judge's decision on X (formerly Twitter).
FastSearch operates on "RankEmbed signals" and generates abbreviated, ranked web results. Its design prioritizes speed by retrieving fewer documents compared to standard Google Search. While this makes it significantly faster, the trade-off is a "lower quality than Search's fully ranked web results," as stated in the court document.
The primary application of FastSearch is to validate responses generated by Google's Gemini models, ensuring their factual accuracy and relevance. Despite its reduced quality compared to general web search, it is deemed "good enough for grounding" purposes within the AI framework. This internal tool plays a crucial role in the development and reliability of Google's advanced AI capabilities.
Access to FastSearch is tightly controlled by Google. The technology is not made directly available to third parties through an API. Instead, it is integrated into Vertex AI, a Google Cloud offering, where external users can leverage it to ground their models on Google Search results or other data sources. However, Vertex customers only receive the processed information derived from FastSearch results, not the raw ranked web results themselves, a measure Google implements to protect its intellectual property.