Gemini 2.5 Pro is INSANE Good! - MCP Server Build Tutorial

In the video, the creator showcases the capabilities of Google’s Gemini 2.5 model by guiding viewers through the process of building a cursor mCP server, integrating a vector store from OpenAI for documentation storage. They demonstrate setting up the vector database, generating server prompts, and successfully querying the server, while expressing enthusiasm for Gemini 2.5’s performance and teasing future content.

In the video, the creator discusses the release of Google’s Gemini 2.5 model and expresses excitement about using it to build a cursor mCP server. The tutorial aims to guide viewers through the process of setting up an mCP server tailored to their specific needs, particularly focusing on integrating a vector store from OpenAI to store documentation. The creator plans to use a 3js documentation vector store as an example and shares their typical workflow for building these vector stores and connecting them to an mCP server.

The video begins with the creator gathering necessary documentation for the project, including information from the cursor, the mCP TypeScript SDK, and OpenAI’s documentation. They emphasize the importance of having the right context for Gemini to effectively build the mCP server. The creator also promotes Brilliant.org, a platform offering interactive courses on AI, as a resource for viewers interested in understanding how AI models work.

After preparing the documentation, the creator moves on to setting up the vector database. They demonstrate how to create a vector store on OpenAI’s platform, specifically for the 3js documentation. The creator uploads the relevant files and configures the chunk size and overlap settings for the vector store, ensuring it is optimized for use in the mCP server.

Once the vector store is established, the creator returns to Google AI Studio to upload the prepared files and generate a prompt for creating the mCP server. They instruct Gemini 2.5 to create a cursor mCP server in TypeScript that utilizes the file search vector store feature from OpenAI. The model quickly provides a step-by-step guide, which the creator follows to set up the server, including creating necessary files and installing dependencies.

Finally, the creator tests the mCP server by querying the vector store for information on creating fog in 3js. The results are successfully returned, demonstrating the effectiveness of the setup. The creator expresses satisfaction with the performance of Gemini 2.5, highlighting its speed and capabilities compared to previous models. They conclude by encouraging viewers to explore mCP servers further and tease upcoming content related to OpenAI’s adoption of mCP. Additionally, they mention an upcoming GPU giveaway on their channel.