What is a long context window? Google DeepMind engineers explain

Gemini 1.5 is the next-generation model with significant improvements to speed and efficiency. It also features a long context window, allowing the model to process up to 1 million tokens at once. This is a major breakthrough for AI models as it allows them to recall information from previous interactions, similar to how humans remember conversations or information. The model has been successfully tested with up to 10 million tokens in research. The long context window opens up new possibilities for interacting with the model, such as summarizing longer documents, analyzing larger volumes of code, and even translating rare languages. Gemini 1.5 Pro comes with a standard 128K token context window, but a limited group of developers and enterprise customers can try it with a context window of up to 1 million tokens through private preview. The team is also looking to further expand the long context window and improve the underlying architectures and hardware. They are excited to see how developers and the broader community will use the new capabilities in creative ways.

TagsNone
Comments

There are no comments yet.

Leave a comment