Google launches 2 million context window for Gemini 1.5 Professional

[ad_1]

Google has introduced that builders now have entry to a 2 million context window for Gemini Professional 1.5. For comparability, GPT-4o has a 128k context window. 

This context window size was first introduced at Google I/O and accessible solely by a waitlist, however now everybody has entry.

Longer context home windows can result in larger prices, so Google additionally introduced assist for context caching within the Gemini API for Gemini 1.5 Professional and 1.5 Flash. This enables context to be saved to be used in later queries, which reduces prices for duties that reuse tokens throughout prompts. 

Moreover, Google has introduced that code execution is now enabled for each Gemini 1.5 Professional and 1.5 Flash. This characteristic permits the mannequin to generate and run Python code after which iterate on it till the specified result’s achieved.

In line with Google, the execution sandbox isn’t linked to the web, comes with a couple of numerical libraries pre-installed, and payments builders primarily based on the output tokens from the mannequin.

And eventually, Gemma 2 is now out there in Google AI Studio and Gemini 1.5 Flash tuning will likely be out there through the Gemini API or Google AI Studio someday subsequent month. 


You may additionally like…

Anthropic’s new Claude 3.5 Sonnet mannequin already aggressive with GPT-4o and Gemini 1.5 Professional on a number of benchmarks

Gemini enhancements unveiled at Google Cloud Subsequent

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *