Krux

April 22, 2026
Claude Opus 4.7 Hits AWS with 1M Context Window
Published: April 22, 2026 at 12:30 AM
Updated: April 22, 2026 at 12:30 AM
100-word summary
Anthropic's most capable model is now available through Amazon Bedrock, bringing a 1 million token context window to AWS customers. That's enough to fit about 750,000 words or roughly ten novels in a single prompt. Opus 4.7 can handle ambiguous instructions better than prior versions and processes higher-resolution images. AWS is routing it through a new inference engine that dynamically shifts traffic between regions. The practical shift: coding assistants can now maintain context across entire repositories, and financial analysts can feed in quarterly reports without chopping them into chunks. Bedrock customers get the same zero-access data guarantees Anthropic promises elsewhere. The model handles long-running tasks that require multiple tools working...
What happened
Anthropic's most capable model is now available through Amazon Bedrock, bringing a 1 million token context window to AWS customers. That's enough to fit about 750,000 words or roughly ten novels in a single prompt. Opus 4.7 can handle ambiguous instructions better than prior versions and processes higher-resolution images. AWS is routing it through a new inference engine that dynamically shifts traffic between regions.
Why it matters
The practical shift: coding assistants can now maintain context across entire repositories, and financial analysts can feed in quarterly reports without chopping them into chunks. Bedrock customers get the same zero-access data guarantees Anthropic promises elsewhere. The model handles long-running tasks that require multiple tools working together without losing the thread.