Anthropic’s Claude Instant: A Smaller, Faster and Cheaper Language Model
Anthropic released Claude Instant 1.2, a smaller, faster and cheaper version of its Claude 2 large language model.
This article was originally published on AI Business.
Anthropic, an AI startup founded by former OpenAI engineers, has released the latest version of its Claude Instant language model.
A smaller and cheaper version of its Claude 2 large language model, Claude Instant 1.2 can be accessed through its API for businesses, the company said in a blog post.
Anthropic claims that Claude Instant is a “faster, lower-priced and yet still very capable model” that can handle tasks including dialogue, text analysis, summarization and document comprehension.
Claude Instant, which Anthropic said is best for low latency, high throughput cases, charges $1.63 per million tokens for prompts and $5.51 per million tokens for completion. The context window is 100,000 tokens, translating to about 75,000 words in each prompt.
In comparison, Claude 2 charges $11.02 per million tokens for prompts and $32.68 per million tokens for completion. The context window is the same. Anthropic said Claude 2 is best for tasks that need complex reasoning and “superior performance.”
Claude Instant: Performance Metrics
The startup said Claude Instant 1.2 shows “significant gains” in math, coding, reasoning and safety compared to the earlier version. It also is able to generate “longer, more structured responses and follows formatting instructions better.”
The latest version also shows improvement in quote extraction, multilingual capabilities and question answering, the company said.
Here is a comparison between Claude Instant versions 1.1 and 1.2:
image-1_1
Anthropic also said Claude Instant 1.2 hallucinates less than the earlier version, and is more resistant to jailbreaks, or attempts to get the chatbot to forgo its guardrails.
Read more about:
AI BusinessAbout the Authors
You May Also Like