LOADING

Anthropic Launches Claude 3.5 Haiku AI Model, Enhancing Speed and Performance for All Users

  • 18 Dec 2024
  • 0
img

Anthropic has made a significant advancement with the rollout of the Claude 3.5 Haiku artificial intelligence model, now accessible to all users through web and mobile platforms. On Thursday, many users across various networks began sharing their experiences with the newly available model within Claude's online interface and applications. According to the company, this latest version of Haiku represents the fastest large language model they have created to date. In various tests, it has been shown to surpass the capabilities of the previous version, Claude 3 Opus. Impressively, every user of Claude will have access to the 3.5 Haiku model, regardless of their subscription level.

The release of the Haiku model was not formally publicized by the company. However, numerous users on X shared their discoveries about the new model being available on both the website and mobile applications. Independent checks by service staff confirmed that Claude 3.5 Haiku is now the default model within the chatbot framework, and it is the only option offered to users on the free tier.

The Claude 3.5 series of AI models was initially unveiled in October, coinciding with the introduction of the 3.5 Sonnet. During that announcement, the company emphasized that the Haiku version is their fastest model to date. Enhancements found in this generation include reduced latency leading to faster responses, better instruction comprehension, and improved tool usage capabilities.

For business applications, the company noted that Claude 3.5 Haiku is particularly effective for customer-facing solutions, specific sub-agent assignments, and crafting tailored experiences from extensive datasets. In terms of performance metrics, the new Haiku model achieved a score of 40.6 percent on the Software Engineering benchmark, surpassing both the Claude 3.5 Sonnet and OpenAI’s GPT-4o. It also outshines the mini version of GPT-4o on the HumanEval and Graduate-Level Google-Proof Q&A benchmarks.

Recently, the company optimized the Claude 3.5 Haiku to work effectively with the AWS Trainium2 AI chipset and included support for latency-focused inference in Amazon Bedrock. However, support for Google Cloud's Vertex AI has not yet been rolled out. The newly released AI model is designed to generate text, while it can also accept text and image inputs for processing.