Sign in
Categories
Your Saved List Become a Channel Partner Sell in AWS Marketplace Amazon Web Services Home Help

Falcon 180B Chat: OpenAI & API Compatible

Falcon 180B Chat: OpenAI & API Compatible

By: Meetrix.io Latest Version: 1.0.0
Linux/Unix
Linux/Unix

Product Overview

Falcon 180B is the largest openly available language model, with 180 billion parameters. It was trained on 3.5 trillion tokens using TII's RefinedWeb dataset. This represents the longest single-epoch pre-training for an open model.

Falcon 180B scaled up for its predecessor Falcon 40B, with new capabilities such as multiquery attention for enhanced scalability. The model used 4096 GPUs on Amazon SageMaker and was trained on 3.5 trillion tokens. This is roughly around 7,000,000 GPU hours. This means that Falcon 180B is 2.5x faster than LLMs such as Llama 2 and was trained on 4x more computing.

Key Features:
Unmatched Pretrained Depth: Boasting an impressive 180 billion parameters, this AMI is set to produce results with unparalleled depth, precision, and richness.

Quick Deployment: Overcome the challenges of intricate setups effortlessly. The AMI variant provides a clean, ready-to-use experience, simplifying the complexities associated with raw models.

Effortless API Integration: Seamlessly connect with a multitude of applications. The integrated API design guarantees adaptable versatility and seamless operations.

Version

1.0.0

Operating System

Linux/Unix, Ubuntu 22.04

Delivery Methods

  • CloudFormation Template

Pricing Information

Usage Information

Support Information

Customer Reviews