



const { OpenAI } = require('openai');
const api = new OpenAI({
baseURL: 'https://api.ai.cc/v1',
apiKey: '',
});
const main = async () => {
const result = await api.chat.completions.create({
model: 'togethercomputer/guanaco-7b',
messages: [
{
role: 'system',
content: 'You are an AI assistant who knows everything.',
},
{
role: 'user',
content: 'Tell me, why is the sky blue?'
}
],
});
const message = result.choices[0].message.content;
console.log(`Assistant: ${message}`);
};
main();
import os
from openai import OpenAI
client = OpenAI(
base_url="https://api.ai.cc/v1",
api_key="",
)
response = client.chat.completions.create(
model="togethercomputer/guanaco-7b",
messages=[
{
"role": "system",
"content": "You are an AI assistant who knows everything.",
},
{
"role": "user",
"content": "Tell me, why is the sky blue?"
},
],
)
message = response.choices[0].message.content
print(f"Assistant: {message}")
-
AI Playground

Test all API models in the sandbox environment before you integrate.
We provide more than 300 models to integrate into your app.


Product Detail
Discover Guanaco-7B: A Powerful Open-Source LLM
The Guanaco-7B is an innovative 7-billion parameter open-source chatbot model. Built upon Meta's robust LLaMA architecture, it delivers performance comparable to ChatGPT on the Vicuna benchmark, yet requires significantly fewer computational resources for both fine-tuning and inference. Released in May 2023 by Tim Dettmers, Guanaco-7B is a text-based Large Language Model (LLM) designed for efficiency and accessibility.
🌟 Key Features & Advantages
-
✅
Efficient 4-bit QLoRA Fine-tuning: Leverage Quantization-aware training with Low-Rank Adaptation (QLoRA) to fine-tune Guanaco-7B in 4-bit precision. This innovative method reduces memory requirements by an impressive 75% compared to full-precision training, making advanced customization highly accessible.
-
🌐
Multilingual Support: Trained on a diverse multilingual dataset, Guanaco-7B seamlessly engages in conversations across a broad spectrum of languages, breaking down communication barriers.
-
🔓
Open-Source & Apache 2.0 Licensed: Completely free for both research and commercial applications under the permissive Apache 2.0 license, fostering broad adoption and innovation.
-
💻
Supports Local Experimentation: Its highly efficient fine-tuning and inference capabilities enable cost-effective local experimentation and the rapid development of custom chatbot solutions.
🎯 Intended Use & Applications
Guanaco-7B is specifically designed for versatile applications in conversational AI. Its primary use cases include:
- • Open-domain chatbots: For general conversational AI.
- • Question-answering systems: Providing informative responses efficiently.
- • Other conversational AI applications: Its efficient architecture makes it ideal for deployment on resource-constrained devices and edge computing environments.
Language Support Details: The model offers support for a wide array of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Russian, Chinese, Japanese, and Korean. While the exact total number of supported languages is not specified, its training on a multilingual dataset ensures broad applicability.
⚙️ Technical Specifications
- Architecture: Guanaco-7B is meticulously built upon Meta's LLaMA architecture. It employs a standard Transformer-based language model, featuring 7 billion parameters and 32 attention layers, ensuring robust performance.
- Training Data: The model was trained using a substantial multilingual dataset. While specific details on data sources and exact size are not publicly documented, it's understood to encompass web pages, books, articles, and various other text data, forming a subset of the data utilized for larger Guanaco models.
- Knowledge Cutoff: Although not explicitly stated, the knowledge cutoff date for Guanaco-7B is estimated to be early 2023, aligning with its release timeframe.
- Diversity and Bias: Given its multilingual training, Guanaco-7B likely benefits from a diverse dataset across multiple languages and domains. However, detailed information regarding the diversity of the training data or any identified biases in its outputs is not provided. Responsible deployment is encouraged.
🤝 Usage & Licensing
License Type: Guanaco-7B operates under the highly flexible Apache 2.0 license. This license grants users the freedom for commercial and non-commercial use, modification, and distribution of the model, provided the original copyright notice and disclaimer are retained.
Ethical Guidelines: The official documentation for Guanaco-7B does not explicitly detail specific ethical guidelines or considerations. As an open-source model, the onus is on individual users and organizations to ensure its responsible development and deployment, adhering to ethical AI principles.
🚀 API Usage Example
Here’s a practical example demonstrating API usage:
<snippet data-name="open-ai.chat-completion" data-model="togethercomputer/guanaco-7b"></snippet>
❓ Frequently Asked Questions (FAQs)
A: Guanaco-7B is an open-source, 7-billion parameter chatbot model based on Meta's LLaMA architecture, known for its efficiency in fine-tuning and inference.
A: It utilizes 4-bit QLoRA fine-tuning, which reduces memory requirements by 75% compared to full-precision training.
A: Yes, it is released under the Apache 2.0 license, allowing for both commercial and non-commercial use, modification, and distribution.
A: It supports multiple languages including English, French, Spanish, German, Italian, Portuguese, Dutch, Russian, Chinese, Japanese, and Korean, thanks to its multilingual training data.
A: While not explicitly stated, it is estimated to be around early 2023, based on its release date.
Learn how you can transformyour company with AICC APIs



Log in