Skip to main content
All CollectionsFrequently Asked Questions
What is "High Context Notice" for?
What is "High Context Notice" for?

This article explains why you might be getting a "High Context" notice inside your chat.

Dustin W. Stout avatar
Written by Dustin W. Stout
Updated over 3 months ago

If you see a "High Context Notice" message in your chat, it means that the conversation has reached a point where additional usage costs apply. This happens when the conversation exceeds around 10,000 tokens, with costs increasing every 10,000 tokens after that.

What is "context" for AI?

"Context" is the conversation history sent with each new message to help the AI understand the conversation and provide accurate responses.

Understanding "input" and "output"

When you interact with an AI language model, there are two main components:

- Input: The text you send to the AI, including the current message and the conversation history (context).

- Output: The AI's response to your input.

AI language models charge based on both input and output, but the input cost is usually much lower than the output cost.

Why the extra cost?

As conversations get longer, the input cost increases because more context is sent with each message. To keep our business sustainable and maintain service quality, we charge extra for conversations that exceed a certain context threshold.

Magai usually only counts output against your monthly usage quota, which means you could potentially cost us more than your subscription. To manage this risk and keep Magai sustainable, we start charging based on input at a certain point.

Managing your usage

To avoid unexpected charges:

- Keep conversations focused and start new ones when topics are fully discussed.

- Use the token counter in the chat to monitor usage and know when you're nearing the high context threshold.

- Check your usage regularly in your account dashboard.

We appreciate your understanding as we strive to provide the best AI experience while ensuring our service's long-term viability.

Did this answer your question?