The tool offers real-time insights into regulatory changes relevant to a business, mitigating compliance risks.
AI Copilot for Sales
The tool generates executive summaries of deals, identifies issues, suggests the next best actions, and more.
AI Research Solution for Due Diligence
The solution enhances due diligence assessments, allowing users to make data-driven decisions.
AI Customer Support Agent
The agent streamlines your customer support processes and provides accurate, multilingual assistance across multiple channels, reducing support ticket volume.
← All Insights Generative AI for procurement: Adoption phases, use cases, integration strategies, and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. The buzz around generative AI is growing louder, and for...
← All Insights Generative AI for corporate accounting: Scope, integration strategies, use cases, challenges and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. Generative AI is reshaping corporate...
← All Insights Generative AI in hospitality: Scope, integration approaches, use cases, challenges and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. In this era of rapid technological advancement, the...
← All Insights Generative AI in healthcare: Scope, integration, use cases, challenges and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. The healthcare industry is undergoing a technological revolution...
← All Insights Generative AI in HR operations: Scope, integration strategies, use cases, challenges and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. In the rapidly evolving human resources landscape,...
← All Insights Generative AI for customer success: Development, integration, use cases and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. As competition intensifies across industries, providing exceptional...
Tour ZBrain to see how it enhances legal practice, from document management to complex workflow automation. ZBrain solutions, such as legal AI agents, boost productivity.
Discover Workflow Integrations
Explore how ZBrain seamlessly integrates into your workflows to automate complex tasks and provide strategic insights, ensuring streamlined operations and enhanced efficiency.
Engage in Q&A Session
Receive real-time answers to your questions, ensuring you comprehend ZBrain’s operations, how it meets the specific needs of your legal practice, and the setup and integration process.
ZBrain credits and their usage
A credit is a unit of usage on ZBrain. Credits are consumed whenever you perform actions such as embedding documents or querying an app. Here’s how credits are utilized:
Document Embedding:
When you upload a document, it is processed and embedded to create a searchable vector representation. This step consumes credits based on the size of the document and the embedding model used.
Input Processing:
When you ask a question, the relevant context is retrieved from the embeddings and processed by the Large Language Model (LLM). This step incurs a credit cost based on the model and the number of input tokens.
Generating Output
The LLM then generates a response based on the processed input, consuming additional credits based on the number of output tokens.
The cost in credits for various models and processes is detailed below:
Models and Credit Consumption:
Model
Input Cost
Output Cost
GPT-4o
2,500 credits / 1M tokens
7,500 credits / 1M tokens
GPT-4
15,000 credits / 1M tokens
30,000 credits / 1M tokens
GPT-4-32k
30,000 credits / 1M tokens
60,000 credits / 1M tokens
GPT-3.5 Turbo
250 credits / 1M tokens
750 credits / 1M tokens
Embedding Models:
Model
Input Cost
text-embedding-3-small
20 credits / 1M tokens
text-embedding-3-large
130 credits / 1M tokens
ada v2
100 credits / 1M tokens
When you create a knowledge base or query an app, credits are deducted based on the embedding, input, and output token usage. This ensures you only pay for what you use while leveraging the full power of advanced AI models.