The tool offers real-time insights into regulatory changes relevant to a business, mitigating compliance risks.
AI Copilot for Sales
The tool generates executive summaries of deals, identifies issues, suggests the next best actions, and more.
AI Research Solution for Due Diligence
The solution enhances due diligence assessments, allowing users to make data-driven decisions.
AI Customer Support Agent
The agent streamlines your customer support processes and provides accurate, multilingual assistance across multiple channels, reducing support ticket volume.
← All Insights Generative AI in manufacturing: Capabilities, integration approaches, use cases, challenges and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. Could generative AI be the key to unlocking the...
← All Insights Generative AI in customer service: Scope, adoption strategies, use cases, challenges and best practices Talk to our Consultant Listen to the article Your browser does not support the audio element. The advent of generative AI tools like ChatGPT marks...
← All Insights Generative AI for regulatory compliance: Scope, integration approaches, use cases, challenges and best practices Talk to our Consultant Listen to the article Your browser does not support the audio element. GenAI tools like OpenAI’s ChatGPT and...
← All Insights Generative AI in due diligence: Scope, adoption strategies, use cases, challenges, considerations and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. Is your due diligence process keeping up...
← All Insights Generative AI in logistics: Use cases, integration approaches, development and future Talk to our Consultant Listen to the article Your browser does not support the audio element. The logistics industry is undergoing a major transformation, driven by...
← All Insights Generative AI for financial reporting: Development, integration, use cases, benefits and future outlook Talk to our Consultant Listen to the article Your browser does not support the audio element. Financial reporting is undergoing a significant...
Tour ZBrain to see how it enhances legal practice, from document management to complex workflow automation. ZBrain solutions, such as legal AI agents, boost productivity.
Discover Workflow Integrations
Explore how ZBrain seamlessly integrates into your workflows to automate complex tasks and provide strategic insights, ensuring streamlined operations and enhanced efficiency.
Engage in Q&A Session
Receive real-time answers to your questions, ensuring you comprehend ZBrain’s operations, how it meets the specific needs of your legal practice, and the setup and integration process.
ZBrain credits and their usage
A credit is a unit of usage on ZBrain. Credits are consumed whenever you perform actions such as embedding documents or querying an app. Here’s how credits are utilized:
Document Embedding:
When you upload a document, it is processed and embedded to create a searchable vector representation. This step consumes credits based on the size of the document and the embedding model used.
Input Processing:
When you ask a question, the relevant context is retrieved from the embeddings and processed by the Large Language Model (LLM). This step incurs a credit cost based on the model and the number of input tokens.
Generating Output
The LLM then generates a response based on the processed input, consuming additional credits based on the number of output tokens.
The cost in credits for various models and processes is detailed below:
Models and Credit Consumption:
Model
Input Cost
Output Cost
GPT-4o
2,500 credits / 1M tokens
7,500 credits / 1M tokens
GPT-4
15,000 credits / 1M tokens
30,000 credits / 1M tokens
GPT-4-32k
30,000 credits / 1M tokens
60,000 credits / 1M tokens
GPT-3.5 Turbo
250 credits / 1M tokens
750 credits / 1M tokens
Embedding Models:
Model
Input Cost
text-embedding-3-small
20 credits / 1M tokens
text-embedding-3-large
130 credits / 1M tokens
ada v2
100 credits / 1M tokens
When you create a knowledge base or query an app, credits are deducted based on the embedding, input, and output token usage. This ensures you only pay for what you use while leveraging the full power of advanced AI models.