AI Transparency Statement
This statement explains how we use artificial intelligence (AI) in our Civio Assist platform.
Civio Assist is a Retrieval-Augmented Generation (RAG) chatbot. This architecture is designed to provide answers that are grounded in a specific, controlled set of information provided by our client organisations.
Here is how it functions:
Retrieval: When a user asks a question, the system first searches a dedicated knowledge base. This knowledge base is made up of approved information provided by our client organisation (for example, the content of a specific government agency’s website).
Generation: The system then uses a foundational large language model (LLM) to generate a conversational answer based only on the relevant information it retrieved from the knowledge base.
This RAG approach ensures that the chatbot’s responses are based on the client organisation’s own approved content, rather than the LLM’s general knowledge from its training data. To generate these conversational responses, Civio Assist uses foundational LLMs provided through secure, enterprise-grade platforms, primarily Amazon Web Services (AWS) Bedrock and Microsoft Azure AI Foundry. This gives us access to a range of models, including the Claude series (from Anthropic) and the GPT series (from OpenAI).
For the purposes of our client organisations’ reporting, we classify the use of AI in the Civio Assist platform as follows:
The accuracy and currency of the information within the knowledge base is the responsibility of the client organisation. Civio provides the platform, but the client organisation owns and governs the content the chatbot uses to form its answers.
Our data handling practices are designed to enable our client organisations to comply with their obligations under the Australian Privacy Act 1988 and the Australian Privacy Principles (APPs). This section explains what we process and how we handle it.
What we process:
How we handle the data:
Civio Assist is a tool for information retrieval, not a decision-making entity. It is designed to augment, not replace, human support.
We continuously monitor the performance and effectiveness of the Civio Assist platform, including metrics related to response accuracy, user satisfaction, speed and system uptime. Client organisations determine their own processes for human review of chatbot interactions or for providing users with pathways to contact a human staff member.
We are committed to providing a platform that enables our client organisations to deliver fair and inclusive services. We use a multi-layered approach to mitigate the risk of bias:
Bias can still exist in the underlying data or language models. We encourage users to report any instances of biased or unfair responses directly within the chatbot interface using the provided feedback tools. This feedback is a critical part of our continuous improvement process.
Civio Assist is designed to provide general information. It is not a substitute for professional advice. The AI models powering the platform can be probabilistic, meaning they may sometimes produce responses that are inaccurate, incomplete, or unreliable. For this reason, information provided by Civio Assist must not be used for making legal, financial, or any other official decisions.
For any complex, sensitive, or personal enquiries, or for any matter of importance, users must contact the client organisation directly through their established channels to speak with a qualified person.
If you have any questions about this statement or how Civio uses AI, please get in touch with us.
If your question relates to the specific information provided by a chatbot on one of our client organisations’ websites, please contact that organisation directly.
This statement is updated when there are significant changes to our use of AI.
Civio Assist is an Australian-owned AI chatbot service designed specifically for government and public sector organisations.
If you’re interested in implementing Civio Assist for your organisation, we’d love to hear from you.