Use an LLM to Generate AI Context for You

You’ve read about how providing AI context in Omni’s ai_context parameter improves accuracy, consistency, and most importantly, trust in the analytics you build with AI. But did you know that you can use an LLM of your choice to help you generate that AI Context?

Here I’ll share two prompts that you can use to generate ai_context at the model level, and topic level.

Model Level Context:

Paste the following prompt into your LLM of choice, answer the 5-10 questions it asks in-response about your business, metrics, and brand guidelines. Then, paste the AI context it generates into the ai_context parameter in Omni as a single ai_context block. That way it will be applied consistently anywhere AI is used.

“I’m a modeler in the Omni Analytics platform and I want to add AI context to my model. Please ask me a set of questions (at least 5, but not more than 10) so you can deeply understand my business mission, metrics of success, and the dos and don’ts about my brand. This should enable users to answer questions about key business metrics and define how we want the AI to respond to user questions. After I answer your questions, please give me markdown files that I can copy and paste into the Omni IDE. ai_context must be a single string parameter in Omni topic files (per the docs), everything else should be plain Markdown/structured text inside that parameter rather than separate YAML properties. The key is: only one ai_context: parameter should exist, and its value is the full multi-line context you want the AI to see.

All documentation here: https://docs.omni.co/
Optimize model for ai documentation here: https://docs.omni.co/modeling/develop/ai-optimization
Key blog post here: https://omni.co/blog/getting-started-with-ai-analytics

Topic Level Context:

Same process as the model-level context above. Paste the following prompt into your LLM of choice, answer the 5-10 questions it asks in-response about what the dataset is used for, which questions it can answer, and which it shouldn’t. Once finalized, that context will live alongside the topic in Omni, where it will guide topic selection and interpretation at query time.

“I’m a modeler in the Omni Analytics platform, and I want to add AI context to my model. Please ask a set of questions (at least 5 but not more than 10) so you can deeply understand a single topic in my model. Topic level context should answer questions like “what is this data used for”, “what questions can this data answer”, “what questions can this data not answer”, “how do users get value from the data in this topic”, etc.. This should prevent the user from selecting the wrong topic to answer their question. After I answer your questions, please give me markdown files that I can copy and paste into the Omni IDE. ai_context must be a single string parameter in Omni topic files (per the docs), everything else should be plain Markdown/structured text inside that parameter rather than separate YAML properties. The key is: only one ai_context: parameter should exist, and its value is the full multi-line context you want the AI to see.

All documentation here: https://docs.omni.co/
Optimize model for ai documentation here: https://docs.omni.co/modeling/develop/ai-optimization
Key blog post here: https://omni.co/blog/getting-started-with-ai-analytics

As always, give the generated AI Context a good proofread and edit to make sure it reflects your specific business needs.

Happy querying!