Blog

Custom AI Agents for Business: A Secure Approach with Oracle Private Agent Factory

Written by Chanaka Yapa | Apr 28, 2026 3:00:01 PM

Introduction

Every day, AI technologies are getting better and better. I used to think that the hallucination rate was high on these models. But after using Gemini, ChatGPT, and Claude, I changed my mind. Every model has a different way of looking at data. Now the challenge is to embed this business-sensitive data. This is the time when data is worth more than gold. If you really get the most out of the data, LLM models and MCP will give you a real edge over your competitors.

Safety is very important. It’s critical to have secure connections and agent-based tools in place. Exposing sensitive organizational data to public models can have serious consequences. That’s why my focus has been on security while experimenting with different models and LLM architectures.

Oracle Private Agent Factory (OPAF) stands out as a key solution, providing a secure way to connect to LLMs without exposing your data. Tools like this make it possible to innovate with AI while maintaining strong data governance and protection.

The next step is to build your own agents, which are solutions made just for your business that can make better use of your data and get more accurate over time. Remember that you are the one who knows your data best. Data scientists, AI and BI engineers are very important for getting value from data with AI.

Oracle Private Agent Factory lets you make your own custom agents, which is exactly what I’ve been wanting from Oracle.

That’s why I say Oracle was a little late to the cloud space, but it’s now making great strides in AI. It gives you peace of mind that your data will stay safe within Oracle’s secure boundaries because it has dedicated AI data centres.

In this article, I’ll walk you through how to create a simple agent to analyze Toronto housing data.

The Toronto housing market is constantly fluctuating and can be quite volatile. I’ve used a sample dataset to demonstrate how we can build a data analysis agent and generate meaningful insights.

 

Step 1: Create a Data Source

Start by creating a database data source and granting the necessary permissions to the user who owns the Toronto housing data table. This ensures the agent can securely access and query the data.

Figure 1: Create data source

Figure 2 illustrates how to directly add a data source to the database.


Figure 2: Create data source – 01

Figure 3 shows how to validate the connection and, once validated, save the connection details.

Figure 3: Create data source – 02

The database source should now be displayed and marked as connected.

Figure 4: Data Source

 

Let’s Build a Private Agent

Overview

As per this example, this agent is designed to answer questions about Toronto housing data by combining user input, database queries, prompt engineering, and an LLM, all connected through a simple drag-and-drop interface.

 

Step-by-Step Flow

1. Chat Input (User Entry Point)

  • This is where the user asks a question.

  • Example: “What is the average house price by city?”

 

2. SQL Query Block (Data Retrieval)

  • Connected to the TorontoHousingData database.

  • Executes a controlled SQL query (read-only).

  • Pulls relevant structured data needed to answer the question.

  • Ensures only safe, approved queries are run.

 

3. Prompt Block (Brain Setup)

  • Combines:

    • User question (from Chat Input).

    • Data (from SQL Query).

  • Uses a template to guide the LLM:

    • Clearly answer the question.

    • Handle missing data gracefully.

  • This is where you define how the AI should think and respond.

 

4. Agent Block (LLM Processing)

  • Uses a selected LLM (e.g., GPT model).

  • Receives the structured prompt.

  • Applies reasoning to generate a meaningful answer.

  • Settings like temperature (0.01) ensure consistent, factual responses.

  • Can be extended with tools or sub-agents if needed.

 

5. Chat Output (Final Response)

  • Sends the generated answer back to the user.

  • Clean, human-readable output.

 

Prompt + Agent Logic (The Intelligence Layer)

At the core of this setup is how the prompt and agent work together to generate accurate and safe responses.

Prompt Template:

User Question :


Database Result:


Instructions:
- What is the average house price by city?
- Answer the user question clearly
- Do NOT make up data
- if no results found , say " No data found"

 

How This Works

1. Dynamic Input Injection

  • → Query from the user’s question

  • {{data}} → comes from the SQL query result

This ensures the model only works with real, retrieved data, not assumptions.

 

2. Prompt Block (Control Layer)

The prompt acts as a governor for the LLM:

  • Forces the model to stay grounded in actual data.

  • Prevents hallucinations (“Do NOT make up data”).

  • Defines expected behavior (clear answer, fallback response).

This is critical when working with sensitive enterprise data.

 

3. Agent Block (Reasoning Engine)

  • The agent receives the fully structured prompt.

  • Uses the selected LLM to:

    • Interpret the question.

    • Analyze the database results.

    • Generate a clean, human-readable answer.

With a low temperature (0.01):

  • Responses are deterministic.

  • Focus is on accuracy over creativity.

Now, let’s discuss how we can create a prompt.

After you drag and drop the prompt, edit the text and add the context below.

Figure 6: Prompt agent

Once you save, you will receive two endpoints for connection:

  • Query

  • Data

Figure 7: Promot agent – 02

This is how it appears after saving, showing two endpoint connectors.

Workflow Summary

This diagram shows how to make a custom AI agent using Oracle Private Agent Factory. A user question is captured through chat input, combined with data retrieved from the database via an SQL query, and formatted using a prompt. The agent then uses an LLM to analyze the data and derive insights. These insights are sent back to the user through the chat output, all in a safe place.

Diagram:

User asks a question

[Chat Input] ──────────────────────────────┐
[SQL Query → TorontoHousingData] ──────────┤

[Prompt Template]
(combines Q + Data)

[Agent / LLM]
(GPT-120b, temp=0.01)

[Chat Output → User]

 

Workflow :

One thing to keep in mind is that the tool lets you connect to different LLMs based on how you want to use it. This pricing model is different from traditional VM-based ones because it is based on how many tokens you use. This means you pay based on how much data you process and create; it has a very different cost model that needs to be understood and improved. You can read this article, “Building a GenAI TCO: The Math of GenAI”, to learn more about how GenAI costs work.

 

Analyzing the Output from Our AI Agent

The agent takes a simple natural language question like “What is the average house price by city?” and turns it into a SQL query. Then, it gets the most recent data from the database. The result is presented in a clear, organized way that shows average house prices by city, along with important information like the province, year, and month.

This demonstrates how well the agent connects natural language queries with structured data, giving users accurate, real-time insights without them having to write any SQL.

For the second question, I asked the agent a more complicated analytical question to see how far I could push them:

Using a three-month rolling window, what are the sales volume trends for Toronto, and when does the rolling average reach its highest point?

This kind of question is more than just adding things together. It needs the agent, which was made with Oracle Private Agent Factory, to know how to do time-based analysis, use a rolling average calculation, and find trends over a moving window.

 

Conclusion

I really mean it when I say that Oracle Private Agent Factory is a game-changer. Every business has a lot of data in their data warehouse environments, but just having the raw data isn’t enough. The real chance to get the maximum out of data is having quality, and then letting LLMs find insights faster than any other BI method. AI doesn’t just help when your data is clean and well-organized; it speeds things up.

The real magic happens when you build custom agents. But don’t think it’s easy to invest, you need to spend real time testing your data, trying out different LLM and GenAI models, and making small changes until the output quality meets your standards. There are no easy ways to do this, and that’s the point.

We often say we understand our data, but the key is being clear about what we want to achieve with it. Once that’s defined, you can design the right agent to guide you there. OPAF agents enable you to visualize and predict trends effectively, and when paired with the right LLM model, they can unlock powerful outcomes.

Oracle Private Agent Factory makes that possible without compromising on security. By keeping everything within Oracle’s dedicated AI data centres, organizations can innovate boldly while ensuring proprietary data never leaves a trusted, governed environment. For businesses that have been waiting for an enterprise-grade, secure AI agent platform, the wait is over.

The future belongs to those who know their data and are smart enough to use it. Contact us today to find out more.