Atlassian-hosted LLMs

Who can do this?
Role: Organization admin
Atlassian Cloud: Enterprise plan
Atlassian Government Cloud: Not available

While we take great care in designing Rovo AI features with responsible security and trust standards, some enterprise customers have heightened data requirements and need to block access to external LLM providers. These are typically organizations in regulated industries like finance or government, or those with only one pre-approved LLM provider. (We optimize for dynamic routing and can’t limit LLM processing to that provider alone.)

These customers can opt into Atlassian-hosted LLMs, allowing organizations with Cloud Enterprise to confidently adopt Rovo AI features while meeting their security and compliance standards. When this setting is enabled, no customer data leaves Atlassian’s Cloud boundary for LLM processing.

How Atlassian‑hosted LLMs work

Rovo normally uses dynamic routing between trusted third‑party LLMs and Atlassian‑hosted LLMs to select the best model for each request. It runs on infrastructure managed by Atlassian and our cloud hosting providers.

When an organization enables Atlassian‑hosted LLMs, Rovo’s AI‑powered features rely only on LLMs that are hosted within the Atlassian Cloud boundary.

We still use a diverse range of open-source, Atlassian-hosted LLMs, including models from OpenAI, Google, Meta, and Mixtral.

Features still use Teamwork Graph, which is unique to your teams' project or service work, to deliver results specific to your organization’s context.

But prompts and context used for LLM processing are not sent to external LLM providers (such as OpenAI).

Whether you opt into Atlassian-hosted LLMs or not, your inputs and outputs are never used to train or fine-tune external models.

For more detail on how we protect your data when using AI, see the Trust Center and our security practices.

Availability and scope

Atlassian‑hosted LLMs are available by request to Cloud Enterprise customers that activate Rovo AI.

  • Org‑level setting

    • The Atlassian‑hosted LLMs option is applied at the organization level, not per site.

  • Data location

    • Atlassian‑hosted LLMs are hosted in a US data center.

  • Feature coverage

    • Most Rovo features in Jira and Confluence (such as Search, Chat, and agents) work with Atlassian‑hosted LLMs.

      • Some multimodal features (for example, audio) may not be available.

    • Rovo features in Jira Service Management and other collections will be verified in coming quarters.

Atlassian-hosted LLMs meet our commitments to enterprise-grade quality and latency, but because the underlying set of models is different, you may notice variations in performance, latency, or response quality compared with the default configuration that uses both Atlassian‑hosted and third‑party LLMs.

You should review this option with your security, privacy, and compliance stakeholders to confirm that it meets your organization’s requirements.

Important considerations

  • If your team currently uses Rovo AI features and switches to Atlassian-hosted LLMs, your existing AI agents — as well as experiences like Search and Chat — won't have access to the same information and may return different responses.

  • When we enforce Rovo credit usage quotas in the future, customers with Atlassian-hosted LLMs will be subject to higher rates.

How to opt in to Atlassian‑hosted LLMs

Atlassian‑hosted LLMs don’t have self‑serve controls in Atlassian Administration yet.

Org admins with Cloud Enterprise who would like to opt in should raise a request with:

 

Still need help?

The Atlassian Community is here for you.