Who This Guide Is For
This guide is for engineers and observability teams who rely on Datadog and want to get more out of their existing setup — without adding operational overhead, complexity, or risk. You might be here because:- You’re happy with Datadog’s real-time workflows, but want a more cost-effective way to retain logs for weeks, months, or even years
- Your team needs long-term access to business-critical log data without dealing with cold storage or rehydration delays
- You’re forced to drop or exclude valuable logs — not because they’re low priority, but because you’re working within budget constraints that limit visibility and slow down troubleshooting
- You’re looking to reduce redundant tooling by complementing Datadog with a purpose-built system for deep search and long-term log access
1. Getting Logs into Bronto
Bronto is designed to complement your existing Datadog setup, allowing you to forward logs to both platforms seamlessly. By configuring the Datadog Agent to send a copy of your logs to Bronto, you can maintain your current workflows while extending your log retention and search capabilities.Step 1: Generate a Bronto API Key
- Log in to your Bronto account.
- Navigate to Settings > API Keys.
- Click on Create API Key, provide a name for the key, and copy it to a secure location.
Step 2: Configure the Datadog Agent
In your Datadog Agent configuration file, add:YOUR-BRONTO-API-KEY with your key and <REGION> with us or eu.
Step 3: Restart the Agent and Confirm Logs Are Arriving
Visit your Bronto workspace and go to the Search page to verify that logs are being ingested.Other Integration Options
If you’re not using the Datadog Agent — or you’re planning to send data directly to Bronto — we support a number of other ingestion methods. A common alternative is Fluent Bit, which offers flexible configuration and wide plugin support. For a full list of supported agents and integrations, visit our Agent Setup documentation.2. Bronto’s Data Model: Collections, Datasets, and Metadata
Bronto uses a simple and flexible model to organize your logs:- Datasets represent individual log streams, typically aligned to a service or log type (e.g.,
payments-api,nginx) - Collections group related datasets together — such as by environment, team, or function (e.g.,
production,web-tier)
service.name→ Datasetservice.namespace→ Collectionenvironment,version→ searchable context
3. Search That Just Works
Bronto’s search is built to feel familiar. If you’ve used Datadog, Kibana, or Splunk, you’ll be right at home. Just type a keyword or key-value pair. Bronto autocompletes based on your existing metadata likeservice.name, env, and version. There’s no need to preselect a dataset or learn a new syntax. It just works.
One thing that might feel unfamiliar is that you don’t have to choose which logs to index. In Bronto, all logs are indexed and searchable by default.
For advanced use cases, Bronto supports SQL-like queries, but most users won’t need them to get started.See full search syntax and tips in our docs
4. Tagging and Metadata in Bronto
Bronto automatically picks up the tags and metadata you’re already using in Datadog. If your logs include fields likeservice, env, or version, they’re available in Bronto out of the box and ready to use in search.
You can also add or adjust tags via UI or API as needed.
5. Which Workflows to Mirror — and Which to Leave in Datadog
You don’t need to move everything to Bronto all at once. Most teams start gradually by focusing on the logs that benefit most from longer retention and deep search — and expand as they see value. Mirror in Bronto:- Long-term investigations
- Security and compliance logs
- Logs you currently drop for cost reasons
- Anything useful beyond Datadog’s retention window
- Real-time alerts and monitors
- SLO dashboards
- Live tail and active triage tools
6. Correlating Metrics in Datadog with Logs in Bronto
If your teams rely on Datadog dashboards to monitor metrics or traces, Bronto can serve as the next step in your investigation workflow — giving you fast, searchable access to logs even after they’ve aged out of Datadog’s retention window.The easiest way to connect Datadog to Bronto is with context links:
These allow you to add a custom link to Datadog graphs, monitors, or dashboards that opens a Bronto search page filtered to the relevant service, environment, and time range.How to set it up:
- In Datadog, go to Dashboards > Context Links and create a new link.
- Set the display name to something like “View logs in Bronto.”
- Use the following URL format (adjusting for your Bronto region —
us,eu, etc.):
https://app.<REGION>.bronto.io/search?query=[...]
-
Map Datadog template variables:
$serviceto your service tag$envto your environment tag$fromTsand$toTsto timestamps provided by the Datadog time window
- Save and apply the link to relevant dashboards or monitors.
Bronto search syntax
7. Retention Strategy: Keep 3 Days in Datadog, 1 Year in Bronto
The most popular Bronto usage pattern is splitting log retention across tools:- Datadog: 3–5 days of logs for alerts and real-time workflows
- Bronto: 90 days, 1 year, or more for deep investigations, audits, and searchability
- Retain more logs overall
- Reduce gaps and blind spots
- Avoid expensive cold storage and rehydration delays
- Reconstructing user behavior
- Debugging edge-case failures
- Verifying the impact of a regional outage
If you’re not sure where to start, begin by routing one high-volume log type — such as CDN access logs — into Bronto and see what it unlocks.
Appendix: Quick Reference Mappings
| Concept | In Datadog | In Bronto |
|---|---|---|
| Log grouping | Index | Collection + Dataset |
| Log source | service | service.name |
| Environment | env | env |
| Version | version | version |
| Search/filter syntax | key:value | key:value |
| Autocomplete | Tag-based | Attribute and tag-based |
| Retention extension | Cold storage or rehydration | Native, hot, searchable |

