Prometheus Integration

Create alerts in ilert from Prometheus Alertmanager alerts

Prometheus is an open-source systems monitoring and alerting toolkit that uses a pull-based approach to collecting metrics in a timeseries database.

With ilert's Prometheus integration, you can automatically create alerts in ilert using the Prometheus' Alertmanager. That way, you will never miss a critical alert and always alert the right person using ilert's on-call schedules, automatic escalation, and multiple alerting channels. When the Alertmanager triggers an alert, ilert will alert the on-call person through their preferred channel, including SMS, phone calls, push notifications and Slack. ilert will automatically escalate to the next person, if the alert is not acknowledged. ilert also lets you define alerting rules based on support hours and delay alerts until your support hours start.

System Requirements

In ilert: Create a Prometheus alert source

  1. Go to Alert sources --> Alert sources and click on Create new alert source

  2. Search for Prometheus in the search field, click on the Prometheus tile and click on Next.

  3. Give your alert source a name, optionally assign teams and click Next.

  4. Select an escalation policy by creating a new one or assigning an existing one.

  5. Select you Alert grouping preference and click Continue setup. You may click Do not group alerts for now and change it later.

  6. The next page show additional settings such as customer alert templates or notification prioritiy. Click on Finish setup for now.

  7. On the final page, an API key and / or webhook URL will be generated that you will need later in this guide.

In Prometheus Alertmanager: add a webhook receiver

1. Add a Webhook configuration from the alert manager in the configuration file. Use the URL generated in ilert as the Webhook URL:

receivers:
- name: 'ilert.web.hook'
  webhook_configs:
  - url: 'https://api.ilert.com/api/v1/events/prometheus/e6bcfcbf-a38f-462a-af9d-1687809b7594'

2. You can now configure any route in the Alert Manager. In the following example, all alerts that do not match another route are sent to ilert:

route:
group_by: ['alertname']
group_wait: 10s
group_interval: 10s
repeat_interval: 1h
receiver: 'ilert.web.hook'

3. Restart the alert manager

4. Optional: Send a test alert through the Alert Manager API.

curl -d '[{"labels":{"Alertname":"iLert Test"},"annotations":{"summary":"iLert Test"}}]' http://localhost:9093/api/v1/alerts

Dynamic policy routing

ilert's Prometheus integration supports dynamic escalation policy routing with the help of routing keys.

In ilert navigate to the escalation policies that you want to route to and enter a unique routing key for for each policy.

In your Prometheus alert rule yml add a label called ilert_routingkey and set its value to the policy's routing key that you want to assign to the alert e.g. ilert_routingkey: policy1

When ilert receives Prometheus alert events it will look for the first alert with the specific label and decide upon the routing. If the label is not present the escalation policy that is assigned to the alert source is used instead.

Supported custom labels

  • gcp_project will be automatically added to the alert summary

  • url may be used to add a custom link to the alert

  • urlLabel may be used to set a defined label for the custom link (url)

Note: for custom labels to be accepted they must be part of the alert labels of an alert in status firing

FAQ


Will alerts in ilert be resolved automatically?

Yes, Prometheus also sends resolved events by default, as long as the send_resolved: false option is NOT set in the Webhook configuration of the alert manager. Furthermore, resolved events - just like firing events - are not sent until the next group_interval configuration in the alert manager.

Can I link Prometheus to multiple alert sources in ilert?

Yes, create several Webhook receivers in Prometheus and enter the URL of the alert source from ilert in the Webhook URL.

What if my internet connection is interrupted? Are the alerts generated in Prometheus lost?

No, alerts are not lost. The alert manager has a retry mechanism. In addition, we recommend that you monitor your Internet connection with an external monitoring service (e.g. using iLert's heartbeat feature or uptime monitoring). See here for a Prometheus Heartbeat Example.

Not all Prometheus Alerts alerts are created in ilert. Why?

The alerts from Prometheus are grouped and sent to ilert and bundled in an alert. Grouping is affected by the group_by configuration in the Alert Manager route.

Example:

route:
  # The labels by which incoming alerts are grouped together. For example,
  # multiple alerts coming in for cluster=A and alertname=LatencyHigh would
  # be batched into a single group.
  #
  # To aggregate by all possible labels use '...' as the sole label name.
  # This effectively disables aggregation entirely, passing through all
  # alerts as-is. This is unlikely to be what you want, unless you have
  # a very low alert volume or your upstream notification system performs
  # its own grouping. Example: group_by: [...]
  group_by: ['alertname', 'cluster', 'service']

The integration does not work. How do I find the issue?

First, look in the log file of the alert manager. If you can not find the issue, please contact our support at support@ilert.com.

Further References

Here is the instruction on how to import metrics from Prometheus and display them on your ilert status page:

Import metrics from Prometheus

Last updated