• Products
  • Get started
  • Documentation
  • Resources

Integrate with Splunk IT Service Intelligence

The feature described in this article is currently rolling out to some Jira Service Management Cloud customers. It may not yet be visible or available on your site.

Splunk Logo

Splunk ITSI has a notable event functionality to create alerts or trigger a script. Use Splunk ITSI alerts to monitor for and respond to specific events. Alerts use a saved search to look for events in real time or on a schedule. Alerts are triggered when search results meet specific conditions. Alert actions can be used to respond when alerts trigger.

What does the integration offer?

Splunk IT Service Intelligence's easily sift through vast amounts of events by filtering and sorting them based on priority. Additionally, Splunk ITSI triggers alerts, initiates remediation, and automates incident workflows. Real-time anomaly detection also reduces alert fatigue. Through Jira Service Management for Splunk ITSI app, forward Splunk ITSI alerts to Jira Service Management. With the Splunk ITSI Integration, Jira Service Management acts as a dispatcher for these alerts, determines the right people to notify based on on-call schedules– notifies via email, text messages (SMS), phone calls, and iPhone & Android push notifications, and escalates alerts until the alert is acknowledged or closed.

How does the integration work?

Jira Service Management has a Splunk ITSI specific alert app to send Splunk ITSI alerts to Jira Service Management. Jira Service Management also has a specific API for Splunk ITSI Integration, Splunk ITSI sends alerts through Jira Service Management for Splunk ITSI app to Jira Service Management and Jira Service Management handles the automatic creation of alerts.

Set up the integration

Splunk ITSI is a bidirectional integration. Setting it up involves the following steps:

  • Add a Splunk ITSI integration in Jira Service Management

  • Configure the integration in Splunk ITSI

Add a Splunk ITSI integration

If you're using the Free or Standard plan in Jira Service Management, you can only add this integration from your team’s operations page. To access the feature through Settings (gear icon) > Products (under JIRA SETTINGS) > OPERATIONS, you need to be on Premium or Enterprise plan.

Adding an integration from your team’s operations page makes your team the owner of the integration. This means Jira Service Management only assigns the alerts received through this integration to your team.

To add a Splunk ITSI integration in Jira Service Management, complete the following steps:

  1. Go to your team’s operations page.

  2. On the left navigation panel, select Integrations and then Add integration.

  3. Run a search and select “Splunk ITSI”.

  4. On the next screen, enter a name for the integration.

  5. Optional: Select a team in Assignee team if you want a specific team to receive alerts from the integration.

  6. Select Continue.
    The integration is saved at this point.

  7. Expand the Steps to configure the integration section and copy the API URL.
    You will use this key while configuring the integration in Splunk ITSI later.

  8. Select Turn on integration.
    The rules you create for the integration will work only if you turn on the integration.

Configure the integration in Splunk ITSI

To configure the integration of Splunk ITSI with Jira Service Management, complete the following steps:

  1. In Splunk, install the Jira Service Management for Splunk ITSI app from Splunkbase.

  2. Go to your apps and select Set Up to configure the Jira Service Management app.

  3. Select the notable event for which you want to create an alert.

  4. Select Jira Service Management Manage Event with Jira Service Management from Actions.

  5. Paste the API URL previously copied from Jira Service Management into JSM API URL.

  6. Set the priority.

  7. Select Done.

Update the API URL

Update the default API URL and priority in alert_actions.conf for $SPLUNK_HOME/etc/apps/jsm_itsi/default.

Review the app logs

If any problems occur while creating an alert at Jira Service Management, check the app logs in $SPLUNK_HOME/var/log/splunk/jsm_itsi.log.

Create alerts in Notable Events Review

  1. In IT Service Intelligence, go to the Notable Events Review page.

  2. Select on a Notable Event to create an alert.

  3. Select Actions > Manage Event with Jira Service Management.

Create automatized alerts using Notable Event Aggregation Policies

  1. Select Configure > Notable Event Aggregation Policies.

  2. Create a New Notable Event Aggregation Policy or edit an existing one.

  3. In the Action Rules tab, add a Rule and update action to "Manage Event with Jira Service Management".

  4. Select Configure and use the API URL copied previously and select the priority.

  5. Click Save.

Map alert actions

Jira Service Management allows you to map alert actions between Jira Service Management actions and Splunk ITSI.

For alerts created by Splunk ITSI

  • Use this section to map Jira Service Management actions to Splunk ITSI actions when the source of the alert is Splunk ITSI (i.e When the alert is created by Splunk ITSI integration.)

  • Map different Jira Service Management actions to different Splunk ITSI actions. For example, set the Notable Event status to "In Progress" in Splunk ITSI, when the alert is acknowledged. In order to do this, define "If alert is acknowledged in Jira Service Management", "update Notable Event Status to In Progress" mapping in the "Send Alert Updates Back to Splunk ITSI" section.

After the update action is triggered in Jira Service Management, updates are sent back to the Notable Event and updates are viewed from the "Activity" tab.

Jira Service Management Priority to Splunk ITSI Severity Mapping

After updating alert priority, action will be sent to Splunk ITSI and update the severity of the Notable Event.

P1 Alert > Critical
P2 Alert > High
P3 Alert > Medium
P4 Alert > Low
P5 Alert > Info

Sample payload sent from Splunk and dynamic properties in Jira Service Management

Create Alert payload

JSON

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 { "title": "Tutorial", "severity_label": "test+label", "timeDiff": "900.000", "search_now": "1527605880.000", "service_kpi_ids": "92c7c8cc-70d4-4976-8c4d-f87d6234ca28:b5d1e7b2e8d1bcc64f", "is_service_max_severity_event": "1", "priority": "P3", "entity_title": "", "gs_service_id": "92c7c8cc-70d4-4976-8c4d-f874ca28", "composite_kpi_id": "test_kpi_id", "orig_index": "itsi_summary", "percentage": "100", "search_type": "composite_kpi_percentage_type", "health_score": "5", "search_name": "Buttercup Correlation Search", "linecount": "1", "color": "", "kpi": "4xx Errors Count", "scoretype": "2", "enabled": "1", "alert_severity": "urgent", "is_service_in_maintenance": "0", "owner": "unassigned", "event_identifier_hash": "f90c181ba0b3291864f50e6d5313f4a5ceb701c19b307bef72b8", "indexed_is_service_max_severity_event": "1", "alert_period": "5", "is_service_aggregate": "1", "source": "Buttercup Correlation Search", "composite_kpi_name": "test_kpi_name", "occurances": "1", "drilldown_uri": "null", "indexed_is_service_aggregate": "1", "splunk_server_group": "14", "indexed_itsi_kpi_id": "b5d1e7b2e8d1b5cc64f", "orig_severity": "normal", "kpi_name": "", "timeDiffInMin": "15.0000", "severity": "4", "host": "ip-172-31-38-242", "indexed_itsi_service_id": "92c7c8cc-70d4-4976-8c4d-f87ca28", "service_name": "", "change_type": "21", "index": "itsi_tracked_alerts", "total_occurrences": "3", "service_ids": "92c7c8cc-70d4-4976-8c4d-f87da28", "kpibasesearch": "DA-ITSI-APPSERVER_Performance_Web_Transaction", "all_info": "4xx Errors Count had severity value normal 1 times in Last 15 minutes", "gs_kpi_id": "b5d1e794555cc64f", "orig_sourcetype": "stash", "tag": "test", "statement": "4xx Errors Count had severity value normal 1 times in Last 15 minutes", "latest_alert_level": "test+level", "drilldown_search_search": "null", "splunk_server": "ip-172-24-94-292", "kpi_urgency": "", "urgency": "0", "kpiid": "b5d1e7b2e8d1b55cc64f", "alert_color": "#99D18B", "alert_value": "0", "time": "test_time", "description": "4xx Errors Count had severity value normal 1 times in Last 15 minutes", "alert_level": "2", "event_id": "233aa640-6337-11e8-8ffe-06f182", "severity_value": "4", "event_description": "4xx Errors Count had severity value normal 1 times in Last 15 minutes" }

 

Additional Help