Bulk Move Worklogs Efficiently in Jira Cloud

Platform Notice: Cloud Only - This article only applies to Atlassian apps on the cloud platform.

Summary

This page aims to share a script for moving workloads between issues. This will allow the creation of new work logs when the 10,000 limit is enforced.

Important Documentation

Get Issue Worklogs API

Bulk Move Issue Worklogs API

Solution

Get Data

Fetch the worklog IDs you plan to move using the Get Issue Worklogs API. Use the parameters startedAfter and startedBefore to narrow the timeframe (timestamps in milliseconds).

def get_worklog_ids(issue_key, after=None, before=None): url = f"{SITENAME}/rest/api/3/issue/{issue_key}/worklog?" \ f"maxResults=5000&" \ f"startedAfter={get_timestamp_in_milliseconds(after)}&" \ f"startedBefore={get_timestamp_in_milliseconds(before)}" response = requests.get(url, auth=(USER, PWD), timeout=30) return [worklog['id'] for worklog in response.json()['worklogs']]

Get Data by author

To move worklogs authored by a specific user, custom filtering of results must be done, as the Get Issue Worklogs API does not accept a user parameter.

Example script in Python 3 (by email address)

def get_worklog_ids_by_emails(issue_key, emails, after=None, before=None): url = f"{SITENAME}/rest/api/3/issue/{issue_key}/worklog?" \ f"maxResults=5000&" \ f"startedAfter={get_timestamp_in_milliseconds(after)}&" \ f"startedBefore={get_timestamp_in_milliseconds(before)}" response = requests.get(url, auth=(USER, PWD), timeout=30) worklog_ids = [] for worklog in response.json()['worklogs']: if 'emailAddress' in worklog['author'] and worklog['author']['emailAddress'] in emails: worklog_ids.append(worklog['id']) return worklog_ids

Move Worklogs

Move the worklogs by calling the Move Issue Worklogs API with IDs received from the previous call

Example script in Python 3

Example script in Python 3

def bulk_move_worklogs(issue_key, worklog_ids, destination_issue): url = f"{SITENAME}/rest/api/3/issue/{issue_key}/worklog/move" body = { "issueIdOrKey": destination_issue, "ids": worklog_ids } response = requests.post(url, json=body, auth=(USER, PWD), timeout=30) return response.status_code, response.text

Full script

The following script will move all worklogs from work item SOURCE_ISSUE_KEY to work item DESTINATION_ISSUE_KEY started after STARTED_AFTER and before STARTED_BEFORE

import json from datetime import datetime import requests USER = 'admin@yahoo.com' PWD = 'prod_token' SITENAME = "the instance address" SOURCE_ISSUE_KEY = 'TEST-1' DESTINATION_ISSUE_KEY = 'TEST-2' # Adjust if you'd prefer to use a different format DATE_TIME_FORMAT = '%Y-%m-%dT%H:%M:%S%z' # Can be set to None STARTED_AFTER = '2023-01-01T11:00:00+1100' STARTED_BEFORE = '2024-01-01T11:00:00+1100' def get_worklog_ids(issue_key, after=None, before=None): url = f"{SITENAME}/rest/api/3/issue/{issue_key}/worklog?" \ f"maxResults=5000&" \ f"startedAfter={get_timestamp_in_milliseconds(after)}&" \ f"startedBefore={get_timestamp_in_milliseconds(before)}" response = requests.get(url, auth=(USER, PWD), timeout=30) return [worklog['id'] for worklog in response.json()['worklogs']] def get_worklog_ids_by_emails(issue_key, emails, after=None, before=None): url = f"{SITENAME}/rest/api/3/issue/{issue_key}/worklog?" \ f"maxResults=5000&" \ f"startedAfter={get_timestamp_in_milliseconds(after)}&" \ f"startedBefore={get_timestamp_in_milliseconds(before)}" response = requests.get(url, auth=(USER, PWD), timeout=30) worklog_ids = [] for worklog in response.json()['worklogs']: if 'emailAddress' in worklog['author'] and worklog['author']['emailAddress'] in emails: worklog_ids.append(worklog['id']) return worklog_ids def get_timestamp_in_milliseconds(date): if not date: return '' return 1000 * int(datetime.strptime(date, DATE_TIME_FORMAT).timestamp()) def bulk_move_worklogs(issue_key, worklog_ids, destination_issue): url = f"{SITENAME}/rest/api/3/issue/{issue_key}/worklog/move" body = { "issueIdOrKey": destination_issue, "ids": worklog_ids } response = requests.post(url, json=body, auth=(USER, PWD), timeout=30) return response.status_code, response.text while True: worklog_ids = get_worklog_ids(SOURCE_ISSUE_KEY, STARTED_AFTER, STARTED_BEFORE) if len(worklog_ids) == 0: print("No more worklogs to move") break status, body = bulk_move_worklogs(SOURCE_ISSUE_KEY, worklog_ids, DESTINATION_ISSUE_KEY) # partial success if status == 200: print(f"Successfully moved *some* worklogs, response \n\"{body}\"") elif status == 204: print(f"Successfully moved {len(worklog_ids)} worklogs from issue {SOURCE_ISSUE_KEY} to issue {DESTINATION_ISSUE_KEY}") else: print(f"Received different status than HTTP 200/204 from the Bulk Move endpoint, status {status}, response body \n\"{body}\"") break

FAQ

How many worklogs, at max, can be moved at a time?

The limit, as described in the API docs, is 5,000 worklogs.

What are the various criteria that can be used to identify the worklogs to move?

The Bulk Move API can only accept worklog IDs. Those IDs must be fetched using one of the available APIs, for example, the Get Issue Worklogs API

What is the maximum time range to get worklog IDs?

The Get Issue Worklogs API does not limit the time range, as long as the timestamps are valid.

Are worklogs moved from a single work item at a time?

The worklogs must be from a single work item, specified in the {issueIDOrKey} API parameter.

Are worklogs moved to a single work item at a time?

The worklogs can't be moved to more than one work item with one API call.

What if I move worklogs by mistake? Can I restore them back?

The worklogs can be restored by calling the same API, but with the work items reversed.

Does the Bulk Move API update the time-tracking field?

No, time-tracking values are not updated on either work item.

Does the Bulk Move API update work item change history?

No, work item change history is not updated on either work item.

Updated on September 25, 2025

Still need help?

The Atlassian Community is here for you.