Import historical data
- Workforce Management > Historical Data > Upload permission
- Queues, skills, and languages configured in Genesys Cloud
To upload historical data, perform the following steps:
- Click Admin.
- Under WorkforceManagement, click Historical Data Import.
- Click the calendar under Historical Data End Date (UTC) and choose the latest date and time to use in the import file.
- Under File to Upload, click Browse.
- Navigate to the location that contains the file you want to import, and select it.
- Click Import.
After the file imports, the validation process begins. The Import History table provides status and validation details about the imported data.
Validation details
Status | Description |
---|---|
Import Initiated | The system accepts the requests and begins to upload the data. |
Import in progress | The system begins to validate the file. |
Import pending | Validation passed successfully and the batch process begins. The import process completes at approximately 3:00 a.m. local time in each region. |
Import Successful (Active) | The batch process successfully processed the file and the data is ready to use. |
Import Successful | The batch process successfully processed the file but a new successful upload replaced it. |
Import Failed |
The upload was not successful and the file contains errors. Click the link in the Validation Results column for more information. |
Import Canceled | A new upload was initiated and validated successfully, which cancels the previous request. |
Purge Pending |
Historical data purge request initiated successfully and waits for a batch process to purge the data from Genesys Cloud. The purge process completes at approximately 3:00 a.m. local time in each region. Note: This entry only appears if another table entry contains a Successful (Active) status. |
Purge Successful | A data purge process completed successfully. This entry only appears if another table entry contains a Successful (Active) status. |
File considerations
The file must include data in exactly 15-minute intervals. In the example below, the granularity level configured in 30-minute intervals.
Interval Start UTC Date | Queue | Media Type | Skill Set | Language | Offered | Interactions Handled | Total Handle Time |
---|---|---|---|---|---|---|---|
2020-02-26T00:00:00Z | Example queue 1 | Voice | recording ||| analytics | English | 16 | 6 | 4 |
2020-02-26T00:30:00Z | Example queue 1 | Voice | recording ||| analytics | English | 14 | 7 | 6 |
If you send the data, the back-end process considers that no data exists for every 15 minutes as a pattern and creates an incorrect forecast. The values considered in this case are:
Interval Start UTC Date | Queue | Media Type | Skill Set | Language | Offered | Interactions Handled | Total Handle Time |
---|---|---|---|---|---|---|---|
2020-02-26T00:00:00Z | Example queue 1 | Voice | recording ||| analytics | English | 16 | 7 | 4 |
2020-02-26T00:15:00Z | Example queue 1 | Voice | recording ||| analytics | English | Null | Null | Null |
2020-02-26T00:30:00Z | Example queue 1 | Voice | recording ||| analytics | English | 14 | 7 | 6 |
2020-02-26T00:45:00Z | Example queue 1 | Voice | recording ||| analytics | English | Null | Null | Null |
To correct the issue, split the data into two parts for 00 minutes and another 15 minutes as shown below:
Interval Start UTC Date | Queue | Media Type | Skill Set | Language | Offered | Interactions Handled | Total Handle Time |
---|---|---|---|---|---|---|---|
2020-02-26T00:00:00Z | Example queue 1 | Voice | recording ||| analytics | English | 8 | 3 | 2 |
2020-02-26T00:15:00Z | Example queue 1 | Voice | recording ||| analytics | English | 8 | 3 | 2 |
2020-02-26T00:30:00Z | Example queue 1 | Voice | recording ||| analytics | English | 7 | 4 | 3 |
2020-02-26T00:45:00Z | Example queue 1 | Voice | recording ||| analytics | English | 7 | 4 | 3 |