Search…
Schedule Cloud Upload
APIs to scheduled data uploads
Before executing APIs to schedule uploads, grant read access to the Google Cloud Storage bucket or Google BigQuery table (if using either).
Tenant Id can be retrieved from the UI as listed here
Source Id can be retrieved from the UI or using the APIs
get
https://demo.api.telm.ai
/{tenant}/configuration/sources/{source}/scheduled_uploads
Get all scheduled uploads for this tenant and source
post
https://demo.api.telm.ai
/{tenant}/configuration/sources/{source}/scheduled_uploads
Schedule an upload
put
https://demo.api.telm.ai
/{tenant}/configuration/sources/{source}/scheduled_uploads
Update a scheduled upload
delete
https://demo.api.telm.ai
/{tenant}/configuration/sources/{source}/scheduled_uploads
Delete a scheduled upload

Request Body for POST and PUT

Some examples of the request body parameters are described below
Depending on the type of input source, the "upload" section will be any one of the request body described in Upload Data Apis section.

Example 1: If input file is located in Azure

1
{
2
"delta_only": true,
3
"enabled": true,
4
"sample_fraction": 0,
5
"schedule": "string",
6
"skip_investigator": true,
7
"source": "string",
8
"timestamp_attribute": "string",
9
"idAttribute": "string",
10
"upload": {
11
"type": "AZURE"
12
"payload": {
13
"bucket": "string",
14
"path": "string",
15
"read_options": {
16
"separator": "string"
17
},
18
"security": {
19
"sas_key": "string",
20
"storage_account": "string"
21
}
22
}
23
}
24
}
25
Copied!

Example 2: If input file is located in BIGQUERY:

1
{
2
"delta_only": true,
3
"enabled": true,
4
"sample_fraction": 0,
5
"schedule": "string",
6
"skip_investigator": true,
7
"source": "string",
8
"timestamp_attribute": "string",
9
"idAttribute": "string",
10
"upload": {
11
"type": "BIGQUERY"
12
"payload": {
13
"dataset": "string",
14
"method": "EXPORT",
15
"project": "string",
16
"table": "string"
17
}
18
}
19
}
20
Copied!

Example 3: If input file is located in S3:

1
{
2
"delta_only": true,
3
"enabled": true,
4
"sample_fraction": 0,
5
"schedule": "string",
6
"skip_investigator": true,
7
"source": "string",
8
"timestamp_attribute": "string",
9
"idAttribute": "string",
10
"upload": {
11
"type": "S3"
12
"payload": {
13
"bucket": "string",
14
"path": "string",
15
"read_options": {
16
"separator": "string"
17
},
18
"security": {
19
"aws_key": "string",
20
"aws_secret": "string",
21
"region": "string"
22
}
23
}
24
}
25
}
Copied!