Schedule Cloud Upload

APIs to scheduled data uploads

Before executing APIs to schedule uploads, grant read access to the Google Cloud Storage bucket or Google BigQuery table (if using either).

Tenant Id can be retrieved from the UI as listed here

Source Id can be retrieved from the UI or using the APIs

Get all scheduled uploads for this tenant and source

GET https://demo.api.telm.ai/{tenant}/configuration/sources/{source}/scheduled_uploads

Path Parameters

NameTypeDescription

source_id*

string

Id of source to upload data to

tenant*

string

Name of tenant

Headers

NameTypeDescription

Content-type

string

application/json

Authentication*

string

Bearer <access_token>. Access token from Authentication API

{
  "delta_only": true,
  "enabled": true,
  "sample_fraction": 0,
  "schedule": "string",
  "skip_investigator": true,
  "source": "string",
  "timestamp_attribute": "string",
  "upload": {
    "azureLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "sas_key": "string",
        "storage_account": "string"
      }
    },
    "bigQueryLocation": {
      "dataset": "string",
      "method": "EXPORT",
      "project": "string",
      "table": "string"
    },
    "databaseTableLocation": {
      "table": "string"
    },
    "fileStorageLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      }
    },
    "gcsLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      }
    },
    "idAttribute": "string",
    "s3Location": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "aws_key": "string",
        "aws_secret": "string",
        "region": "string"
      }
    },
    "snowflakeLocation": {
      "account": "string",
      "database": "string",
      "schema": "string",
      "security": {
        "private_key": "string",
        "role": "string",
        "user": "string"
      },
      "table": "string",
      "warehouse": "string"
    },
    "type": "AZURE"
  }
}

Schedule an upload

POST https://demo.api.telm.ai/{tenant}/configuration/sources/{source}/scheduled_uploads

Examples of request body are listed below

Path Parameters

NameTypeDescription

source_id*

string

Id of source to upload data to

tenant*

string

Name of tenant

Headers

NameTypeDescription

Content-type

string

application/json

Authentication*

string

Bearer <access_token>. Access token from Authentication API

Request Body

NameTypeDescription

enabled*

Boolean

true or false

sample_fraction

Boolean

Default false. If true, then 10% of the data will be used for sampling and processing

schedule*

String

Cron schedule expression. Refer to these resources for format - Cron wiki and Crontab

skip_investigator

Boolean

If true, investigator feature will be unavailable for this upload, only trends.

timestamp_attribute

String

Name of the attribute column that has the timestamp information. This will be used for the delta_only field

delta_only

Boolean

If true, only delta rows will be processed based on the timestamp column defined. For GBQ and Snowflake, if timestamp is blank, delta_only is ignored.

upload*

Json

Inner json object that defines the location of the input file

type*

String

Type of location of the input file. Acceptable values listed here

payload*

Json

Depending on the type of location, this section is the same as the respective request body parameters listed here

{
  "delta_only": true,
  "enabled": true,
  "sample_fraction": 0,
  "schedule": "string",
  "skip_investigator": true,
  "source": "string",
  "timestamp_attribute": "string",
  "upload": {
    "azureLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "sas_key": "string",
        "storage_account": "string"
      }
    },
    "bigQueryLocation": {
      "dataset": "string",
      "method": "EXPORT",
      "project": "string",
      "table": "string"
    },
    "databaseTableLocation": {
      "table": "string"
    },
    "fileStorageLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      }
    },
    "gcsLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      }
    },
    "idAttribute": "string",
    "s3Location": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "aws_key": "string",
        "aws_secret": "string",
        "region": "string"
      }
    },
    "snowflakeLocation": {
      "account": "string",
      "database": "string",
      "schema": "string",
      "security": {
        "private_key": "string",
        "role": "string",
        "user": "string"
      },
      "table": "string",
      "warehouse": "string"
    },
    "type": "AZURE"
  }
}

Update a scheduled upload

PUT https://demo.api.telm.ai/{tenant}/configuration/sources/{source}/scheduled_uploads

Examples of request body are listed below

Path Parameters

NameTypeDescription

source_id*

string

Id of source to upload data to

tenant*

string

Name of tenant

Headers

NameTypeDescription

Content-type

string

application/json

Authentication*

string

Bearer <access_token>. Access token from Authentication API

Request Body

NameTypeDescription

enabled*

Boolean

true or false

sample_fraction

Boolean

Default false. If true, then 10% of the data will be used for sampling and processing

schedule*

String

Cron schedule expression. Refer to these resources for format - Cron wiki and Crontab

skip_investigator

Boolean

If true, investigator feature will be unavailable for this upload, only trends.

timestamp_attribute

String

Name of the attribute column that has the timestamp information. This will be used for the delta_only field

delta_only

Boolean

If true, only delta rows will be processed based on the timestamp column defined. For GBQ and Snowflake, if timestamp is blank, delta_only is ignored.

upload*

Json

Inner json object that defines the location of the input file

type*

String

Type of location of the input file. Acceptable values listed here

payload*

Json

Depending on the type of location, this section is the same as the respective request body parameters listed here

{
  "delta_only": true,
  "enabled": true,
  "sample_fraction": 0,
  "schedule": "string",
  "skip_investigator": true,
  "source": "string",
  "timestamp_attribute": "string",
  "upload": {
    "azureLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "sas_key": "string",
        "storage_account": "string"
      }
    },
    "bigQueryLocation": {
      "dataset": "string",
      "method": "EXPORT",
      "project": "string",
      "table": "string"
    },
    "databaseTableLocation": {
      "table": "string"
    },
    "fileStorageLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      }
    },
    "gcsLocation": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      }
    },
    "idAttribute": "string",
    "s3Location": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "aws_key": "string",
        "aws_secret": "string",
        "region": "string"
      }
    },
    "snowflakeLocation": {
      "account": "string",
      "database": "string",
      "schema": "string",
      "security": {
        "private_key": "string",
        "role": "string",
        "user": "string"
      },
      "table": "string",
      "warehouse": "string"
    },
    "type": "AZURE"
  }}

Delete a scheduled upload

DELETE https://demo.api.telm.ai/{tenant}/configuration/sources/{source}/scheduled_uploads

Path Parameters

NameTypeDescription

source_id*

string

Id of source to upload data to

tenant*

string

Name of tenant

Headers

NameTypeDescription

Content-type

string

application/json

Authentication*

string

Bearer <access_token>. Access token from Authentication API

{
  "message": "string"
}

Request Body for POST and PUT

Some examples of the request body parameters are described below

Depending on the type of input source, the "upload" section will be any one of the request body described in Upload Data Apis section.

Example 1: If input file is located in Azure

{
  "delta_only": true,
  "enabled": true,
  "sample_fraction": 0,
  "schedule": "string",
  "skip_investigator": true,
  "source": "string",
  "timestamp_attribute": "string",
  "idAttribute": "string",
  "upload": {
    "type": "AZURE"
    "payload": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "sas_key": "string",
        "storage_account": "string"
      }
    }
  }
}

Example 2: If input file is located in BIGQUERY:

{
  "delta_only": true,
  "enabled": true,
  "sample_fraction": 0,
  "schedule": "string",
  "skip_investigator": true,
  "source": "string",
  "timestamp_attribute": "string",
  "idAttribute": "string",
  "upload": {
    "type": "BIGQUERY"
    "payload": {
      "dataset": "string",
      "method": "EXPORT",
      "project": "string",
      "table": "string"
    }
  }
}

Example 3: If input file is located in S3:

{
  "delta_only": true,
  "enabled": true,
  "sample_fraction": 0,
  "schedule": "string",
  "skip_investigator": true,
  "source": "string",
  "timestamp_attribute": "string",
  "idAttribute": "string",
  "upload": {
    "type": "S3"
    "payload": {
      "bucket": "string",
      "path": "string",
      "read_options": {
        "separator": "string"
      },
      "security": {
        "aws_key": "string",
        "aws_secret": "string",
        "region": "string"
      }
    }
  }
}

Last updated