LogoLogo
HOMEBLOG
  • Getting Started
  • Connect to Data
    • Projects
    • Data Connectors
      • Google BigQuery
      • Google Cloud Storage
      • Iceberg
      • Snowflake
      • AWS S3
      • AWS Athena
      • AWS Redshift
      • Databricks Delta
      • Azure Blob
      • Salesforce
      • SAP Hana
      • File Path Options
      • SQL Server
      • Trino
    • Connection Modes
    • Triggering Scans
    • Configuring a Data Source
  • Profiling Data
    • Data Health Metrics
    • Data Health Overview Page
    • Interactive Profiling Tool: Investigator
    • Data Diff
    • Compound Attributes
      • List of Supported Functions
  • Monitoring Data
    • Data Quality Metrics
    • Alert Policies
    • Data Trends and Alerts
    • Metrics Inspector
  • Data Quality Rules
    • Rules Expression Examples
  • PII Data Detection
  • Remediation
    • Data Binning
    • Circuit Breaker
  • Integrations
    • Jira Integration
    • Slack
    • Jobs Status Notification
  • User Management
    • Microsoft Entra IDP Setup
    • Auth0 Setup
    • Okta SSO Setup
    • SSO Configuration
  • API Reference
    • Authentication API
    • API Keys
    • Telmai IP List
    • Get Google Service Account API
  • Source APIs
    • Source APIs
  • Upload Data APIs
    • Upload data from Cloud
      • RedShift Request data
      • GCS Request data
      • Azure Request data
      • GBQ Request data
      • Snowflake Request data
      • Amazon S3 Request data
      • Delta Lake Request
      • Trino Request data
    • Track upload job
    • Check for alerts
  • Admin APIs
    • User Management
  • Telmai Releases
    • Release Notes
      • 25.2.1
      • 25.2.0
      • 25.1.3
      • 25.1.2
      • 25.1.0
Powered by GitBook
On this page
  1. Upload Data APIs
  2. Upload data from Cloud

Azure Request data

Request Body for Azure

{
     "type":string,
      "id_attribute": string,
     "payload": {
         "bucket":string,
         "path":string,
         "read_options": {
           "separator": string
         },
         "security": {
           "sas_key": "string",
          "storage_account": "string"
         }
     }
 }

type

string

Required. "azure"

id_attribute

string

bucket

string

Required. Name of the bucket

path

string

read_options

json

Provide this only for CSV input if there is a separator like “,” or “\t”. For parquet and json, omit read_options and separator

separator

string

For CSV only,

‘,’ for CSV

sas_key

string

Required. Shared Access Signature access key

storage_account

string

Required. Storage Account

PreviousGCS Request dataNextGBQ Request data

Last updated 6 months ago

Optional. Name of the column in data that represents the identifier of the row. Read more

Required. Full path of file inside the bucket. Read more

here
here