LogoLogo
HOMEBLOG
  • Getting Started
  • Connect to Data
    • Projects
    • Data Connectors
      • Google BigQuery
      • Google Cloud Storage
      • Iceberg
      • Snowflake
      • AWS S3
      • AWS Athena
      • AWS Redshift
      • Databricks Delta
      • Azure Blob
      • Salesforce
      • SAP Hana
      • File Path Options
      • SQL Server
      • Trino
    • Connection Modes
    • Triggering Scans
    • Configuring a Data Source
  • Profiling Data
    • Data Health Metrics
    • Data Health Overview Page
    • Interactive Profiling Tool: Investigator
    • Data Diff
    • Compound Attributes
      • List of Supported Functions
  • Monitoring Data
    • Data Quality Metrics
    • Alert Policies
    • Data Trends and Alerts
    • Metrics Inspector
  • Data Quality Rules
    • Rules Expression Examples
  • PII Data Detection
  • Remediation
    • Data Binning
    • Circuit Breaker
  • Integrations
    • Jira Integration
    • Slack
    • Jobs Status Notification
  • User Management
    • Okta SSO Setup
    • SSO Configuration
  • API Reference
    • Authentication API
    • API Keys
    • Telmai IP List
    • Get Google Service Account API
  • Source APIs
    • Source APIs
  • Upload Data APIs
    • Upload data from Cloud
      • RedShift Request data
      • GCS Request data
      • Azure Request data
      • GBQ Request data
      • Snowflake Request data
      • Amazon S3 Request data
      • Delta Lake Request
      • Trino Request data
    • Track upload job
    • Check for alerts
  • Admin APIs
    • User Management
  • Telmai Releases
    • Release Notes
      • 25.2.0
      • 25.1.3
      • 25.1.2
      • 25.1.0
Powered by GitBook
On this page
  1. Connect to Data
  2. Data Connectors

Iceberg

Tables hosted in Hive Metastore

  1. Navigate to the Iceberg Configuration Page

    • Go to your project's dashboard.

    • Under Source Systems, select Iceberg.

  2. Configure Connection Details

    • Fill in the required fields as follows:

      • Thrift URI: Enter the Thrift URI to connect to Hive Metastore using thrift protocol.

      • Database: Specify the database where Iceberg table resides.

      • Location Type: Choose the type of storage, where physical data files are present. (Currently only GCS supported)

      • For GCS:

        • Project ID: Provide your project identifier.

        • Client ID: Enter your client identifier.

        • Email: Provide your email for authentication.

        • Private Key ID: Enter the private key ID.

        • Private Key: Enter the private key.

      • For Azure:

        • Not Supported

      • For S3:

        • Not Supported

    Specified GCS service account credentials should have read-access to Hive Metastore warehouse. Additionally in case of private cloud deployment write-access to Telmai internal storage bucket should be set to this service account.

  3. Validate Connection

    • After filling all fields, click Next.

    • The system will validate your connection parameters.

    • If successful, you will proceed to Table selection.

  4. Create Source

    • After table selection, click Confirm to create the data source.

PreviousGoogle Cloud StorageNextSnowflake

Last updated 25 days ago