Sample Source Configurations
Below are examples for configuring the agent to connect to various data sources.
BigQuery
##########
# BigQuery
##########
# sources:
# - identifier: Example BigQuery source
# connectionFactory:
# type: bigquery
# host: https://www.googleapis.com/bigquery/v2
# port: 443
# Add the BQ key file to the agent_config directory and reference it below for the agent
# keyFile: /app/config/bigquery_credentials.json
Databricks
Server hostname, port, and HTTP path should be pulled from Databricks Advanced Options (JDBC/ODBC tab).
Personal access tokens can be generated in your Databricks UI. You can review the Databricks token management documentation here.
Additional connection details can be found in the Databricks documentation here.
##########
# Databricks
##########
# - identifier: Example Databricks source
# connectionFactory:
# type: spark
# host: example.cloud.databricks.com
# port: 443
# user: Http Path (ie sql/protocolv1/o/5542344243/54935-34534-anon123)
# password: Personal Access Token (ie. 4f9aaac66235d46a4b12ebf1f058a62c101)
#
# - identifier: Example Databricks oauth source
# connectionFactory:
# type: spark
# host: example.cloud.databricks.com
# port: 443
# databaseName: Http Path (ie sql/protocolv1/o/554234234713/5945-34534-anon123)
# user: Oauth App Name (i.e. d58c5cad-47fb-4d3b-9de5-008d3327bb33)
# password: Oauth Secret (i.e. dose984735ab8ab7aa9a47)
# authType: AUTH_TYPE_OAUTH2_M2M
Netezza
The Netezza JDBC driver isn't included with the Bigeye Data Source agent due to licensing restrictions, so it is necessary to download it onto the host where the agent will be running. Do this after the agent setup steps have been run. You can learn how to set a custom path to a JDBC driver here.
##########
# Netezza
##########
# - identifier: Example Netezza source
# connectionFactory:
# type: netezza
# host: 10.123.123.123
# port: 5480
# user: user
# password: "password"
# databaseName: somedb
Oracle
##########
# Oracle
##########
# - identifier: Example Oracle source
# connectionFactory:
# type: oracle
# host: oracle.example.com
# port: 1521
# user: user
# password: "password"
# databaseName: somedb
If you need to provide custom configuration to the Oracle JDBC driver, you can use the jdbcUrl options instead of the host, port, and databaseName combination. To configure an SSL connection to your Oracle database, you must first mount a copy of the Java Key Store with the necessary certificates somewhere the agent can access it while running. The example JDBC URL below has the necessary options to connect using SSL. The full list of options that can be specified on an Oracle JDBC URL can be found in the Oracle documentation here.
##########
# Oracle
##########
# - identifier: Example Oracle Custom JDBC URL SSL source
# connectionFactory:
# type: oracle
# user: user
# password: "password"
# jdbcUrl: jdbc:oracle:thin:@tcps://<host>:2484/<databaseName>?javax.net.ssl.trustStore=<path/to/JKS>&javax.net.ssl.trustStoreType=JKS&javax.net.ssl.trustStorePassword=<JKS password>
#
Redshift
##########
# Redshift
##########
# - identifier: Example Redshift source
# connectionFactory:
# type: redshift
# host: example.redshift.amazonaws.com
# port: 5439
# user: user
# password: "password"
# databaseName: somedb
SAP Hana
##########
# SAP HANA
##########
# - identifier: Example SAP HANA source
# connectionFactory:
# type: sap
# host: hana.example.com
# port: 39017
# user: user
# password: "password"
# databaseName: somedb
#
Snowflake
Basic Authentication
Note that Snowflake is deprecating basic authentication for Snowflake users in November 2025.
##########
# Snowflake
##########
# - identifier: Example Snowflake username/password source
# connectionFactory:
# type: snowflake
# host: example.snowflakecomputing.com
# port: 443
# user: user
# password: "password"
# # NOTE - for snowflake, put the warehouse name as the databaseName
# databaseName: somewarehousename
Key Pair Authentication
##########
# Snowflake
##########
# Example using key pair authentication
# - identifier: Example Snowflake keypair source
# connectionFactory:
# type: snowflake
# host: example.snowflakecomputing.com
# port: 443
# user: user
# authType: AUTH_TYPE_KEYPAIR
# keyFile: "/path/to/rsa_key.p8"
# password: "privateKeyPassword"
# # NOTE - for snowflake, put the warehouse name as the databaseName
# databaseName: somewarehousename
Metadata Table Overrides
Freshness and Volume metrics query Snowflake metadata tables. Utilize the parameters below to override the metadata tables being used for Freshness and Volume metrics.
##########
# Snowflake
##########
# - identifier: Example Snowflake keypair source
# connectionFactory:
# type: snowflake
# host: example.snowflakecomputing.com
# port: 443
# user: user
# authType: AUTH_TYPE_KEYPAIR
# keyFile: "/path/to/rsa_key.p8"
# password: "privateKeyPassword"
# # NOTE - for snowflake, put the warehouse name as the databaseName
# databaseName: somewarehousename
# sourceMetadataOverrides:
# accessHistoryTableName: 'SNOWFLAKE.ACCOUNT_USAGE.ACCESS_HISTORY'
# copyHistoryTableName: 'SNOWFLAKE.ACCOUNT_USAGE.COPY_HISTORY'
# queryHistoryTableName: 'SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY'
# tablesTableName: 'SNOWFLAKE.ACCOUNT_USAGE.TABLES'
Synapse
##########
# Synapse
##########
# - identifier: Example synapse source
# connectionFactory:
# type: synapse
# host: example.sql.azuresynapse.net
# port: 1433
# user: user
# password: "password"
# databaseName: somedb
#
Teradata
The Teradata JDBC driver isn't included with the Bigeye Data Source agent due to licensing restrictions, so it is necessary to download it onto the host where the agent will be running. Do this after the agent setup steps have been run. You can learn how to set a custom path to a JDBC driver here.
##########
# Teradata
##########
# - identifier: Example Teradata source
# connectionFactory:
# type: teradata
# host: 10.123.123.123
# port: 1025
# user: user
# password: "password"
Vertica
##########
# Vertica
##########
# - identifier: Example Vertica source
# connectionFactory:
# type: vertica
# host: 10.123.123.123
# port: 5433
# user: user
# password: "password"
# databaseName: somedb
Updated 20 days ago