Database Connectors
Actyze connects to multiple data sources through Trino connectors, enabling you to query across relational databases, cloud warehouses, NoSQL stores, and data lakes from a single interface.
Overview
Trino connectors allow Actyze to:
- Query data across multiple databases simultaneously
- Join data from different sources
- Apply consistent SQL syntax regardless of underlying database
- Leverage database-specific optimizations
Supported Connectors
Relational Databases
| Database | Connector | Use Case |
|---|---|---|
| PostgreSQL | postgresql | Operational databases, analytics |
| MySQL | mysql | Web applications, transactional data |
| Oracle | oracle | Enterprise applications, ERP systems |
| SQL Server | sqlserver | Microsoft stack, business applications |
| MariaDB | mariadb | MySQL alternative, cloud deployments |
Cloud Data Warehouses
| Platform | Connector | Use Case |
|---|---|---|
| Snowflake | snowflake | Cloud data warehouse, analytics |
| Databricks | delta_lake | Lakehouse architecture, ML workloads |
| BigQuery | bigquery | Google Cloud analytics, petabyte-scale |
| Redshift | redshift | AWS data warehouse, BI workloads |
| Synapse | sqlserver | Azure analytics, enterprise DW |
NoSQL Databases
| Database | Connector | Use Case |
|---|---|---|
| MongoDB | mongodb | Document store, JSON data |
| Cassandra | cassandra | Distributed NoSQL, high availability |
| Elasticsearch | elasticsearch | Search, log analytics, full-text |
| Redis | redis | Caching, session store, real-time |
Data Lakes
| Technology | Connector | Use Case |
|---|---|---|
| Iceberg | iceberg | Modern data lake format, ACID |
| Delta Lake | delta_lake | Lakehouse, time travel, streaming |
| Hudi | hudi | Incremental processing, CDC |
| Hive | hive | Legacy Hadoop, S3 access |
Configuration
Docker Compose
Trino connectors are configured through template files in docker/trino/catalog-templates/.
Active Connectors (Default)
PostgreSQL (postgres.properties.tpl):
connector.name=postgresql
connection-url=jdbc:postgresql://${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
connection-user=${POSTGRES_USER}
connection-password=${POSTGRES_PASSWORD}
Memory (memory.properties.tpl):
connector.name=memory
TPCH (tpch.properties.tpl):
connector.name=tpch
tpch.splits-per-node=4
Adding New Connectors
Step 1: Create Template File
Example for MySQL (mysql.properties.tpl):
connector.name=mysql
connection-url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}
connection-user=${MYSQL_USER}
connection-password=${MYSQL_PASSWORD}
Step 2: Add Environment Variables
Edit docker/.env:
MYSQL_HOST=mysql-server.example.com
MYSQL_PORT=3306
MYSQL_DB=production
MYSQL_USER=actyze_user
MYSQL_PASSWORD=your-secure-password
Step 3: Rebuild Trino
cd docker
docker-compose build trino
docker-compose up -d trino
Helm/Kubernetes
Connectors are configured in values.yaml under the catalogs section.
Example: Add Snowflake
Edit values.yaml:
services:
trino:
enabled: true
catalogs:
postgres:
connector.name: "postgresql"
connection-url: "jdbc:postgresql://dashboard-postgres:5432/dashboard"
connection-user: "nexus_service"
connection-password: "${ENV:POSTGRES_PASSWORD}"
snowflake:
connector.name: "snowflake"
snowflake.account: "your-account.us-east-1"
snowflake.user: "actyze_user"
snowflake.password: "${ENV:SNOWFLAKE_PASSWORD}"
snowflake.database: "ANALYTICS"
snowflake.warehouse: "COMPUTE_WH"
snowflake.role: "ANALYST"
Edit values-secrets.yaml:
secrets:
trino:
snowflakePassword: "your-snowflake-password"
Connector Examples
MySQL
Docker (mysql.properties.tpl):
connector.name=mysql
connection-url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}
connection-user=${MYSQL_USER}
connection-password=${MYSQL_PASSWORD}
jdbc-url-parameters=useSSL=true&requireSSL=true
Helm (values.yaml):
catalogs:
mysql:
connector.name: "mysql"
connection-url: "jdbc:mysql://mysql.example.com:3306/production"
connection-user: "actyze_user"
connection-password: "${ENV:MYSQL_PASSWORD}"
MongoDB
Docker (mongodb.properties.tpl):
connector.name=mongodb
mongodb.connection-url=mongodb://${MONGO_HOST}:${MONGO_PORT}
mongodb.credentials=${MONGO_USER}:${MONGO_PASSWORD}@admin
Helm (values.yaml):
catalogs:
mongodb:
connector.name: "mongodb"
mongodb.connection-url: "mongodb+srv://cluster.mongodb.net"
mongodb.credentials: "${ENV:MONGO_USER}:${ENV:MONGO_PASSWORD}@admin"
Databricks
Docker (databricks.properties.tpl):
connector.name=delta_lake
delta.databricks.server-hostname=${DATABRICKS_HOST}
delta.databricks.http-path=${DATABRICKS_HTTP_PATH}
delta.databricks.catalog=main
delta.databricks.auth-type=pat
delta.databricks.pat-token=${DATABRICKS_TOKEN}
Helm (values.yaml):
catalogs:
databricks:
connector.name: "delta_lake"
delta.databricks.server-hostname: "your-workspace.cloud.databricks.com"
delta.databricks.http-path: "/sql/1.0/warehouses/your-warehouse-id"
delta.databricks.catalog: "main"
delta.databricks.auth-type: "pat"
delta.databricks.pat-token: "${ENV:DATABRICKS_TOKEN}"
Snowflake
Docker (snowflake.properties.tpl):
connector.name=snowflake
snowflake.account=${SNOWFLAKE_ACCOUNT}
snowflake.user=${SNOWFLAKE_USER}
snowflake.password=${SNOWFLAKE_PASSWORD}
snowflake.database=${SNOWFLAKE_DATABASE}
snowflake.warehouse=${SNOWFLAKE_WAREHOUSE}
snowflake.role=${SNOWFLAKE_ROLE}
Helm (values.yaml):
catalogs:
snowflake:
connector.name: "snowflake"
snowflake.account: "your-account.us-east-1"
snowflake.user: "actyze_user"
snowflake.password: "${ENV:SNOWFLAKE_PASSWORD}"
snowflake.database: "ANALYTICS"
snowflake.warehouse: "COMPUTE_WH"
snowflake.role: "ANALYST"
BigQuery
Docker (bigquery.properties.tpl):
connector.name=bigquery
bigquery.project-id=${BIGQUERY_PROJECT_ID}
bigquery.credentials-file=${BIGQUERY_CREDENTIALS_FILE}
Helm (values.yaml):
catalogs:
bigquery:
connector.name: "bigquery"
bigquery.project-id: "your-project-id"
bigquery.credentials-file: "/etc/bigquery/credentials.json"
Testing Connectors
Verify Catalog
Docker:
docker exec dashboard-trino trino --execute "SHOW CATALOGS"
Kubernetes:
kubectl exec -n actyze deployment/dashboard-trino -- trino --execute "SHOW CATALOGS"
Test Query
# Docker
docker exec dashboard-trino trino --execute "SELECT * FROM mysql.production.customers LIMIT 5"
# Kubernetes
kubectl exec -n actyze deployment/dashboard-trino -- \
trino --execute "SELECT * FROM snowflake.analytics.sales LIMIT 5"
Troubleshooting
Connector Not Appearing
Check catalog files:
# Docker
docker exec dashboard-trino ls -la /etc/trino/catalog/
# Kubernetes
kubectl exec -n actyze deployment/dashboard-trino -- ls -la /etc/trino/catalog/
Check Trino logs:
# Docker
docker logs dashboard-trino
# Kubernetes
kubectl logs -n actyze deployment/dashboard-trino
Connection Errors
Verify network connectivity:
# Docker
docker exec dashboard-trino ping mysql-server
# Kubernetes
kubectl exec -n actyze deployment/dashboard-trino -- ping mysql-server
Check credentials:
# Docker
docker exec dashboard-trino env | grep MYSQL
# Kubernetes
kubectl get secret dashboard-secrets -n actyze -o yaml
Permission Errors
Ensure database user has required permissions:
-- MySQL
GRANT SELECT ON database.* TO 'actyze_user'@'%';
-- PostgreSQL
GRANT SELECT ON ALL TABLES IN SCHEMA public TO actyze_user;
-- Snowflake
GRANT USAGE ON WAREHOUSE COMPUTE_WH TO ROLE analyst;
GRANT SELECT ON ALL TABLES IN SCHEMA analytics.public TO ROLE analyst;
Best Practices
Security
- Use read-only database accounts for Actyze
- Store credentials in secrets (Kubernetes secrets or environment variables)
- Enable SSL/TLS for database connections
- Use service accounts with minimal permissions
Performance
- Configure connection pooling for high-traffic databases
- Use query result caching where appropriate
- Limit concurrent queries per connector
- Monitor query performance and optimize indexes
High Availability
- Configure multiple connection URLs for failover
- Use connection retry logic
- Monitor connector health
- Set appropriate connection timeouts
Additional Resources
- Trino Connector Documentation - Complete connector reference
- Environment Variables - All configuration options
- Docker Deployment - Docker-specific setup
- Helm Deployment - Kubernetes-specific setup
Support
For connector-specific issues:
- Check Trino connector documentation
- Verify database connectivity and permissions
- Review Trino logs for detailed error messages
- Test connection outside of Actyze using Trino CLI