snowflake-connections
Configuring Snowflake connections using connections.toml (for Snowflake CLI, Streamlit, Snowpark) or profiles.yml (for dbt) with multiple authentication methods (SSO, key pair, username/password, OAuth), managing multiple environments, and overriding settings with environment variables. Use this skill when setting up Snowflake CLI, Streamlit apps, dbt, or any tool requiring Snowflake authentication and connection management.
Install
mkdir -p .claude/skills/snowflake-connections && curl -L -o skill.zip "https://mcp.directory/api/skills/download/231" && unzip -o skill.zip -d .claude/skills/snowflake-connections && rm skill.zipInstalls to .claude/skills/snowflake-connections
About this skill
Snowflake Connections
Configure and manage Snowflake connections for CLI tools, Streamlit apps, dbt, and Snowpark applications.
Configuration Files:
connections.toml- Used by Snowflake CLI, Streamlit, and Snowparkprofiles.yml- Used by dbt (different format, covered in dbt-core skill)
When to Use This Skill
Activate this skill when users ask about:
- Setting up Snowflake connections for CLI, Streamlit, or Snowpark
- Configuring
connections.tomlfile - Authentication methods (SSO, key pair, username/password, OAuth)
- Managing multiple environments (dev, staging, prod)
- Overriding connection settings with environment variables
- Troubleshooting authentication or connection issues
- Rotating credentials or keys
- Setting up CI/CD authentication
Note: For dbt-specific connection setup using profiles.yml, see the dbt-core skill. The concepts and authentication methods in this skill still apply, but dbt uses a different configuration file format.
Configuration File
This skill covers connections.toml used by Snowflake CLI, Streamlit, and Snowpark.
For dbt: Use ~/.dbt/profiles.yml instead. See the dbt-core skill for dbt configuration. The authentication methods described here apply to both files.
Location
| OS | Path |
|---|---|
| Unix/Mac | ~/.snowflake/connections.toml |
| Windows | %USERPROFILE%\.snowflake\connections.toml |
Basic Structure
[default]
account = "your_account"
user = "your_username"
warehouse = "COMPUTE_WH"
database = "MY_DB"
schema = "PUBLIC"
role = "MY_ROLE"
# Add authentication method (see below)
Key Fields:
account- Snowflake account identifier (e.g.,xy12345.us-east-1)user- Snowflake usernamewarehouse- Default warehouse for queriesdatabase- Default database contextschema- Default schema contextrole- Default role to use
Authentication Methods
Option 1: SSO/External Browser (Recommended for Development)
Best for: Organizations with SSO, interactive development
[default]
account = "your_account"
user = "your_username"
authenticator = "externalbrowser"
How it works: Opens browser for SSO authentication
Pros:
- ✅ Most secure for development
- ✅ Leverages existing SSO infrastructure
- ✅ No password storage required
- ✅ MFA support built-in
Cons:
- ❌ Requires browser access
- ❌ Not suitable for headless/CI environments
Usage:
# Browser opens automatically for authentication
streamlit run app.py
snow sql -c default -q "SELECT CURRENT_USER()"
Option 2: Key Pair Authentication (Recommended for Production)
Best for: Production deployments, CI/CD pipelines, automation
[default]
account = "your_account"
user = "your_username"
authenticator = "snowflake_jwt"
private_key_path = "~/.ssh/snowflake_key.p8"
private_key_passphrase = "your_passphrase" # Optional if key is encrypted
Setup Steps:
1. Generate Key Pair:
# Generate encrypted private key (recommended)
openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out snowflake_key.p8
# Or unencrypted (less secure, but no passphrase needed)
openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out snowflake_key.p8 -nocrypt
# Generate public key
openssl rsa -in snowflake_key.p8 -pubout -out snowflake_key.pub
2. Extract Public Key (remove header/footer/newlines):
# Remove header, footer, and newlines
cat snowflake_key.pub | grep -v "BEGIN PUBLIC" | grep -v "END PUBLIC" | tr -d '\n'
3. Add Public Key to Snowflake:
-- Set public key for user
ALTER USER your_username SET RSA_PUBLIC_KEY='MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8A...';
-- Verify
DESC USER your_username;
-- Check RSA_PUBLIC_KEY_FP field is populated
4. Test Connection:
snow sql -c default -q "SELECT CURRENT_USER()"
Pros:
- ✅ Very secure for production
- ✅ No password storage
- ✅ Ideal for CI/CD and automation
- ✅ Works in headless environments
- ✅ No interactive prompts
Cons:
- ❌ More complex initial setup
- ❌ Requires key management and rotation
Security Best Practices:
- Store private keys outside project directory
- Use encrypted keys with passphrases
- Rotate keys every 90 days
- Use different keys for different environments
- Never commit keys to version control
Option 3: Username/Password (Development Only)
Best for: Quick testing, local development
[default]
account = "your_account"
user = "your_username"
password = "your_password"
Pros:
- ✅ Simple setup
- ✅ Works everywhere
Cons:
- ❌ Less secure (password in plain text)
- ❌ Not recommended for production
- ❌ MFA requires separate handling
⚠️ WARNING: Never use for production or commit connections.toml with passwords to git!
Option 4: OAuth Token
Best for: OAuth-based integrations, programmatic access
[default]
account = "your_account"
authenticator = "oauth"
token = "your_oauth_token"
Pros:
- ✅ Supports OAuth workflows
- ✅ Token-based security
Cons:
- ❌ Requires token refresh logic
- ❌ Token expiration management
Usage Pattern:
# Token needs to be refreshed before expiration
from snowflake.snowpark import Session
import os
session = Session.builder.configs({
"account": "your_account",
"authenticator": "oauth",
"token": os.getenv("OAUTH_TOKEN")
}).create()
Multiple Connections (Multi-Environment)
Define multiple connection profiles for different environments:
[default]
account = "dev_account"
user = "dev_user"
authenticator = "externalbrowser"
warehouse = "DEV_WH"
database = "DEV_DB"
schema = "PUBLIC"
[staging]
account = "staging_account"
user = "staging_user"
authenticator = "externalbrowser"
warehouse = "STAGING_WH"
database = "STAGING_DB"
schema = "PUBLIC"
[prod]
account = "prod_account"
user = "prod_user"
authenticator = "snowflake_jwt"
private_key_path = "~/.ssh/prod_key.p8"
warehouse = "PROD_WH"
database = "PROD_DB"
schema = "PUBLIC"
Using Connection Profiles
Snowflake CLI:
# Use specific connection
snow sql -c default -q "SELECT CURRENT_DATABASE()"
snow sql -c staging -q "SELECT CURRENT_DATABASE()"
snow sql -c prod -q "SELECT CURRENT_DATABASE()"
# Deploy with specific connection
snow streamlit deploy -c prod
Streamlit Apps:
import streamlit as st
from snowflake.snowpark import Session
# Allow user to select environment
env = st.selectbox("Environment", ["default", "staging", "prod"])
session = Session.builder.config("connection_name", env).create()
dbt:
# profiles.yml
snowflake_demo:
target: dev
outputs:
dev:
type: snowflake
account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
# Uses connections.toml if not specified
prod:
type: snowflake
account: "{{ env_var('SNOWFLAKE_PROD_ACCOUNT') }}"
Environment Variable Overrides
Override connection settings without modifying connections.toml:
Supported Variables
| Variable | Purpose | Example |
|---|---|---|
SNOWFLAKE_ACCOUNT | Override account | xy12345.us-east-1 |
SNOWFLAKE_USER | Override user | john_doe |
SNOWFLAKE_PASSWORD | Override password | secret123 |
SNOWFLAKE_DATABASE | Override database | ANALYTICS_DB |
SNOWFLAKE_SCHEMA | Override schema | REPORTING |
SNOWFLAKE_WAREHOUSE | Override warehouse | LARGE_WH |
SNOWFLAKE_ROLE | Override role | ANALYST |
Usage Examples
Command-Line Overrides:
# Override database/schema
export SNOWFLAKE_DATABASE=ANALYTICS_DB
export SNOWFLAKE_SCHEMA=REPORTING
streamlit run app.py
# Override warehouse for heavy query
export SNOWFLAKE_WAREHOUSE=XLARGE_WH
snow sql -c default -f heavy_query.sql
# Multiple overrides
export SNOWFLAKE_DATABASE=PROD_DB
export SNOWFLAKE_SCHEMA=PUBLIC
export SNOWFLAKE_WAREHOUSE=COMPUTE_WH
dbt run
Startup Script Pattern:
#!/bin/bash
# run_dev.sh
# Set environment-specific variables
export SNOWFLAKE_DATABASE=DEV_DB
export SNOWFLAKE_SCHEMA=DEV_SCHEMA
export SNOWFLAKE_WAREHOUSE=DEV_WH
# Start application
streamlit run app.py
Multi-Environment Scripts:
#!/bin/bash
# run.sh
ENV="${1:-dev}"
case $ENV in
dev)
export SNOWFLAKE_DATABASE=DEV_DB
export SNOWFLAKE_WAREHOUSE=DEV_WH
;;
staging)
export SNOWFLAKE_DATABASE=STAGING_DB
export SNOWFLAKE_WAREHOUSE=STAGING_WH
;;
prod)
export SNOWFLAKE_DATABASE=PROD_DB
export SNOWFLAKE_WAREHOUSE=PROD_WH
;;
esac
streamlit run app.py
Usage: ./run.sh prod
Connection Patterns for Different Tools
Streamlit Apps
Required pattern for local/Snowflake compatibility:
import streamlit as st
from snowflake.snowpark.context import get_active_session
from snowflake.snowpark import Session
@st.cache_resource
def get_snowpark_session():
"""Get or create Snowpark session (cached)"""
try:
# When running in Snowflake (deployed)
return get_active_session()
except:
# When running locally - uses connections.toml
return Session.builder.config('connection_name', 'default').create()
session = get_snowpark_session()
With environment selection:
@st.cache_resource
def get_snowpark_session(connection_name='default'):
try:
return get_active_session()
except:
return Session.builder.config('connection_name', connection_name).create()
# Allow user to select environment
env = st.selectbox("Environment", ["default", "staging", "prod"])
session = get_snowpark_session(env)
Snowflake CLI
# Use default connection
snow sql -c default -q "SELECT CURRENT_USER()"
# Use specific connection profile
snow sql -c prod -q "SELECT CURRENT_DATABASE()"
# Test connection
snow connection test -c default
dbt
Important: dbt uses ~/.dbt/profiles.yml instead of `connections.tom
Content truncated.
More by sfc-gh-dflippo
View all skills by sfc-gh-dflippo →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
pdf-to-markdown
aliceisjustplaying
Convert entire PDF documents to clean, structured Markdown for full context loading. Use this skill when the user wants to extract ALL text from a PDF into context (not grep/search), when discussing or analyzing PDF content in full, when the user mentions "load the whole PDF", "bring the PDF into context", "read the entire PDF", or when partial extraction/grepping would miss important context. This is the preferred method for PDF text extraction over page-by-page or grep approaches.
Related MCP Servers
Browse all serversEasily interact with MySQL databases: execute queries, manage connections, and streamline your data workflow using MySQL
Manage Alibaba Cloud ECS, monitor metrics, and configure VPC networks effortlessly using natural language commands with
Leverage LinkedIn API to automate connections, search profiles, manage posts, and enhance sales or recruitment using pow
Manage data pipelines with Fivetran: automate syncs, unpause connections, and handle invites via REST API integration.
Integrate Databutton app API endpoints seamlessly with secure websocket connections for effortless API integration with
A centralized gateway for managing multiple MCP server connections. Instead of configuring each MCP server individually
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.