vastai-data-handling
Implement Vast.ai PII handling, data retention, and GDPR/CCPA compliance patterns. Use when handling sensitive data, implementing data redaction, configuring retention policies, or ensuring compliance with privacy regulations for Vast.ai integrations. Trigger with phrases like "vastai data", "vastai PII", "vastai GDPR", "vastai data retention", "vastai privacy", "vastai CCPA".
Install
mkdir -p .claude/skills/vastai-data-handling && curl -L -o skill.zip "https://mcp.directory/api/skills/download/7835" && unzip -o skill.zip -d .claude/skills/vastai-data-handling && rm skill.zipInstalls to .claude/skills/vastai-data-handling
About this skill
Vast.ai Data Handling
Overview
Manage training data and model artifacts securely on Vast.ai GPU instances. Covers data transfer, encryption, checkpoint management, and cleanup. Critical consideration: Vast.ai instances run on shared hardware operated by third-party hosts.
Prerequisites
- Vast.ai instance with SSH access
- Cloud storage (S3, GCS) for persistent artifacts
- Understanding of data sensitivity classification
Instructions
Step 1: Data Transfer Patterns
# Small datasets (<5GB): Direct SCP
scp -P $PORT -r ./data/ root@$HOST:/workspace/data/
# Large datasets (5-50GB): Compressed transfer
tar czf - ./data/ | ssh -p $PORT root@$HOST "tar xzf - -C /workspace/"
# Very large datasets (>50GB): Cloud storage staging
# Upload to S3/GCS first, then download on instance
ssh -p $PORT root@$HOST "aws s3 sync s3://bucket/dataset/ /workspace/data/"
Step 2: Encrypted Data Transfer
import subprocess, os
def encrypt_and_upload(local_path, host, port, remote_path, passphrase):
"""Encrypt data before transferring to Vast.ai instance."""
encrypted = f"{local_path}.enc"
# Encrypt with AES-256
subprocess.run([
"openssl", "enc", "-aes-256-cbc", "-salt", "-pbkdf2",
"-in", local_path, "-out", encrypted,
"-pass", f"pass:{passphrase}",
], check=True)
# Transfer encrypted file
subprocess.run([
"scp", "-P", str(port), encrypted,
f"root@{host}:{remote_path}.enc",
], check=True)
# Decrypt on instance
subprocess.run([
"ssh", "-p", str(port), f"root@{host}",
f"openssl enc -aes-256-cbc -d -pbkdf2 "
f"-in {remote_path}.enc -out {remote_path} "
f"-pass pass:{passphrase} && rm {remote_path}.enc"
], check=True)
os.remove(encrypted)
Step 3: Checkpoint to Cloud Storage
import torch, boto3, os
class CloudCheckpointManager:
def __init__(self, s3_bucket, prefix, save_every=500):
self.s3 = boto3.client("s3")
self.bucket = s3_bucket
self.prefix = prefix
self.save_every = save_every
def save(self, model, optimizer, step, loss):
if step % self.save_every != 0:
return
local_path = f"/tmp/ckpt-{step}.pt"
torch.save({
"step": step, "loss": loss,
"model": model.state_dict(),
"optimizer": optimizer.state_dict(),
}, local_path)
self.s3.upload_file(local_path, self.bucket,
f"{self.prefix}/ckpt-{step}.pt")
os.remove(local_path)
print(f"Checkpoint saved: step {step}, loss {loss:.4f}")
def load_latest(self):
resp = self.s3.list_objects_v2(Bucket=self.bucket, Prefix=self.prefix)
if not resp.get("Contents"):
return None
latest = sorted(resp["Contents"], key=lambda o: o["Key"])[-1]
self.s3.download_file(self.bucket, latest["Key"], "/tmp/latest.pt")
return torch.load("/tmp/latest.pt")
Step 4: Secure Cleanup Before Destroy
# ALWAYS clean sensitive data before destroying an instance
ssh -p $PORT root@$HOST << 'CLEANUP'
# Remove training data and checkpoints
rm -rf /workspace/data /workspace/checkpoints /workspace/*.pt
# Clear command history
history -c && rm -f ~/.bash_history
# Overwrite sensitive files (optional, for high-security)
find /workspace -name "*.env" -exec shred -u {} \;
echo "Cleanup complete"
CLEANUP
# Then destroy
vastai destroy instance $INSTANCE_ID
Step 5: Data Lifecycle Policy
| Data Type | On Instance | After Job | Retention |
|---|---|---|---|
| Training data | Decrypt on use | Delete before destroy | Source system only |
| Checkpoints | Local + cloud sync | Keep in cloud storage | 30 days |
| Final model | Local | Upload to model registry | Permanent |
| Logs | Local | Upload to logging service | 90 days |
| Temp files | /tmp | Auto-deleted on destroy | None |
Output
- Data transfer patterns (SCP, compressed, cloud-staged)
- Encrypted transfer for sensitive datasets
- Cloud checkpoint manager with S3 integration
- Secure cleanup script before instance destruction
- Data lifecycle policy
Error Handling
| Error | Cause | Solution |
|---|---|---|
| SCP timeout | Large file or slow network | Use compressed transfer or cloud staging |
| Checkpoint upload fails | S3 credentials not on instance | Pass AWS creds via env vars at instance creation |
| Disk full during training | Insufficient disk allocation | Increase --disk or clean old checkpoints |
| Data left after destroy | Skipped cleanup | Always run cleanup script before vastai destroy |
Resources
Next Steps
For enterprise access control, see vastai-enterprise-rbac.
Examples
Sensitive data workflow: Encrypt dataset locally, SCP encrypted file to instance, decrypt on-instance, train, save checkpoints to S3, clean and destroy.
Resume after preemption: Load latest checkpoint from S3 on new instance, continue training from last saved step.
More by jeremylongshore
View all skills by jeremylongshore →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
fastapi-templates
wshobson
Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.
Related MCP Servers
Browse all serversNekzus Utility Server offers modular TypeScript tools for datetime, cards, and schema conversion with stdio transport co
Build persistent semantic networks for enterprise & engineering data management. Enable data persistence and memory acro
Unlock seamless Figma to code: streamline Figma to HTML with Framelink MCP Server for fast, accurate design-to-code work
Cipher empowers agents with persistent memory using vector databases and embeddings for seamless context retention and t
Integrate with Gemini CLI for large-scale file analysis, secure code execution, and advanced context control using Googl
Powerful MCP server for Slack with advanced API, message fetching, webhooks, and enterprise features. Robust Slack data
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.