Resident Solution Architect – Databricks & Spark

🌍 Remote, USA 🎯 Full-time 🕐 Posted Recently

Job Description

Job Title: Databricks Resident Solutions Architect (RSA)

Experience: 12+ Years (with min. 5+ years deep specialization in Databricks/Spark)

Location: Remote

Role Overview

We are looking for an experienced Resident Solution Architect with strong expertise in the Databricks Lakehouse Platform to design, architect, and deliver scalable, secure, and production-grade Data & AI solutions.

The ideal candidate should have deep hands-on technical expertise, strong consulting experience, and proven experience leading enterprise-level data platform implementations and migrations across cloud environments.

Key Responsibilities

    1. Solution Architecture & Delivery
  • Design and build production-ready reference architectures using Lakehouse and Delta Lake best practices
  • Architect scalable big data and AI solutions on Databricks
  • Lead migrations (ETL/ELT, data warehouses, legacy systems) to modern Lakehouse architecture
  • Provide architecture consulting, cluster optimization, and performance tuning
  • Implement data governance and security using Unity Catalog
    2. Customer Engagement & Delivery Management
  • Scope, plan, and manage technical engagements
  • Drive end-to-end project delivery (Design → Development → Deployment → Optimization)
  • Manage timelines, risks, and deliverables
  • Provide support for complex production issues
    3. Platform Engineering & DevOps
  • Implement CI/CD pipelines for code and infrastructure
  • Deploy infrastructure using Terraform and Databricks Asset Bundles (DAB)
  • Set up job scheduling, monitoring, and production management
  • Establish best practices for version control and automation
    4. Optimization & Continuous Improvement
  • Monitor and optimize data pipelines and ML models
  • Improve system performance and cost efficiency
  • Contribute reusable assets and documentation
  • Enable customer teams through knowledge transfer
    5. Leadership & Enablement
  • Mentor and train customer teams
  • Provide technical leadership across engagements
  • Support pre-sales activities and architecture discussions

Required Skills & Qualifications

Experience (Mandatory)

    12+ years of hands-on experience in:
  • Data Engineering
  • Data Platforms
  • Data Analytics
  • Data Warehousing
  • Big Data technologies (Kafka, Data Lakes, Cloud-native solutions)

Cloud Expertise

    Hands-on experience in at least one:
  • AWS
  • Microsoft Azure
  • Google Cloud Platform (GCP)
    Programming Skills
  • Python
  • SQL
  • Scala

Databricks Expertise

    Strong hands-on experience with:
  • Databricks SQL
  • Apache Spark
  • Delta Lake
  • MLflow
  • Unity Catalog
  • Delta Live Tables (DLT)
    Migration & Architecture
  • Experience leading enterprise workload migrations
  • Strong knowledge of ETL/ELT design patterns
  • Experience modernizing legacy data systems
    Deployment & Automation
  • Databricks Asset Bundles (DAB)
  • Terraform
  • CI/CD pipelines
  • Infrastructure as Code (IaC)
    Certifications
  • Databricks Certified Data Engineer Associate
  • Databricks Certified Data Engineer Professional (Preferred)
    Soft Skills
  • Excellent communication
  • Strong stakeholder management
  • Leadership & mentoring ability
  • Consulting and problem-solving skills

Apply tot his job

Apply To this Job

Ready to Apply?

Don't miss out on this amazing opportunity!

🚀 Apply Now

Similar Jobs

Recent Jobs

You May Also Like