Senior Data Architect — Lakehouse & Analytics

🌍 Remote, USA 🎯 Full-time 🕐 Posted Recently

Job Description

Important Information

Location: Brazil
Job Mode: Full-time
Work Mode: Work from home

Job Summary

We are looking for a Senior Data Architect to design and govern our enterprise lakehouse platform built on Medallion Architecture. You will own the end-to-end analytical data architecture — from ingestion patterns and pipeline quality gates through dimensional modeling, semantic layers, and BI consumption. This role blends hands-on pipeline development in SQL and Python with strategic platform decisions around governance (Unity Catalog), security, data quality contracts, and cost-conscious design choices such as materialized views vs. streaming tables. You will partner closely with data engineers, analysts, and business leaders to deliver a modern, performant, and well-governed analytics ecosystem.

    Responsibilities and Duties
  • Architect and maintain our lakehouse using Medallion Architecture (bronze / silver / gold), defining clear contracts between layers;
  • Design dimensional models and semantic layers that power self-service analytics across Power BI, Tableau, or similar platforms;
  • Define and enforce data governance through Unity Catalog, including RBAC, row-level and column-level security, and PII masking strategies;
  • Make platform-level trade-off decisions — materialized views vs. streaming tables, batch vs. incremental loads, cost vs. freshness;
  • Develop declarative pipelines in SQL and Python with built-in quality gates, observability, and data contract enforcement;
  • Own CI/CD and deployment workflows using Asset Bundles, GitHub-based deployments, and infrastructure-as-code practices;
  • Drive performance tuning for Delta tables, including partition strategies, Z-ordering, and incremental load optimization;
  • Establish data quality standards, lineage tracking, metadata management, and business glossary design;
  • Partner with business stakeholders and leadership to translate analytical needs into scalable architectural decisions;
  • Lead or support enterprise-scale migration and modernization efforts as the platform evolves.
    Essential Skills
  • Deep experience designing lakehouse or data warehouse architectures at enterprise scale;
  • Strong hands-on skills with Delta Lake, dimensional modeling, and columnar data optimization;
  • Proven governance experience with Unity Catalog or equivalent (access control, lineage, cataloging);
  • Fluency in SQL and Python for pipeline development, with comfort in both notebook and production CI/CD workflows;
  • A track record of defining data quality standards, data contracts, and observability patterns;
  • Familiarity with Autoloader, streaming tables, and real-time ingestion patterns is a plus;
  • Experience with Iceberg, Parquet internals, or multi-engine open table formats is a plus.

About Encora

Encora is the preferred digital engineering and modernization partner of some of the world’s leading enterprises and digital native companies. With over 9,000 experts in 47+ offices and innovation labs worldwide, Encora’s technology practices include Product Engineering & Development, Cloud Services, Quality Engineering, DevSecOps, Data & Analytics, Digital Experience, Cybersecurity, and AI & LLM Engineering.

At Encora, we hire professionals based solely on their skills and qualifications, and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.

Apply tot his job

Apply To this Job

Ready to Apply?

Don't miss out on this amazing opportunity!

🚀 Apply Now

Similar Jobs

Recent Jobs

You May Also Like