Head of Data Science Engineering & Analytics

🌍 Remote, USA 🎯 Full-time 🕐 Posted Recently

Job Description

About the position

What you can expect You will build and scale the data foundation that powers Zoom's AI capabilities and business operations. By defining enterprise data strategy and leading a global team, you will enable trusted insights and innovation. This role offers the opportunity to shape how a $10B+ company leverages data to drive growth. About the Team We deliver enterprise data platforms, analytics, and AI capabilities across Zoom. Our team partners with Product, Engineering, Finance, and Sales to enable data-driven decisions. We exist to transform data into actionable insights that fuel business growth.

    Responsibilities
  • Defining and executing enterprise data strategy, platform architecture, and AI enablement roadmaps aligned with business goals
  • Leading a global organization of 60+ data engineers, machine learning engineers, scientists, analysts, and architects across the US and India
  • Building scalable data pipelines, governance frameworks, and analytics solutions that serve as the single source of truth
  • Partnering with Finance, Marketing, Sales, Product, and Engineering leaders to deliver insights, dashboards, and data products
  • Enabling self-service analytics and AI/ML use cases through accessible, trusted, and secure data platforms
    Requirements
  • Demonstrate 15+ years of experience in data engineering, analytics, and data science, or equivalent practical experience
  • Lead teams of 50+ people in senior leadership roles, building cultures of collaboration and continuous learning
  • Create Architect modern cloud-based data platforms using Snowflake, AWS, and data pipeline technologies
  • Build data foundations that support AI and machine learning initiatives at enterprise scale
  • Translate business needs into technical solutions and influence both technical and non-technical stakeholders
  • Apply expertise in data governance, master data management, security, and compliance frameworks
  • Use tools such as DBT, Airflow, Kafka, Databricks, FiveTran, and data quality platforms
  • Work with AI/ML platforms including LLMs, vector databases, and MLOps tools
  • Apply deep, hands-on experience with Snowflake and modern data pipeline technologies
  • Display hands-on experience with Snowflake and modern data pipeline technologies (required), including cloud platforms (AWS), CI/CD, DevOps/DataOps, and enterprise integrations
  • Apply Expertise across the modern data stack: Cloud databases (Snowflake, Databricks, Mongo), transformation tools (DBT, Airflow), ingestion platforms (FiveTran, Stitch, Snowpipe, Kafka, Apache Pulsar, AWS Kinesis/Firehose), orchestration (Airflow, Dagster), observability (Great Expectations, BigEye), and governance (Alation, Monte Carlo, Atlan)
  • Demonstrate experience building data foundations to support AI/ML initiatives, including platforms like Databricks, Snowflake AI/ML (Cortex), ML Flow, LLM-based protocols and patterns (RAG/Agentic/MCP/A2A), and vector databases (Pinecone, FAISS).
  • Demonstrate ability to establish operational excellence frameworks including SLAs/SLIs, end-to-end observability, cost optimization, and key operational metrics tracking

Apply tot his job

Apply To this Job

Ready to Apply?

Don't miss out on this amazing opportunity!

🚀 Apply Now

Similar Jobs

Recent Jobs

You May Also Like