Job Description
- Principal GCP Data Architect | Location: Remote
- 16–20 years of overall IT experience.
- 5–6 years of recent, hands-on experience with GCP.
- Proven experience working on Google Data projects.
- Valid and recent GCP certification.
- Prior involvement in Google-funded projects.
- Must possess a Data Readiness Placement (DRP) ID with a verified score of 50 or above.
Job Description
- We are seeking a hands-on
- *Principal GCP Data Architect**
to lead the design, and implementation of large-scale data ecosystems.
The ideal candidate is an architect who has led massive cloud migrations, built self-serve data platforms from the ground up, and maintains a rigorous focus on data governance and security.
- Technical Requirements
- **Cloud Expertise:**
- Expert-level mastery of the GCP Data Stack:
- **Storage & Warehouse:**
- BigQuery (including BigLake and Omni), Google Cloud Storage.
- **Processing:**
- Dataflow (Apache Beam), Dataproc (Spark/Hadoop), Cloud Composer (Airflow).
- **Messaging:**
- Pub/Sub and Confluent/Kafka integration.
- **Analytics & AI:**
- Looker, Vertex AI, and BigQuery ML.
- **Certifications:**
- Must hold an active
- *GCP Professional Data Engineer**
- certification.
- **DRP ID Performance:**
- A
- *Data Readiness Placement (DRP) ID**
- with a verified score of
- *50+**
- is highly preferable, demonstrating a high level of technical proficiency and architectural maturity.
- **Modern Data Stack:**
Deep experience with dbt, Airflow, and containerization (GKE/Kubernetes).
- Experience & Qualifications
- **Years of Experience:**
- 18–20 years in Data Engineering, Data Warehousing, and Business Intelligence, with at least 6+ years focused specifically on GCP.
- **Migration Track Record:**
- Proven experience leading at least two enterprise-scale migrations (PB-scale) to the cloud.
- **Leadership:**
- Demonstrated experience leading large, cross-functional engineering teams in an Agile/DevOps environment.
- **Education:**
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related technical field.
- Roles & Responsibilities
- **Strategic Architecture:**
- Define the long-term roadmap for data platforms on GCP, ensuring alignment with global business goals and "North Star" data strategies.
- **Self-Serve Platform Design:**
- Architect and oversee the build-out of modern, self-serve data architectures that empower decentralized teams while maintaining central standards.
- **Large-Scale Migrations:**
- Lead complex end-to-end migration from AWS to GCP.
- **Data Governance & Security:**
- Design and enforce robust data security frameworks including IAM, VPC Service Controls, Data Masking/Encryption, and automated governance using
- *Dataplex**
- .
- **Infrastructure as Code (IaC):**
- Drive automation using Terraform or Pulumi to ensure repeatable, scalable, and version-controlled infrastructure.
- **Executive Stakeholder Management:**
Act as a trusted advisor to the VP of Data, articulating the ROI of data initiatives and managing technical risk.
Apply tot his job
Apply To this Job