Job Description
Responsibilities:Data Ops Engineer - JD
Qualifications:5+ years of overall software engineering experience that includes hands-on software development, data engineering3+ years for hands-on coding experience in SQL, Python, PySpark3+ years of experience with advanced orchestration tools like Apache Airflow2+ years of experience in at least one cloud (Azure, AWS, GCP) platforms, preferably GCPExperience in building bolthires/CD processes and pipelinesRoles and
Responsibilities:Worked in an agile environmentProactively identify and assist in solving recurring data quality or data availability issuesMonitor, support, triage data pipelines that ingest, move, transform, and integrate as it moves from acquisition to consumption layersExceptional problem solving and troubleshooting skills, analyze data to figure out issues/patternsEffective communication skills with technical and business teamsAspire to be efficient, thorough, and proactiveAble to develop queries, metrics for data platform related ad-hoc reporting and/or ETL batch triageMaintain knowledge base and FAQ documentation providing instructions for solving a problem that jobs commonly run into.
Apply tot his job
Ready to Apply?
Don't miss out on this amazing opportunity!
🚀
Apply Now