Python Engineer – ML/Data Pipelines

🌍 Remote, USA 🎯 Full-time 🕐 Posted Recently

Job Description

This a Full Remote job, the offer is available from: Europe

This is a remote position.
About the Project

Currently, we are looking for a Python Engineer – Data Pipelines for our key client, an innovative company specializing in data-driven solutions. Focused on leveraging technology, people, and processes, the company develops and optimises software, data infrastructure, and AI-powered tools for businesses across industries. With a collaborative, agile environment, it emphasizes continuous learning, career growth, and remote work flexibility.

This role focuses on building and maintaining data pipelines that operationalize PoC solutions developed by a team of data scientists. These PoCs—often related to marketing segmentation or attribute enrichment—are transformed into scalable, production-ready pipeline components. You will work in a team of 5: project lead, 3 data/software engineers, and 1 QA.

Your Duties:

As a Python Engineer – ML/Data Pipelines , you will be responsible for:

  • Developing and maintaining scalable data pipelines to support production deployment
  • Translating Python-based scripts from data scientists into robust, maintainable pipeline components
  • Implementing workflow orchestration using Kubeflow or similar tools (Airflow, Argo, MLflow, etc.)
  • Collaborating with data scientists to understand PoC requirements and convert them into production workflows
  • Optimizing pipelines for performance, reliability, and efficient resource usage
  • Debugging, monitoring, and resolving pipeline issues in production environments
  • Contributing to continuous improvement of tooling, workflows, and best practices
  • Supporting cloud-based deployments and infrastructure (any major cloud platform)
    Requirements
  • 4+ years of Python development experience (APIs, automation, backend systems, or pipeline work).
  • Strong understanding of data workflows, data processing, or data management.
  • Experience with workflow orchestration tools such as Kubeflow, Airflow, Argo, Prefect, or MLflow.
  • Knowledge of containerization best practices (e.g., Docker); Kubernetes is a plus but not required.
  • Familiarity with cloud environments (GCP, AWS, or Azure all acceptable).
  • Strong debugging, problem-solving, and communication skills.
  • Experience collaborating in cross-functional engineering teams.
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
  • English language proficiency at B2 level or higher.
  • Nice to have:
  • Familiarity with ML frameworks (TensorFlow, PyTorch, Scikit-learn) not required.
  • Experience with CI/CD pipelines and DevOps tools.
  • Exposure to Kubernetes.
  • Experience working with databases especially Snowflake and integrating them into data workflows.

Salary: EUR 3000This offer from "Gegidze" has been enriched by Jobgether.com and got a 78% flex score.

Apply tot his job

Apply To this Job

Ready to Apply?

Don't miss out on this amazing opportunity!

🚀 Apply Now

Similar Jobs

Recent Jobs

You May Also Like