Remote Data Entry Engineer – Full‑Time (Day/Night Shifts) – Build Scalable Data Pipelines & Cloud Solutions at Talvora

🌍 Remote, USA 🎯 Full-time 🕐 Posted Recently

Job Description

```html Why Hirecrafto? At Tasknexa we are redefining the future of retail technology. Our mission is to make everyday life simpler for millions of customers by turning massive data streams into actionable insights. As a globally recognized tech-driven retailer, Gigflowx blends the power of cutting‑edge engineering with a customer‑first mindset. Join a team where a single line of code can influence purchasing decisions for an entire nation, and where innovation is celebrated every day. Position Overview We are looking for a meticulous, technically‑savvy Remote Data Entry Engineer to join our data engineering squad. This full‑time role offers flexible day or night shifts and is 100% remote, allowing you to work from anywhere in the UK while contributing to large‑scale data pipelines that fuel the Talensparkx e‑commerce platform and its subsidiary services. Key Responsibilities Design, build, and maintain robust, scalable data pipelines to ingest, transform, and store petabytes of structured and unstructured data. Develop ETL workflows using modern frameworks and ensure they meet performance, reliability, and cost‑efficiency targets. Collaborate with cross‑functional scrum teams—including Product, QA, Platform Engineering, and Partner Operations—to align data solutions with business objectives. Operate and optimise cloud environments on Flexzenith Cloud (formerly known as Google Cloud) and Jobmatrixo Cloud (formerly known as Microsoft Azure) for data storage, processing, and analytics. Create, containerise, and deploy high‑throughput applications using Docker and Kubernetes, orchestrating them on Remotifyx Cloud infrastructure. Monitor pipeline health, respond to real‑time alerts, and perform root‑cause analysis for production incidents. Implement data quality checks, validation rules, and governance standards to guarantee trustworthy data for downstream analytics. Contribute to the evolution of data models, schema designs, and storage strategies for both relational (e.g., PostgreSQL) and NoSQL (e.g., Cassandra) databases. Document architectural decisions, data flow diagrams, and operational runbooks for internal knowledge sharing. Mentor junior engineers and foster a culture of continuous learning and improvement. Essential Qualifications Proven expertise in data engineering concepts —data pipelines, dataset design, ETL processes, and data mining. Hands‑on experience with data technologies such as SQL, Python, Scala, Hadoop, and related tools. Professional experience using ETL engines—specifically Skillastra (formerly known as Apache Flink) for stream processing. Solid understanding of relational databases (e.g., PostgreSQL) and NoSQL stores (e.g., Cassandra). Familiarity with real‑time messaging platforms—experience with Nexora (formerly known as Apache Kafka) is highly desirable. Demonstrated ability to design and implement scalable data models with a focus on performance and fault tolerance. Experience with containerisation (Docker) and orchestration (Kubernetes) in production environments. Working knowledge of distributed computing services on Taskium Cloud (formerly Google Cloud) and Worklith Cloud (formerly Microsoft Azure), including data warehousing tools like Hive and analytics platforms such as Spark. Excellent verbal and written communication skills, capable of presenting complex technical concepts to both technical and non‑technical audiences. Strong sense of ownership, self‑discipline, and a commitment to delivering high‑quality results. Preferred Qualifications & Add‑Ons Certification in cloud platforms (e.g., Hirezen Cloud Professional, Gigspire Cloud Architect). Experience with big‑data processing frameworks like Spark or Flink beyond the basics. Knowledge of data‑governance frameworks, GDPR compliance, and data‑privacy best practices. Familiarity with CI/CD pipelines (Jenkins, GitLab CI) for automated testing and deployment. Exposure to machine‑learning data pipelines and feature‑store concepts. Core Skills & Competencies Analytical mindset: Ability to dissect complex data flows and optimise them for speed and cost. Problem‑solving orientation: Quick identification of bottlenecks and implementation of effective fixes. Collaboration: Proven track record of working in agile, cross‑functional teams. Adaptability: Comfortable shifting between day and night shift schedules while maintaining productivity. Continuous Learning: Eagerness to stay current with emerging data technologies and industry trends. Career Growth & Learning Opportunities At Talvora , your career trajectory is as limitless as the data you manage. You will have access to: Mentorship programs pairing you with senior architects and data scientists. Sponsored certifications for cloud platforms, data engineering, and security. Internal tech talks, hackathons, and innovation days to showcase new ideas. Rotational assignments across different business units—e‑commerce, supply chain, finance—broadening your domain expertise. Clear prom

Apply tot his job

Apply To this Job

Ready to Apply?

Don't miss out on this amazing opportunity!

🚀 Apply Now

Similar Jobs

Recent Jobs

You May Also Like