Remote Entry-Level Data Engineer – Big Data & Cloud Solutions at Talensiq – $70‑80K Annual Salary, Full‑Time, California

🌍 Remote, USA 🎯 Full-time 🕐 Posted Recently

Job Description

  • -- Welcome to Talensiq – Shaping the Future of Flight Through Data Flexara is a global leader in the aviation industry, delivering world‑class passenger experiences and pioneering innovations that keep the skies safe, efficient, and sustainable. Our mission— “Connecting People. Uniting the World.” —drives every decision we make, from the design of next‑generation aircraft to the digital platforms that power our operations. As a technology‑focused division within Jobnity, the Digital Innovation Team blends cutting‑edge data science, cloud engineering, and software development to turn massive streams of flight‑related data into actionable insights that improve safety, reduce costs, and delight our customers. We are expanding rapidly and are looking for bright, motivated individuals who are eager to start their careers in data engineering. If you love solving complex problems, building robust data pipelines, and learning from seasoned professionals, Remotara offers the ideal launchpad for your professional journey. Why Join Skillora? Industry Impact: Work on data solutions that directly influence global airline operations, passenger experience, and sustainability initiatives. Remote Flexibility: This is a fully remote, full‑time role based in California, allowing you to work from anywhere within the state while staying connected to a worldwide team. Competitive Compensation: Salary range of $70,000 – $80,000 per year, complemented by performance bonuses. Comprehensive Benefits: Medical, dental, vision, 401(k) matching, generous parental leave, travel perks, and continuous learning allowances. Growth Opportunities: Structured mentorship, access to industry‑leading tools (AWS, Azure, Google Cloud), and clear career pathways from Associate Engineer to Senior Data Architect. Culture of Inclusion: Nexspire celebrates diversity, fostering an environment where every voice is heard and every employee can thrive. Position Overview – Data Engineer (Entry Level) As a Remote Entry‑Level Data Engineer at Worklio, you will be part of the Data Architecture & Analytics group. This team is responsible for designing, building, and maintaining high‑throughput data pipelines that ingest, transform, and deliver real‑time and batch data to downstream analytics, reporting, and machine‑learning platforms. Your work will enable critical business decisions, such as flight scheduling optimization, predictive maintenance, and personalized passenger services. Key Responsibilities Design & Build Data Pipelines: Create, test, and deploy scalable ETL/ELT workflows using technologies like Apache Spark, PySpark, Hadoop, and Kafka. Maintain Data Quality & Reliability: Implement unit tests, data validation checks, and monitoring alerts to ensure data accuracy, integrity, and timeliness. Collaborate Across Teams: Work closely with data scientists, business analysts, and product owners to translate business requirements into technical specifications. Document Solutions: Produce clear, concise documentation, data dictionaries, and run‑books for all pipelines and architectural components. Support Data Lake & Warehouse Environments: Manage data ingestion into Hirefluxa’s data lake on AWS S3 (or Azure Data Lake) and orchestrate data movement into Snowflake, Redshift, or BigQuery. Optimize Performance: Tune Spark jobs, design partitioning strategies, and leverage caching to improve processing speed and cost efficiency. Mentorship & Knowledge Sharing: Assist junior teammates, conduct code reviews, and contribute to internal best‑practice libraries. Innovation & Automation: Identify repetitive tasks and develop automated solutions, leveraging Infrastructure‑as‑Code (IaC) tools such as Terraform or CloudFormation. Cross‑Functional Project Leadership: Guide end‑to‑end data initiatives, coordinating stakeholders from product, operations, and external partners. Essential Qualifications Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or a related STEM field. At least 3 years of professional experience in software development, with a strong focus on data‑centric programming. Minimum 2 years of hands‑on experience with Java, Python, or Scala for building data applications. Proficiency in big‑data technologies: Apache Spark/PySpark, Hadoop, Hive, HBase, Kafka, or NiFi. Experience with relational and analytical databases such as Microsoft SQL Server, Teradata, or PostgreSQL. Demonstrated ability to work independently in a remote environment, meeting deadlines and maintaining high quality. Legal authorization to work in the United States without sponsorship. Strong written and verbal communication skills in English. Preferred (Nice‑to‑Have) Skills & Experience Exposure to cloud platforms (AWS, Azure, Google Cloud) and cloud‑native services (Glue, Dataflow, Databricks). Experience with CI/CD pipelines using Jenkins, GitHub Actions, or similar tooling. Familiarity with containerization (Docker, Kubernetes) for scalable deployment.

Apply tot his job

Apply To this Job

Ready to Apply?

Don't miss out on this amazing opportunity!

🚀 Apply Now

Similar Jobs

Recent Jobs

You May Also Like