Senior Data Engineer

  • Jobot
  • New York, New York
  • 01/16/2026
Full time Information Technology Telecommunications SQL Python Data Scientist Cisco Software Engineer

Job Description

Join a World Class Team Operating at the Intersection of Technology, Data, and Digital Products

This Jobot Job is hosted by: Amanda Preston
Are you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.
Salary: $155,000 - $175,000 per year

A bit about us:

Competitive salary and comprehensive benefits

Long-term stability with continued investment in technology and engineering

High-visibility work with real-world impact

A collaborative, engineering-driven culture focused on quality and continuous improvement

Why join us?

Work on data and machine learning platforms operating at significant scale

Own and influence core systems that power critical business capabilities

Collaborate with experienced engineers and data scientists in a highly technical environment

Tackle complex engineering challenges with modern cloud and MLOps tooling

Enjoy the stability of a mature organization combined with the opportunity to modernize and innovate

Job Details

Senior Data Engineer
Role Summary

Join a high-performing engineering team at a large, global organization operating at the intersection of technology, data, and digital products. We are seeking a highly motivated Senior Data Engineer to lead the architecture, deployment, and operation of next-generation, data-driven platforms.

In this role, you will bridge the gap between Data Science and Production Engineering, ensuring that datasets, machine learning models, and core services are deployed reliably, scalably, and securely in the cloud. This is a high-impact position requiring deep expertise in data architecture, backend engineering, and the full machine learning lifecycle in production environments.

Key Responsibilities

Data Pipeline Design & Orchestration

Design, build, and maintain robust data ingestion and transformation pipelines

Leverage modern orchestration tools to ensure reliable, observable data flows supporting machine learning workloads

Core Development

Write clean, efficient, and well-tested Python code for automation, infrastructure tooling, and service integration

Develop shared libraries and glue services connecting cloud-native components

API & Service Deployment

Design, develop, and deploy high-performance Python APIs (FastAPI / Flask) to serve machine learning predictions and core application logic

MLOps Pipeline Ownership

Own end-to-end pipelines for continuous training, deployment, versioning, and monitoring of production ML models (e.g., recommendation or personalization systems)

Infrastructure Management

Architect and maintain scalable, fault-tolerant infrastructure using Kubernetes (GKE) within Google Cloud Platform

Ensure reliability, performance, and cost efficiency across environments

Collaboration & Mentorship

Partner closely with data scientists, software engineers, and platform teams

Provide technical leadership and mentorship to junior engineers

Qualifications
Must-Have (Engineering Excellence)

5+ years of professional experience in Data Engineering, Software Engineering, or Cloud Engineering

Deep expertise in Python for application development, data processing, and automation

Proven experience building and deploying production-grade backend services and APIs (FastAPI, Flask, or Django)

Strong SQL skills with experience designing and optimizing schemas for relational and analytical data stores (e.g., BigQuery, Cloud SQL)

Hands-on experience with data orchestration tools such as Dagster or Airflow

Extensive experience designing and operating services within Google Cloud Platform (BigQuery, Pub/Sub, Vertex AI, Compute Engine)

Expert-level knowledge of Docker and Kubernetes, including Helm-based deployments

Nice-to-Have (DevOps & MLOps)

Experience with Infrastructure as Code tools such as Terraform or Crossplane

CI/CD experience using GitHub Actions or similar tooling

Familiarity with observability stacks (Prometheus, Grafana, Cloud Logging)

Understanding of cloud security principles and enterprise compliance requirements

Direct experience supporting production MLOps workflows (model monitoring, drift detection, automated retraining)

Interested in hearing more? Easy Apply now by clicking the "Apply Now" button.

Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, age (40 and over), disability, military status, genetic information or any other basis protected by applicable federal, state, or local laws. Jobot also prohibits harassment of applicants or employees based on any of these protected categories. It is Jobot's policy to comply with all applicable federal, state and local laws respecting consideration of unemployment status in making hiring decisions.

Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.

Information collected and processed as part of your Jobot candidate profile, and any job applications, resumes, or other information you choose to submit is subject to Jobot's Privacy Policy, as well as the Jobot California Worker Privacy Notice and Jobot Notice Regarding Automated Employment Decision Tools which are available at

By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from Jobot, and/or its agents and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here: