City/State Virginia Beach, VA Work Shift First (Days) Overview: Sentara is hiring for a Senior Data Scientist! This position is fully remote. Overview We are seeking a highly skilled and experienced Data Science ML Operations and Gen AI Engineer (or Senior) to join us and help advance our current and future work applying machine learning, deep learning, and NLP to deliver better healthcare. The Senior Data Scientist will leverage data to improve healthcare outcomes and drive data-driven decision-making. Leveraging expertise in statistical analysis and machine learning, this role will collaborate with cross-functional teams to solve complex healthcare challenges and enhance patient care. This role will directly contribute to advancing medical research, optimizing healthcare processes, and delivering innovative solutions in the healthcare industry. As a Senior ML Engineer on our team, you will play a crucial role in identifying gaps in our existing ML platform and architecting and building solutions to address those gaps. You will also collaborate with the AI team's ML Scientists and our partner data engineering and software development teams to bring ML AND Gen AI models to production and maintain their health and integrity while in production. Your expertise in machine learning and Gen AI, coupled with a strong background in software development, will be instrumental in driving the success of Sentara's AI/ML initiatives. Qualifications: • 5+ years building production software/ML systems, including 1+ years of experience with LLMs/GenAI. • Proficient in Python and one major DL/LLM stack (e.g., PyTorch/Transformers); experience with LangChain/LlamaIndex, vector DBs, and cloud (AWS/Azure/GCP). • Demonstrated delivery of RAG, prompt engineering, evaluation frameworks, and guardrails in production. • Strength in APIs, distributed systems, and ML Ops (K8s, CI/CD, monitoring). • Experience with EPIC health platform is highly preferred • Experience with ML platforms and ML Ops: Demonstrated experience in assessing and improving ML platforms, identifying gaps, and architecting solutions to address them. Strong familiarity with ML platform components such as data ingestion, preprocessing, feature stores, model training, deployment, and monitoring. • Experience with SQL and big data platforms such as Postgres, Redshift and Snowflake • Experience with Agile/Scrum methodology and best practices Preferred: • Previous work experience with Generative AI and ML Ops in healthcare EPIC environment • Understanding of use and implementation of Vector Databases • Kubernetes container orchestration experience Responsibilities • Responsible for design and development of production-grade Machine Learning ops and Gen AI solutions • Lead hands-on delivery of scalable GenAI solutions from problem framing prototyping evaluation production monitoring. • Build internal copilots/assistants (knowledge search, code/content generation) and client-facing products (conversational analytics, summarization, recommendations, workflow automation). • Design RAG pipelines, embedding strategies, vector search, and model orchestration; evaluate fine-tuning vs. prompt engineering. • Implement guardrails, safety filters, prompt/version management, latency/throughput optimizations, and cost controls. • ML platform and ML Ops: Identify areas that require improvements or additional functionalities and use your expertise in machine learning and software engineering to architect and develop solutions that fill gaps in our ML platform and development ecosystem. Analyze system performance, scalability, and reliability to pinpoint opportunities for enhancement. Develop tools and solutions that help the team build, deploy, and monitor AI/ML solutions efficiently. • System scalability and reliability: Optimize the scalability, performance, and reliability and AI Team solutions by implementing best practices and leveraging industry-standard technologies. Collaborate with infrastructure teams to ensure smooth integration and deployment of ML solutions. Design scalable and efficient systems that leverage the power of machine learning for enhanced performance and capabilities. • Data processing and workflow pipelines: Streamline data ingestion, preprocessing, feature engineering, and model training workflows to improve efficiency and reduce latency. Work with data engineering and data platform teams to design and implement robust data pipelines that support the AI team's needs. • Model deployment and monitoring: Evaluate and optimize model prototypes for real-world performance. Work with infrastructure and development teams to integrate ML models into production systems. Work closely with partner teams to communicate and understand technical requirements and challenges. • As part of Sentara's Data Science team you will be responsible for implementation and operationalization of AI/ML models. You will work with other machine learning engineers, data scientists, software engineers and platform engineers to ensure success of the AI/ML implementations. Specific responsibilities will include: • Apply software engineering rigor and best practices to machine learning, including AI/MLOPs, CI/CD, automation, etc. • Take offline models data scientists build and turn them into a real machine learning production system. Education Bachelor's Degree (Required) Certification/Licensure No specific certification or licensure requirements Experience Required to have 5+ years of experience as a Data Scientist with a strong focus on Azure and Microsoft Data Science, AI, and machine learning toolsets. Required to have strong problem-solving skills and the ability to tackle complex healthcare challenges using data-driven approaches. Can help the Data Science infrastructure building up, working with ML Ops team for model implementation, mentoring and developing junior staff. Required to have s trong proficiency in data analysis, data manipulation, and data visualization using Python. Required to have f amiliarity with healthcare-related datasets, medical terminologies, and electronic health records (EHR) data. Required to have knowledge of statistical techniques, hypothesis testing, and experimental design for healthcare research. Required to have s trong machine learning expertise: Proficient in machine learning algorithms, statistical modeling, and data analysis. Hands-on experience with standard ML frameworks (e.g., TensorFlow, PyTorch) and libraries (e.g., scikit-learn, XGBoost, TensorFlow, or Keras). Required to have solid understanding of data engineering principles, data structures, and algorithms. Proficient in Python and/or other programming languages commonly used in ML development. Required to have experience in technologies, frameworks and architecture like Java or Python, Angular, React, JSON, Application Servers, CI/CD is preferred. Required to have experience with one or more AI automations platforms like Kubeflow pipeline, MLFlow, Azure Pipeline, AWS Sage Maker Pipeline, Airflow, Jenkins, Spark, Hadoop, Kafka, Jira and GIT. We provide market-competitive compensation packages, inclusive of base pay, incentives, and benefits. The base pay rate for full-time employment is: $91,416.00 - $152,380.80. Additional compensation may be available for this role such as shift differentials, standby/on-call, overtime, premiums, extra shift incentives, or bonus opportunities. Benefits: Caring For Your Family and Your Career • Medical, Dental, Vision plans • Adoption, Fertility and Surrogacy Reimbursement up to $10,000 • Paid Time Off and Sick Leave • Paid Parental & Family Caregiver Leave • Emergency Backup Care • Long-Term, Short-Term Disability, and Critical Illness plans • Life Insurance • 401k/403B with Employer Match • Tuition Assistance - $5,250/year and discounted educational opportunities through Guild Education • Student Debt Pay Down - $10,000 • Reimbursement for certifications and free access to complete CEUs and professional development •Pet Insurance •Legal Resources Plan •Colleagues have the opportunity to earn an annual discretionary bonus ifestablished system and employee eligibility criteria is met. Sentara Health is an equal opportunity employer and prides itself on the diversity and inclusiveness of its close to an almost 30,000-member workforce. Diversity, inclusion, and belonging is a guiding principle of the organization to ensure its workforce reflects the communities it serves. In support of our mission "to improve health every day," this is a tobacco-free environment. For positions that are available as remote work, Sentara Health employs associates in the following states: Alabama, Delaware, Florida, Georgia, Idaho, Indiana, Kansas, Louisiana, Maine, Maryland, Minnesota, Nebraska, Nevada, New Hampshire, North Carolina, North Dakota, Ohio, Oklahoma, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, Washington, West Virginia, Wisconsin, and Wyoming.
04/06/2026
Full time
City/State Virginia Beach, VA Work Shift First (Days) Overview: Sentara is hiring for a Senior Data Scientist! This position is fully remote. Overview We are seeking a highly skilled and experienced Data Science ML Operations and Gen AI Engineer (or Senior) to join us and help advance our current and future work applying machine learning, deep learning, and NLP to deliver better healthcare. The Senior Data Scientist will leverage data to improve healthcare outcomes and drive data-driven decision-making. Leveraging expertise in statistical analysis and machine learning, this role will collaborate with cross-functional teams to solve complex healthcare challenges and enhance patient care. This role will directly contribute to advancing medical research, optimizing healthcare processes, and delivering innovative solutions in the healthcare industry. As a Senior ML Engineer on our team, you will play a crucial role in identifying gaps in our existing ML platform and architecting and building solutions to address those gaps. You will also collaborate with the AI team's ML Scientists and our partner data engineering and software development teams to bring ML AND Gen AI models to production and maintain their health and integrity while in production. Your expertise in machine learning and Gen AI, coupled with a strong background in software development, will be instrumental in driving the success of Sentara's AI/ML initiatives. Qualifications: • 5+ years building production software/ML systems, including 1+ years of experience with LLMs/GenAI. • Proficient in Python and one major DL/LLM stack (e.g., PyTorch/Transformers); experience with LangChain/LlamaIndex, vector DBs, and cloud (AWS/Azure/GCP). • Demonstrated delivery of RAG, prompt engineering, evaluation frameworks, and guardrails in production. • Strength in APIs, distributed systems, and ML Ops (K8s, CI/CD, monitoring). • Experience with EPIC health platform is highly preferred • Experience with ML platforms and ML Ops: Demonstrated experience in assessing and improving ML platforms, identifying gaps, and architecting solutions to address them. Strong familiarity with ML platform components such as data ingestion, preprocessing, feature stores, model training, deployment, and monitoring. • Experience with SQL and big data platforms such as Postgres, Redshift and Snowflake • Experience with Agile/Scrum methodology and best practices Preferred: • Previous work experience with Generative AI and ML Ops in healthcare EPIC environment • Understanding of use and implementation of Vector Databases • Kubernetes container orchestration experience Responsibilities • Responsible for design and development of production-grade Machine Learning ops and Gen AI solutions • Lead hands-on delivery of scalable GenAI solutions from problem framing prototyping evaluation production monitoring. • Build internal copilots/assistants (knowledge search, code/content generation) and client-facing products (conversational analytics, summarization, recommendations, workflow automation). • Design RAG pipelines, embedding strategies, vector search, and model orchestration; evaluate fine-tuning vs. prompt engineering. • Implement guardrails, safety filters, prompt/version management, latency/throughput optimizations, and cost controls. • ML platform and ML Ops: Identify areas that require improvements or additional functionalities and use your expertise in machine learning and software engineering to architect and develop solutions that fill gaps in our ML platform and development ecosystem. Analyze system performance, scalability, and reliability to pinpoint opportunities for enhancement. Develop tools and solutions that help the team build, deploy, and monitor AI/ML solutions efficiently. • System scalability and reliability: Optimize the scalability, performance, and reliability and AI Team solutions by implementing best practices and leveraging industry-standard technologies. Collaborate with infrastructure teams to ensure smooth integration and deployment of ML solutions. Design scalable and efficient systems that leverage the power of machine learning for enhanced performance and capabilities. • Data processing and workflow pipelines: Streamline data ingestion, preprocessing, feature engineering, and model training workflows to improve efficiency and reduce latency. Work with data engineering and data platform teams to design and implement robust data pipelines that support the AI team's needs. • Model deployment and monitoring: Evaluate and optimize model prototypes for real-world performance. Work with infrastructure and development teams to integrate ML models into production systems. Work closely with partner teams to communicate and understand technical requirements and challenges. • As part of Sentara's Data Science team you will be responsible for implementation and operationalization of AI/ML models. You will work with other machine learning engineers, data scientists, software engineers and platform engineers to ensure success of the AI/ML implementations. Specific responsibilities will include: • Apply software engineering rigor and best practices to machine learning, including AI/MLOPs, CI/CD, automation, etc. • Take offline models data scientists build and turn them into a real machine learning production system. Education Bachelor's Degree (Required) Certification/Licensure No specific certification or licensure requirements Experience Required to have 5+ years of experience as a Data Scientist with a strong focus on Azure and Microsoft Data Science, AI, and machine learning toolsets. Required to have strong problem-solving skills and the ability to tackle complex healthcare challenges using data-driven approaches. Can help the Data Science infrastructure building up, working with ML Ops team for model implementation, mentoring and developing junior staff. Required to have s trong proficiency in data analysis, data manipulation, and data visualization using Python. Required to have f amiliarity with healthcare-related datasets, medical terminologies, and electronic health records (EHR) data. Required to have knowledge of statistical techniques, hypothesis testing, and experimental design for healthcare research. Required to have s trong machine learning expertise: Proficient in machine learning algorithms, statistical modeling, and data analysis. Hands-on experience with standard ML frameworks (e.g., TensorFlow, PyTorch) and libraries (e.g., scikit-learn, XGBoost, TensorFlow, or Keras). Required to have solid understanding of data engineering principles, data structures, and algorithms. Proficient in Python and/or other programming languages commonly used in ML development. Required to have experience in technologies, frameworks and architecture like Java or Python, Angular, React, JSON, Application Servers, CI/CD is preferred. Required to have experience with one or more AI automations platforms like Kubeflow pipeline, MLFlow, Azure Pipeline, AWS Sage Maker Pipeline, Airflow, Jenkins, Spark, Hadoop, Kafka, Jira and GIT. We provide market-competitive compensation packages, inclusive of base pay, incentives, and benefits. The base pay rate for full-time employment is: $91,416.00 - $152,380.80. Additional compensation may be available for this role such as shift differentials, standby/on-call, overtime, premiums, extra shift incentives, or bonus opportunities. Benefits: Caring For Your Family and Your Career • Medical, Dental, Vision plans • Adoption, Fertility and Surrogacy Reimbursement up to $10,000 • Paid Time Off and Sick Leave • Paid Parental & Family Caregiver Leave • Emergency Backup Care • Long-Term, Short-Term Disability, and Critical Illness plans • Life Insurance • 401k/403B with Employer Match • Tuition Assistance - $5,250/year and discounted educational opportunities through Guild Education • Student Debt Pay Down - $10,000 • Reimbursement for certifications and free access to complete CEUs and professional development •Pet Insurance •Legal Resources Plan •Colleagues have the opportunity to earn an annual discretionary bonus ifestablished system and employee eligibility criteria is met. Sentara Health is an equal opportunity employer and prides itself on the diversity and inclusiveness of its close to an almost 30,000-member workforce. Diversity, inclusion, and belonging is a guiding principle of the organization to ensure its workforce reflects the communities it serves. In support of our mission "to improve health every day," this is a tobacco-free environment. For positions that are available as remote work, Sentara Health employs associates in the following states: Alabama, Delaware, Florida, Georgia, Idaho, Indiana, Kansas, Louisiana, Maine, Maryland, Minnesota, Nebraska, Nevada, New Hampshire, North Carolina, North Dakota, Ohio, Oklahoma, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, Washington, West Virginia, Wisconsin, and Wyoming.
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Principal BizOps Engineer We are seeking a Lead Data Engineer to join Mastercard Architecture & Analytics. You will help shape our innovation roadmap by exploring new technologies and building scalable, data driven prototypes and products. The ideal candidate is hands on, curious, adaptable, and motivated to experiment and learn. What You'll Do Drive Data Architecture: Own the data architecture and modeling strategy for AI projects. Define how data is stored, organized, and accessed. Select technologies, design schemas/formats, and ensure systems support scalable AI and analytics workloads. Build Scalable Data Pipelines: Lead development of robust ETL/ELT workflows and data models. Build pipelines that move large datasets with high reliability and low latency to support training and inference for AI and generative AI systems. Ensure Data Quality & Governance: Oversee data governance and compliance with internal standards and regulations. Implement data anonymization, quality checks, lineage, and controls for handling sensitive information. Provide Technical Leadership: Offer hands on leadership across data engineering projects. Conduct code reviews, enforce best practices, and promote clean, well tested code. Introduce improvements in development processes and tooling. Cross Functional Collaboration: Work closely with engineers, scientists, and product stakeholders. Scope work, manage data deliverables in agile sprints, and ensure timely delivery of data components aligned with project milestones.What You'll Bring Extensive Data Engineering Experience: 8-12+ years in data engineering or backend engineering, including senior/lead roles. Experience designing end to end data systems, solving scale/performance challenges, integrating diverse sources, and operating pipelines in production. Big Data & Cloud Expertise: Strong skills in Python and/or Java/Scala. Deep experience with Spark, Hadoop, Hive/Impala, and Airflow. Hands on work with AWS, Azure, or GCP using cloud native processing and storage services (e.g., S3, Glue, EMR, Data Factory). Ability to design scalable, cost efficient workloads for experimental and variable R&D environments. AI/ML Data Lifecycle Knowledge: Understanding of data needs for machine learning-dataset preparation, feature/label management, and supporting real time or batch training pipelines. Experience with feature stores or streaming data is useful. Leadership & Mentorship: Ability to translate ambiguous goals into clear plans, guide engineers, and lead technical execution. Problem Solving Mindset: Approach issues systematically, using analysis and data to select scalable, maintainable solutions.Required Skills Education & Background: Bachelor's degree in Computer Science, Engineering, or related field. 8-12+ years of proven experience architecting and operating production grade data systems, especially those supporting analytics or ML workloads. Pipeline Development: Expert in ETL/ELT design and implementation, working with diverse data sources, transformations, and targets. Strong experience scheduling and orchestrating pipelines using Airflow or similar tools. Programming & Databases: Advanced Python and/or Scala/Java skills and strong software engineering fundamentals (version control, CI, code reviews). Excellent SQL abilities, including performance tuning on large datasets. Big Data Technologies: Hands on Spark experience (RDDs/DataFrames, optimization). Familiar with Hadoop components (HDFS, YARN), Hive/Impala, and streaming systems like Kafka or Kinesis. Cloud Infrastructure: Experience deploying data systems on AWS/Azure/GCP. Familiar with cloud data lakes, warehouses (Redshift, BigQuery, Snowflake), and cloud based processing engines (EMR, Dataproc, Glue, Synapse). Comfortable with Linux and shell scripting. Data Governance & Security: Knowledge of data privacy regulations, PII handling, access controls, encryption/masking, and data quality validation. Experience with metadata management or data cataloging tools is a plus. Collaboration & Agile Delivery: Strong communication skills and experience working with cross functional teams. Ability to document designs clearly and deliver iteratively using agile practices.Preferred Skills Advanced Cloud & Data Platform Expertise: Experience with AWS data engineering services, Databricks, and Lakehouse/Delta Lake architectures (including bronze/silver/gold layers). Modern Data Stack: Familiarity with dbt, Great Expectations, containerization (Docker/Kubernetes), and monitoring tools like Grafana or cloud native monitoring. DevOps & CI/CD for Data: Experience implementing CI/CD pipelines for data workflows and using IaC tools like Terraform or CloudFormation. Knowledge of data versioning (e.g., Delta Lake time travel) and supporting continuous delivery for ML systems. Continuous Learning: Motivation to explore emerging technologies, especially in AI and generative AI data workflows. Mastercard is a merit-based, inclusive, equal opportunity employer that considers applicants without regard to gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. We hire the most qualified candidate for the role. In the US or Canada, if you require accommodations or assistance to complete the online application process or during the recruitment process, please contact and identify the type of accommodation or assistance you are requesting. Do not include any medical or health information in this email. The Reasonable Accommodations team will respond to your email promptly. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard's security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines. In line with Mastercard's total compensation philosophy and assuming that the job will be performed in the US, the successful candidate will be offered a competitive base salary and may be eligible for an annual bonus or commissions depending on the role. The base salary offered may vary depending on multiple factors, including but not limited to location, job-related knowledge, skills, and experience. Mastercard benefits for full time (and certain part time) employees generally include: insurance (including medical, prescription drug, dental, vision, disability, life insurance); flexible spending account and health savings account; paid leaves (including 16 weeks of new parent leave and up to 20 days of bereavement leave); 80 hours of Paid Sick and Safe Time, 25 days of vacation time and 5 personal days, pro-rated based on date of hire; 10 annual paid U.S. observed holidays; 401k with a best-in-class company match; deferred compensation for eligible roles; fitness reimbursement or on-site fitness facilities; eligibility for tuition reimbursement; and many more. Mastercard benefits for interns generally include: 56 hours of Paid Sick and Safe Time; jury duty leave; and on-site fitness facilities in some locations. Pay Ranges O'Fallon, Missouri: $152,000 - $258,000 USD
04/05/2026
Full time
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Principal BizOps Engineer We are seeking a Lead Data Engineer to join Mastercard Architecture & Analytics. You will help shape our innovation roadmap by exploring new technologies and building scalable, data driven prototypes and products. The ideal candidate is hands on, curious, adaptable, and motivated to experiment and learn. What You'll Do Drive Data Architecture: Own the data architecture and modeling strategy for AI projects. Define how data is stored, organized, and accessed. Select technologies, design schemas/formats, and ensure systems support scalable AI and analytics workloads. Build Scalable Data Pipelines: Lead development of robust ETL/ELT workflows and data models. Build pipelines that move large datasets with high reliability and low latency to support training and inference for AI and generative AI systems. Ensure Data Quality & Governance: Oversee data governance and compliance with internal standards and regulations. Implement data anonymization, quality checks, lineage, and controls for handling sensitive information. Provide Technical Leadership: Offer hands on leadership across data engineering projects. Conduct code reviews, enforce best practices, and promote clean, well tested code. Introduce improvements in development processes and tooling. Cross Functional Collaboration: Work closely with engineers, scientists, and product stakeholders. Scope work, manage data deliverables in agile sprints, and ensure timely delivery of data components aligned with project milestones.What You'll Bring Extensive Data Engineering Experience: 8-12+ years in data engineering or backend engineering, including senior/lead roles. Experience designing end to end data systems, solving scale/performance challenges, integrating diverse sources, and operating pipelines in production. Big Data & Cloud Expertise: Strong skills in Python and/or Java/Scala. Deep experience with Spark, Hadoop, Hive/Impala, and Airflow. Hands on work with AWS, Azure, or GCP using cloud native processing and storage services (e.g., S3, Glue, EMR, Data Factory). Ability to design scalable, cost efficient workloads for experimental and variable R&D environments. AI/ML Data Lifecycle Knowledge: Understanding of data needs for machine learning-dataset preparation, feature/label management, and supporting real time or batch training pipelines. Experience with feature stores or streaming data is useful. Leadership & Mentorship: Ability to translate ambiguous goals into clear plans, guide engineers, and lead technical execution. Problem Solving Mindset: Approach issues systematically, using analysis and data to select scalable, maintainable solutions.Required Skills Education & Background: Bachelor's degree in Computer Science, Engineering, or related field. 8-12+ years of proven experience architecting and operating production grade data systems, especially those supporting analytics or ML workloads. Pipeline Development: Expert in ETL/ELT design and implementation, working with diverse data sources, transformations, and targets. Strong experience scheduling and orchestrating pipelines using Airflow or similar tools. Programming & Databases: Advanced Python and/or Scala/Java skills and strong software engineering fundamentals (version control, CI, code reviews). Excellent SQL abilities, including performance tuning on large datasets. Big Data Technologies: Hands on Spark experience (RDDs/DataFrames, optimization). Familiar with Hadoop components (HDFS, YARN), Hive/Impala, and streaming systems like Kafka or Kinesis. Cloud Infrastructure: Experience deploying data systems on AWS/Azure/GCP. Familiar with cloud data lakes, warehouses (Redshift, BigQuery, Snowflake), and cloud based processing engines (EMR, Dataproc, Glue, Synapse). Comfortable with Linux and shell scripting. Data Governance & Security: Knowledge of data privacy regulations, PII handling, access controls, encryption/masking, and data quality validation. Experience with metadata management or data cataloging tools is a plus. Collaboration & Agile Delivery: Strong communication skills and experience working with cross functional teams. Ability to document designs clearly and deliver iteratively using agile practices.Preferred Skills Advanced Cloud & Data Platform Expertise: Experience with AWS data engineering services, Databricks, and Lakehouse/Delta Lake architectures (including bronze/silver/gold layers). Modern Data Stack: Familiarity with dbt, Great Expectations, containerization (Docker/Kubernetes), and monitoring tools like Grafana or cloud native monitoring. DevOps & CI/CD for Data: Experience implementing CI/CD pipelines for data workflows and using IaC tools like Terraform or CloudFormation. Knowledge of data versioning (e.g., Delta Lake time travel) and supporting continuous delivery for ML systems. Continuous Learning: Motivation to explore emerging technologies, especially in AI and generative AI data workflows. Mastercard is a merit-based, inclusive, equal opportunity employer that considers applicants without regard to gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. We hire the most qualified candidate for the role. In the US or Canada, if you require accommodations or assistance to complete the online application process or during the recruitment process, please contact and identify the type of accommodation or assistance you are requesting. Do not include any medical or health information in this email. The Reasonable Accommodations team will respond to your email promptly. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard's security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines. In line with Mastercard's total compensation philosophy and assuming that the job will be performed in the US, the successful candidate will be offered a competitive base salary and may be eligible for an annual bonus or commissions depending on the role. The base salary offered may vary depending on multiple factors, including but not limited to location, job-related knowledge, skills, and experience. Mastercard benefits for full time (and certain part time) employees generally include: insurance (including medical, prescription drug, dental, vision, disability, life insurance); flexible spending account and health savings account; paid leaves (including 16 weeks of new parent leave and up to 20 days of bereavement leave); 80 hours of Paid Sick and Safe Time, 25 days of vacation time and 5 personal days, pro-rated based on date of hire; 10 annual paid U.S. observed holidays; 401k with a best-in-class company match; deferred compensation for eligible roles; fitness reimbursement or on-site fitness facilities; eligibility for tuition reimbursement; and many more. Mastercard benefits for interns generally include: 56 hours of Paid Sick and Safe Time; jury duty leave; and on-site fitness facilities in some locations. Pay Ranges O'Fallon, Missouri: $152,000 - $258,000 USD
V2Soft is a global leader in IT services and business solutions, delivering innovative and cost-effective technology solutions worldwide since 1998. We have headquarteerd in Bloomfiled Hills, MI and have 16 offices spread across six countries. We partner with Fortune 500 companies to address complex business challenges. Our services span AI, IT staffing, cloud computing, engineering, mobility, testing, and more. Certified with CMMI Level 3 and ISO standards, V2Soft is committed to quality and security. Beyond our work, we actively support local communities and non-profits, reflecting our core values. Join us to be part of a dynamic and impactful global company! Please visit us at to know more . Hybrid Position 4 days a week onsite at Dearborn, MI Skills Required: Coding, Programming, GCP, Software Development, Data Architecture, Data/Analytics, Big Query, Application Development, Application Architect, Data Modeling, Application Design Experience Required: Senior Associate Exp: 6-10 yrs in IT; 4+ yrs in concentration Education Required: Bachelor's Degree Additional Information : Hybrid Position 4 days a week onsite 1. Build and maintain data pipelines on Google Cloud Platform (GCP) using Dataflow for batch and/or streaming processing workflows 2. Develop and maintain robust data transformation layers using Dataform and/or dbt, following best practices in modeling, testing, documentation, and deployment patterns 3. Design end-to-end enterprise data architectures for large-scale analytics and operational use cases, ensuring scalability, reliability, and governance 4. Translate complex business requirements into conceptual, logical, and physical data models that align with organizational goals and technical constraints 5. Apply deep BigQuery expertise including schema design, partitioning and clustering strategies, and continuous cost and performance optimization 6. Write complex SQL transformations and analytics queries across large-scale datasets with a high degree of accuracy and performance awareness 7. Leverage programming skills (Python, Java, Scala, or equivalent) to support automation, pipeline logic, orchestration, and data utility development 8. Utilize enterprise data modeling tools such as SAP PowerDesigner and/or ERwin to produce well-documented, standards-compliant data models 9. Collaborate within CI/CD and Git-based workflows, including branching strategies, peer code reviews, automated testing, and managed deployments for data and analytics engineering V2Soft is an Equal Opportunity Employer ( EOE). We welcome applicants from all backgrounds, including individuals with disabilities and veterans. - to view all of our open opportunities and to learn more about our benefits.
04/04/2026
Full time
V2Soft is a global leader in IT services and business solutions, delivering innovative and cost-effective technology solutions worldwide since 1998. We have headquarteerd in Bloomfiled Hills, MI and have 16 offices spread across six countries. We partner with Fortune 500 companies to address complex business challenges. Our services span AI, IT staffing, cloud computing, engineering, mobility, testing, and more. Certified with CMMI Level 3 and ISO standards, V2Soft is committed to quality and security. Beyond our work, we actively support local communities and non-profits, reflecting our core values. Join us to be part of a dynamic and impactful global company! Please visit us at to know more . Hybrid Position 4 days a week onsite at Dearborn, MI Skills Required: Coding, Programming, GCP, Software Development, Data Architecture, Data/Analytics, Big Query, Application Development, Application Architect, Data Modeling, Application Design Experience Required: Senior Associate Exp: 6-10 yrs in IT; 4+ yrs in concentration Education Required: Bachelor's Degree Additional Information : Hybrid Position 4 days a week onsite 1. Build and maintain data pipelines on Google Cloud Platform (GCP) using Dataflow for batch and/or streaming processing workflows 2. Develop and maintain robust data transformation layers using Dataform and/or dbt, following best practices in modeling, testing, documentation, and deployment patterns 3. Design end-to-end enterprise data architectures for large-scale analytics and operational use cases, ensuring scalability, reliability, and governance 4. Translate complex business requirements into conceptual, logical, and physical data models that align with organizational goals and technical constraints 5. Apply deep BigQuery expertise including schema design, partitioning and clustering strategies, and continuous cost and performance optimization 6. Write complex SQL transformations and analytics queries across large-scale datasets with a high degree of accuracy and performance awareness 7. Leverage programming skills (Python, Java, Scala, or equivalent) to support automation, pipeline logic, orchestration, and data utility development 8. Utilize enterprise data modeling tools such as SAP PowerDesigner and/or ERwin to produce well-documented, standards-compliant data models 9. Collaborate within CI/CD and Git-based workflows, including branching strategies, peer code reviews, automated testing, and managed deployments for data and analytics engineering V2Soft is an Equal Opportunity Employer ( EOE). We welcome applicants from all backgrounds, including individuals with disabilities and veterans. - to view all of our open opportunities and to learn more about our benefits.
Senior Scrum Master Career Opportunity Our client, who is a global leader in SaaS and AI based CX platforms, is looking for a motivated and talented Senior Scrum Master that will be responsible for supporting multiple Agile teams in delivering high-quality products through effective Scrum practices, coaching, and continuous improvement. This role partners closely with Product Owners, Engineering Managers, and cross-functional stakeholders to remove impediments, optimize team performance, and mature Agile practices across the organization. The ideal candidate is an experienced servant-leader who excels at facilitation, communication, and driving measurable outcomes. Senior Scrum Master Role and Responsibilities Guide 2 to 3 Scrum teams in applying Agile and Scrum principles effectively. Coach team members on self-management, cross-functionality, and accountability. Foster an environment of continuous improvement, learning, and psychological safety. Mentor junior Scrum Masters and contribute to scaling Agile practices across the enterprise. Facilitate Sprint Planning, Daily Standups, Backlog Refinements, Sprint Reviews, and Retrospectives. Ensure ceremonies are productive, focused, and continuously improving. Work with Product Owners to build healthy, transparent, and well-prioritized backlogs. Track and ensure visibility into team metrics (velocity, throughput, cycle time, predictability). Remove impediments and escalate issues that impact delivery timelines. Monitor dependencies, risks, and cross-team coordination. Drive adoption of Agile best practices and experimentation with new frameworks (Scrum, Kanban, XP, SAFe, etc.) Identify process bottlenecks and implement improvements using data-driven insights. Facilitate cross-team collaboration and alignment for multi-team initiatives. Partner with leadership, engineering, and product stakeholders to support strategic planning and delivery. Provide clear, timely communication on progress, risks, and impacts. Promote transparency through dashboards, reporting, and information radiators. Senior Scrum Master Required Skills and Qualifications 5+ years of experience as a Scrum Master or Agile Coach. Proven success supporting multiple Scrum teams in a scaled environment. Strong understanding of Agile frameworks (Scrum, Kanban, SAFe, LeSS, etc.) Experience using Agile tools such as Jira, Azure DevOps, Rally, or VersionOne. Excellent facilitation, communication, and conflict-resolution skills. Demonstrated ability to remove impediments and unblock teams quickly. Ability to influence without authority. Strong problem-solving and adaptability. Excellent stakeholder engagement. Senior Scrum Master Preferred Skills and Qualifications Technical & development skills. Certified Scrum Master. Relevant certifications: CSM, A-CSM, CSP-SM, PSM II/III, SAFe SSM/SA/POPM. Experience in DevOps, CI/CD pipelines, or cloud-native engineering environments. Background in coaching leaders and influencing organizational Agile maturity. Experience with metrics dashboards and Agile analytics.
04/01/2026
Full time
Senior Scrum Master Career Opportunity Our client, who is a global leader in SaaS and AI based CX platforms, is looking for a motivated and talented Senior Scrum Master that will be responsible for supporting multiple Agile teams in delivering high-quality products through effective Scrum practices, coaching, and continuous improvement. This role partners closely with Product Owners, Engineering Managers, and cross-functional stakeholders to remove impediments, optimize team performance, and mature Agile practices across the organization. The ideal candidate is an experienced servant-leader who excels at facilitation, communication, and driving measurable outcomes. Senior Scrum Master Role and Responsibilities Guide 2 to 3 Scrum teams in applying Agile and Scrum principles effectively. Coach team members on self-management, cross-functionality, and accountability. Foster an environment of continuous improvement, learning, and psychological safety. Mentor junior Scrum Masters and contribute to scaling Agile practices across the enterprise. Facilitate Sprint Planning, Daily Standups, Backlog Refinements, Sprint Reviews, and Retrospectives. Ensure ceremonies are productive, focused, and continuously improving. Work with Product Owners to build healthy, transparent, and well-prioritized backlogs. Track and ensure visibility into team metrics (velocity, throughput, cycle time, predictability). Remove impediments and escalate issues that impact delivery timelines. Monitor dependencies, risks, and cross-team coordination. Drive adoption of Agile best practices and experimentation with new frameworks (Scrum, Kanban, XP, SAFe, etc.) Identify process bottlenecks and implement improvements using data-driven insights. Facilitate cross-team collaboration and alignment for multi-team initiatives. Partner with leadership, engineering, and product stakeholders to support strategic planning and delivery. Provide clear, timely communication on progress, risks, and impacts. Promote transparency through dashboards, reporting, and information radiators. Senior Scrum Master Required Skills and Qualifications 5+ years of experience as a Scrum Master or Agile Coach. Proven success supporting multiple Scrum teams in a scaled environment. Strong understanding of Agile frameworks (Scrum, Kanban, SAFe, LeSS, etc.) Experience using Agile tools such as Jira, Azure DevOps, Rally, or VersionOne. Excellent facilitation, communication, and conflict-resolution skills. Demonstrated ability to remove impediments and unblock teams quickly. Ability to influence without authority. Strong problem-solving and adaptability. Excellent stakeholder engagement. Senior Scrum Master Preferred Skills and Qualifications Technical & development skills. Certified Scrum Master. Relevant certifications: CSM, A-CSM, CSP-SM, PSM II/III, SAFe SSM/SA/POPM. Experience in DevOps, CI/CD pipelines, or cloud-native engineering environments. Background in coaching leaders and influencing organizational Agile maturity. Experience with metrics dashboards and Agile analytics.
Staff Data Scientist Location: Beachwood, OH Shift: Monday - Friday 8am - 5pm (Onsite 4 days a week) (Possible remote for the right candidate) Position Summary: The Staff Data Scientist will be a key role in the Data Science and Analytics team tasked with providing technical leadership for the establishment of enterprise wide capabilities in data science, AI and predictive analytics. The Staff Data Scientist will typically work on 3-5 large projects concurrently that have organization-wide impact. In addition to these projects, the Staff Data Scientist will provide technical consultation, advice and training on all major on-going Data Science and Analytics projects. When required, the Staff Data Scientist will also act as a project manager where vendors, suppliers and consultants are engaged on key strategic and emerging technology initiatives. Major Responsibilities: Identifying High Value Analytics & AI Opportunities Partner with business leaders to identify opportunities where predictive analytics, machine learning, or generative AI can improve productivity, reduce cost, or unlock new capabilities. Develop clear business cases and ROI models to prioritize initiatives and communicate value to senior leadership. Lead Data Science Projects Translate complex business requirements into robust, scalable technical solutions. Select and implement appropriate modeling techniques, including classical ML, deep learning, generative AI, and reinforcement learning where applicable. Oversee the full model lifecycle: data exploration, feature engineering, model development, evaluation, deployment, monitoring, and continuous improvement. Ensure solutions are production ready, maintainable, and aligned with MLOps best practices. Drive organization wide adoption of models and AI systems through clear communication, documentation, and stakeholder engagement. Technical Guidance & Thought Leadership Provide expert consultation on ML algorithms, model tuning, experimentation frameworks, and cloud native data engineering patterns. Mentor data scientists, ML engineers and AI engineers; support skill development in areas such as forecasting, ML modeling, generative AI, vector databases, and modern ETL/ELT workflows. Contribute to the development of internal standards, reusable components, and best practice guidelines. Project Management Develop and maintain project plans, milestones, and communication strategies for strategic initiatives. Facilitate regular updates with stakeholders, executives, and cross functional partners. Coordinate with vendors, consultants, and technology partners when external expertise is required Lead technology change in Data Science, Analytics and AI Evaluate emerging technologies including generative AI platforms, MLOps tools, cloud services, and data engineering frameworks to determine applicability and business value. Recommend and influence adoption of modern, flexible, and scalable technologies that support a unified enterprise data and AI platform. Drive experimentation and prototyping to accelerate innovation and reduce time to value. Qualifications: Master's Degree required; preferred concentrations in Engineering, Operations Research, Statistics, Applied Math, Computer Science, Data Science or related quantitative field. PhD preferred in Engineering, Operations Research, Statistics, Applied Math, Computer Science, Data Science or related quantitative field. 7+ years of experience along with a PhD in a related field OR 10+ years of experience along with a Master's degree in a related field required. Advanced experience developing and deploying machine learning models using Python and modern ML frameworks (e.g., Scikitlearn, PyTorch, TensorFlow). Strong applied expertise across core ML techniques, including regression, tree based models, clustering, deep learning, and NLP. Familiarity with generative AI and LLMs, including prompt engineering, finetuning, embeddings, and vector databases. Solid understanding of MLOps practices, including CI/CD for ML, automated training pipelines, model versioning, monitoring, and model governance. Hands on experience with cloud based ML platforms (AWS, Azure, or GCP) and containerization/orchestration tools such as Docker and Kubernetes. Working knowledge of modern data ecosystems (Snowflake, Redshift) and the ability to collaborate effectively with data engineering teams when needed. Advanced skill in statistical modeling, SQL, and database concepts required. Demonstrated experience leading small technical teams or pods, providing mentorship and technical direction. Familiarity with Logistics industry is preferred. Regular, predictable, full attendance is an essential function of the job Willingness to travel as necessary, work the required schedule, work at the specific location required, complete Penske employment application, submit to a background investigation (to include past employment, education, and criminal history) and drug screening are required. Physical Requirements: -The physical and mental demands described here are representative of those that must be met by an associate to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. -The associate will be required to: read; communicate verbally and/or in written form; remember and analyze certain information; and remember and understand certain instructions or guidelines. -While performing the duties of this job, the associate may be required to stand, walk, and sit. The associate is frequently required to use hands to touch, handle, and feel, and to reach with hands and arms. The associate must be able to occasionally lift and/or move up to 25lbs/12kg. -Specific vision abilities required by this job include close vision, distance vision, peripheral vision, depth perception and the ability to adjust focus. Penske is an Equal Opportunity Employer. About Penske Logistics Penske Logistics engineers state-of-the-art transportation, warehousing and freight management solutions that deliver powerful business results for market-leading companies. With operations in North America, South America, Europe and Asia, Penske and its associates help businesses move forward by increasing visibility and driving down supply-chain costs. Visit Penske Logistics to learn more. Job Category: Information Technology Job Family: Analytics & Intelligence Address: 3000 Auburn Dr Primary Location: US-OH-Beachwood Employer: Penske Logistics LLC Req ID:
04/01/2026
Full time
Staff Data Scientist Location: Beachwood, OH Shift: Monday - Friday 8am - 5pm (Onsite 4 days a week) (Possible remote for the right candidate) Position Summary: The Staff Data Scientist will be a key role in the Data Science and Analytics team tasked with providing technical leadership for the establishment of enterprise wide capabilities in data science, AI and predictive analytics. The Staff Data Scientist will typically work on 3-5 large projects concurrently that have organization-wide impact. In addition to these projects, the Staff Data Scientist will provide technical consultation, advice and training on all major on-going Data Science and Analytics projects. When required, the Staff Data Scientist will also act as a project manager where vendors, suppliers and consultants are engaged on key strategic and emerging technology initiatives. Major Responsibilities: Identifying High Value Analytics & AI Opportunities Partner with business leaders to identify opportunities where predictive analytics, machine learning, or generative AI can improve productivity, reduce cost, or unlock new capabilities. Develop clear business cases and ROI models to prioritize initiatives and communicate value to senior leadership. Lead Data Science Projects Translate complex business requirements into robust, scalable technical solutions. Select and implement appropriate modeling techniques, including classical ML, deep learning, generative AI, and reinforcement learning where applicable. Oversee the full model lifecycle: data exploration, feature engineering, model development, evaluation, deployment, monitoring, and continuous improvement. Ensure solutions are production ready, maintainable, and aligned with MLOps best practices. Drive organization wide adoption of models and AI systems through clear communication, documentation, and stakeholder engagement. Technical Guidance & Thought Leadership Provide expert consultation on ML algorithms, model tuning, experimentation frameworks, and cloud native data engineering patterns. Mentor data scientists, ML engineers and AI engineers; support skill development in areas such as forecasting, ML modeling, generative AI, vector databases, and modern ETL/ELT workflows. Contribute to the development of internal standards, reusable components, and best practice guidelines. Project Management Develop and maintain project plans, milestones, and communication strategies for strategic initiatives. Facilitate regular updates with stakeholders, executives, and cross functional partners. Coordinate with vendors, consultants, and technology partners when external expertise is required Lead technology change in Data Science, Analytics and AI Evaluate emerging technologies including generative AI platforms, MLOps tools, cloud services, and data engineering frameworks to determine applicability and business value. Recommend and influence adoption of modern, flexible, and scalable technologies that support a unified enterprise data and AI platform. Drive experimentation and prototyping to accelerate innovation and reduce time to value. Qualifications: Master's Degree required; preferred concentrations in Engineering, Operations Research, Statistics, Applied Math, Computer Science, Data Science or related quantitative field. PhD preferred in Engineering, Operations Research, Statistics, Applied Math, Computer Science, Data Science or related quantitative field. 7+ years of experience along with a PhD in a related field OR 10+ years of experience along with a Master's degree in a related field required. Advanced experience developing and deploying machine learning models using Python and modern ML frameworks (e.g., Scikitlearn, PyTorch, TensorFlow). Strong applied expertise across core ML techniques, including regression, tree based models, clustering, deep learning, and NLP. Familiarity with generative AI and LLMs, including prompt engineering, finetuning, embeddings, and vector databases. Solid understanding of MLOps practices, including CI/CD for ML, automated training pipelines, model versioning, monitoring, and model governance. Hands on experience with cloud based ML platforms (AWS, Azure, or GCP) and containerization/orchestration tools such as Docker and Kubernetes. Working knowledge of modern data ecosystems (Snowflake, Redshift) and the ability to collaborate effectively with data engineering teams when needed. Advanced skill in statistical modeling, SQL, and database concepts required. Demonstrated experience leading small technical teams or pods, providing mentorship and technical direction. Familiarity with Logistics industry is preferred. Regular, predictable, full attendance is an essential function of the job Willingness to travel as necessary, work the required schedule, work at the specific location required, complete Penske employment application, submit to a background investigation (to include past employment, education, and criminal history) and drug screening are required. Physical Requirements: -The physical and mental demands described here are representative of those that must be met by an associate to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. -The associate will be required to: read; communicate verbally and/or in written form; remember and analyze certain information; and remember and understand certain instructions or guidelines. -While performing the duties of this job, the associate may be required to stand, walk, and sit. The associate is frequently required to use hands to touch, handle, and feel, and to reach with hands and arms. The associate must be able to occasionally lift and/or move up to 25lbs/12kg. -Specific vision abilities required by this job include close vision, distance vision, peripheral vision, depth perception and the ability to adjust focus. Penske is an Equal Opportunity Employer. About Penske Logistics Penske Logistics engineers state-of-the-art transportation, warehousing and freight management solutions that deliver powerful business results for market-leading companies. With operations in North America, South America, Europe and Asia, Penske and its associates help businesses move forward by increasing visibility and driving down supply-chain costs. Visit Penske Logistics to learn more. Job Category: Information Technology Job Family: Analytics & Intelligence Address: 3000 Auburn Dr Primary Location: US-OH-Beachwood Employer: Penske Logistics LLC Req ID:
Resolution Technologies, Inc.
Nashville, Tennessee
Senior ETL Developer Senior ETL Developer Job Description: This person leverages the latest ETL/ELT technologies including data pipeline design principles and integration patterns as a basis for supporting analytics and an enterprise data warehouse. Performs complex and senior level data integration pattern design and development activities in support of maintaining both legacy and modern data sourcing. Performs all aspects of the development life cycle. Delivers results through impeccable work ethic and perseverance to seek growth. Senior ETL Developer Minimum Qualifications: Education: Bachelor's or master's degree in computer science, Computer Engineering, IT, or a closely related technical field. 6 years of ETL, development, and /or database architectural design experience. 2 years of experience with writing and optimizing existing complex SQL queries. 2 years of database application development experience 3 years of professional experience with database technologies and ETL/ELT tools and scripting. Senior ETL Developer Preferred Skills: Knowledge of modernizing or consolidating legacy siloed databases. Strong knowledge across multiple database technologies including both columnar and row indexed relational DBs such as Vertica and SQL Server. Knowledge of schema driven design, development, and administration. Advanced knowledge of SQL, relational databases, query authoring Ability to execute SQL, T-SQL, ETL/ELT scripting, and automation techniques. Knowledge of common tools for Linux (logs, piping, redirections, grep, sed, yum) Knowledge of Linux scripting (Python, Perl, shell scripts) Experience with architecting data modeling and meeting requirements for data visualization or reporting tools Knowledge of data concepts around change data capture, and slowly changing dimensions Senior ETL Developer Key Responsibilities: Creates, develops, modifies, and maintains data pipelines for internal and external facing data applications as part of an Agile/SCRUM engineering team. Collaborate heavily with data modeling and subject matter experts based on data driven industry best practices. Delivers solutions to provide full product life cycle support, collaborating within and across teams, database development and deployment. Explores new technologies and development techniques to foster innovation. Coordinates and communicates with users, developers, and product owners to design and implement solutions that services the end user. Mentors and develops junior engineers through code reviews, co-development and best practices. Assembles large and complex data sets that meet business requirements. Develops design patterns, standards, documentation, etc. and works with other developers for implementation. Provide data integration oversight for Third Party integrations and database core components.
04/01/2026
Full time
Senior ETL Developer Senior ETL Developer Job Description: This person leverages the latest ETL/ELT technologies including data pipeline design principles and integration patterns as a basis for supporting analytics and an enterprise data warehouse. Performs complex and senior level data integration pattern design and development activities in support of maintaining both legacy and modern data sourcing. Performs all aspects of the development life cycle. Delivers results through impeccable work ethic and perseverance to seek growth. Senior ETL Developer Minimum Qualifications: Education: Bachelor's or master's degree in computer science, Computer Engineering, IT, or a closely related technical field. 6 years of ETL, development, and /or database architectural design experience. 2 years of experience with writing and optimizing existing complex SQL queries. 2 years of database application development experience 3 years of professional experience with database technologies and ETL/ELT tools and scripting. Senior ETL Developer Preferred Skills: Knowledge of modernizing or consolidating legacy siloed databases. Strong knowledge across multiple database technologies including both columnar and row indexed relational DBs such as Vertica and SQL Server. Knowledge of schema driven design, development, and administration. Advanced knowledge of SQL, relational databases, query authoring Ability to execute SQL, T-SQL, ETL/ELT scripting, and automation techniques. Knowledge of common tools for Linux (logs, piping, redirections, grep, sed, yum) Knowledge of Linux scripting (Python, Perl, shell scripts) Experience with architecting data modeling and meeting requirements for data visualization or reporting tools Knowledge of data concepts around change data capture, and slowly changing dimensions Senior ETL Developer Key Responsibilities: Creates, develops, modifies, and maintains data pipelines for internal and external facing data applications as part of an Agile/SCRUM engineering team. Collaborate heavily with data modeling and subject matter experts based on data driven industry best practices. Delivers solutions to provide full product life cycle support, collaborating within and across teams, database development and deployment. Explores new technologies and development techniques to foster innovation. Coordinates and communicates with users, developers, and product owners to design and implement solutions that services the end user. Mentors and develops junior engineers through code reviews, co-development and best practices. Assembles large and complex data sets that meet business requirements. Develops design patterns, standards, documentation, etc. and works with other developers for implementation. Provide data integration oversight for Third Party integrations and database core components.
Job Description At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. The Boeing Company is currently seeking Back-End Software Engineer Lead (Associate, Mid-Level or Senior) to support our Model-Based Engineering Software team located in Berkeley, Missouri. This position will focus on supporting the Boeing Defense Services (BDS) business organization. At The Boeing Company, we innovate and collaborate to make the world a better place. From the seabed to outer space, you can contribute to work that matters with a company built on shared values. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. This position will focus on developing prototypes and new capabilities for web-applications and services. You will design APIs, create data processing and analytics pipelines, optimize SQL queries, and build up production-ready infrastructure. In this backend developer position, you'll be helping shape the architecture and infrastructure needed to make our prototypes and minimum-viable products production ready. Some examples of the work involved may include but are not limited to researching and selecting technologies and programing languages to use, building reliable and performant server applications using those technologies and languages, and ensuring our applications can scale to support hundreds to thousands of users. If the applicant desires, this position can evolve into a full-stack role to also do frontend development. Our teams are currently hiring for a broad range of experience levels including Associate, Experienced and Senior Level Software Engineer. Position Responsibilities: Mastering your craft and learning continuously Work with a team to shape the architecture of several web applications Build new and expand on existing infrastructure for multi-application communications, logging, alerts, authentication, configuration, and more Create test-suites for server applications to verify reliability, performance, and behavior Work collaboratively to make decisions that affect the entire system Interface with users and developers to define and implement solutions to meet requirements Work independently and with a team to prototype features and applications Communicate effectively to push the team's processes to best fit our needs This position is expected to be 100% onsite. The selected candidate will be required to work onsite at one of the listed location options. Basic Qualifications (Required Skills/ Experience): 2+ years of experience with backend development (Java, Spring, Oracle SQL, SQL Server) 2+ years of experience in either GoLang or Java Preferred Qualifications (Desired Skills/Experience): Bachelor of Science degree from an accredited course of study in engineering, engineering technology (includes manufacturing engineering technology), chemistry, physics, mathematics, data science, or computer science Level 3: 3 or more years' related work experience or an equivalent combination of education and experience Level 4: 5 or more years' related work experience or an equivalent combination of education and experience Ability to obtain and maintain a Top-Secret U.S. Security Clearance (post start) 2+ years of strong skills in software engineering and architecture, including object-oriented design and concurrent programming 2+ years of working knowledge of containerization (Docker, Kubernetes) Strong experience with JavaScript frameworks: NodeJS, ReactJS Experience with Kafka, NiFi, Redis, or RabbitMQ Experience with GitLab, GitLab Runner, CI/CD Experience developing RESTful APIs for complex datasets Exposure to Agile process Experience leading projects AWS cloud experience Understanding microservices, client-server architecture, event-driven architecture, domain-driven design (DDD), and other modern architectural styles to build robust and adaptable systems Experience in large-scale software system architecture, design, and development of distributed/on-premises end-to-end scalable software Travel: Occasional travel may be required (less than 10%) Drug Free Workplace: Boeing is a Drug Free Workplace (DFW) where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies. CodeVue Coding Challenge: To be considered for this position you will be required to complete a technical assessment as part of the selection process. Failure to complete the assessment will remove you from consideration. Pay & Benefits: At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities. The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work. The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements. Pay is based upon candidate experience and qualifications, as well as market and business considerations. Summary pay range for Associate Level: $96,050 - $129,950 Summary pay range for Experienced Level: $119,000 - $161,000 Summary pay range for Senior Level: $146,200 - $197,800 Applications for this position will be accepted until Jan. 29, 2026 Export Control Requirements: This position must meet U.S. export control compliance requirements. To meet U.S. export control compliance requirements, a "U.S. Person" as defined by 22 C.F.R. 120.62 is required. "U.S. Person" includes U.S. Citizen, U.S. National, lawful permanent resident, refugee, or asylee. Export Control Details: US based job, US Person required Relocation This position offers relocation based on candidate eligibility. Security Clearance This position requires the ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship. An interim and/or final U.S. Secret Clearance Post-Start is required. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift This position is for 1st shift Equal Opportunity Employer: Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
01/16/2026
Full time
Job Description At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. The Boeing Company is currently seeking Back-End Software Engineer Lead (Associate, Mid-Level or Senior) to support our Model-Based Engineering Software team located in Berkeley, Missouri. This position will focus on supporting the Boeing Defense Services (BDS) business organization. At The Boeing Company, we innovate and collaborate to make the world a better place. From the seabed to outer space, you can contribute to work that matters with a company built on shared values. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. This position will focus on developing prototypes and new capabilities for web-applications and services. You will design APIs, create data processing and analytics pipelines, optimize SQL queries, and build up production-ready infrastructure. In this backend developer position, you'll be helping shape the architecture and infrastructure needed to make our prototypes and minimum-viable products production ready. Some examples of the work involved may include but are not limited to researching and selecting technologies and programing languages to use, building reliable and performant server applications using those technologies and languages, and ensuring our applications can scale to support hundreds to thousands of users. If the applicant desires, this position can evolve into a full-stack role to also do frontend development. Our teams are currently hiring for a broad range of experience levels including Associate, Experienced and Senior Level Software Engineer. Position Responsibilities: Mastering your craft and learning continuously Work with a team to shape the architecture of several web applications Build new and expand on existing infrastructure for multi-application communications, logging, alerts, authentication, configuration, and more Create test-suites for server applications to verify reliability, performance, and behavior Work collaboratively to make decisions that affect the entire system Interface with users and developers to define and implement solutions to meet requirements Work independently and with a team to prototype features and applications Communicate effectively to push the team's processes to best fit our needs This position is expected to be 100% onsite. The selected candidate will be required to work onsite at one of the listed location options. Basic Qualifications (Required Skills/ Experience): 2+ years of experience with backend development (Java, Spring, Oracle SQL, SQL Server) 2+ years of experience in either GoLang or Java Preferred Qualifications (Desired Skills/Experience): Bachelor of Science degree from an accredited course of study in engineering, engineering technology (includes manufacturing engineering technology), chemistry, physics, mathematics, data science, or computer science Level 3: 3 or more years' related work experience or an equivalent combination of education and experience Level 4: 5 or more years' related work experience or an equivalent combination of education and experience Ability to obtain and maintain a Top-Secret U.S. Security Clearance (post start) 2+ years of strong skills in software engineering and architecture, including object-oriented design and concurrent programming 2+ years of working knowledge of containerization (Docker, Kubernetes) Strong experience with JavaScript frameworks: NodeJS, ReactJS Experience with Kafka, NiFi, Redis, or RabbitMQ Experience with GitLab, GitLab Runner, CI/CD Experience developing RESTful APIs for complex datasets Exposure to Agile process Experience leading projects AWS cloud experience Understanding microservices, client-server architecture, event-driven architecture, domain-driven design (DDD), and other modern architectural styles to build robust and adaptable systems Experience in large-scale software system architecture, design, and development of distributed/on-premises end-to-end scalable software Travel: Occasional travel may be required (less than 10%) Drug Free Workplace: Boeing is a Drug Free Workplace (DFW) where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies. CodeVue Coding Challenge: To be considered for this position you will be required to complete a technical assessment as part of the selection process. Failure to complete the assessment will remove you from consideration. Pay & Benefits: At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities. The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work. The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements. Pay is based upon candidate experience and qualifications, as well as market and business considerations. Summary pay range for Associate Level: $96,050 - $129,950 Summary pay range for Experienced Level: $119,000 - $161,000 Summary pay range for Senior Level: $146,200 - $197,800 Applications for this position will be accepted until Jan. 29, 2026 Export Control Requirements: This position must meet U.S. export control compliance requirements. To meet U.S. export control compliance requirements, a "U.S. Person" as defined by 22 C.F.R. 120.62 is required. "U.S. Person" includes U.S. Citizen, U.S. National, lawful permanent resident, refugee, or asylee. Export Control Details: US based job, US Person required Relocation This position offers relocation based on candidate eligibility. Security Clearance This position requires the ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship. An interim and/or final U.S. Secret Clearance Post-Start is required. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift This position is for 1st shift Equal Opportunity Employer: Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
Screen reader users may encounter difficulty with this site. For assistance with applying, please contact . If you have questions while submitting an application, please review these frequently asked questions .Current Employees and Students:If you are currently employed or enrolled as a student at The Ohio State University, please log in to Workday to use the internal application process. Welcome to The Ohio State University's career site. We invite you to apply to positions of interest. In order to ensure your application is complete, you must complete the following: Ensure you have all necessary documents available when starting the application process. You can review the additional job description section on postings for documents that may be required. Prior to submitting your application, please review and update (if necessary) the information in your candidate profile as it will transfer to your application. Job Title:Senior Data EngineerDepartment:Health System Shared Services Analytics Center of Excellence Position Summary The Senior Data Engineer serves as a top-level technical contributor and lead for complex data and analytics initiatives that align technology solutions with clinical, operational, and strategic priorities. This role designs, builds, and operates secure, scalable data pipelines and curated datasets that power analytics, reporting, and advanced AI/ML use cases supporting patient care, administrative decision-making, and improved outcomes. This position partners closely with clinical and operational leaders, analytics teams, vendors and IT stakeholders to translate business needs into reliable data products. The Senior Data Engineer leads the end-to-end development lifecycle - data ingestion, transformation, modeling, testing, deployment, and monitoring - while championing modern engineering practices (CI/CD, data quality automation, observability, documentation, and governance). Recognized across the organization for expertise in data architecture, engineering standards, and platform modernization. Key Responsibilities Lead design and delivery of enterprise-grade data pipelines (ETL/ELT) using SQL/Python, supporting high-volume, high-complexity healthcare data. Build and optimize a modern lakehouse architecture using Azure services and Databricks, including Delta Lake patterns and performance tuning. Implement and maintain medallion architecture (bronze/silver/gold): ingestion, standardization, and curated semantic datasets for analytics and downstream consumption. Develop scalable transformation and modeling layers using dbt (or equivalent) and data modeling best practices (Kimball, dimensional modeling, star schemas, conformed dimensions). Establish and enforce data quality and reliability standards (tests, reconciliation, SLA monitoring, anomaly detection, lineage/metadata). Implement CI/CD for data pipelines and dbt projects (Git-based workflows, automated testing, release pipelines via Azure DevOps/GitHub Actions). Collaborate with reporting & analytics, data science, data governance and platform systems & architecture teams to enable self-service access to trusted datasets and accelerate insight delivery. Provide technical leadership on architecture decisions, security/privacy considerations, performance optimization, and cost management. Create and maintain technical documentation, data contracts, and operational runbooks; contribute to engineering standards and patterns. Coordinate across medical center and university entities and evaluate external tools/partners to adopt innovative methods and improve delivery. Minimum Qualifications Bachelor's degree in Computer Science, Information Systems, Data Analytics, Engineering, or related field (or equivalent practical experience). 4+ years of progressive experience in data engineering, data warehousing, or analytics engineering (healthcare strongly preferred). Advanced proficiency with SQL and strong experience with Python for data engineering. Hands-on experience designing and operating ETL/ELT pipelines and building curated analytics datasets. Strong understanding of data modeling, warehousing/lakehouse concepts, and modern data management practices. Demonstrated ability to lead large, complex initiatives and deliver outcomes in high-impact environments. Additional Information: Our Comprehensive Employee Benefits Include: An array of retirement plan options, each with a generous employer contribution. Affordable health insurance options, including dental, vision and prescription coverage that begin on day one. Paid vacation and sick leave, including short and long-term disability and paid parental leave. Get the most out of the Public Service Loan Forgiveness program. And much more! Location:Ackerman Rd, 660 (0242)Position Type:RegularScheduled Hours:40Shift:First Shift Final candidates are subject to successful completion of a background check. A drug screen or physical may be required during the post offer process. Thank you for your interest in positions at The Ohio State University and Wexner Medical Center. Once you have applied, the most updated information on the status of your application can be found by visiting the Candidate Home section of this site. Please view your submitted applications by logging in and reviewing your status. For answers to additional questions please review the frequently asked questions . The university is an equal opportunity employer, including veterans and disability. As required by Ohio Revised Code section 3345.0216, Ohio State will: educate students by means of free, open and rigorous intellectual inquiry to seek the truth; equip students with the opportunity to develop intellectual skills to reach their own, informed conclusions; not require, favor, disfavor or prohibit speech or lawful assembly; create a community dedicated to an ethic of civil and free inquiry, which respects the autonomy of each member, supports individual capacities for growth and tolerates differences in opinion; treat all faculty, staff and students as individuals, hold them to equal standards and provide equality of opportunity with regard to race, ethnicity, religion, sex, sexual orientation, gender identity or gender expression.
01/15/2026
Full time
Screen reader users may encounter difficulty with this site. For assistance with applying, please contact . If you have questions while submitting an application, please review these frequently asked questions .Current Employees and Students:If you are currently employed or enrolled as a student at The Ohio State University, please log in to Workday to use the internal application process. Welcome to The Ohio State University's career site. We invite you to apply to positions of interest. In order to ensure your application is complete, you must complete the following: Ensure you have all necessary documents available when starting the application process. You can review the additional job description section on postings for documents that may be required. Prior to submitting your application, please review and update (if necessary) the information in your candidate profile as it will transfer to your application. Job Title:Senior Data EngineerDepartment:Health System Shared Services Analytics Center of Excellence Position Summary The Senior Data Engineer serves as a top-level technical contributor and lead for complex data and analytics initiatives that align technology solutions with clinical, operational, and strategic priorities. This role designs, builds, and operates secure, scalable data pipelines and curated datasets that power analytics, reporting, and advanced AI/ML use cases supporting patient care, administrative decision-making, and improved outcomes. This position partners closely with clinical and operational leaders, analytics teams, vendors and IT stakeholders to translate business needs into reliable data products. The Senior Data Engineer leads the end-to-end development lifecycle - data ingestion, transformation, modeling, testing, deployment, and monitoring - while championing modern engineering practices (CI/CD, data quality automation, observability, documentation, and governance). Recognized across the organization for expertise in data architecture, engineering standards, and platform modernization. Key Responsibilities Lead design and delivery of enterprise-grade data pipelines (ETL/ELT) using SQL/Python, supporting high-volume, high-complexity healthcare data. Build and optimize a modern lakehouse architecture using Azure services and Databricks, including Delta Lake patterns and performance tuning. Implement and maintain medallion architecture (bronze/silver/gold): ingestion, standardization, and curated semantic datasets for analytics and downstream consumption. Develop scalable transformation and modeling layers using dbt (or equivalent) and data modeling best practices (Kimball, dimensional modeling, star schemas, conformed dimensions). Establish and enforce data quality and reliability standards (tests, reconciliation, SLA monitoring, anomaly detection, lineage/metadata). Implement CI/CD for data pipelines and dbt projects (Git-based workflows, automated testing, release pipelines via Azure DevOps/GitHub Actions). Collaborate with reporting & analytics, data science, data governance and platform systems & architecture teams to enable self-service access to trusted datasets and accelerate insight delivery. Provide technical leadership on architecture decisions, security/privacy considerations, performance optimization, and cost management. Create and maintain technical documentation, data contracts, and operational runbooks; contribute to engineering standards and patterns. Coordinate across medical center and university entities and evaluate external tools/partners to adopt innovative methods and improve delivery. Minimum Qualifications Bachelor's degree in Computer Science, Information Systems, Data Analytics, Engineering, or related field (or equivalent practical experience). 4+ years of progressive experience in data engineering, data warehousing, or analytics engineering (healthcare strongly preferred). Advanced proficiency with SQL and strong experience with Python for data engineering. Hands-on experience designing and operating ETL/ELT pipelines and building curated analytics datasets. Strong understanding of data modeling, warehousing/lakehouse concepts, and modern data management practices. Demonstrated ability to lead large, complex initiatives and deliver outcomes in high-impact environments. Additional Information: Our Comprehensive Employee Benefits Include: An array of retirement plan options, each with a generous employer contribution. Affordable health insurance options, including dental, vision and prescription coverage that begin on day one. Paid vacation and sick leave, including short and long-term disability and paid parental leave. Get the most out of the Public Service Loan Forgiveness program. And much more! Location:Ackerman Rd, 660 (0242)Position Type:RegularScheduled Hours:40Shift:First Shift Final candidates are subject to successful completion of a background check. A drug screen or physical may be required during the post offer process. Thank you for your interest in positions at The Ohio State University and Wexner Medical Center. Once you have applied, the most updated information on the status of your application can be found by visiting the Candidate Home section of this site. Please view your submitted applications by logging in and reviewing your status. For answers to additional questions please review the frequently asked questions . The university is an equal opportunity employer, including veterans and disability. As required by Ohio Revised Code section 3345.0216, Ohio State will: educate students by means of free, open and rigorous intellectual inquiry to seek the truth; equip students with the opportunity to develop intellectual skills to reach their own, informed conclusions; not require, favor, disfavor or prohibit speech or lawful assembly; create a community dedicated to an ethic of civil and free inquiry, which respects the autonomy of each member, supports individual capacities for growth and tolerates differences in opinion; treat all faculty, staff and students as individuals, hold them to equal standards and provide equality of opportunity with regard to race, ethnicity, religion, sex, sexual orientation, gender identity or gender expression.
Job Duties: Design and build a data warehouse using computingbased data cloud (Snowflake) for machine learning, data analysis, self-serve analytics and reporting needs. Develop Extract, Transform, Load (ETL) pipelines to orchestrate execution of scripts, automate data transformation and loading data into data warehouses. Perform data cleansing, validation, testing and schema design to ensure accuracy and reliability of data insights. Manage and build automated workflows and monitor these pipelines and DAGs (Directed Acyclic Graphs). Identify and work on improvements by automating manual processes to optimize data delivery, re-design infrastructure for more scalability using python & sql. Collaborate with cross-functional teams including data engineers, analysts and data scientists to understand the business logic and needs to create an effective and efficient pipeline and data model. Create ad-hoc report/data visualizations based on user requirements. Participate in ETL flow design reviews and recommend solutions to improve processes. Perform data quality checks and alerting to identify issues and resolve bugs in a timely manner. Participate in source code and design reviews in a software development lifecycle (SDLC) driven environment using technical, functional, and domain knowledge. Identify issues or gaps and suggest improvements to the team processes. Provide support for customer requests for all the products handled by the team. Full-time telecommuting is an option. Minimum Requirements: Masters degree (or its foreign degree equivalent) in Analytics, Data Science, Engineering (any field), or a related quantitative discipline, and six (6) months of experience in the job offered or in any occupation in related field. Special Skill Requirements: 1. Data warehouse design and dimensional modeling 2. Extract, Transform, Load (ETL) 3. SQL 4. Python 5. Data Visualization (Tableau or Power BI) 6. Gitlab 7. Azure 8. PySpark 9. OLAP and OLTP system experience (MySQL Database, Postgres DB, or Snowflake) 10. Machine Learning 11. Data Pipeline - Performance & Scalability Techniques. Any suitable combination of education, training and/or experience is acceptable. Full-time telecommuting is an option. Salary: $177,813.00 - $195,594.00 per annum. Submit resume with references to: Req.# 53.2 at: ATTN: HR,
01/15/2026
Job Duties: Design and build a data warehouse using computingbased data cloud (Snowflake) for machine learning, data analysis, self-serve analytics and reporting needs. Develop Extract, Transform, Load (ETL) pipelines to orchestrate execution of scripts, automate data transformation and loading data into data warehouses. Perform data cleansing, validation, testing and schema design to ensure accuracy and reliability of data insights. Manage and build automated workflows and monitor these pipelines and DAGs (Directed Acyclic Graphs). Identify and work on improvements by automating manual processes to optimize data delivery, re-design infrastructure for more scalability using python & sql. Collaborate with cross-functional teams including data engineers, analysts and data scientists to understand the business logic and needs to create an effective and efficient pipeline and data model. Create ad-hoc report/data visualizations based on user requirements. Participate in ETL flow design reviews and recommend solutions to improve processes. Perform data quality checks and alerting to identify issues and resolve bugs in a timely manner. Participate in source code and design reviews in a software development lifecycle (SDLC) driven environment using technical, functional, and domain knowledge. Identify issues or gaps and suggest improvements to the team processes. Provide support for customer requests for all the products handled by the team. Full-time telecommuting is an option. Minimum Requirements: Masters degree (or its foreign degree equivalent) in Analytics, Data Science, Engineering (any field), or a related quantitative discipline, and six (6) months of experience in the job offered or in any occupation in related field. Special Skill Requirements: 1. Data warehouse design and dimensional modeling 2. Extract, Transform, Load (ETL) 3. SQL 4. Python 5. Data Visualization (Tableau or Power BI) 6. Gitlab 7. Azure 8. PySpark 9. OLAP and OLTP system experience (MySQL Database, Postgres DB, or Snowflake) 10. Machine Learning 11. Data Pipeline - Performance & Scalability Techniques. Any suitable combination of education, training and/or experience is acceptable. Full-time telecommuting is an option. Salary: $177,813.00 - $195,594.00 per annum. Submit resume with references to: Req.# 53.2 at: ATTN: HR,
Job Title: Data Infrastructure Engineer Location: Kennesaw, Georgia Regular/Temporary: Regular Full/Part Time: Full-Time Job ID: 292336 About Us Are you ready to transform lives through academic excellence, innovative research, strong community partnerships and economic opportunity? Kennesaw State University is one of the 50 largest public institutions in the country. With growing enrollment and global reach, we continue to expand our institutional influence and prominence beyond the state of Georgia. We offer more than 190 undergraduate, graduate, and doctoral degrees to empower our 47,000 students to become thought leaders, lifelong learners, and informed global citizens. Our entrepreneurial spirit, high-impact research, and Division I athletics draw students from throughout the region and from more than 100 countries across the globe. Our university's vibrant culture, career opportunities, rich benefits, and values of respect, integrity, collaboration, inclusivity, and accountability make us an employer of choice. We are part of the University System of Georgia . We are searching for talented people to join Kennesaw State University in our vision . Come Take Flight at KSU! Location (Primary Location for Job Responsibilities) Our Kennesaw campus is located at 1000 Chastain Road NW, Kennesaw, GA 30144. Our Marietta campus is located at 1100 South Marietta Parkway, Marietta, GA 30060. Job Summary Focuses on building and maintaining the data infrastructure including the extraction, loading, and staging of data from various data sources, both on-premises or in the cloud. Ensures the seamless and secure transfer of data, optimizing for performance, integration and reliability to enable subsequent data transformation and modeling processes for enterprise reporting and analytics. Responsibilities KEY RESPONSIBILITIES: 1. Develops, tests, and implements data pipelines to efficiently, securely, and reliably ingest data from various data sources into the enterprise data lake or data warehouse 2. Collaborates with data architects and data source SMEs to understand requirements and designs data pipelines accordingly 3. Develops and maintains robust data integrity checks, ensuring data accuracy, timeliness, and consistency 4. Utilizes cloud tools and custom scripts for validation, set up automated anomaly detection, and collaborates with stakeholders to align quality checks with data requirements 5. Enables integration of data from various data sources, such as databases, cloud services and APIs, to facilitate seamless data flow for enterprise and self-service reporting and analytics 6. Monitors and optimizes data pipelines, identifies and troubleshoots issues using tools such as Azure Monitor or similar system 7. Implements automated alerts and collaborates with architects to adopt best practices in pipeline design for enhanced performance 8. Maintains compliance with data security policies and regulations in the data infrastructure 9. Implements encryption, manage access controls, conduct security audits, and stay updated with security best practices to safeguard data integrity and privacy 10. Manages the storage structure within the data lake or data warehouse, optimizing resource utilization and ensuring efficient integration with data transformation and modeling processes 11. Supports senior technical staff in project planning and the development of standard operating procedures 12. Contributes to the establishment of best practices, ensuring project alignment with technical standards and organizational goals 13. Creates and regularly updates technical documentation for data pipeline processes 14. Ensures clear, comprehensive, and accessible documentation is available, covering all aspects of pipeline design, operation, and maintenance Required Qualifications Educational Requirements High school diploma or equivalent Required Experience Five (5) years of related IT experience. Preferred Qualifications Preferred Educational Qualifications An undergraduate or advanced degree from an accredited institution of higher education in Computer Science, Information Systems, Business Administration or related field Preferred Experience Experience working with reporting tools such as Power BI, Tableau, etc. Previous work experience in Higher Education Knowledge, Skills, & Abilities ABILITIES Commitment to continuous learning and staying updated with the latest trends and best practices in data engineering Able to handle multiple tasks or projects at one time meeting assigned deadlines KNOWLEDGE Familiarity with data protection and privacy laws and regulations (i.e., FERPA, HIPAA, etc.) Knowledge of various file formats used in data storage, like parquet, avro, csv, etc., and their implications on performance and storage Understanding of cloud storage services, such as blob storage, data lakes, data lakehouses, and data warehouses Understanding of data warehouse architecture patterns (e.g., Medallion Architecture, OBT, Materialized View Pattern, Star-Schema, etc.) Knowledge of data warehousing principles, including data quality, data enrichment and standardization, and data modeling. Knowledge of data warehouse architecture patterns, such as star schema, One Big Table (OBT), materialized view architecture, etc. Knowledge of best practices in data pipelines orchestration Knowledge of data security practices, including encryption/decryption, and compliance with data governance policies and guidelines SKILLS Excellent interpersonal, initiative, teamwork, problem solving, independent judgment, organization, communication (verbal and written), time management, project management and presentation skills Demonstrated skills in relational databases (e.g., SQL Server, Oracle, MySQL, etc.) Demonstrated skills in data engineering tools (e.g., Azure Data Factory, SSIS, Informatica, Pentaho, Oracle Data Integrator, etc.) Skills in identifying bottlenecks in data pipelines and optimizing for efficiency and scalability Skills in developing efficient and scalable data ingestion pipelines between cloud-based and on-premises data sources and destinations Proficient with computer applications and programs associated with the position (i.e., Microsoft Office suite and other collaboration tools) Proficiency with SQL and its variants (e.g., PL/SQL, T-SQL, etc.) Proficiency in programming/scripting languages (e.g., Python, Java, PowerShell, etc.) Proficiency in data engineering technologies and tools (e.g., Azure Data Factory, Apache Spark, Azure Synapse Analytics, Python, Airflow, etc.) Strong attention to detail and organization skills Strong customer service skills and phone and email etiquette USG Core Values The University System of Georgia is comprised of our 26 institutions of higher education and learning as well as the System Office. Our USG Statement of Core Values are Integrity, Excellence, Accountability, and Respect. These values serve as the foundation for all that we do as an organization, and each USG community member is responsible for demonstrating and upholding these standards. More details on the USG Statement of Core Values and Code of Conduct are available in USG Board Policy 8.2.18.1.2 and can be found on-line at . Additionally, USG supports Freedom of Expression as stated in Board Policy 6.5 Freedom of Expression and Academic Freedom found on-line at . Equal Employment Opportunity Kennesaw State University is an Equal Employment Opportunity Employer. The University is committed to maintaining a fair and respectful environment for living, work and study. To that end, and in accordance with federal and state law, Board of Regents policy, and University policy, the University prohibits harassment of or discrimination against any person because of race, color, sex (including sexual harassment, pregnancy, and medical conditions related to pregnancy), sexual orientation, gender identity, gender expression, ethnicity or national origin, religion, age, genetic information, disability, or veteran or military status by any member of the KSU Community on campus, in connection with a University program or activity, or in a manner that creates a hostile environment for members of the KSU community. For additional information on this policy, or to file a complaint under the provisions of this policy, students, employees, applicants for employment or admission or other third parties should contact the Office of Institutional Equity at English Building, Suite 225, . Other Information This is not a supervisory position. This position does not have any financial responsibilities. This position will not be required to drive. This role is considered a position of trust. This position does not require a purchasing card (P-Card). This position may travel 1% - 24% of the time This position does not require security clearance. Background Check Credit Report Standard Enhanced Per the University System of Georgia background check policy, all final candidates will be required to consent to a criminal background investigation. Final candidates may be asked to disclose criminal record history during the initial screening process and prior to a conditional offer of employment. Applicants for positions of trust with screening . click apply for full job details
01/14/2026
Full time
Job Title: Data Infrastructure Engineer Location: Kennesaw, Georgia Regular/Temporary: Regular Full/Part Time: Full-Time Job ID: 292336 About Us Are you ready to transform lives through academic excellence, innovative research, strong community partnerships and economic opportunity? Kennesaw State University is one of the 50 largest public institutions in the country. With growing enrollment and global reach, we continue to expand our institutional influence and prominence beyond the state of Georgia. We offer more than 190 undergraduate, graduate, and doctoral degrees to empower our 47,000 students to become thought leaders, lifelong learners, and informed global citizens. Our entrepreneurial spirit, high-impact research, and Division I athletics draw students from throughout the region and from more than 100 countries across the globe. Our university's vibrant culture, career opportunities, rich benefits, and values of respect, integrity, collaboration, inclusivity, and accountability make us an employer of choice. We are part of the University System of Georgia . We are searching for talented people to join Kennesaw State University in our vision . Come Take Flight at KSU! Location (Primary Location for Job Responsibilities) Our Kennesaw campus is located at 1000 Chastain Road NW, Kennesaw, GA 30144. Our Marietta campus is located at 1100 South Marietta Parkway, Marietta, GA 30060. Job Summary Focuses on building and maintaining the data infrastructure including the extraction, loading, and staging of data from various data sources, both on-premises or in the cloud. Ensures the seamless and secure transfer of data, optimizing for performance, integration and reliability to enable subsequent data transformation and modeling processes for enterprise reporting and analytics. Responsibilities KEY RESPONSIBILITIES: 1. Develops, tests, and implements data pipelines to efficiently, securely, and reliably ingest data from various data sources into the enterprise data lake or data warehouse 2. Collaborates with data architects and data source SMEs to understand requirements and designs data pipelines accordingly 3. Develops and maintains robust data integrity checks, ensuring data accuracy, timeliness, and consistency 4. Utilizes cloud tools and custom scripts for validation, set up automated anomaly detection, and collaborates with stakeholders to align quality checks with data requirements 5. Enables integration of data from various data sources, such as databases, cloud services and APIs, to facilitate seamless data flow for enterprise and self-service reporting and analytics 6. Monitors and optimizes data pipelines, identifies and troubleshoots issues using tools such as Azure Monitor or similar system 7. Implements automated alerts and collaborates with architects to adopt best practices in pipeline design for enhanced performance 8. Maintains compliance with data security policies and regulations in the data infrastructure 9. Implements encryption, manage access controls, conduct security audits, and stay updated with security best practices to safeguard data integrity and privacy 10. Manages the storage structure within the data lake or data warehouse, optimizing resource utilization and ensuring efficient integration with data transformation and modeling processes 11. Supports senior technical staff in project planning and the development of standard operating procedures 12. Contributes to the establishment of best practices, ensuring project alignment with technical standards and organizational goals 13. Creates and regularly updates technical documentation for data pipeline processes 14. Ensures clear, comprehensive, and accessible documentation is available, covering all aspects of pipeline design, operation, and maintenance Required Qualifications Educational Requirements High school diploma or equivalent Required Experience Five (5) years of related IT experience. Preferred Qualifications Preferred Educational Qualifications An undergraduate or advanced degree from an accredited institution of higher education in Computer Science, Information Systems, Business Administration or related field Preferred Experience Experience working with reporting tools such as Power BI, Tableau, etc. Previous work experience in Higher Education Knowledge, Skills, & Abilities ABILITIES Commitment to continuous learning and staying updated with the latest trends and best practices in data engineering Able to handle multiple tasks or projects at one time meeting assigned deadlines KNOWLEDGE Familiarity with data protection and privacy laws and regulations (i.e., FERPA, HIPAA, etc.) Knowledge of various file formats used in data storage, like parquet, avro, csv, etc., and their implications on performance and storage Understanding of cloud storage services, such as blob storage, data lakes, data lakehouses, and data warehouses Understanding of data warehouse architecture patterns (e.g., Medallion Architecture, OBT, Materialized View Pattern, Star-Schema, etc.) Knowledge of data warehousing principles, including data quality, data enrichment and standardization, and data modeling. Knowledge of data warehouse architecture patterns, such as star schema, One Big Table (OBT), materialized view architecture, etc. Knowledge of best practices in data pipelines orchestration Knowledge of data security practices, including encryption/decryption, and compliance with data governance policies and guidelines SKILLS Excellent interpersonal, initiative, teamwork, problem solving, independent judgment, organization, communication (verbal and written), time management, project management and presentation skills Demonstrated skills in relational databases (e.g., SQL Server, Oracle, MySQL, etc.) Demonstrated skills in data engineering tools (e.g., Azure Data Factory, SSIS, Informatica, Pentaho, Oracle Data Integrator, etc.) Skills in identifying bottlenecks in data pipelines and optimizing for efficiency and scalability Skills in developing efficient and scalable data ingestion pipelines between cloud-based and on-premises data sources and destinations Proficient with computer applications and programs associated with the position (i.e., Microsoft Office suite and other collaboration tools) Proficiency with SQL and its variants (e.g., PL/SQL, T-SQL, etc.) Proficiency in programming/scripting languages (e.g., Python, Java, PowerShell, etc.) Proficiency in data engineering technologies and tools (e.g., Azure Data Factory, Apache Spark, Azure Synapse Analytics, Python, Airflow, etc.) Strong attention to detail and organization skills Strong customer service skills and phone and email etiquette USG Core Values The University System of Georgia is comprised of our 26 institutions of higher education and learning as well as the System Office. Our USG Statement of Core Values are Integrity, Excellence, Accountability, and Respect. These values serve as the foundation for all that we do as an organization, and each USG community member is responsible for demonstrating and upholding these standards. More details on the USG Statement of Core Values and Code of Conduct are available in USG Board Policy 8.2.18.1.2 and can be found on-line at . Additionally, USG supports Freedom of Expression as stated in Board Policy 6.5 Freedom of Expression and Academic Freedom found on-line at . Equal Employment Opportunity Kennesaw State University is an Equal Employment Opportunity Employer. The University is committed to maintaining a fair and respectful environment for living, work and study. To that end, and in accordance with federal and state law, Board of Regents policy, and University policy, the University prohibits harassment of or discrimination against any person because of race, color, sex (including sexual harassment, pregnancy, and medical conditions related to pregnancy), sexual orientation, gender identity, gender expression, ethnicity or national origin, religion, age, genetic information, disability, or veteran or military status by any member of the KSU Community on campus, in connection with a University program or activity, or in a manner that creates a hostile environment for members of the KSU community. For additional information on this policy, or to file a complaint under the provisions of this policy, students, employees, applicants for employment or admission or other third parties should contact the Office of Institutional Equity at English Building, Suite 225, . Other Information This is not a supervisory position. This position does not have any financial responsibilities. This position will not be required to drive. This role is considered a position of trust. This position does not require a purchasing card (P-Card). This position may travel 1% - 24% of the time This position does not require security clearance. Background Check Credit Report Standard Enhanced Per the University System of Georgia background check policy, all final candidates will be required to consent to a criminal background investigation. Final candidates may be asked to disclose criminal record history during the initial screening process and prior to a conditional offer of employment. Applicants for positions of trust with screening . click apply for full job details
Senior Data Scientist, Wharton Graduate Division University Overview The University of Pennsylvania, the largest private employer in Philadelphia, is a world-renowned leader in education, research, and innovation. This historic, Ivy League school consistently ranks among the top 10 universities in the annual U.S. News & World Report survey. Penn has 12 highly-regarded schools that provide opportunities for undergraduate, graduate and continuing education, all influenced by Penn's distinctive interdisciplinary approach to scholarship and learning. As an employer Penn has been ranked nationally on many occasions with the most recent award from Forbes who named Penn one of America's Best Large Employers in 2023. Penn offers a unique working environment within the city of Philadelphia. The University is situated on a beautiful urban campus, with easy access to a range of educational, cultural, and recreational activities. With its historical significance and landmarks, lively cultural offerings, and wide variety of atmospheres, Philadelphia is the perfect place to call home for work and play. The University offers a competitive benefits package that includes excellent healthcare and tuition benefits for employees and their families, generous retirement benefits, a wide variety of professional development opportunities, supportive work and family benefits, a wealth of health and wellness programs and resources, and much more. Posted Job Title Senior Data Scientist, Wharton Graduate Division Job Profile Title Job Description Summary The Senior Data Scientist - Algorithmic Systems is a key member of the Data Science team supporting the full-time, Executive, and Global Wharton MBA programs. Reporting to the Director of Data Science, this role is instrumental in advancing analytics-driven solutions that enhance the student experience and improve office productivity. This position will lead the enhancement, maintenance, and expansion of Wharton's Course Match registration system-Wharton's MBA registration system that allocates course schedule. As a technically advanced implementation of modern mechanism design, Course Match is central to the MBA academic experience and requires continued refinement and innovation. The Senior Data Scientist will contribute directly to its algorithm and infrastructure integration, with responsibilities spanning optimization tuning, heuristic development (e.g., Tabu Search), performance diagnostics, and production-level feature development. The role demands expertise in mixed-integer programming (MIP), fluency in Python, and the ability to deliver clean, modular, and maintainable production code. The position also plays a strategic role in evaluating and integrating a new optimization engine into Course Match, collaborating with internal stakeholders such as the MBA Program Office, Wharton Computing, and the University Registrar, and engaging with external vendors to ensure system robustness, scalability, and alignment with institutional practices, policies, and priorities. Beyond Course Match, the Senior Data Scientist will contribute to a broader portfolio of academic affairs and data science initiatives, including AI integration, predictive modeling, statistical analysis, machine learning, and analytics infrastructure development. A major focus will be building reusable, scalable, and automated Python-based solutions that enable long-term innovation within the MBA Program Office and Wharton Computing. The ideal candidate brings a strong background in algorithmic systems and applied data science, with a demonstrated ability to manage complex technical projects from ideation through deployment. They will be adept at translating technical insights for diverse audiences, fostering cross-functional collaboration, and providing mentorship and thought leadership to colleagues. Outstanding communication skills and a highly collaborative mindset are essential. This role is a two-year appointment, with the potential for continuation. Job Description Duties Lead development and ownership of Course Match Serve as the primary developer and systems owner for Course Match registration system, including algorithm and infrastructure integration. Algorithmic and performance refinement Design and implement enhancements to Course Match's algorithmic infrastructure, including optimization tuning, performance analysis, heuristic development (e.g., Tabu Search), mixed-integer programming (MIP), and system diagnostics. Internal and external collaboration and system integration Partner with internal stakeholders and external vendors to support the integration of a new optimization engine. Participate in vendor evaluation, architecture planning, and system design discussions to ensure successful deployment and infrastructure integration. Data science project contributions beyond Course Match Contribute to high-impact academic affairs and data science initiatives beyond Course Match, including AI integration, process automation, and machine learning. Build robust Python-based solutions that promote reusability, automation, and scalability. Infrastructure development and automation Develop and maintain analytics infrastructure and automation pipelines to support data workflows across the MBA Program Office and Wharton Computing. Apply best practices in software development, testing, and version control. Qualifications Master's degree and 1-2 years of experience in a quantitative field such as Engineering, Data Science, Computer Science, Operations Research, Mathematics, or a related discipline is required; however, a Master's degree and at least 5 years of relevant experience is strongly preferred. Strong programming proficiency in Python is required, including experience developing modular, well-documented, and production-grade code. Familiarity with version control systems (e.g., Git) and test-driven development is strongly preferred. Demonstrated expertise in algorithmic systems or large-scale computational frameworks, with hands-on experience in one or more of the following: heuristic search methods (e.g., Tabu Search), combinatorial optimization, mixed-integer programming (MIP), or algorithmic market design. Experience working within multi-component software environments and integrating external libraries, APIs, or platforms. Familiarity with systems integration and data engineering practices is preferred. Experience supporting or participating in vendor evaluations or third-party system integrations is strongly preferred. Ability to engage with external vendors on technical system design, performance review, and implementation. Applied experience with core data science methods, including statistical modeling, predictive analytics, and machine learning. Proficiency with tools such as R, SQL, and Power BI is a plus. Job Location - City, State Philadelphia, Pennsylvania Department / School Wharton School Pay Range $100,000.00 - $140,000.00 Annual Rate Salary offers are made based on the candidate's qualifications, experience, skills, and education as they directly relate to the requirements of the position, and in alignment with salary ranges based on external market data for the job's level. Internal organization and peer data at Penn are also considered. Equal Opportunity Statement The University of Pennsylvania is an equal opportunity employer. Candidates are considered for employment without regard to race, color, sex, sexual orientation, religion, creed, national origin (including shared ancestry or ethnic characteristics), citizenship status, age, disability, veteran status or any class protected under applicable federal, state or local law. Special Requirements Background checks may be required after a conditional job offer is made. Consideration of the background check will be tailored to the requirements of the job. University Benefits Health, Life, and Flexible Spending Accounts: Penn offers comprehensive medical, prescription, behavioral health, dental, vision, and life insurance benefits to protect you and your family's health and welfare. You can also use flexible spending accounts to pay for eligible health care and dependent care expenses with pre-tax dollars. Tuition: Take advantage of Penn's exceptional tuition benefits. You, your spouse, and your dependent children can get tuition assistance here at Penn. Your dependent children are also eligible for tuition assistance at other institutions. Retirement: Penn offers generous retirement plans to help you save for your future. Penn's Basic, Matching, and Supplemental retirement plans allow you to save for retirement on a pre-tax or Roth basis. Choose from a wide variety of investment options through TIAA and Vanguard. Time Away from Work: Penn provides you with a substantial amount of time away from work during the course of the year. This allows you to relax, take vacations, attend to personal affairs, recover from illness or injury, spend time with family-whatever your personal needs may be. Long-Term Care Insurance: In partnership with Genworth Financial, Penn offers faculty and staff (and your eligible family members) long-term care insurance to help you cover some of the costs of long-term care services received at home, in the community or in a nursing facility. If you apply when you're newly hired . click apply for full job details
01/14/2026
Full time
Senior Data Scientist, Wharton Graduate Division University Overview The University of Pennsylvania, the largest private employer in Philadelphia, is a world-renowned leader in education, research, and innovation. This historic, Ivy League school consistently ranks among the top 10 universities in the annual U.S. News & World Report survey. Penn has 12 highly-regarded schools that provide opportunities for undergraduate, graduate and continuing education, all influenced by Penn's distinctive interdisciplinary approach to scholarship and learning. As an employer Penn has been ranked nationally on many occasions with the most recent award from Forbes who named Penn one of America's Best Large Employers in 2023. Penn offers a unique working environment within the city of Philadelphia. The University is situated on a beautiful urban campus, with easy access to a range of educational, cultural, and recreational activities. With its historical significance and landmarks, lively cultural offerings, and wide variety of atmospheres, Philadelphia is the perfect place to call home for work and play. The University offers a competitive benefits package that includes excellent healthcare and tuition benefits for employees and their families, generous retirement benefits, a wide variety of professional development opportunities, supportive work and family benefits, a wealth of health and wellness programs and resources, and much more. Posted Job Title Senior Data Scientist, Wharton Graduate Division Job Profile Title Job Description Summary The Senior Data Scientist - Algorithmic Systems is a key member of the Data Science team supporting the full-time, Executive, and Global Wharton MBA programs. Reporting to the Director of Data Science, this role is instrumental in advancing analytics-driven solutions that enhance the student experience and improve office productivity. This position will lead the enhancement, maintenance, and expansion of Wharton's Course Match registration system-Wharton's MBA registration system that allocates course schedule. As a technically advanced implementation of modern mechanism design, Course Match is central to the MBA academic experience and requires continued refinement and innovation. The Senior Data Scientist will contribute directly to its algorithm and infrastructure integration, with responsibilities spanning optimization tuning, heuristic development (e.g., Tabu Search), performance diagnostics, and production-level feature development. The role demands expertise in mixed-integer programming (MIP), fluency in Python, and the ability to deliver clean, modular, and maintainable production code. The position also plays a strategic role in evaluating and integrating a new optimization engine into Course Match, collaborating with internal stakeholders such as the MBA Program Office, Wharton Computing, and the University Registrar, and engaging with external vendors to ensure system robustness, scalability, and alignment with institutional practices, policies, and priorities. Beyond Course Match, the Senior Data Scientist will contribute to a broader portfolio of academic affairs and data science initiatives, including AI integration, predictive modeling, statistical analysis, machine learning, and analytics infrastructure development. A major focus will be building reusable, scalable, and automated Python-based solutions that enable long-term innovation within the MBA Program Office and Wharton Computing. The ideal candidate brings a strong background in algorithmic systems and applied data science, with a demonstrated ability to manage complex technical projects from ideation through deployment. They will be adept at translating technical insights for diverse audiences, fostering cross-functional collaboration, and providing mentorship and thought leadership to colleagues. Outstanding communication skills and a highly collaborative mindset are essential. This role is a two-year appointment, with the potential for continuation. Job Description Duties Lead development and ownership of Course Match Serve as the primary developer and systems owner for Course Match registration system, including algorithm and infrastructure integration. Algorithmic and performance refinement Design and implement enhancements to Course Match's algorithmic infrastructure, including optimization tuning, performance analysis, heuristic development (e.g., Tabu Search), mixed-integer programming (MIP), and system diagnostics. Internal and external collaboration and system integration Partner with internal stakeholders and external vendors to support the integration of a new optimization engine. Participate in vendor evaluation, architecture planning, and system design discussions to ensure successful deployment and infrastructure integration. Data science project contributions beyond Course Match Contribute to high-impact academic affairs and data science initiatives beyond Course Match, including AI integration, process automation, and machine learning. Build robust Python-based solutions that promote reusability, automation, and scalability. Infrastructure development and automation Develop and maintain analytics infrastructure and automation pipelines to support data workflows across the MBA Program Office and Wharton Computing. Apply best practices in software development, testing, and version control. Qualifications Master's degree and 1-2 years of experience in a quantitative field such as Engineering, Data Science, Computer Science, Operations Research, Mathematics, or a related discipline is required; however, a Master's degree and at least 5 years of relevant experience is strongly preferred. Strong programming proficiency in Python is required, including experience developing modular, well-documented, and production-grade code. Familiarity with version control systems (e.g., Git) and test-driven development is strongly preferred. Demonstrated expertise in algorithmic systems or large-scale computational frameworks, with hands-on experience in one or more of the following: heuristic search methods (e.g., Tabu Search), combinatorial optimization, mixed-integer programming (MIP), or algorithmic market design. Experience working within multi-component software environments and integrating external libraries, APIs, or platforms. Familiarity with systems integration and data engineering practices is preferred. Experience supporting or participating in vendor evaluations or third-party system integrations is strongly preferred. Ability to engage with external vendors on technical system design, performance review, and implementation. Applied experience with core data science methods, including statistical modeling, predictive analytics, and machine learning. Proficiency with tools such as R, SQL, and Power BI is a plus. Job Location - City, State Philadelphia, Pennsylvania Department / School Wharton School Pay Range $100,000.00 - $140,000.00 Annual Rate Salary offers are made based on the candidate's qualifications, experience, skills, and education as they directly relate to the requirements of the position, and in alignment with salary ranges based on external market data for the job's level. Internal organization and peer data at Penn are also considered. Equal Opportunity Statement The University of Pennsylvania is an equal opportunity employer. Candidates are considered for employment without regard to race, color, sex, sexual orientation, religion, creed, national origin (including shared ancestry or ethnic characteristics), citizenship status, age, disability, veteran status or any class protected under applicable federal, state or local law. Special Requirements Background checks may be required after a conditional job offer is made. Consideration of the background check will be tailored to the requirements of the job. University Benefits Health, Life, and Flexible Spending Accounts: Penn offers comprehensive medical, prescription, behavioral health, dental, vision, and life insurance benefits to protect you and your family's health and welfare. You can also use flexible spending accounts to pay for eligible health care and dependent care expenses with pre-tax dollars. Tuition: Take advantage of Penn's exceptional tuition benefits. You, your spouse, and your dependent children can get tuition assistance here at Penn. Your dependent children are also eligible for tuition assistance at other institutions. Retirement: Penn offers generous retirement plans to help you save for your future. Penn's Basic, Matching, and Supplemental retirement plans allow you to save for retirement on a pre-tax or Roth basis. Choose from a wide variety of investment options through TIAA and Vanguard. Time Away from Work: Penn provides you with a substantial amount of time away from work during the course of the year. This allows you to relax, take vacations, attend to personal affairs, recover from illness or injury, spend time with family-whatever your personal needs may be. Long-Term Care Insurance: In partnership with Genworth Financial, Penn offers faculty and staff (and your eligible family members) long-term care insurance to help you cover some of the costs of long-term care services received at home, in the community or in a nursing facility. If you apply when you're newly hired . click apply for full job details
SENIOR DATA ANALYTICS DEVELOPER, Institutional Research Job Description SENIOR DATA ANALYTICS DEVELOPER, Institutional Research Category Charles River Campus > Professional Job Location BOSTON, MA, United States Tracking Code Posted Date 11/10/2025 Salary Grade Grade 51 Expected Hiring Range Minimum $120,375.00 Expected Hiring Range Maximum $168,525.00 The salary of the finalist selected for this role will be set based on a variety of factors, including but not limited to departmental budgets, qualifications, experience, education, licenses, specialty, training and internal pay comparison. The above hiring range represents the University's good faith and reasonable estimate of the range of possible compensation at the time of posting. Position Type Full-Time/Regular Boston University seeks a Senior Data Analytics Developer to join our growing Analytical Services & Institutional Research team. This is a high-impact role at a leading R1 university, where advanced analytics directly shape institutional strategy, resource planning, and student success. The Senior Developer will design and deliver sophisticated data solutions including dashboards, visualizations, and analytic models that transform complex datasets into actionable insights for senior leadership and academic/administrative stakeholders. This position offers the opportunity to apply advanced technical skills in Python, R, SQL, and modern BI platforms while working in a collaborative, mission-driven environment. If you thrive at the intersection of technical innovation, data storytelling, and strategic decision support, this is an outstanding opportunity to make a measurable difference at a world-class research institution. Key Responsibilities Successful candidates will bring a strong technical foundation combined with the ability to communicate insights that drive institutional priorities. Analytics Development: Build advanced dashboards, reports, and visualizations (Power BI, Tableau, etc.) to deliver clear insights to senior leaders and decision-makers. Design and implement reproducible data models and statistical analyses (e.g., regression, clustering, predictive modeling). Contribute to scenario planning, enrollment forecasting, and operational optimization projects. Data Engineering & Infrastructure: Develop and optimize queries and pipelines using SQL and/or cloud-based tools. Support the integration of data across institutional systems (SIS, HRIS, LMS) and external sources (IPEDS, NSF HERD, CDS). Contribute to a scalable and modern data environment, leveraging cloud platforms (AWS, Azure) and data stack tools (dbt, Airflow). Collaboration & Communication Partner with administrators and technical staff to clarify questions, design solutions, and translate complex data into actionable insights. Create executive-ready reports and presentations tailored to non-technical stakeholders. Support a culture of data literacy and evidence-based decision-making across the institution. Continuous Improvement: Document workflows, models, and reporting resources for transparency and reproducibility. Mentor junior staff and contribute to knowledge-sharing within the team. Stay current on emerging analytics technologies and best practices. Required Skills Qualifications: Bachelor's degree in data science, statistics, computer science, applied mathematics, or related field. 8+ years of applicable experience with 3+ years of progressively responsible experience in data analytics, data science, or institutional research, preferably in higher education Proficiency in Python, R, and SQL for data cleaning, modeling, and analysis. Strong experience with data visualization platforms (Tableau, Power BI, or similar). Excellent communication skills with the ability to translate technical analyses into leadership-ready insights. Bonus Qualifications If you do not meet these, you are still encouraged to apply; we value employees with a willingness to learn. Experience in an R1 university or similarly complex academic institution. Familiarity with higher education data systems (SIS, HRIS, LMS) and national datasets (IPEDS, NSF HERD, CDS). Knowledge of higher education policy, enrollment management, or institutional effectiveness. Experience with cloud platforms (AWS, Azure) and modern data stack tools (dbt, Airflow). Understanding of data governance, metadata management, or semantic modeling. Why Join Us? Be part of an innovative analytics team at one of the nation's leading research universities. Work on projects that directly impact student success, institutional planning, and financial sustainability. Collaborate with highly engaged administrators, faculty, and data professionals across a vibrant campus. Access professional development opportunities to expand technical and leadership skills in a rapidly evolving field. Boston University offers an excellent benefits package including Time Off: In addition to PTO and leave policy, BU employees have a paid intersession break and 13 paid holidays. Retirement: University-funded retirement plan with full vesting after 2 years of eligible service. Tuition Assistance Program: Competitive tuition assistance program for yourself and family members. Check out and for more information! Boston University IS&T invests in our staff and their personal and professional growth. We promote staff learning including lunch and learn sessions, an extensive library of online courses, Fun Advisory Board (FAB) arranges a number of events throughout the year and opportunities to engage with peers at NERCOMP and EDUCAUSE events. If you require a reasonable accommodation in order to complete the employment application process, please contact the Equal Opportunity Office at . We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We, at IS&T, appreciate each individual's knowledge, experiences and insights which enhance who we are, and as our DEIA knowledge and practice grows, we will ensure that our Mission, Vision, & Practices remain equitable and welcoming to all. Close Required Skills Job Location: BOSTON, MA Position Type: Full-Time/Regular Salary Grade: $120,375.00-$168,525.00 To apply, visit Copyright 2025 Inc. All rights reserved. Posted by the FREE value-added recruitment advertising agency jeid-ac840642a01affe489e67e5d
01/14/2026
Full time
SENIOR DATA ANALYTICS DEVELOPER, Institutional Research Job Description SENIOR DATA ANALYTICS DEVELOPER, Institutional Research Category Charles River Campus > Professional Job Location BOSTON, MA, United States Tracking Code Posted Date 11/10/2025 Salary Grade Grade 51 Expected Hiring Range Minimum $120,375.00 Expected Hiring Range Maximum $168,525.00 The salary of the finalist selected for this role will be set based on a variety of factors, including but not limited to departmental budgets, qualifications, experience, education, licenses, specialty, training and internal pay comparison. The above hiring range represents the University's good faith and reasonable estimate of the range of possible compensation at the time of posting. Position Type Full-Time/Regular Boston University seeks a Senior Data Analytics Developer to join our growing Analytical Services & Institutional Research team. This is a high-impact role at a leading R1 university, where advanced analytics directly shape institutional strategy, resource planning, and student success. The Senior Developer will design and deliver sophisticated data solutions including dashboards, visualizations, and analytic models that transform complex datasets into actionable insights for senior leadership and academic/administrative stakeholders. This position offers the opportunity to apply advanced technical skills in Python, R, SQL, and modern BI platforms while working in a collaborative, mission-driven environment. If you thrive at the intersection of technical innovation, data storytelling, and strategic decision support, this is an outstanding opportunity to make a measurable difference at a world-class research institution. Key Responsibilities Successful candidates will bring a strong technical foundation combined with the ability to communicate insights that drive institutional priorities. Analytics Development: Build advanced dashboards, reports, and visualizations (Power BI, Tableau, etc.) to deliver clear insights to senior leaders and decision-makers. Design and implement reproducible data models and statistical analyses (e.g., regression, clustering, predictive modeling). Contribute to scenario planning, enrollment forecasting, and operational optimization projects. Data Engineering & Infrastructure: Develop and optimize queries and pipelines using SQL and/or cloud-based tools. Support the integration of data across institutional systems (SIS, HRIS, LMS) and external sources (IPEDS, NSF HERD, CDS). Contribute to a scalable and modern data environment, leveraging cloud platforms (AWS, Azure) and data stack tools (dbt, Airflow). Collaboration & Communication Partner with administrators and technical staff to clarify questions, design solutions, and translate complex data into actionable insights. Create executive-ready reports and presentations tailored to non-technical stakeholders. Support a culture of data literacy and evidence-based decision-making across the institution. Continuous Improvement: Document workflows, models, and reporting resources for transparency and reproducibility. Mentor junior staff and contribute to knowledge-sharing within the team. Stay current on emerging analytics technologies and best practices. Required Skills Qualifications: Bachelor's degree in data science, statistics, computer science, applied mathematics, or related field. 8+ years of applicable experience with 3+ years of progressively responsible experience in data analytics, data science, or institutional research, preferably in higher education Proficiency in Python, R, and SQL for data cleaning, modeling, and analysis. Strong experience with data visualization platforms (Tableau, Power BI, or similar). Excellent communication skills with the ability to translate technical analyses into leadership-ready insights. Bonus Qualifications If you do not meet these, you are still encouraged to apply; we value employees with a willingness to learn. Experience in an R1 university or similarly complex academic institution. Familiarity with higher education data systems (SIS, HRIS, LMS) and national datasets (IPEDS, NSF HERD, CDS). Knowledge of higher education policy, enrollment management, or institutional effectiveness. Experience with cloud platforms (AWS, Azure) and modern data stack tools (dbt, Airflow). Understanding of data governance, metadata management, or semantic modeling. Why Join Us? Be part of an innovative analytics team at one of the nation's leading research universities. Work on projects that directly impact student success, institutional planning, and financial sustainability. Collaborate with highly engaged administrators, faculty, and data professionals across a vibrant campus. Access professional development opportunities to expand technical and leadership skills in a rapidly evolving field. Boston University offers an excellent benefits package including Time Off: In addition to PTO and leave policy, BU employees have a paid intersession break and 13 paid holidays. Retirement: University-funded retirement plan with full vesting after 2 years of eligible service. Tuition Assistance Program: Competitive tuition assistance program for yourself and family members. Check out and for more information! Boston University IS&T invests in our staff and their personal and professional growth. We promote staff learning including lunch and learn sessions, an extensive library of online courses, Fun Advisory Board (FAB) arranges a number of events throughout the year and opportunities to engage with peers at NERCOMP and EDUCAUSE events. If you require a reasonable accommodation in order to complete the employment application process, please contact the Equal Opportunity Office at . We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We, at IS&T, appreciate each individual's knowledge, experiences and insights which enhance who we are, and as our DEIA knowledge and practice grows, we will ensure that our Mission, Vision, & Practices remain equitable and welcoming to all. Close Required Skills Job Location: BOSTON, MA Position Type: Full-Time/Regular Salary Grade: $120,375.00-$168,525.00 To apply, visit Copyright 2025 Inc. All rights reserved. Posted by the FREE value-added recruitment advertising agency jeid-ac840642a01affe489e67e5d
Job Overview: Architecture & Design Support Support the definition of system and data integration architecture for eCommerce platforms and enterprise systems. Contribute to solution design documents, data flow diagrams, interface specifications, and integration standards. Work with senior architects to evaluate design options and ensure alignment with overall platform strategy. Data Engineering & Integrations Help design and implement data pipelines supporting analytics, tagging, customer data, product information, inventory, orders, and other commerce workflows. Configure and maintain integration patterns such as APIs, streaming, webhooks, batch jobs, and middleware workflows. Participate in building data models, schemas, and event structures that support digital commerce operations. Assist in maintaining data quality, monitoring, governance, and synchronization across systems. Help design and implement data visualization using BI tools such as Microsoft Power BI , Tableau etc. Implementation & Delivery Collaborate with engineering teams to translate integration requirements into technical tasks. Support testing, debugging, and validation of integrations across upstream and downstream systems. Contribute to documentation and knowledge sharing within the team. Ensure solutions meet requirements for security, reliability, observability, and performance. Cross-Functional Collaboration Work closely with product managers, analytics teams, marketing, operations, and customer experience teams to understand data and integration needs. Help clarify data requirements and ensure correct flow of customer, product, and transaction information. Communicate technical concepts clearly to both technical and business partners. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Total Rewards: Salary Range: $71,000 - $115,500 Actual placement within the compensation range may vary depending on experience, skills, and other factors Benefits, subject to election and eligibility: Medical, Dental, Vision, Disability, Paid Time Off (including paid parental leave, vacation, and sick time), 401k with company match, Tuition Reimbursement, and Mileage Reimbursement Annual bonus based on performance and eligibility It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Requirements: 4-7 years of experience in solution architecture, data engineering, systems integration, or similar roles. Strong understanding of eCommerce ecosystems and how data flows between CRM, CDP, ERP, OMS, PIM, tagging/analytics, and marketing platforms. Experience building or supporting API integrations and event-driven workflows. Working knowledge of data modeling, ETL/ELT pipelines, and cloud-based data infrastructure. Familiarity with digital tagging, analytics measurement, and customer data capture. Strong technical documentation, communication, and problem-solving skills. Ability to work collaboratively within a cross-functional team environment. Preferred Qualifications Experience contributing to large-scale digital commerce or replatforming initiatives. Exposure to headless or composable commerce concepts. Some familiarity with DevOps practices, CI/CD, and cloud-based integration tooling. Understanding of data privacy and compliance considerations (GDPR, CCPA). Relevant industry ceritifications such as Microsoft Azure, AWS, Google Cloud or equivalent Company Overview: Keurig Dr Pepper (NASDAQ: KDP) is a leading beverage company in North America, with a portfolio of more than 125 owned, licensed and partner brands and powerful distribution capabilities to provide a beverage for every need, anytime, anywhere. We operate with a differentiated business model and world-class brand portfolio, powered by a talented and engaged team that is anchored in our values. We work with big, exciting beverage brands and the single-serve coffee brewing system in North America at KDP, and we have fun doing it! Together, we have built a leading beverage company in North America offering hot and cold beverages together at scale. Whatever your area of expertise, at KDP you can be a part of a team that's proud of its brands, partnerships, innovation, and growth. Will you join us? We strive to be an employer of choice, providing a culture and opportunities that empower our team of 29,000 employees to grow and develop. We offer robust benefits to support your health and wellness as well as your personal and financial well-being. We also provide employee programs designed to enhance your professional growth and development, while ensuring you feel valued, inspired and appreciated at work. Keurig Dr Pepper is an equal opportunity employer and recruits qualified applicants and advances in employment its employees without regard to race, color, religion, gender, sexual orientation, gender identity, gender expression, age, disability or association with a person with a disability, medical condition, genetic information, ethnic or national origin, marital status, veteran status, or any other status protected by law. A.I. Disclosure: KDP uses artificial intelligence to assist with initial resume screening and candidate matching. This technology helps us efficiently identify candidates whose qualifications align with our open roles. If you prefer not to have your application processed using artificial intelligence, you may opt out by emailing your resume and qualifications directly to in lieu of clicking Apply. Please include the job title and location or Job ID # in the email subject line.
01/14/2026
Full time
Job Overview: Architecture & Design Support Support the definition of system and data integration architecture for eCommerce platforms and enterprise systems. Contribute to solution design documents, data flow diagrams, interface specifications, and integration standards. Work with senior architects to evaluate design options and ensure alignment with overall platform strategy. Data Engineering & Integrations Help design and implement data pipelines supporting analytics, tagging, customer data, product information, inventory, orders, and other commerce workflows. Configure and maintain integration patterns such as APIs, streaming, webhooks, batch jobs, and middleware workflows. Participate in building data models, schemas, and event structures that support digital commerce operations. Assist in maintaining data quality, monitoring, governance, and synchronization across systems. Help design and implement data visualization using BI tools such as Microsoft Power BI , Tableau etc. Implementation & Delivery Collaborate with engineering teams to translate integration requirements into technical tasks. Support testing, debugging, and validation of integrations across upstream and downstream systems. Contribute to documentation and knowledge sharing within the team. Ensure solutions meet requirements for security, reliability, observability, and performance. Cross-Functional Collaboration Work closely with product managers, analytics teams, marketing, operations, and customer experience teams to understand data and integration needs. Help clarify data requirements and ensure correct flow of customer, product, and transaction information. Communicate technical concepts clearly to both technical and business partners. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Total Rewards: Salary Range: $71,000 - $115,500 Actual placement within the compensation range may vary depending on experience, skills, and other factors Benefits, subject to election and eligibility: Medical, Dental, Vision, Disability, Paid Time Off (including paid parental leave, vacation, and sick time), 401k with company match, Tuition Reimbursement, and Mileage Reimbursement Annual bonus based on performance and eligibility It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Requirements: 4-7 years of experience in solution architecture, data engineering, systems integration, or similar roles. Strong understanding of eCommerce ecosystems and how data flows between CRM, CDP, ERP, OMS, PIM, tagging/analytics, and marketing platforms. Experience building or supporting API integrations and event-driven workflows. Working knowledge of data modeling, ETL/ELT pipelines, and cloud-based data infrastructure. Familiarity with digital tagging, analytics measurement, and customer data capture. Strong technical documentation, communication, and problem-solving skills. Ability to work collaboratively within a cross-functional team environment. Preferred Qualifications Experience contributing to large-scale digital commerce or replatforming initiatives. Exposure to headless or composable commerce concepts. Some familiarity with DevOps practices, CI/CD, and cloud-based integration tooling. Understanding of data privacy and compliance considerations (GDPR, CCPA). Relevant industry ceritifications such as Microsoft Azure, AWS, Google Cloud or equivalent Company Overview: Keurig Dr Pepper (NASDAQ: KDP) is a leading beverage company in North America, with a portfolio of more than 125 owned, licensed and partner brands and powerful distribution capabilities to provide a beverage for every need, anytime, anywhere. We operate with a differentiated business model and world-class brand portfolio, powered by a talented and engaged team that is anchored in our values. We work with big, exciting beverage brands and the single-serve coffee brewing system in North America at KDP, and we have fun doing it! Together, we have built a leading beverage company in North America offering hot and cold beverages together at scale. Whatever your area of expertise, at KDP you can be a part of a team that's proud of its brands, partnerships, innovation, and growth. Will you join us? We strive to be an employer of choice, providing a culture and opportunities that empower our team of 29,000 employees to grow and develop. We offer robust benefits to support your health and wellness as well as your personal and financial well-being. We also provide employee programs designed to enhance your professional growth and development, while ensuring you feel valued, inspired and appreciated at work. Keurig Dr Pepper is an equal opportunity employer and recruits qualified applicants and advances in employment its employees without regard to race, color, religion, gender, sexual orientation, gender identity, gender expression, age, disability or association with a person with a disability, medical condition, genetic information, ethnic or national origin, marital status, veteran status, or any other status protected by law. A.I. Disclosure: KDP uses artificial intelligence to assist with initial resume screening and candidate matching. This technology helps us efficiently identify candidates whose qualifications align with our open roles. If you prefer not to have your application processed using artificial intelligence, you may opt out by emailing your resume and qualifications directly to in lieu of clicking Apply. Please include the job title and location or Job ID # in the email subject line.
Job ID: 709262BR Date posted: Oct. 29, 2025 Description: THE WORK This senior role fosters collaboration with other senior engineers for the development of advanced data analytics solutions and agile development projects in support of a high-visibility mission. This position involves providing technical leadership and guidance on data analytics and agile development projects, as well as collaborating with cross-functional teams to drive mission objectives. WHO WE ARE At Lockheed Martin, we're a leading aerospace and defense company that's shaping the future of cyber and intelligence. We're committed to innovating at the Edge: Harnessing the latest advancements in cyber, artificial intelligence, and machine learning to stay ahead of emerging threats and opportunities. This Program is seeking a software engineer to parse data products which make information available for Analytic missions. This Program has a robust process to ensure quality of products. The process includes DAT, UAT/SME Validation, Smoke Test, and System Level Test (SLT) prior to deploying products to production. WHO YOU ARE Experience as a Software Engineer with knowledge assisting in the design, development, testing, and debugging of software solutions, with a focus on Linux operating systems. Strong Familiarity with programming languages such as Java. WHY JOIN US Providing ongoing training, mentorship, and development opportunities to help our cyber and intelligence professionals stay at the forefront of their field and achieve their career goals. Competitive and comprehensive benefits package. Rewards and recognition for your hard work. Medical and dental coverage. 401k retirement savings plan. Paid time off for work/life balance. And more Basic Qualifications: Bachelors degree from an accredited college in a related discipline, or equivalent experience/combined education, with 20 years of professional experience. Twenty (20) years' experience as a SWE in programs and contracts of similar scope, type, and complexity is required. Bachelor's degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor's degree. Demonstrated experience with distributed scale Big Data stores (i.e. Accumulo), Map/Reduce programming model (i.e. Hadoop), Hadoop Distributed File System, and data serialization (i.e. JSON). Experience with programming languages such as Java and scripting languages such as Perl, Python, Bash. TS/ SCI with Poly Desired Skills: • Full-Stack development (Front-end and back-end) • Implement and maintain high availability (HA) and replication strategies. • Implement and maintain PostgresSQL and PostGIS database • AWS Managed Services (i.e. EC2, S3, VPC) • Understanding of CI/CD pipelines (using GitLab) and infrastructure as code (Terraform, AWS CloudFormation). • Experience with containerized environments (Docker, Kubernetes, EKS). • Prometheus, Grafana, or other logging/monitoring tools Clearance Level: TS/SCI w/Poly SP Other Important Information You Should Know Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings. Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility. Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits. Schedule for this Position: 9x80 every other Friday off Pay Rate: The annual base salary range for this position in California, Massachusetts, and New York (excluding most major metropolitan areas), Colorado, Hawaii, Illinois, Maryland, Minnesota, New Jersey, Vermont, Washington or Washington DC is $150,800 - $265,880. For states not referenced above, the salary range for this position will reflect the candidate's final work location. Please note that the salary information is a general guideline only. Lockheed Martin considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer. Benefits offered: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k) match, Flexible Spending Accounts, EAP, Education Assistance, Parental Leave, Paid time off, and Holidays. (Washington state applicants only) Non-represented full-time employees: accrue at least 10 hours per month of Paid Time Off (PTO) to be used for incidental absences and other reasons; receive at least 90 hours for holidays. Represented full time employees accrue 6.67 hours of Vacation per month; accrue up to 52 hours of sick leave annually; receive at least 96 hours for holidays. PTO, Vacation, sick leave, and holiday hours are prorated based on start date during the calendar year. This position is incentive plan eligible. Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics. The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration. At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work. With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility. If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications. Experience Level: Experienced Professional Business Unit: RMS Relocation Available: No Career Area: Software Engineering Type: Task Order/IDIQ Shift: First
01/14/2026
Full time
Job ID: 709262BR Date posted: Oct. 29, 2025 Description: THE WORK This senior role fosters collaboration with other senior engineers for the development of advanced data analytics solutions and agile development projects in support of a high-visibility mission. This position involves providing technical leadership and guidance on data analytics and agile development projects, as well as collaborating with cross-functional teams to drive mission objectives. WHO WE ARE At Lockheed Martin, we're a leading aerospace and defense company that's shaping the future of cyber and intelligence. We're committed to innovating at the Edge: Harnessing the latest advancements in cyber, artificial intelligence, and machine learning to stay ahead of emerging threats and opportunities. This Program is seeking a software engineer to parse data products which make information available for Analytic missions. This Program has a robust process to ensure quality of products. The process includes DAT, UAT/SME Validation, Smoke Test, and System Level Test (SLT) prior to deploying products to production. WHO YOU ARE Experience as a Software Engineer with knowledge assisting in the design, development, testing, and debugging of software solutions, with a focus on Linux operating systems. Strong Familiarity with programming languages such as Java. WHY JOIN US Providing ongoing training, mentorship, and development opportunities to help our cyber and intelligence professionals stay at the forefront of their field and achieve their career goals. Competitive and comprehensive benefits package. Rewards and recognition for your hard work. Medical and dental coverage. 401k retirement savings plan. Paid time off for work/life balance. And more Basic Qualifications: Bachelors degree from an accredited college in a related discipline, or equivalent experience/combined education, with 20 years of professional experience. Twenty (20) years' experience as a SWE in programs and contracts of similar scope, type, and complexity is required. Bachelor's degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional SWE experience on projects with similar software processes may be substituted for a bachelor's degree. Demonstrated experience with distributed scale Big Data stores (i.e. Accumulo), Map/Reduce programming model (i.e. Hadoop), Hadoop Distributed File System, and data serialization (i.e. JSON). Experience with programming languages such as Java and scripting languages such as Perl, Python, Bash. TS/ SCI with Poly Desired Skills: • Full-Stack development (Front-end and back-end) • Implement and maintain high availability (HA) and replication strategies. • Implement and maintain PostgresSQL and PostGIS database • AWS Managed Services (i.e. EC2, S3, VPC) • Understanding of CI/CD pipelines (using GitLab) and infrastructure as code (Terraform, AWS CloudFormation). • Experience with containerized environments (Docker, Kubernetes, EKS). • Prometheus, Grafana, or other logging/monitoring tools Clearance Level: TS/SCI w/Poly SP Other Important Information You Should Know Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings. Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility. Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits. Schedule for this Position: 9x80 every other Friday off Pay Rate: The annual base salary range for this position in California, Massachusetts, and New York (excluding most major metropolitan areas), Colorado, Hawaii, Illinois, Maryland, Minnesota, New Jersey, Vermont, Washington or Washington DC is $150,800 - $265,880. For states not referenced above, the salary range for this position will reflect the candidate's final work location. Please note that the salary information is a general guideline only. Lockheed Martin considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer. Benefits offered: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k) match, Flexible Spending Accounts, EAP, Education Assistance, Parental Leave, Paid time off, and Holidays. (Washington state applicants only) Non-represented full-time employees: accrue at least 10 hours per month of Paid Time Off (PTO) to be used for incidental absences and other reasons; receive at least 90 hours for holidays. Represented full time employees accrue 6.67 hours of Vacation per month; accrue up to 52 hours of sick leave annually; receive at least 96 hours for holidays. PTO, Vacation, sick leave, and holiday hours are prorated based on start date during the calendar year. This position is incentive plan eligible. Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics. The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration. At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work. With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility. If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications. Experience Level: Experienced Professional Business Unit: RMS Relocation Available: No Career Area: Software Engineering Type: Task Order/IDIQ Shift: First
Job Overview: Architecture & Design Support Support the definition of system and data integration architecture for eCommerce platforms and enterprise systems. Contribute to solution design documents, data flow diagrams, interface specifications, and integration standards. Work with senior architects to evaluate design options and ensure alignment with overall platform strategy. Data Engineering & Integrations Help design and implement data pipelines supporting analytics, tagging, customer data, product information, inventory, orders, and other commerce workflows. Configure and maintain integration patterns such as APIs, streaming, webhooks, batch jobs, and middleware workflows. Participate in building data models, schemas, and event structures that support digital commerce operations. Assist in maintaining data quality, monitoring, governance, and synchronization across systems. Help design and implement data visualization using BI tools such as Microsoft Power BI , Tableau etc. Implementation & Delivery Collaborate with engineering teams to translate integration requirements into technical tasks. Support testing, debugging, and validation of integrations across upstream and downstream systems. Contribute to documentation and knowledge sharing within the team. Ensure solutions meet requirements for security, reliability, observability, and performance. Cross-Functional Collaboration Work closely with product managers, analytics teams, marketing, operations, and customer experience teams to understand data and integration needs. Help clarify data requirements and ensure correct flow of customer, product, and transaction information. Communicate technical concepts clearly to both technical and business partners. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Total Rewards: Salary Range: $71,000 - $115,500 Actual placement within the compensation range may vary depending on experience, skills, and other factors Benefits, subject to election and eligibility: Medical, Dental, Vision, Disability, Paid Time Off (including paid parental leave, vacation, and sick time), 401k with company match, Tuition Reimbursement, and Mileage Reimbursement Annual bonus based on performance and eligibility It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Requirements: 4-7 years of experience in solution architecture, data engineering, systems integration, or similar roles. Strong understanding of eCommerce ecosystems and how data flows between CRM, CDP, ERP, OMS, PIM, tagging/analytics, and marketing platforms. Experience building or supporting API integrations and event-driven workflows. Working knowledge of data modeling, ETL/ELT pipelines, and cloud-based data infrastructure. Familiarity with digital tagging, analytics measurement, and customer data capture. Strong technical documentation, communication, and problem-solving skills. Ability to work collaboratively within a cross-functional team environment. Preferred Qualifications Experience contributing to large-scale digital commerce or replatforming initiatives. Exposure to headless or composable commerce concepts. Some familiarity with DevOps practices, CI/CD, and cloud-based integration tooling. Understanding of data privacy and compliance considerations (GDPR, CCPA). Relevant industry ceritifications such as Microsoft Azure, AWS, Google Cloud or equivalent Company Overview: Keurig Dr Pepper (NASDAQ: KDP) is a leading beverage company in North America, with a portfolio of more than 125 owned, licensed and partner brands and powerful distribution capabilities to provide a beverage for every need, anytime, anywhere. We operate with a differentiated business model and world-class brand portfolio, powered by a talented and engaged team that is anchored in our values. We work with big, exciting beverage brands and the single-serve coffee brewing system in North America at KDP, and we have fun doing it! Together, we have built a leading beverage company in North America offering hot and cold beverages together at scale. Whatever your area of expertise, at KDP you can be a part of a team that's proud of its brands, partnerships, innovation, and growth. Will you join us? We strive to be an employer of choice, providing a culture and opportunities that empower our team of 29,000 employees to grow and develop. We offer robust benefits to support your health and wellness as well as your personal and financial well-being. We also provide employee programs designed to enhance your professional growth and development, while ensuring you feel valued, inspired and appreciated at work. Keurig Dr Pepper is an equal opportunity employer and recruits qualified applicants and advances in employment its employees without regard to race, color, religion, gender, sexual orientation, gender identity, gender expression, age, disability or association with a person with a disability, medical condition, genetic information, ethnic or national origin, marital status, veteran status, or any other status protected by law. A.I. Disclosure: KDP uses artificial intelligence to assist with initial resume screening and candidate matching. This technology helps us efficiently identify candidates whose qualifications align with our open roles. If you prefer not to have your application processed using artificial intelligence, you may opt out by emailing your resume and qualifications directly to in lieu of clicking Apply. Please include the job title and location or Job ID # in the email subject line.
01/12/2026
Full time
Job Overview: Architecture & Design Support Support the definition of system and data integration architecture for eCommerce platforms and enterprise systems. Contribute to solution design documents, data flow diagrams, interface specifications, and integration standards. Work with senior architects to evaluate design options and ensure alignment with overall platform strategy. Data Engineering & Integrations Help design and implement data pipelines supporting analytics, tagging, customer data, product information, inventory, orders, and other commerce workflows. Configure and maintain integration patterns such as APIs, streaming, webhooks, batch jobs, and middleware workflows. Participate in building data models, schemas, and event structures that support digital commerce operations. Assist in maintaining data quality, monitoring, governance, and synchronization across systems. Help design and implement data visualization using BI tools such as Microsoft Power BI , Tableau etc. Implementation & Delivery Collaborate with engineering teams to translate integration requirements into technical tasks. Support testing, debugging, and validation of integrations across upstream and downstream systems. Contribute to documentation and knowledge sharing within the team. Ensure solutions meet requirements for security, reliability, observability, and performance. Cross-Functional Collaboration Work closely with product managers, analytics teams, marketing, operations, and customer experience teams to understand data and integration needs. Help clarify data requirements and ensure correct flow of customer, product, and transaction information. Communicate technical concepts clearly to both technical and business partners. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Total Rewards: Salary Range: $71,000 - $115,500 Actual placement within the compensation range may vary depending on experience, skills, and other factors Benefits, subject to election and eligibility: Medical, Dental, Vision, Disability, Paid Time Off (including paid parental leave, vacation, and sick time), 401k with company match, Tuition Reimbursement, and Mileage Reimbursement Annual bonus based on performance and eligibility It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Requirements: 4-7 years of experience in solution architecture, data engineering, systems integration, or similar roles. Strong understanding of eCommerce ecosystems and how data flows between CRM, CDP, ERP, OMS, PIM, tagging/analytics, and marketing platforms. Experience building or supporting API integrations and event-driven workflows. Working knowledge of data modeling, ETL/ELT pipelines, and cloud-based data infrastructure. Familiarity with digital tagging, analytics measurement, and customer data capture. Strong technical documentation, communication, and problem-solving skills. Ability to work collaboratively within a cross-functional team environment. Preferred Qualifications Experience contributing to large-scale digital commerce or replatforming initiatives. Exposure to headless or composable commerce concepts. Some familiarity with DevOps practices, CI/CD, and cloud-based integration tooling. Understanding of data privacy and compliance considerations (GDPR, CCPA). Relevant industry ceritifications such as Microsoft Azure, AWS, Google Cloud or equivalent Company Overview: Keurig Dr Pepper (NASDAQ: KDP) is a leading beverage company in North America, with a portfolio of more than 125 owned, licensed and partner brands and powerful distribution capabilities to provide a beverage for every need, anytime, anywhere. We operate with a differentiated business model and world-class brand portfolio, powered by a talented and engaged team that is anchored in our values. We work with big, exciting beverage brands and the single-serve coffee brewing system in North America at KDP, and we have fun doing it! Together, we have built a leading beverage company in North America offering hot and cold beverages together at scale. Whatever your area of expertise, at KDP you can be a part of a team that's proud of its brands, partnerships, innovation, and growth. Will you join us? We strive to be an employer of choice, providing a culture and opportunities that empower our team of 29,000 employees to grow and develop. We offer robust benefits to support your health and wellness as well as your personal and financial well-being. We also provide employee programs designed to enhance your professional growth and development, while ensuring you feel valued, inspired and appreciated at work. Keurig Dr Pepper is an equal opportunity employer and recruits qualified applicants and advances in employment its employees without regard to race, color, religion, gender, sexual orientation, gender identity, gender expression, age, disability or association with a person with a disability, medical condition, genetic information, ethnic or national origin, marital status, veteran status, or any other status protected by law. A.I. Disclosure: KDP uses artificial intelligence to assist with initial resume screening and candidate matching. This technology helps us efficiently identify candidates whose qualifications align with our open roles. If you prefer not to have your application processed using artificial intelligence, you may opt out by emailing your resume and qualifications directly to in lieu of clicking Apply. Please include the job title and location or Job ID # in the email subject line.
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
01/07/2026
Full time
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
01/07/2026
Full time
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
01/07/2026
Full time
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
01/07/2026
Full time
Company Description Adtalem Global Education is a national leader in post-secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well-being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world. Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University. We operate on a hybrid schedule with four in-office days per week (Monday-Thursday). This approach enhances creativity, innovation, communication, and relationship-building, fostering a dynamic and collaborative work environment. Visit for more information and follow us on LinkedIn and Instagram . Job Description Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision-making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company's innovation initiatives and growth objectives. Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth. Design, develop, deploy and support high performance data pipelines both inbound and outbound. Model data platform by applying the business logic and building objects in the semantic layer of the data platform. Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. Document the design and support strategy of the data pipelines Capture, store and socialize data lineage and operational metadata Troubleshoot and resolve data engineering issues as they arise. Develop REST APIs to expose data to other teams within the company. Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation Mentor and guide junior data engineers. Qualifications Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI. Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics. Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities. Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. Expert knowledge on Python programming and SQL. Experience with cloud platforms (AWS, GCP, Azure) and their data services Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Familiarity with synthetic data generation and unstructured data processing Experience in AI/ML data pipelines and frameworks Excellent organizational, prioritization and analytical abilities. Have proven experience working in incremental execution through successful launches. Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. Experience working in agile environment. Additional Information In support of the pay transparency laws enacted across the country, the expected salary range for this position is between $84,835.61 and $149,076.17. Actual pay will be adjusted based on job-related factors permitted by law, such as experience and training; geographic location; licensure and certifications; market factors; departmental budgets; and responsibility. Our Talent Acquisition Team will be happy to answer any questions you may have, and we look forward to learning more about your salary requirements. The position qualifies for the below benefits. Adtalem offers a robust suite of benefits including: Health, dental, vision, life and disability insurance 401k Retirement Program + 6% employer match Participation in Adtalem's Flexible Time Off (FTO) Policy 12 Paid Holidays For more information related to our benefits please visit: You are also eligible to participate in an annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Equal Opportunity - Minority / Female / Disability / V / Gender Identity / Sexual Orientation
Real people. Real service. At , we value every individual team member and cultivate a community where people come first. Led by our core values of G enerosity, R espect, I nnovation, T eamwork, and GRIT, we're dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers. We are looking for a new Principal Backend Engineer to join our growing IT Team. This individual will report to our Sr. Director of IT, serve as a technical leader and system architect, guiding the design and delivery of scalable, reliable, and high-performance solutions across our e-commerce and internal platforms. You'll partner closely with senior leaders to define the long-term engineering vision and lead initiatives that strengthen system performance, scalability, and reliability. If you're an experienced technical leader who thrives on collaboration, mentorship, and building solutions that power business growth, we'd love to hear from you! This remote position is open to individuals who live in, or are open to relocating to, the following states: Arizona, Delaware, Florida, Georgia, Nevada, New Jersey, New York, North Carolina, Ohio, Rhode Island, South Carolina, Tennessee, Texas, Virginia, and Washington. This position requires travel to our headquarters in Melville, NY 3 times per year for internal meetings and team building activities. We reimburse reasonable and necessary travel expenses, and you're also welcome to work on-site anytime beyond these visits - our doors are always open! Role Type: Full-Time Location: Remote Schedule: Monday through Friday, 8:00 a.m. to 5:00 p.m. with time zone flexibility Base Salary: $140,000 - $175,000 per year Responsibilities : Technical Strategy & Architecture Architect end-to-end software solutions using modern frameworks and design patterns aligned with scalability, performance, and maintainability goals. Lead system design discussions for high-traffic applications, ensuring robust architecture for business-critical services. Evaluate, recommend, and implement architectural improvements to enhance scalability, observability, and resilience. Define and uphold best practices for code quality, security, accessibility, and data privacy compliance. Project Leadership Lead complex, cross-functional projects from concept to delivery, aligning engineering solutions with business needs. Collaborate with product and business teams to define technical vision and ensure cohesive project execution. Oversee the technical implementation of new features and services, ensuring efficient use of system resources and infrastructure. System Engineering & Optimization Build and maintain distributed systems using Spring Boot microservices, Docker, and Kubernetes. Design and optimize high-performance databases using MySQL and Oracle, leveraging indexing and tuning for efficiency. Implement and manage Redis for caching, Eureka Server for service registration, and the ELK Stack for monitoring and analytics. Configure Nginx and Apache for load balancing and high availability across production systems. DevOps & Reliability Engineering Own and maintain CI/CD pipelines using Jenkins for automated builds, testing, and deployments. Ensure system health and stability through proactive monitoring, logging, and alerting strategies. Drive improvements in deployment automation, infrastructure as code, and site reliability practices. Leadership & Collaboration Act as a technical advisor and mentor, helping develop engineers' skills in architecture, design, and problem-solving. Conduct code and design reviews to maintain engineering excellence. Collaborate effectively with senior leadership and cross-departmental stakeholders to influence company-wide technical initiatives. Foster a culture of innovation, ownership, and accountability across the engineering organization. Requirements : Bachelor's degree or foreign equivalent in Computer Science, Engineering, Information Technology, or related field. 10+ years of experience in enterprise-level software development. Advanced proficiency in: Java , Spring Boot , Microservices Architecture MySQL , Oracle , and Redis React and modern front-end frameworks Docker , Kubernetes , Jenkins CI/CD Linux/UNIX , Nginx , Apache , and ELK Stack Deep understanding of data structures , algorithms , and system design principles . Proven ability to architect and deliver complex backend systems and mentor technical teams . Strong collaboration skills with the ability to influence senior leaders and align multiple teams toward shared goals. Why work with us: We have awesome benefits - We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and voluntary life insurance options 401(k) with up to 4% company match Paid vacation, sick time, and holidays Company-paid basic life insurance and long-term disability Discounted auto, home, and pet insurance programs Flexible Spending Account (FSA) Confidential mental health, financial planning, and legal support through our Employee Assistance Program (EAP) Company-provided equipment and one-time $250 work from home stipend $750 annual professional development budget $25 monthly Grubhub credit Company rewards and recognition program And more! We empower ownership - We all contribute to our success and we all share in it. Our Ownership for All program ensures each SupplyHouse team member will benefit financially from the company's growth and accomplishments. We promote work-life balance - We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our wellness initiatives and ask about our Flex-Time Policy! We support growth - We encourage you to embrace continuous learning and take on new challenges. In an exciting and evolving industry, we provide opportunities for career growth through our annual merit and bonus opportunities, hands-on training, diversity and inclusion initiatives, internal mobility options, and professional development budget. We give back - We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for non-profit organizations, Volunteer Paid Time Off, and more. We listen - We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, employee listening initiatives, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day. Check us out and learn more at ! Additional Details: Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations. Applicants must be currently authorized to work in the U.S. on a full-time basis. SupplyHouse may sponsor applicants for work visas in limited situations. is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position. To ensure fairness and trust in our hiring process, we ask that all application materials, assessments, and interview responses reflect your own thinking and perspective. You may use AI tools to assist in preparing your responses, as long as this use is clearly disclosed and you can speak authentically to your ideas and work. Our focus is on honesty, judgment, and how you approach problem-solving. We appreciate your transparency and look forward to learning more about your skills. We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations. All emails from the SupplyHouse team will only be sent from email address. Please exercise caution if you receive an email from an alternate domain.
01/07/2026
Full time
Real people. Real service. At , we value every individual team member and cultivate a community where people come first. Led by our core values of G enerosity, R espect, I nnovation, T eamwork, and GRIT, we're dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers. We are looking for a new Principal Backend Engineer to join our growing IT Team. This individual will report to our Sr. Director of IT, serve as a technical leader and system architect, guiding the design and delivery of scalable, reliable, and high-performance solutions across our e-commerce and internal platforms. You'll partner closely with senior leaders to define the long-term engineering vision and lead initiatives that strengthen system performance, scalability, and reliability. If you're an experienced technical leader who thrives on collaboration, mentorship, and building solutions that power business growth, we'd love to hear from you! This remote position is open to individuals who live in, or are open to relocating to, the following states: Arizona, Delaware, Florida, Georgia, Nevada, New Jersey, New York, North Carolina, Ohio, Rhode Island, South Carolina, Tennessee, Texas, Virginia, and Washington. This position requires travel to our headquarters in Melville, NY 3 times per year for internal meetings and team building activities. We reimburse reasonable and necessary travel expenses, and you're also welcome to work on-site anytime beyond these visits - our doors are always open! Role Type: Full-Time Location: Remote Schedule: Monday through Friday, 8:00 a.m. to 5:00 p.m. with time zone flexibility Base Salary: $140,000 - $175,000 per year Responsibilities : Technical Strategy & Architecture Architect end-to-end software solutions using modern frameworks and design patterns aligned with scalability, performance, and maintainability goals. Lead system design discussions for high-traffic applications, ensuring robust architecture for business-critical services. Evaluate, recommend, and implement architectural improvements to enhance scalability, observability, and resilience. Define and uphold best practices for code quality, security, accessibility, and data privacy compliance. Project Leadership Lead complex, cross-functional projects from concept to delivery, aligning engineering solutions with business needs. Collaborate with product and business teams to define technical vision and ensure cohesive project execution. Oversee the technical implementation of new features and services, ensuring efficient use of system resources and infrastructure. System Engineering & Optimization Build and maintain distributed systems using Spring Boot microservices, Docker, and Kubernetes. Design and optimize high-performance databases using MySQL and Oracle, leveraging indexing and tuning for efficiency. Implement and manage Redis for caching, Eureka Server for service registration, and the ELK Stack for monitoring and analytics. Configure Nginx and Apache for load balancing and high availability across production systems. DevOps & Reliability Engineering Own and maintain CI/CD pipelines using Jenkins for automated builds, testing, and deployments. Ensure system health and stability through proactive monitoring, logging, and alerting strategies. Drive improvements in deployment automation, infrastructure as code, and site reliability practices. Leadership & Collaboration Act as a technical advisor and mentor, helping develop engineers' skills in architecture, design, and problem-solving. Conduct code and design reviews to maintain engineering excellence. Collaborate effectively with senior leadership and cross-departmental stakeholders to influence company-wide technical initiatives. Foster a culture of innovation, ownership, and accountability across the engineering organization. Requirements : Bachelor's degree or foreign equivalent in Computer Science, Engineering, Information Technology, or related field. 10+ years of experience in enterprise-level software development. Advanced proficiency in: Java , Spring Boot , Microservices Architecture MySQL , Oracle , and Redis React and modern front-end frameworks Docker , Kubernetes , Jenkins CI/CD Linux/UNIX , Nginx , Apache , and ELK Stack Deep understanding of data structures , algorithms , and system design principles . Proven ability to architect and deliver complex backend systems and mentor technical teams . Strong collaboration skills with the ability to influence senior leaders and align multiple teams toward shared goals. Why work with us: We have awesome benefits - We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and voluntary life insurance options 401(k) with up to 4% company match Paid vacation, sick time, and holidays Company-paid basic life insurance and long-term disability Discounted auto, home, and pet insurance programs Flexible Spending Account (FSA) Confidential mental health, financial planning, and legal support through our Employee Assistance Program (EAP) Company-provided equipment and one-time $250 work from home stipend $750 annual professional development budget $25 monthly Grubhub credit Company rewards and recognition program And more! We empower ownership - We all contribute to our success and we all share in it. Our Ownership for All program ensures each SupplyHouse team member will benefit financially from the company's growth and accomplishments. We promote work-life balance - We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our wellness initiatives and ask about our Flex-Time Policy! We support growth - We encourage you to embrace continuous learning and take on new challenges. In an exciting and evolving industry, we provide opportunities for career growth through our annual merit and bonus opportunities, hands-on training, diversity and inclusion initiatives, internal mobility options, and professional development budget. We give back - We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for non-profit organizations, Volunteer Paid Time Off, and more. We listen - We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, employee listening initiatives, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day. Check us out and learn more at ! Additional Details: Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations. Applicants must be currently authorized to work in the U.S. on a full-time basis. SupplyHouse may sponsor applicants for work visas in limited situations. is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position. To ensure fairness and trust in our hiring process, we ask that all application materials, assessments, and interview responses reflect your own thinking and perspective. You may use AI tools to assist in preparing your responses, as long as this use is clearly disclosed and you can speak authentically to your ideas and work. Our focus is on honesty, judgment, and how you approach problem-solving. We appreciate your transparency and look forward to learning more about your skills. We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations. All emails from the SupplyHouse team will only be sent from email address. Please exercise caution if you receive an email from an alternate domain.