Keurig Dr Pepper Careers
Job Overview: Are you passionate about harnessing data to unlock business insights and drive strategic growth? At Keurig Dr Pepper, we're seeking a Principal Data Engineer to lead the development and optimization of our modern data ecosystem. You'll play a pivotal role in designing scalable data solutions, enabling advanced analytics, and mentoring teams across disciplines to maximize the value of data. Join our innovative team and help shape the future of enterprise data at one of North America's leading beverage companies. As a Principal Data Engineer, you will Data Engineering & Architecture: • Architect and implement scalable data pipelines and transformation workflows using dbt, SnowSQL, and cloud-native technologies. • Design, build, and maintain enterprise-grade data lakes, data warehouses, and transformation layers leveraging modern architectural patterns such as the Medallion (Bronze, Silver, Gold) architecture to support analytics and machine learning. • Implement best practices for data ingestion, modeling, quality assurance, and lineage to ensure trusted and consistent data delivery. • Create reusable frameworks and tools for automated deployment, monitoring, and lifecycle management of data assets. • Optimize data platforms to deliver reliable, performant, and cost-effective analytics infrastructure. AI-Ready Data Ecosystems • Collaborate with Data Scientists to support AI/ML pipelines-enabling efficient feature engineering, model training, and real-time inferencing. • Integrate AI-driven capabilities such as anomaly detection, intelligent alerting, and natural language enrichment into data workflows. Leadership & Collaboration: • Collaborate with cross-functional teams-including Solution Architects, Product Managers, Data Scientists, and Analysts-to align on data strategy and business outcomes. • Technically lead product teams involving external partners, ensuring timely delivery of high-quality data solutions. • Provide technical thought leadership, guiding project teams and business units through architectural decisions and best practices. • Facilitate cross-functional alignment to ensure scalable and sustainable data engineering practices. Governance & Optimization: • Define and enforce engineering standards and best practices across the analytics ecosystem. • Lead architectural design reviews, ensuring technical rigor and adherence to change control processes. • Continuously assess and optimize the performance, reliability, and scalability of the data platform. • Contribute to roadmap development and long-term strategy for the enterprise analytics platform. Who you are: You're a strategic, hands-on engineering leader who combines deep technical expertise with strong business acumen. You're passionate about solving complex data challenges, enabling AI, and mentoring teams to deliver enterprise-grade data solutions. Key Skills & Expertise: Technical Mastery (Deep, hands-on expertise expected) • Expert-level knowledge of Snowflake architecture, SnowSQL, and data transformation workflows. • Advanced proficiency with dbt for modeling, testing, versioning, and orchestrating ELT pipelines. • Hands-on experience with Databricks, including Spark-based data processing, Delta Lake, and integration with cloud-native data platforms. • Strong command of SQL, Python, and scalable data pipeline development. • Proven experience designing and managing enterprise data warehouses and cloud-native data platforms. • Deep understanding of modern data modeling techniques (e.g., dimensional, data vault, star/snowflake schemas). • Experience delivering platforms that support advanced analytics and machine learning solutions. • Solid grasp of data architecture frameworks such as Data Warehouses, Data Lakes, and Data Hubs. • Experience with commercial data science tools like KNIME, Alteryx, or similar platforms. Some Experience / Familiarity With (Preferred, but not required at expert level) • AI/ML platforms such as SageMaker, AutoML, or TensorFlow. • Data integration and ELT tools like Informatica Cloud, Fivetran, or Azure Data Factory. • BI tools: Power BI, Tableau, or MicroStrategy. • Working with SAP ERP as a data source. • Cloud platforms such as Azure, AWS, or GCP-especially storage, compute, and orchestration services. • DevOps practices including Git workflows, CI/CD, and automation. • UNIX/Linux environments and shell scripting. Leadership & Communication: • Ability to lead cross-functional teams and manage complex data initiatives from design through delivery. • Excellent communicator-able to convey complex technical concepts to both technical and non-technical audiences. • Proven track record of influencing architectural direction, mentoring team members, and driving business-aligned outcomes. • Skilled at deriving insights from data and translating them into clear, actionable recommendations. Problem Solving & Optimization: • Demonstrated success in building and optimizing data pipelines to maximize performance and maintainability. • Strong analytical and troubleshooting skills with a focus on root cause analysis and long-term solutions. Total Rewards: Salary Range: $116,100 - $185,000 Actual placement within the compensation range may vary depending on experience, skills, and other factors Benefits, subject to election and eligibility: Medical, Dental, Vision, Disability, Paid Time Off (including paid parental leave, vacation, and sick time), 401k with company match, Tuition Reimbursement, and Mileage Reimbursement Annual bonus based on performance and eligibility Requirements: Education: • Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. Experience: • 10+ years in data management, including integration, modeling, and optimization. • Hands-on experience with Analytics tools and platforms like Snowflake or Informatica Cloud. • Expert in SQL, SnowSql coding, ETL, and data warehousing. Core Skills: • Expertise in conceptual architecture, data integration design, and wireframing. • Familiarity with DevOps and Agile technology environments (preferred). • Strong presentation and change management skills. Nice to have / Cross trained In: • Exposure to SAP ERP and SAP-based data integration. • Experience with DevOps, CI/CD pipelines, and Git-based workflows. • Familiarity with AI/ML use cases in data operations (e.g., intelligent data quality, anomaly detection). • Snowflake, dbt, or cloud platform certifications. Company Overview: Keurig Dr Pepper (NASDAQ: KDP) is a leading beverage company in North America, with a portfolio of more than 125 owned, licensed and partner brands and powerful distribution capabilities to provide a beverage for every need, anytime, anywhere. We operate with a differentiated business model and world-class brand portfolio, powered by a talented and engaged team that is anchored in our values. We work with big, exciting beverage brands and the single-serve coffee brewing system in North America at KDP, and we have fun doing it! Together, we have built a leading beverage company in North America offering hot and cold beverages together at scale. Whatever your area of expertise, at KDP you can be a part of a team that's proud of its brands, partnerships, innovation, and growth. Will you join us? We strive to be an employer of choice, providing a culture and opportunities that empower our team of 29,000 employees to grow and develop. We offer robust benefits to support your health and wellness as well as your personal and financial well-being. We also provide employee programs designed to enhance your professional growth and development, while ensuring you feel valued, inspired and appreciated at work. Keurig Dr Pepper is an equal opportunity employer and recruits qualified applicants and advances in employment its employees without regard to race, color, religion, gender, sexual orientation, gender identity, gender expression, age, disability or association with a person with a disability, medical condition, genetic information, ethnic or national origin, marital status, veteran status, or any other status protected by law. A.I. Disclosure: KDP uses artificial intelligence to assist with initial resume screening and candidate matching. This technology helps us efficiently identify candidates whose qualifications align with our open roles. If you prefer not to have your application processed using artificial intelligence, you may opt out by emailing your resume and qualifications directly to .
Job Overview: Are you passionate about harnessing data to unlock business insights and drive strategic growth? At Keurig Dr Pepper, we're seeking a Principal Data Engineer to lead the development and optimization of our modern data ecosystem. You'll play a pivotal role in designing scalable data solutions, enabling advanced analytics, and mentoring teams across disciplines to maximize the value of data. Join our innovative team and help shape the future of enterprise data at one of North America's leading beverage companies. As a Principal Data Engineer, you will Data Engineering & Architecture: • Architect and implement scalable data pipelines and transformation workflows using dbt, SnowSQL, and cloud-native technologies. • Design, build, and maintain enterprise-grade data lakes, data warehouses, and transformation layers leveraging modern architectural patterns such as the Medallion (Bronze, Silver, Gold) architecture to support analytics and machine learning. • Implement best practices for data ingestion, modeling, quality assurance, and lineage to ensure trusted and consistent data delivery. • Create reusable frameworks and tools for automated deployment, monitoring, and lifecycle management of data assets. • Optimize data platforms to deliver reliable, performant, and cost-effective analytics infrastructure. AI-Ready Data Ecosystems • Collaborate with Data Scientists to support AI/ML pipelines-enabling efficient feature engineering, model training, and real-time inferencing. • Integrate AI-driven capabilities such as anomaly detection, intelligent alerting, and natural language enrichment into data workflows. Leadership & Collaboration: • Collaborate with cross-functional teams-including Solution Architects, Product Managers, Data Scientists, and Analysts-to align on data strategy and business outcomes. • Technically lead product teams involving external partners, ensuring timely delivery of high-quality data solutions. • Provide technical thought leadership, guiding project teams and business units through architectural decisions and best practices. • Facilitate cross-functional alignment to ensure scalable and sustainable data engineering practices. Governance & Optimization: • Define and enforce engineering standards and best practices across the analytics ecosystem. • Lead architectural design reviews, ensuring technical rigor and adherence to change control processes. • Continuously assess and optimize the performance, reliability, and scalability of the data platform. • Contribute to roadmap development and long-term strategy for the enterprise analytics platform. Who you are: You're a strategic, hands-on engineering leader who combines deep technical expertise with strong business acumen. You're passionate about solving complex data challenges, enabling AI, and mentoring teams to deliver enterprise-grade data solutions. Key Skills & Expertise: Technical Mastery (Deep, hands-on expertise expected) • Expert-level knowledge of Snowflake architecture, SnowSQL, and data transformation workflows. • Advanced proficiency with dbt for modeling, testing, versioning, and orchestrating ELT pipelines. • Hands-on experience with Databricks, including Spark-based data processing, Delta Lake, and integration with cloud-native data platforms. • Strong command of SQL, Python, and scalable data pipeline development. • Proven experience designing and managing enterprise data warehouses and cloud-native data platforms. • Deep understanding of modern data modeling techniques (e.g., dimensional, data vault, star/snowflake schemas). • Experience delivering platforms that support advanced analytics and machine learning solutions. • Solid grasp of data architecture frameworks such as Data Warehouses, Data Lakes, and Data Hubs. • Experience with commercial data science tools like KNIME, Alteryx, or similar platforms. Some Experience / Familiarity With (Preferred, but not required at expert level) • AI/ML platforms such as SageMaker, AutoML, or TensorFlow. • Data integration and ELT tools like Informatica Cloud, Fivetran, or Azure Data Factory. • BI tools: Power BI, Tableau, or MicroStrategy. • Working with SAP ERP as a data source. • Cloud platforms such as Azure, AWS, or GCP-especially storage, compute, and orchestration services. • DevOps practices including Git workflows, CI/CD, and automation. • UNIX/Linux environments and shell scripting. Leadership & Communication: • Ability to lead cross-functional teams and manage complex data initiatives from design through delivery. • Excellent communicator-able to convey complex technical concepts to both technical and non-technical audiences. • Proven track record of influencing architectural direction, mentoring team members, and driving business-aligned outcomes. • Skilled at deriving insights from data and translating them into clear, actionable recommendations. Problem Solving & Optimization: • Demonstrated success in building and optimizing data pipelines to maximize performance and maintainability. • Strong analytical and troubleshooting skills with a focus on root cause analysis and long-term solutions. Total Rewards: Salary Range: $116,100 - $185,000 Actual placement within the compensation range may vary depending on experience, skills, and other factors Benefits, subject to election and eligibility: Medical, Dental, Vision, Disability, Paid Time Off (including paid parental leave, vacation, and sick time), 401k with company match, Tuition Reimbursement, and Mileage Reimbursement Annual bonus based on performance and eligibility Requirements: Education: • Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. Experience: • 10+ years in data management, including integration, modeling, and optimization. • Hands-on experience with Analytics tools and platforms like Snowflake or Informatica Cloud. • Expert in SQL, SnowSql coding, ETL, and data warehousing. Core Skills: • Expertise in conceptual architecture, data integration design, and wireframing. • Familiarity with DevOps and Agile technology environments (preferred). • Strong presentation and change management skills. Nice to have / Cross trained In: • Exposure to SAP ERP and SAP-based data integration. • Experience with DevOps, CI/CD pipelines, and Git-based workflows. • Familiarity with AI/ML use cases in data operations (e.g., intelligent data quality, anomaly detection). • Snowflake, dbt, or cloud platform certifications. Company Overview: Keurig Dr Pepper (NASDAQ: KDP) is a leading beverage company in North America, with a portfolio of more than 125 owned, licensed and partner brands and powerful distribution capabilities to provide a beverage for every need, anytime, anywhere. We operate with a differentiated business model and world-class brand portfolio, powered by a talented and engaged team that is anchored in our values. We work with big, exciting beverage brands and the single-serve coffee brewing system in North America at KDP, and we have fun doing it! Together, we have built a leading beverage company in North America offering hot and cold beverages together at scale. Whatever your area of expertise, at KDP you can be a part of a team that's proud of its brands, partnerships, innovation, and growth. Will you join us? We strive to be an employer of choice, providing a culture and opportunities that empower our team of 29,000 employees to grow and develop. We offer robust benefits to support your health and wellness as well as your personal and financial well-being. We also provide employee programs designed to enhance your professional growth and development, while ensuring you feel valued, inspired and appreciated at work. Keurig Dr Pepper is an equal opportunity employer and recruits qualified applicants and advances in employment its employees without regard to race, color, religion, gender, sexual orientation, gender identity, gender expression, age, disability or association with a person with a disability, medical condition, genetic information, ethnic or national origin, marital status, veteran status, or any other status protected by law. A.I. Disclosure: KDP uses artificial intelligence to assist with initial resume screening and candidate matching. This technology helps us efficiently identify candidates whose qualifications align with our open roles. If you prefer not to have your application processed using artificial intelligence, you may opt out by emailing your resume and qualifications directly to .
JPS Tech Solutions LLC
Orlando, Florida
Job Title: BI Solutions Architect Lead Specialist Engineer Location: Orlando, FL Experience: 12+ Years Employment Type: Contract Interview Type: In-Person or Webcam Job Overview We are seeking an experienced BI Solutions Architect Lead Specialist Engineer to lead the design, development, and implementation of Business Intelligence solutions. The ideal candidate will have deep expertise in BI architecture, data warehousing, data modeling, analytics platforms, and enterprise reporting capabilities. This role requires strong leadership skills, hands-on technical ability, and experience guiding teams to deliver scalable BI and analytics solutions that support business strategy and informed decision-making. Key Responsibilities Lead the architecture, design, and delivery of enterprise-level Business Intelligence and data analytics solutions. Work closely with business stakeholders to define BI strategy, reporting needs, and data integration requirements. Architect, design, and manage data warehouse solutions including ETL pipelines, data models, and metadata frameworks. Evaluate and recommend BI tools, data platforms, modeling approaches, and architectural best practices. Oversee end-to-end technical solution delivery including planning, execution, and optimization. Drive standardization and governance around data quality, consistency, performance, and security. Collaborate with cross-functional teams including data engineers, developers, analysts, and enterprise architects. Lead performance tuning, testing, migration, and infrastructure optimization for BI platforms. Provide leadership, mentorship, and technical guidance to engineering and analytics teams. Ensure alignment of BI solutions with business goals, scalability needs, and future-state architecture. Required Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or equivalent field. 12+ years of professional experience working in Business Intelligence, Data Engineering, or Analytics roles. Strong experience designing enterprise BI architectures and data warehouse solutions. Hands-on experience with BI tools such as Power BI, Tableau, Qlik, Looker, or MicroStrategy. Strong understanding of ETL frameworks and tools such as Informatica, Talend, DataStage, SSIS, or ADF. Expertise in SQL and data modeling, including star schema, dimensional modeling, and OLAP concepts. Practical experience with cloud platforms such as AWS, Azure, or Google Cloud data services. Good knowledge of data governance, data security, and master data management practices. Proven track record of leading technical teams and large-scale BI transformation projects. Excellent analytical, communication, and stakeholder management skills. Preferred Skills Experience with modern data platforms such as Snowflake, Redshift, Databricks, or BigQuery. Knowledge of Python or other scripting languages for automation and data processing. Experience with real-time data integration technologies including Kafka or streaming pipelines. Exposure to machine learning, predictive analytics, and advanced analytics ecosystems. Prior consulting or enterprise-level solution delivery experience.
Job Title: BI Solutions Architect Lead Specialist Engineer Location: Orlando, FL Experience: 12+ Years Employment Type: Contract Interview Type: In-Person or Webcam Job Overview We are seeking an experienced BI Solutions Architect Lead Specialist Engineer to lead the design, development, and implementation of Business Intelligence solutions. The ideal candidate will have deep expertise in BI architecture, data warehousing, data modeling, analytics platforms, and enterprise reporting capabilities. This role requires strong leadership skills, hands-on technical ability, and experience guiding teams to deliver scalable BI and analytics solutions that support business strategy and informed decision-making. Key Responsibilities Lead the architecture, design, and delivery of enterprise-level Business Intelligence and data analytics solutions. Work closely with business stakeholders to define BI strategy, reporting needs, and data integration requirements. Architect, design, and manage data warehouse solutions including ETL pipelines, data models, and metadata frameworks. Evaluate and recommend BI tools, data platforms, modeling approaches, and architectural best practices. Oversee end-to-end technical solution delivery including planning, execution, and optimization. Drive standardization and governance around data quality, consistency, performance, and security. Collaborate with cross-functional teams including data engineers, developers, analysts, and enterprise architects. Lead performance tuning, testing, migration, and infrastructure optimization for BI platforms. Provide leadership, mentorship, and technical guidance to engineering and analytics teams. Ensure alignment of BI solutions with business goals, scalability needs, and future-state architecture. Required Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or equivalent field. 12+ years of professional experience working in Business Intelligence, Data Engineering, or Analytics roles. Strong experience designing enterprise BI architectures and data warehouse solutions. Hands-on experience with BI tools such as Power BI, Tableau, Qlik, Looker, or MicroStrategy. Strong understanding of ETL frameworks and tools such as Informatica, Talend, DataStage, SSIS, or ADF. Expertise in SQL and data modeling, including star schema, dimensional modeling, and OLAP concepts. Practical experience with cloud platforms such as AWS, Azure, or Google Cloud data services. Good knowledge of data governance, data security, and master data management practices. Proven track record of leading technical teams and large-scale BI transformation projects. Excellent analytical, communication, and stakeholder management skills. Preferred Skills Experience with modern data platforms such as Snowflake, Redshift, Databricks, or BigQuery. Knowledge of Python or other scripting languages for automation and data processing. Experience with real-time data integration technologies including Kafka or streaming pipelines. Exposure to machine learning, predictive analytics, and advanced analytics ecosystems. Prior consulting or enterprise-level solution delivery experience.