Description
We are looking for a Data Engineer to help us build and maintain scalable and resilient pipelines that will ingest, process, and deliver the data needed for predictive and descriptive analytics. These data pipelines will further connect to machine learning pipelines to facilitate automatic retraining of our models.
We are a diverse group of data scientists, data engineers, software engineers, machine learning engineers from over 30 different countries. We are smart and fast moving, operating in small teams, with freedom for independent work and fast decision making.
To empower scientists and radically improve how science is published, evaluated and disseminated to researchers, innovators and the public, we have built our own state-of-the-art Artificial Intelligence Review Assistant (AIRA), backed by cutting-edge machine learning algorithms.
Key Responsibilities
Work in a team of machine learning engineers responsible for the productization of prototypes developed by data scientists.
Collaborate with data scientists, machine learning engineers, and other data engineers to design scalable, reliable, and maintainable ETL processes that ensure data scientists and automated ML processes have the necessary data available
Research and adopt the best DataOps & MLOps standards to design and develop scalable end-to-end data pipelines.
Identify opportunities for data process automation.
Establish and enforce best practices (e.g. in development, quality assurance, optimization, release, and monitoring).
Requirements
Degree in Computer Science or similar
Proven experience as a Data Engineer
Proficiency in Python
Experience with a Cloud Platform (e.g. Azure, AWS, GCP)
Experience with a workflow engine (e.g. Data Factory, Airflow)
Experience with SQL and NoSQL (e.g. MongoDB) databases
Experience with Hadoop & Spark
Great communication, teamwork, problem-solving, and organizational skills.
Nice To Have
Understanding of supervised and unsupervised machine learning algorithms
Stream-processing frameworks (e.g. Kafka)
Benefits
Competitive salary.
Participation in Frontiers annual bonus scheme
25 leave days + 4 well-being days (pro rata and expiring each year on 31st of December)
Great work-life balance.
Opportunity to work remotely
Fresh fruit, snacks and coffee.
English classes.
Team building/sport activities and monthly social events.
Lots of opportunities to work with exciting technologies and solve challenging problems
Who we are
Frontiers is an award-winning open science platform and leading open access scholarly publisher. We are one of the largest and most cited publishers globally. Our journals span science, health, humanities and social sciences, engineering, and sustainability and we continue to expand into new academic disciplines so more researchers can publish open access.
Dec 23, 2021
Full time
Description
We are looking for a Data Engineer to help us build and maintain scalable and resilient pipelines that will ingest, process, and deliver the data needed for predictive and descriptive analytics. These data pipelines will further connect to machine learning pipelines to facilitate automatic retraining of our models.
We are a diverse group of data scientists, data engineers, software engineers, machine learning engineers from over 30 different countries. We are smart and fast moving, operating in small teams, with freedom for independent work and fast decision making.
To empower scientists and radically improve how science is published, evaluated and disseminated to researchers, innovators and the public, we have built our own state-of-the-art Artificial Intelligence Review Assistant (AIRA), backed by cutting-edge machine learning algorithms.
Key Responsibilities
Work in a team of machine learning engineers responsible for the productization of prototypes developed by data scientists.
Collaborate with data scientists, machine learning engineers, and other data engineers to design scalable, reliable, and maintainable ETL processes that ensure data scientists and automated ML processes have the necessary data available
Research and adopt the best DataOps & MLOps standards to design and develop scalable end-to-end data pipelines.
Identify opportunities for data process automation.
Establish and enforce best practices (e.g. in development, quality assurance, optimization, release, and monitoring).
Requirements
Degree in Computer Science or similar
Proven experience as a Data Engineer
Proficiency in Python
Experience with a Cloud Platform (e.g. Azure, AWS, GCP)
Experience with a workflow engine (e.g. Data Factory, Airflow)
Experience with SQL and NoSQL (e.g. MongoDB) databases
Experience with Hadoop & Spark
Great communication, teamwork, problem-solving, and organizational skills.
Nice To Have
Understanding of supervised and unsupervised machine learning algorithms
Stream-processing frameworks (e.g. Kafka)
Benefits
Competitive salary.
Participation in Frontiers annual bonus scheme
25 leave days + 4 well-being days (pro rata and expiring each year on 31st of December)
Great work-life balance.
Opportunity to work remotely
Fresh fruit, snacks and coffee.
English classes.
Team building/sport activities and monthly social events.
Lots of opportunities to work with exciting technologies and solve challenging problems
Who we are
Frontiers is an award-winning open science platform and leading open access scholarly publisher. We are one of the largest and most cited publishers globally. Our journals span science, health, humanities and social sciences, engineering, and sustainability and we continue to expand into new academic disciplines so more researchers can publish open access.
Senior Data Engineer - Onsite - London I've recently partnered with a company in the public safety industry who are seeking a Senior Data Engineer to join their IT team. They are seeking a Senior Data Engineer to join their company to help bring their data together, develop and maintain advanced PostgreSQL queries alongside executing data-related activities such as cleansing, migration, ETL, modelling and mapping. This role would be suitable for either a Senior Data Engineer or Power BI Developer. What's our client ideally looking for? 5 years of experience as a Data Engineer or BI Developer Power BI SQL Experience in data visualization & BI Development using Power BI Proficiency is PostgreSQL, DAX and Power Query Capability to generate Salesforce CRM reports. If you'd like to review the full job description, please apply today and one of Connexa's consultants will reach out to you. Established in Didsbury, Connexa Technology Ltd is becoming one of the UK's fastest growing IT and Technology recruitment companies. People. Technology. Connected. Connexa Technology is acting as an Employment Agency in relation to this vacancy.
Apr 19, 2024
Full time
Senior Data Engineer - Onsite - London I've recently partnered with a company in the public safety industry who are seeking a Senior Data Engineer to join their IT team. They are seeking a Senior Data Engineer to join their company to help bring their data together, develop and maintain advanced PostgreSQL queries alongside executing data-related activities such as cleansing, migration, ETL, modelling and mapping. This role would be suitable for either a Senior Data Engineer or Power BI Developer. What's our client ideally looking for? 5 years of experience as a Data Engineer or BI Developer Power BI SQL Experience in data visualization & BI Development using Power BI Proficiency is PostgreSQL, DAX and Power Query Capability to generate Salesforce CRM reports. If you'd like to review the full job description, please apply today and one of Connexa's consultants will reach out to you. Established in Didsbury, Connexa Technology Ltd is becoming one of the UK's fastest growing IT and Technology recruitment companies. People. Technology. Connected. Connexa Technology is acting as an Employment Agency in relation to this vacancy.
We are hiring a Senior Staff Software Engineer/Principal Software Engineer for Databricks' Engineering team reporting to an Engineering Leader.You will be part of the Databricks engineering organization, working on one the most important products here within Databricks while working with teams that develop Databricks products and features for thousands of enterprises worldwide as well. As a software engineer, you will join as a founding member of our Berlin site (fully remote) but really as a founding team for our multi-year journey to achieve our Lakehouse vision. You will be involved in the entire development cycle and exemplify all core Databricks values (own-it, data decide, teamwork, customer-obsessed). Key Characteristics As an engineer at Databricks, you will build the next generation distributed data storage and processing systems that can outperform specialized SQL query engines in relational query performance, yet provide the expressiveness and programming abstractions to support diverse workloads ranging from ETL to data science. Job Description At Databricks, we are obsessed with enabling data teams to solve the world's toughest problems, from security threat detection to cancer drug development. We do this by building and running the world's best data and AI infrastructure platform, so our customers can focus on the high value challenges that are central to their own missions. Our engineering teams build highly technical products that fulfill real, important needs in the world. We develop and operate one of the largest scale software platforms. The fleet consists of millions of virtual machines, generating terabytes of logs and processing exabytes of data per day. At our scale, we regularly observe cloud hardware, network, and operating system faults, and our software must gracefully shield our customers from any of the above. The Impact you will have: Solve real business needs at large scale by applying your software engineering. Deliver a highly scalable, available, and fault-tolerant engine processing hundreds of TB of data daily across thousands of customers Low level systems debugging, performance measurement & optimization on large production clusters. Build architecture design, influence product roadmap, and take ownership and responsibility over new projects. Introduce tools to allow greater automation and operability of services. Use your deep experience to help prevent and investigate production issues. Plan and lead complicated technical projects that work with several teams within the company. Contribute as a Technical Team Lead by mentoring others, lead sprint planning, delegating work and assignments to team members and participate in project planning. What we look for: 15+ years industry experience building and supporting large-scale distributed systems. Comfortable working towards a multi-year vision with incremental deliverables. Motivated by delivering customer value and impact. Strong foundation in algorithms and data structures and their real-world use cases. Experience driving company initiatives towards customer satisfaction. BS/MS/PhD in Computer Science or related majors, or equivalent experience. Comprehensive health coverage including medical, dental, and vision Equity awards Flexible time off Paid parental leave Family Planning Gym reimbursement Employee Assistance Program (EAP) Mental wellness resources About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook .
Apr 19, 2024
Full time
We are hiring a Senior Staff Software Engineer/Principal Software Engineer for Databricks' Engineering team reporting to an Engineering Leader.You will be part of the Databricks engineering organization, working on one the most important products here within Databricks while working with teams that develop Databricks products and features for thousands of enterprises worldwide as well. As a software engineer, you will join as a founding member of our Berlin site (fully remote) but really as a founding team for our multi-year journey to achieve our Lakehouse vision. You will be involved in the entire development cycle and exemplify all core Databricks values (own-it, data decide, teamwork, customer-obsessed). Key Characteristics As an engineer at Databricks, you will build the next generation distributed data storage and processing systems that can outperform specialized SQL query engines in relational query performance, yet provide the expressiveness and programming abstractions to support diverse workloads ranging from ETL to data science. Job Description At Databricks, we are obsessed with enabling data teams to solve the world's toughest problems, from security threat detection to cancer drug development. We do this by building and running the world's best data and AI infrastructure platform, so our customers can focus on the high value challenges that are central to their own missions. Our engineering teams build highly technical products that fulfill real, important needs in the world. We develop and operate one of the largest scale software platforms. The fleet consists of millions of virtual machines, generating terabytes of logs and processing exabytes of data per day. At our scale, we regularly observe cloud hardware, network, and operating system faults, and our software must gracefully shield our customers from any of the above. The Impact you will have: Solve real business needs at large scale by applying your software engineering. Deliver a highly scalable, available, and fault-tolerant engine processing hundreds of TB of data daily across thousands of customers Low level systems debugging, performance measurement & optimization on large production clusters. Build architecture design, influence product roadmap, and take ownership and responsibility over new projects. Introduce tools to allow greater automation and operability of services. Use your deep experience to help prevent and investigate production issues. Plan and lead complicated technical projects that work with several teams within the company. Contribute as a Technical Team Lead by mentoring others, lead sprint planning, delegating work and assignments to team members and participate in project planning. What we look for: 15+ years industry experience building and supporting large-scale distributed systems. Comfortable working towards a multi-year vision with incremental deliverables. Motivated by delivering customer value and impact. Strong foundation in algorithms and data structures and their real-world use cases. Experience driving company initiatives towards customer satisfaction. BS/MS/PhD in Computer Science or related majors, or equivalent experience. Comprehensive health coverage including medical, dental, and vision Equity awards Flexible time off Paid parental leave Family Planning Gym reimbursement Employee Assistance Program (EAP) Mental wellness resources About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook .
Data Engineer - Dynamics 365 CE / CRM and Power Platform experience required A well established MS partner who are experiencing growth through a very strong project pipeline are looking to add a Data Engineer and a Data Analytics Consultant to their team. They pride themselves on their innovative approach and their commitment to delivering outstanding results for the clients. Job Purpose Developing accurate, efficient data transformations that meet customer needs within agreed deadlines. Ensuring the reliability, robustness, and resilience of the projects you design and build, while working independently within agreed standards Creating and maintaining data pipelines, data storage solutions, data processing, and data integration. Troubleshooting issues and implementing necessary fixes. Key Responsibilities Designing and building reliable, robust, and accurate data pipelines based on agreed best practices. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL and other technologies. Design ETL processes, develop integration workflows, and manage data load processes to support both regular and ad-hoc activities. Create and maintain optimal data pipeline architecture, which might include integrating with Dataverse via cloud flows or external data sources. Design, develop, and maintain data warehouse environments. Identify, design, and implement internal process improvements, including automating manual processes, optimising data delivery, and re-designing infrastructure for greater scalability Technical Skills 3-5 years' established experience as a Data Engineer Experience with programming languages and tools C#, Python, Visual Studio Code and JSON Dynamics 365 CRM experience Power Platform: Power Automate, Dataverse Data management and warehousing ETL and ELT Data architecture and experience building complex database systems for businesses SQL Server including query optimisation Coding experience to include low code/no-code solutions Comfortable working with a range of data sources and formats e.g. JSON, XML, Flat files, API Integration Machine Learning and AI Normalisation and de-normalisation Understanding of Snowflake architecture, data modelling, and administration 50,000 - 60,000 based on experience - Great benefits They would consider contract options for these roles so please do reach out of you are looking for a contract opportunity - Outside IR35. This role will be fully remote with the occasional travel to client site. You must have the right to work in the UK as sponsorship is not provided. Please reach out to me on (phone number removed) or (url removed) to find out more information and get your application moving!
Apr 18, 2024
Full time
Data Engineer - Dynamics 365 CE / CRM and Power Platform experience required A well established MS partner who are experiencing growth through a very strong project pipeline are looking to add a Data Engineer and a Data Analytics Consultant to their team. They pride themselves on their innovative approach and their commitment to delivering outstanding results for the clients. Job Purpose Developing accurate, efficient data transformations that meet customer needs within agreed deadlines. Ensuring the reliability, robustness, and resilience of the projects you design and build, while working independently within agreed standards Creating and maintaining data pipelines, data storage solutions, data processing, and data integration. Troubleshooting issues and implementing necessary fixes. Key Responsibilities Designing and building reliable, robust, and accurate data pipelines based on agreed best practices. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL and other technologies. Design ETL processes, develop integration workflows, and manage data load processes to support both regular and ad-hoc activities. Create and maintain optimal data pipeline architecture, which might include integrating with Dataverse via cloud flows or external data sources. Design, develop, and maintain data warehouse environments. Identify, design, and implement internal process improvements, including automating manual processes, optimising data delivery, and re-designing infrastructure for greater scalability Technical Skills 3-5 years' established experience as a Data Engineer Experience with programming languages and tools C#, Python, Visual Studio Code and JSON Dynamics 365 CRM experience Power Platform: Power Automate, Dataverse Data management and warehousing ETL and ELT Data architecture and experience building complex database systems for businesses SQL Server including query optimisation Coding experience to include low code/no-code solutions Comfortable working with a range of data sources and formats e.g. JSON, XML, Flat files, API Integration Machine Learning and AI Normalisation and de-normalisation Understanding of Snowflake architecture, data modelling, and administration 50,000 - 60,000 based on experience - Great benefits They would consider contract options for these roles so please do reach out of you are looking for a contract opportunity - Outside IR35. This role will be fully remote with the occasional travel to client site. You must have the right to work in the UK as sponsorship is not provided. Please reach out to me on (phone number removed) or (url removed) to find out more information and get your application moving!
Head of Data Engineering The Head of Data will be a strategic leader responsible for overseeing all aspects of data management, analytics, and governance within the organisation. This individual will play a critical role in driving data-driven decision-making processes, optimising data infrastructure, and ensuring the integrity, security, and accessibility of data assets. The ideal candidate will possess strong leadership skills, deep technical expertise in data management and analytics, and a proven track record of implementing innovative data strategies to support business objectives. Key Responsibilities: Strategic Leadership: Lead the development and execution of the organisation's data strategy, aligning it with business goals and objectives. Provide strategic direction for the use of data to drive decision-making and improve operational efficiency. Data Management: Oversee the design, implementation, and maintenance of robust data management systems and processes, including data acquisition, storage, integration, quality assurance, and lifecycle management. Data Analytics: Drive the development and implementation of advanced analytics initiatives to extract insights from data, identify trends, and support predictive modelling and forecasting. Collaborate with business stakeholders to understand their analytical needs and develop solutions to address them. Data Governance: Establish and enforce data governance policies, standards, and best practices to ensure the accuracy, consistency, security, and privacy of data across the organization. Develop data quality metrics and monitor compliance with regulatory requirements. Data Architecture: Define and maintain the organization's data architecture, including data models, schemas, and taxonomies. Evaluate and select appropriate technologies and tools to support data management, analytics, and visualization requirements. Team Leadership: Build and lead a high-performing team of data professionals, including data engineers, analysts, scientists, and governance specialists. Provide mentorship, coaching, and professional development opportunities to foster a culture of continuous learning and growth. Cross-Functional Collaboration: Collaborate closely with other departments, including IT, finance, marketing, operations, and product development, to understand their data needs and priorities. Partner with business leaders to develop data-driven solutions that drive value and competitive advantage. Vendor Management: Evaluate and manage relationships with third-party data vendors, software providers, and consultants to ensure the successful implementation of data-related projects and initiatives. Negotiate contracts, oversee vendor performance, and assess emerging technologies and trends in the data management space. Qualifications: Bachelor's degree in computer science, engineering, mathematics, statistics, or a related field; advanced degree (e.g., MBA, MS, or PhD) preferred. 10+ years of experience in data management, analytics, and business intelligence, with at least 5 years in a leadership role. Proven track record of developing and implementing data strategies that drive business growth and innovation. Deep understanding of data governance principles, regulatory compliance requirements (e.g., GDPR, CCPA), and industry best practices. Strong technical proficiency in data modelling, SQL, ETL tools, data visualization tools (e.g., Tableau, Power BI), and advanced analytics techniques (e.g., machine learning, predictive modelling). Excellent leadership, communication, and interpersonal skills, with the ability to influence and collaborate effectively across all levels of the organization. Demonstrated experience in managing cross-functional teams and driving cultural change towards a data-driven mindset. Ability to thrive in a fast-paced, dynamic environment and effectively prioritize and manage multiple projects and initiatives. Interested? Please submit your updated CV to Lucy Morgan at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy. Please see our website for Crimson's Privacy Statement, should you wish to view prior to applying for this vacancy.
Apr 18, 2024
Full time
Head of Data Engineering The Head of Data will be a strategic leader responsible for overseeing all aspects of data management, analytics, and governance within the organisation. This individual will play a critical role in driving data-driven decision-making processes, optimising data infrastructure, and ensuring the integrity, security, and accessibility of data assets. The ideal candidate will possess strong leadership skills, deep technical expertise in data management and analytics, and a proven track record of implementing innovative data strategies to support business objectives. Key Responsibilities: Strategic Leadership: Lead the development and execution of the organisation's data strategy, aligning it with business goals and objectives. Provide strategic direction for the use of data to drive decision-making and improve operational efficiency. Data Management: Oversee the design, implementation, and maintenance of robust data management systems and processes, including data acquisition, storage, integration, quality assurance, and lifecycle management. Data Analytics: Drive the development and implementation of advanced analytics initiatives to extract insights from data, identify trends, and support predictive modelling and forecasting. Collaborate with business stakeholders to understand their analytical needs and develop solutions to address them. Data Governance: Establish and enforce data governance policies, standards, and best practices to ensure the accuracy, consistency, security, and privacy of data across the organization. Develop data quality metrics and monitor compliance with regulatory requirements. Data Architecture: Define and maintain the organization's data architecture, including data models, schemas, and taxonomies. Evaluate and select appropriate technologies and tools to support data management, analytics, and visualization requirements. Team Leadership: Build and lead a high-performing team of data professionals, including data engineers, analysts, scientists, and governance specialists. Provide mentorship, coaching, and professional development opportunities to foster a culture of continuous learning and growth. Cross-Functional Collaboration: Collaborate closely with other departments, including IT, finance, marketing, operations, and product development, to understand their data needs and priorities. Partner with business leaders to develop data-driven solutions that drive value and competitive advantage. Vendor Management: Evaluate and manage relationships with third-party data vendors, software providers, and consultants to ensure the successful implementation of data-related projects and initiatives. Negotiate contracts, oversee vendor performance, and assess emerging technologies and trends in the data management space. Qualifications: Bachelor's degree in computer science, engineering, mathematics, statistics, or a related field; advanced degree (e.g., MBA, MS, or PhD) preferred. 10+ years of experience in data management, analytics, and business intelligence, with at least 5 years in a leadership role. Proven track record of developing and implementing data strategies that drive business growth and innovation. Deep understanding of data governance principles, regulatory compliance requirements (e.g., GDPR, CCPA), and industry best practices. Strong technical proficiency in data modelling, SQL, ETL tools, data visualization tools (e.g., Tableau, Power BI), and advanced analytics techniques (e.g., machine learning, predictive modelling). Excellent leadership, communication, and interpersonal skills, with the ability to influence and collaborate effectively across all levels of the organization. Demonstrated experience in managing cross-functional teams and driving cultural change towards a data-driven mindset. Ability to thrive in a fast-paced, dynamic environment and effectively prioritize and manage multiple projects and initiatives. Interested? Please submit your updated CV to Lucy Morgan at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy. Please see our website for Crimson's Privacy Statement, should you wish to view prior to applying for this vacancy.
This is a role for a sector specialist who has experience working as Business Intelligence or Data Management Lead in a fast-paced environment and for someone who is friendly, approachable and proactive in bringing new ideas to the table. Working within the IT department, the Head of Business Intelligence owns the reporting and analysis platform to design and deliver enterprise operational reporting, management information and analytics solutions. Capability Development: Develop a strategy to transition from manual desktop reporting to a unified business intelligence platform. Act as a Solution Architect for reporting and analytics, prioritizing activities aligned with business goals. Lead development teams, coordinate deliverables, and communicate progress. Establish a Data Governance Framework and ensure data quality and ownership. Own the Data Dictionary and standard reporting services, promoting understanding within the firm. Strategize self-service reporting and analysis, fostering business intelligence skills across functions. Provide training and documentation for effective technology usage. Business Intelligence Solution Delivery: Analyze reporting requirements and design solutions to meet objectives. Simplify, standardize, and optimize processes using BI and data tools. Identify data needs aligned with business priorities. Advise on data solution designs for performance and accuracy. Prototype, demonstrate, and document solutions. Refine delivery methods, automation, and QA processes. Manage external development teams and align delivery with cloud and change management processes. Oversee testing and contribute to solution architecture and frameworks. Performance & Quality: Update technical knowledge and skills regularly. Ensure development meets quality and timeliness standards, providing feedback and coaching. Manage time effectively between new solutions and support. Act as an IT ambassador, facilitating communication on software and service delivery and building cross-departmental relationships. Core Qualities and Skills: Proficient in leading business intelligence platforms/tools (e.g., Qliksense, PowerBI, Tableau). Expertise in business intelligence concepts, data visualization, and analytics. Skilled in data modeling for optimized reporting and analysis. Extensive experience in business analysis and process optimization. Comprehensive understanding of Azure platform and its services. Strong background in data engineering, including ETL, data cleaning, and warehousing. Hands-on experience with Azure data services like Azure SQL Database and Data Factory. Proficiency in SQL databases, particularly Azure SQL Database. Understanding of DevOps principles and CI/CD pipelines, with experience in related tools. Effective communication with technical and non-technical stakeholders. Strong problem-solving skills, with a focus on technical troubleshooting and long-term strategies. Experience in Technology and Data teams within financial sectors. Desired Qualities and Skills: Stakeholder relationship management, maintaining effective communication and understanding needs. Familiarity with big data technologies like Apache Spark and Hadoop. Understanding of broader cloud architecture principles and services from AWS or Google Cloud. Additional relevant certifications such as Azure Solutions Architect Expert or Tableau certification. Experience with project management methodologies like Agile, Scrum, or Kanban.
Apr 18, 2024
Full time
This is a role for a sector specialist who has experience working as Business Intelligence or Data Management Lead in a fast-paced environment and for someone who is friendly, approachable and proactive in bringing new ideas to the table. Working within the IT department, the Head of Business Intelligence owns the reporting and analysis platform to design and deliver enterprise operational reporting, management information and analytics solutions. Capability Development: Develop a strategy to transition from manual desktop reporting to a unified business intelligence platform. Act as a Solution Architect for reporting and analytics, prioritizing activities aligned with business goals. Lead development teams, coordinate deliverables, and communicate progress. Establish a Data Governance Framework and ensure data quality and ownership. Own the Data Dictionary and standard reporting services, promoting understanding within the firm. Strategize self-service reporting and analysis, fostering business intelligence skills across functions. Provide training and documentation for effective technology usage. Business Intelligence Solution Delivery: Analyze reporting requirements and design solutions to meet objectives. Simplify, standardize, and optimize processes using BI and data tools. Identify data needs aligned with business priorities. Advise on data solution designs for performance and accuracy. Prototype, demonstrate, and document solutions. Refine delivery methods, automation, and QA processes. Manage external development teams and align delivery with cloud and change management processes. Oversee testing and contribute to solution architecture and frameworks. Performance & Quality: Update technical knowledge and skills regularly. Ensure development meets quality and timeliness standards, providing feedback and coaching. Manage time effectively between new solutions and support. Act as an IT ambassador, facilitating communication on software and service delivery and building cross-departmental relationships. Core Qualities and Skills: Proficient in leading business intelligence platforms/tools (e.g., Qliksense, PowerBI, Tableau). Expertise in business intelligence concepts, data visualization, and analytics. Skilled in data modeling for optimized reporting and analysis. Extensive experience in business analysis and process optimization. Comprehensive understanding of Azure platform and its services. Strong background in data engineering, including ETL, data cleaning, and warehousing. Hands-on experience with Azure data services like Azure SQL Database and Data Factory. Proficiency in SQL databases, particularly Azure SQL Database. Understanding of DevOps principles and CI/CD pipelines, with experience in related tools. Effective communication with technical and non-technical stakeholders. Strong problem-solving skills, with a focus on technical troubleshooting and long-term strategies. Experience in Technology and Data teams within financial sectors. Desired Qualities and Skills: Stakeholder relationship management, maintaining effective communication and understanding needs. Familiarity with big data technologies like Apache Spark and Hadoop. Understanding of broader cloud architecture principles and services from AWS or Google Cloud. Additional relevant certifications such as Azure Solutions Architect Expert or Tableau certification. Experience with project management methodologies like Agile, Scrum, or Kanban.
AWS Data Engineer UK Wide 60,000 - 80,000 per annum + permanent benefits A leading IT Consultancy are looking to strengthen their Data Engineering team, the successful candidate will have hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. This role requires candidates to go through SC Clearance, so you must be eligible. Experience of AWS tools (e.g Athena, Redshift, Glue, EMR) Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools (e.g. Terraform, AWS CloudFormation or Cloud Deployment Manager) Have a broad understanding of AWS services including but not limited to, EC2, Storage, AWS Security, Container technologies, IAM, Cloud Networking, data processing and machine learning.
Apr 18, 2024
Full time
AWS Data Engineer UK Wide 60,000 - 80,000 per annum + permanent benefits A leading IT Consultancy are looking to strengthen their Data Engineering team, the successful candidate will have hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. This role requires candidates to go through SC Clearance, so you must be eligible. Experience of AWS tools (e.g Athena, Redshift, Glue, EMR) Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools (e.g. Terraform, AWS CloudFormation or Cloud Deployment Manager) Have a broad understanding of AWS services including but not limited to, EC2, Storage, AWS Security, Container technologies, IAM, Cloud Networking, data processing and machine learning.
Our client, an established insurance FinTech with offices in the City of London and the United States are looking for an experienced Lead Full Stack Software Developer. They are embarking on an exciting initiative to re-develop their existing product suite from the ground up, modernising their technology stack and their engineering processes. The successful candidate will lead the existing small-but-experienced engineering team and be a significant contributor to the design and development of the new product and to the various DevOps processes that are required to enable this. As the Lead Software Developer, you won't just be a member of the team; you'll be at the heart of their transformative journey. Collaborating closely with Project Leads and Developers / Engineers, you will: • Champion the execution of the project architectural design and rollout of the new product portfolio • Apply your extensive full-stack web application expertise to drive technical solutions, lead on aspects of code quality and influence other developers • Act as the key liaison between the project and developer teams resolving technical blockers and ushering an efficient project lifecycle Experience / Skills / Knowledge required: Competencies Required: A strong background in full stack web application development Expert knowledge about relational databases, database design and querying Experience in building DevOps build and deployment pipelines Experience in working as part of an application development team; experience in leading a development team a plus Demonstratable experience applying OOD to create reusable, maintainable, and modular code that can adapt to changing requirements Desirable: Experience of delivering applications on the Azure platform Experience of infrastructure-as-code Experience in designing and developing ETL processes or similar Insurance knowledge Scrum or Kanban methodologies Technologies: Required: Front end o Angular 12 or above o TypeScript o JavaScript o CSS/Sass Back end o C# .NET (Core) 5 or above o Microsoft SQL Server Desirable: Microsoft Azure Azure DevOps Git Entity Framework Core Pulumi SignalR NUnit Playwright or Selenium Windows Auth/Azure AD/Azure AD B2C
Apr 18, 2024
Full time
Our client, an established insurance FinTech with offices in the City of London and the United States are looking for an experienced Lead Full Stack Software Developer. They are embarking on an exciting initiative to re-develop their existing product suite from the ground up, modernising their technology stack and their engineering processes. The successful candidate will lead the existing small-but-experienced engineering team and be a significant contributor to the design and development of the new product and to the various DevOps processes that are required to enable this. As the Lead Software Developer, you won't just be a member of the team; you'll be at the heart of their transformative journey. Collaborating closely with Project Leads and Developers / Engineers, you will: • Champion the execution of the project architectural design and rollout of the new product portfolio • Apply your extensive full-stack web application expertise to drive technical solutions, lead on aspects of code quality and influence other developers • Act as the key liaison between the project and developer teams resolving technical blockers and ushering an efficient project lifecycle Experience / Skills / Knowledge required: Competencies Required: A strong background in full stack web application development Expert knowledge about relational databases, database design and querying Experience in building DevOps build and deployment pipelines Experience in working as part of an application development team; experience in leading a development team a plus Demonstratable experience applying OOD to create reusable, maintainable, and modular code that can adapt to changing requirements Desirable: Experience of delivering applications on the Azure platform Experience of infrastructure-as-code Experience in designing and developing ETL processes or similar Insurance knowledge Scrum or Kanban methodologies Technologies: Required: Front end o Angular 12 or above o TypeScript o JavaScript o CSS/Sass Back end o C# .NET (Core) 5 or above o Microsoft SQL Server Desirable: Microsoft Azure Azure DevOps Git Entity Framework Core Pulumi SignalR NUnit Playwright or Selenium Windows Auth/Azure AD/Azure AD B2C
The successful candidate will work from home and their prime responsibility will be our sites in Scotland. There will be visits to sites within the rest of the UK for training, support and cover. The Safety systems typically consist of Radar, high power imagery as well as other safety transponder systems. The systems are PC based on a TCIP/IP network which includes WLAN networks with distances around 5KM. Typical duties will involve an element of electrical/electronic, mechanical and IT skills. So knowledge and or experience within any or as many of these areas would be advantageous. Experience working with Magnetron and Solid State X and S Band Radar and Systems up to 650kW would be highly desirable, although systems are typically 25kW (Magnetron). Electronic component fault finding and repairs skills would also be beneficial. Typical working week is an average of 35 hours. Attending preventative and reactive call-outs within agree contractual obligations. Sometimes call-outs might need to be answered around the customer requirements, which will involve some weekend and evening work. Comfortable working at heights on fixed, temporary and natural fixtures, and fit enough to perform climbing duties and passing an annual revalidation. We would welcome applications from candidates with recent experience in an electronics field based role whom can demonstrate their aptitude to learn (e.g. ex. Military). Also of consideration may be exceptional HNC/HND or other students/graduates that might want to consider this role with their existing studies or as a first role after recently completing their courses. Due to the nature of our typical clients, the applicant must be able to seek security clearance and therefore this position is only suitable for UK nationals. Transport essential, normally this is done be reimbursement of business mileage where your own vehicle is used. Candidates can join immediaetly or after they have served any notice period.
Apr 18, 2024
Full time
The successful candidate will work from home and their prime responsibility will be our sites in Scotland. There will be visits to sites within the rest of the UK for training, support and cover. The Safety systems typically consist of Radar, high power imagery as well as other safety transponder systems. The systems are PC based on a TCIP/IP network which includes WLAN networks with distances around 5KM. Typical duties will involve an element of electrical/electronic, mechanical and IT skills. So knowledge and or experience within any or as many of these areas would be advantageous. Experience working with Magnetron and Solid State X and S Band Radar and Systems up to 650kW would be highly desirable, although systems are typically 25kW (Magnetron). Electronic component fault finding and repairs skills would also be beneficial. Typical working week is an average of 35 hours. Attending preventative and reactive call-outs within agree contractual obligations. Sometimes call-outs might need to be answered around the customer requirements, which will involve some weekend and evening work. Comfortable working at heights on fixed, temporary and natural fixtures, and fit enough to perform climbing duties and passing an annual revalidation. We would welcome applications from candidates with recent experience in an electronics field based role whom can demonstrate their aptitude to learn (e.g. ex. Military). Also of consideration may be exceptional HNC/HND or other students/graduates that might want to consider this role with their existing studies or as a first role after recently completing their courses. Due to the nature of our typical clients, the applicant must be able to seek security clearance and therefore this position is only suitable for UK nationals. Transport essential, normally this is done be reimbursement of business mileage where your own vehicle is used. Candidates can join immediaetly or after they have served any notice period.
Interested Candidates can reach out to Viswash Beesetti at Principal Consultant Why You? As a MySQL Principal Consultantyou will, as part of a team based approach, supply complete support for all aspects of MySQL database administration and/or system administration to a variety of clients, as assigned. You will assist the Team Manager, Lead Database Consultant, and Director Manager Services in ensuring quality and adherence to technical processes and standards with regards to the servicing of clients, take on a greater role in supporting Pythian's external profile and in the training and mentoring of team members. What will you be doing? As a MySQL Principal Consultant at Pythian, your primary focus will be on project work, debugging performance degradations, audits, and health checks. Once trained in, you will be included on a third tier on call rotation with other Consultants to respond to escalations and you will be expected to provide training and mentoring to junior team members. Designing and helping implement new MySQL deployments Evaluating existing clusters and provide recommendations on best practices Debugging high priority issues on mission critical production environments Be involved with OpenSource community through user lists, irc, blog posts, webinars, and open source projects Contributing to rapid brainstorming, designing and developing of prototypes Automating and providing documentation on operational procedures Training and mentoring junior team members Providing performance and forecast reports on the health and load of critical business processes to help ensure the infrastructure has adequate capacity Participating in company, and team meetings What do you get in return? Competitive total rewards package including an annual bonus plan Flexible work environment Outstanding people: Collaborate with the industry's top minds around the world. Substantial training allowance: Hone your skills or learn new ones; participate in professional development days, attend conferences, become certified, whatever you like! Office Allowance: A device of your choosing and personalise your work environment! Blog during work hours ; take a day off and volunteer for your favorite charity. Why Pythian? Pythian excels at helping businesses use their data and cloud to transform how they compete and win in this ever-changing environment by delivering advanced on-prem, hybrid, cloud and multi-cloud solutions to solve the toughest data challenges faster and better than anyone else. Founded and headquartered in Ottawa, Canada in 1997, Pythian now has more than 330 employees located around the globe with over 350 clients spanning industries from SaaS; media; gaming; financial services; e-commerce and more. Pythian is known for its technology-enabled data expertise covering everything from ETL to ML. We pride ourselves on our ability to deliver innovative solutions that meet the specific data goals of each client and have built meaningful partnerships with major cloud vendors AWS, Google and Microsoft. The powerful combination of our extensive expertise in data and cloud and our ability to keep on top of the latest bleeding edge technologies make us the perfect partner to help mid and large-sized businesses transform to stay ahead in today's rapidly changing digital economy. If you are Google Workspace Engineer; live in Hyderabad; love your data and want to love your career then join us! Disclaimer For this job an equivalent combination of education and experience, which results in demonstrated ability to apply skills will also be considered. Pythian is an equal opportunity employer and welcomes applications from people with disabilities. Accommodations are available upon request for candidates taking part in all aspects of the selection process. The successful applicant will need to fulfill the requirements necessary to obtain a background check. Applicants must be legally authorized to work in their country of residence permanently- Pythian will not relocate, sponsor, or file petitions of any kind on behalf of a foreign worker to gain a work visa, become a permanent resident based on a permanent job offer, or to otherwise obtain authorization to work. No recruitment agencies
Apr 18, 2024
Full time
Interested Candidates can reach out to Viswash Beesetti at Principal Consultant Why You? As a MySQL Principal Consultantyou will, as part of a team based approach, supply complete support for all aspects of MySQL database administration and/or system administration to a variety of clients, as assigned. You will assist the Team Manager, Lead Database Consultant, and Director Manager Services in ensuring quality and adherence to technical processes and standards with regards to the servicing of clients, take on a greater role in supporting Pythian's external profile and in the training and mentoring of team members. What will you be doing? As a MySQL Principal Consultant at Pythian, your primary focus will be on project work, debugging performance degradations, audits, and health checks. Once trained in, you will be included on a third tier on call rotation with other Consultants to respond to escalations and you will be expected to provide training and mentoring to junior team members. Designing and helping implement new MySQL deployments Evaluating existing clusters and provide recommendations on best practices Debugging high priority issues on mission critical production environments Be involved with OpenSource community through user lists, irc, blog posts, webinars, and open source projects Contributing to rapid brainstorming, designing and developing of prototypes Automating and providing documentation on operational procedures Training and mentoring junior team members Providing performance and forecast reports on the health and load of critical business processes to help ensure the infrastructure has adequate capacity Participating in company, and team meetings What do you get in return? Competitive total rewards package including an annual bonus plan Flexible work environment Outstanding people: Collaborate with the industry's top minds around the world. Substantial training allowance: Hone your skills or learn new ones; participate in professional development days, attend conferences, become certified, whatever you like! Office Allowance: A device of your choosing and personalise your work environment! Blog during work hours ; take a day off and volunteer for your favorite charity. Why Pythian? Pythian excels at helping businesses use their data and cloud to transform how they compete and win in this ever-changing environment by delivering advanced on-prem, hybrid, cloud and multi-cloud solutions to solve the toughest data challenges faster and better than anyone else. Founded and headquartered in Ottawa, Canada in 1997, Pythian now has more than 330 employees located around the globe with over 350 clients spanning industries from SaaS; media; gaming; financial services; e-commerce and more. Pythian is known for its technology-enabled data expertise covering everything from ETL to ML. We pride ourselves on our ability to deliver innovative solutions that meet the specific data goals of each client and have built meaningful partnerships with major cloud vendors AWS, Google and Microsoft. The powerful combination of our extensive expertise in data and cloud and our ability to keep on top of the latest bleeding edge technologies make us the perfect partner to help mid and large-sized businesses transform to stay ahead in today's rapidly changing digital economy. If you are Google Workspace Engineer; live in Hyderabad; love your data and want to love your career then join us! Disclaimer For this job an equivalent combination of education and experience, which results in demonstrated ability to apply skills will also be considered. Pythian is an equal opportunity employer and welcomes applications from people with disabilities. Accommodations are available upon request for candidates taking part in all aspects of the selection process. The successful applicant will need to fulfill the requirements necessary to obtain a background check. Applicants must be legally authorized to work in their country of residence permanently- Pythian will not relocate, sponsor, or file petitions of any kind on behalf of a foreign worker to gain a work visa, become a permanent resident based on a permanent job offer, or to otherwise obtain authorization to work. No recruitment agencies
Job Description - AVP - Data Architect & Advisory (BFS034948) With a startup spirit and 115,000+ curious and courageous minds, we have the expertise to go deep with the world's biggest brands-and we have fun doing it. We dream in digital, dare in reality, and reinvent the ways companies work to make an impact far bigger than just our bottom line. We're harnessing the power of technology and humanity to create meaningful transformation that moves us forward in our pursuit of a world that works better for people. Now, we're calling upon the thinkers and doers, those with a natural curiosity and a hunger to keep learning, keep growing., People who thrive on fearlessly experimenting, seizing opportunities, and pushing boundaries to turn our vision into reality. And as you help us create a better world, we will help you build your own intellectual firepower. Welcome to the relentless pursuit of better. Inviting applications for the role of Assistant Vice President, Data Architect & Advisory! Responsibilities Extensive experience w.r.t. Data architecture, Consulting, Implementation of large-scale Enterprise-level Data Warehousing, Modernization of Data Platform , Business Intelligence, and Analytics applications. Should have led multiple engagements in the Data space in terms of Solutioning and Architecture and Delivery. DDesign and develop a scalable platform architecture that supports global deployment without the need for rebuilds, aligning with the 'Build Once & Activate Many' strategy. Work closely with stakeholders to ensure the platform architecture supports high-quality, personalized user experiences and content relevancy. Collaborate with the Digital Marketing team to integrate the Growth Assist Digital Marketing framework, enabling targeted, accelerated, and cost-effective global user acquisition. Partner with the AI and Managed Services teams to implement AI-enabled solutions for domain expertise, data management, and operational efficiencies. Act as a liaison between technical teams and strategic partners, ensuring the Impact Commercial Model is effectively integrated into the platform's growth strategy. Oversee the technical execution of 7 out of the 9 lots outlined in the RFP, ensuring comprehensive coverage across build, integration, testing, operations, domain expertise, marketing, and data insights. Stay abreast of emerging technologies and methodologies that could further enhance the platform's capabilities and user experience. Ability to interact with, report out and make recommendations to the executive level steering committee. Qualifications we seek in you! Minimum Qualifications Master or Bachelor's degree in Computer Science, Information Systems, Engineering, related fields Preferred Qualifications/ Skills Required technical skills • Strong Cloud exp on AWS or Azure or GCP • Ability to work on design exercise of Data Modernization platform. • Strong exp on DataBricks or Snowflake • Strong exp on designing data ingestion and data aggregation. • Exp of design data exchange through API's Streaming, batch and ETL process • Exp on Data Bricks and Anypoint API Gateway • Strong on Data Lake and storage design for structured and unstructured data. • Exp on Data Lineage , Data Dictionary and governance Soft Skills • Should have excellent client interaction and presentation skills • Excellent English communication both written and verbal • Excellent thought leadership is required. • Should be capable to be a Mentor in building teams and enhance technical skills for the team. • Should be extremely good at internal and external stakeholder management Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter , Facebook , LinkedIn , and YouTube . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training .
Apr 18, 2024
Full time
Job Description - AVP - Data Architect & Advisory (BFS034948) With a startup spirit and 115,000+ curious and courageous minds, we have the expertise to go deep with the world's biggest brands-and we have fun doing it. We dream in digital, dare in reality, and reinvent the ways companies work to make an impact far bigger than just our bottom line. We're harnessing the power of technology and humanity to create meaningful transformation that moves us forward in our pursuit of a world that works better for people. Now, we're calling upon the thinkers and doers, those with a natural curiosity and a hunger to keep learning, keep growing., People who thrive on fearlessly experimenting, seizing opportunities, and pushing boundaries to turn our vision into reality. And as you help us create a better world, we will help you build your own intellectual firepower. Welcome to the relentless pursuit of better. Inviting applications for the role of Assistant Vice President, Data Architect & Advisory! Responsibilities Extensive experience w.r.t. Data architecture, Consulting, Implementation of large-scale Enterprise-level Data Warehousing, Modernization of Data Platform , Business Intelligence, and Analytics applications. Should have led multiple engagements in the Data space in terms of Solutioning and Architecture and Delivery. DDesign and develop a scalable platform architecture that supports global deployment without the need for rebuilds, aligning with the 'Build Once & Activate Many' strategy. Work closely with stakeholders to ensure the platform architecture supports high-quality, personalized user experiences and content relevancy. Collaborate with the Digital Marketing team to integrate the Growth Assist Digital Marketing framework, enabling targeted, accelerated, and cost-effective global user acquisition. Partner with the AI and Managed Services teams to implement AI-enabled solutions for domain expertise, data management, and operational efficiencies. Act as a liaison between technical teams and strategic partners, ensuring the Impact Commercial Model is effectively integrated into the platform's growth strategy. Oversee the technical execution of 7 out of the 9 lots outlined in the RFP, ensuring comprehensive coverage across build, integration, testing, operations, domain expertise, marketing, and data insights. Stay abreast of emerging technologies and methodologies that could further enhance the platform's capabilities and user experience. Ability to interact with, report out and make recommendations to the executive level steering committee. Qualifications we seek in you! Minimum Qualifications Master or Bachelor's degree in Computer Science, Information Systems, Engineering, related fields Preferred Qualifications/ Skills Required technical skills • Strong Cloud exp on AWS or Azure or GCP • Ability to work on design exercise of Data Modernization platform. • Strong exp on DataBricks or Snowflake • Strong exp on designing data ingestion and data aggregation. • Exp of design data exchange through API's Streaming, batch and ETL process • Exp on Data Bricks and Anypoint API Gateway • Strong on Data Lake and storage design for structured and unstructured data. • Exp on Data Lineage , Data Dictionary and governance Soft Skills • Should have excellent client interaction and presentation skills • Excellent English communication both written and verbal • Excellent thought leadership is required. • Should be capable to be a Mentor in building teams and enhance technical skills for the team. • Should be extremely good at internal and external stakeholder management Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter , Facebook , LinkedIn , and YouTube . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training .
Job Description: Pet Nutrition (PN) is the most vibrant category in the FMCG sector. As we work to transform this exciting category, a new program, Digital First, has been mobilized by the Mars Pet Nutrition (PN) leadership team. Digital First places pet parents at the center of all we do in Mars PN, while digitalizing a wide range of business process areas, and creating future fit capabilities to achieve ambitious targets in top line growth, earnings, and pet parent centricity. The Digital First agenda requires Digitizing at scale and requires you to demonstrate significant thought leadership, quality decision making, deep technical know-how, and an ability to navigate complex business challenges while building and leading a team of world class data and analytics leaders. Are you passionate about Data and Analytics and excited about how it can completely transform the way an enterprise works? Do you have the strategic vision, technical expertise, and leadership skills to drive data-driven solutions? Do you want to work in a dynamic, fast-growing category? If so, you might be the ideal candidate for the role of Solution Architect Data Foundations, in the Enterprise Architecture function for Global Pet Nutrition (PN) at Mars. The Solution Architect Data Foundations is a strategic leadership role that oversees delivery of cross product transversal data capabilities that is foundational to our success. This role is accountable for the architecture and design and optimization of data platforms, data architecture, data operations, data engineering and the development of data assets/products for the multi-billion-dollar Pet Nutrition division's digital needs. Reporting to the Head of Enterprise Architecture, the person in this role will be a part of the Global PN Architecture of Tomorrow team. The role operates globally and partners with PN business and digital leaders across all functions. 'This role is an incubation role (temporary) with an estimated end date of December 2026. The purpose is to fast-track and support the build of this specific product. At the completion of the product, a permanent BAU role will open to maintain and support the product: the role will be permanent and will have a different job description more suited to the need of the organisation at end state. If you are unable to secure the role by December 2026 you will be eligible for a separation package.' What are we looking for? Bachelor's degree or Equivalent (IT Degree preferred in particular computer science, data science or related field) Industry leading expertise in building and delivering data foundations, preferably in the CPG, or retail industry. Established and deep understanding of a range of technology solutions & business process, across CPG functional capabilities Proven track record of delivering value through data products in a fast-paced, agile environment. Extensive knowledge of data principles, architecture/modeling, ingestion, ETL principles and practices Extensive knowledge of Azure based big data platform & exposure to other clouds such as GCP is desirable. Experience in architect and design data platforms such as data lakes, data warehouses, and the data pipelines and data services that support various types of data and analytics use cases. Prior experience of successfully leading large-scale data initiatives to support analytics, BI & AI use cases. Prior experience in decentralized data management, specifically, in data governance of managing fragmented data domains like sales, finance, marketing. Proven track record of establishing and leading a DDF design authority Proven track record to master new and emerging technologies Successful experience, established over several years, to perform architecture leadership within a Technology environment A strong customer centric mindset especially within an internal customer base with the purpose of driving adoption and use Strategic thinking, problem solving and innovation, with the ability to anticipate and navigate challenges and opportunities. Excellent in engaging with technical and functional leadership in a matrix organization. Ability to navigate complex matrix organisation What will be your key responsibilities? Mars Principles: Live and exemplify the Five Principles of Mars, Inc. within self and team. Strategy and Thought Leadership: Work with PN Digital Leadership to create and execute the data foundations strategy and roadmap for the Pet Nutrition segment, in alignment with the Pet Nutrition's business strategic priorities, goals and analytics needs. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D&A leadership. You align with and support Enterprise architecture efforts in Mars Petcare, corporate EA, GDO, CISO teams. Architectural governance, review and assurance: you are accountable for effective and proportionate governance to approve or reject high level solution designs, solution architectures, other Technology services or substantial changes to existing services for compliance, including granting waivers where justified. You ensure that critical DDF design decisions and issues escalated by delivery teams across PN DT are reviewed and resolved promptly. You ensure that the governance, review and assurance processes provide insight and information to drive future revisions of the strategy and roadmap, so that the Technology architecture continues to evolve to meet the changing needs of the Mars PN. You drive architectural governance, review and assurance in partnership with the Technology Leadership Team, PN/Petcare/Corporate EAs and colleagues in the wider Mars PN. Roadmap to achieve the target architecture: you are accountable for setting out a roadmap to move from the current state architecture to the target architecture for DDF, taking account of the change portfolio and expected future change plans. You ensure that the roadmap is maintained to account for evolving requirements. Data as a Product: Bring technical mastery, knowledge, and acumen to lead the creation and deployment of scalable, secure data platforms and data assets tailored to our organization's evolving requirements while ensuring data quality and trust. Embed thought leadership in modeling data such that it is domain driven, easly discoverable and self service enabled (where appropriate) with a strong-willed approach to avoid duplication and promote trust and integrity in data assets. What can you expect from Mars? Work with over 130,000 diverse and talented Associates, all guided by the Five Principles. Join a purpose-driven company where we're striving to build the world we want tomorrow, today. Best-in-class learning and development support from day one, including access to our in-house Mars University. An industry-competitive salary and benefits package, including company bonus. Mars is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. The company is pleased to provide such assistance, and no applicant will be penalized as a result of such a request.
Apr 18, 2024
Full time
Job Description: Pet Nutrition (PN) is the most vibrant category in the FMCG sector. As we work to transform this exciting category, a new program, Digital First, has been mobilized by the Mars Pet Nutrition (PN) leadership team. Digital First places pet parents at the center of all we do in Mars PN, while digitalizing a wide range of business process areas, and creating future fit capabilities to achieve ambitious targets in top line growth, earnings, and pet parent centricity. The Digital First agenda requires Digitizing at scale and requires you to demonstrate significant thought leadership, quality decision making, deep technical know-how, and an ability to navigate complex business challenges while building and leading a team of world class data and analytics leaders. Are you passionate about Data and Analytics and excited about how it can completely transform the way an enterprise works? Do you have the strategic vision, technical expertise, and leadership skills to drive data-driven solutions? Do you want to work in a dynamic, fast-growing category? If so, you might be the ideal candidate for the role of Solution Architect Data Foundations, in the Enterprise Architecture function for Global Pet Nutrition (PN) at Mars. The Solution Architect Data Foundations is a strategic leadership role that oversees delivery of cross product transversal data capabilities that is foundational to our success. This role is accountable for the architecture and design and optimization of data platforms, data architecture, data operations, data engineering and the development of data assets/products for the multi-billion-dollar Pet Nutrition division's digital needs. Reporting to the Head of Enterprise Architecture, the person in this role will be a part of the Global PN Architecture of Tomorrow team. The role operates globally and partners with PN business and digital leaders across all functions. 'This role is an incubation role (temporary) with an estimated end date of December 2026. The purpose is to fast-track and support the build of this specific product. At the completion of the product, a permanent BAU role will open to maintain and support the product: the role will be permanent and will have a different job description more suited to the need of the organisation at end state. If you are unable to secure the role by December 2026 you will be eligible for a separation package.' What are we looking for? Bachelor's degree or Equivalent (IT Degree preferred in particular computer science, data science or related field) Industry leading expertise in building and delivering data foundations, preferably in the CPG, or retail industry. Established and deep understanding of a range of technology solutions & business process, across CPG functional capabilities Proven track record of delivering value through data products in a fast-paced, agile environment. Extensive knowledge of data principles, architecture/modeling, ingestion, ETL principles and practices Extensive knowledge of Azure based big data platform & exposure to other clouds such as GCP is desirable. Experience in architect and design data platforms such as data lakes, data warehouses, and the data pipelines and data services that support various types of data and analytics use cases. Prior experience of successfully leading large-scale data initiatives to support analytics, BI & AI use cases. Prior experience in decentralized data management, specifically, in data governance of managing fragmented data domains like sales, finance, marketing. Proven track record of establishing and leading a DDF design authority Proven track record to master new and emerging technologies Successful experience, established over several years, to perform architecture leadership within a Technology environment A strong customer centric mindset especially within an internal customer base with the purpose of driving adoption and use Strategic thinking, problem solving and innovation, with the ability to anticipate and navigate challenges and opportunities. Excellent in engaging with technical and functional leadership in a matrix organization. Ability to navigate complex matrix organisation What will be your key responsibilities? Mars Principles: Live and exemplify the Five Principles of Mars, Inc. within self and team. Strategy and Thought Leadership: Work with PN Digital Leadership to create and execute the data foundations strategy and roadmap for the Pet Nutrition segment, in alignment with the Pet Nutrition's business strategic priorities, goals and analytics needs. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D&A leadership. You align with and support Enterprise architecture efforts in Mars Petcare, corporate EA, GDO, CISO teams. Architectural governance, review and assurance: you are accountable for effective and proportionate governance to approve or reject high level solution designs, solution architectures, other Technology services or substantial changes to existing services for compliance, including granting waivers where justified. You ensure that critical DDF design decisions and issues escalated by delivery teams across PN DT are reviewed and resolved promptly. You ensure that the governance, review and assurance processes provide insight and information to drive future revisions of the strategy and roadmap, so that the Technology architecture continues to evolve to meet the changing needs of the Mars PN. You drive architectural governance, review and assurance in partnership with the Technology Leadership Team, PN/Petcare/Corporate EAs and colleagues in the wider Mars PN. Roadmap to achieve the target architecture: you are accountable for setting out a roadmap to move from the current state architecture to the target architecture for DDF, taking account of the change portfolio and expected future change plans. You ensure that the roadmap is maintained to account for evolving requirements. Data as a Product: Bring technical mastery, knowledge, and acumen to lead the creation and deployment of scalable, secure data platforms and data assets tailored to our organization's evolving requirements while ensuring data quality and trust. Embed thought leadership in modeling data such that it is domain driven, easly discoverable and self service enabled (where appropriate) with a strong-willed approach to avoid duplication and promote trust and integrity in data assets. What can you expect from Mars? Work with over 130,000 diverse and talented Associates, all guided by the Five Principles. Join a purpose-driven company where we're striving to build the world we want tomorrow, today. Best-in-class learning and development support from day one, including access to our in-house Mars University. An industry-competitive salary and benefits package, including company bonus. Mars is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. The company is pleased to provide such assistance, and no applicant will be penalized as a result of such a request.
Position: Azure Data Engineer Contract Length: 6 months Location: Bristol, UK (5 days onsite) IR35 Status: Inside IR35 Rate: 500 - 650 per day Role Overview: As an Azure Data Engineer, you will play a pivotal role in designing, developing, and maintaining data solutions on the Azure platform. Your expertise will drive the success of data projects, ensuring optimal performance, scalability, and reliability. Key Responsibilities: Design, develop, and implement data pipelines and ETL processes on Azure. Optimize and troubleshoot data solutions to ensure performance and reliability. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Utilize Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics to build robust data architectures. Implement best practices for data governance, security, and compliance. Skills and Qualifications: Proven experience as a Data Engineer with expertise in Azure data technologies. Strong proficiency in SQL, Python, or other relevant programming languages. Hands-on experience with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. Knowledge of data modeling, warehousing, and ETL concepts. Familiarity with data visualization tools such as Power BI or Tableau. Excellent problem-solving skills and ability to thrive in a fast-paced environment. Additional Details: Contract length: 6 months Location: Bristol, UK (5 days onsite) IR35 status: Inside IR35 Rate: 500 - 650 per day
Apr 18, 2024
Contractor
Position: Azure Data Engineer Contract Length: 6 months Location: Bristol, UK (5 days onsite) IR35 Status: Inside IR35 Rate: 500 - 650 per day Role Overview: As an Azure Data Engineer, you will play a pivotal role in designing, developing, and maintaining data solutions on the Azure platform. Your expertise will drive the success of data projects, ensuring optimal performance, scalability, and reliability. Key Responsibilities: Design, develop, and implement data pipelines and ETL processes on Azure. Optimize and troubleshoot data solutions to ensure performance and reliability. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Utilize Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics to build robust data architectures. Implement best practices for data governance, security, and compliance. Skills and Qualifications: Proven experience as a Data Engineer with expertise in Azure data technologies. Strong proficiency in SQL, Python, or other relevant programming languages. Hands-on experience with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. Knowledge of data modeling, warehousing, and ETL concepts. Familiarity with data visualization tools such as Power BI or Tableau. Excellent problem-solving skills and ability to thrive in a fast-paced environment. Additional Details: Contract length: 6 months Location: Bristol, UK (5 days onsite) IR35 status: Inside IR35 Rate: 500 - 650 per day
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. An exciting opportunity within financial services client is looking for DevOps Engineer / SRE /Site reliability engineer based in London. Role : DevOps Engineer / SRE /Site reliability engineer Location : London (2 days a week onsite) Duration : 6 Months Status : Inside IR35 Experienced and knowledgeable AWS Cloud with knowledge of EKS, Jenkins, DevOps, Terraform, Kubernetes, Docker, Helm, Git Ops, and troubleshooting skills. Experience and skills Required: Experience in backend development or data engineering. Hands-on experience with AWS services like S3, EC2, EMR. Experience with Kubernetes, Terraform, CI/CD, Jenkins Proficiency in SQL and experience with CDAP, Spark, Kafka. Experience building scalable ETL processes and workflows. Responsibilities: Develop and enhance data pipelines, ETL processes using CDAP on AWS infrastructure. Build data integration flows to migrate large datasets into Snowflake data warehouse. Implement AWS infrastructure-as-code solutions for deployment automation. Instrument data pipelines and leverage monitoring for performance tuning and reliability. Work with data scientists to optimize data workflows and models on Databricks. Follow security best practices for access control, encryption, auditing across data platforms. Participate in architecture reviews and technology selections. Continuously monitor and improve data platforms for scalability and costs. Candidates will ideally show evidence of the above in their CV to be considered. Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. Pontoon is an employment consultancy and operates as an equal opportunity's employer.
Apr 18, 2024
Contractor
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. An exciting opportunity within financial services client is looking for DevOps Engineer / SRE /Site reliability engineer based in London. Role : DevOps Engineer / SRE /Site reliability engineer Location : London (2 days a week onsite) Duration : 6 Months Status : Inside IR35 Experienced and knowledgeable AWS Cloud with knowledge of EKS, Jenkins, DevOps, Terraform, Kubernetes, Docker, Helm, Git Ops, and troubleshooting skills. Experience and skills Required: Experience in backend development or data engineering. Hands-on experience with AWS services like S3, EC2, EMR. Experience with Kubernetes, Terraform, CI/CD, Jenkins Proficiency in SQL and experience with CDAP, Spark, Kafka. Experience building scalable ETL processes and workflows. Responsibilities: Develop and enhance data pipelines, ETL processes using CDAP on AWS infrastructure. Build data integration flows to migrate large datasets into Snowflake data warehouse. Implement AWS infrastructure-as-code solutions for deployment automation. Instrument data pipelines and leverage monitoring for performance tuning and reliability. Work with data scientists to optimize data workflows and models on Databricks. Follow security best practices for access control, encryption, auditing across data platforms. Participate in architecture reviews and technology selections. Continuously monitor and improve data platforms for scalability and costs. Candidates will ideally show evidence of the above in their CV to be considered. Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. Pontoon is an employment consultancy and operates as an equal opportunity's employer.
A forward-thinking technology company working with major global brands are entering a period of growth within their new office location in Llandysul They are currently seeking an experienced Data Engineer(with a focus on Power BI) to play a pivotal role in managing and optimising their data infrastructure. You ll leverage your technical expertise and join a supportive, open and collaborative environment with a culture of learning, who are not afraid of making mistakes and thinking outside of the box. Your role will involve working alongside Data Scientists and Analysts to drive data-driven insights for their clients to make strategic decisions. You ll collaborate with stakeholders, design/maintain data pipelines and ETL Processes, ensuring the accuracy and security of the data and identifying opportunities for automation. Experience required At least 2 years Data Engineering with proven experience in Power BI Development Knowledge in data analysis, modelling, and visualization Proficient in: Microsoft Power BI (Power Query, DAX, Power View) Strong knowledge of SQL and relational databases (e.g., MySQL, PostgreSQL) Skilled in at least one programming language e.g., Python or Java Cloud platforms knowledge - AWS / Azure, and familiarity with data warehousing Applicants must be within commutable distance of Llandysul for 3 days a week working in office (working 2 days a week at home) and must have the right to work in UK as we do not offer sponsorship. Please Apply Now to be considered or contact Rachael for a confidential chat: (url removed)
Apr 18, 2024
Full time
A forward-thinking technology company working with major global brands are entering a period of growth within their new office location in Llandysul They are currently seeking an experienced Data Engineer(with a focus on Power BI) to play a pivotal role in managing and optimising their data infrastructure. You ll leverage your technical expertise and join a supportive, open and collaborative environment with a culture of learning, who are not afraid of making mistakes and thinking outside of the box. Your role will involve working alongside Data Scientists and Analysts to drive data-driven insights for their clients to make strategic decisions. You ll collaborate with stakeholders, design/maintain data pipelines and ETL Processes, ensuring the accuracy and security of the data and identifying opportunities for automation. Experience required At least 2 years Data Engineering with proven experience in Power BI Development Knowledge in data analysis, modelling, and visualization Proficient in: Microsoft Power BI (Power Query, DAX, Power View) Strong knowledge of SQL and relational databases (e.g., MySQL, PostgreSQL) Skilled in at least one programming language e.g., Python or Java Cloud platforms knowledge - AWS / Azure, and familiarity with data warehousing Applicants must be within commutable distance of Llandysul for 3 days a week working in office (working 2 days a week at home) and must have the right to work in UK as we do not offer sponsorship. Please Apply Now to be considered or contact Rachael for a confidential chat: (url removed)
Senior Data Engineer - Inside IR35 Contract - London As a Senior Data Engineer, you will play a pivotal role in architecting, building, and maintaining our data infrastructure. Leveraging your expertise in data engineering you will collaborate closely with cross-functional teams to design scalable data pipelines, optimise data workflows, and integrate advanced analytics solutions. Key Responsibilities: Design, develop, and deploy robust data pipelines and ETL processes to ingest, transform, and store large volumes of structured and unstructured data from diverse sources. Implement best practices for data modelling, storage, and retrieval to ensure scalability, reliability, and performance of our data infrastructure. Collaborate with data scientists and analysts to operationalise machine learning models and enable real-time data analytics for actionable insights. Work closely with stakeholders to understand business requirements, define data engineering solutions, and drive the adoption of data-driven decision-making. Requirements: Proven experience as a Data Engineer or similar role, with a focus on designing and implementing scalable data infrastructure solutions. Strong proficiency in programming languages such as Python, or Scala Hands-on experience with cloud platforms, ideally AWS or Azure Solid understanding of tools including Spark, Hadoop and dbt Experience within FMCG, Retail or consumer facing businesses would be beneficial
Apr 18, 2024
Full time
Senior Data Engineer - Inside IR35 Contract - London As a Senior Data Engineer, you will play a pivotal role in architecting, building, and maintaining our data infrastructure. Leveraging your expertise in data engineering you will collaborate closely with cross-functional teams to design scalable data pipelines, optimise data workflows, and integrate advanced analytics solutions. Key Responsibilities: Design, develop, and deploy robust data pipelines and ETL processes to ingest, transform, and store large volumes of structured and unstructured data from diverse sources. Implement best practices for data modelling, storage, and retrieval to ensure scalability, reliability, and performance of our data infrastructure. Collaborate with data scientists and analysts to operationalise machine learning models and enable real-time data analytics for actionable insights. Work closely with stakeholders to understand business requirements, define data engineering solutions, and drive the adoption of data-driven decision-making. Requirements: Proven experience as a Data Engineer or similar role, with a focus on designing and implementing scalable data infrastructure solutions. Strong proficiency in programming languages such as Python, or Scala Hands-on experience with cloud platforms, ideally AWS or Azure Solid understanding of tools including Spark, Hadoop and dbt Experience within FMCG, Retail or consumer facing businesses would be beneficial
Process Control (OT) Systems Engineer Location: Shotton, North West including North Wales, GB Job Title: Process Control (OT) Systems Engineer Department: Computer Operations Location: Shotton (CH5) Salary: £42,758 -£48,867 + Annualised Hours Closing Date: Sunday 7th April 2024 At Tata Steel, we're committed to excellence in everything we do. As a global leader in the steel industry, we're constantly pushing boundaries and innovating to meet the needs of our customers and communities. Join us in our mission to create a sustainable future through cutting-edge technology and unparalleled expertise. What you will do We're currently seeking a talented Process Control (OT) Systems Engineer to join our Colors team at our site in Shotton. Reporting to the Computer Operations Manager, you'll play a pivotal role in supporting and enhancing our Operational Technology infrastructure, ensuring seamless operations and driving continuous improvement. Key Responsibilities: Prioritize safety above all else, fostering a secure working environment and ensuring adherence to safety protocols. Oversee the management of infrastructure and data flow across all system levels, minimizing production downtime due to critical interface failures. Lead and manage specialist IT and process control system maintenance contractors, ensuring compliance with technical standards and safety protocols. Maintain and develop process control computer systems and associated software for real-time product tracking and material processing data. Ensure compliance of all OT assets with Company Cybersecurity policy, in alignment with HSEOG86 standards. This is an exceptional opportunity to join a dynamic team working across a diverse range of OT technologies. We're dedicated to investing in the ongoing training and development of our staff, offering regular specialist training on emerging technologies and ample opportunities for career progression. What you will need We're seeking a Process Control (OT) Systems Engineer who holds a degree in a computer-related discipline and is CCNA certified, boasting expertise in switching, routing, wireless, and security. Beyond academic achievements, you'll possess a solid understanding of VMware and the Windows Server operating system. Your experience in applying cybersecurity measures to industrial automation and control systems, aligning with HSEOG86 standards, will be invaluable. Familiarity with SAP business/manufacturing management software systems, WinCC, IFixSCADA, and ideally, Visual Basic, is highly desirable. Strong communication skills are a must, as you'll engage in influencing behaviors and effectively managing contractors across various disciplines. Your ability to thrive under pressure, swiftly resolving faults to minimize production halts, is crucial. Additionally, you'll play a pivotal role in maintaining compliance with our company's cybersecurity policies, ensuring the integrity and security of our operational technology assets. What we can offer you Tata Steel UK offers their employees significant benefits packages. For this role, you will benefit from: A market competitive salary 35 days holidays per annum Annual Pay Review Quarterly Bonus Scheme - subject to business performance Private Healthcare Scheme (Individual cover) One of the UK's leading defined contribution pension schemes (10% employer contribution / 6% employee contribution) We also have an extensive list of lifestyle benefits including free onsite parking at all of our sites, an employee assistance programme as well as Employee discount scheme for companies including Vodafone, Jaguar Land Rover and also various local services. Why us? Tata Steel is one of the world's top 10 steel producers. The combined group has an annual aggregate crude steel capacity of more than 33 million tonnes with approximately 80,000 employees across four continents. We're part of the Tata Group, one of the largest, most diverse conglomerates in the world with businesses in the UK including Tata Steel, Jaguar Land Rover and Tetley Tea. Sustainability is at the very heart of what we do and we are dedicated to managing our operations responsibly and to continuously improving our performance. Innovating for tomorrow, making a positive impact today.
Apr 18, 2024
Full time
Process Control (OT) Systems Engineer Location: Shotton, North West including North Wales, GB Job Title: Process Control (OT) Systems Engineer Department: Computer Operations Location: Shotton (CH5) Salary: £42,758 -£48,867 + Annualised Hours Closing Date: Sunday 7th April 2024 At Tata Steel, we're committed to excellence in everything we do. As a global leader in the steel industry, we're constantly pushing boundaries and innovating to meet the needs of our customers and communities. Join us in our mission to create a sustainable future through cutting-edge technology and unparalleled expertise. What you will do We're currently seeking a talented Process Control (OT) Systems Engineer to join our Colors team at our site in Shotton. Reporting to the Computer Operations Manager, you'll play a pivotal role in supporting and enhancing our Operational Technology infrastructure, ensuring seamless operations and driving continuous improvement. Key Responsibilities: Prioritize safety above all else, fostering a secure working environment and ensuring adherence to safety protocols. Oversee the management of infrastructure and data flow across all system levels, minimizing production downtime due to critical interface failures. Lead and manage specialist IT and process control system maintenance contractors, ensuring compliance with technical standards and safety protocols. Maintain and develop process control computer systems and associated software for real-time product tracking and material processing data. Ensure compliance of all OT assets with Company Cybersecurity policy, in alignment with HSEOG86 standards. This is an exceptional opportunity to join a dynamic team working across a diverse range of OT technologies. We're dedicated to investing in the ongoing training and development of our staff, offering regular specialist training on emerging technologies and ample opportunities for career progression. What you will need We're seeking a Process Control (OT) Systems Engineer who holds a degree in a computer-related discipline and is CCNA certified, boasting expertise in switching, routing, wireless, and security. Beyond academic achievements, you'll possess a solid understanding of VMware and the Windows Server operating system. Your experience in applying cybersecurity measures to industrial automation and control systems, aligning with HSEOG86 standards, will be invaluable. Familiarity with SAP business/manufacturing management software systems, WinCC, IFixSCADA, and ideally, Visual Basic, is highly desirable. Strong communication skills are a must, as you'll engage in influencing behaviors and effectively managing contractors across various disciplines. Your ability to thrive under pressure, swiftly resolving faults to minimize production halts, is crucial. Additionally, you'll play a pivotal role in maintaining compliance with our company's cybersecurity policies, ensuring the integrity and security of our operational technology assets. What we can offer you Tata Steel UK offers their employees significant benefits packages. For this role, you will benefit from: A market competitive salary 35 days holidays per annum Annual Pay Review Quarterly Bonus Scheme - subject to business performance Private Healthcare Scheme (Individual cover) One of the UK's leading defined contribution pension schemes (10% employer contribution / 6% employee contribution) We also have an extensive list of lifestyle benefits including free onsite parking at all of our sites, an employee assistance programme as well as Employee discount scheme for companies including Vodafone, Jaguar Land Rover and also various local services. Why us? Tata Steel is one of the world's top 10 steel producers. The combined group has an annual aggregate crude steel capacity of more than 33 million tonnes with approximately 80,000 employees across four continents. We're part of the Tata Group, one of the largest, most diverse conglomerates in the world with businesses in the UK including Tata Steel, Jaguar Land Rover and Tetley Tea. Sustainability is at the very heart of what we do and we are dedicated to managing our operations responsibly and to continuously improving our performance. Innovating for tomorrow, making a positive impact today.
Are you ready to be a game-changer in the world of sports data? Do you want to be part of a team that's rewriting the playbook on sports analytics? This company is on a mission to revolutionise the beautiful game with cutting-edge data technology, and they need your skills to make it happen. Step onto the field with your new team as Data Engineer and lead the charge in sports data innovation. From scouting talent to optimising game strategy, they're building the tools that will shape the future of sports. Join their sports analytics team today and be at the forefront of this exciting journey! They're seeking a star player in the data engineering game. As a sports Data Engineer , you'll be instrumental in designing and building a data infrastructure that will power a game-changing analytics platform. Your goal? To ensure your team has the data they need to kick it out of the park! Key Responsibilities as Data Engineer Craft a winning data architecture that's as solid as a defender's tackle. Score big with efficient data pipelines for seamless extraction, transformation, and loading. Keep the data flowing smoothly with top-notch reliability and performance. Play offence with optimization techniques like data partitioning and indexing. Team up with external data providers to bring in the stats that matter most. Star player key qualities 5+ years of experience in data engineering, with a passion for sports. Pro-level skills in Python and Bash scripting. Experience dribbling through ETL processes with tools like Prefect or Airflow. Goal-scoring knowledge of containerization and orchestration tools like Docker and Kubernetes. Captain of the cloud with expertise in AWS, Azure, or GCP. This is your shot to join a winning team - and you can do it all from the comfort of your home pitch! This Data Engineer remote opportunity awaits, so lace up your boots and get ready to make history in the world of sports data! Eligo Recruitment is acting as an Employment Business in relation to this vacancy. Eligo is proud to be an equal opportunity employer dedicated to fostering diversity and creating an inclusive and equitable environment for employees and applicants. We actively celebrate and embrace differences, including but not limited to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran status, and disability. We encourage applications from individuals of all backgrounds and experiences and all will be considered for employment without discrimination. At Eligo Recruitment diversity, equity and inclusion is integral to achieving our mission to ensure every workplace reflects the richness of human diversity.
Apr 18, 2024
Full time
Are you ready to be a game-changer in the world of sports data? Do you want to be part of a team that's rewriting the playbook on sports analytics? This company is on a mission to revolutionise the beautiful game with cutting-edge data technology, and they need your skills to make it happen. Step onto the field with your new team as Data Engineer and lead the charge in sports data innovation. From scouting talent to optimising game strategy, they're building the tools that will shape the future of sports. Join their sports analytics team today and be at the forefront of this exciting journey! They're seeking a star player in the data engineering game. As a sports Data Engineer , you'll be instrumental in designing and building a data infrastructure that will power a game-changing analytics platform. Your goal? To ensure your team has the data they need to kick it out of the park! Key Responsibilities as Data Engineer Craft a winning data architecture that's as solid as a defender's tackle. Score big with efficient data pipelines for seamless extraction, transformation, and loading. Keep the data flowing smoothly with top-notch reliability and performance. Play offence with optimization techniques like data partitioning and indexing. Team up with external data providers to bring in the stats that matter most. Star player key qualities 5+ years of experience in data engineering, with a passion for sports. Pro-level skills in Python and Bash scripting. Experience dribbling through ETL processes with tools like Prefect or Airflow. Goal-scoring knowledge of containerization and orchestration tools like Docker and Kubernetes. Captain of the cloud with expertise in AWS, Azure, or GCP. This is your shot to join a winning team - and you can do it all from the comfort of your home pitch! This Data Engineer remote opportunity awaits, so lace up your boots and get ready to make history in the world of sports data! Eligo Recruitment is acting as an Employment Business in relation to this vacancy. Eligo is proud to be an equal opportunity employer dedicated to fostering diversity and creating an inclusive and equitable environment for employees and applicants. We actively celebrate and embrace differences, including but not limited to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran status, and disability. We encourage applications from individuals of all backgrounds and experiences and all will be considered for employment without discrimination. At Eligo Recruitment diversity, equity and inclusion is integral to achieving our mission to ensure every workplace reflects the richness of human diversity.
End Date Friday 19 April 2024 Salary Range £78,849 - £87,610 We support agile working - click here for more information on agile working options. Agile Working Options Hybrid Working, Job Share Job Description Summary . Job Description Data Security Engineer - Data Resilience Lloyds Banking Group London - hybrid working two days per week in the office & rest from home. Salary & Benefits: £78,849 to £96,371 per annum, plus annual personal bonus, 15% employer pension contribution, private medical insurance, 30 days holiday plus bank holidays. About the Role As the Data Resilience Security Engineer, you'll focus on Data Security; assuring the group safeguards data and associated assets from vulnerabilities and threats that could lead to a compromise of the integrity and availability leading to customer harm. The role reports into the Data Resilience Technical Lead and requires ambitious individuals with a proactive, can-do attitude and solution-oriented approach to deliver at pace. Key Responsibilities: The primary security contact for data resilience queries. Provide input and direction on security assessments to identify gaps that could lead to IBS Impact Tolerance thresholds being breached. Develop security initiatives and guidance for Operational Resilience, Chief Security Office and change frameworks. Oversee the development of security controls and collaborate with platform teams and Chief Security Office to remediate security gaps. Perform horizon scanning and provide input to group policies and procedures. Support and grow team members in security domains of data resilience. Present data resilience security gaps to peers and senior collaborators What we're looking for; We'd welcome applicants from diverse cultural and technological backgrounds, however financial services exposure will be important for this position. We'll need to see evidence of the following in your CV; Prior experience working at mid to senior level within a relevant role. Experience of security scanning and testing, including Qualys, Ethical Hacking, SAST & DAST Experience of vulnerability management (CVSS) Hands on experience of modern security architecture along with diagnostic and monitoring tooling. Proficient in Cryptographic key management and encryption deployments. Knowledge of ISO 27001/27002, NIST and/or CIS Experience of working with SIEM tooling (Splunk) or similar Knowledge of Endpoint Detection and Response tooling (SentinelOne) Knowledge of zero trust security for applications Good experience in Identity and Access Management Knowledge of Operating Systems (Windows, Linux, zOS, CentOS, Unix, Ubuntu and Solaris) Familiar with analytic platforms and databases such as MSSQL, Kafka, S3, etc Experience of ransomware attack techniques and mitigation strategies. Exposure to security concepts (MITRE, Kill-Chain) Experience of incident response (triage, classification, investigation, and escalation) Financial Services experience and exposure to some but not all; payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys. Solid verbal and written communication skills to discuss and describe the target architecture with stakeholders. Its great if you have: Public cloud (AWS, GCP, Azure) experience Knowledge of Extract, Transform & Load (ETL), Disaster Recovery or back-up and restore domains. Prior experience working of supporting or remediating resilience issues on assets such as batch, messaging queues, third party data connections, data recovery & backup, data vaulting, data integrity. Technical knowledge of FCA, PRA, EBA guidelines on operational resilience. CISSP/CSSP/CISM or equivalent experience. Experience in Financial Services is a nice to have but not mandatory. About working for us. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. We were one of the first major organisations to set goals on diversity in senior roles, create a menopause health package, and a dedicated Working with Cancer initiative. We re disability confident. So, if you d like reasonable adjustments to be made to our recruitment processes, just let us know. Ready for a career where you can have a positive impact as you learn, grow and thrive?Apply today and find out more At Lloyds Banking Group, we're driven by a clear purpose; to help Britain prosper. Across the Group, our colleagues are focused on making a difference to customers, businesses and communities. With us you'll have a key role to play in shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer to join us which is when we run our background checks. We'll always explain what we need and why, with any request coming from a trusted Lloyds Banking Group person. We're focused on creating a values-led culture and are committed to building a workforce which reflects the diversity of the customers and communities we serve. Together we re building a truly inclusive workplace where all of our colleagues have the opportunity to make a real difference.
Apr 18, 2024
Full time
End Date Friday 19 April 2024 Salary Range £78,849 - £87,610 We support agile working - click here for more information on agile working options. Agile Working Options Hybrid Working, Job Share Job Description Summary . Job Description Data Security Engineer - Data Resilience Lloyds Banking Group London - hybrid working two days per week in the office & rest from home. Salary & Benefits: £78,849 to £96,371 per annum, plus annual personal bonus, 15% employer pension contribution, private medical insurance, 30 days holiday plus bank holidays. About the Role As the Data Resilience Security Engineer, you'll focus on Data Security; assuring the group safeguards data and associated assets from vulnerabilities and threats that could lead to a compromise of the integrity and availability leading to customer harm. The role reports into the Data Resilience Technical Lead and requires ambitious individuals with a proactive, can-do attitude and solution-oriented approach to deliver at pace. Key Responsibilities: The primary security contact for data resilience queries. Provide input and direction on security assessments to identify gaps that could lead to IBS Impact Tolerance thresholds being breached. Develop security initiatives and guidance for Operational Resilience, Chief Security Office and change frameworks. Oversee the development of security controls and collaborate with platform teams and Chief Security Office to remediate security gaps. Perform horizon scanning and provide input to group policies and procedures. Support and grow team members in security domains of data resilience. Present data resilience security gaps to peers and senior collaborators What we're looking for; We'd welcome applicants from diverse cultural and technological backgrounds, however financial services exposure will be important for this position. We'll need to see evidence of the following in your CV; Prior experience working at mid to senior level within a relevant role. Experience of security scanning and testing, including Qualys, Ethical Hacking, SAST & DAST Experience of vulnerability management (CVSS) Hands on experience of modern security architecture along with diagnostic and monitoring tooling. Proficient in Cryptographic key management and encryption deployments. Knowledge of ISO 27001/27002, NIST and/or CIS Experience of working with SIEM tooling (Splunk) or similar Knowledge of Endpoint Detection and Response tooling (SentinelOne) Knowledge of zero trust security for applications Good experience in Identity and Access Management Knowledge of Operating Systems (Windows, Linux, zOS, CentOS, Unix, Ubuntu and Solaris) Familiar with analytic platforms and databases such as MSSQL, Kafka, S3, etc Experience of ransomware attack techniques and mitigation strategies. Exposure to security concepts (MITRE, Kill-Chain) Experience of incident response (triage, classification, investigation, and escalation) Financial Services experience and exposure to some but not all; payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys. Solid verbal and written communication skills to discuss and describe the target architecture with stakeholders. Its great if you have: Public cloud (AWS, GCP, Azure) experience Knowledge of Extract, Transform & Load (ETL), Disaster Recovery or back-up and restore domains. Prior experience working of supporting or remediating resilience issues on assets such as batch, messaging queues, third party data connections, data recovery & backup, data vaulting, data integrity. Technical knowledge of FCA, PRA, EBA guidelines on operational resilience. CISSP/CSSP/CISM or equivalent experience. Experience in Financial Services is a nice to have but not mandatory. About working for us. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. We were one of the first major organisations to set goals on diversity in senior roles, create a menopause health package, and a dedicated Working with Cancer initiative. We re disability confident. So, if you d like reasonable adjustments to be made to our recruitment processes, just let us know. Ready for a career where you can have a positive impact as you learn, grow and thrive?Apply today and find out more At Lloyds Banking Group, we're driven by a clear purpose; to help Britain prosper. Across the Group, our colleagues are focused on making a difference to customers, businesses and communities. With us you'll have a key role to play in shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer to join us which is when we run our background checks. We'll always explain what we need and why, with any request coming from a trusted Lloyds Banking Group person. We're focused on creating a values-led culture and are committed to building a workforce which reflects the diversity of the customers and communities we serve. Together we re building a truly inclusive workplace where all of our colleagues have the opportunity to make a real difference.
End Date Thursday 18 April 2024 Salary Range £78,849 - £87,610 We support agile working - click here for more information on agile working options. Agile Working Options Hybrid Working, Job Share Job Description Summary . Job Description Database Engineer - Data Resilience Lloyds Banking Group London - hybrid working two days per week in the office & rest from home. Salary & Benefits: £78,849 to £96,371 per annum, plus annual personal bonus, 15% employer pension contribution, private medical insurance, 30 days holiday plus bank holidays. We also offer flexible working hours, agile working practices & flexibility to suit your to ensure a good work-life balance. About us: Data is at the heart of Lloyds Banking Group; enabling the delivery of innovative financial services to our 26 million customers and help Britain prosper. A resilient organisation instils confidence and trust with customers, staff and regulators, reduces risk and protects Group from regulatory censure and fines. The Data Resilience team is a new chapter within the Chief Data and Analytics Office. We have the responsibility to facilitate the identification, end-to end data flow mapping, and assessment of IBS critical technical assets to establish the Data Resilience position and proactively mitigate Group exposure to data loss or corruption events. Background: Disruptions are inevitable and regulators expect Financial Institutions to take the necessary steps to protect data and recover from severe but plausible data loss or corruption events, such as a cyber-attack, to meet IBS impact tolerance thresholds. Data Resilience purpose is to proactively protect the integrity, availability, and security of our data to mitigate the risk of disruption to the Group s Important Business Services. The Data Resilience team have 3 main objectives to achieve this: Identify & define assets that are critical to the delivery of Important Business Services. Map the end-to-end data flow of critical assets from source to consumer of each Important Business Service Assess resilience maturity of critical assets to identify gaps that could impact the Group s ability to maintain service within ITOL. About the Role As the Data Resilience Data Engineer, you'll focus on ensuring IBS critical data is stored, handled, and processed effectively to maintain the availability, confidentiality and integrity and fulfil the Group s Important Business Services. This will safeguard our critical data and associated assets from vulnerabilities and threats that could lead to a compromise of the integrity and availability leading to customer harm. The role requires ambitious individuals with a proactive, can-do attitude and solution-oriented approach to deliver at pace. Key Responsibilities: Be the technical data reliability point of contact for data resilience. Provide input and direction on database resilience assessments to identify gaps that could lead to IBS Impact Tolerance thresholds being breached. Develop database optimisation initiatives to drive improvement and guidance for Operational Resilience and change frameworks. Oversee the development of database controls and collaborate with platform teams and Chief Security Office to remediate security gaps. Embed proactive database hygiene, including ROT data, compression rates, effective maintenance plans and compaction & reorganisation across IBS critical applications. Ensure Data Resilience Information Asset Register (IAR) for databases is accurately maintained. Perform horizon scanning and provide input to group policies and procedures. Review backup and recovery procedures for IBS critical databases Support and grow team members in database domains of data resilience. Present data resilience database gaps to peers and senior stakeholders. What we're looking for; We'd welcome applicants from diverse cultural and technological backgrounds, however financial services exposure will be important for this position. We will need to see evidence of the following in your CV; Career experience working as a Senior Database Administrator, including Oracle, MS SQL, (DB2 and IMS desirable) Knowledge of traditional Operating Systems (Windows, Linux, zOS, F5) Experience in Data Fabric and Data Mesh concepts, including Systems of Record, Engagement and Insight strategies Experience in database management and optimisation, i.e. reorganisation and rebuild. Strong knowledge of database backup and recovery procedures Experience of incident response (triage, classification, investigation, and escalation) Proficient in database encryption at rest, in transit and in memory Financial Services experience and exposure to some but not all; payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys. Solid verbal and written communication skills to discuss and describe the target architecture with technical and non-technical stakeholders. Its great if you have: Knowledge of Extract, Transform & Load (ETL), Disaster Recovery or back-up and restore domains. Public cloud data management experience including Databricks, MongoDB, CockcroachDB, GCP Dataproc, BigQuery. Experience in Data Mesh Enterprise Data Hub (EDH) and Warehouse (EDW) Prior experience working of supporting or remediating resilience issues on assets such as batch, messaging queues, third party data connections, data recovery & backup, data vaulting, data integrity. About working for us. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. We were one of the first major organisations to set goals on diversity in senior roles, create a menopause health package, and a dedicated Working with Cancer initiative. We re disability confident. So, if you d like reasonable adjustments to be made to our recruitment processes, just let us know. Ready for a career where you can have a positive impact as you learn, grow and thrive?Apply today and find out more At Lloyds Banking Group, we're driven by a clear purpose; to help Britain prosper. Across the Group, our colleagues are focused on making a difference to customers, businesses and communities. With us you'll have a key role to play in shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer to join us which is when we run our background checks. We'll always explain what we need and why, with any request coming from a trusted Lloyds Banking Group person. We're focused on creating a values-led culture and are committed to building a workforce which reflects the diversity of the customers and communities we serve. Together we re building a truly inclusive workplace where all of our colleagues have the opportunity to make a real difference.
Apr 18, 2024
Full time
End Date Thursday 18 April 2024 Salary Range £78,849 - £87,610 We support agile working - click here for more information on agile working options. Agile Working Options Hybrid Working, Job Share Job Description Summary . Job Description Database Engineer - Data Resilience Lloyds Banking Group London - hybrid working two days per week in the office & rest from home. Salary & Benefits: £78,849 to £96,371 per annum, plus annual personal bonus, 15% employer pension contribution, private medical insurance, 30 days holiday plus bank holidays. We also offer flexible working hours, agile working practices & flexibility to suit your to ensure a good work-life balance. About us: Data is at the heart of Lloyds Banking Group; enabling the delivery of innovative financial services to our 26 million customers and help Britain prosper. A resilient organisation instils confidence and trust with customers, staff and regulators, reduces risk and protects Group from regulatory censure and fines. The Data Resilience team is a new chapter within the Chief Data and Analytics Office. We have the responsibility to facilitate the identification, end-to end data flow mapping, and assessment of IBS critical technical assets to establish the Data Resilience position and proactively mitigate Group exposure to data loss or corruption events. Background: Disruptions are inevitable and regulators expect Financial Institutions to take the necessary steps to protect data and recover from severe but plausible data loss or corruption events, such as a cyber-attack, to meet IBS impact tolerance thresholds. Data Resilience purpose is to proactively protect the integrity, availability, and security of our data to mitigate the risk of disruption to the Group s Important Business Services. The Data Resilience team have 3 main objectives to achieve this: Identify & define assets that are critical to the delivery of Important Business Services. Map the end-to-end data flow of critical assets from source to consumer of each Important Business Service Assess resilience maturity of critical assets to identify gaps that could impact the Group s ability to maintain service within ITOL. About the Role As the Data Resilience Data Engineer, you'll focus on ensuring IBS critical data is stored, handled, and processed effectively to maintain the availability, confidentiality and integrity and fulfil the Group s Important Business Services. This will safeguard our critical data and associated assets from vulnerabilities and threats that could lead to a compromise of the integrity and availability leading to customer harm. The role requires ambitious individuals with a proactive, can-do attitude and solution-oriented approach to deliver at pace. Key Responsibilities: Be the technical data reliability point of contact for data resilience. Provide input and direction on database resilience assessments to identify gaps that could lead to IBS Impact Tolerance thresholds being breached. Develop database optimisation initiatives to drive improvement and guidance for Operational Resilience and change frameworks. Oversee the development of database controls and collaborate with platform teams and Chief Security Office to remediate security gaps. Embed proactive database hygiene, including ROT data, compression rates, effective maintenance plans and compaction & reorganisation across IBS critical applications. Ensure Data Resilience Information Asset Register (IAR) for databases is accurately maintained. Perform horizon scanning and provide input to group policies and procedures. Review backup and recovery procedures for IBS critical databases Support and grow team members in database domains of data resilience. Present data resilience database gaps to peers and senior stakeholders. What we're looking for; We'd welcome applicants from diverse cultural and technological backgrounds, however financial services exposure will be important for this position. We will need to see evidence of the following in your CV; Career experience working as a Senior Database Administrator, including Oracle, MS SQL, (DB2 and IMS desirable) Knowledge of traditional Operating Systems (Windows, Linux, zOS, F5) Experience in Data Fabric and Data Mesh concepts, including Systems of Record, Engagement and Insight strategies Experience in database management and optimisation, i.e. reorganisation and rebuild. Strong knowledge of database backup and recovery procedures Experience of incident response (triage, classification, investigation, and escalation) Proficient in database encryption at rest, in transit and in memory Financial Services experience and exposure to some but not all; payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys. Solid verbal and written communication skills to discuss and describe the target architecture with technical and non-technical stakeholders. Its great if you have: Knowledge of Extract, Transform & Load (ETL), Disaster Recovery or back-up and restore domains. Public cloud data management experience including Databricks, MongoDB, CockcroachDB, GCP Dataproc, BigQuery. Experience in Data Mesh Enterprise Data Hub (EDH) and Warehouse (EDW) Prior experience working of supporting or remediating resilience issues on assets such as batch, messaging queues, third party data connections, data recovery & backup, data vaulting, data integrity. About working for us. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. We were one of the first major organisations to set goals on diversity in senior roles, create a menopause health package, and a dedicated Working with Cancer initiative. We re disability confident. So, if you d like reasonable adjustments to be made to our recruitment processes, just let us know. Ready for a career where you can have a positive impact as you learn, grow and thrive?Apply today and find out more At Lloyds Banking Group, we're driven by a clear purpose; to help Britain prosper. Across the Group, our colleagues are focused on making a difference to customers, businesses and communities. With us you'll have a key role to play in shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer to join us which is when we run our background checks. We'll always explain what we need and why, with any request coming from a trusted Lloyds Banking Group person. We're focused on creating a values-led culture and are committed to building a workforce which reflects the diversity of the customers and communities we serve. Together we re building a truly inclusive workplace where all of our colleagues have the opportunity to make a real difference.