Data Engineer - Liverpool/ Hybrid - £40,000 to £50,000 + Bonus We have partnered with an organisation who pride themselves on being the best in what they do, they are a business who specialise on the transportation industry. The company are recruiting for a Data Engineer to make a difference in the company and use the business data to create dashboards and reports in PowerBI and Qlik Sense and conduct data transformations. The position is a great opportunity for someone to take that step into a managerial role and have the opportunity to work with the Solutions Architect and make a true impact into the company. The Data Engineer position will work on a hybrid working structure of 3 days from home and 2 onsite per week. Essential experience for the Data Engineer: - 3+ years of experience working as a Data Engineer or similar position. - Strong understanding of PowerBI - Extensive knowledge of Qlik Sense. - Experience with Azure data (Data Lake and Data Warehouse) - Strong people skills, ability to influence people and attention to detail. Responsibilities of the Data Engineer: - Visualise data through dashboards and reports using PowerBI. - Conduct quality coding to technical business criteria. - Design, develop and test scripts to import data from test dashboards and business systems to meeting client requirements. - Provide support to end-users as required for existing technical content and developed solutions. Data Engineer - Liverpool/ Hybrid - £40,000 to £50,000 + Bonus To discuss this opportunity further, APPLY NOW for a confidential conversation with your VIQU Consultant. For additional information, contact Dan Freeman at If you refer someone ideal for this role, VIQU offers an introduction fee up to £1,000 once your referral starts work with our client (terms apply). Stay updated on exciting opportunities, technology, and recruitment news by following us at 'VIQU IT Recruitment' on LinkedIn and
Apr 19, 2024
Full time
Data Engineer - Liverpool/ Hybrid - £40,000 to £50,000 + Bonus We have partnered with an organisation who pride themselves on being the best in what they do, they are a business who specialise on the transportation industry. The company are recruiting for a Data Engineer to make a difference in the company and use the business data to create dashboards and reports in PowerBI and Qlik Sense and conduct data transformations. The position is a great opportunity for someone to take that step into a managerial role and have the opportunity to work with the Solutions Architect and make a true impact into the company. The Data Engineer position will work on a hybrid working structure of 3 days from home and 2 onsite per week. Essential experience for the Data Engineer: - 3+ years of experience working as a Data Engineer or similar position. - Strong understanding of PowerBI - Extensive knowledge of Qlik Sense. - Experience with Azure data (Data Lake and Data Warehouse) - Strong people skills, ability to influence people and attention to detail. Responsibilities of the Data Engineer: - Visualise data through dashboards and reports using PowerBI. - Conduct quality coding to technical business criteria. - Design, develop and test scripts to import data from test dashboards and business systems to meeting client requirements. - Provide support to end-users as required for existing technical content and developed solutions. Data Engineer - Liverpool/ Hybrid - £40,000 to £50,000 + Bonus To discuss this opportunity further, APPLY NOW for a confidential conversation with your VIQU Consultant. For additional information, contact Dan Freeman at If you refer someone ideal for this role, VIQU offers an introduction fee up to £1,000 once your referral starts work with our client (terms apply). Stay updated on exciting opportunities, technology, and recruitment news by following us at 'VIQU IT Recruitment' on LinkedIn and
BI Developer Location: London, UK - Hybrid Working Salary offering: £50K - £60K Per Annum Fixed Term Contract: 12 - 18 Months My client is a leading Regulatory Body professional services organisation, who are looking for an experienced BI Developer to join their growing Data & Analytics team. In this role, you will be responsible for developing and enhancing their data warehouse and BI capabilities, delivering critical reporting and analytics to support the business. Key responsibilities include: Integrating new data sources and enabling reporting on the client's data Performing source-to-target mappings and aligning data structures to enterprise data models Designing and implementing data warehouse solutions, including data flows, ETL processes, and data models Developing interactive dashboards and reports using tools like Power BI Ensuring data quality, governance, and security standards are met Collaborating with business analysts and stakeholders to understand requirements The ideal candidate will have: Extensive experience as a BI Developer or equivalent, with a strong focus on data engineering and analytics In-depth knowledge of the Microsoft BI stack, including SSIS, SQL Server, and Power BI Proven skills in data warehousing, ETL, data modelling, and data analysis Hands-on experience with modern BI tools and Azure data technologies Combined deep technical expertise in data engineering and BI development, along with strong business acumen and stakeholder management skills to deliver high-impact data and analytics solutions. Lead the integration of new data sources, enhance the data warehouse, and drive the delivery of critical business intelligence and reporting. Experience in migrating to Dynamics 365 is advantageous Morgan Hunt is a multi-award-winning recruitment business for interim, contract and temporary recruitment and acts as an Employment Agency in relation to permanent vacancies. Morgan Hunt is an equal opportunities employer. Job suitability is assessed on merit in accordance with the individual's skills, qualifications and abilities to perform the relevant duties required in a particular role.
Apr 19, 2024
Full time
BI Developer Location: London, UK - Hybrid Working Salary offering: £50K - £60K Per Annum Fixed Term Contract: 12 - 18 Months My client is a leading Regulatory Body professional services organisation, who are looking for an experienced BI Developer to join their growing Data & Analytics team. In this role, you will be responsible for developing and enhancing their data warehouse and BI capabilities, delivering critical reporting and analytics to support the business. Key responsibilities include: Integrating new data sources and enabling reporting on the client's data Performing source-to-target mappings and aligning data structures to enterprise data models Designing and implementing data warehouse solutions, including data flows, ETL processes, and data models Developing interactive dashboards and reports using tools like Power BI Ensuring data quality, governance, and security standards are met Collaborating with business analysts and stakeholders to understand requirements The ideal candidate will have: Extensive experience as a BI Developer or equivalent, with a strong focus on data engineering and analytics In-depth knowledge of the Microsoft BI stack, including SSIS, SQL Server, and Power BI Proven skills in data warehousing, ETL, data modelling, and data analysis Hands-on experience with modern BI tools and Azure data technologies Combined deep technical expertise in data engineering and BI development, along with strong business acumen and stakeholder management skills to deliver high-impact data and analytics solutions. Lead the integration of new data sources, enhance the data warehouse, and drive the delivery of critical business intelligence and reporting. Experience in migrating to Dynamics 365 is advantageous Morgan Hunt is a multi-award-winning recruitment business for interim, contract and temporary recruitment and acts as an Employment Agency in relation to permanent vacancies. Morgan Hunt is an equal opportunities employer. Job suitability is assessed on merit in accordance with the individual's skills, qualifications and abilities to perform the relevant duties required in a particular role.
Job Title: Data Engineer Location: Manchester Package: from £40,000 - £55,000 + Benefits Type: Permanent Sanderson Recruitment is recruiting for a Data Engineer on behalf of our leading Insurance client based in Manchester. Company Overview: Are you interested in joining a leading insurance company headquartered in the UK? Established over a decade ago, my client specialises in providing a range of insurance services tailored to meet the diverse needs of their customers. With a primary focus on the motor insurance market, they offer comprehensive car insurance directly through their brand, as well as underwriting services to other insurers. In addition to motor insurance, they also provide various supporting services related to insurance, including financing, distribution, and legal assistance. My client's commitment to utilising technology and data-driven strategies ensures they deliver high-quality products and services to their customers while mitigating risks effectively. Role & Responsibilities: As a Data Engineer, you will be actively participating in technical tasks, focusing on constructing data solutions for projects and ongoing data products. Your responsibilities will include Develop secure, efficient data pipelines of varying complexity, integrating data from diverse sources, both on-premise and off-premise, internal and external. Ensure data integrity and quality by cleansing, mapping, transforming, and optimising data for storage, aligning with business and technical requirements. Incorporate data observability and quality measures into pipelines to facilitate self-testing and early detection of processing issues or discrepancies. Construct solutions to transform and store data across different storage areas, including data lakes, databases, and reporting structures, spanning data warehouse, Business Intelligence systems, and analytics applications. Design physical data models tailored to business needs and storage optimisation, emphasising reusability and scalability. Conduct thorough unit testing of own code and peer testing to maintain high quality and integrity. Document pipelines and code comprehensively to ensure transparency and facilitate understanding. Adhere to coding standards, architectural principles, and release management processes to ensure code safety, quality, and compliance. Provide guidance and support to Associate data engineers through coaching and mentoring. Develop BI solutions of varying complexity, including data marts, semantic layers, and reporting & visualisation solutions using recognised BI tools such as PowerBI. Essential Requirements: To thrive in this role, candidates must possess: Demonstrated proficiency in PySpark and SQL development, with a strong interest in advancing your career in data engineering. Enthusiastic about leveraging Azure best practices to facilitate seamless data delivery from source to consumption on a daily basis. Excels at translating customer requirements into actionable designs and timely delivery. 2-5 years of experience in designing and implementing end-to-end data solutions. Proficiency in SQL Server and Azure technologies such as Data Factory and Synapse, along with expertise in associated ETL technologies. Experience working with large, event-based datasets within an enterprise setting. Familiarity with testing techniques and tools to ensure data quality and integrity. Strong interpersonal and communication skills, with an ability to build strong relationships. Active engagement in the data community with a keen interest in leveraging data to drive business value. Comprehensive understanding of the complete data life cycle. Experience with Continuous Integration/Continuous Delivery (CI/CD) practices. Proven track record of thriving in agile environments and adeptness at self-managing teams. This role offers an exciting opportunity to drive data innovation within a forward-thinking organisation. If you're ready to make a direct and meaningful contribution to my clients dynamic work environment, apply now.
Apr 19, 2024
Full time
Job Title: Data Engineer Location: Manchester Package: from £40,000 - £55,000 + Benefits Type: Permanent Sanderson Recruitment is recruiting for a Data Engineer on behalf of our leading Insurance client based in Manchester. Company Overview: Are you interested in joining a leading insurance company headquartered in the UK? Established over a decade ago, my client specialises in providing a range of insurance services tailored to meet the diverse needs of their customers. With a primary focus on the motor insurance market, they offer comprehensive car insurance directly through their brand, as well as underwriting services to other insurers. In addition to motor insurance, they also provide various supporting services related to insurance, including financing, distribution, and legal assistance. My client's commitment to utilising technology and data-driven strategies ensures they deliver high-quality products and services to their customers while mitigating risks effectively. Role & Responsibilities: As a Data Engineer, you will be actively participating in technical tasks, focusing on constructing data solutions for projects and ongoing data products. Your responsibilities will include Develop secure, efficient data pipelines of varying complexity, integrating data from diverse sources, both on-premise and off-premise, internal and external. Ensure data integrity and quality by cleansing, mapping, transforming, and optimising data for storage, aligning with business and technical requirements. Incorporate data observability and quality measures into pipelines to facilitate self-testing and early detection of processing issues or discrepancies. Construct solutions to transform and store data across different storage areas, including data lakes, databases, and reporting structures, spanning data warehouse, Business Intelligence systems, and analytics applications. Design physical data models tailored to business needs and storage optimisation, emphasising reusability and scalability. Conduct thorough unit testing of own code and peer testing to maintain high quality and integrity. Document pipelines and code comprehensively to ensure transparency and facilitate understanding. Adhere to coding standards, architectural principles, and release management processes to ensure code safety, quality, and compliance. Provide guidance and support to Associate data engineers through coaching and mentoring. Develop BI solutions of varying complexity, including data marts, semantic layers, and reporting & visualisation solutions using recognised BI tools such as PowerBI. Essential Requirements: To thrive in this role, candidates must possess: Demonstrated proficiency in PySpark and SQL development, with a strong interest in advancing your career in data engineering. Enthusiastic about leveraging Azure best practices to facilitate seamless data delivery from source to consumption on a daily basis. Excels at translating customer requirements into actionable designs and timely delivery. 2-5 years of experience in designing and implementing end-to-end data solutions. Proficiency in SQL Server and Azure technologies such as Data Factory and Synapse, along with expertise in associated ETL technologies. Experience working with large, event-based datasets within an enterprise setting. Familiarity with testing techniques and tools to ensure data quality and integrity. Strong interpersonal and communication skills, with an ability to build strong relationships. Active engagement in the data community with a keen interest in leveraging data to drive business value. Comprehensive understanding of the complete data life cycle. Experience with Continuous Integration/Continuous Delivery (CI/CD) practices. Proven track record of thriving in agile environments and adeptness at self-managing teams. This role offers an exciting opportunity to drive data innovation within a forward-thinking organisation. If you're ready to make a direct and meaningful contribution to my clients dynamic work environment, apply now.
Trading since 1989, e.surv Chartered Surveyors is the UK's number one residential surveyor and the largest provider of property risk expertise and residential surveying services. To put it into numbers, we complete more than one property inspection every 12 seconds and employ over 600 surveyors from Land's End to John O'Groats and Northern Ireland. This gives us the flexibility to offer nationwide coverage combined with invaluable local knowledge. We're part of the LSL Property Services Group PLC, which includes household names Your Move and Reeds Rains as well as the mortgage network PRIMIS. We work with lenders, intermediaries, social housing entities and estate agents in addition to private customers. Reporting to the Head of Data, this is an exciting time to be joining the Data and Analytics team, where we truly believe in the value of being a Data driven organisation. We are searching for a good communicator and facilitator, capable of understanding and refining non-technical descriptions into actionable stories with very good stakeholder management skills. Responsibilities will include building and maintaining data pipelines, models, automation routines and the management of data warehouses; whilst providing consultation and guidance on integrating new data sources into the warehouse and to facilitate data usage. Main Accountabilities: Create and maintain ETL pipelines. Automate data management tasks. Create and provide observability. Document data pipeline and storage. Maintain data dictionaries and lineage diagrams. Consult on Data source integration and usage. Knowledge and Expertise - (relating to specialist knowledge and expertise required to undertake the role). Essential Data Architecture and management. SQL and TSQL understanding and programming. Knowledge and use of SQL Server integration services. Source code management. MSSQL server. Azure. Desirable Power Bi. CICD/Data-Ops processes and environments. GIT Python Experience, Qualifications and other requirements specific to the role Essential Experience in creating and maintaining complex SQL code. Experience in all stages of SSIS ETL package lifecycle. Creating and maintaining data dictionaries. Master data management. At least two years' experience working in a data management role. Qualifications Microsoft SQL (nice to have). MS power BI. Apply If you feel you match our requirements and are looking for your next career challenge, or for a confidential discussion on the full details of this role please contact Mike Brett. LSL Property Services are dedicated to protecting your data - our Recruitment Privacy Notice can be viewed HERE. PRE EMPLOYMENT SCREENING - All of our employees have to pass a Criminal Records Disclosure and Credit Referencing Process in order to work with our lender clients, if you are unsure on this, ask the team and we'll be happy to explain the process.
Apr 19, 2024
Full time
Trading since 1989, e.surv Chartered Surveyors is the UK's number one residential surveyor and the largest provider of property risk expertise and residential surveying services. To put it into numbers, we complete more than one property inspection every 12 seconds and employ over 600 surveyors from Land's End to John O'Groats and Northern Ireland. This gives us the flexibility to offer nationwide coverage combined with invaluable local knowledge. We're part of the LSL Property Services Group PLC, which includes household names Your Move and Reeds Rains as well as the mortgage network PRIMIS. We work with lenders, intermediaries, social housing entities and estate agents in addition to private customers. Reporting to the Head of Data, this is an exciting time to be joining the Data and Analytics team, where we truly believe in the value of being a Data driven organisation. We are searching for a good communicator and facilitator, capable of understanding and refining non-technical descriptions into actionable stories with very good stakeholder management skills. Responsibilities will include building and maintaining data pipelines, models, automation routines and the management of data warehouses; whilst providing consultation and guidance on integrating new data sources into the warehouse and to facilitate data usage. Main Accountabilities: Create and maintain ETL pipelines. Automate data management tasks. Create and provide observability. Document data pipeline and storage. Maintain data dictionaries and lineage diagrams. Consult on Data source integration and usage. Knowledge and Expertise - (relating to specialist knowledge and expertise required to undertake the role). Essential Data Architecture and management. SQL and TSQL understanding and programming. Knowledge and use of SQL Server integration services. Source code management. MSSQL server. Azure. Desirable Power Bi. CICD/Data-Ops processes and environments. GIT Python Experience, Qualifications and other requirements specific to the role Essential Experience in creating and maintaining complex SQL code. Experience in all stages of SSIS ETL package lifecycle. Creating and maintaining data dictionaries. Master data management. At least two years' experience working in a data management role. Qualifications Microsoft SQL (nice to have). MS power BI. Apply If you feel you match our requirements and are looking for your next career challenge, or for a confidential discussion on the full details of this role please contact Mike Brett. LSL Property Services are dedicated to protecting your data - our Recruitment Privacy Notice can be viewed HERE. PRE EMPLOYMENT SCREENING - All of our employees have to pass a Criminal Records Disclosure and Credit Referencing Process in order to work with our lender clients, if you are unsure on this, ask the team and we'll be happy to explain the process.
This Global Pharmaceutical Services provider are looking for a Data Engineer to work within their growing Business Intelligence technical service function. You will be supporting the technical development of their cloud data platforms, offering the options for near real time process reporting solutions. Client Details Global Pharmaceutical Services provider Description This Global Pharmaceutical Services provider are looking for a Data Engineer to work within their growing Business Intelligence technical service function. You will be supporting the technical development of their cloud data platforms, offering the options for near real time process reporting solutions. You will work on designing and developing their company data acquisition strategy allowing for near real time process reporting and full support of contractual reporting to customers. Key Responsibilities: Monitor and maintain ETL processes to ensure the continuation of an accurate data reporting platform for the business. Explore solutions for optimising the performance of the strategic data platform pipelines, following development design principles Develop and scale the cloud data platforms utilising CI / CD pipelines and Terraform. Help identify and implement the approach to data modelling our business processes in a manner that allows for agile implementation and improvement. Technical implementation of process protocols, quality measures and outcome definitions that allow us to use our data as Real-World evidence. Work with the analytics team to understand and build solutions that combine internal insight, patient outcomes, and external data feeds - creating an opportunity to explore big data and machine learning algorithms. Support the implementation of the security controls that ensure patient data privacy and full compliance with the Data Protection Act / GDPR Help develop and implement the strategic data platform for insights including support to NHS National KPIs and standard operational/financial reporting, Standard Pharma Offerings, Bespoke Customer Solutions, Commercial/Financial Analytics/Planning and Internal Self-Service reporting. Identify key processes where BI can link our business operations with our patients. Help develop and implement the data acquisition strategy ensuring that data is available when needed, accurate and understood. Key Technical Skills required: Extensive experience of working with Azure data platform technologies including Data Lake gen 2, Synapse, Analysis Services, Power BI Extensive experience of working with Azure Data Factory & Databricks Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses / Lakehouses. Experience of working with business data domains / data as a product (Data Mesh). Metadata management / Data quality Profile Extensive experience of working with Azure data platform technologies including Data Lake gen 2, Synapse, Analysis Services, Power BI Extensive experience of working with Azure Data Factory & Databricks Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses / Lakehouses. Experience of working with business data domains / data as a product (Data Mesh). Metadata management / Data quality Job Offer Opportunity to work on a major Data Engineering programme Opportunity to work with a global organisation
Apr 19, 2024
Full time
This Global Pharmaceutical Services provider are looking for a Data Engineer to work within their growing Business Intelligence technical service function. You will be supporting the technical development of their cloud data platforms, offering the options for near real time process reporting solutions. Client Details Global Pharmaceutical Services provider Description This Global Pharmaceutical Services provider are looking for a Data Engineer to work within their growing Business Intelligence technical service function. You will be supporting the technical development of their cloud data platforms, offering the options for near real time process reporting solutions. You will work on designing and developing their company data acquisition strategy allowing for near real time process reporting and full support of contractual reporting to customers. Key Responsibilities: Monitor and maintain ETL processes to ensure the continuation of an accurate data reporting platform for the business. Explore solutions for optimising the performance of the strategic data platform pipelines, following development design principles Develop and scale the cloud data platforms utilising CI / CD pipelines and Terraform. Help identify and implement the approach to data modelling our business processes in a manner that allows for agile implementation and improvement. Technical implementation of process protocols, quality measures and outcome definitions that allow us to use our data as Real-World evidence. Work with the analytics team to understand and build solutions that combine internal insight, patient outcomes, and external data feeds - creating an opportunity to explore big data and machine learning algorithms. Support the implementation of the security controls that ensure patient data privacy and full compliance with the Data Protection Act / GDPR Help develop and implement the strategic data platform for insights including support to NHS National KPIs and standard operational/financial reporting, Standard Pharma Offerings, Bespoke Customer Solutions, Commercial/Financial Analytics/Planning and Internal Self-Service reporting. Identify key processes where BI can link our business operations with our patients. Help develop and implement the data acquisition strategy ensuring that data is available when needed, accurate and understood. Key Technical Skills required: Extensive experience of working with Azure data platform technologies including Data Lake gen 2, Synapse, Analysis Services, Power BI Extensive experience of working with Azure Data Factory & Databricks Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses / Lakehouses. Experience of working with business data domains / data as a product (Data Mesh). Metadata management / Data quality Profile Extensive experience of working with Azure data platform technologies including Data Lake gen 2, Synapse, Analysis Services, Power BI Extensive experience of working with Azure Data Factory & Databricks Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses / Lakehouses. Experience of working with business data domains / data as a product (Data Mesh). Metadata management / Data quality Job Offer Opportunity to work on a major Data Engineering programme Opportunity to work with a global organisation
IT Engineer (Logistics Operations) *Applicants should be aware this role is fully onsite and will require you to take part in a shift rota covering 6am-12:30am Sunday-Friday (more details will be provided during the initial screening call) Are you passionate about optimizing logistic operations? Do you thrive in a dynamic environment where problem-solving and technical expertise are key? Our client is seeking a Logistic Operations Engineer to join their team in the Derby area. Role Overview: As a Logistic Operations Engineer, you'll play a pivotal role in ensuring the smooth functioning of the logistic operations, meeting both customer and company specifications. Your responsibilities will encompass supervisory, technical, and administrative tasks, ensuring operational stability and efficiency. Key Responsibilities: Identify and address operational issues promptly Lead troubleshooting procedures including Incident, Problem, and Change Management Drive continuous improvement processes to enhance stability and efficiency Anticipate and mitigate operational challenges Establish and promote best practices within the team Provide operational management insights based on data analysis Skills and Qualifications: NVQ/BTEC/C&G Level 3 in an IT discipline or equivalent experience Technical knowledge in Linux and Windows operating systems Strong knowledge of SQL databases Experience writing scripts The ability to "talk tech" to non technical stakeholders Desirable: Understanding of warehouse operations and supply chain management Desirable: Experience in software application support and incident investigation Desirable: Familiarity with programming languages (Java, Python, PL/SQL, C++) Desirable: Operational experience within an automated distribution facility ITIL Foundation certification is a plus Company Benefits: Competitive salary up to £40,000 Pension contribution up to 6% Private healthcare, including dental and optical cover Cycle to work scheme Employee referral scheme Employee assistance programme Employee of the Quarter awards Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
Apr 19, 2024
Full time
IT Engineer (Logistics Operations) *Applicants should be aware this role is fully onsite and will require you to take part in a shift rota covering 6am-12:30am Sunday-Friday (more details will be provided during the initial screening call) Are you passionate about optimizing logistic operations? Do you thrive in a dynamic environment where problem-solving and technical expertise are key? Our client is seeking a Logistic Operations Engineer to join their team in the Derby area. Role Overview: As a Logistic Operations Engineer, you'll play a pivotal role in ensuring the smooth functioning of the logistic operations, meeting both customer and company specifications. Your responsibilities will encompass supervisory, technical, and administrative tasks, ensuring operational stability and efficiency. Key Responsibilities: Identify and address operational issues promptly Lead troubleshooting procedures including Incident, Problem, and Change Management Drive continuous improvement processes to enhance stability and efficiency Anticipate and mitigate operational challenges Establish and promote best practices within the team Provide operational management insights based on data analysis Skills and Qualifications: NVQ/BTEC/C&G Level 3 in an IT discipline or equivalent experience Technical knowledge in Linux and Windows operating systems Strong knowledge of SQL databases Experience writing scripts The ability to "talk tech" to non technical stakeholders Desirable: Understanding of warehouse operations and supply chain management Desirable: Experience in software application support and incident investigation Desirable: Familiarity with programming languages (Java, Python, PL/SQL, C++) Desirable: Operational experience within an automated distribution facility ITIL Foundation certification is a plus Company Benefits: Competitive salary up to £40,000 Pension contribution up to 6% Private healthcare, including dental and optical cover Cycle to work scheme Employee referral scheme Employee assistance programme Employee of the Quarter awards Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
Data Engineer - SQL, Spotfire, Dataiku Data Engineer - SQL, Spotfire, Dataiku The location of the role is remote (main site in Cheltenham, candidates must be based in UK). The duration of the contract is 12 months . The pay rate on offer is 50 per hour via Umbrella. Key accountabilities of the role Review and manage surge capacity to facilitate the seamless transition to Redshift and effectively manage heightened demand for digital tools. Conduct validation of data ingested from the source database, ensuring data accuracy and reliability. Utilise Dataiku and SQL coding for data preparation, and optimise efficiency in the processing of data. Manage the transition of projects and data between cloud-based data warehouses. Ensure continuity and effective data management. Lead the migration of System services datasets from Greenplum to AWS Redshift, supporting the development of UK business metrics dashboards using Spotfire, and creating published datasets using SQL and Dataiku. Implement validation and alignment procedures for data consistency across both UK and US data projects. Drive the consistent delivery of high-quality data to meet consumers' needs. Key skills and experience Spotfire, SQL and Excel experience essential Dataiku experience preferred Oracle database experience preferred Highly motivated self-starter Problem solver, with a keen eye for detail Comfortable with big data analytics, data preparation and validation
Apr 19, 2024
Contractor
Data Engineer - SQL, Spotfire, Dataiku Data Engineer - SQL, Spotfire, Dataiku The location of the role is remote (main site in Cheltenham, candidates must be based in UK). The duration of the contract is 12 months . The pay rate on offer is 50 per hour via Umbrella. Key accountabilities of the role Review and manage surge capacity to facilitate the seamless transition to Redshift and effectively manage heightened demand for digital tools. Conduct validation of data ingested from the source database, ensuring data accuracy and reliability. Utilise Dataiku and SQL coding for data preparation, and optimise efficiency in the processing of data. Manage the transition of projects and data between cloud-based data warehouses. Ensure continuity and effective data management. Lead the migration of System services datasets from Greenplum to AWS Redshift, supporting the development of UK business metrics dashboards using Spotfire, and creating published datasets using SQL and Dataiku. Implement validation and alignment procedures for data consistency across both UK and US data projects. Drive the consistent delivery of high-quality data to meet consumers' needs. Key skills and experience Spotfire, SQL and Excel experience essential Dataiku experience preferred Oracle database experience preferred Highly motivated self-starter Problem solver, with a keen eye for detail Comfortable with big data analytics, data preparation and validation
Product Data Analyst Bristol - Hybrid Working (3 days a week onsite) 50,000 - 60,000 + Bonus, Great Pension, Private Healthcare, 28 Days Holiday, Progression, Other Benefits This is an excellent opportunity for an experienced Data Analyst to join a well-established company in a highly autonomous and technical role. This company offer a brilliant service and are one of the leading car leasing company within the UK. With a fantastic client base and working with some of the biggest brands out there you would be joining a great company where you will feel valued and part of a team. In this role you will deliver valuable and robust data solutions for this client's customers. You will articulate business requirements for the Data Lake and insight capability, as well as requirements for the cloud-based data warehouse. The ideal candidate will have proven experience in a similar role and strong knowledge of Python, SQL and be able to articulate data requirements for Data Science, ML and AI. Experience with Snowflake, Power BI, Tableau and Google Analytics is highly desirable. This is a fantastic opportunity for a Data Analyst to join a growing and rewarding company where you will have excellent opportunities for career progression. The Role: Deliver valuable and robust data solutions Articulate business requirements for the Data Lake and insight capability Cloud-based data warehouse insights Hybrid Working in Bristol - 3 days a week onsite The Person: Proven experience in a similar role Strong knowledge of Python, SQL Experience with Snowflake, Power BI, Tableau and Google Analytics is highly desirable Must be commutable to Bristol or willing to relocate Reference Number: BBBH(phone number removed) To apply for this role or for to be considered for further roles, please click "Apply Now" or contact Ryan McIntyre at Rise Technical Recruitment. This vacancy is being advertised by Rise Technical Recruitment Ltd. The services of Rise Technical Recruitment Ltd are that of an Employment Agency. Rise Technical Recruitment Ltd regrets to inform that our client can only accept applications from engineering candidates who have a valid legal permit or right to work in the United Kingdom. Potential candidates who do not have this right or permit or are pending an application to obtain this right or permit should not apply as your details will not be processed.
Apr 19, 2024
Full time
Product Data Analyst Bristol - Hybrid Working (3 days a week onsite) 50,000 - 60,000 + Bonus, Great Pension, Private Healthcare, 28 Days Holiday, Progression, Other Benefits This is an excellent opportunity for an experienced Data Analyst to join a well-established company in a highly autonomous and technical role. This company offer a brilliant service and are one of the leading car leasing company within the UK. With a fantastic client base and working with some of the biggest brands out there you would be joining a great company where you will feel valued and part of a team. In this role you will deliver valuable and robust data solutions for this client's customers. You will articulate business requirements for the Data Lake and insight capability, as well as requirements for the cloud-based data warehouse. The ideal candidate will have proven experience in a similar role and strong knowledge of Python, SQL and be able to articulate data requirements for Data Science, ML and AI. Experience with Snowflake, Power BI, Tableau and Google Analytics is highly desirable. This is a fantastic opportunity for a Data Analyst to join a growing and rewarding company where you will have excellent opportunities for career progression. The Role: Deliver valuable and robust data solutions Articulate business requirements for the Data Lake and insight capability Cloud-based data warehouse insights Hybrid Working in Bristol - 3 days a week onsite The Person: Proven experience in a similar role Strong knowledge of Python, SQL Experience with Snowflake, Power BI, Tableau and Google Analytics is highly desirable Must be commutable to Bristol or willing to relocate Reference Number: BBBH(phone number removed) To apply for this role or for to be considered for further roles, please click "Apply Now" or contact Ryan McIntyre at Rise Technical Recruitment. This vacancy is being advertised by Rise Technical Recruitment Ltd. The services of Rise Technical Recruitment Ltd are that of an Employment Agency. Rise Technical Recruitment Ltd regrets to inform that our client can only accept applications from engineering candidates who have a valid legal permit or right to work in the United Kingdom. Potential candidates who do not have this right or permit or are pending an application to obtain this right or permit should not apply as your details will not be processed.
JOB TITLE: Technical Support Engineer LOCATION: Home-based INDUSTRY: Warehouse Management Systems THE ROLE Our Technical Support Engineer refers to someone who delivers automated solutions for software processes. You'll have to work closely with other teams to help discover and eliminate problems by gathering requirements and implementing process automation. Sometimes this will include hardware or software, but other times you might be asked to automate service or business processes. You will be shown how to: Identify opportunities for automation within software processes. Run tests for databases, systems, networks, applications, hardware and software. Identify bugs and quality issues in development, service or business processes. Install applications and databases relevant to automation. Collaborate with other business units to understand how automation can improve workflow. Gather requirements from clients, customers or end-users to develop the best automation solutions Provide 1st and eventually up to 3rd line support, investigate tickets/incidents and provide a solution within the SLA's. No two days are the same in Support, you will thrive on challenges and embrace ambiguity. You will be expected to take ownership and responsibility implementing support solutions when client systems are not performing as expected with ingenuity and simplicity. You will 'know how things work' in his/her areas of responsibility and keep changes under control. The role will be home-based, you will provide effective technical support for our clients. An ability to understand code is preferable but not essential. KEY RESPONSIBILITIES 24/7 support cover for shift rotation (5:30am-14:00, 09:00- 17:30, 13:30-22:00, 21:30- 06:00) Effective processing of incoming technical product queries via telephone and email. Ensure details of all calls and enquiries are accurately logged onto internal service database. Resolve technical queries in a quick and efficient manner, ensuring compliance to all company policy and procedures. Empathetic communicator, able to see things from the other person's point of view. Ensure clients are continually updated and ETA's provided. If unable to provide solution escalate and seek support to resolve client issue. Run monitoring tools - report back Core Competencies Previous Technical Support role experience Experience in Cloud hosting technology Excellent communicator Good written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to a variety of audience. Committed team player, willing to work in a challenging and cross-platform environment Self-driven, results-oriented with a positive outlook, keen for new experiences and responsibilities with a clear focus on high quality and operational efficiency. A natural forward planner who critically assesses own performance and is proactive and self-driven in achieving goals and objectives. Desirable SQL Ability to code software according to published standards and design guidelines (C# .Net) Experience of working within Warehouse Management industry
Apr 19, 2024
Full time
JOB TITLE: Technical Support Engineer LOCATION: Home-based INDUSTRY: Warehouse Management Systems THE ROLE Our Technical Support Engineer refers to someone who delivers automated solutions for software processes. You'll have to work closely with other teams to help discover and eliminate problems by gathering requirements and implementing process automation. Sometimes this will include hardware or software, but other times you might be asked to automate service or business processes. You will be shown how to: Identify opportunities for automation within software processes. Run tests for databases, systems, networks, applications, hardware and software. Identify bugs and quality issues in development, service or business processes. Install applications and databases relevant to automation. Collaborate with other business units to understand how automation can improve workflow. Gather requirements from clients, customers or end-users to develop the best automation solutions Provide 1st and eventually up to 3rd line support, investigate tickets/incidents and provide a solution within the SLA's. No two days are the same in Support, you will thrive on challenges and embrace ambiguity. You will be expected to take ownership and responsibility implementing support solutions when client systems are not performing as expected with ingenuity and simplicity. You will 'know how things work' in his/her areas of responsibility and keep changes under control. The role will be home-based, you will provide effective technical support for our clients. An ability to understand code is preferable but not essential. KEY RESPONSIBILITIES 24/7 support cover for shift rotation (5:30am-14:00, 09:00- 17:30, 13:30-22:00, 21:30- 06:00) Effective processing of incoming technical product queries via telephone and email. Ensure details of all calls and enquiries are accurately logged onto internal service database. Resolve technical queries in a quick and efficient manner, ensuring compliance to all company policy and procedures. Empathetic communicator, able to see things from the other person's point of view. Ensure clients are continually updated and ETA's provided. If unable to provide solution escalate and seek support to resolve client issue. Run monitoring tools - report back Core Competencies Previous Technical Support role experience Experience in Cloud hosting technology Excellent communicator Good written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to a variety of audience. Committed team player, willing to work in a challenging and cross-platform environment Self-driven, results-oriented with a positive outlook, keen for new experiences and responsibilities with a clear focus on high quality and operational efficiency. A natural forward planner who critically assesses own performance and is proactive and self-driven in achieving goals and objectives. Desirable SQL Ability to code software according to published standards and design guidelines (C# .Net) Experience of working within Warehouse Management industry
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles () The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 6.5.24, the sift is due to take place by 10.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 13.5.24 and 20.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
Apr 19, 2024
Full time
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles () The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 6.5.24, the sift is due to take place by 10.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 13.5.24 and 20.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles. The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 9.5.24, the sift is due to take place by 16.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 20.5.24 and 27.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
Apr 19, 2024
Full time
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles. The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 9.5.24, the sift is due to take place by 16.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 20.5.24 and 27.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
End Date Tuesday 23 April 2024 Salary Range £100,657 - £118,420 We support agile working - click here for more information on agile working options. Agile Working Options Hybrid Working, Job Share Job Description Summary . Job Description Database Technical Lead Lloyds Banking Group London- hybrid working two days per week in the office & rest from home. Salary & Benefits:£100,657 to £130,262 per annum (experience dependent), plus annual personal bonus, 15% employer pension contribution, flexible benefits package, private medical insurance, 30 days holiday plus bank holidays. Background: The Data Resilience team is a new chapter within the Chief Data and Analytics Office. We have the responsibility to facilitate the identification, end-to end data flow mapping, and assessment of IBS critical technical assets to establish the Data Resilience position and proactively mitigate Group exposure to data loss or corruption events. Disruptions are inevitable and regulators expect Financial Institutions to take the necessary steps to protect data and recover from severe but plausible data loss or corruption events, such as a cyber-attack, to meet IBS impact tolerance thresholds. Data Resilience purpose is to proactively protect the integrity, availability, and security of our data to mitigate the risk of disruption to the Group s Important Business Services. About the Role As the Data Resilience Data Engineer, you will focus on ensuring IBS critical data is stored, managed, and processed effectively to maintain the availability, confidentiality and integrity and fulfil the Group s Important Business Services. This will safeguard our critical data and associated assets from vulnerabilities and threats that could lead to a compromise of the integrity and availability leading to customer harm. The role requires ambitious individuals with a proactive, can-do attitude and solution-oriented approach to deliver at pace. Key Responsibilities: Be the technical data reliability point of contact for data resilience. Provide input and direction on database resilience assessments to identify gaps that could lead to IBS Impact Tolerance thresholds being breached. Develop database optimisation initiatives to drive improvement and guidance for Operational Resilience and change frameworks. Oversee the development of database controls and collaborate with platform teams and Chief Security Office to remediate security gaps. Embed proactive database hygiene, including ROT data, compression rates, effective maintenance plans and compaction & reorganisation across IBS critical applications. Ensure Data Resilience Information Asset Register (IAR) for databases is accurately maintained. Perform horizon scanning and provide input to group policies and procedures. Review backup and recovery procedures for IBS critical databases Support and grow team members in database domains of data resilience. Present data resilience database gaps to peers and senior collaborators. What we're looking for; We'd welcome applicants from diverse cultural and technological backgrounds, however financial services exposure will be important for this position.We'll need to see evidence of the following in your CV; Experience as a Senior Lead Database Administrator, including exposure to Oracle, MS SQL, (DB2 andIMS desirable) Knowledge of traditional Operating Systems (Windows, Linux, zOS, F5) Experience in Data Fabric and Data Mesh concepts, including Systems of Record, Engagement and Insight strategies Experience in database management and optimisation, i.e. reorganisation and rebuild. Strong knowledge of database backup and recovery procedures Experience of incident response (triage, classification, investigation, and escalation) Proficient in database encryption at rest, in transit and in memory Financial Services experience and exposure to some but not all; payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys. Solid verbal and written communication skills to discuss and describe the target architecture with technical and non-technical stakeholders. Its great if you have: Knowledge of Extract, Transform & Load (ETL), Disaster Recovery or back-up and restore domains. Public cloud data management experience including Databricks, MongoDB, CockcroachDB, GCP Dataproc, BigQuery. Enterprise Data Hub (EDH) and Warehouse (EDW) Prior experience working of supporting or remediating resilience issues on assets such as batch, messaging queues, third party data connections, data recovery & backup, data vaulting, data integrity. About working for us. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. We were one of the first major organisations to set goals on diversity in senior roles, create a menopause health package, and a dedicated Working with Cancer initiative. We re disability confident. So, if you d like reasonable adjustments to be made to our recruitment processes, just let us know. Ready for a career where you can have a positive impact as you learn, grow and thrive?Apply today and find out more At Lloyds Banking Group, we're driven by a clear purpose; to help Britain prosper. Across the Group, our colleagues are focused on making a difference to customers, businesses and communities. With us you'll have a key role to play in shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer to join us which is when we run our background checks. We'll always explain what we need and why, with any request coming from a trusted Lloyds Banking Group person. We're focused on creating a values-led culture and are committed to building a workforce which reflects the diversity of the customers and communities we serve. Together we re building a truly inclusive workplace where all of our colleagues have the opportunity to make a real difference.
Apr 19, 2024
Full time
End Date Tuesday 23 April 2024 Salary Range £100,657 - £118,420 We support agile working - click here for more information on agile working options. Agile Working Options Hybrid Working, Job Share Job Description Summary . Job Description Database Technical Lead Lloyds Banking Group London- hybrid working two days per week in the office & rest from home. Salary & Benefits:£100,657 to £130,262 per annum (experience dependent), plus annual personal bonus, 15% employer pension contribution, flexible benefits package, private medical insurance, 30 days holiday plus bank holidays. Background: The Data Resilience team is a new chapter within the Chief Data and Analytics Office. We have the responsibility to facilitate the identification, end-to end data flow mapping, and assessment of IBS critical technical assets to establish the Data Resilience position and proactively mitigate Group exposure to data loss or corruption events. Disruptions are inevitable and regulators expect Financial Institutions to take the necessary steps to protect data and recover from severe but plausible data loss or corruption events, such as a cyber-attack, to meet IBS impact tolerance thresholds. Data Resilience purpose is to proactively protect the integrity, availability, and security of our data to mitigate the risk of disruption to the Group s Important Business Services. About the Role As the Data Resilience Data Engineer, you will focus on ensuring IBS critical data is stored, managed, and processed effectively to maintain the availability, confidentiality and integrity and fulfil the Group s Important Business Services. This will safeguard our critical data and associated assets from vulnerabilities and threats that could lead to a compromise of the integrity and availability leading to customer harm. The role requires ambitious individuals with a proactive, can-do attitude and solution-oriented approach to deliver at pace. Key Responsibilities: Be the technical data reliability point of contact for data resilience. Provide input and direction on database resilience assessments to identify gaps that could lead to IBS Impact Tolerance thresholds being breached. Develop database optimisation initiatives to drive improvement and guidance for Operational Resilience and change frameworks. Oversee the development of database controls and collaborate with platform teams and Chief Security Office to remediate security gaps. Embed proactive database hygiene, including ROT data, compression rates, effective maintenance plans and compaction & reorganisation across IBS critical applications. Ensure Data Resilience Information Asset Register (IAR) for databases is accurately maintained. Perform horizon scanning and provide input to group policies and procedures. Review backup and recovery procedures for IBS critical databases Support and grow team members in database domains of data resilience. Present data resilience database gaps to peers and senior collaborators. What we're looking for; We'd welcome applicants from diverse cultural and technological backgrounds, however financial services exposure will be important for this position.We'll need to see evidence of the following in your CV; Experience as a Senior Lead Database Administrator, including exposure to Oracle, MS SQL, (DB2 andIMS desirable) Knowledge of traditional Operating Systems (Windows, Linux, zOS, F5) Experience in Data Fabric and Data Mesh concepts, including Systems of Record, Engagement and Insight strategies Experience in database management and optimisation, i.e. reorganisation and rebuild. Strong knowledge of database backup and recovery procedures Experience of incident response (triage, classification, investigation, and escalation) Proficient in database encryption at rest, in transit and in memory Financial Services experience and exposure to some but not all; payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys. Solid verbal and written communication skills to discuss and describe the target architecture with technical and non-technical stakeholders. Its great if you have: Knowledge of Extract, Transform & Load (ETL), Disaster Recovery or back-up and restore domains. Public cloud data management experience including Databricks, MongoDB, CockcroachDB, GCP Dataproc, BigQuery. Enterprise Data Hub (EDH) and Warehouse (EDW) Prior experience working of supporting or remediating resilience issues on assets such as batch, messaging queues, third party data connections, data recovery & backup, data vaulting, data integrity. About working for us. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. We were one of the first major organisations to set goals on diversity in senior roles, create a menopause health package, and a dedicated Working with Cancer initiative. We re disability confident. So, if you d like reasonable adjustments to be made to our recruitment processes, just let us know. Ready for a career where you can have a positive impact as you learn, grow and thrive?Apply today and find out more At Lloyds Banking Group, we're driven by a clear purpose; to help Britain prosper. Across the Group, our colleagues are focused on making a difference to customers, businesses and communities. With us you'll have a key role to play in shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer to join us which is when we run our background checks. We'll always explain what we need and why, with any request coming from a trusted Lloyds Banking Group person. We're focused on creating a values-led culture and are committed to building a workforce which reflects the diversity of the customers and communities we serve. Together we re building a truly inclusive workplace where all of our colleagues have the opportunity to make a real difference.
Data Engineer - Dynamics 365 CE / CRM and Power Platform experience required A well established MS partner who are experiencing growth through a very strong project pipeline are looking to add a Data Engineer and a Data Analytics Consultant to their team. They pride themselves on their innovative approach and their commitment to delivering outstanding results for the clients. Job Purpose Developing accurate, efficient data transformations that meet customer needs within agreed deadlines. Ensuring the reliability, robustness, and resilience of the projects you design and build, while working independently within agreed standards Creating and maintaining data pipelines, data storage solutions, data processing, and data integration. Troubleshooting issues and implementing necessary fixes. Key Responsibilities Designing and building reliable, robust, and accurate data pipelines based on agreed best practices. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL and other technologies. Design ETL processes, develop integration workflows, and manage data load processes to support both regular and ad-hoc activities. Create and maintain optimal data pipeline architecture, which might include integrating with Dataverse via cloud flows or external data sources. Design, develop, and maintain data warehouse environments. Identify, design, and implement internal process improvements, including automating manual processes, optimising data delivery, and re-designing infrastructure for greater scalability Technical Skills 3-5 years' established experience as a Data Engineer Experience with programming languages and tools C#, Python, Visual Studio Code and JSON Dynamics 365 CRM experience Power Platform: Power Automate, Dataverse Data management and warehousing ETL and ELT Data architecture and experience building complex database systems for businesses SQL Server including query optimisation Coding experience to include low code/no-code solutions Comfortable working with a range of data sources and formats e.g. JSON, XML, Flat files, API Integration Machine Learning and AI Normalisation and de-normalisation Understanding of Snowflake architecture, data modelling, and administration 50,000 - 60,000 based on experience - Great benefits They would consider contract options for these roles so please do reach out of you are looking for a contract opportunity - Outside IR35. This role will be fully remote with the occasional travel to client site. You must have the right to work in the UK as sponsorship is not provided. Please reach out to me on (phone number removed) or (url removed) to find out more information and get your application moving!
Apr 18, 2024
Full time
Data Engineer - Dynamics 365 CE / CRM and Power Platform experience required A well established MS partner who are experiencing growth through a very strong project pipeline are looking to add a Data Engineer and a Data Analytics Consultant to their team. They pride themselves on their innovative approach and their commitment to delivering outstanding results for the clients. Job Purpose Developing accurate, efficient data transformations that meet customer needs within agreed deadlines. Ensuring the reliability, robustness, and resilience of the projects you design and build, while working independently within agreed standards Creating and maintaining data pipelines, data storage solutions, data processing, and data integration. Troubleshooting issues and implementing necessary fixes. Key Responsibilities Designing and building reliable, robust, and accurate data pipelines based on agreed best practices. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL and other technologies. Design ETL processes, develop integration workflows, and manage data load processes to support both regular and ad-hoc activities. Create and maintain optimal data pipeline architecture, which might include integrating with Dataverse via cloud flows or external data sources. Design, develop, and maintain data warehouse environments. Identify, design, and implement internal process improvements, including automating manual processes, optimising data delivery, and re-designing infrastructure for greater scalability Technical Skills 3-5 years' established experience as a Data Engineer Experience with programming languages and tools C#, Python, Visual Studio Code and JSON Dynamics 365 CRM experience Power Platform: Power Automate, Dataverse Data management and warehousing ETL and ELT Data architecture and experience building complex database systems for businesses SQL Server including query optimisation Coding experience to include low code/no-code solutions Comfortable working with a range of data sources and formats e.g. JSON, XML, Flat files, API Integration Machine Learning and AI Normalisation and de-normalisation Understanding of Snowflake architecture, data modelling, and administration 50,000 - 60,000 based on experience - Great benefits They would consider contract options for these roles so please do reach out of you are looking for a contract opportunity - Outside IR35. This role will be fully remote with the occasional travel to client site. You must have the right to work in the UK as sponsorship is not provided. Please reach out to me on (phone number removed) or (url removed) to find out more information and get your application moving!
Position: Azure Data Architect Location: Remote Type: 6 Month Contract (Outside IR35) Rate: 550 to 600 Per Day Role: This is a fantastic opportunity to work for a leading Consultancy, my client is currently looking for an experienced Senior Data Architect to act as client architecture lead for various programmes of work. Data Architect is a multi-disciplinary role, requiring collaboration with a wide range of stakeholders, from developers to C-level executives. You will be responsible for working with customers to influence and shape the end-to-end data management and analytics workstreams, within fast paced and complex programmes, engaging in a wide variety of data management and analytics activities. Key Responsibilities: Support and influence Data Strategy, and Data Governance Policies and Principles Promote Data Management standards and best practices Support business and data requirements gathering Input and guidance to business for Data Catalog, Master Data and Metadata Management Lead the data solution designs and execution of data models for these solutions such as Data Warehouse, Data Lake, and Data Lakehouse Work with Data Engineers and Analysts to architect scalable and secure solutions across Data Integration, Data Orchestration, Data Processing, Data Storage, and Data Visualisation Work with cross-functional teams to support delivery of the data solutions Engage with customer and end-users to understand solution impact and develop technology operation plans Work with customers or partners to promote the company brand and develop healthy relationships Coach and mentor upcoming Data Architects Requirements: Demonstrable experience in Data Architecture in the last 3 years Experience in architecting data solutions which meet high data security and compliance requirements Experience working with various open-source, on-prem, COTS, and cloud (AWS, Azure, GCP) tools and technologies Advanced Data Modelling skills and experience in relational, dimensional and NoSQL databases Demonstrable experience in advanced SQL/TSQL Knowledge and experience working with a variety of frameworks and platforms for data management and analytics Data Engineering experience, and familiarity with Git, Python and R Data Analysis, Data Profiling and Data Visualisation experience Knowledge and desired experience of Big Data
Apr 18, 2024
Contractor
Position: Azure Data Architect Location: Remote Type: 6 Month Contract (Outside IR35) Rate: 550 to 600 Per Day Role: This is a fantastic opportunity to work for a leading Consultancy, my client is currently looking for an experienced Senior Data Architect to act as client architecture lead for various programmes of work. Data Architect is a multi-disciplinary role, requiring collaboration with a wide range of stakeholders, from developers to C-level executives. You will be responsible for working with customers to influence and shape the end-to-end data management and analytics workstreams, within fast paced and complex programmes, engaging in a wide variety of data management and analytics activities. Key Responsibilities: Support and influence Data Strategy, and Data Governance Policies and Principles Promote Data Management standards and best practices Support business and data requirements gathering Input and guidance to business for Data Catalog, Master Data and Metadata Management Lead the data solution designs and execution of data models for these solutions such as Data Warehouse, Data Lake, and Data Lakehouse Work with Data Engineers and Analysts to architect scalable and secure solutions across Data Integration, Data Orchestration, Data Processing, Data Storage, and Data Visualisation Work with cross-functional teams to support delivery of the data solutions Engage with customer and end-users to understand solution impact and develop technology operation plans Work with customers or partners to promote the company brand and develop healthy relationships Coach and mentor upcoming Data Architects Requirements: Demonstrable experience in Data Architecture in the last 3 years Experience in architecting data solutions which meet high data security and compliance requirements Experience working with various open-source, on-prem, COTS, and cloud (AWS, Azure, GCP) tools and technologies Advanced Data Modelling skills and experience in relational, dimensional and NoSQL databases Demonstrable experience in advanced SQL/TSQL Knowledge and experience working with a variety of frameworks and platforms for data management and analytics Data Engineering experience, and familiarity with Git, Python and R Data Analysis, Data Profiling and Data Visualisation experience Knowledge and desired experience of Big Data
AWS Data Engineer UK Wide 60,000 - 80,000 per annum + permanent benefits A leading IT Consultancy are looking to strengthen their Data Engineering team, the successful candidate will have hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. This role requires candidates to go through SC Clearance, so you must be eligible. Experience of AWS tools (e.g Athena, Redshift, Glue, EMR) Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools (e.g. Terraform, AWS CloudFormation or Cloud Deployment Manager) Have a broad understanding of AWS services including but not limited to, EC2, Storage, AWS Security, Container technologies, IAM, Cloud Networking, data processing and machine learning.
Apr 18, 2024
Full time
AWS Data Engineer UK Wide 60,000 - 80,000 per annum + permanent benefits A leading IT Consultancy are looking to strengthen their Data Engineering team, the successful candidate will have hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. This role requires candidates to go through SC Clearance, so you must be eligible. Experience of AWS tools (e.g Athena, Redshift, Glue, EMR) Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools (e.g. Terraform, AWS CloudFormation or Cloud Deployment Manager) Have a broad understanding of AWS services including but not limited to, EC2, Storage, AWS Security, Container technologies, IAM, Cloud Networking, data processing and machine learning.
We are currently recruiting for a warehouse operative to work as part of a small team for our client based in Harlow. If you also have experience or an Interest in stripping down IT units and testing parts this may be of particular interest to you. Within the warehouse the role will involve; Receipt and inspection of inbound products. Verifying goods are correct as per purchase orders raised. Data entry receipt of inbound product. Stock location of received product. Picking/Cleaning of product. Packaging of product for readiness of outbound shipping. General upkeep and tidying of work environment. As an IT Engineer it could also involve; Rigorous testing and grading of various types of IT equipment. Data erasure of units. Splitting down of units to a reverse bill of materials for saleable parts. Part identification & part numbering of individual parts. Database stocking of individual parts and complete working units. The role requires the candidate to have dexterity, high concentration levels, high levels of attention to detail and a willingness to adapt to any task requested. You must be willing to work hard and on occasion show your own initiative. In-house training will be provided Hours are between 9am and 6pm with an hour for lunch. If you are interested in finding out more about the role, please apply in the 1st instance.
Apr 18, 2024
Full time
We are currently recruiting for a warehouse operative to work as part of a small team for our client based in Harlow. If you also have experience or an Interest in stripping down IT units and testing parts this may be of particular interest to you. Within the warehouse the role will involve; Receipt and inspection of inbound products. Verifying goods are correct as per purchase orders raised. Data entry receipt of inbound product. Stock location of received product. Picking/Cleaning of product. Packaging of product for readiness of outbound shipping. General upkeep and tidying of work environment. As an IT Engineer it could also involve; Rigorous testing and grading of various types of IT equipment. Data erasure of units. Splitting down of units to a reverse bill of materials for saleable parts. Part identification & part numbering of individual parts. Database stocking of individual parts and complete working units. The role requires the candidate to have dexterity, high concentration levels, high levels of attention to detail and a willingness to adapt to any task requested. You must be willing to work hard and on occasion show your own initiative. In-house training will be provided Hours are between 9am and 6pm with an hour for lunch. If you are interested in finding out more about the role, please apply in the 1st instance.
Job Description: Pet Nutrition (PN) is the most vibrant category in the FMCG sector. As we work to transform this exciting category, a new program, Digital First, has been mobilized by the Mars Pet Nutrition (PN) leadership team. Digital First places pet parents at the center of all we do in Mars PN, while digitalizing a wide range of business process areas, and creating future fit capabilities to achieve ambitious targets in top line growth, earnings, and pet parent centricity. The Digital First agenda requires Digitizing at scale and requires you to demonstrate significant thought leadership, quality decision making, deep technical know-how, and an ability to navigate complex business challenges while building and leading a team of world class data and analytics leaders. Are you passionate about Data and Analytics and excited about how it can completely transform the way an enterprise works? Do you have the strategic vision, technical expertise, and leadership skills to drive data-driven solutions? Do you want to work in a dynamic, fast-growing category? If so, you might be the ideal candidate for the role of Solution Architect Data Foundations, in the Enterprise Architecture function for Global Pet Nutrition (PN) at Mars. The Solution Architect Data Foundations is a strategic leadership role that oversees delivery of cross product transversal data capabilities that is foundational to our success. This role is accountable for the architecture and design and optimization of data platforms, data architecture, data operations, data engineering and the development of data assets/products for the multi-billion-dollar Pet Nutrition division's digital needs. Reporting to the Head of Enterprise Architecture, the person in this role will be a part of the Global PN Architecture of Tomorrow team. The role operates globally and partners with PN business and digital leaders across all functions. 'This role is an incubation role (temporary) with an estimated end date of December 2026. The purpose is to fast-track and support the build of this specific product. At the completion of the product, a permanent BAU role will open to maintain and support the product: the role will be permanent and will have a different job description more suited to the need of the organisation at end state. If you are unable to secure the role by December 2026 you will be eligible for a separation package.' What are we looking for? Bachelor's degree or Equivalent (IT Degree preferred in particular computer science, data science or related field) Industry leading expertise in building and delivering data foundations, preferably in the CPG, or retail industry. Established and deep understanding of a range of technology solutions & business process, across CPG functional capabilities Proven track record of delivering value through data products in a fast-paced, agile environment. Extensive knowledge of data principles, architecture/modeling, ingestion, ETL principles and practices Extensive knowledge of Azure based big data platform & exposure to other clouds such as GCP is desirable. Experience in architect and design data platforms such as data lakes, data warehouses, and the data pipelines and data services that support various types of data and analytics use cases. Prior experience of successfully leading large-scale data initiatives to support analytics, BI & AI use cases. Prior experience in decentralized data management, specifically, in data governance of managing fragmented data domains like sales, finance, marketing. Proven track record of establishing and leading a DDF design authority Proven track record to master new and emerging technologies Successful experience, established over several years, to perform architecture leadership within a Technology environment A strong customer centric mindset especially within an internal customer base with the purpose of driving adoption and use Strategic thinking, problem solving and innovation, with the ability to anticipate and navigate challenges and opportunities. Excellent in engaging with technical and functional leadership in a matrix organization. Ability to navigate complex matrix organisation What will be your key responsibilities? Mars Principles: Live and exemplify the Five Principles of Mars, Inc. within self and team. Strategy and Thought Leadership: Work with PN Digital Leadership to create and execute the data foundations strategy and roadmap for the Pet Nutrition segment, in alignment with the Pet Nutrition's business strategic priorities, goals and analytics needs. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D&A leadership. You align with and support Enterprise architecture efforts in Mars Petcare, corporate EA, GDO, CISO teams. Architectural governance, review and assurance: you are accountable for effective and proportionate governance to approve or reject high level solution designs, solution architectures, other Technology services or substantial changes to existing services for compliance, including granting waivers where justified. You ensure that critical DDF design decisions and issues escalated by delivery teams across PN DT are reviewed and resolved promptly. You ensure that the governance, review and assurance processes provide insight and information to drive future revisions of the strategy and roadmap, so that the Technology architecture continues to evolve to meet the changing needs of the Mars PN. You drive architectural governance, review and assurance in partnership with the Technology Leadership Team, PN/Petcare/Corporate EAs and colleagues in the wider Mars PN. Roadmap to achieve the target architecture: you are accountable for setting out a roadmap to move from the current state architecture to the target architecture for DDF, taking account of the change portfolio and expected future change plans. You ensure that the roadmap is maintained to account for evolving requirements. Data as a Product: Bring technical mastery, knowledge, and acumen to lead the creation and deployment of scalable, secure data platforms and data assets tailored to our organization's evolving requirements while ensuring data quality and trust. Embed thought leadership in modeling data such that it is domain driven, easly discoverable and self service enabled (where appropriate) with a strong-willed approach to avoid duplication and promote trust and integrity in data assets. What can you expect from Mars? Work with over 130,000 diverse and talented Associates, all guided by the Five Principles. Join a purpose-driven company where we're striving to build the world we want tomorrow, today. Best-in-class learning and development support from day one, including access to our in-house Mars University. An industry-competitive salary and benefits package, including company bonus. Mars is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. The company is pleased to provide such assistance, and no applicant will be penalized as a result of such a request.
Apr 18, 2024
Full time
Job Description: Pet Nutrition (PN) is the most vibrant category in the FMCG sector. As we work to transform this exciting category, a new program, Digital First, has been mobilized by the Mars Pet Nutrition (PN) leadership team. Digital First places pet parents at the center of all we do in Mars PN, while digitalizing a wide range of business process areas, and creating future fit capabilities to achieve ambitious targets in top line growth, earnings, and pet parent centricity. The Digital First agenda requires Digitizing at scale and requires you to demonstrate significant thought leadership, quality decision making, deep technical know-how, and an ability to navigate complex business challenges while building and leading a team of world class data and analytics leaders. Are you passionate about Data and Analytics and excited about how it can completely transform the way an enterprise works? Do you have the strategic vision, technical expertise, and leadership skills to drive data-driven solutions? Do you want to work in a dynamic, fast-growing category? If so, you might be the ideal candidate for the role of Solution Architect Data Foundations, in the Enterprise Architecture function for Global Pet Nutrition (PN) at Mars. The Solution Architect Data Foundations is a strategic leadership role that oversees delivery of cross product transversal data capabilities that is foundational to our success. This role is accountable for the architecture and design and optimization of data platforms, data architecture, data operations, data engineering and the development of data assets/products for the multi-billion-dollar Pet Nutrition division's digital needs. Reporting to the Head of Enterprise Architecture, the person in this role will be a part of the Global PN Architecture of Tomorrow team. The role operates globally and partners with PN business and digital leaders across all functions. 'This role is an incubation role (temporary) with an estimated end date of December 2026. The purpose is to fast-track and support the build of this specific product. At the completion of the product, a permanent BAU role will open to maintain and support the product: the role will be permanent and will have a different job description more suited to the need of the organisation at end state. If you are unable to secure the role by December 2026 you will be eligible for a separation package.' What are we looking for? Bachelor's degree or Equivalent (IT Degree preferred in particular computer science, data science or related field) Industry leading expertise in building and delivering data foundations, preferably in the CPG, or retail industry. Established and deep understanding of a range of technology solutions & business process, across CPG functional capabilities Proven track record of delivering value through data products in a fast-paced, agile environment. Extensive knowledge of data principles, architecture/modeling, ingestion, ETL principles and practices Extensive knowledge of Azure based big data platform & exposure to other clouds such as GCP is desirable. Experience in architect and design data platforms such as data lakes, data warehouses, and the data pipelines and data services that support various types of data and analytics use cases. Prior experience of successfully leading large-scale data initiatives to support analytics, BI & AI use cases. Prior experience in decentralized data management, specifically, in data governance of managing fragmented data domains like sales, finance, marketing. Proven track record of establishing and leading a DDF design authority Proven track record to master new and emerging technologies Successful experience, established over several years, to perform architecture leadership within a Technology environment A strong customer centric mindset especially within an internal customer base with the purpose of driving adoption and use Strategic thinking, problem solving and innovation, with the ability to anticipate and navigate challenges and opportunities. Excellent in engaging with technical and functional leadership in a matrix organization. Ability to navigate complex matrix organisation What will be your key responsibilities? Mars Principles: Live and exemplify the Five Principles of Mars, Inc. within self and team. Strategy and Thought Leadership: Work with PN Digital Leadership to create and execute the data foundations strategy and roadmap for the Pet Nutrition segment, in alignment with the Pet Nutrition's business strategic priorities, goals and analytics needs. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D&A leadership. You align with and support Enterprise architecture efforts in Mars Petcare, corporate EA, GDO, CISO teams. Architectural governance, review and assurance: you are accountable for effective and proportionate governance to approve or reject high level solution designs, solution architectures, other Technology services or substantial changes to existing services for compliance, including granting waivers where justified. You ensure that critical DDF design decisions and issues escalated by delivery teams across PN DT are reviewed and resolved promptly. You ensure that the governance, review and assurance processes provide insight and information to drive future revisions of the strategy and roadmap, so that the Technology architecture continues to evolve to meet the changing needs of the Mars PN. You drive architectural governance, review and assurance in partnership with the Technology Leadership Team, PN/Petcare/Corporate EAs and colleagues in the wider Mars PN. Roadmap to achieve the target architecture: you are accountable for setting out a roadmap to move from the current state architecture to the target architecture for DDF, taking account of the change portfolio and expected future change plans. You ensure that the roadmap is maintained to account for evolving requirements. Data as a Product: Bring technical mastery, knowledge, and acumen to lead the creation and deployment of scalable, secure data platforms and data assets tailored to our organization's evolving requirements while ensuring data quality and trust. Embed thought leadership in modeling data such that it is domain driven, easly discoverable and self service enabled (where appropriate) with a strong-willed approach to avoid duplication and promote trust and integrity in data assets. What can you expect from Mars? Work with over 130,000 diverse and talented Associates, all guided by the Five Principles. Join a purpose-driven company where we're striving to build the world we want tomorrow, today. Best-in-class learning and development support from day one, including access to our in-house Mars University. An industry-competitive salary and benefits package, including company bonus. Mars is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. The company is pleased to provide such assistance, and no applicant will be penalized as a result of such a request.
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. An exciting opportunity within financial services client is looking for DevOps Engineer / SRE /Site reliability engineer based in London. Role : DevOps Engineer / SRE /Site reliability engineer Location : London (2 days a week onsite) Duration : 6 Months Status : Inside IR35 Experienced and knowledgeable AWS Cloud with knowledge of EKS, Jenkins, DevOps, Terraform, Kubernetes, Docker, Helm, Git Ops, and troubleshooting skills. Experience and skills Required: Experience in backend development or data engineering. Hands-on experience with AWS services like S3, EC2, EMR. Experience with Kubernetes, Terraform, CI/CD, Jenkins Proficiency in SQL and experience with CDAP, Spark, Kafka. Experience building scalable ETL processes and workflows. Responsibilities: Develop and enhance data pipelines, ETL processes using CDAP on AWS infrastructure. Build data integration flows to migrate large datasets into Snowflake data warehouse. Implement AWS infrastructure-as-code solutions for deployment automation. Instrument data pipelines and leverage monitoring for performance tuning and reliability. Work with data scientists to optimize data workflows and models on Databricks. Follow security best practices for access control, encryption, auditing across data platforms. Participate in architecture reviews and technology selections. Continuously monitor and improve data platforms for scalability and costs. Candidates will ideally show evidence of the above in their CV to be considered. Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. Pontoon is an employment consultancy and operates as an equal opportunity's employer.
Apr 18, 2024
Contractor
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. An exciting opportunity within financial services client is looking for DevOps Engineer / SRE /Site reliability engineer based in London. Role : DevOps Engineer / SRE /Site reliability engineer Location : London (2 days a week onsite) Duration : 6 Months Status : Inside IR35 Experienced and knowledgeable AWS Cloud with knowledge of EKS, Jenkins, DevOps, Terraform, Kubernetes, Docker, Helm, Git Ops, and troubleshooting skills. Experience and skills Required: Experience in backend development or data engineering. Hands-on experience with AWS services like S3, EC2, EMR. Experience with Kubernetes, Terraform, CI/CD, Jenkins Proficiency in SQL and experience with CDAP, Spark, Kafka. Experience building scalable ETL processes and workflows. Responsibilities: Develop and enhance data pipelines, ETL processes using CDAP on AWS infrastructure. Build data integration flows to migrate large datasets into Snowflake data warehouse. Implement AWS infrastructure-as-code solutions for deployment automation. Instrument data pipelines and leverage monitoring for performance tuning and reliability. Work with data scientists to optimize data workflows and models on Databricks. Follow security best practices for access control, encryption, auditing across data platforms. Participate in architecture reviews and technology selections. Continuously monitor and improve data platforms for scalability and costs. Candidates will ideally show evidence of the above in their CV to be considered. Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. Pontoon is an employment consultancy and operates as an equal opportunity's employer.
Dynamics 365 Finance and Operations Integration Developer 12 month contract Surrey / hybrid remote Advantage are recruiting on behalf of an established global engineering consultancy in Surrey for a D365 Integration Developer to design, develop, and implement integrations between Dynamics 365 Finance and Operations and other systems within the business. We're looking for candidates who can offer demonstrable experience developing integrations for D365 finance and operations, along with skills in X and T-SQL, experience in developing within the Power Platform, and familiarity with Azure integration technologies (Functions, Logic Apps, Data Lake, Synapse, ADO) Key Responsibilities System Integration: Collaborating with stakeholders to understand business requirements and designing integration solutions to connect Dynamics 365 Finance and Operations with other systems such as ProjOps, Dataverse, Data Lake, Synapse, Enterprise Data Warehouse (EDW), or third-party applications. Development: Writing code and developing custom solutions using relevant programming languages and technologies (such as X , C#, SQL, XML, JSON) to create seamless data flow and functionality between different systems. API Development: Utilizing APIs (Application Programming Interfaces) provided by Dynamics 365 Finance and Operations and other systems to facilitate data exchange and communication between them. Data Mapping and Transformation: Defining data mapping rules and implementing data transformation processes to ensure data consistency and accuracy across integrated systems. Testing and Debugging: Conducting thorough testing of integration solutions to identify and resolve any issues or bugs, ensuring smooth functionality and data integrity. Documentation: Documenting technical specifications, integration processes, and system configurations for reference purposes and future maintenance. Collaboration: Working closely with cross-functional teams including business analysts, system administrators, and other developers to ensure successful integration projects delivery. Maintenance and Support: Providing ongoing support and troubleshooting assistance to address any integration-related issues or enhancements post-implementation. Compliance and Security: Ensuring compliance with data protection regulations and implementing security measures to safeguard sensitive information during integration processes. Stay Updated: Keeping abreast of the latest technologies, best practices, and updates related to Dynamics 365 Finance and Operations and integration methodologies to continuously improve integration solutions and processes. Skills And Competencies A relevant number of years' of experience developing integrations for Microsoft Dynamics 365 Finance & Operations Skills of X development, including solutions. T-SQL development skills Cloud Integration specialisms familiar with the range of Azure integration technologies Functions Logic App Azure Data Lake Azure Synapse Azure DevOps Experience in developing solutions within Power platform (Power Apps, Power automate), Dataverse Experience of integration work with payroll/HR/Finance systems. Good understanding of financial data concepts. Experience with Microsoft Dynamics 365 Project Operations is a plus Experience with Informatica is a plus Submit your CV now to be considered for this brilliant opportunity supporting a globally renowned consulting business.
Apr 18, 2024
Contractor
Dynamics 365 Finance and Operations Integration Developer 12 month contract Surrey / hybrid remote Advantage are recruiting on behalf of an established global engineering consultancy in Surrey for a D365 Integration Developer to design, develop, and implement integrations between Dynamics 365 Finance and Operations and other systems within the business. We're looking for candidates who can offer demonstrable experience developing integrations for D365 finance and operations, along with skills in X and T-SQL, experience in developing within the Power Platform, and familiarity with Azure integration technologies (Functions, Logic Apps, Data Lake, Synapse, ADO) Key Responsibilities System Integration: Collaborating with stakeholders to understand business requirements and designing integration solutions to connect Dynamics 365 Finance and Operations with other systems such as ProjOps, Dataverse, Data Lake, Synapse, Enterprise Data Warehouse (EDW), or third-party applications. Development: Writing code and developing custom solutions using relevant programming languages and technologies (such as X , C#, SQL, XML, JSON) to create seamless data flow and functionality between different systems. API Development: Utilizing APIs (Application Programming Interfaces) provided by Dynamics 365 Finance and Operations and other systems to facilitate data exchange and communication between them. Data Mapping and Transformation: Defining data mapping rules and implementing data transformation processes to ensure data consistency and accuracy across integrated systems. Testing and Debugging: Conducting thorough testing of integration solutions to identify and resolve any issues or bugs, ensuring smooth functionality and data integrity. Documentation: Documenting technical specifications, integration processes, and system configurations for reference purposes and future maintenance. Collaboration: Working closely with cross-functional teams including business analysts, system administrators, and other developers to ensure successful integration projects delivery. Maintenance and Support: Providing ongoing support and troubleshooting assistance to address any integration-related issues or enhancements post-implementation. Compliance and Security: Ensuring compliance with data protection regulations and implementing security measures to safeguard sensitive information during integration processes. Stay Updated: Keeping abreast of the latest technologies, best practices, and updates related to Dynamics 365 Finance and Operations and integration methodologies to continuously improve integration solutions and processes. Skills And Competencies A relevant number of years' of experience developing integrations for Microsoft Dynamics 365 Finance & Operations Skills of X development, including solutions. T-SQL development skills Cloud Integration specialisms familiar with the range of Azure integration technologies Functions Logic App Azure Data Lake Azure Synapse Azure DevOps Experience in developing solutions within Power platform (Power Apps, Power automate), Dataverse Experience of integration work with payroll/HR/Finance systems. Good understanding of financial data concepts. Experience with Microsoft Dynamics 365 Project Operations is a plus Experience with Informatica is a plus Submit your CV now to be considered for this brilliant opportunity supporting a globally renowned consulting business.
Data Engineer - 5 Month Remote Contract Interviewing now for a Data Engineer for a remote project (Outside IR35.) We are seeking an experienced Data Engineer responsible to develop and deliver data solutions, optimising data to ensure data availability and accuracy. You will extract data from the Datalake using Big Query. You will Build data pipelines from gathered requirements to support delivery of data solutions. You will also track down discrepancies with data across the integration points and report data issues that impact service delivery. Type: 5 Month Contract Day Rate: Market Rates (Outside IR35) Location: Remote/UK Start: ASAP Skills -Previous experience in database services, design, implementation and experience at enterprise level data platforms, data lake concepts and data handling. - Proficient with SQL, joins, PKs, FKs, referential integrity. -Previous data warehouse skills including ETL. -Proficient working with data in production environment Please apply now to be considered for this role.
Apr 18, 2024
Contractor
Data Engineer - 5 Month Remote Contract Interviewing now for a Data Engineer for a remote project (Outside IR35.) We are seeking an experienced Data Engineer responsible to develop and deliver data solutions, optimising data to ensure data availability and accuracy. You will extract data from the Datalake using Big Query. You will Build data pipelines from gathered requirements to support delivery of data solutions. You will also track down discrepancies with data across the integration points and report data issues that impact service delivery. Type: 5 Month Contract Day Rate: Market Rates (Outside IR35) Location: Remote/UK Start: ASAP Skills -Previous experience in database services, design, implementation and experience at enterprise level data platforms, data lake concepts and data handling. - Proficient with SQL, joins, PKs, FKs, referential integrity. -Previous data warehouse skills including ETL. -Proficient working with data in production environment Please apply now to be considered for this role.