Our growing Technology team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer centric design and technology services. Join us and you'll have an opportunity to make an impact. To empower projects that deliver real-world benefits across Britain and internationally. To hear our customers say they couldn't have done it without us. And to be central to OS's vision: to be recognised as world leaders in geospatial services; creating location insight for positive impact. About the Role We have an opportunity for an Engineer specialising in DevOps to join Ordnance Survey's Automatic Feature Extraction team in developing next generation Microsoft Azure based systems. Using Azure DevOps YAML Pipelines and DevOps practices to deploy and manage our systems in the Azure Cloud (in a forward-thinking agile environment), this role will challenge you to understand requirements and develop cutting edge solutions for our users and customers. We are looking for a problem solver who is keen to learn and fail fast. The team will look to you to develop and deliver systems that are functional, performant, scalable, tested, secure, maintainable, and supportable. Here is a snapshot of the technologies that we use: Azure Cloud Technologies (AzureML, Databricks, Azure Batch, Function Apps), Bicep, Powershell, Azure DevOps, YAML Pipelines, Pester testing. The wider team also uses Python, Databricks, Esri ArcGIS, FME, QGIS. We implement industry standard security recommendations on our infrastructure, including recommendations from CIS and Azure Defender. What we are looking for If you are interested in joining a team that lies at the heart of what OS is about, we are looking for someone that can demonstrate essential skills and experience in: Experience of Microsoft Azure Cloud Services (or similar with desire to work in Azure). Experience of engineering, monitoring and supporting secure cloud production systems. Experience of Infrastructure as Code to create reproduceable systems (e.g. Bicep, Terraform, ARM, AzCLI) and of configuring deployment and release pipelines. Programming experience in a language such as C#/Python, in a shell language such as bash, Powershell and use of version control such as git. Understanding of software engineering best practices (TDD, CI, Clean Code, creating testable code, Design Patterns) Experience in iterative and incremental development. Problem resolution and selection of technical solutions. Willingness to collaborate on technical problems through pair and mob programming. Comfortable with engaging with stakeholders, gathering feedback, bringing recommendations and continuous improvement. Desirable Skills: Experience in advocating best practice within your team and to junior members of the team. Knowledge of Open-Source Python Data Libraries such as Pandas and Numpy or of geospatial frameworks (Esri, FME) or of machine learning computer vision. The rewards We want you to love what you do. That's why our benefits package rewards a job well done. We'll give you: Salary: £37,511 - £44,130 Performance related bonus A competitive pension scheme We embrace flexible working and can consider different working hours dependent on the role and your personal circumstances 25 days annual leave - (30 days after five years) bank holidays and an extra 3 over Christmas Plus, a suite of excellent additional benefits Location ? OSHQ is based in Southampton, Hampshire but at OS we believe work is something you do, not somewhere you go. We embrace a hybrid working model where we believe the choice is with the individual on when they work from our fantastic office or from home. We want to continue to be an inclusive employer and recognise that one size doesn't necessarily fit all. Closing date : Sunday, 31 March 2024 We believe diversity and inclusion is about working together - in an encouraging and respectful environment to reach our full potential. We believe combining different backgrounds, experiences and perspectives will help us reach our vision and be trusted and admired across the globe for setting the standards and leading the way. We are looking for passionate people from a range of backgrounds and welcome applications from any race, age, gender, background or religion. We're individually talented and collectively powerful, and we give you the space to take your career in whichever direction you want.
Mar 29, 2024
Full time
Our growing Technology team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer centric design and technology services. Join us and you'll have an opportunity to make an impact. To empower projects that deliver real-world benefits across Britain and internationally. To hear our customers say they couldn't have done it without us. And to be central to OS's vision: to be recognised as world leaders in geospatial services; creating location insight for positive impact. About the Role We have an opportunity for an Engineer specialising in DevOps to join Ordnance Survey's Automatic Feature Extraction team in developing next generation Microsoft Azure based systems. Using Azure DevOps YAML Pipelines and DevOps practices to deploy and manage our systems in the Azure Cloud (in a forward-thinking agile environment), this role will challenge you to understand requirements and develop cutting edge solutions for our users and customers. We are looking for a problem solver who is keen to learn and fail fast. The team will look to you to develop and deliver systems that are functional, performant, scalable, tested, secure, maintainable, and supportable. Here is a snapshot of the technologies that we use: Azure Cloud Technologies (AzureML, Databricks, Azure Batch, Function Apps), Bicep, Powershell, Azure DevOps, YAML Pipelines, Pester testing. The wider team also uses Python, Databricks, Esri ArcGIS, FME, QGIS. We implement industry standard security recommendations on our infrastructure, including recommendations from CIS and Azure Defender. What we are looking for If you are interested in joining a team that lies at the heart of what OS is about, we are looking for someone that can demonstrate essential skills and experience in: Experience of Microsoft Azure Cloud Services (or similar with desire to work in Azure). Experience of engineering, monitoring and supporting secure cloud production systems. Experience of Infrastructure as Code to create reproduceable systems (e.g. Bicep, Terraform, ARM, AzCLI) and of configuring deployment and release pipelines. Programming experience in a language such as C#/Python, in a shell language such as bash, Powershell and use of version control such as git. Understanding of software engineering best practices (TDD, CI, Clean Code, creating testable code, Design Patterns) Experience in iterative and incremental development. Problem resolution and selection of technical solutions. Willingness to collaborate on technical problems through pair and mob programming. Comfortable with engaging with stakeholders, gathering feedback, bringing recommendations and continuous improvement. Desirable Skills: Experience in advocating best practice within your team and to junior members of the team. Knowledge of Open-Source Python Data Libraries such as Pandas and Numpy or of geospatial frameworks (Esri, FME) or of machine learning computer vision. The rewards We want you to love what you do. That's why our benefits package rewards a job well done. We'll give you: Salary: £37,511 - £44,130 Performance related bonus A competitive pension scheme We embrace flexible working and can consider different working hours dependent on the role and your personal circumstances 25 days annual leave - (30 days after five years) bank holidays and an extra 3 over Christmas Plus, a suite of excellent additional benefits Location ? OSHQ is based in Southampton, Hampshire but at OS we believe work is something you do, not somewhere you go. We embrace a hybrid working model where we believe the choice is with the individual on when they work from our fantastic office or from home. We want to continue to be an inclusive employer and recognise that one size doesn't necessarily fit all. Closing date : Sunday, 31 March 2024 We believe diversity and inclusion is about working together - in an encouraging and respectful environment to reach our full potential. We believe combining different backgrounds, experiences and perspectives will help us reach our vision and be trusted and admired across the globe for setting the standards and leading the way. We are looking for passionate people from a range of backgrounds and welcome applications from any race, age, gender, background or religion. We're individually talented and collectively powerful, and we give you the space to take your career in whichever direction you want.
A leading consultancy is looking for a Snowflake Technical Architect to join their InsureTech team! You will get the opportunity to work with a world-leading consultancy, massively expand your network, be exposed to some of the most innovative tools in the market including Snowflake, Databricks, Airflow, and more! You will work with international insurance companies in developing their Databricks implementations from a strategic perspective and creating their implementation roadmap. The role is hybrid, based in central London, and is offering up to 90,000 + bonus + excellent benefits! Requirements: More than 5 years of experience in Data Engineering and Data Architecture Experience designing and delivering Snowflake solutions for 2 or more projects/clients Background of working for a consultancy Knowledge of Python, SQL, R or Scala and any cloud platform such as Amazon Web Services (AWS), Azure or Google Cloud Platform (GCP) Desirable: Experience working for insurance clients Experience with more than one cloud platform is a plus Don't miss the chance to work with a world-leading consultancy, with some of the most innovative tools in the market! Apply now below Lawrence Harvey is acting as an Employment Business in regards to this position. Visit our website and follow us on Twitter for all live vacancies (lawharveyjobs)
Mar 28, 2024
Full time
A leading consultancy is looking for a Snowflake Technical Architect to join their InsureTech team! You will get the opportunity to work with a world-leading consultancy, massively expand your network, be exposed to some of the most innovative tools in the market including Snowflake, Databricks, Airflow, and more! You will work with international insurance companies in developing their Databricks implementations from a strategic perspective and creating their implementation roadmap. The role is hybrid, based in central London, and is offering up to 90,000 + bonus + excellent benefits! Requirements: More than 5 years of experience in Data Engineering and Data Architecture Experience designing and delivering Snowflake solutions for 2 or more projects/clients Background of working for a consultancy Knowledge of Python, SQL, R or Scala and any cloud platform such as Amazon Web Services (AWS), Azure or Google Cloud Platform (GCP) Desirable: Experience working for insurance clients Experience with more than one cloud platform is a plus Don't miss the chance to work with a world-leading consultancy, with some of the most innovative tools in the market! Apply now below Lawrence Harvey is acting as an Employment Business in regards to this position. Visit our website and follow us on Twitter for all live vacancies (lawharveyjobs)
Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year Contract - 2 Positions available I'm looking for a Data Engineering Lead for a 2 year contract for a company specialising in Waste Management. Key Skills: 5+ years in software development (various languages) with expertise in data structures, algorithms, Data Platforms, Datalakes, and Business Intelligence. Experience as a data engineer: creating data pipelines, using tools like Pyspark, Spark SQL, Scala, Airflow, etc. 1+ year in a technical leadership role. Proficiency in AWS, Google Cloud, or Azure, and Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, Snowflake, etc.). Skilled in Power Platform (Power Apps, Power Automate, Power BI). Advanced knowledge of Excel and office applications. Relevant system admin certification. Strong problem-solving ability and analytical skills. Excellent written and verbal communication. Agile environment experience, team player with self-initiative. Results-driven and quality-focused. If this sounds like a suitable opportunity, please click apply to be considered for the role. Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year contract - 2 positions available
Mar 28, 2024
Full time
Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year Contract - 2 Positions available I'm looking for a Data Engineering Lead for a 2 year contract for a company specialising in Waste Management. Key Skills: 5+ years in software development (various languages) with expertise in data structures, algorithms, Data Platforms, Datalakes, and Business Intelligence. Experience as a data engineer: creating data pipelines, using tools like Pyspark, Spark SQL, Scala, Airflow, etc. 1+ year in a technical leadership role. Proficiency in AWS, Google Cloud, or Azure, and Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, Snowflake, etc.). Skilled in Power Platform (Power Apps, Power Automate, Power BI). Advanced knowledge of Excel and office applications. Relevant system admin certification. Strong problem-solving ability and analytical skills. Excellent written and verbal communication. Agile environment experience, team player with self-initiative. Results-driven and quality-focused. If this sounds like a suitable opportunity, please click apply to be considered for the role. Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year contract - 2 positions available
Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year Contract - 2 Positions available I'm looking for a Data Engineering Lead for a 2 year contract for a company specialising in Waste Management. Key Skills: 5+ years in software development (various languages) with expertise in data structures, algorithms, Data Platforms, Datalakes, and Business Intelligence. Experience as a data engineer: creating data pipelines, using tools like Pyspark, Spark SQL, Scala, Airflow, etc. 1+ year in a technical leadership role. Proficiency in AWS, Google Cloud, or Azure, and Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, Snowflake, etc.). Skilled in Power Platform (Power Apps, Power Automate, Power BI). Advanced knowledge of Excel and office applications. Relevant system admin certification. Strong problem-solving ability and analytical skills. Excellent written and verbal communication. Agile environment experience, team player with self-initiative. Results-driven and quality-focused. If this sounds like a suitable opportunity, please click apply to be considered for the role. Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year contract - 2 positions available
Mar 28, 2024
Contractor
Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year Contract - 2 Positions available I'm looking for a Data Engineering Lead for a 2 year contract for a company specialising in Waste Management. Key Skills: 5+ years in software development (various languages) with expertise in data structures, algorithms, Data Platforms, Datalakes, and Business Intelligence. Experience as a data engineer: creating data pipelines, using tools like Pyspark, Spark SQL, Scala, Airflow, etc. 1+ year in a technical leadership role. Proficiency in AWS, Google Cloud, or Azure, and Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, Snowflake, etc.). Skilled in Power Platform (Power Apps, Power Automate, Power BI). Advanced knowledge of Excel and office applications. Relevant system admin certification. Strong problem-solving ability and analytical skills. Excellent written and verbal communication. Agile environment experience, team player with self-initiative. Results-driven and quality-focused. If this sounds like a suitable opportunity, please click apply to be considered for the role. Data Engineering / Tech Lead - Azure, Databricks, Python - 2 year contract - 2 positions available
We are seeking a Junior Developer for UK Power Networks in central Ipswich. This will be for 6-6 months but could become long term. JOB PURPOSE: To play a key role in the implementation of UK Power Networks Digital Strategy by transforming the way the business utilises technology and data to sustain and increase business performance. As a Junior Developer your primary focus will be to Support the Technology Specialists to define, build and continuously improve the core systems and data infrastructure of the business. You will be involved in designing solutions for on-premise and cloud-based technologies. You will be a vital part of a dynamic team defining and delivering continuous change into our systems and data architecture driven increasingly by cloud-based solutions. What you deliver will drive the realisation of the Digital Strategy through enabling technology services that allow true business agility, robust systems resilience, and excellent data intelligence. Specifically, you will be maintaining in house business applications and also working on minor bug fixes, including front-end user interface design, server-side code, any business logic, and database integration. This role is vital to enable the delivery of the UK Power Networks digital strategy and being able to set UK Power Networks up to operate within the expectations of the RIIO-ED2 regime. investment. PRINCIPAL ACCOUNTABILITIES: This role will be responsible for the following: Supporting technology change • Take a Supporting role in the customer service agile team, prioritising own workload. • Be primary contact in the customer service agile team, prioritise workload according to agreed attributes including understanding the impact of not delivering. • collaboration with representatives from across the business in the delivery of change for on-premises and cloud-based solutions. • Support waterfall project delivery for certain projects and scope, working with project managers and team members to achieve the required outcomes. Technology solving business challenges. • Showing thought approach in how to rapidly assess and select relevant technology solutions to deliver business needs in a scalable and supportable manner. • Showing deep understanding to keep abreast of technology advances and identifying how and when such capabilities can be tested and utilised to drive digital innovation. • Being passionate about how tech works and getting hands-on to develop cloud-based prototypes to rapidly prove business value and drive learning through action . Driving data change • Ensure your work is done with robustness, scalability and best practices in mind. • Advocating best practices in data governance, ownership and stewardship within the team Ensuring standards and controls • Document customer service technology system strategies, roadmaps and standards and ensuring their alignment and support of UK Power Networks business vision and objectives. • Support the development and adherence to technology principles and guardrails to support the realisation of the enterprise architecture and the corresponding business vision and strategy. Managing Communication • Maintain key stakeholder relationships in the business to continuously understand their biggest challenges and data needs, incorporating these needs into technology design and implementation. NATURE AND SCOPE: UK Power Networks is UK's largest electricity distribution network operator, delivering electricity to approximately 18 million customers, including critical infrastructure and businesses across the country. As we face new opportunities and challenges such as smart meters, electric vehicles and a carbon neutral future, we need to simultaneously transform the way use technology to evolve the service we offer to customers & the platform our teams use to serve them. It is the purpose of this initiative to drive this transformation, by complimenting the digital strategy that will seek to make UK Power Networks a in data driven decision making. You will have the opportunity to have a major impact across the company and change the way technology and data is utilised by employees and customers of UK Power Networks. We are accountable in supporting the business in achieving its vision of becoming an 'employer of choice', 'a respected and trusted corporate citizen' and 'sustainably cost efficient'. SKILLS, QUALIFICATIONS AND EXPERIENCE: Qualifications • Education to degree level is desirable but not essential. Experience • Knowledge or experience in working in Agile Delivery Teams and/or leveraging the Scaled Agile Framework for Enterprises (SAFE) • Experience is required in the creation and maintenance of databases within SQL server. This includes schema normalisation, production of views, stored procedures, triggers and user access control. • Extensive experience with Office 365 and Azure as well as data visualisation tools such as PowerBI • Knowledge of the energy and utilities sector is very much preferred. Technical Skills • Writing clean, concise, commented and exceptional code standards within, but not limited to (C# / JavaScript / VBA/HTML / Razor mark-up/SQL) • Using and deploying Microsoft O365 and Azure cloud services utilising SaaS, PaaS and IaaS models • Designing and delivery of solutions on or integrating with real-time systems • Architecting, designing and delivery of packaged solutions including integration into the wider IT estate. • Designing and implementing cloud-based solutions data engineering solutions, including designing the networking and security arrangements. Using technologies such as Azure Data Lakes, Azure Data Factory, Azure Databricks, Azure SQL Data Warehouse/Synapse Analytics, HDInsight, Azure Analysis Services, NoSQL DBs • Championing an automation data driven mindset and knowledge of any programming/scripting (e.g. PowerShell, Go, Python, Ruby, .NET,) • Regularly using data analytics technology and scripting (e.g. SQL, Python/R, NoSQL, PowerBI, Notebooks, machine learning) • In depth understanding of Windows, Linux and Networking systems. • Working with real-time event ingestion engines such as Azure Event Hubs or Apache Kafka, have developed connectors for a range of different data sources. • Excellent range of communication skills, including being able to translate technical issues into non-technical terms and attain complex details from stakeholders quickly and easily.
Mar 28, 2024
Contractor
We are seeking a Junior Developer for UK Power Networks in central Ipswich. This will be for 6-6 months but could become long term. JOB PURPOSE: To play a key role in the implementation of UK Power Networks Digital Strategy by transforming the way the business utilises technology and data to sustain and increase business performance. As a Junior Developer your primary focus will be to Support the Technology Specialists to define, build and continuously improve the core systems and data infrastructure of the business. You will be involved in designing solutions for on-premise and cloud-based technologies. You will be a vital part of a dynamic team defining and delivering continuous change into our systems and data architecture driven increasingly by cloud-based solutions. What you deliver will drive the realisation of the Digital Strategy through enabling technology services that allow true business agility, robust systems resilience, and excellent data intelligence. Specifically, you will be maintaining in house business applications and also working on minor bug fixes, including front-end user interface design, server-side code, any business logic, and database integration. This role is vital to enable the delivery of the UK Power Networks digital strategy and being able to set UK Power Networks up to operate within the expectations of the RIIO-ED2 regime. investment. PRINCIPAL ACCOUNTABILITIES: This role will be responsible for the following: Supporting technology change • Take a Supporting role in the customer service agile team, prioritising own workload. • Be primary contact in the customer service agile team, prioritise workload according to agreed attributes including understanding the impact of not delivering. • collaboration with representatives from across the business in the delivery of change for on-premises and cloud-based solutions. • Support waterfall project delivery for certain projects and scope, working with project managers and team members to achieve the required outcomes. Technology solving business challenges. • Showing thought approach in how to rapidly assess and select relevant technology solutions to deliver business needs in a scalable and supportable manner. • Showing deep understanding to keep abreast of technology advances and identifying how and when such capabilities can be tested and utilised to drive digital innovation. • Being passionate about how tech works and getting hands-on to develop cloud-based prototypes to rapidly prove business value and drive learning through action . Driving data change • Ensure your work is done with robustness, scalability and best practices in mind. • Advocating best practices in data governance, ownership and stewardship within the team Ensuring standards and controls • Document customer service technology system strategies, roadmaps and standards and ensuring their alignment and support of UK Power Networks business vision and objectives. • Support the development and adherence to technology principles and guardrails to support the realisation of the enterprise architecture and the corresponding business vision and strategy. Managing Communication • Maintain key stakeholder relationships in the business to continuously understand their biggest challenges and data needs, incorporating these needs into technology design and implementation. NATURE AND SCOPE: UK Power Networks is UK's largest electricity distribution network operator, delivering electricity to approximately 18 million customers, including critical infrastructure and businesses across the country. As we face new opportunities and challenges such as smart meters, electric vehicles and a carbon neutral future, we need to simultaneously transform the way use technology to evolve the service we offer to customers & the platform our teams use to serve them. It is the purpose of this initiative to drive this transformation, by complimenting the digital strategy that will seek to make UK Power Networks a in data driven decision making. You will have the opportunity to have a major impact across the company and change the way technology and data is utilised by employees and customers of UK Power Networks. We are accountable in supporting the business in achieving its vision of becoming an 'employer of choice', 'a respected and trusted corporate citizen' and 'sustainably cost efficient'. SKILLS, QUALIFICATIONS AND EXPERIENCE: Qualifications • Education to degree level is desirable but not essential. Experience • Knowledge or experience in working in Agile Delivery Teams and/or leveraging the Scaled Agile Framework for Enterprises (SAFE) • Experience is required in the creation and maintenance of databases within SQL server. This includes schema normalisation, production of views, stored procedures, triggers and user access control. • Extensive experience with Office 365 and Azure as well as data visualisation tools such as PowerBI • Knowledge of the energy and utilities sector is very much preferred. Technical Skills • Writing clean, concise, commented and exceptional code standards within, but not limited to (C# / JavaScript / VBA/HTML / Razor mark-up/SQL) • Using and deploying Microsoft O365 and Azure cloud services utilising SaaS, PaaS and IaaS models • Designing and delivery of solutions on or integrating with real-time systems • Architecting, designing and delivery of packaged solutions including integration into the wider IT estate. • Designing and implementing cloud-based solutions data engineering solutions, including designing the networking and security arrangements. Using technologies such as Azure Data Lakes, Azure Data Factory, Azure Databricks, Azure SQL Data Warehouse/Synapse Analytics, HDInsight, Azure Analysis Services, NoSQL DBs • Championing an automation data driven mindset and knowledge of any programming/scripting (e.g. PowerShell, Go, Python, Ruby, .NET,) • Regularly using data analytics technology and scripting (e.g. SQL, Python/R, NoSQL, PowerBI, Notebooks, machine learning) • In depth understanding of Windows, Linux and Networking systems. • Working with real-time event ingestion engines such as Azure Event Hubs or Apache Kafka, have developed connectors for a range of different data sources. • Excellent range of communication skills, including being able to translate technical issues into non-technical terms and attain complex details from stakeholders quickly and easily.
Are you looking to join a consultancy who can genuinely offer you the opportunity to work on pure cloud based projects considered the best in the UK? If you already have experience with the Azure Analytics platform (Azure Data Factory, Azure Synapse, Azure Databricks, Apache Spark, Python/PySpark) and the eagerness to learn then this could be a fantastic role for you. You will work alongside the nicest team mates who so happen to also be MVP's and despite being fully remote, you'll meet up once in a while for drinks and catch ups. Role & Responsibilities Gathering requirements from Senior Stakeholders/Clients. Deliver end to end Azure based solutions through Data Engineering and Data Science methodologies. Promoting company services and looking for extra business opportunities. Skills & Qualifications Extensive experience with SQL Server. Demonstrable experience with MS Azure within a Data capacity (use of Azure Data Factory, Azure Databricks, Azure Synapse, Python, PySpark, Apache Spark, Scala/Java). Ideally, existing consultancy experience. Benefits Salary of up to £75,000 plus various bonuses Very generous benefits package Fully remote working To apply for this role, please send an up to date CV to Jay Dixon or call for a catch up in complete confidence. Frank Group's Data Teams offer more opportunities across the UK than any other recruiter We're the proud sponsor and supporter of SQLBits, AWS RE:Invent, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group.
Mar 27, 2024
Full time
Are you looking to join a consultancy who can genuinely offer you the opportunity to work on pure cloud based projects considered the best in the UK? If you already have experience with the Azure Analytics platform (Azure Data Factory, Azure Synapse, Azure Databricks, Apache Spark, Python/PySpark) and the eagerness to learn then this could be a fantastic role for you. You will work alongside the nicest team mates who so happen to also be MVP's and despite being fully remote, you'll meet up once in a while for drinks and catch ups. Role & Responsibilities Gathering requirements from Senior Stakeholders/Clients. Deliver end to end Azure based solutions through Data Engineering and Data Science methodologies. Promoting company services and looking for extra business opportunities. Skills & Qualifications Extensive experience with SQL Server. Demonstrable experience with MS Azure within a Data capacity (use of Azure Data Factory, Azure Databricks, Azure Synapse, Python, PySpark, Apache Spark, Scala/Java). Ideally, existing consultancy experience. Benefits Salary of up to £75,000 plus various bonuses Very generous benefits package Fully remote working To apply for this role, please send an up to date CV to Jay Dixon or call for a catch up in complete confidence. Frank Group's Data Teams offer more opportunities across the UK than any other recruiter We're the proud sponsor and supporter of SQLBits, AWS RE:Invent, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group.
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
Mar 27, 2024
Full time
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
Mar 27, 2024
Full time
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
Mar 27, 2024
Full time
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
Mar 27, 2024
Full time
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
Mar 27, 2024
Full time
About Us We're an award-winning innovative tech consultancy - a team of creative problem solvers. Since 1993 we've been finding better, more sustainable ways to solve complex technology problems for some of the world's leading organisations and delivered solutions that millions of people use every day. In the last 30 years we won several awards, including a prestigious Queen's Award for Enterprise in the Innovation category for our Enterprise Agile delivery approach. Operating from 26 locations across the world, we bring together teams of creative experts with diverse backgrounds and experiences, who enjoy working and learning in our collaborative and open culture and are committed to world-class delivery. We want to continue to grow our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don't engage BJSS to do the straightforward things, they ask us to help on their biggest challenges which means we get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you'll help our clients deploy data pipelines and processes in a production-safe manner, using the latest technologies and with a DataOps culture. You'll work in a fast moving, agile environment, within multi-disciplinary teams of highly skilled consultants, delivering modern data platforms into large organisations.You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services and platforms. You have a good understanding of coding best practices and design patterns and experience with code and data versioning, dependency management, code quality and optimisation, error handling, logging, monitoring, validation and alerting. You have experience in writing well tested object-oriented Python. You have experience with using CI/CD tooling to analyse, build, test and deploy your code. You have a good understanding of design choices for data storage and data processing, with a particular focus on cloud data services. You have experience in using parallel computing to process large datasets and to optimise computationally intensive tasks. You have experience in programmatically deploying, scheduling and monitoring components in a workflow. You have experience in writing complex queries against relational and non-relational data stores. Some of the Perks Flexible benefits allowance - you choose how to spend your allowance (additional pension contributions, healthcare, dental and more) Industry leading health and wellbeing plan - we partner with several wellbeing support functions to cater to each individual's need, including 24/7 GP services, mental health support, and other Life Assurance (4 x annual salary) 25 days annual leave plus bank holidays Hybrid working - Our roles are not fully remote as we take pride in the tight knit communities we have created at our local offices. But we offer plenty of flexibility and you can split your time between the office, client site and WFH Discounts - we have preferred rates from dozens of retail, lifestyle, and utility brands An industry-leading referral scheme with no limits on the number of referrals Flexible holiday buy/sell option Electric vehicle scheme Training opportunities and incentives - we support professional certifications across engineering and non-engineering roles, including unlimited access to O'Reilly Giving back - the ability to get involved nationally and regionally with partnerships to get people from diverse backgrounds into tech You will become part of a squad with people from different areas within the business who will help you grow at BJSS We have a busy social calendar that you can choose to join- quarterly town halls/squad nights out/weekends away with families included/office get togethers GymFlex gym membership programme
Our growing Technology and Design team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer centric design and technology services. Join us and you'll have an opportunity to make an impact. To empower projects that deliver real-world benefits across Britain and internationally. To hear our customers say they couldn't have done it without us. And to be central to OS's vision: to be recognised as world leaders in geospatial services; creating location insight for positive impact. About the Role Having an appetite for knowledge and the ability to think creatively are highly valued within OS. Our engineering team is working to develop the next generation of cloud-based systems, that drive our world leading geospatial and business applications. Be part of this growth by joining our forward-thinking Agile development team as a Geospatial Engineer. The team is focused on the use of automatic feature extraction to deliver solutions / data layers for our customers using cutting edge technology and approaches. This exciting role will challenge you to translate requirements into cutting edge solutions for our customers. You will have the opportunity to use Databricks and Big Data technologies to create innovative geospatial products; whilst supporting production systems that you and the team have developed. Delivering at pace to a high standard Development, Deployment, Debugging, Testing (TDD, BDD), Scaling and Monitoring secure systems Always looking for opportunities to advocate best practice to Associate Engineers in your team We are keen to invest in you, as such we encourage development and training to allow you to grow within your role. There are also paths for development opportunities within the business as your expertise thrive. Here is a snapshot of the technologies that we use: Azure Cloud Technologies, Python, Databricks, Azure DevOps, Powershell, YAML Pipelines, Esri ArcGIS, FME, QGIS What we are looking for If you are interested in joining a team that lies at the heart of what OS is about, we are looking for someone that can demonstrate essential skills and experience in: Experience of managing and visualising geospatial data using GIS software Knowledge of Databricks utilising PySpark, Scala and SQL to build ETL workflows Knowledge of Azure Data Solutions using Azure Data Factory, Azure DataLake, Storage Account configuration Knowledge of Open Source Python Data Libraries such as Pandas, Numpy Skills in software configuration, engineering of python-based solutions (use of OOP, TDD, SOLID) Experience with Azure DevOps including Version Control and Azure Pipeline configuration Consistent track record in iterative and incremental development Problem resolution and selection of technical solutions Desirable Skills : Experience working with Big Data/Machine Learning technologies. Experience in advocating best practice within your team and to junior members of the team. The rewards We want you to love what you do. That's why our benefits package rewards a job well done. We'll give you: Salary: £37,511 - £44,130 Performance related bonus A competitive pension scheme We embrace flexible working and can consider different working hours dependent on the role and your personal circumstances 25 days annual leave - (30 days after five years) bank holidays and an extra 3 over Christmas Plus, a suite of excellent additional benefits Location OSHQ is based in Southampton, Hampshire but at OS we believe work is something you do, not somewhere you go. We embrace a hybrid working model where we believe the choice is with the individual on when they work from our fantastic office or from home. We want to continue to be an inclusive employer and recognise that one size doesn't necessarily fit all. Closing date: Sunday March We believe diversity and inclusion is about working together - in an encouraging and respectful environment to reach our full potential. We believe combining different backgrounds, experiences and perspectives will help us reach our vision and be trusted and admired across the globe for setting the standards and leading the way. We are looking for passionate people from a range of backgrounds and welcome applications from any race, age, gender, background or religion. We're individually talented and collectively powerful, and we give you the space to take your career in whichever direction you want.
Mar 27, 2024
Full time
Our growing Technology and Design team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer centric design and technology services. Join us and you'll have an opportunity to make an impact. To empower projects that deliver real-world benefits across Britain and internationally. To hear our customers say they couldn't have done it without us. And to be central to OS's vision: to be recognised as world leaders in geospatial services; creating location insight for positive impact. About the Role Having an appetite for knowledge and the ability to think creatively are highly valued within OS. Our engineering team is working to develop the next generation of cloud-based systems, that drive our world leading geospatial and business applications. Be part of this growth by joining our forward-thinking Agile development team as a Geospatial Engineer. The team is focused on the use of automatic feature extraction to deliver solutions / data layers for our customers using cutting edge technology and approaches. This exciting role will challenge you to translate requirements into cutting edge solutions for our customers. You will have the opportunity to use Databricks and Big Data technologies to create innovative geospatial products; whilst supporting production systems that you and the team have developed. Delivering at pace to a high standard Development, Deployment, Debugging, Testing (TDD, BDD), Scaling and Monitoring secure systems Always looking for opportunities to advocate best practice to Associate Engineers in your team We are keen to invest in you, as such we encourage development and training to allow you to grow within your role. There are also paths for development opportunities within the business as your expertise thrive. Here is a snapshot of the technologies that we use: Azure Cloud Technologies, Python, Databricks, Azure DevOps, Powershell, YAML Pipelines, Esri ArcGIS, FME, QGIS What we are looking for If you are interested in joining a team that lies at the heart of what OS is about, we are looking for someone that can demonstrate essential skills and experience in: Experience of managing and visualising geospatial data using GIS software Knowledge of Databricks utilising PySpark, Scala and SQL to build ETL workflows Knowledge of Azure Data Solutions using Azure Data Factory, Azure DataLake, Storage Account configuration Knowledge of Open Source Python Data Libraries such as Pandas, Numpy Skills in software configuration, engineering of python-based solutions (use of OOP, TDD, SOLID) Experience with Azure DevOps including Version Control and Azure Pipeline configuration Consistent track record in iterative and incremental development Problem resolution and selection of technical solutions Desirable Skills : Experience working with Big Data/Machine Learning technologies. Experience in advocating best practice within your team and to junior members of the team. The rewards We want you to love what you do. That's why our benefits package rewards a job well done. We'll give you: Salary: £37,511 - £44,130 Performance related bonus A competitive pension scheme We embrace flexible working and can consider different working hours dependent on the role and your personal circumstances 25 days annual leave - (30 days after five years) bank holidays and an extra 3 over Christmas Plus, a suite of excellent additional benefits Location OSHQ is based in Southampton, Hampshire but at OS we believe work is something you do, not somewhere you go. We embrace a hybrid working model where we believe the choice is with the individual on when they work from our fantastic office or from home. We want to continue to be an inclusive employer and recognise that one size doesn't necessarily fit all. Closing date: Sunday March We believe diversity and inclusion is about working together - in an encouraging and respectful environment to reach our full potential. We believe combining different backgrounds, experiences and perspectives will help us reach our vision and be trusted and admired across the globe for setting the standards and leading the way. We are looking for passionate people from a range of backgrounds and welcome applications from any race, age, gender, background or religion. We're individually talented and collectively powerful, and we give you the space to take your career in whichever direction you want.
The Senior Data Scientist is a critical role supporting the growth of our Data and AI services, reporting to the Practice Director. This role will work with clients and our team to set out the machine learning best practices and take responsibility for the low-level design and development of client ML solutions, including selection and fine-tuning of foundation models and pre-trained services such as Azure Computer Services. The role will also support technical pre-sales, providing technical guidance on client AI requirements. Responsibilities Technical leadership and development of machine learning models, including requirements and data scoping, model selection, training, evaluation, deployment and monitoring. Includes selection and fine-tuning of foundation and pre-trained models. Comfortable advising and selecting appropriate ML solutions against requirements, including cost implications. Set the machine learning best practices and standards for the Data and AI practice, including identifying opportunities for re-usable components and leading their development, maintenance and sharing. Collaborate with stakeholders, including client teams, your wider project team such as Security, Data Engineering, Cloud Architect, App Developer, Scrum Master, BA, and Partner technical representatives (Databricks, Microsoft, AWS) to ensure that ML solutions meet business requirements and are scalable, secure, and compliant with organizational policies and standards. Support technical pre-sales to provide guidance on client requirements for proposals, providing guidance on the technical approach and timeframes. Coach and support development of data scientists on the team. Potential to take on line management or continue a technical SME track without direct people management responsibilities. Advance our ML technology vendor partnerships in data science, including maintaining and taking relevant certifications, and identifying opportunities for further partnerships. Contribute to marketing events and thought leadership articles, including providing input to the strategy and direction About You Design, development, and maintenance of machine learning models in a highly regulated industry. Must have experience with productionised models for business-critical functions that is consumed in downstream services, whether in infrastructure, reports, or applications Knowledge of all stages of the ML lifecycle, and applying MLOps principles to manage experimentation, evaluation, deployment, integration, and monitoring. Experienced in Databricks and Azure AI services, including Cognitive Services, AzureML, Azure OpenAI. Excellent Python or R expertise, including writing, testing and quality assuring ML code. Experienced in working with sensitive data, understands and applies security, ethics, and privacy best practices. Team management and leadership. Pre-sales responses to RFPs and client proposals. Consulting experience. Why people choose to grow their careers at UBDS People are the most important aspect of our business, so adding the right people to the team and helping them grow is critical. This is why we've invested in a people-focused team to look after the entire employee experience. With an impressive portfolio of customers in both the public and private sectors, we have a variety of exciting projects to be involved in. As a technology agnostic organisation, you'll gain exposure to the world's leading and latest technology. Employee Benefits Training - All team members are offered a number of options in terms of personal development, whether it is technical led, business acumen or methodologies. We want you to grow with us and to help us achieve more Private medical cover for you and your spouse/partner, offered via Vitality Discretionary bonus based on a blend of personal and company performance Holiday - You will receive 25 Days holiday, plus 1 day for Birthday and 1 day for your work anniversary in addition to UK bank holidaysElectric Vehicle leasing with salary sacrifice Contributed Pension Scheme Death in service coverHybrid Working - UBDS offers a flexible working environment to help enable you to operate at your maximum regardless of your location. With offices in London & Manchester we offer a culture that is focussed on outcomes and giving you a work life balance but at that same time creating and driving a culture of inclusivity and togetherness About UBDS UBDS was born out of a vision to build lasting relationships by delivering digital transformation solutions with unrivalled speed and efficiency. We have taken complex organisations to the frontier of innovation, transforming enterprise and public sector to be faster, leaner and more competitive. Organisations turn to us for deep knowledge, specialist skills, years of on-the-job experience and our can-do and get-it done culture. Projects are personal. Our work is an extension of the values we embody, and we are always looking for ways to fill the gap for our clients. For us, it's about top- and bottom-line growth, and equipping our clients with cutting-edge technology that empowers innovation. We exist to deliver significant, measurable and sustainable digital transformation, and we achieve this by delivering value to our customers in the following ways: 1. Accelerating change 2. Ensuring frictionless high performance 3. Mitigating risk and ensuring security From advisory, to design and execution, we implement the technology that aligns with our clients' goals - to help them innovate and thrive. We have four key values that guide the way we work together, engage with our customers, make decisions, and ultimately succeed: Our reputation is everything We are passionate about technology and innovation We deliver value and make an impact We keep it simple and make it happen Interested in joining our innovative team? Get in touch. To find out more about this role, one of our other vacancies or to just talk about UBDS and where you might fit into one of our free thinking and ever-advancing teams, please head to or send your CV to Equal Opportunities We are an equal opportunities employer and do not discriminate on the grounds of gender, sexual orientation, marital or civil partner status, pregnancy or maternity, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.
Mar 25, 2024
Full time
The Senior Data Scientist is a critical role supporting the growth of our Data and AI services, reporting to the Practice Director. This role will work with clients and our team to set out the machine learning best practices and take responsibility for the low-level design and development of client ML solutions, including selection and fine-tuning of foundation models and pre-trained services such as Azure Computer Services. The role will also support technical pre-sales, providing technical guidance on client AI requirements. Responsibilities Technical leadership and development of machine learning models, including requirements and data scoping, model selection, training, evaluation, deployment and monitoring. Includes selection and fine-tuning of foundation and pre-trained models. Comfortable advising and selecting appropriate ML solutions against requirements, including cost implications. Set the machine learning best practices and standards for the Data and AI practice, including identifying opportunities for re-usable components and leading their development, maintenance and sharing. Collaborate with stakeholders, including client teams, your wider project team such as Security, Data Engineering, Cloud Architect, App Developer, Scrum Master, BA, and Partner technical representatives (Databricks, Microsoft, AWS) to ensure that ML solutions meet business requirements and are scalable, secure, and compliant with organizational policies and standards. Support technical pre-sales to provide guidance on client requirements for proposals, providing guidance on the technical approach and timeframes. Coach and support development of data scientists on the team. Potential to take on line management or continue a technical SME track without direct people management responsibilities. Advance our ML technology vendor partnerships in data science, including maintaining and taking relevant certifications, and identifying opportunities for further partnerships. Contribute to marketing events and thought leadership articles, including providing input to the strategy and direction About You Design, development, and maintenance of machine learning models in a highly regulated industry. Must have experience with productionised models for business-critical functions that is consumed in downstream services, whether in infrastructure, reports, or applications Knowledge of all stages of the ML lifecycle, and applying MLOps principles to manage experimentation, evaluation, deployment, integration, and monitoring. Experienced in Databricks and Azure AI services, including Cognitive Services, AzureML, Azure OpenAI. Excellent Python or R expertise, including writing, testing and quality assuring ML code. Experienced in working with sensitive data, understands and applies security, ethics, and privacy best practices. Team management and leadership. Pre-sales responses to RFPs and client proposals. Consulting experience. Why people choose to grow their careers at UBDS People are the most important aspect of our business, so adding the right people to the team and helping them grow is critical. This is why we've invested in a people-focused team to look after the entire employee experience. With an impressive portfolio of customers in both the public and private sectors, we have a variety of exciting projects to be involved in. As a technology agnostic organisation, you'll gain exposure to the world's leading and latest technology. Employee Benefits Training - All team members are offered a number of options in terms of personal development, whether it is technical led, business acumen or methodologies. We want you to grow with us and to help us achieve more Private medical cover for you and your spouse/partner, offered via Vitality Discretionary bonus based on a blend of personal and company performance Holiday - You will receive 25 Days holiday, plus 1 day for Birthday and 1 day for your work anniversary in addition to UK bank holidaysElectric Vehicle leasing with salary sacrifice Contributed Pension Scheme Death in service coverHybrid Working - UBDS offers a flexible working environment to help enable you to operate at your maximum regardless of your location. With offices in London & Manchester we offer a culture that is focussed on outcomes and giving you a work life balance but at that same time creating and driving a culture of inclusivity and togetherness About UBDS UBDS was born out of a vision to build lasting relationships by delivering digital transformation solutions with unrivalled speed and efficiency. We have taken complex organisations to the frontier of innovation, transforming enterprise and public sector to be faster, leaner and more competitive. Organisations turn to us for deep knowledge, specialist skills, years of on-the-job experience and our can-do and get-it done culture. Projects are personal. Our work is an extension of the values we embody, and we are always looking for ways to fill the gap for our clients. For us, it's about top- and bottom-line growth, and equipping our clients with cutting-edge technology that empowers innovation. We exist to deliver significant, measurable and sustainable digital transformation, and we achieve this by delivering value to our customers in the following ways: 1. Accelerating change 2. Ensuring frictionless high performance 3. Mitigating risk and ensuring security From advisory, to design and execution, we implement the technology that aligns with our clients' goals - to help them innovate and thrive. We have four key values that guide the way we work together, engage with our customers, make decisions, and ultimately succeed: Our reputation is everything We are passionate about technology and innovation We deliver value and make an impact We keep it simple and make it happen Interested in joining our innovative team? Get in touch. To find out more about this role, one of our other vacancies or to just talk about UBDS and where you might fit into one of our free thinking and ever-advancing teams, please head to or send your CV to Equal Opportunities We are an equal opportunities employer and do not discriminate on the grounds of gender, sexual orientation, marital or civil partner status, pregnancy or maternity, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.
The Company: At Amber Labs, we are a cutting-edge UK and European technology consultancy that prioritises empowering autonomy, promoting experimentation, and facilitating rapid learning to provide exceptional value to our clients. Our company culture is centred around collaboration, where all colleagues, regardless of their role, work together to minimise risk and shorten delivery times. Our team consists of highly-skilled cross-functional consultants, analysts, and support staff. Our clients have the opportunity to earn R&D credits that can be used towards our areas of expertise: Data, Governance, and Cloud Engineering, allowing us to drive customer-focused innovation. Our work extends across both the public and private sectors, providing our colleagues with a diverse and interesting landscape of experience. Amber Labs was founded on three key principles: 1. A people-first internal culture, with diverse investments and exciting opportunities for our team, and a partnership structure that ensures everyone has a chance to share in the success of the company. 2. Constant iteration to identify opportunities to develop artifacts, accelerators, and automation solutions that allow for rapid deployment of highly technical cloud or on-premises solutions for our clients. 3. Consistent investment in our ADM (Amber Labs Delivery Methodology, underpinned by Agile Methodology) to ensure maximum velocity, quality, and value. With demand for our services at an all-time high and continuous growth in our market sectors, we are embarking on a major recruitment drive. We are eager to recruit a talented Data Architect to join our Digital Practice and one of our project delivery teams. Expectations: Proven experience as a Data Architect Strong Dimensional Modelling experience is a must Create ideas within feasibility stages, considering and identifying risks, identifying viable solution options offering value for money delivery. Resolve complex technical design conflicts through stakeholder engagement and negotiation Own technical design documentation, Design standards, Metadata and Data Quality and adherence and compliance across the team Ensure that designs are fit for purpose, clearly understood and meet the strategic direction of Technology Be instrumental in creating innovative ideas, considering risks, opportunities and impacts arising from digital technologies Shape propositions and suggests viable solution options Provide delivery resource estimates and timescales as required Ensure capacity and capability to carry out the work assigned Work closely with Solution Architects to define database types and advise on the best solution based on use cases Requirements: Experience of Azure, GCP or AWS Advanced SQL knowledge Experience with Relational, MPP, NoSQL databases and Object storage Experience with Python / Scala / Databricks / Spark Strong background in Data modelling Streaming and batch mode processes Agile ways of working Source control management, coding standards / code reviews testing, CI/CD Experience in designing and implementing distributed systems Benefits: Join a rapidly expanding startup where personal growth is a part of our DNA. Benefit from a flexible work environment focused on deliverable outcomes. Receive private medical insurance through Aviva. Enjoy the benefits of a company pension plan through Nest. 25 days of annual leave plus UK bank holidays. Access Perkbox, a global employee rewards platform offering discounts, perks, and wellness resources. Participate in a generous employee referral program. A highly collaborative and collegial environment with opportunities for career advancement. Be encouraged to take bold steps and embrace a mindset of experimentation. Choose your preferred device, PC or Mac. Diversity & Inclusion: Here at Amber Labs, we are dedicated to fostering an inclusive and equitable workplace for all. Our commitment to diversity, equality, and inclusion includes: Valuing the unique experiences, perspectives, and backgrounds of all employees and creating an environment where everyone feels welcomed, respected, and valued. Prohibiting all forms of harassment, bullying, discrimination, and victimisation and promoting a culture of dignity and respect for all. Educating all new hires on our Diversity and Inclusion policies and ensuring they are aware of their rights and responsibilities to create a safe and inclusive workplace. By taking these steps, we are dedicated to building a workplace that reflects and celebrates the diversity of our employees and communities. This role at Amber Labs is a permanent position, and all employees are required to meet the Baseline Personnel Security Standard (BPSS). Please be advised that, at this time, we are unable to consider candidates who require sponsorship or hold a visa of any type.
Mar 25, 2024
Full time
The Company: At Amber Labs, we are a cutting-edge UK and European technology consultancy that prioritises empowering autonomy, promoting experimentation, and facilitating rapid learning to provide exceptional value to our clients. Our company culture is centred around collaboration, where all colleagues, regardless of their role, work together to minimise risk and shorten delivery times. Our team consists of highly-skilled cross-functional consultants, analysts, and support staff. Our clients have the opportunity to earn R&D credits that can be used towards our areas of expertise: Data, Governance, and Cloud Engineering, allowing us to drive customer-focused innovation. Our work extends across both the public and private sectors, providing our colleagues with a diverse and interesting landscape of experience. Amber Labs was founded on three key principles: 1. A people-first internal culture, with diverse investments and exciting opportunities for our team, and a partnership structure that ensures everyone has a chance to share in the success of the company. 2. Constant iteration to identify opportunities to develop artifacts, accelerators, and automation solutions that allow for rapid deployment of highly technical cloud or on-premises solutions for our clients. 3. Consistent investment in our ADM (Amber Labs Delivery Methodology, underpinned by Agile Methodology) to ensure maximum velocity, quality, and value. With demand for our services at an all-time high and continuous growth in our market sectors, we are embarking on a major recruitment drive. We are eager to recruit a talented Data Architect to join our Digital Practice and one of our project delivery teams. Expectations: Proven experience as a Data Architect Strong Dimensional Modelling experience is a must Create ideas within feasibility stages, considering and identifying risks, identifying viable solution options offering value for money delivery. Resolve complex technical design conflicts through stakeholder engagement and negotiation Own technical design documentation, Design standards, Metadata and Data Quality and adherence and compliance across the team Ensure that designs are fit for purpose, clearly understood and meet the strategic direction of Technology Be instrumental in creating innovative ideas, considering risks, opportunities and impacts arising from digital technologies Shape propositions and suggests viable solution options Provide delivery resource estimates and timescales as required Ensure capacity and capability to carry out the work assigned Work closely with Solution Architects to define database types and advise on the best solution based on use cases Requirements: Experience of Azure, GCP or AWS Advanced SQL knowledge Experience with Relational, MPP, NoSQL databases and Object storage Experience with Python / Scala / Databricks / Spark Strong background in Data modelling Streaming and batch mode processes Agile ways of working Source control management, coding standards / code reviews testing, CI/CD Experience in designing and implementing distributed systems Benefits: Join a rapidly expanding startup where personal growth is a part of our DNA. Benefit from a flexible work environment focused on deliverable outcomes. Receive private medical insurance through Aviva. Enjoy the benefits of a company pension plan through Nest. 25 days of annual leave plus UK bank holidays. Access Perkbox, a global employee rewards platform offering discounts, perks, and wellness resources. Participate in a generous employee referral program. A highly collaborative and collegial environment with opportunities for career advancement. Be encouraged to take bold steps and embrace a mindset of experimentation. Choose your preferred device, PC or Mac. Diversity & Inclusion: Here at Amber Labs, we are dedicated to fostering an inclusive and equitable workplace for all. Our commitment to diversity, equality, and inclusion includes: Valuing the unique experiences, perspectives, and backgrounds of all employees and creating an environment where everyone feels welcomed, respected, and valued. Prohibiting all forms of harassment, bullying, discrimination, and victimisation and promoting a culture of dignity and respect for all. Educating all new hires on our Diversity and Inclusion policies and ensuring they are aware of their rights and responsibilities to create a safe and inclusive workplace. By taking these steps, we are dedicated to building a workplace that reflects and celebrates the diversity of our employees and communities. This role at Amber Labs is a permanent position, and all employees are required to meet the Baseline Personnel Security Standard (BPSS). Please be advised that, at this time, we are unable to consider candidates who require sponsorship or hold a visa of any type.
Standort(e): Increase your chances of an interview by reading the following overview of this role before making an application. - Swindon, Wiltshire, GB, SN5 6PB - London, City of London, GB, EC2R 8HP - Essen, NW, DE, 45141 RWE Supply & Trading GmbH, Lead in Data To start as soon as possible, Full time, Permanent Your future plans A permanent full-time role within the Trading IT arena at RWE Supply & Trading GmbH (RWEST). Could be located in either Essen (Germany), London (UK) or Swindon (UK), with some travel between sites required. Although the role is full-time, we do have some degree of flex with working times - we are happy to discuss this at the interview stage. It's an exciting time to join us! RWEST is implementing and further developing a Data & Analytics platform to make the sharing, searching, transforming and on-boarding of data simple and efficient. Data is fundamental to the company's business model and will become an ever increasing asset in the future. As part of an expanding multi-disciplinary team you will play a key role in the company wide initiative. We are developing a cloud-based platform and infrastructure using the latest tech, capable of cleaning, transforming and efficiently accessing data, running on-demand as well as scheduled models, and integrating with pre-existing technology. The initiative is sponsored by the Board and is a key enabler for the company's commercial strategy. Therefore, we are looking for an experienced individual to join us. Someone who will play a key role in shaping and building the platform as we scale it, influencing how it is built and the technical decisions we take. If this sounds appealing, then apply now! As part of your role as an interdisciplinary AWS Cloud Expert, you will: Take ownership of architecting and implementing improvements to our data platform cloud environment Want to work in a supportive collaborative environment focused on delivering value to our users Be a specialist for AWS cloud native solutions, define / further improve tools, designs, and work with several vendors in the market Support a multi-disciplinary team of DevOps and Data Engineers, promoting knowledge sharing and upskilling Be a real technical problem solver, able to create awesome solutions for complex requirements with a hands-on attitude Specialise in continuous delivery and automation techniques for managing cloud-native Services using a wide range of technologies Keep up to date with new trends in cloud always looking to new solutions to solve complex problems Have a detailed understanding of our cloud infrastructure (and how it fits in to the wider organisation), codebases (including Infrastructure as Code), and support / availability requirements Your powerful skills A degree in, or related to Computer Science or Mathematics or equivalent experience Experience as a Junior DevOps Engineer with exposure to cloud platforms, automation and continuous software delivery Knowledge of Infrastructure as Code (Terraform, CloudFormation etc.) Fundamental experience in CI / CD pipelines (Azure DevOps, GitLab or similar) Experience in development and scripting (e.g. Python, Ruby, Bash) Good working experience with Linux and Windows server management and Active Directory integration Experience with logging and monitoring platforms (ELK, CloudWatch etc) Fluent in English (verbal and written) Advantageous, but not a must AWS Certifications to professional level Experience building and supporting robust and scalable big data platforms (Dremio, Databricks) Deployment tools: Azure DevOps, AWS CodeDeploy or similar Hands on experience with IaaS and PaaS in Amazon Web Services, including S3, Lambda, EC2, KMS, IAM, CloudWatch Knowledge of skills in Cloud security What we value most is passion, willingness to learn and a determined and resilient work ethic. So, if you can display most of the skills above then we would like to hear from you. Benefits you can rely on Working alongside our traders and analysts on this initiative gives you the opportunity to shape the commercial business from an IT perspective! Right from the beginning you'll be making a difference within your team and this will be supplemented by a variety of comprehensive on the job learning. We are also able to offer you the following: Working in one of the most interesting business contexts; a mixture of energy supply, trading and of course IT! Working in an agile team on things which matter Competitive salary plus annual discretionary bonus Open and diverse company culture Apply now with just a few clicks: ad code 74826 Any questions? Deborah Münz (Recruiting), , Nick Plaßmann (specialist department), We look forward to meeting you! We value diversity and therefore welcome all applications, irrespective of gender, disability, nationality, ethnic and social background, religion and beliefs, age or sexual orientation and identity. Of course, you can find us on LinkedIn, Twitter and Xing, too. 170 traders in the energy business, 290 experts on Europe's largest energy trading floor and 1,100 additional professionals. Join RWE Supply & Trading and shape the future of the trading business to make energy clean, reliable and affordable.
Sep 24, 2022
Full time
Standort(e): Increase your chances of an interview by reading the following overview of this role before making an application. - Swindon, Wiltshire, GB, SN5 6PB - London, City of London, GB, EC2R 8HP - Essen, NW, DE, 45141 RWE Supply & Trading GmbH, Lead in Data To start as soon as possible, Full time, Permanent Your future plans A permanent full-time role within the Trading IT arena at RWE Supply & Trading GmbH (RWEST). Could be located in either Essen (Germany), London (UK) or Swindon (UK), with some travel between sites required. Although the role is full-time, we do have some degree of flex with working times - we are happy to discuss this at the interview stage. It's an exciting time to join us! RWEST is implementing and further developing a Data & Analytics platform to make the sharing, searching, transforming and on-boarding of data simple and efficient. Data is fundamental to the company's business model and will become an ever increasing asset in the future. As part of an expanding multi-disciplinary team you will play a key role in the company wide initiative. We are developing a cloud-based platform and infrastructure using the latest tech, capable of cleaning, transforming and efficiently accessing data, running on-demand as well as scheduled models, and integrating with pre-existing technology. The initiative is sponsored by the Board and is a key enabler for the company's commercial strategy. Therefore, we are looking for an experienced individual to join us. Someone who will play a key role in shaping and building the platform as we scale it, influencing how it is built and the technical decisions we take. If this sounds appealing, then apply now! As part of your role as an interdisciplinary AWS Cloud Expert, you will: Take ownership of architecting and implementing improvements to our data platform cloud environment Want to work in a supportive collaborative environment focused on delivering value to our users Be a specialist for AWS cloud native solutions, define / further improve tools, designs, and work with several vendors in the market Support a multi-disciplinary team of DevOps and Data Engineers, promoting knowledge sharing and upskilling Be a real technical problem solver, able to create awesome solutions for complex requirements with a hands-on attitude Specialise in continuous delivery and automation techniques for managing cloud-native Services using a wide range of technologies Keep up to date with new trends in cloud always looking to new solutions to solve complex problems Have a detailed understanding of our cloud infrastructure (and how it fits in to the wider organisation), codebases (including Infrastructure as Code), and support / availability requirements Your powerful skills A degree in, or related to Computer Science or Mathematics or equivalent experience Experience as a Junior DevOps Engineer with exposure to cloud platforms, automation and continuous software delivery Knowledge of Infrastructure as Code (Terraform, CloudFormation etc.) Fundamental experience in CI / CD pipelines (Azure DevOps, GitLab or similar) Experience in development and scripting (e.g. Python, Ruby, Bash) Good working experience with Linux and Windows server management and Active Directory integration Experience with logging and monitoring platforms (ELK, CloudWatch etc) Fluent in English (verbal and written) Advantageous, but not a must AWS Certifications to professional level Experience building and supporting robust and scalable big data platforms (Dremio, Databricks) Deployment tools: Azure DevOps, AWS CodeDeploy or similar Hands on experience with IaaS and PaaS in Amazon Web Services, including S3, Lambda, EC2, KMS, IAM, CloudWatch Knowledge of skills in Cloud security What we value most is passion, willingness to learn and a determined and resilient work ethic. So, if you can display most of the skills above then we would like to hear from you. Benefits you can rely on Working alongside our traders and analysts on this initiative gives you the opportunity to shape the commercial business from an IT perspective! Right from the beginning you'll be making a difference within your team and this will be supplemented by a variety of comprehensive on the job learning. We are also able to offer you the following: Working in one of the most interesting business contexts; a mixture of energy supply, trading and of course IT! Working in an agile team on things which matter Competitive salary plus annual discretionary bonus Open and diverse company culture Apply now with just a few clicks: ad code 74826 Any questions? Deborah Münz (Recruiting), , Nick Plaßmann (specialist department), We look forward to meeting you! We value diversity and therefore welcome all applications, irrespective of gender, disability, nationality, ethnic and social background, religion and beliefs, age or sexual orientation and identity. Of course, you can find us on LinkedIn, Twitter and Xing, too. 170 traders in the energy business, 290 experts on Europe's largest energy trading floor and 1,100 additional professionals. Join RWE Supply & Trading and shape the future of the trading business to make energy clean, reliable and affordable.
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong GCP Data Engineers who are passionate about Cloud technology and who ideally have skills in many of the following areas: • Build and deliver GCP data engineering solutions as part of a larger project • Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers • Experience in Spark (Scala/Python/Java) and Kafka. • Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. • E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. • Experience with SQL and NoSQL modern data stores. • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Use GCP Data focused Reference Architecture • Design and build data service APIs • Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills Needed • Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) • Google Cloud Platform • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data pipelines. • Deep understanding of data manipulation/wrangling techniques • Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc • Experience and knowledge of application Containerisation, Docker, Kubernetes etc • Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools • Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. • Ability to translate business requirements into plausible technical solutions for articulation to other development staff. • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to Haves: • Knowledge of other cloud platforms • AWS (e.g Athena, Redshift, Glue, EMR) • Relevant certifications • Python • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini and although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why we're different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fuelled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion. Discover more about what Capgemini can offer you. Visit: and
Sep 23, 2022
Full time
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong GCP Data Engineers who are passionate about Cloud technology and who ideally have skills in many of the following areas: • Build and deliver GCP data engineering solutions as part of a larger project • Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers • Experience in Spark (Scala/Python/Java) and Kafka. • Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. • E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. • Experience with SQL and NoSQL modern data stores. • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Use GCP Data focused Reference Architecture • Design and build data service APIs • Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills Needed • Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) • Google Cloud Platform • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data pipelines. • Deep understanding of data manipulation/wrangling techniques • Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc • Experience and knowledge of application Containerisation, Docker, Kubernetes etc • Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools • Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. • Ability to translate business requirements into plausible technical solutions for articulation to other development staff. • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to Haves: • Knowledge of other cloud platforms • AWS (e.g Athena, Redshift, Glue, EMR) • Relevant certifications • Python • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini and although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why we're different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fuelled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion. Discover more about what Capgemini can offer you. Visit: and
Job Introduction BBC R&D has recently established an Automation Applied Research Area focussed on the use of Machine Learning across the BBC. Automation works closely with other BBC R&D Applied Research Areas, BBC Product and Technology Groups and senior business stakeholders across the BBC to accelerate Machine Learning based innovation. Reporting to the Head of Automation, this role will lead a team of experts exploring the ML platforms, tools, performance, and sustainability that will underpin the BBC's approach to Machine Learning innovation. It will ensure that best practice and correct technology choices are downstreamed into R&D ML applications as well as supporting the wider BBC in making the right strategic decisions for its future ML technology. BBC R&D has five applied research areas focussed on Audiences, Automation, Distribution, Infrastructure and Production who are looking to solve some of the most interesting challenges in Media and Broadcasting; as well as our Commercial, Partnerships & Engagement team who ensure we're collaborating with the right external partners and optimising commercial returns through the exploitation of our Intellectual Property and grant funding. Our work supports the BBC's current ambition as well as informing future strategy. If you're excited by the prospect of working in an innovative environment with smart and supportive colleagues, then BBC R&D is the place for you. Role Responsibility This is a hands-on role. Your key responsibilities will be: Build and lead a team of ML engineers to develop an infrastructure to manage ML lifecycle through experimentation, deployment, and testing. Own the Automation MLOps strategy, roadmap, and backlog. Provide leadership and guidance on the delivery of ML models from prototypes to production, mentor and coach team members on ML engineering best practises; work alongside researchers to enable BBC to benefit more rapidly from fundamental ML research. Contribute to the design of ML systems and infrastructure to shape how ML is used across the BBC. Develop relationships with pan-BBC and external contributors and stakeholders. You will need to bring to life long-term ambitions to secure required support and buy-in for tangible and intangible benefits and outcomes. Focus on ensuring our ML technology delivers on performance, cost and sustainability goals and is supportive of the BBC's responsible and ethical ML objectives. Work with our Technology Strategy and Governance team to identify and communicate strategic investment decisions required to mature the BBC's ML technology in line with business needs. Are you the right candidate? Solid understanding of machine learning concepts and algorithms Experience deploying machine learning solutions Expert knowledge of Python programming and machine learning libraries (Scikit-learn, TensorFlow, Keras, PyTorch, MxNet, etc.) Experience implementing ML automation, MLOps (scalable deployment practices aimed to deploy and maintain machine learning models in production reliably and efficiently) and related tools (e.g., MLflow, Kubeflow, Airflow, Sagemaker) Experience working in accordance with DevOps principles, and with industry deployment best practices using CI/CD tools and infrastructure as code (e.g., Docker, Kubernetes, Terraform) Experience in at least one cloud platform (e.g., AWS, GCP, Azure) and associated machine learning services, e.g., Amazon SageMaker, Azure ML, Databricks. Package Description Band: E Contract type: Permanent - Full time Location: UK wide We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i About the BBC We don't focus simply on what we do - we also care how we do it. Our values and the way we behave are important to us. Please make sure you've read about our values and behaviours in the document attached below. Diversity matters at the BBC. We have a working environment where we value and respect every individual's unique contribution, enabling all of our employees to thrive and achieve their full potential. We want to attract the broadest range of talented people to be part of the BBC - whether that's to contribute to our programming or our wide range of non-production roles. The more diverse our workforce, the better able we are to respond to and reflect our audiences in all their diversity. We are committed to equality of opportunity and welcome applications from individuals, regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We will consider flexible working requests for all roles, unless operational requirements prevent otherwise. To find out more about Diversity and Inclusion at the BBC, please click here
Sep 23, 2022
Full time
Job Introduction BBC R&D has recently established an Automation Applied Research Area focussed on the use of Machine Learning across the BBC. Automation works closely with other BBC R&D Applied Research Areas, BBC Product and Technology Groups and senior business stakeholders across the BBC to accelerate Machine Learning based innovation. Reporting to the Head of Automation, this role will lead a team of experts exploring the ML platforms, tools, performance, and sustainability that will underpin the BBC's approach to Machine Learning innovation. It will ensure that best practice and correct technology choices are downstreamed into R&D ML applications as well as supporting the wider BBC in making the right strategic decisions for its future ML technology. BBC R&D has five applied research areas focussed on Audiences, Automation, Distribution, Infrastructure and Production who are looking to solve some of the most interesting challenges in Media and Broadcasting; as well as our Commercial, Partnerships & Engagement team who ensure we're collaborating with the right external partners and optimising commercial returns through the exploitation of our Intellectual Property and grant funding. Our work supports the BBC's current ambition as well as informing future strategy. If you're excited by the prospect of working in an innovative environment with smart and supportive colleagues, then BBC R&D is the place for you. Role Responsibility This is a hands-on role. Your key responsibilities will be: Build and lead a team of ML engineers to develop an infrastructure to manage ML lifecycle through experimentation, deployment, and testing. Own the Automation MLOps strategy, roadmap, and backlog. Provide leadership and guidance on the delivery of ML models from prototypes to production, mentor and coach team members on ML engineering best practises; work alongside researchers to enable BBC to benefit more rapidly from fundamental ML research. Contribute to the design of ML systems and infrastructure to shape how ML is used across the BBC. Develop relationships with pan-BBC and external contributors and stakeholders. You will need to bring to life long-term ambitions to secure required support and buy-in for tangible and intangible benefits and outcomes. Focus on ensuring our ML technology delivers on performance, cost and sustainability goals and is supportive of the BBC's responsible and ethical ML objectives. Work with our Technology Strategy and Governance team to identify and communicate strategic investment decisions required to mature the BBC's ML technology in line with business needs. Are you the right candidate? Solid understanding of machine learning concepts and algorithms Experience deploying machine learning solutions Expert knowledge of Python programming and machine learning libraries (Scikit-learn, TensorFlow, Keras, PyTorch, MxNet, etc.) Experience implementing ML automation, MLOps (scalable deployment practices aimed to deploy and maintain machine learning models in production reliably and efficiently) and related tools (e.g., MLflow, Kubeflow, Airflow, Sagemaker) Experience working in accordance with DevOps principles, and with industry deployment best practices using CI/CD tools and infrastructure as code (e.g., Docker, Kubernetes, Terraform) Experience in at least one cloud platform (e.g., AWS, GCP, Azure) and associated machine learning services, e.g., Amazon SageMaker, Azure ML, Databricks. Package Description Band: E Contract type: Permanent - Full time Location: UK wide We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i About the BBC We don't focus simply on what we do - we also care how we do it. Our values and the way we behave are important to us. Please make sure you've read about our values and behaviours in the document attached below. Diversity matters at the BBC. We have a working environment where we value and respect every individual's unique contribution, enabling all of our employees to thrive and achieve their full potential. We want to attract the broadest range of talented people to be part of the BBC - whether that's to contribute to our programming or our wide range of non-production roles. The more diverse our workforce, the better able we are to respond to and reflect our audiences in all their diversity. We are committed to equality of opportunity and welcome applications from individuals, regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We will consider flexible working requests for all roles, unless operational requirements prevent otherwise. To find out more about Diversity and Inclusion at the BBC, please click here
Data Engineer London, Edinburgh or Manchester - Remote Initially What is a Data Engineer? Dufrain consider a data engineer to be a multi-skilled individual with experience in delivering solutions across the technical data landscape. Our data engineers have expertise in technical delivery, technologies and concepts in areas such as Data Storage, Data Ingestion, Data Integration, Data Warehousing, Data Preparation and Cloud Infrastructure. A suitable candidate will also have a wider understanding of how their role and delivery contributes to wider business outcomes and be able to concisely articulate to stakeholders and interested parties their role and solutions in a way that can be easily understood. Essential Requirements: • 2+ years of experience in a data engineer role • Expert SQL Skills • Experience working with the Hadoop ecosystem (Spark, Hive/Impala, Databricks) • Strong Programming background in Java, Scala, R or Python • Experience in designing, developing and managing data pipelines to process large amounts of dataExperience of cloud based big data offerings such as Amazon EMR, Azure HDInsights or Google Dataproc • Experience of cloud platforms such as Azure, AWS or GCP • Experience working with an ETL tool e.g. Azure Data Factory, SSIS & Talend • Understanding of data modelling techniques e.g. Kimball • Experience working with data streams e.g. Kafka • Experience working with large, structured and unstructured datasets • Experience with database technologies (relational, NoSQL, graph) • Expertise with version control e.g. git • Flexibility to travel to and work on client sites within the UK and occasionally Europe. • Excellent track record in executive stakeholder and sponsor management and maintaining valuable relationships. Key Skills: • Takes ownership and accountability for mission critical initiatives and deliverables both internally and for clients • A proven leader of people having significant influence on the careers and aspirations of those working in proximity • Awareness of current market trends in data having the ability to influence opinion and decisioning across the Data Management spectrum #LifeAtDufrain #GetBusyLiving We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Feb 04, 2022
Full time
Data Engineer London, Edinburgh or Manchester - Remote Initially What is a Data Engineer? Dufrain consider a data engineer to be a multi-skilled individual with experience in delivering solutions across the technical data landscape. Our data engineers have expertise in technical delivery, technologies and concepts in areas such as Data Storage, Data Ingestion, Data Integration, Data Warehousing, Data Preparation and Cloud Infrastructure. A suitable candidate will also have a wider understanding of how their role and delivery contributes to wider business outcomes and be able to concisely articulate to stakeholders and interested parties their role and solutions in a way that can be easily understood. Essential Requirements: • 2+ years of experience in a data engineer role • Expert SQL Skills • Experience working with the Hadoop ecosystem (Spark, Hive/Impala, Databricks) • Strong Programming background in Java, Scala, R or Python • Experience in designing, developing and managing data pipelines to process large amounts of dataExperience of cloud based big data offerings such as Amazon EMR, Azure HDInsights or Google Dataproc • Experience of cloud platforms such as Azure, AWS or GCP • Experience working with an ETL tool e.g. Azure Data Factory, SSIS & Talend • Understanding of data modelling techniques e.g. Kimball • Experience working with data streams e.g. Kafka • Experience working with large, structured and unstructured datasets • Experience with database technologies (relational, NoSQL, graph) • Expertise with version control e.g. git • Flexibility to travel to and work on client sites within the UK and occasionally Europe. • Excellent track record in executive stakeholder and sponsor management and maintaining valuable relationships. Key Skills: • Takes ownership and accountability for mission critical initiatives and deliverables both internally and for clients • A proven leader of people having significant influence on the careers and aspirations of those working in proximity • Awareness of current market trends in data having the ability to influence opinion and decisioning across the Data Management spectrum #LifeAtDufrain #GetBusyLiving We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Microsoft Azure Data Engineer £50,000 - £90,000 DOE + Benefits Flexible Location UK Wide Permanent Location offices to choose from: either London, Manchester, Birmingham, Bristol, Glasgow, Wales, Liverpool or Sheffield (remote for now) As a trusted and preferred recruitment partner to this prestigious global consultancy, we have been asked to assist in the hire of a permanent Microsoft Azure Data Engineer We are looking for strong Azure Data Engineers who are passionate about Microsoft technology and who ideally have skills in many of the following areas: *Azure DevOps - with skills to Design, build, deploy and integrate with other components to provide: -Repository -CI/CD pipelines -Build -Release -Environment Config -Agent Management *Azure Data Factory - Full working knowledge including DevOps integration -Runtime management -Linked service config -Pipelines -Triggers -Logic Apps -API Integration -Data flows/Databricks *Azure Databricks - Full working knowledge including DevOps Integration, ADF Orchestration with languages -R -Python -Scala (optional) *Building Azure Templates and Automation for re-use *Azure Synapse Analytics /Azure SQL Database *Azure Streaming Analytics *Event Hub Security Clearance: Whilst not required, it would be beneficial for candidates to be eligible for SC Clearance. Deerfoot IT Resources Ltd is one of the UK's leading IT Recruitment Agencies, trusted by many of the UK's leading employers. Established in 1997, we have over twenty years of experience as IT Recruitment Specialist. We will never send your CV anywhere without your authorisation and only after you have seen the complete details on this opportunity. Deerfoot is acting as an employment agency in relation to this vacancy. Each time Deerfoot sends a CV to a recruiting client we donate £1 to The Born Free Foundation ().
Nov 18, 2021
Full time
Microsoft Azure Data Engineer £50,000 - £90,000 DOE + Benefits Flexible Location UK Wide Permanent Location offices to choose from: either London, Manchester, Birmingham, Bristol, Glasgow, Wales, Liverpool or Sheffield (remote for now) As a trusted and preferred recruitment partner to this prestigious global consultancy, we have been asked to assist in the hire of a permanent Microsoft Azure Data Engineer We are looking for strong Azure Data Engineers who are passionate about Microsoft technology and who ideally have skills in many of the following areas: *Azure DevOps - with skills to Design, build, deploy and integrate with other components to provide: -Repository -CI/CD pipelines -Build -Release -Environment Config -Agent Management *Azure Data Factory - Full working knowledge including DevOps integration -Runtime management -Linked service config -Pipelines -Triggers -Logic Apps -API Integration -Data flows/Databricks *Azure Databricks - Full working knowledge including DevOps Integration, ADF Orchestration with languages -R -Python -Scala (optional) *Building Azure Templates and Automation for re-use *Azure Synapse Analytics /Azure SQL Database *Azure Streaming Analytics *Event Hub Security Clearance: Whilst not required, it would be beneficial for candidates to be eligible for SC Clearance. Deerfoot IT Resources Ltd is one of the UK's leading IT Recruitment Agencies, trusted by many of the UK's leading employers. Established in 1997, we have over twenty years of experience as IT Recruitment Specialist. We will never send your CV anywhere without your authorisation and only after you have seen the complete details on this opportunity. Deerfoot is acting as an employment agency in relation to this vacancy. Each time Deerfoot sends a CV to a recruiting client we donate £1 to The Born Free Foundation ().
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Nov 04, 2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5