Azure Data Architect

  • Tech Providers Inc.
  • Worcester, Massachusetts
  • 04/14/2026
Full time Information Technology Telecommunications SQL Python Data Analyst

Job Description

Target client bill rate /hr Target conversion salary K + 10% IC bonus. For senior level candidate (8+ years) if identified target conversion salary K + 15% IC bonus. -Local candidates are highly preferred, Onsite at least 1 days/week in Worcester, MA; target local hybrid candidates first -Candidates must be US Citizen / Green Card holder, no sponsorship for employment -Focus on P&C industry experience, hands on development (SQL + Python) -Strong experience in data architecture, data modeling, database design across transactional and analytical systems Job Title: Azure Data Architect (Hybrid Role) Position Overview/Summary: The Azure Data Architect plays a critical role in defining, governing, and evolving the enterprise data architecture that enables trusted, scalable, and high performing data solutions across the organization. Operating within both cloud native and legacy environments, this role provides advanced technical leadership in data architecture, data modeling, and database management to support analytics, operational systems, and strategic initiatives. The Azure Data Architect ensures that data assets are well designed, secure, interoperable, and aligned with business and technology strategy. Working closely with business stakeholders, solution architects, data engineering teams, and platform owners, this role translates complex business requirements into technical components, evaluates and recommends architectural patterns; and guides the adoption of modern data technologies and standards. This role is responsible for researching and experimenting with emerging data platforms, defining data integration and governance frameworks, ensuring data quality and consistency, and supporting both legacy systems and modern cloud ecosystems. Success requires strong end to end architectural thinking, the ability to analyze and design scalable solutions under evolving constraints, influence across cross functional teams, and clearly communicate complex architectural concepts to both technical and non technical audiences. Responsibilities/Essential Functions: • Accountable for data architecture delivery at the project and domain level; consults with and collaborates with business stakeholders, solution architects, and engineering teams. • Gather, analyze, and validate business and technical data requirements; translate requirements into conceptual, logical, and physical data models with ownership and authority. • Design and maintain enterprise data models, database schemas, and message models supporting transactional systems, ODS, data warehouses, and analytics platforms. • Ensure data architecture and designs conform to enterprise reference architecture, data standards, naming conventions, and compliance rules. • Define and maintain data integration architectures, patterns, and roadmaps aligned to enterprise data and integration strategy. • Provide technical leadership for ETL and data integration design and development across Azure Data Factory, Synapse, and Informatica. • Author and maintain source to target mappings, transformation logic, business rules, and data integration documentation. • Partner with business and product owners to define KPIs, metric logic, and shared metric catalogs; ensure consistent interpretation and use across analytics and reporting. • Create and maintain curated analytics datasets and semantic models (SQL views, tables, BI models) to support dashboards and self service reporting. • Perform data profiling, reconciliation, and validation using SQL and Python to ensure data accuracy, completeness, and consistency across systems. • Define data validation rules and test scenarios; support QA, UAT, and production readiness activities. • Participate in data governance processes, including metadata management, lineage documentation, certified datasets, and reduction of redundant data assets. • Develop and maintain processes for capturing and managing metadata across operational, integration, and analytical platforms. • Collaborate with application, infrastructure, and integration teams to ensure end to end information flow across systems and platforms. • Lead or participate in architecture design sessions and architecture reviews; identify risks and recommend mitigation strategies. • Provide technical coaching and mentoring to data engineers, data analysts, and junior architects; promote modeling and integration best practices. • Support project planning, estimation, and delivery activities; contribute to technical designs and implementation approaches. • Proactively research and apply modern data architecture patterns, cloud best practices, and emerging technologies to improve scalability and performance. Key Measures of Success: At an intermediate level, delivers the following items: Erwin data models Conceptual, Logical and Physical. Technical Approach and Design Documentation. Data Context and Flow Diagrams. Data Engineering Pipelines. Data mapping logic, rules and transformation logic, and business rules. Data Migration and Integrations. REQUIRED SKILLS, EXPERIENCE & KNOWLEDGE • Property and Casualty Insurance Industry Experience. • Strong experience in data architecture, data modeling, and database design across transactional and analytical systems. • Hands on experience with Microsoft Azure data services (ADF, Synapse, ADLS, Azure SQL). • Advanced SQL skills; working knowledge of Python for data validation, automation, and data processing. • Experience designing and supporting ETL/data integration architectures, including legacy platforms such as Informatica. • Understanding of enterprise data governance, metadata, lineage, and data quality concepts. • Ability to translate complex business requirements into scalable, governed data solutions. • Strong written and verbal communication skills; ability to communicate architectural concepts to both technical and business audiences. NICE TO HAVE • Experience with Power BI semantic models, Microsoft Fabric, or Purview / metadata tools. • Familiarity with Delta Lake, Parquet, Spark/PySpark, and lakehouse architectures. • CI/CD experience using Azure DevOps. • Knowledge of event streaming, API integrations, or data quality automation frameworks. Experience: • Degree in business management, computer science, computer engineering, electrical engineering, system analysis or a related field of study • 6-8 years of overall Information systems, services / consulting experience of which the recent 5+ years in data architecture and data related fields.