it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

3657 jobs found

Email me jobs like this
Refine Search
Current Search
data engineer
Qualys Security Engineer- Active Secret Clearance Required
VETS, Inc Washington, Washington DC
Staffing Pros, a division of VETS Inc., is recruiting for a full-time Qualys Security Engineer onsite in Washington, DC or Beltsville, MD. This position requires an Active Secret Clearance. The Senior Qualys Security Engineer will support our customer's enterprise vulnerability management initiatives. This role involves maintaining and optimizing Qualys toolsets, performing vulnerability assessments, and working collaboratively across technical teams to strengthen organizational cybersecurity posture. This position is based on-site at either the Washington, DC or Beltsville, MD office, with occasional travel between the two locations. What youll do: Oversee day-to-day management of the Qualys platform including agents, scanners, and connectors. Optimize scan configurations, authentication methods, and template deployments. Review and interpret scan results to generate actionable intelligence for technical and non-technical audiences. Partner with infrastructure, development, and SOC teams to validate findings and drive remediation efforts. Automate tasks using Qualys APIs and custom scripts to support reporting and data integration. Maintain an up-to-date asset inventory through discovery and classification workflows. Minimize false positives through tuning and validation. Conduct policy compliance assessments in support of regulatory frameworks. Provide guidance and mentorship to junior analysts in vulnerability management best practices. Required Qualifications 5+ years of hands-on expertise with Qualys. Must be able to commute to Beltsville, MD or Washington, DC for full-time onsite work. Secret clearance with the ability to obtain a Top Secret clearance is required. Proficiency in scripting (Python, PowerShell, or Bash). Familiarity with network protocols, OS security (Windows/Linux), and web application vulnerabilities. Understanding of compliance standards and frameworks (e.g., NIST 800-53, CIS Controls, ISO 27001.) Qualys Vulnerability Management & Policy Compliance. Qualys Web Application Scanning. Automation using Qualys APIs. Network architecture and protocol knowledge. Database and OS-level security. Vulnerability lifecycle and remediation strategies. Excellent written and verbal communication. Strong problem-solving and analytical mindset. Ability to operate independently or as part of a multi-disciplinary team. Solid documentation and reporting practices. Experience engaging with cross-functional stakeholders. US Citizenship is required. Preferred Qualifications Professional certifications: CISSP, CEH, GIAC, or equivalent. Exposure to other scanning tools (e.g., Tenable, Rapid7). Familiarity with public cloud security models (AWS, Azure, GCP). Experience with configuration management tools and CI/CD pipelines. Background in system administration, network engineering, or DevSecOps. EEO Statement Staffing Pros a division of VETS-inc is an Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities. The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractors legal duty to furnish information.
02/11/2026
Staffing Pros, a division of VETS Inc., is recruiting for a full-time Qualys Security Engineer onsite in Washington, DC or Beltsville, MD. This position requires an Active Secret Clearance. The Senior Qualys Security Engineer will support our customer's enterprise vulnerability management initiatives. This role involves maintaining and optimizing Qualys toolsets, performing vulnerability assessments, and working collaboratively across technical teams to strengthen organizational cybersecurity posture. This position is based on-site at either the Washington, DC or Beltsville, MD office, with occasional travel between the two locations. What youll do: Oversee day-to-day management of the Qualys platform including agents, scanners, and connectors. Optimize scan configurations, authentication methods, and template deployments. Review and interpret scan results to generate actionable intelligence for technical and non-technical audiences. Partner with infrastructure, development, and SOC teams to validate findings and drive remediation efforts. Automate tasks using Qualys APIs and custom scripts to support reporting and data integration. Maintain an up-to-date asset inventory through discovery and classification workflows. Minimize false positives through tuning and validation. Conduct policy compliance assessments in support of regulatory frameworks. Provide guidance and mentorship to junior analysts in vulnerability management best practices. Required Qualifications 5+ years of hands-on expertise with Qualys. Must be able to commute to Beltsville, MD or Washington, DC for full-time onsite work. Secret clearance with the ability to obtain a Top Secret clearance is required. Proficiency in scripting (Python, PowerShell, or Bash). Familiarity with network protocols, OS security (Windows/Linux), and web application vulnerabilities. Understanding of compliance standards and frameworks (e.g., NIST 800-53, CIS Controls, ISO 27001.) Qualys Vulnerability Management & Policy Compliance. Qualys Web Application Scanning. Automation using Qualys APIs. Network architecture and protocol knowledge. Database and OS-level security. Vulnerability lifecycle and remediation strategies. Excellent written and verbal communication. Strong problem-solving and analytical mindset. Ability to operate independently or as part of a multi-disciplinary team. Solid documentation and reporting practices. Experience engaging with cross-functional stakeholders. US Citizenship is required. Preferred Qualifications Professional certifications: CISSP, CEH, GIAC, or equivalent. Exposure to other scanning tools (e.g., Tenable, Rapid7). Familiarity with public cloud security models (AWS, Azure, GCP). Experience with configuration management tools and CI/CD pipelines. Background in system administration, network engineering, or DevSecOps. EEO Statement Staffing Pros a division of VETS-inc is an Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities. The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractors legal duty to furnish information.
Maniframe developer
Enin Systems Penasco, New Mexico
Location: Onsite / Hybrid / Remote Experience: 410+ YearsEmployment Type: Full Time / Contract Job Description: We are looking for an experienced Mainframe Developer to design, develop, maintain, and support enterprise mainframe applications. The ideal candidate should have strong hands-on experience in IBM Mainframe technologies and be comfortable working in a production support and development environment. Key Responsibilities: Design, develop, test, and maintain mainframe applications Work on batch and online programs using COBOL and JCL Analyze business requirements and convert them into technical solutions Perform code enhancements, bug fixes, and performance tuning Participate in production support, including issue analysis and resolution Collaborate with business analysts, QA, and onsite/offshore teams Prepare and maintain technical documentation Required Skills: Strong experience in COBOL (Batch & Online) Hands-on experience with JCL Experience with CICS and/or IMS Strong knowledge of DB2 and SQL Experience with VSAM datasets Familiarity with Endevor / Changeman / ISPW Experience in batch scheduling tools like Control-M / CA-7 / OPC Nice to Have: Experience in Banking, Financial, Insurance, or Healthcare domains Exposure to Mainframe modernization or migration projects Knowledge of REXX, Easytrieve, or Assembler (basic) Experience working in Agile/Scrum environments Education: Bachelors degree in Computer Science, Engineering, or related field
02/11/2026
Location: Onsite / Hybrid / Remote Experience: 410+ YearsEmployment Type: Full Time / Contract Job Description: We are looking for an experienced Mainframe Developer to design, develop, maintain, and support enterprise mainframe applications. The ideal candidate should have strong hands-on experience in IBM Mainframe technologies and be comfortable working in a production support and development environment. Key Responsibilities: Design, develop, test, and maintain mainframe applications Work on batch and online programs using COBOL and JCL Analyze business requirements and convert them into technical solutions Perform code enhancements, bug fixes, and performance tuning Participate in production support, including issue analysis and resolution Collaborate with business analysts, QA, and onsite/offshore teams Prepare and maintain technical documentation Required Skills: Strong experience in COBOL (Batch & Online) Hands-on experience with JCL Experience with CICS and/or IMS Strong knowledge of DB2 and SQL Experience with VSAM datasets Familiarity with Endevor / Changeman / ISPW Experience in batch scheduling tools like Control-M / CA-7 / OPC Nice to Have: Experience in Banking, Financial, Insurance, or Healthcare domains Exposure to Mainframe modernization or migration projects Knowledge of REXX, Easytrieve, or Assembler (basic) Experience working in Agile/Scrum environments Education: Bachelors degree in Computer Science, Engineering, or related field
Golang Developer - Onsite
Panacea Direct Inc Plano, Texas
Interview: Virtual + Onsite Key Responsibilities Design and develop Golang-based APIs and web services Maintain REST API documentation using Swagger Create and maintain technical/design documentation Perform testing and validation before releases Troubleshoot and debug applications Work on AWS cloud platform, including EKS Maintain and enhance existing codebases Perform peer code reviews Collaborate with team members to meet project milestones Communicate effectively with internal teams and suppliers Required Skills & Qualifications 2 - 3+ years of Golang experience (mandatory) Experience with Java or NodeJS (plus) Experience with AWS/GCP/Azure cloud platforms Experience with DevOps, cloud engineering, and SaaS environments Experience building large-scale distributed systems Strong problem-solving and analytical skills Good understanding of algorithms, data structures, and complexity analysis Overall experience: 5 - 8 years Golang experience: 3 - 4 years
02/11/2026
Interview: Virtual + Onsite Key Responsibilities Design and develop Golang-based APIs and web services Maintain REST API documentation using Swagger Create and maintain technical/design documentation Perform testing and validation before releases Troubleshoot and debug applications Work on AWS cloud platform, including EKS Maintain and enhance existing codebases Perform peer code reviews Collaborate with team members to meet project milestones Communicate effectively with internal teams and suppliers Required Skills & Qualifications 2 - 3+ years of Golang experience (mandatory) Experience with Java or NodeJS (plus) Experience with AWS/GCP/Azure cloud platforms Experience with DevOps, cloud engineering, and SaaS environments Experience building large-scale distributed systems Strong problem-solving and analytical skills Good understanding of algorithms, data structures, and complexity analysis Overall experience: 5 - 8 years Golang experience: 3 - 4 years
Generative AI Engineer / GenAI Developer
Enin Systems United, Pennsylvania
Experience: 310+ YearsEmployment Type: Full Time / Contract Job Description: We are seeking a skilled Generative AI Engineer to design, build, and deploy GenAI solutions using Large Language Models (LLMs The ideal candidate will work on AI-powered applications such as chatbots, copilots, document intelligence, and automation tools, collaborating closely with data science, product, and engineering teams. Key Responsibilities: Design and develop GenAI applications using LLMs Build and optimize prompt engineering and RAG (Retrieval Augmented Generation) pipelines Integrate GenAI models into web and enterprise applications Fine-tune and evaluate LLMs for performance and accuracy Work with structured and unstructured data (PDFs, documents, APIs) Implement AI safety, monitoring, and cost optimization strategies Collaborate with cross-functional teams to deliver AI solutions to production Required Skills: Strong programming experience in Python (mandatory) Hands-on experience with LLMs (OpenAI, Azure OpenAI, Anthropic, Gemini, LLaMA, etc Experience with LangChain, LlamaIndex, or similar frameworks Knowledge of Prompt Engineering and RAG architectures Experience with Vector Databases (Pinecone, FAISS, Weaviate, Chroma, Milvus) Familiarity with REST APIs and microservices Experience deploying models on AWS / Azure / GCP Nice to Have: Fine-tuning using LoRA / PEFT Experience with MLOps tools (MLflow, Kubeflow, CI/CD) Knowledge of NLP, embeddings, transformers Experience with Docker, Kubernetes Exposure to AI governance, security, and compliance Education: Bachelors or Masters degree in Computer Science, AI, Data Science, or related field
02/11/2026
Experience: 310+ YearsEmployment Type: Full Time / Contract Job Description: We are seeking a skilled Generative AI Engineer to design, build, and deploy GenAI solutions using Large Language Models (LLMs The ideal candidate will work on AI-powered applications such as chatbots, copilots, document intelligence, and automation tools, collaborating closely with data science, product, and engineering teams. Key Responsibilities: Design and develop GenAI applications using LLMs Build and optimize prompt engineering and RAG (Retrieval Augmented Generation) pipelines Integrate GenAI models into web and enterprise applications Fine-tune and evaluate LLMs for performance and accuracy Work with structured and unstructured data (PDFs, documents, APIs) Implement AI safety, monitoring, and cost optimization strategies Collaborate with cross-functional teams to deliver AI solutions to production Required Skills: Strong programming experience in Python (mandatory) Hands-on experience with LLMs (OpenAI, Azure OpenAI, Anthropic, Gemini, LLaMA, etc Experience with LangChain, LlamaIndex, or similar frameworks Knowledge of Prompt Engineering and RAG architectures Experience with Vector Databases (Pinecone, FAISS, Weaviate, Chroma, Milvus) Familiarity with REST APIs and microservices Experience deploying models on AWS / Azure / GCP Nice to Have: Fine-tuning using LoRA / PEFT Experience with MLOps tools (MLflow, Kubeflow, CI/CD) Knowledge of NLP, embeddings, transformers Experience with Docker, Kubernetes Exposure to AI governance, security, and compliance Education: Bachelors or Masters degree in Computer Science, AI, Data Science, or related field
Power BI developer
Robotics technology LLC Iselin, New Jersey
Job Description:Role : Power BI developerMust Have : powerbi + SQL+ DAX + Paginated reportsJob Description:We are seeking a highly skilled and motivated Business Intelligence Engineer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence solutions that drive data-driven decision-making across the organization. This role requires a deep understanding of data warehousing, data modeling, and ETL processes, as well as strong analytical and problem-solving skills. Experience with business intelligence tools, particularly Power BI, including creating complex dashboards and reports.The candidate will be responsible for the design and development of reports, dashboards and ad-hoc queries. The position calls for someone that is comfortable working with business users along with business analyst expertise. Excellent communication skills and the ability to interact effectively with users are essential. Experience with the following technologies will be required:Power BI Dashboards, Paginated Reports, DAXSnowflakePBRS (Power BI Report Scheduler)Power OnSQL ServerDatawarehousingSemantic Model Tooling (ie. Cube Dev)Key Responsibilities:Design, develop, and maintain data warehouse and business intelligence (link removed)prehension of ETL processes to ensure data is accurately and efficiently loaded into data warehouses.Develop and maintain dashboards, reports, and visualizations to support business with cross-functional teams to understand business requirements and translate them into technical solutions.Perform data analysis to identify trends, patterns, and insights that can drive business improvements.Ensure data quality and integrity by implementing data validation and cleansing processes.Optimize and tune SQL queries and database (link removed)municate complex technical concepts to non-technical stakeholders in a clear and concise manner.Engage with users to gather requirements, provide support, and ensure the successful adoption of business intelligence solutions.Stay up-to-date with the latest industry trends and technologies in business intelligence and data analytics.Qualifications:Minimum 8+ years of combined experience in data warehousing/business intelligence/analytics and reporting systems.Minimum 8+ years of relational and multi-dimensional (OLAP) data modeling.Proficiency in the use of SQL, including relational and dimensional database structures, query optimization, specifically with SQL Server and SnowflakeStrong knowledge of Fixed Income, Equity and Derivative businessesStrong ability to analyze user requirements, make recommendations and implement solutionsSelf-driven and should be able to troubleshoot and provide quick resolutions to issues.Full project management and development life cycle experienceStrong oral and written communication skillsStrong presentation and interpersonal skillsAbility to prioritize and execute in a high-pressured environmentUniversity bachelor degree (Computer Science, Information Systems or Computer Engineering) Equal Opportunity Employer We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.
02/11/2026
Job Description:Role : Power BI developerMust Have : powerbi + SQL+ DAX + Paginated reportsJob Description:We are seeking a highly skilled and motivated Business Intelligence Engineer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence solutions that drive data-driven decision-making across the organization. This role requires a deep understanding of data warehousing, data modeling, and ETL processes, as well as strong analytical and problem-solving skills. Experience with business intelligence tools, particularly Power BI, including creating complex dashboards and reports.The candidate will be responsible for the design and development of reports, dashboards and ad-hoc queries. The position calls for someone that is comfortable working with business users along with business analyst expertise. Excellent communication skills and the ability to interact effectively with users are essential. Experience with the following technologies will be required:Power BI Dashboards, Paginated Reports, DAXSnowflakePBRS (Power BI Report Scheduler)Power OnSQL ServerDatawarehousingSemantic Model Tooling (ie. Cube Dev)Key Responsibilities:Design, develop, and maintain data warehouse and business intelligence (link removed)prehension of ETL processes to ensure data is accurately and efficiently loaded into data warehouses.Develop and maintain dashboards, reports, and visualizations to support business with cross-functional teams to understand business requirements and translate them into technical solutions.Perform data analysis to identify trends, patterns, and insights that can drive business improvements.Ensure data quality and integrity by implementing data validation and cleansing processes.Optimize and tune SQL queries and database (link removed)municate complex technical concepts to non-technical stakeholders in a clear and concise manner.Engage with users to gather requirements, provide support, and ensure the successful adoption of business intelligence solutions.Stay up-to-date with the latest industry trends and technologies in business intelligence and data analytics.Qualifications:Minimum 8+ years of combined experience in data warehousing/business intelligence/analytics and reporting systems.Minimum 8+ years of relational and multi-dimensional (OLAP) data modeling.Proficiency in the use of SQL, including relational and dimensional database structures, query optimization, specifically with SQL Server and SnowflakeStrong knowledge of Fixed Income, Equity and Derivative businessesStrong ability to analyze user requirements, make recommendations and implement solutionsSelf-driven and should be able to troubleshoot and provide quick resolutions to issues.Full project management and development life cycle experienceStrong oral and written communication skillsStrong presentation and interpersonal skillsAbility to prioritize and execute in a high-pressured environmentUniversity bachelor degree (Computer Science, Information Systems or Computer Engineering) Equal Opportunity Employer We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.
Technical Architect
Enin Systems New York, New York
Experience: 1015+ YearsEmployment Type: Full Time / Contract Job Description: We are looking for an experienced Technical Architect to design and lead the implementation of scalable, secure, and high-performance enterprise solutions. The ideal candidate will work closely with business stakeholders, development teams, and leadership to define technical strategy and ensure best practices across projects. Key Responsibilities: Define end-to-end system architecture and technical solutions Translate business requirements into scalable technical designs Lead architectural decisions for applications, integrations, and platforms Review and guide development teams on coding and design standards Ensure performance, security, scalability, and reliability of systems Drive cloud adoption, modernization, and migration initiatives Participate in technical governance and architecture reviews Mentor developers and technical leads Required Skills: Strong hands-on experience in .NET / Java / Cloud technologies Expertise in Microservices architecture and REST APIs Strong knowledge of Design Patterns and SOLID principles Experience with Cloud platforms (Azure / AWS / GCP) Experience with CI/CD pipelines, DevOps, and Infrastructure as Code Strong knowledge of Databases (SQL & NoSQL) Experience with Security, Authentication, and Authorization Nice to Have: Experience with Azure/AWS Architecture certifications Knowledge of Containerization (Docker, Kubernetes) Experience in Data, AI/ML, or GenAI solutions Experience working in Agile/Scrum environments Domain experience in Banking, Healthcare, Insurance, or Government Education: Bachelors or Masters degree in Computer Science, Engineering, or related field If you want, I can:
02/11/2026
Experience: 1015+ YearsEmployment Type: Full Time / Contract Job Description: We are looking for an experienced Technical Architect to design and lead the implementation of scalable, secure, and high-performance enterprise solutions. The ideal candidate will work closely with business stakeholders, development teams, and leadership to define technical strategy and ensure best practices across projects. Key Responsibilities: Define end-to-end system architecture and technical solutions Translate business requirements into scalable technical designs Lead architectural decisions for applications, integrations, and platforms Review and guide development teams on coding and design standards Ensure performance, security, scalability, and reliability of systems Drive cloud adoption, modernization, and migration initiatives Participate in technical governance and architecture reviews Mentor developers and technical leads Required Skills: Strong hands-on experience in .NET / Java / Cloud technologies Expertise in Microservices architecture and REST APIs Strong knowledge of Design Patterns and SOLID principles Experience with Cloud platforms (Azure / AWS / GCP) Experience with CI/CD pipelines, DevOps, and Infrastructure as Code Strong knowledge of Databases (SQL & NoSQL) Experience with Security, Authentication, and Authorization Nice to Have: Experience with Azure/AWS Architecture certifications Knowledge of Containerization (Docker, Kubernetes) Experience in Data, AI/ML, or GenAI solutions Experience working in Agile/Scrum environments Domain experience in Banking, Healthcare, Insurance, or Government Education: Bachelors or Masters degree in Computer Science, Engineering, or related field If you want, I can:
Software Engineers for ERP Integrations - onsite
Panacea Direct Inc Irving, Texas
Software Engineers for ERP Integration Responsibilities: - Design, build, and maintain integration services across ERP systems (SAP, Workday, Oracle, Infor, or similar). Develop REST and SOAP APIs and ensure scalable, maintainable integration patterns. Implement backend services using C#, .NET Core, Node.js, Java, Python, or Typescript depending on system needs. Build microservices and event-driven architectures leveraging Azure EventHub, Service Bus, or Kafka. Engineer cloud-based solutions using Azure Functions, Logic Apps, API Management, Key Vault, and App Services. Develop and optimize data integration pipelines, including ETL/ELT transformations and handling JSON, XML, and EDI formats. Work with relational database technologies (RDBMS, PL/SQL) and perform data modeling and performance tuning. Interpret architecture diagrams, sequence flows, and design loosely coupled integration layers. Follow strong Git branching strategies and engineering practices such as TDD and unit testing. Work in CI/CD pipelines using Azure DevOps or GitHub Actions. Monitor production systems and integrations using App Insights, Splunk, or similar tools. Partner with QE, Product, and cross-functional teams to deliver high-quality integration capabilities. Experience: - 8+ Years Location: - Irving, TX - 2days/Week Educational Qualifications: - Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: - Mandatory skills Minimum 8 years of software engineering experience. Experience with ERP integrations and external system connectivity. API development expertise with REST and SOAP. Backend engineering proficiency with C#, .NET Core, Node.js, Java, Python, or Typescript. Experience with microservices and event-driven messaging frameworks such as EventHub, Service Bus, or Kafka. Hands-on experience with Azure cloud services and Infrastructure-as-Code (Pulumi or Terraform). Strong understanding of ETL/ELT patterns, data transformation, and ERP-related data formats. Solid SQL skills, including schema design and performance tuning. Strong debugging abilities and comfort working in complex, interconnected environments. Strong communication skills and experience working with evolving requirements Good to have skills: - Proactive and prevention-focused mindset. Strong analytical and problem-solving approach. Ability to collaborate across teams (engineering, product, business, data, QE). Proven experience maintaining system stability and delivering high-quality solutions. Familiarity with AI-assisted development tools such as Copilot or ChatGPT. Awareness of semantic search, ML-driven enrichment, or Azure Cognitive Services is a plus
02/11/2026
Software Engineers for ERP Integration Responsibilities: - Design, build, and maintain integration services across ERP systems (SAP, Workday, Oracle, Infor, or similar). Develop REST and SOAP APIs and ensure scalable, maintainable integration patterns. Implement backend services using C#, .NET Core, Node.js, Java, Python, or Typescript depending on system needs. Build microservices and event-driven architectures leveraging Azure EventHub, Service Bus, or Kafka. Engineer cloud-based solutions using Azure Functions, Logic Apps, API Management, Key Vault, and App Services. Develop and optimize data integration pipelines, including ETL/ELT transformations and handling JSON, XML, and EDI formats. Work with relational database technologies (RDBMS, PL/SQL) and perform data modeling and performance tuning. Interpret architecture diagrams, sequence flows, and design loosely coupled integration layers. Follow strong Git branching strategies and engineering practices such as TDD and unit testing. Work in CI/CD pipelines using Azure DevOps or GitHub Actions. Monitor production systems and integrations using App Insights, Splunk, or similar tools. Partner with QE, Product, and cross-functional teams to deliver high-quality integration capabilities. Experience: - 8+ Years Location: - Irving, TX - 2days/Week Educational Qualifications: - Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: - Mandatory skills Minimum 8 years of software engineering experience. Experience with ERP integrations and external system connectivity. API development expertise with REST and SOAP. Backend engineering proficiency with C#, .NET Core, Node.js, Java, Python, or Typescript. Experience with microservices and event-driven messaging frameworks such as EventHub, Service Bus, or Kafka. Hands-on experience with Azure cloud services and Infrastructure-as-Code (Pulumi or Terraform). Strong understanding of ETL/ELT patterns, data transformation, and ERP-related data formats. Solid SQL skills, including schema design and performance tuning. Strong debugging abilities and comfort working in complex, interconnected environments. Strong communication skills and experience working with evolving requirements Good to have skills: - Proactive and prevention-focused mindset. Strong analytical and problem-solving approach. Ability to collaborate across teams (engineering, product, business, data, QE). Proven experience maintaining system stability and delivering high-quality solutions. Familiarity with AI-assisted development tools such as Copilot or ChatGPT. Awareness of semantic search, ML-driven enrichment, or Azure Cognitive Services is a plus
Data Engineer
SpreadsheetBroccoli New York, New York
Data Engineer - New York, NY SpreadsheetBroccoli is looking for a skilled Data Engineer to join our team in New York City. We build CSV and Excel processing tools with 500+ integrations for e-commerce, accounting, and business platforms. Responsibilities: - Design, build, and maintain scalable data pipelines and ETL processes - Develop and optimize data architectures for CSV and Excel processing workflows - Build and manage integrations with e-commerce, accounting, and business platforms - Implement data validation, reconciliation, and migration solutions - Collaborate with engineering and product teams to support data-driven features - Monitor and troubleshoot data infrastructure for performance and reliability - Write clean, well-documented, and testable code Requirements: - 3+ years of experience as a Data Engineer or similar role - Strong proficiency in Python and SQL - Experience with ETL/ELT frameworks and data pipeline orchestration tools (e.g., Apache Airflow, dbt) - Familiarity with cloud platforms (AWS, GCP, or Azure) - Experience working with CSV, Excel, and structured file formats at scale - Knowledge of relational and non-relational databases - Strong problem-solving and communication skills Nice to Have: - Experience with e-commerce or accounting platform APIs - Familiarity with data quality and validation frameworks - Experience with containerization (Docker, Kubernetes) - Knowledge of streaming data technologies (Kafka, Spark Streaming) Location: New York, NY 10003 Job Type: Full-Time About SpreadsheetBroccoli: We provide online tools to convert, validate, import, connect, integrate, reconcile, migrate and process CSV and Excel files. No signup required. Learn more at (link removed).
02/11/2026
Data Engineer - New York, NY SpreadsheetBroccoli is looking for a skilled Data Engineer to join our team in New York City. We build CSV and Excel processing tools with 500+ integrations for e-commerce, accounting, and business platforms. Responsibilities: - Design, build, and maintain scalable data pipelines and ETL processes - Develop and optimize data architectures for CSV and Excel processing workflows - Build and manage integrations with e-commerce, accounting, and business platforms - Implement data validation, reconciliation, and migration solutions - Collaborate with engineering and product teams to support data-driven features - Monitor and troubleshoot data infrastructure for performance and reliability - Write clean, well-documented, and testable code Requirements: - 3+ years of experience as a Data Engineer or similar role - Strong proficiency in Python and SQL - Experience with ETL/ELT frameworks and data pipeline orchestration tools (e.g., Apache Airflow, dbt) - Familiarity with cloud platforms (AWS, GCP, or Azure) - Experience working with CSV, Excel, and structured file formats at scale - Knowledge of relational and non-relational databases - Strong problem-solving and communication skills Nice to Have: - Experience with e-commerce or accounting platform APIs - Familiarity with data quality and validation frameworks - Experience with containerization (Docker, Kubernetes) - Knowledge of streaming data technologies (Kafka, Spark Streaming) Location: New York, NY 10003 Job Type: Full-Time About SpreadsheetBroccoli: We provide online tools to convert, validate, import, connect, integrate, reconcile, migrate and process CSV and Excel files. No signup required. Learn more at (link removed).
Data engineer with Python & GCP
Avance Consulting Services Hartford, Connecticut
Role: Google Cloud (GCP) data engineer with Python Hartford, CT Full Time Required Qualifications: At least 4 years of Information Technology experience. Experience working with technologies like GCP with data engineering data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query. ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka. Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks. Good experience in end-to-end implementation of data warehouse and data marts. Strong knowledge and hands-on experience in Python and SQL. Knowledge on CICD pipeline using Terraform in Git. Preferred Qualifications: Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data Knowledge on Airflow Dag creation, execution, and monitoring Good understanding of Agile software development frameworks Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams Experience and desire to work in a global delivery environment
02/11/2026
Role: Google Cloud (GCP) data engineer with Python Hartford, CT Full Time Required Qualifications: At least 4 years of Information Technology experience. Experience working with technologies like GCP with data engineering data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query. ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka. Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks. Good experience in end-to-end implementation of data warehouse and data marts. Strong knowledge and hands-on experience in Python and SQL. Knowledge on CICD pipeline using Terraform in Git. Preferred Qualifications: Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data Knowledge on Airflow Dag creation, execution, and monitoring Good understanding of Agile software development frameworks Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams Experience and desire to work in a global delivery environment
Entry Level Python Developer
ConsultAdd New York, New York
Role-Python Developer. Definition - A Python Web Developer is responsible for writing server-side web application logic. Python web developers usually develop back-end components, connect the application with the other (often third-party) web services, and support the front-end developers by integrating their work with the Python application. Skills and qualifications - Work experience as a Python Developer Expertise in at least one popular Python framework (like Django, Flask or Pyramid) Knowledge of object-relational mapping (ORM) Familiarity with front-end technologies (like JavaScript and HTML5) Team spirit Good problem-solving skills Graduate degree in Computer Science, Engineering or relevant field. Responsibilities - Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Implement security and data protection solutions Assess and prioritize feature requests Coordinate with internal teams to understand user requirements and provide technical solutions If you are interested and available in the job market or looking for a job change then please go to this link and fill the form (link removed) Point of contact-Pratik Balladkar
02/11/2026
Role-Python Developer. Definition - A Python Web Developer is responsible for writing server-side web application logic. Python web developers usually develop back-end components, connect the application with the other (often third-party) web services, and support the front-end developers by integrating their work with the Python application. Skills and qualifications - Work experience as a Python Developer Expertise in at least one popular Python framework (like Django, Flask or Pyramid) Knowledge of object-relational mapping (ORM) Familiarity with front-end technologies (like JavaScript and HTML5) Team spirit Good problem-solving skills Graduate degree in Computer Science, Engineering or relevant field. Responsibilities - Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Implement security and data protection solutions Assess and prioritize feature requests Coordinate with internal teams to understand user requirements and provide technical solutions If you are interested and available in the job market or looking for a job change then please go to this link and fill the form (link removed) Point of contact-Pratik Balladkar
Full Stack Lead Python Developer
Robotics technology LLC New York, New York
Job Details Title: Full Stack Lead Python DeveloperLocation: NYC, NY 10001 (Hybrid) We are seeking a highly skilled Full Stack Lead Python Developer to join a leading financial services client. In this role, you will leverage your expertise in Python, ReactJS, and SQL to build scalable, secure, and high-performing applications tailored to the financial industry. You will work across the stack, contributing to both backend and frontend development while ensuring best practices in software design, performance, and security. You will lead a team of engineers working in a global delivery model and engage with customer stakeholders to drive innovative financial solutions. Key Responsibilities: Design, develop, and maintain web applications using Python, ReactJS, and SQL Implement efficient, reusable, and scalable code for frontend and backend components Develop and integrate RESTful APIs with a focus on security and compliance Ensure application security, authentication, and authorization using industry best practices Utilize CI/CD pipelines for smooth and secure deployments Lead and mentor a team of engineers in a global delivery environment Collaborate with cross-functional teams to deliver new features aligned with financial regulations Stay updated with emerging full-stack technologies and trends Required Skills & Experience: Strong proficiency in Python (Django, Flask, or FastAPI) Expertise in JavaScript/TypeScript and ReactJS Experience with Git version control systems Strong knowledge of relational databases such as PostgreSQL or MySQL Familiarity with OAuth and JWT authentication mechanisms Experience with cloud platforms (AWS, Azure, Google Cloud Platform) is a plus Basic knowledge of Docker, containerization, and CI/CD tools Experience with testing frameworks such as pytest, Jest, or Mocha Proven leadership experience in a global delivery model We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.
02/11/2026
Job Details Title: Full Stack Lead Python DeveloperLocation: NYC, NY 10001 (Hybrid) We are seeking a highly skilled Full Stack Lead Python Developer to join a leading financial services client. In this role, you will leverage your expertise in Python, ReactJS, and SQL to build scalable, secure, and high-performing applications tailored to the financial industry. You will work across the stack, contributing to both backend and frontend development while ensuring best practices in software design, performance, and security. You will lead a team of engineers working in a global delivery model and engage with customer stakeholders to drive innovative financial solutions. Key Responsibilities: Design, develop, and maintain web applications using Python, ReactJS, and SQL Implement efficient, reusable, and scalable code for frontend and backend components Develop and integrate RESTful APIs with a focus on security and compliance Ensure application security, authentication, and authorization using industry best practices Utilize CI/CD pipelines for smooth and secure deployments Lead and mentor a team of engineers in a global delivery environment Collaborate with cross-functional teams to deliver new features aligned with financial regulations Stay updated with emerging full-stack technologies and trends Required Skills & Experience: Strong proficiency in Python (Django, Flask, or FastAPI) Expertise in JavaScript/TypeScript and ReactJS Experience with Git version control systems Strong knowledge of relational databases such as PostgreSQL or MySQL Familiarity with OAuth and JWT authentication mechanisms Experience with cloud platforms (AWS, Azure, Google Cloud Platform) is a plus Basic knowledge of Docker, containerization, and CI/CD tools Experience with testing frameworks such as pytest, Jest, or Mocha Proven leadership experience in a global delivery model We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.
Job Title : Databricks Developer / Hybrid but Atlanta preferred
CCM Consulting Atlanta, Georgia
Job Title: Databricks Developer Location: Hybrid, but local resources to Atlanta preferred Duration: 12 - 24 months Job Description: A Pyspark and Databricks Developer with a good understanding of the entire ETL/Azure lifecycle with a background of data projects. Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes using Azure Databricks, Data Factory, and other Azure services Implement and optimize Spark jobs, data transformations, and data processing workflows, Managing Databricks notebooks, Delta lake with Python, Delta Lake with Sparks SQL in Databricks Leverage Azure DevOps and CI/CD best practices to automate the deployment /DAB Deployments and management of data pipelines and infrastructure Ensure Data Integrity checks and Data Quality checks with zero percent errors when deployed to production Understand Databricks new features Unity Catalog/Lake flow/DAB Deployments/Catalog Federation Hands on experience Data extraction (extract, schemas, corrupt records, error handling, parallelized code), transformations and loads (user defined functions, join optimizations) and Production optimize (automate ETL) Qualifications Bachelors degree in computer science, Information Technology, or related field. Minimum of 5 years of experience in data engineering or similar roles. Proven expertise with Azure Databricks and data processing frameworks. Strong understanding of data warehousing, ETL processes, and data pipeline design. Experience with SQL, Python, and Spark. Excellent problem-solving and analytical skills. Effective communication and teamwork abilities. Skills Azure Databricks Python Apache Spark SQL ETL processes Data Warehousing Data Pipeline Design Cloud Architecture Performance Tuning
02/11/2026
Job Title: Databricks Developer Location: Hybrid, but local resources to Atlanta preferred Duration: 12 - 24 months Job Description: A Pyspark and Databricks Developer with a good understanding of the entire ETL/Azure lifecycle with a background of data projects. Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes using Azure Databricks, Data Factory, and other Azure services Implement and optimize Spark jobs, data transformations, and data processing workflows, Managing Databricks notebooks, Delta lake with Python, Delta Lake with Sparks SQL in Databricks Leverage Azure DevOps and CI/CD best practices to automate the deployment /DAB Deployments and management of data pipelines and infrastructure Ensure Data Integrity checks and Data Quality checks with zero percent errors when deployed to production Understand Databricks new features Unity Catalog/Lake flow/DAB Deployments/Catalog Federation Hands on experience Data extraction (extract, schemas, corrupt records, error handling, parallelized code), transformations and loads (user defined functions, join optimizations) and Production optimize (automate ETL) Qualifications Bachelors degree in computer science, Information Technology, or related field. Minimum of 5 years of experience in data engineering or similar roles. Proven expertise with Azure Databricks and data processing frameworks. Strong understanding of data warehousing, ETL processes, and data pipeline design. Experience with SQL, Python, and Spark. Excellent problem-solving and analytical skills. Effective communication and teamwork abilities. Skills Azure Databricks Python Apache Spark SQL ETL processes Data Warehousing Data Pipeline Design Cloud Architecture Performance Tuning
Principal Business Analysts - SAP
Simpson Strong-Tie Company, Inc.
Job duties Implement, configure, and support SAP WM, EWM and Transportation modules. Work closely with business users to understand their warehouse management processes and requirements, and then translate those requirements into SAP EWM solutions. Configure the SAP EWM system to support warehouse management processes, including warehouse structure setup, storage bin management, yard management, labor management, and cross-docking. May oversee the work of junior analysts. May manage SAP projects, etc. 50% domestic & international travel. Remote position - work from anywhere in the U.S. Education and Experience required Bachelor's degree in Computer Science, Computer Engineering, Management Information Systems, any Science field, or a related technical field and 6 years of experience as an SAP analyst, engineer and/or consultant. Alternate Experience No degree and 8 years of experience as an SAP analyst, engineer and/or consultant Background 2 years of experience configuring SAP EWM (extended warehouse management). 4 years of experience with SAP EWM Outbound and Inbound and 1 year of experience with SAP EWM Internal processes. 3 years of configuration experience in various communication channels including Application link enabled intermittent documents (ALE IDOCs), Queued and transaction remote functional call (QRFC and TRFC), and 1 year of experience with Core Interface (CIF). 5 years of experience with Master Data. 1 year of experience with Ship ERP TMS, BluJay and/or Yard Con. 2 years of SAP MM (Material Master) configuration experience. 1 year of PI/PO (Process integration & orchestration) experience. Location HQ: Pleasanton, CA. Remote position work from anywhere in the U.S. Rate of pay $174,637 per year How to apply Send resume to and include job reference in the subject line.
02/11/2026
Job duties Implement, configure, and support SAP WM, EWM and Transportation modules. Work closely with business users to understand their warehouse management processes and requirements, and then translate those requirements into SAP EWM solutions. Configure the SAP EWM system to support warehouse management processes, including warehouse structure setup, storage bin management, yard management, labor management, and cross-docking. May oversee the work of junior analysts. May manage SAP projects, etc. 50% domestic & international travel. Remote position - work from anywhere in the U.S. Education and Experience required Bachelor's degree in Computer Science, Computer Engineering, Management Information Systems, any Science field, or a related technical field and 6 years of experience as an SAP analyst, engineer and/or consultant. Alternate Experience No degree and 8 years of experience as an SAP analyst, engineer and/or consultant Background 2 years of experience configuring SAP EWM (extended warehouse management). 4 years of experience with SAP EWM Outbound and Inbound and 1 year of experience with SAP EWM Internal processes. 3 years of configuration experience in various communication channels including Application link enabled intermittent documents (ALE IDOCs), Queued and transaction remote functional call (QRFC and TRFC), and 1 year of experience with Core Interface (CIF). 5 years of experience with Master Data. 1 year of experience with Ship ERP TMS, BluJay and/or Yard Con. 2 years of SAP MM (Material Master) configuration experience. 1 year of PI/PO (Process integration & orchestration) experience. Location HQ: Pleasanton, CA. Remote position work from anywhere in the U.S. Rate of pay $174,637 per year How to apply Send resume to and include job reference in the subject line.
Senior Data Engineer
Charles River Laboratories, Inc. Wilmington, Massachusetts
Data Architecture Design and Optimization: Designing, implementing, and optimizing data architecture on Azure, including databases, data lakes, and data warehouses. Azure Data Services Implementation: Implementing and managing Azure data services such as Azure SQL Database , Azure Data Lake Storage, and others. ETL (Extract, Transform, Load) Pipeline Development: Building and maintaining ETL pipelines to move and transform data from various sources to target destinations in Azure. Data Integration and Transformation: Integrating data from diverse sources and transforming it into a unified format for analysis. Performance Monitoring and Optimization: Monitoring the performance of data systems and optimizing queries, storage, and processing for efficiency. Collaboration with Cross-functional Teams: Collaborating with data scientists, analysts, and other teams to understand their data requirements and provide necessary support. Minimum Job Requirements: Bachelors degree, in Computer Engineering, Computer Science, Electronic Engineering, or related field, or foreign degree equivalent. Plus, seven (7) years of experience in ETL design, performance optimization, and implementation in a multi-dimensional Data Warehousing environment. The experience (which may be gained concurrently) must also include each of the following: 7 years of advanced SQL Programming: T-SQL; 5 years of experience in designing and implementing Enterprise Data & Analytics solutions, focusing on architecture and strategy development to drive data-driven business decisions; 3 years of hands-on experience with data-heavy and analytics applications, utilizing relational databases, data warehousing, and big data technologies such as HDFS, Hive, Sqoop, Spark, and Python for data processing and analysis; and 2 years of experience with Azure cloud technologies, including Azure Data Factory, Azure Data Lake Gen2, Azure Databricks, Blob Storage, Azure SQL Database, Azure Functions, and Cosmos DB, enabling scalable and efficient data pipelines and solutions. Job Location: Charles River Laboratories, Inc. 251 Ballardvale Street, Wilmington, MA 01887. (100% telecommuting allowed from any U.S. location). 40 hours per week, 9:00 am 5:00 pm. Salary: $185,000 per year. To apply, send resume and letter of application detailing experience to Emily VanGilder, HR Business Partner, Charles River Laboratories, Inc.,
02/11/2026
Data Architecture Design and Optimization: Designing, implementing, and optimizing data architecture on Azure, including databases, data lakes, and data warehouses. Azure Data Services Implementation: Implementing and managing Azure data services such as Azure SQL Database , Azure Data Lake Storage, and others. ETL (Extract, Transform, Load) Pipeline Development: Building and maintaining ETL pipelines to move and transform data from various sources to target destinations in Azure. Data Integration and Transformation: Integrating data from diverse sources and transforming it into a unified format for analysis. Performance Monitoring and Optimization: Monitoring the performance of data systems and optimizing queries, storage, and processing for efficiency. Collaboration with Cross-functional Teams: Collaborating with data scientists, analysts, and other teams to understand their data requirements and provide necessary support. Minimum Job Requirements: Bachelors degree, in Computer Engineering, Computer Science, Electronic Engineering, or related field, or foreign degree equivalent. Plus, seven (7) years of experience in ETL design, performance optimization, and implementation in a multi-dimensional Data Warehousing environment. The experience (which may be gained concurrently) must also include each of the following: 7 years of advanced SQL Programming: T-SQL; 5 years of experience in designing and implementing Enterprise Data & Analytics solutions, focusing on architecture and strategy development to drive data-driven business decisions; 3 years of hands-on experience with data-heavy and analytics applications, utilizing relational databases, data warehousing, and big data technologies such as HDFS, Hive, Sqoop, Spark, and Python for data processing and analysis; and 2 years of experience with Azure cloud technologies, including Azure Data Factory, Azure Data Lake Gen2, Azure Databricks, Blob Storage, Azure SQL Database, Azure Functions, and Cosmos DB, enabling scalable and efficient data pipelines and solutions. Job Location: Charles River Laboratories, Inc. 251 Ballardvale Street, Wilmington, MA 01887. (100% telecommuting allowed from any U.S. location). 40 hours per week, 9:00 am 5:00 pm. Salary: $185,000 per year. To apply, send resume and letter of application detailing experience to Emily VanGilder, HR Business Partner, Charles River Laboratories, Inc.,
Database Administrator
MI Windows and Doors Tacoma, Washington
Job Description Location: Tacoma, WA Pay Range: $103,000/Yr. - $129,500/Yr., depending on experience and qualifications. Join MITER Brands Where Innovation Meets Craftsmanship MITER Brands is more than a window and door manufacturer we're shaping the future of residential living. As the powerhouse behind trusted names like Milgard, MI Windows & Doors, and PGT, we're one of the nation's largest suppliers of vinyl windows and patio doors, with state-of-the-art facilities across the country. We are looking for a Database Administrator in the Tacoma, Washington area. In this role you will ensure the stability, integrity, and efficient operation of the MSSQL, mysql, or any related database as required that support core organizational functions. This includes designing, installing, configuring, administering, and fine-tuning database components across the organization in a timely and efficient manner. There is a considerable focus and emphasis for this role in the support of Microsoft Azure with a focus on Azure Synapse and data analytics. The Database Administrator will apply proven communication, analytical, and problem-solving skills to help identify, communicate, and resolve issues in order to maximize the benefit of IT systems investments. Responsibilities Azure Analytics and Infrastructure Engineering Promote changes in Azure DevOps to production Synapse environment Manage Synapse pipelines and coordinate with other infrastructure team members to deliver data Manage, build and design Azure infrastructure to support business demand Troubleshoot data issues related to Enterprise BI platform in Azure Synapse and Power BI Ensure all database servers are backed up in a way that meets the business's Recovery Point Objectives (RPO) Test backups to ensure we can meet the business' Recovery Time Objectives (RTO) Troubleshoot SQL Server service outages as they occur, including after-hours and weekends Configure SQL Server monitoring utilities to minimize false alarms Install and configure new SQL Servers Deploy database change scripts provided by third party vendors or internal Development and Business Analysts post approvals When performance issues arise, determine the most effective way to increase performance including server configuration changes, or index/query changes Document the company's database environment Manage/plan database capacity and storage, disk space and monitor database growth Monitor and pro-actively identify locks/blocking and resolve concurrent issues Monitor database server health and create alerts to capture poor server performance issues Manage database security as defined and dictated by the company Qualifications On-call troubleshooting experience with at least one production SQL Server for a year. You don't have to be the only DBA or have DBA in your job description, but you should have been the one person that the company would call if the SQL Server service stopped working. Finding DMV queries to answer questions about server-level performance Using tools like (Extended Events, DTA) and (SQL Profiler, DMV, Data Collection) to diagnose server reliability and performance issues Tuning T-SQL queries to improve performance 3-7 years of experience Expertise in designing database schemas, normalization, and indexing strategies. Experience with data modeling tools and technique Knowledge of backup strategies and tools. Proficiency in disaster recovery planning and implementation. Understanding of data warehousing concepts and design. Experience with ETL (Extract, Transform, Load) processes and tools. Certifications (Optional but Beneficial) Microsoft Certified: Azure Database Administrator Associate Certified MySQL Database Administrator MongoDB Certified DBA What We Offer Our benefits package includes coverage of your health, wealth, and wellness for you and your eligible spouse/dependents. We offer a competitive salary and benefits package, including a 401k with company match and generous paid time off to help you balance your life. Below is a list of benefits you will enjoy while working with our company. - Three comprehensive Medical plan options Prescription Dental Vision - Company Paid Life Insurance - Voluntary Life Insurance - Supplemental Hospital Indemnity, Critical Illness, and Accident Insurance - Company-paid Short-Term Disability - Company-paid Long-Term Disability - Paid time off (PTO) and paid Holidays - 401k retirement plan with company match - Employee Assistance Program - Teladoc - Legal Insurance - Identity Theft Protection - Pet Insurance - Team Member Discount Program - Tuition Reimbursement - Yearly Wellness Clinic MITER Brands, also known as MI Windows and Doors, Milgard and PGT Industries are an equal-opportunity employer. The company does not discriminate based on religion, race, creed, color, national origin, sex, age, disability, handicap, veteran status, sexual orientation, genetic information, or any other applicable legally protected category.
02/11/2026
Full time
Job Description Location: Tacoma, WA Pay Range: $103,000/Yr. - $129,500/Yr., depending on experience and qualifications. Join MITER Brands Where Innovation Meets Craftsmanship MITER Brands is more than a window and door manufacturer we're shaping the future of residential living. As the powerhouse behind trusted names like Milgard, MI Windows & Doors, and PGT, we're one of the nation's largest suppliers of vinyl windows and patio doors, with state-of-the-art facilities across the country. We are looking for a Database Administrator in the Tacoma, Washington area. In this role you will ensure the stability, integrity, and efficient operation of the MSSQL, mysql, or any related database as required that support core organizational functions. This includes designing, installing, configuring, administering, and fine-tuning database components across the organization in a timely and efficient manner. There is a considerable focus and emphasis for this role in the support of Microsoft Azure with a focus on Azure Synapse and data analytics. The Database Administrator will apply proven communication, analytical, and problem-solving skills to help identify, communicate, and resolve issues in order to maximize the benefit of IT systems investments. Responsibilities Azure Analytics and Infrastructure Engineering Promote changes in Azure DevOps to production Synapse environment Manage Synapse pipelines and coordinate with other infrastructure team members to deliver data Manage, build and design Azure infrastructure to support business demand Troubleshoot data issues related to Enterprise BI platform in Azure Synapse and Power BI Ensure all database servers are backed up in a way that meets the business's Recovery Point Objectives (RPO) Test backups to ensure we can meet the business' Recovery Time Objectives (RTO) Troubleshoot SQL Server service outages as they occur, including after-hours and weekends Configure SQL Server monitoring utilities to minimize false alarms Install and configure new SQL Servers Deploy database change scripts provided by third party vendors or internal Development and Business Analysts post approvals When performance issues arise, determine the most effective way to increase performance including server configuration changes, or index/query changes Document the company's database environment Manage/plan database capacity and storage, disk space and monitor database growth Monitor and pro-actively identify locks/blocking and resolve concurrent issues Monitor database server health and create alerts to capture poor server performance issues Manage database security as defined and dictated by the company Qualifications On-call troubleshooting experience with at least one production SQL Server for a year. You don't have to be the only DBA or have DBA in your job description, but you should have been the one person that the company would call if the SQL Server service stopped working. Finding DMV queries to answer questions about server-level performance Using tools like (Extended Events, DTA) and (SQL Profiler, DMV, Data Collection) to diagnose server reliability and performance issues Tuning T-SQL queries to improve performance 3-7 years of experience Expertise in designing database schemas, normalization, and indexing strategies. Experience with data modeling tools and technique Knowledge of backup strategies and tools. Proficiency in disaster recovery planning and implementation. Understanding of data warehousing concepts and design. Experience with ETL (Extract, Transform, Load) processes and tools. Certifications (Optional but Beneficial) Microsoft Certified: Azure Database Administrator Associate Certified MySQL Database Administrator MongoDB Certified DBA What We Offer Our benefits package includes coverage of your health, wealth, and wellness for you and your eligible spouse/dependents. We offer a competitive salary and benefits package, including a 401k with company match and generous paid time off to help you balance your life. Below is a list of benefits you will enjoy while working with our company. - Three comprehensive Medical plan options Prescription Dental Vision - Company Paid Life Insurance - Voluntary Life Insurance - Supplemental Hospital Indemnity, Critical Illness, and Accident Insurance - Company-paid Short-Term Disability - Company-paid Long-Term Disability - Paid time off (PTO) and paid Holidays - 401k retirement plan with company match - Employee Assistance Program - Teladoc - Legal Insurance - Identity Theft Protection - Pet Insurance - Team Member Discount Program - Tuition Reimbursement - Yearly Wellness Clinic MITER Brands, also known as MI Windows and Doors, Milgard and PGT Industries are an equal-opportunity employer. The company does not discriminate based on religion, race, creed, color, national origin, sex, age, disability, handicap, veteran status, sexual orientation, genetic information, or any other applicable legally protected category.
Staff Software Engineer
Chicago Mercantile Exchange Inc. Chicago, Illinois
40 hrs/week, Mon-Fri, 8:30 a.m. - 5:30 p.m. Salary: $165,200 - $203,900/yr. Standard company benefits. MINIMUM REQUIREMENTS: Bachelors degree, or foreign equivalent degree, in Information Technology, Electrical Engineering, or a related field and five (5) years of post-bachelors, progressive, related work experience. Must have five (5) years of experience with/in the following: Cloud infrastructure components such as load balancers, API gateways, and service meshes, to support robust and scalable application deployments; Google Cloud; Developing applications using Java, Python, Spring Framework, and RESTful APIs to enhance performance and scalability; Leveraging in-memory JVM, Java Queue processors, multithreading, and data persistence to enhance data processing efficiency; Utilizing industry-standard Caching and Polling methods to optimize data performance; and Integrating and automating workflows using Jenkins and Chef. Telecommuting permitted on a hybrid schedule as determined by the employer. To apply, please email resume to: and reference: IL0205.
02/11/2026
40 hrs/week, Mon-Fri, 8:30 a.m. - 5:30 p.m. Salary: $165,200 - $203,900/yr. Standard company benefits. MINIMUM REQUIREMENTS: Bachelors degree, or foreign equivalent degree, in Information Technology, Electrical Engineering, or a related field and five (5) years of post-bachelors, progressive, related work experience. Must have five (5) years of experience with/in the following: Cloud infrastructure components such as load balancers, API gateways, and service meshes, to support robust and scalable application deployments; Google Cloud; Developing applications using Java, Python, Spring Framework, and RESTful APIs to enhance performance and scalability; Leveraging in-memory JVM, Java Queue processors, multithreading, and data persistence to enhance data processing efficiency; Utilizing industry-standard Caching and Polling methods to optimize data performance; and Integrating and automating workflows using Jenkins and Chef. Telecommuting permitted on a hybrid schedule as determined by the employer. To apply, please email resume to: and reference: IL0205.
Geospatial Data Engineer
The Water Institute of the Gulf Baton Rouge, Louisiana
Requirements: Bachelors in Geomatics Engineering or related. 5 years GIS experience, to include some solid experience with each of the following: web GIS architectures, deployment of distributed geospatial services; Esri technologies, including ArcGIS Pro, ArcGIS Online, ArcGIS Enterprise (Portal, Server, Datastore); prototyping, deploying spatial web applications & dashboards using ArcGIS Web AppBuilder and ArcGIS Dashboards, including HTML & CSS customization; spatial databases, including relational systems (PostgreSQL, PostGIS) and NoSQL databases (MongoDB); Python, with emphasis on geospatial libraries; ETL workflows for high-volume spatial data pipelines; Open Geospatial Consortium standards; containerizing environments using Docker; Git Technologies (GitLab), CI/CD pipelines. In depth knowledge of: spatial statistics and geostatistics (Kriging, spatial autocorrelation, hot spot analysis); remote sensing data and techniques (satellite imagery, LiDAR). Hybrid schedule is an option for this position. Apply: MUST follow these specific application instructions in order to be considered: Send CV and cover letter to or Brandy Rush, The Water Institute of the Gulf, 1110 River Road S, Suite 200, Baton Rouge, LA 70802 within 30 days, ref Job # W2021-489.
02/11/2026
Requirements: Bachelors in Geomatics Engineering or related. 5 years GIS experience, to include some solid experience with each of the following: web GIS architectures, deployment of distributed geospatial services; Esri technologies, including ArcGIS Pro, ArcGIS Online, ArcGIS Enterprise (Portal, Server, Datastore); prototyping, deploying spatial web applications & dashboards using ArcGIS Web AppBuilder and ArcGIS Dashboards, including HTML & CSS customization; spatial databases, including relational systems (PostgreSQL, PostGIS) and NoSQL databases (MongoDB); Python, with emphasis on geospatial libraries; ETL workflows for high-volume spatial data pipelines; Open Geospatial Consortium standards; containerizing environments using Docker; Git Technologies (GitLab), CI/CD pipelines. In depth knowledge of: spatial statistics and geostatistics (Kriging, spatial autocorrelation, hot spot analysis); remote sensing data and techniques (satellite imagery, LiDAR). Hybrid schedule is an option for this position. Apply: MUST follow these specific application instructions in order to be considered: Send CV and cover letter to or Brandy Rush, The Water Institute of the Gulf, 1110 River Road S, Suite 200, Baton Rouge, LA 70802 within 30 days, ref Job # W2021-489.
Sr DevSecOps Engineer
Northwestern Mutual Milwaukee, Wisconsin
Bring your best! What this role needs: Passionate about security A team player who enjoys collaborating with cross-functional teams A great communicator (written and verbal) with an ability to articulate complex topics in a clear and concise manner Employs a flexible and constructive approach when solving problems Proficient with development and scripting languages, Python and JavaScript preferred Strong knowledge of data security principles, encryption techniques, access controls, and secure coding practices Experienced with infrastructure-as-code concepts and tooling, including Terraform and YAML Continuously looking for opportunities to improve our processes and capabilities Experienced working with application and engineering teams A self-directed individual contributor What you'll get to do: Engineer solutions with a focus on automation to reduce manual and repetitive tasks Guide and advise application and engineering teams in the area of Data Security Manage day-to-day support of Data Security tools integrated into our on-premise and cloud database environments (relational & NoSQL) Manage technical support of Data Security capabilities and respond to service and escalation tickets within service-level agreements Design, implement, and maintain procedures, processes, and methodologies that support DevSecOps capabilities Actively monitor, assess, and recommend tactical and strategic initiatives based on new and emerging threats posing risk to our company Stay apprised of current and proposed security changes impacting regulatory, privacy, and security industry best practices Manage remediation efforts after security assessment findings outline weaknesses requiring attention Mentor other staff members to ensure consistency, quality and productivity of deliverables Further impress us with: Bachelor's or equivalent experience with an emphasis in computer science, computer engineering, software engineering, or an MIS related field 5+ years of experience in cloud and on-prem technologies (systems administration of Unix/Linux/Windows, AWS PaaS databases, database activity monitoring, DSPM tools) 5+ years of experience in development, infrastructure, or cybersecurity Understanding of applicable risk management frameworks from NIST and Data Security Maturity Model Experience with CICD pipelines to automate application and infrastructure code deployments Experience with workload orchestration platforms such as Kubernetes Understanding of a wide-range of cybersecurity capabilities including data security, security engineering, identity and access management, incident response, logging and monitoring, and penetration testing Relevant certifications from GIAC, ISC(2), ISACA, and other recognized cybersecurity industry organizations Compensation Range: Pay Range - Start: $104,090.00 Pay Range - End: $193,310.00 Geographic Specific Pay Structure: Structure 110: $114,520.00 USD - $212,680.00 USD Structure 115: $119,700.00 USD - $222,300.00 USD We believe in fairness and transparency. It's why we share the salary range for most of our roles. However, final salaries are based on a number of factors, including the skills and experience of the candidate; the current market; location of the candidate; and other factors uncovered in the hiring process. The standard pay structure is listed but if you're living in California, New York City or other eligible location, geographic specific pay structures, compensation and benefits could be applicable, click here to learn more. Grow your career with a best-in-class company that puts our clients' interests at the center of all we do. Get started now! Northwestern Mutual is an equal opportunity employer who welcomes and encourages diversity in the workforce. We are committed to creating and maintaining an environment in which each employee can contribute creative ideas, seek challenges, assume leadership and continue to focus on meeting and exceeding business and personal objectives. Skills Analytical Thinking (NM) - Advanced, Cloud & IT Infrastructure (NM) - Advanced, Cloud Security (NM) - Intermediate, Threat Awareness (NM) - Advanced, Application Security (NM) - Intermediate, Cross Functional Partnering & Planning (NM) - Intermediate, DevOps (NM) - Advanced, Professional Curiosity (NM) - Advanced (Inactive), Risk Management (NM) - Intermediate, System Development (NM) - Advanced, Customer Centricity (NM) - Intermediate, Strategic Thinking (NM) - Intermediate, Technology R&D (NM) - Intermediate, Adaptive Communication (NM) - Intermediate, Engineering Expertise & Practices (NM) - Advanced, Vulnerability Assessment & Management (NM) - Intermediate, Scripting & Integration (NM) - Intermediate, Coaching & Mentoring (NM) - Intermediate, Triage (NM) - Advanced, Security Practices (NM) - Intermediate, Secure Information Management (NM) - Intermediate (Inactive) FIND YOUR FUTURE We're excited about the potential people bring to Northwestern Mutual. You can grow your career here while enjoying first-class perks, benefits, and our commitment to a culture of belonging. Flexible work schedules Concierge service Comprehensive benefits Employee resource groups By applying, you consent to your information being transmitted to the Employer by SonicJobs. See Northwestern Mutual Privacy Policy at and SonicJobs Privacy Policy at and Terms of Use at
02/11/2026
Full time
Bring your best! What this role needs: Passionate about security A team player who enjoys collaborating with cross-functional teams A great communicator (written and verbal) with an ability to articulate complex topics in a clear and concise manner Employs a flexible and constructive approach when solving problems Proficient with development and scripting languages, Python and JavaScript preferred Strong knowledge of data security principles, encryption techniques, access controls, and secure coding practices Experienced with infrastructure-as-code concepts and tooling, including Terraform and YAML Continuously looking for opportunities to improve our processes and capabilities Experienced working with application and engineering teams A self-directed individual contributor What you'll get to do: Engineer solutions with a focus on automation to reduce manual and repetitive tasks Guide and advise application and engineering teams in the area of Data Security Manage day-to-day support of Data Security tools integrated into our on-premise and cloud database environments (relational & NoSQL) Manage technical support of Data Security capabilities and respond to service and escalation tickets within service-level agreements Design, implement, and maintain procedures, processes, and methodologies that support DevSecOps capabilities Actively monitor, assess, and recommend tactical and strategic initiatives based on new and emerging threats posing risk to our company Stay apprised of current and proposed security changes impacting regulatory, privacy, and security industry best practices Manage remediation efforts after security assessment findings outline weaknesses requiring attention Mentor other staff members to ensure consistency, quality and productivity of deliverables Further impress us with: Bachelor's or equivalent experience with an emphasis in computer science, computer engineering, software engineering, or an MIS related field 5+ years of experience in cloud and on-prem technologies (systems administration of Unix/Linux/Windows, AWS PaaS databases, database activity monitoring, DSPM tools) 5+ years of experience in development, infrastructure, or cybersecurity Understanding of applicable risk management frameworks from NIST and Data Security Maturity Model Experience with CICD pipelines to automate application and infrastructure code deployments Experience with workload orchestration platforms such as Kubernetes Understanding of a wide-range of cybersecurity capabilities including data security, security engineering, identity and access management, incident response, logging and monitoring, and penetration testing Relevant certifications from GIAC, ISC(2), ISACA, and other recognized cybersecurity industry organizations Compensation Range: Pay Range - Start: $104,090.00 Pay Range - End: $193,310.00 Geographic Specific Pay Structure: Structure 110: $114,520.00 USD - $212,680.00 USD Structure 115: $119,700.00 USD - $222,300.00 USD We believe in fairness and transparency. It's why we share the salary range for most of our roles. However, final salaries are based on a number of factors, including the skills and experience of the candidate; the current market; location of the candidate; and other factors uncovered in the hiring process. The standard pay structure is listed but if you're living in California, New York City or other eligible location, geographic specific pay structures, compensation and benefits could be applicable, click here to learn more. Grow your career with a best-in-class company that puts our clients' interests at the center of all we do. Get started now! Northwestern Mutual is an equal opportunity employer who welcomes and encourages diversity in the workforce. We are committed to creating and maintaining an environment in which each employee can contribute creative ideas, seek challenges, assume leadership and continue to focus on meeting and exceeding business and personal objectives. Skills Analytical Thinking (NM) - Advanced, Cloud & IT Infrastructure (NM) - Advanced, Cloud Security (NM) - Intermediate, Threat Awareness (NM) - Advanced, Application Security (NM) - Intermediate, Cross Functional Partnering & Planning (NM) - Intermediate, DevOps (NM) - Advanced, Professional Curiosity (NM) - Advanced (Inactive), Risk Management (NM) - Intermediate, System Development (NM) - Advanced, Customer Centricity (NM) - Intermediate, Strategic Thinking (NM) - Intermediate, Technology R&D (NM) - Intermediate, Adaptive Communication (NM) - Intermediate, Engineering Expertise & Practices (NM) - Advanced, Vulnerability Assessment & Management (NM) - Intermediate, Scripting & Integration (NM) - Intermediate, Coaching & Mentoring (NM) - Intermediate, Triage (NM) - Advanced, Security Practices (NM) - Intermediate, Secure Information Management (NM) - Intermediate (Inactive) FIND YOUR FUTURE We're excited about the potential people bring to Northwestern Mutual. You can grow your career here while enjoying first-class perks, benefits, and our commitment to a culture of belonging. Flexible work schedules Concierge service Comprehensive benefits Employee resource groups By applying, you consent to your information being transmitted to the Employer by SonicJobs. See Northwestern Mutual Privacy Policy at and SonicJobs Privacy Policy at and Terms of Use at
Lead Database Developer
Epsilon Data Management LLC Irving, Texas
Requirements: Employer will accept a Masters degree in Computer Science, Engineering, Information Technology or related field and three years of experience in the job offered or three years of experience in any occupation in which the required experience was gained. Position also requires three years of experience in each of the following: 1. SQL 2. PL/SQL 3. Pentaho ETL tool 4. Oracle 5. XML 6. Data Warehousing 7. Data Analytics 8. JSON 9. ProJS Telecommuting available from anywhere in the US. Contact: In order to be considered for this position, please Must reference job 6630.5622.17
02/11/2026
Requirements: Employer will accept a Masters degree in Computer Science, Engineering, Information Technology or related field and three years of experience in the job offered or three years of experience in any occupation in which the required experience was gained. Position also requires three years of experience in each of the following: 1. SQL 2. PL/SQL 3. Pentaho ETL tool 4. Oracle 5. XML 6. Data Warehousing 7. Data Analytics 8. JSON 9. ProJS Telecommuting available from anywhere in the US. Contact: In order to be considered for this position, please Must reference job 6630.5622.17
Senior Software Engineer
jobs New Castle, Delaware
Our team understands the value of their work in the real world. As a Senior Software Engineer in New Castle, Delaware, you'll be working on instruments that are pivotal in: Electric Cars, 3D Printing, batteries, recyclable plastics, space suits, candy that melts in your mouth and not your hands, paint that goes on smooth and dries without streaks, and even ensuring your French fries have the perfect crunch to them! Material Science uniquely spans the fields of physics, chemistry, engineering, and manufacturing. Consider this: today's lab instruments can only be viewed in the lab - our mission here is to enable scientists to see their lab results from Starbucks, home, or anywhere you can access the internet. We are just getting started with building our next-generation best-in-class platform with a goal of being the "best in the industry" to support our customers across life, materials, food, and environmental sciences. We're looking for a passionate Senior Software Engineer with a talent for building quality software solutions. You will work in a fast-paced, agile environment and engage in technical discussions, participate in technical designs, demonstrate problem-solving abilities, and present and share ideas through global collaboration. This role follows a hybrid work model, requiring three days per week on-site at the TA Instruments headquarters located at 159 Lukens Dr, New Castle, DE 19072. Responsibilities As a Senior Software Engineer you will: Research, design, develop and release/maintain: Application, User interface, Data analysis, Instrument control software for thermal analysis, rheological, and calorimetric instruments. Evaluate system specifications and translate system requirements to task specifications. Responsible for ongoing support of current programs including performance, diagnosis and troubleshooting of problem programs and designing solutions to problematic programming. Working within a formal development process covering the full development lifecycle. Work in collaboration with the agile team and appropriate experts to implement your designs. Provide code and supporting documentation in accordance with the coding guidelines, quality processes and applicable procedures, including team's definition of DONE. Work in collaboration with the agile team to generate automated and manual tests to verify implemented software. Provide work estimation and tracking information to support management decisions and planning. Keep a pro-active attitude to insure continuous improvement of the software quality, work process and individual skills. Provide contributions to the application architecture with a focus on scalability, maintainability, security and performance. Provide product specific and technical support to internal and external users where appropriate. As a Team Member you will: Participate in all team meetings and ceremonies in direct collaboration with other sites, provide input and feedback, take ownership on identified improvements. Actively participate in learning and sharing activities either during informal or formal training and demos. Demonstrate continuous technical improvement. This role follows a hybrid work model, requiring three days per week on-site at the TA Instruments headquarters located at 159 Lukens Dr, New Castle, DE 19072. Qualifications Abilities that will help you be successful: Bachelor's Degree in Computer Science or similar, or equivalent relevant experience 5+ years' experience designing, building, and supportingcomplex large-scale applications /platform(s) and/or solutions. 5+ years of experience designing, building, and scaling solutions using C#, .NET, ASP.NET, RESTful Web API, EF Core, and PostgreSQL. Solid knowledge and proven experience as a software developer, with exposure to elements of our back-end technology stack (C#, .Net, ASP.NET, WEB API). Knowledge of Front-End JavaScript Frameworks, especially React using Typescript. Knowledge and application of software engineering practices (e.g., Unit testing, TDD, CI/CD, SOLID, etc.). Proven ability to work as part of an Agile delivery team Good knowledge of software engineering principles Develop an application from end to end - from the database to the user interface. Excellent written and verbal communication skills are essential. Good knowledge of multi-threading & tasks Database - Design and code databases with a specific language like SQL. Desired: Continuous delivery, with pipelines implemented in Kubernetes, Docker Experience using GitHub and GitHub Actions Behaviour Driven Development (BDD), with SpecFlow Software security best practices and implementation (e.g. OWASP, PKI, X509 Certificates, TLS) Software development for regulated environments (e.g. 21 CFR11) Analytical Instrumentation Domain.
02/11/2026
Our team understands the value of their work in the real world. As a Senior Software Engineer in New Castle, Delaware, you'll be working on instruments that are pivotal in: Electric Cars, 3D Printing, batteries, recyclable plastics, space suits, candy that melts in your mouth and not your hands, paint that goes on smooth and dries without streaks, and even ensuring your French fries have the perfect crunch to them! Material Science uniquely spans the fields of physics, chemistry, engineering, and manufacturing. Consider this: today's lab instruments can only be viewed in the lab - our mission here is to enable scientists to see their lab results from Starbucks, home, or anywhere you can access the internet. We are just getting started with building our next-generation best-in-class platform with a goal of being the "best in the industry" to support our customers across life, materials, food, and environmental sciences. We're looking for a passionate Senior Software Engineer with a talent for building quality software solutions. You will work in a fast-paced, agile environment and engage in technical discussions, participate in technical designs, demonstrate problem-solving abilities, and present and share ideas through global collaboration. This role follows a hybrid work model, requiring three days per week on-site at the TA Instruments headquarters located at 159 Lukens Dr, New Castle, DE 19072. Responsibilities As a Senior Software Engineer you will: Research, design, develop and release/maintain: Application, User interface, Data analysis, Instrument control software for thermal analysis, rheological, and calorimetric instruments. Evaluate system specifications and translate system requirements to task specifications. Responsible for ongoing support of current programs including performance, diagnosis and troubleshooting of problem programs and designing solutions to problematic programming. Working within a formal development process covering the full development lifecycle. Work in collaboration with the agile team and appropriate experts to implement your designs. Provide code and supporting documentation in accordance with the coding guidelines, quality processes and applicable procedures, including team's definition of DONE. Work in collaboration with the agile team to generate automated and manual tests to verify implemented software. Provide work estimation and tracking information to support management decisions and planning. Keep a pro-active attitude to insure continuous improvement of the software quality, work process and individual skills. Provide contributions to the application architecture with a focus on scalability, maintainability, security and performance. Provide product specific and technical support to internal and external users where appropriate. As a Team Member you will: Participate in all team meetings and ceremonies in direct collaboration with other sites, provide input and feedback, take ownership on identified improvements. Actively participate in learning and sharing activities either during informal or formal training and demos. Demonstrate continuous technical improvement. This role follows a hybrid work model, requiring three days per week on-site at the TA Instruments headquarters located at 159 Lukens Dr, New Castle, DE 19072. Qualifications Abilities that will help you be successful: Bachelor's Degree in Computer Science or similar, or equivalent relevant experience 5+ years' experience designing, building, and supportingcomplex large-scale applications /platform(s) and/or solutions. 5+ years of experience designing, building, and scaling solutions using C#, .NET, ASP.NET, RESTful Web API, EF Core, and PostgreSQL. Solid knowledge and proven experience as a software developer, with exposure to elements of our back-end technology stack (C#, .Net, ASP.NET, WEB API). Knowledge of Front-End JavaScript Frameworks, especially React using Typescript. Knowledge and application of software engineering practices (e.g., Unit testing, TDD, CI/CD, SOLID, etc.). Proven ability to work as part of an Agile delivery team Good knowledge of software engineering principles Develop an application from end to end - from the database to the user interface. Excellent written and verbal communication skills are essential. Good knowledge of multi-threading & tasks Database - Design and code databases with a specific language like SQL. Desired: Continuous delivery, with pipelines implemented in Kubernetes, Docker Experience using GitHub and GitHub Actions Behaviour Driven Development (BDD), with SpecFlow Software security best practices and implementation (e.g. OWASP, PKI, X509 Certificates, TLS) Software development for regulated environments (e.g. 21 CFR11) Analytical Instrumentation Domain.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board