Salesforce Developer Location: Okemos, MI ( 3 days onsite 2 days remote) In person interview required Visa: w2 Salesforce developer related certification Mandatory Skills Required: Provides vision and strategic oversight during product evaluation, solution prototyping, and proof of concept activities. Develops e nterprise strategy for application design, approves SDLC processes, evaluates application quality and advises IT leadership on risk. Architect/design, implement and maintain scalable and performant solutions on the Salesforce platform including integration with other data sources, ensuring alignment with business requirements and Salesforce best practices. Align enterprise-wide architecture strategies and initiatives with Salesforce platform capabilities, acting as a bridge between business and technology stakeholders Proven experience in solution and technical design/implementation using Salesforce Health Cloud, Sales Cloud, and Service Cloud within the healthcare payer ecosystem Strong understanding of the Salesforce data model, security architecture, sharing & visibility, Apex programming, LWC/Aura and automation capabilities (e.g., Flows, Triggers, etc) In-depth knowledge of Salesforce best practices, support mechanisms, salesforce operational procedures, and platform limitations Hands-on experience with real-time and batch integration patterns, including tools such as Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), and Snowflake. Salesforce certifications, such as Application Architect, System Architect, and/or Platform Developer II Familiarity with HIPAA/Salesforce Shield, FedRAMP/Salesforce Government Cloud, BPO (Business Processing Outsourcing) services/OSP Model licensing is a plus.
03/15/2026
Salesforce Developer Location: Okemos, MI ( 3 days onsite 2 days remote) In person interview required Visa: w2 Salesforce developer related certification Mandatory Skills Required: Provides vision and strategic oversight during product evaluation, solution prototyping, and proof of concept activities. Develops e nterprise strategy for application design, approves SDLC processes, evaluates application quality and advises IT leadership on risk. Architect/design, implement and maintain scalable and performant solutions on the Salesforce platform including integration with other data sources, ensuring alignment with business requirements and Salesforce best practices. Align enterprise-wide architecture strategies and initiatives with Salesforce platform capabilities, acting as a bridge between business and technology stakeholders Proven experience in solution and technical design/implementation using Salesforce Health Cloud, Sales Cloud, and Service Cloud within the healthcare payer ecosystem Strong understanding of the Salesforce data model, security architecture, sharing & visibility, Apex programming, LWC/Aura and automation capabilities (e.g., Flows, Triggers, etc) In-depth knowledge of Salesforce best practices, support mechanisms, salesforce operational procedures, and platform limitations Hands-on experience with real-time and batch integration patterns, including tools such as Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), and Snowflake. Salesforce certifications, such as Application Architect, System Architect, and/or Platform Developer II Familiarity with HIPAA/Salesforce Shield, FedRAMP/Salesforce Government Cloud, BPO (Business Processing Outsourcing) services/OSP Model licensing is a plus.
Job ID: 65397 Position: Data Analyst Client: CO CDE Location: 201 E Colfax Ave Denver, Colorado (Remote) Duration: 6 Months Role Summary IMS is seeking a mid-level Informatica ETL Developer / Administrator to support the migration of ETL workflows from Informatica PowerCenter to Informatica Intelligent Data Management Cloud (IDMC). This role is hands-on and execution-focused, providing development, administration, and database support during migration and stabilization efforts. The contractor will primarily focus on ETL Asset/ Workflow migration. The contractor will may also assist with Oracle database and stored-procedure based systems, including components supporting Accountability Frameworks. This position works under the direction of IMS technical leads and does not include platform ownership, architecture authority, or people management. Key Responsibilities Informatica IDMC Migration Support migration of ETL workflows from PowerCenter to IDMC. Analyze existing mappings and workflows to support rebuild or refactor in IDMC. Assist with testing, validation, and troubleshooting of migrated workflows. Support cutover and post-migration stabilization activities. Informatica Administration (Execution-Level) Support IDMC environment configuration and operations. Perform system configuration activities to support hybrid legacy configuration. Assist with job scheduling, monitoring, and error resolution. Follow established operational procedures and escalation paths. Database and Stored Procedure Support Support Oracle databases used by ETL and reporting systems. Assist with analysis and troubleshooting of SQL and stored procedures. Validate data transformations and outputs, particularly for Accountability Frameworks. Documentation and Knowledge Transfer Contribute to migration and configuration documentation. Document changes made during migration to support operational continuity. Participate in knowledge transfer to IMS staff as requested. Required Skills and Experience 4 7 years of hands-on Informatica ETL experience. Experience with Informatica IDMC / Informatica Cloud or PowerCenter with migration exposure. Experience supporting or administering Informatica environments. Strong SQL skills with Oracle databases. Experience working with stored procedures and database-driven logic. Experience supporting ETL testing, validation, and issue resolution. Experience with Linux based operating systems and command line interface. Preferred Skills Prior PowerCenter IDMC migration experience. Familiarity with cloud or hybrid execution environments. Experience or familiarity with Linux system administration activities. Experience supporting regulated or audit-sensitive data systems. Engagement Objective Provide targeted ETL and database expertise to: Accelerate PowerCenter to IDMC migration Reduce load on permanent IMS staff Support Oracle/stored-procedure based Accountability Frameworks during transition
03/13/2026
Full time
Job ID: 65397 Position: Data Analyst Client: CO CDE Location: 201 E Colfax Ave Denver, Colorado (Remote) Duration: 6 Months Role Summary IMS is seeking a mid-level Informatica ETL Developer / Administrator to support the migration of ETL workflows from Informatica PowerCenter to Informatica Intelligent Data Management Cloud (IDMC). This role is hands-on and execution-focused, providing development, administration, and database support during migration and stabilization efforts. The contractor will primarily focus on ETL Asset/ Workflow migration. The contractor will may also assist with Oracle database and stored-procedure based systems, including components supporting Accountability Frameworks. This position works under the direction of IMS technical leads and does not include platform ownership, architecture authority, or people management. Key Responsibilities Informatica IDMC Migration Support migration of ETL workflows from PowerCenter to IDMC. Analyze existing mappings and workflows to support rebuild or refactor in IDMC. Assist with testing, validation, and troubleshooting of migrated workflows. Support cutover and post-migration stabilization activities. Informatica Administration (Execution-Level) Support IDMC environment configuration and operations. Perform system configuration activities to support hybrid legacy configuration. Assist with job scheduling, monitoring, and error resolution. Follow established operational procedures and escalation paths. Database and Stored Procedure Support Support Oracle databases used by ETL and reporting systems. Assist with analysis and troubleshooting of SQL and stored procedures. Validate data transformations and outputs, particularly for Accountability Frameworks. Documentation and Knowledge Transfer Contribute to migration and configuration documentation. Document changes made during migration to support operational continuity. Participate in knowledge transfer to IMS staff as requested. Required Skills and Experience 4 7 years of hands-on Informatica ETL experience. Experience with Informatica IDMC / Informatica Cloud or PowerCenter with migration exposure. Experience supporting or administering Informatica environments. Strong SQL skills with Oracle databases. Experience working with stored procedures and database-driven logic. Experience supporting ETL testing, validation, and issue resolution. Experience with Linux based operating systems and command line interface. Preferred Skills Prior PowerCenter IDMC migration experience. Familiarity with cloud or hybrid execution environments. Experience or familiarity with Linux system administration activities. Experience supporting regulated or audit-sensitive data systems. Engagement Objective Provide targeted ETL and database expertise to: Accelerate PowerCenter to IDMC migration Reduce load on permanent IMS staff Support Oracle/stored-procedure based Accountability Frameworks during transition
54257: ETL(Informatica) developer Inperson interview Phoenix, AZ(onsite) 6 months LOCAL CANDIDATES ONLY FOR ONSITE CLIENT EVAL Required: - Extensive experience with Informatica PowerCenter and SQL. - Design and map complex workflows to move data from heterogeneous sources (SQL, Oracle, XML, Flat Files) to target databases. - Strong understanding of relational databases . - Ability to analyze data, identify discrepancies, and resolve issues. - Should have good knowledge on SQL, PL/SQL, Unix Shell Scripting - Informatica PowerCenter development experience ,SQL, PL SQL & Unix scripting
03/10/2026
Full time
54257: ETL(Informatica) developer Inperson interview Phoenix, AZ(onsite) 6 months LOCAL CANDIDATES ONLY FOR ONSITE CLIENT EVAL Required: - Extensive experience with Informatica PowerCenter and SQL. - Design and map complex workflows to move data from heterogeneous sources (SQL, Oracle, XML, Flat Files) to target databases. - Strong understanding of relational databases . - Ability to analyze data, identify discrepancies, and resolve issues. - Should have good knowledge on SQL, PL/SQL, Unix Shell Scripting - Informatica PowerCenter development experience ,SQL, PL SQL & Unix scripting
Adnet Advertising Agency, Inc.
San Diego, California
Programmer Analyst sought by The University of San Diego in San Diego, CA to build data models, reports, dashboards, & analytics using knowledge of Cognos, Salesforce, & Workday. Telecommuting is permitted 5 days per week from anywhere in the U.S. Salary: $124,280 - $142,204/yr. Reqs Master's deg in Info Technology or Bus Admin & 1 yr (or a Bachelors deg & 5 yrs) of exp as a software developer. At least 1 yr of exp is reqd in each of the following: Cognos Analytics or Tableau; Applying data warehousing & Business Intelligence concepts & using data modeling & reporting tools; Using Oracle dvlpmt tools including Oracle Developer Suite 10g/11g toolset, SQL Developer, & SQL; Writing functions, packages, procedures, triggers, & views in Oracle PL/SQL for database dvlpmt; Cognos Installations, Configuration, Cognos Performance Tuning, Security settings, Cognos Troubleshooting, & Root cause analysis; Dvlpmt & implmtn, including diagnosing & resolving technical problems; Managing deployments & migrations of Cognos content across envrmts; Working on IBM Cognos implmtns & automated testing using Motio; Utilizing at least 1 of the ETL Tools: Oracle Warehouse Builder (OWB), Oracle Data Integrator (ODI), Informatica, or SSIS; Dsgn & application tuning of relational databases of Oracle, SQL Server, MySQL, or Snowflake; Handling & supporting patches & upgrades in a production envrmt; Dvlpg interfaces & integrations to migrate data between integrated systems in either on-premise or cloud-based; User acceptance testing, data validation & documentation. Email resume to or apply online at Cite job number "114" in the response.
01/15/2026
Full time
Programmer Analyst sought by The University of San Diego in San Diego, CA to build data models, reports, dashboards, & analytics using knowledge of Cognos, Salesforce, & Workday. Telecommuting is permitted 5 days per week from anywhere in the U.S. Salary: $124,280 - $142,204/yr. Reqs Master's deg in Info Technology or Bus Admin & 1 yr (or a Bachelors deg & 5 yrs) of exp as a software developer. At least 1 yr of exp is reqd in each of the following: Cognos Analytics or Tableau; Applying data warehousing & Business Intelligence concepts & using data modeling & reporting tools; Using Oracle dvlpmt tools including Oracle Developer Suite 10g/11g toolset, SQL Developer, & SQL; Writing functions, packages, procedures, triggers, & views in Oracle PL/SQL for database dvlpmt; Cognos Installations, Configuration, Cognos Performance Tuning, Security settings, Cognos Troubleshooting, & Root cause analysis; Dvlpmt & implmtn, including diagnosing & resolving technical problems; Managing deployments & migrations of Cognos content across envrmts; Working on IBM Cognos implmtns & automated testing using Motio; Utilizing at least 1 of the ETL Tools: Oracle Warehouse Builder (OWB), Oracle Data Integrator (ODI), Informatica, or SSIS; Dsgn & application tuning of relational databases of Oracle, SQL Server, MySQL, or Snowflake; Handling & supporting patches & upgrades in a production envrmt; Dvlpg interfaces & integrations to migrate data between integrated systems in either on-premise or cloud-based; User acceptance testing, data validation & documentation. Email resume to or apply online at Cite job number "114" in the response.
Senior Manager, Enterprise Integration Austin Community College Job Posting Closing Times: Job postings are removed from advertising at 12:00 A.M. on the closing date e.g., at midnight on the day before the closing date. Austin Community College employees are required to maintain a domicile in the State of Texas while working for the college and throughout the duration of employment. - AR 4.0300.01 If you are a current Austin Community College employee, please click this link to apply through your Workday account . Austin Community College is a public two-year institution that serves a multicultural population of approximately 41,000 credit students each Fall and Spring semester. We embrace our identity as a community college, as reflected in our mission statement. We promote student success and community development by providing affordable access, through traditional and distance learning modes, to higher education and workforce training, including appropriate applied baccalaureate degrees, in our service area. As a community college committed to our mission, we seek to recruit and retain a workforce that: Values intellectual curiosity and innovative teaching Is attracted by the college's mission to promote equitable access to educational opportunities Cares about student success and collaborates on strategies to facilitate success for populations including; first generation college students, low-income students, and students from underserved communities. Focused on student academic achievement and postgraduate outcomes Welcomes difference and models respectful interaction with others Engages with the community both within and outside of ACC Job Posting Title: Senior Manager, Enterprise Integration Job Description Summary: The Sr. Manager Enterprise Integration is responsible to build and manage enterprise data hub to share the data among multiple systems and Identify silos and bring such data into the main stream and create a strategy for such silo systems. The Manager should also have Data Architect knowledge and able to leading the definition and roadmap for the college's BI Strategy that includes data warehousing, analytical reporting, operational reporting, ETL, data governance, and master data management. Job Description: Description of Duties and Tasks Supervises, trains, coaches, directs, coordinates, and disciplines personnel while adhering to organizational human resource policies and procedures as well as related employment laws. Recommends hire and termination personnel actions for positions supervised. Defines and designs the data warehousing solution and ETL/ELT processes. Works closely with the business to understand the data and provide meaningful analytics to users/customers. Manages Informatica IICS and data projects, performs strategic analysis of data best practices, design databases and data profiling. Supervises the ETL and Informatica Developers. Prioritizes and delegates requests received by the product owner and scrum manager. Communicates with the multiple teams for the data requirement. Analyzes data and develop data architecture. In coordination with Quality Control team, ensures accuracy of standards and procedural documentation related to data. Participates in the definition, implementation, and support of the overall integration strategy. Participates in the selection, design, and implementation of the data hub, the data warehouse, operational data layers, and ETL. Participates in meeting with customers, gathering requirements, and working with developers to provide solutions to end users. Provides technical leadership to other members of the Business Intelligence Team. Participates in BI product evaluations and business decisions to ensure fit and scale into the platform and the organization. Participates in an energetic team of experienced people in an agile environment. Creates high level and detailed design of current and future state of BI System. Creates architecture diagrams and models of Data, Data Warehouse, and Integration and mapping documents with high level design. Knowledge Familiar with data technology design, implementation, or consulting in one or more of the following areas: Enterprise Data Warehouse Business Intelligence Reporting & Analytics Tools Cloud Platforms Data Lake and Big Data Artificial Intelligence Familiar working with enterprise architecture or solution architecture frameworks, methodologies, templates, and tools. Extensive knowledge in good data modeling techniques. Competent in MS SQL Server, Oracle SQL, stored procedures, and other SQL. Unidata experience a plus. Knowledge of best practice for Data Quality Management, Master Data Management and near real time data warehousing. Experience with ETL/ELT tools such as Informatica IICS Suite. Skills Maintaining an established work schedule. Effectively using interpersonal and communications skills including tact and diplomacy. Effectively using organizational and planning skills with attention to detail and follow-through. Outstanding analytical and problem-solving abilities. Presentation skills with a high degree of comfort with both large and small audiences. Ability to take complex concepts and create models for discussion, evaluation, and collaboration. Excellent oral and written communication skills. This includes the ability to explain technology concepts to business leaders, as well as business concepts to technologists, and the ability to sell ideas and processes internally at all levels. Ability to work independently and be self-motivated, committed and detail oriented. Ability to understand and follow instructions precisely. Maintaining confidentiality of work-related information and materials. Establishing and maintaining effective working relationships. Experience in strategic technology planning and execution, as well as policy development and maintenance. Knowledge and skills in SQL (structured Query language). Technology Skills Use a variety of spreadsheet, word processing, database, and presentation software. Use query and control languages, administer applications, and provide technical support to end users. Skills in modelling tools such as ERWIN, Embarcadero, Visio, and LucidChart. Preferred Work Experience Experience with modern databases such as SQL Server, Oracle, or DB2. Experience in modern analytics platforms such as Tableau, SAS, Domo, Microsoft Power BI. Hands on experience with ETL using leading ETL vendor software such as Informatica cloud (IICS) Suite, DataStage, Micrsoft SSIS. Hands on experience with SQL and ETL Tuning. Required Education Bachelor's degree. Special Requirements Reliable transportation for travel in the Austin area as required. Safety Supervise safe operation of unit. Facilitate safety inspections. Take reasonable and prudent actions to eliminate identified hazards. Ensure employees receive appropriate safety training and foster a workplace safety culture. Salary Range (PG 127) $99,155 - $123,943 Number of Openings: 1 Job Posting Close Date: January 22, 2026 Clery Act As required by the US Department of Education, employees are required to report violations under Title IX and, under the Jeanne Clery Disclosure of Campus Security Policy and Crime Statistics Act (Clery Act), select individuals are required to report crimes. If this position is identified as a Campus Security Authority (Clery Act), you will be notified, trained, and provided resources for reporting. Disclaimer The above description is an overview of the job. It is not intended to be an all-inclusive list of duties and responsibilities of the job, nor is it an all-inclusive list of the skills and abilities required to do the job. Duties and responsibilities may change with business needs. ACC reserves the right to add, change, amend, or delete portions of this job description at any time, with or without notice. Employees may be required to perform other duties as requested, directed, or assigned. In addition, reasonable accommodations may be made by ACC at its discretion to enable individuals with disabilities to perform essential functions of the job. To apply, please visit: Copyright 2025 Inc. All rights reserved. Posted by the FREE value-added recruitment advertising agency jeid-bfd995ade5e166469e657654d06007eb
01/14/2026
Full time
Senior Manager, Enterprise Integration Austin Community College Job Posting Closing Times: Job postings are removed from advertising at 12:00 A.M. on the closing date e.g., at midnight on the day before the closing date. Austin Community College employees are required to maintain a domicile in the State of Texas while working for the college and throughout the duration of employment. - AR 4.0300.01 If you are a current Austin Community College employee, please click this link to apply through your Workday account . Austin Community College is a public two-year institution that serves a multicultural population of approximately 41,000 credit students each Fall and Spring semester. We embrace our identity as a community college, as reflected in our mission statement. We promote student success and community development by providing affordable access, through traditional and distance learning modes, to higher education and workforce training, including appropriate applied baccalaureate degrees, in our service area. As a community college committed to our mission, we seek to recruit and retain a workforce that: Values intellectual curiosity and innovative teaching Is attracted by the college's mission to promote equitable access to educational opportunities Cares about student success and collaborates on strategies to facilitate success for populations including; first generation college students, low-income students, and students from underserved communities. Focused on student academic achievement and postgraduate outcomes Welcomes difference and models respectful interaction with others Engages with the community both within and outside of ACC Job Posting Title: Senior Manager, Enterprise Integration Job Description Summary: The Sr. Manager Enterprise Integration is responsible to build and manage enterprise data hub to share the data among multiple systems and Identify silos and bring such data into the main stream and create a strategy for such silo systems. The Manager should also have Data Architect knowledge and able to leading the definition and roadmap for the college's BI Strategy that includes data warehousing, analytical reporting, operational reporting, ETL, data governance, and master data management. Job Description: Description of Duties and Tasks Supervises, trains, coaches, directs, coordinates, and disciplines personnel while adhering to organizational human resource policies and procedures as well as related employment laws. Recommends hire and termination personnel actions for positions supervised. Defines and designs the data warehousing solution and ETL/ELT processes. Works closely with the business to understand the data and provide meaningful analytics to users/customers. Manages Informatica IICS and data projects, performs strategic analysis of data best practices, design databases and data profiling. Supervises the ETL and Informatica Developers. Prioritizes and delegates requests received by the product owner and scrum manager. Communicates with the multiple teams for the data requirement. Analyzes data and develop data architecture. In coordination with Quality Control team, ensures accuracy of standards and procedural documentation related to data. Participates in the definition, implementation, and support of the overall integration strategy. Participates in the selection, design, and implementation of the data hub, the data warehouse, operational data layers, and ETL. Participates in meeting with customers, gathering requirements, and working with developers to provide solutions to end users. Provides technical leadership to other members of the Business Intelligence Team. Participates in BI product evaluations and business decisions to ensure fit and scale into the platform and the organization. Participates in an energetic team of experienced people in an agile environment. Creates high level and detailed design of current and future state of BI System. Creates architecture diagrams and models of Data, Data Warehouse, and Integration and mapping documents with high level design. Knowledge Familiar with data technology design, implementation, or consulting in one or more of the following areas: Enterprise Data Warehouse Business Intelligence Reporting & Analytics Tools Cloud Platforms Data Lake and Big Data Artificial Intelligence Familiar working with enterprise architecture or solution architecture frameworks, methodologies, templates, and tools. Extensive knowledge in good data modeling techniques. Competent in MS SQL Server, Oracle SQL, stored procedures, and other SQL. Unidata experience a plus. Knowledge of best practice for Data Quality Management, Master Data Management and near real time data warehousing. Experience with ETL/ELT tools such as Informatica IICS Suite. Skills Maintaining an established work schedule. Effectively using interpersonal and communications skills including tact and diplomacy. Effectively using organizational and planning skills with attention to detail and follow-through. Outstanding analytical and problem-solving abilities. Presentation skills with a high degree of comfort with both large and small audiences. Ability to take complex concepts and create models for discussion, evaluation, and collaboration. Excellent oral and written communication skills. This includes the ability to explain technology concepts to business leaders, as well as business concepts to technologists, and the ability to sell ideas and processes internally at all levels. Ability to work independently and be self-motivated, committed and detail oriented. Ability to understand and follow instructions precisely. Maintaining confidentiality of work-related information and materials. Establishing and maintaining effective working relationships. Experience in strategic technology planning and execution, as well as policy development and maintenance. Knowledge and skills in SQL (structured Query language). Technology Skills Use a variety of spreadsheet, word processing, database, and presentation software. Use query and control languages, administer applications, and provide technical support to end users. Skills in modelling tools such as ERWIN, Embarcadero, Visio, and LucidChart. Preferred Work Experience Experience with modern databases such as SQL Server, Oracle, or DB2. Experience in modern analytics platforms such as Tableau, SAS, Domo, Microsoft Power BI. Hands on experience with ETL using leading ETL vendor software such as Informatica cloud (IICS) Suite, DataStage, Micrsoft SSIS. Hands on experience with SQL and ETL Tuning. Required Education Bachelor's degree. Special Requirements Reliable transportation for travel in the Austin area as required. Safety Supervise safe operation of unit. Facilitate safety inspections. Take reasonable and prudent actions to eliminate identified hazards. Ensure employees receive appropriate safety training and foster a workplace safety culture. Salary Range (PG 127) $99,155 - $123,943 Number of Openings: 1 Job Posting Close Date: January 22, 2026 Clery Act As required by the US Department of Education, employees are required to report violations under Title IX and, under the Jeanne Clery Disclosure of Campus Security Policy and Crime Statistics Act (Clery Act), select individuals are required to report crimes. If this position is identified as a Campus Security Authority (Clery Act), you will be notified, trained, and provided resources for reporting. Disclaimer The above description is an overview of the job. It is not intended to be an all-inclusive list of duties and responsibilities of the job, nor is it an all-inclusive list of the skills and abilities required to do the job. Duties and responsibilities may change with business needs. ACC reserves the right to add, change, amend, or delete portions of this job description at any time, with or without notice. Employees may be required to perform other duties as requested, directed, or assigned. In addition, reasonable accommodations may be made by ACC at its discretion to enable individuals with disabilities to perform essential functions of the job. To apply, please visit: Copyright 2025 Inc. All rights reserved. Posted by the FREE value-added recruitment advertising agency jeid-bfd995ade5e166469e657654d06007eb
Data Integration Developer Austin Community College Job Posting Closing Times: Job postings are removed from advertising at 12:00 A.M. on the closing date e.g., at midnight on the day before the closing date. Austin Community College employees are required to maintain a domicile in the State of Texas while working for the college and throughout the duration of employment. - AR 4.0300.01 If you are a current Austin Community College employee, please click this link to apply through your Workday account . Austin Community College is a public two-year institution that serves a multicultural population of approximately 41,000 credit students each Fall and Spring semester. We embrace our identity as a community college, as reflected in our mission statement. We promote student success and community development by providing affordable access, through traditional and distance learning modes, to higher education and workforce training, including appropriate applied baccalaureate degrees, in our service area. As a community college committed to our mission, we seek to recruit and retain a workforce that: Values intellectual curiosity and innovative teaching Is attracted by the college's mission to promote equitable access to educational opportunities Cares about student success and collaborates on strategies to facilitate success for populations including; first generation college students, low-income students, and students from underserved communities. Focused on student academic achievement and postgraduate outcomes Welcomes difference and models respectful interaction with others Engages with the community both within and outside of ACC Job Posting Title: Data Integration Developer Job Description Summary: Austin Community College, the 6th largest community college in the nation, is looking for multiple Data Integration developers to help support their new enterprise information management strategy. We are seeking highly skilled developers with a strong background in data integration, ETL/ELT processes, and data modeling to join our dynamic and growing BI team. You will have the unique opportunity to have a hand in building a new data integration hub from the ground up to support a robust and meaningful business intelligence platform that serves the whole college. Job Description: Description of Duties and Tasks Designing, developing, and doing unit testing of complex integration processes. Developing data integrations using ETL and Informatica Real Time process integration components. Developing application integrations using Informatica connecting to platforms such as Salesforce, Workday, and other enterprise applications. Making design decisions and selecting optimal technology strategies and patterns to support varied application integration use cases. Working closely with other BI and IT teams to maintain an accurate and efficient information ecosystem. Writing complex SQL queries, stored procedures and DB triggers. Designing and developing SOAP API components on Informatica Cloud. Writing technical design documents, mapping documents and unit testing documents. Other duties as assigned. Knowledge Demonstrated knowledge of SOA and enterprise architecture, including underlying SOA and BPM standards such as XML, web services (SOAP/REST), WSDL etc., experience in Java, JavaScript, .Net, and related tool chains. Demonstrated experience with both data integrations and application integrations. Demonstrated experience working with different relational SQL and NoSQL Databases, ODBC, and JDBC. Experience with cloud, on premise, and hybrid integrations, e.g. AWS, Google Cloud, Azure, Oracle, SAP, Hadoop Ecosystem, Workday, Salesforce, Netapp, Coupa, Siebel, etc. Understanding of service interface design principles, Core ETL abstractions, and design best practices. Ability to design and apply process and integration design patterns. Knowledge of the tools and technologies used in ICS and ICRT Cloud Platform. Working knowledge of data warehouses, data marts and dimensional modeling. Working knowledge of ETL/ELT, SQL, Python, APIs, scripting and data modeling. Skills Ability to work independently and be self-motivated, committed, and detail-oriented. Maintaining an established work schedule. Effectively using organizational and planning skills. Excellent communication, written and verbal, and interpersonal skills. Prioritizing multiple tasks, projects, and demands. Ability to design and apply process and integration design patterns. Willingness to learn new tools and processes to improve efficiency. Ability to automate manual processes to reduce time and errors. Required Work Experience Two (2) years related work experience. Preferred Work Experience Six (6) years of experience with data management, integration systems, iPaaS, cloud-based enterprise tools, etc., Informatica IICS or IDMC. Eight (8) years of experience with ETL tools, Informatica Power Center preferred or any equivalent ETL tools. Twelve (12) years of experience in writing SQL queries, DB triggers and procedures. Four (4) years of experience in writing and implementing the SOAP APIs on Informatica cloud platform. Required Education Bachelor's degree. Additional related work experience may be substituted of bachelor's degree requirement. Special Requirements Reliable transportation for local Austin area travel. Physical Requirements Work is performed in an office environment. Subject to standing, walking, sitting, bending, reaching, pushing, pulling and at times stooping, crawling, and climbing. Occasional lifting of objects up to 30 pounds. Safety Supervise safe operation of the unit. Facilitate safety inspections. Take reasonable and prudent actions to eliminate identified hazards. Ensure employees receive appropriate safety training and foster a workplace safety culture. Salary Range $88,465 - $110,581 No H1B Visas or Sponsorships. Number of Openings: 3 Job Posting Close Date: January 21, 2026 Clery Act As required by the US Department of Education, employees are required to report violations under Title IX and, under the Jeanne Clery Disclosure of Campus Security Policy and Crime Statistics Act (Clery Act), select individuals are required to report crimes. If this position is identified as a Campus Security Authority (Clery Act), you will be notified, trained, and provided resources for reporting. Disclaimer The above description is an overview of the job. It is not intended to be an all-inclusive list of duties and responsibilities of the job, nor is it an all-inclusive list of the skills and abilities required to do the job. Duties and responsibilities may change with business needs. ACC reserves the right to add, change, amend, or delete portions of this job description at any time, with or without notice. Employees may be required to perform other duties as requested, directed, or assigned. In addition, reasonable accommodations may be made by ACC at its discretion to enable individuals with disabilities to perform essential functions of the job. To apply, please visit: Copyright 2025 Inc. All rights reserved. Posted by the FREE value-added recruitment advertising agency jeid-cc28e36e8e4be24cb5b3c1b05522fa30
01/14/2026
Full time
Data Integration Developer Austin Community College Job Posting Closing Times: Job postings are removed from advertising at 12:00 A.M. on the closing date e.g., at midnight on the day before the closing date. Austin Community College employees are required to maintain a domicile in the State of Texas while working for the college and throughout the duration of employment. - AR 4.0300.01 If you are a current Austin Community College employee, please click this link to apply through your Workday account . Austin Community College is a public two-year institution that serves a multicultural population of approximately 41,000 credit students each Fall and Spring semester. We embrace our identity as a community college, as reflected in our mission statement. We promote student success and community development by providing affordable access, through traditional and distance learning modes, to higher education and workforce training, including appropriate applied baccalaureate degrees, in our service area. As a community college committed to our mission, we seek to recruit and retain a workforce that: Values intellectual curiosity and innovative teaching Is attracted by the college's mission to promote equitable access to educational opportunities Cares about student success and collaborates on strategies to facilitate success for populations including; first generation college students, low-income students, and students from underserved communities. Focused on student academic achievement and postgraduate outcomes Welcomes difference and models respectful interaction with others Engages with the community both within and outside of ACC Job Posting Title: Data Integration Developer Job Description Summary: Austin Community College, the 6th largest community college in the nation, is looking for multiple Data Integration developers to help support their new enterprise information management strategy. We are seeking highly skilled developers with a strong background in data integration, ETL/ELT processes, and data modeling to join our dynamic and growing BI team. You will have the unique opportunity to have a hand in building a new data integration hub from the ground up to support a robust and meaningful business intelligence platform that serves the whole college. Job Description: Description of Duties and Tasks Designing, developing, and doing unit testing of complex integration processes. Developing data integrations using ETL and Informatica Real Time process integration components. Developing application integrations using Informatica connecting to platforms such as Salesforce, Workday, and other enterprise applications. Making design decisions and selecting optimal technology strategies and patterns to support varied application integration use cases. Working closely with other BI and IT teams to maintain an accurate and efficient information ecosystem. Writing complex SQL queries, stored procedures and DB triggers. Designing and developing SOAP API components on Informatica Cloud. Writing technical design documents, mapping documents and unit testing documents. Other duties as assigned. Knowledge Demonstrated knowledge of SOA and enterprise architecture, including underlying SOA and BPM standards such as XML, web services (SOAP/REST), WSDL etc., experience in Java, JavaScript, .Net, and related tool chains. Demonstrated experience with both data integrations and application integrations. Demonstrated experience working with different relational SQL and NoSQL Databases, ODBC, and JDBC. Experience with cloud, on premise, and hybrid integrations, e.g. AWS, Google Cloud, Azure, Oracle, SAP, Hadoop Ecosystem, Workday, Salesforce, Netapp, Coupa, Siebel, etc. Understanding of service interface design principles, Core ETL abstractions, and design best practices. Ability to design and apply process and integration design patterns. Knowledge of the tools and technologies used in ICS and ICRT Cloud Platform. Working knowledge of data warehouses, data marts and dimensional modeling. Working knowledge of ETL/ELT, SQL, Python, APIs, scripting and data modeling. Skills Ability to work independently and be self-motivated, committed, and detail-oriented. Maintaining an established work schedule. Effectively using organizational and planning skills. Excellent communication, written and verbal, and interpersonal skills. Prioritizing multiple tasks, projects, and demands. Ability to design and apply process and integration design patterns. Willingness to learn new tools and processes to improve efficiency. Ability to automate manual processes to reduce time and errors. Required Work Experience Two (2) years related work experience. Preferred Work Experience Six (6) years of experience with data management, integration systems, iPaaS, cloud-based enterprise tools, etc., Informatica IICS or IDMC. Eight (8) years of experience with ETL tools, Informatica Power Center preferred or any equivalent ETL tools. Twelve (12) years of experience in writing SQL queries, DB triggers and procedures. Four (4) years of experience in writing and implementing the SOAP APIs on Informatica cloud platform. Required Education Bachelor's degree. Additional related work experience may be substituted of bachelor's degree requirement. Special Requirements Reliable transportation for local Austin area travel. Physical Requirements Work is performed in an office environment. Subject to standing, walking, sitting, bending, reaching, pushing, pulling and at times stooping, crawling, and climbing. Occasional lifting of objects up to 30 pounds. Safety Supervise safe operation of the unit. Facilitate safety inspections. Take reasonable and prudent actions to eliminate identified hazards. Ensure employees receive appropriate safety training and foster a workplace safety culture. Salary Range $88,465 - $110,581 No H1B Visas or Sponsorships. Number of Openings: 3 Job Posting Close Date: January 21, 2026 Clery Act As required by the US Department of Education, employees are required to report violations under Title IX and, under the Jeanne Clery Disclosure of Campus Security Policy and Crime Statistics Act (Clery Act), select individuals are required to report crimes. If this position is identified as a Campus Security Authority (Clery Act), you will be notified, trained, and provided resources for reporting. Disclaimer The above description is an overview of the job. It is not intended to be an all-inclusive list of duties and responsibilities of the job, nor is it an all-inclusive list of the skills and abilities required to do the job. Duties and responsibilities may change with business needs. ACC reserves the right to add, change, amend, or delete portions of this job description at any time, with or without notice. Employees may be required to perform other duties as requested, directed, or assigned. In addition, reasonable accommodations may be made by ACC at its discretion to enable individuals with disabilities to perform essential functions of the job. To apply, please visit: Copyright 2025 Inc. All rights reserved. Posted by the FREE value-added recruitment advertising agency jeid-cc28e36e8e4be24cb5b3c1b05522fa30
Role : Tableau developer Exp : 10-15 Location : Atlanta GA Job Description The Tableau Developer is responsible for delivering enterprise-level BI solutions by transforming complex data into actionable insights. This role involves data modeling, dashboard architecture, performance optimization, and business stakeholder engagement. Key Responsibilities Design and develop interactive Tableau dashboards and reports Translate business requirements into analytical solutions Perform data modeling and preparation for BI reporting Optimize Tableau performance and data extracts Implement security, governance, and access controls Integrate Tableau with multiple data sources Lead BI initiatives and mentor junior developers Collaborate with data engineers, analysts, and business teams Ensure data accuracy, consistency, and usability Required Skills Technical Skills Tableau Desktop, Tableau Server, Tableau Online Advanced dashboard design and storytelling Data modeling (star/snowflake schemas) SQL (advanced queries, optimization) Tableau performance tuning Data sources: SQL Server, Oracle, Snowflake, Redshift ETL tools (Informatica, Talend, Alteryx preferred) Tableau security, governance, and permissions Integration with Python/R (optional) Soft Skills Strong business and data analysis skills Data storytelling and presentation Stakeholder communication Mentoring and leadership
01/13/2026
Role : Tableau developer Exp : 10-15 Location : Atlanta GA Job Description The Tableau Developer is responsible for delivering enterprise-level BI solutions by transforming complex data into actionable insights. This role involves data modeling, dashboard architecture, performance optimization, and business stakeholder engagement. Key Responsibilities Design and develop interactive Tableau dashboards and reports Translate business requirements into analytical solutions Perform data modeling and preparation for BI reporting Optimize Tableau performance and data extracts Implement security, governance, and access controls Integrate Tableau with multiple data sources Lead BI initiatives and mentor junior developers Collaborate with data engineers, analysts, and business teams Ensure data accuracy, consistency, and usability Required Skills Technical Skills Tableau Desktop, Tableau Server, Tableau Online Advanced dashboard design and storytelling Data modeling (star/snowflake schemas) SQL (advanced queries, optimization) Tableau performance tuning Data sources: SQL Server, Oracle, Snowflake, Redshift ETL tools (Informatica, Talend, Alteryx preferred) Tableau security, governance, and permissions Integration with Python/R (optional) Soft Skills Strong business and data analysis skills Data storytelling and presentation Stakeholder communication Mentoring and leadership
Hi, Greetings For The Day, I'm Symponia from SoftPath Technologies LLC and works in a Recruitment. We have a position for ETL and MA or Woodbridge, NJ(Onsite Only Locals) with our client and your profiles matches to requirement. Can you please check below JD and if you are interested then please share your updated resume to Role: ETL and DevOps Location: Boston, MA or Woodbridge, NJ(Onsite Only Locals) Contract : 6 - 12+ Months Job Description: Need to have AWS/Snowflake, and DBT. They need someone who can establishing CI/CD Pipelines. (ie. Dev/Ops They need to understand the push and pull model. How the DevOps framework works/mapping etc. This would be the only person on that team that will have the exp so they need to be strong with DevOps. ETL Developer with DevOps Informatica Knowledge (power center) DBT Cloud Knowledge Python DevOps Knowledge and ability to manage this AWS Knowledge Ideally Fivetran HVR knowledge Snowflake Knowledge GitHub and all associated components. Knowledge of Secret Password Facility Ability to manage and define security needs
01/06/2026
Hi, Greetings For The Day, I'm Symponia from SoftPath Technologies LLC and works in a Recruitment. We have a position for ETL and MA or Woodbridge, NJ(Onsite Only Locals) with our client and your profiles matches to requirement. Can you please check below JD and if you are interested then please share your updated resume to Role: ETL and DevOps Location: Boston, MA or Woodbridge, NJ(Onsite Only Locals) Contract : 6 - 12+ Months Job Description: Need to have AWS/Snowflake, and DBT. They need someone who can establishing CI/CD Pipelines. (ie. Dev/Ops They need to understand the push and pull model. How the DevOps framework works/mapping etc. This would be the only person on that team that will have the exp so they need to be strong with DevOps. ETL Developer with DevOps Informatica Knowledge (power center) DBT Cloud Knowledge Python DevOps Knowledge and ability to manage this AWS Knowledge Ideally Fivetran HVR knowledge Snowflake Knowledge GitHub and all associated components. Knowledge of Secret Password Facility Ability to manage and define security needs
Ensuring Data Quality: They use Bigeyes data observability features to monitor data quality and identify potential issues.Understanding Data Relationships:They use Alations data catalog to understand the relationships between different data assets.Data Lineage: They use data lineage to identify the impact of changes to data assets.Data Profiling:They use data profiling to determine the usability of data assets.Trust Flags: They use trust flags to signal the trustworthiness of dataData Tagging: Identifying data elements and applying appropriate tags to ensure proper tracking and measurementsData Quality Rule/Plan: Determine and assign appropriate data quality rules, testing, and plans are assigned to ensure veracity of data when engine runs.Must have experience with all USAA internal systems, policies, standards, procedures including IGC, RRAIT, IAI, IA, UDC/Alation, BigEye.Must have experience of playing a lead role in at least 2 to 3 mid-large sized projectsLeads a team of data modelers in large projects for creation logical and physical data models using best practicesOptimizes and updates logical and physical data models to support new and existing projectsMaintains conceptual, logical and physical data models along with corresponding metadataClosely coordinates and liaises with the Business Analysts and Technical Architects on high complexity work to ensure the effective translation of business and technical requirements into the logical data modelDesigns standard naming conventions and coding practices to ensure consistency of data modelsEvaluates data models and physical databases for variances and discrepanciesMust validate data models and physical database for variancesHighly experienced with ETL tools such as Informatica, DataStage, Ab Initio, etc. Equal Opportunity EmployerWe are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.
12/17/2025
Ensuring Data Quality: They use Bigeyes data observability features to monitor data quality and identify potential issues.Understanding Data Relationships:They use Alations data catalog to understand the relationships between different data assets.Data Lineage: They use data lineage to identify the impact of changes to data assets.Data Profiling:They use data profiling to determine the usability of data assets.Trust Flags: They use trust flags to signal the trustworthiness of dataData Tagging: Identifying data elements and applying appropriate tags to ensure proper tracking and measurementsData Quality Rule/Plan: Determine and assign appropriate data quality rules, testing, and plans are assigned to ensure veracity of data when engine runs.Must have experience with all USAA internal systems, policies, standards, procedures including IGC, RRAIT, IAI, IA, UDC/Alation, BigEye.Must have experience of playing a lead role in at least 2 to 3 mid-large sized projectsLeads a team of data modelers in large projects for creation logical and physical data models using best practicesOptimizes and updates logical and physical data models to support new and existing projectsMaintains conceptual, logical and physical data models along with corresponding metadataClosely coordinates and liaises with the Business Analysts and Technical Architects on high complexity work to ensure the effective translation of business and technical requirements into the logical data modelDesigns standard naming conventions and coding practices to ensure consistency of data modelsEvaluates data models and physical databases for variances and discrepanciesMust validate data models and physical database for variancesHighly experienced with ETL tools such as Informatica, DataStage, Ab Initio, etc. Equal Opportunity EmployerWe are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.
Job Title: BI Solutions Architect Lead Specialist Engineer Location: Orlando, FL Experience: 12+ Years Employment Type: Contract Interview Type: In-Person or Webcam Job Overview We are seeking an experienced BI Solutions Architect Lead Specialist Engineer to lead the design, development, and implementation of Business Intelligence solutions. The ideal candidate will have deep expertise in BI architecture, data warehousing, data modeling, analytics platforms, and enterprise reporting capabilities. This role requires strong leadership skills, hands-on technical ability, and experience guiding teams to deliver scalable BI and analytics solutions that support business strategy and informed decision-making. Key Responsibilities Lead the architecture, design, and delivery of enterprise-level Business Intelligence and data analytics solutions. Work closely with business stakeholders to define BI strategy, reporting needs, and data integration requirements. Architect, design, and manage data warehouse solutions including ETL pipelines, data models, and metadata frameworks. Evaluate and recommend BI tools, data platforms, modeling approaches, and architectural best practices. Oversee end-to-end technical solution delivery including planning, execution, and optimization. Drive standardization and governance around data quality, consistency, performance, and security. Collaborate with cross-functional teams including data engineers, developers, analysts, and enterprise architects. Lead performance tuning, testing, migration, and infrastructure optimization for BI platforms. Provide leadership, mentorship, and technical guidance to engineering and analytics teams. Ensure alignment of BI solutions with business goals, scalability needs, and future-state architecture. Required Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or equivalent field. 12+ years of professional experience working in Business Intelligence, Data Engineering, or Analytics roles. Strong experience designing enterprise BI architectures and data warehouse solutions. Hands-on experience with BI tools such as Power BI, Tableau, Qlik, Looker, or MicroStrategy. Strong understanding of ETL frameworks and tools such as Informatica, Talend, DataStage, SSIS, or ADF. Expertise in SQL and data modeling, including star schema, dimensional modeling, and OLAP concepts. Practical experience with cloud platforms such as AWS, Azure, or Google Cloud data services. Good knowledge of data governance, data security, and master data management practices. Proven track record of leading technical teams and large-scale BI transformation projects. Excellent analytical, communication, and stakeholder management skills. Preferred Skills Experience with modern data platforms such as Snowflake, Redshift, Databricks, or BigQuery. Knowledge of Python or other scripting languages for automation and data processing. Experience with real-time data integration technologies including Kafka or streaming pipelines. Exposure to machine learning, predictive analytics, and advanced analytics ecosystems. Prior consulting or enterprise-level solution delivery experience.
12/17/2025
Job Title: BI Solutions Architect Lead Specialist Engineer Location: Orlando, FL Experience: 12+ Years Employment Type: Contract Interview Type: In-Person or Webcam Job Overview We are seeking an experienced BI Solutions Architect Lead Specialist Engineer to lead the design, development, and implementation of Business Intelligence solutions. The ideal candidate will have deep expertise in BI architecture, data warehousing, data modeling, analytics platforms, and enterprise reporting capabilities. This role requires strong leadership skills, hands-on technical ability, and experience guiding teams to deliver scalable BI and analytics solutions that support business strategy and informed decision-making. Key Responsibilities Lead the architecture, design, and delivery of enterprise-level Business Intelligence and data analytics solutions. Work closely with business stakeholders to define BI strategy, reporting needs, and data integration requirements. Architect, design, and manage data warehouse solutions including ETL pipelines, data models, and metadata frameworks. Evaluate and recommend BI tools, data platforms, modeling approaches, and architectural best practices. Oversee end-to-end technical solution delivery including planning, execution, and optimization. Drive standardization and governance around data quality, consistency, performance, and security. Collaborate with cross-functional teams including data engineers, developers, analysts, and enterprise architects. Lead performance tuning, testing, migration, and infrastructure optimization for BI platforms. Provide leadership, mentorship, and technical guidance to engineering and analytics teams. Ensure alignment of BI solutions with business goals, scalability needs, and future-state architecture. Required Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or equivalent field. 12+ years of professional experience working in Business Intelligence, Data Engineering, or Analytics roles. Strong experience designing enterprise BI architectures and data warehouse solutions. Hands-on experience with BI tools such as Power BI, Tableau, Qlik, Looker, or MicroStrategy. Strong understanding of ETL frameworks and tools such as Informatica, Talend, DataStage, SSIS, or ADF. Expertise in SQL and data modeling, including star schema, dimensional modeling, and OLAP concepts. Practical experience with cloud platforms such as AWS, Azure, or Google Cloud data services. Good knowledge of data governance, data security, and master data management practices. Proven track record of leading technical teams and large-scale BI transformation projects. Excellent analytical, communication, and stakeholder management skills. Preferred Skills Experience with modern data platforms such as Snowflake, Redshift, Databricks, or BigQuery. Knowledge of Python or other scripting languages for automation and data processing. Experience with real-time data integration technologies including Kafka or streaming pipelines. Exposure to machine learning, predictive analytics, and advanced analytics ecosystems. Prior consulting or enterprise-level solution delivery experience.
Job Summary We are seeking a Principal SQL Developer to provide technical leadership and expertise in the design, development, and optimization of our enterprise database solutions. This top-level role will be instrumental in architecting scalable, reliable, and high-performance data systems that support critical business applications and analytics. The ideal candidate will be a data expert, capable of translating complex business requirements into robust technical solutions, mentoring junior developers, and driving data strategies across the organization. Key Responsibilities Database Architecture & Design: Lead the design and implementation of complex, scalable database architectures (relational and potentially NoSQL), including schemas, tables, views, and indexing strategies, ensuring data integrity and consistency. Responsible for designing and gaining approval on data architecture decisions, data modeling approaches, design standards, development tools and lifecycle practices. Performance Optimization: Serve as the primary expert for performance tuning and optimization of all SQL queries, stored procedures, and database configurations, utilizing advanced tools (e.g., SQL Server Profiler) to identify and resolve bottlenecks in high-volume environments. ETL and Data Pipelines: Architect, develop, and maintain robust ETL/ELT processes and data pipelines to facilitate data migration, transformation, and integration from various sources into data warehouses and other data stores. Technical Leadership & Mentorship: Provide technical guidance, perform code reviews, and mentor junior and mid-level developers on best practices for SQL development, data modeling, and database management. Collaboration & Strategy: Work closely with senior management, data scientists, application developers, and business analysts to understand data needs, align database strategy with organizational goals, and influence the overall technology roadmap. Data Governance & Security: Define and enforce database security measures, backup/recovery, and disaster recovery strategies to safeguard critical data and ensure compliance with industry regulations (e.g., GDPR, HIPAA). Troubleshooting & Problem Solving: Proactively monitor and troubleshoot the most complex database issues, providing high-level support and implementing permanent solutions to ensure minimal downtime and disruption. Documentation & Standards: Champion a culture of comprehensive documentation for database architectures, processes, and operational procedures to ensure knowledge sharing and maintainability. CONTRACT JOB DESCRIPTION Responsibilities: 1. Leads the adoption or implementation of an advanced technology or platform. 2. Expert on the functionality or usage of a particular system, platform, or technology product. 3. Serves as a consultant to clients, guiding the efficient use or adoption of a particular IT product or platform. 4. Creates implementation, testing, and/or integration plans. 5. Demonstrates expertise in a particular IT platform or service, allowing for maximum IT investment. Minimum Education/ Certification Requirements: Bachelors degree in Information Technology or related field or equivalent experience Training or certification in a particular product or IT platform/service, as required Skill 6-10 yrs. leading advanced technology projects or service projects 6-10 yrs. full system engineering lifecycle 6-10 yrs. creating implementation/integration plans, test plans, and training materials 6-10 yrs. hands-on experience in specific product or IT platform proven experience of: delivering high quality, complex technology solutions in commercial and government organizations delivering data analytics platforms and enterprise applications; working with SQL, SSRS, SSIS, SSAS, Azure Data Factory, Databricks, Python, DAX senior level understanding of health care management systems; and working with senior executives Deep expertise in one or more major DBMS platforms (e.g., Microsoft SQL Server, Oracle, PostgreSQL, MySQL) and advanced T-SQL/PL-SQL programming. Exp data modeling (dimensional, star/snowflake schemas), ETL tools (SSIS, Informatica), and cloud-based data platforms (AWS RDS, Azure SQL) Strong understanding of database architecture principles, high availability solutions (clustering, replication), and capacity planning Bachelors degree. (Master's highly desired) in CS, IT or a quantitative data field of study OR equivalent exp. certifications (e.g., Microsoft Certified: Azure Database Administrator Associate), experience with Big Data technologies (Spark, Hadoop), and DevOps
12/17/2025
Job Summary We are seeking a Principal SQL Developer to provide technical leadership and expertise in the design, development, and optimization of our enterprise database solutions. This top-level role will be instrumental in architecting scalable, reliable, and high-performance data systems that support critical business applications and analytics. The ideal candidate will be a data expert, capable of translating complex business requirements into robust technical solutions, mentoring junior developers, and driving data strategies across the organization. Key Responsibilities Database Architecture & Design: Lead the design and implementation of complex, scalable database architectures (relational and potentially NoSQL), including schemas, tables, views, and indexing strategies, ensuring data integrity and consistency. Responsible for designing and gaining approval on data architecture decisions, data modeling approaches, design standards, development tools and lifecycle practices. Performance Optimization: Serve as the primary expert for performance tuning and optimization of all SQL queries, stored procedures, and database configurations, utilizing advanced tools (e.g., SQL Server Profiler) to identify and resolve bottlenecks in high-volume environments. ETL and Data Pipelines: Architect, develop, and maintain robust ETL/ELT processes and data pipelines to facilitate data migration, transformation, and integration from various sources into data warehouses and other data stores. Technical Leadership & Mentorship: Provide technical guidance, perform code reviews, and mentor junior and mid-level developers on best practices for SQL development, data modeling, and database management. Collaboration & Strategy: Work closely with senior management, data scientists, application developers, and business analysts to understand data needs, align database strategy with organizational goals, and influence the overall technology roadmap. Data Governance & Security: Define and enforce database security measures, backup/recovery, and disaster recovery strategies to safeguard critical data and ensure compliance with industry regulations (e.g., GDPR, HIPAA). Troubleshooting & Problem Solving: Proactively monitor and troubleshoot the most complex database issues, providing high-level support and implementing permanent solutions to ensure minimal downtime and disruption. Documentation & Standards: Champion a culture of comprehensive documentation for database architectures, processes, and operational procedures to ensure knowledge sharing and maintainability. CONTRACT JOB DESCRIPTION Responsibilities: 1. Leads the adoption or implementation of an advanced technology or platform. 2. Expert on the functionality or usage of a particular system, platform, or technology product. 3. Serves as a consultant to clients, guiding the efficient use or adoption of a particular IT product or platform. 4. Creates implementation, testing, and/or integration plans. 5. Demonstrates expertise in a particular IT platform or service, allowing for maximum IT investment. Minimum Education/ Certification Requirements: Bachelors degree in Information Technology or related field or equivalent experience Training or certification in a particular product or IT platform/service, as required Skill 6-10 yrs. leading advanced technology projects or service projects 6-10 yrs. full system engineering lifecycle 6-10 yrs. creating implementation/integration plans, test plans, and training materials 6-10 yrs. hands-on experience in specific product or IT platform proven experience of: delivering high quality, complex technology solutions in commercial and government organizations delivering data analytics platforms and enterprise applications; working with SQL, SSRS, SSIS, SSAS, Azure Data Factory, Databricks, Python, DAX senior level understanding of health care management systems; and working with senior executives Deep expertise in one or more major DBMS platforms (e.g., Microsoft SQL Server, Oracle, PostgreSQL, MySQL) and advanced T-SQL/PL-SQL programming. Exp data modeling (dimensional, star/snowflake schemas), ETL tools (SSIS, Informatica), and cloud-based data platforms (AWS RDS, Azure SQL) Strong understanding of database architecture principles, high availability solutions (clustering, replication), and capacity planning Bachelors degree. (Master's highly desired) in CS, IT or a quantitative data field of study OR equivalent exp. certifications (e.g., Microsoft Certified: Azure Database Administrator Associate), experience with Big Data technologies (Spark, Hadoop), and DevOps
FatPipe Technologies Inc has openings for Software Engineers to gather requirements, analyze, design, write code, test, implement, document, maintain, and modify software applications using any or combination of the following technologies: ADF, OAF, Oracle MAF, Java/J2EE, JavaScript, JSF, Angular, Spring, Hibernate, Struts, WebServices/Microservices, JDeveloper, ADF Integrations with EBS, JMS, Oracle Cloud Infrastructure, Oracle EBS, Hyperion, UiPath, Elastic Search, Kibana, Oracla DBA, Golden Gate, Oracle Engineered Systems, PostgreSQL, MySQL, DevOps, EC2, VPC, Route53,Jenkins, docker, Chef/Ansible, Git, SonarQube, IIB, AWS, Azure, Google Cloud, CICD, CICD, Big Data, Hadoop, Oracle Apps, Salesforce/Siebel, ETL, Abinitio DataStage, SSIS, SSRS, SSAS, Pega, FileNet, Documentum, PL/SQL, Linux, Solaris, Shellscript, SQL server, Jira, ESB Mule, Weblogic, Websphere, Jboss, Python,QTP, QC, Load/Winrunner, RPA, Selenium, ALM, OBIEE, Tableau, Cognos, Qlikview, Informatica, PowerBI, Talend, Alteryx, Tibco, Andoroid, MicroStrategy, UI/UX design, C, C++, and other technologies. All the positions require either US Masters or Bachelors degree with/withoutexperience. In lieu of Masters degree, we will accept Bachelors degree with five years experience. Work location is at Salt Lake City, UT and various unanticipated clients locations throughout US. Travel and/or relocation may require. Mail resumes to: FatPipe Technologies Inc, 392 E Winchester St, 5th Fl, Salt Lake City, UT 84107. EOE
12/17/2025
FatPipe Technologies Inc has openings for Software Engineers to gather requirements, analyze, design, write code, test, implement, document, maintain, and modify software applications using any or combination of the following technologies: ADF, OAF, Oracle MAF, Java/J2EE, JavaScript, JSF, Angular, Spring, Hibernate, Struts, WebServices/Microservices, JDeveloper, ADF Integrations with EBS, JMS, Oracle Cloud Infrastructure, Oracle EBS, Hyperion, UiPath, Elastic Search, Kibana, Oracla DBA, Golden Gate, Oracle Engineered Systems, PostgreSQL, MySQL, DevOps, EC2, VPC, Route53,Jenkins, docker, Chef/Ansible, Git, SonarQube, IIB, AWS, Azure, Google Cloud, CICD, CICD, Big Data, Hadoop, Oracle Apps, Salesforce/Siebel, ETL, Abinitio DataStage, SSIS, SSRS, SSAS, Pega, FileNet, Documentum, PL/SQL, Linux, Solaris, Shellscript, SQL server, Jira, ESB Mule, Weblogic, Websphere, Jboss, Python,QTP, QC, Load/Winrunner, RPA, Selenium, ALM, OBIEE, Tableau, Cognos, Qlikview, Informatica, PowerBI, Talend, Alteryx, Tibco, Andoroid, MicroStrategy, UI/UX design, C, C++, and other technologies. All the positions require either US Masters or Bachelors degree with/withoutexperience. In lieu of Masters degree, we will accept Bachelors degree with five years experience. Work location is at Salt Lake City, UT and various unanticipated clients locations throughout US. Travel and/or relocation may require. Mail resumes to: FatPipe Technologies Inc, 392 E Winchester St, 5th Fl, Salt Lake City, UT 84107. EOE
Design, develop, validate and deploy the Talend ETL processes using Talend data integration and data quality tools in AWS Environment. Successful candidates should have working knowledge with Agile methodology. Skills: 1) Informatica 2) Talend 3) AWS Services 4) python/Java Description: 6+ years of ETL development experience 2+ years of Informatica Development experience 1+ year of Talend Development experience 3+ years of hands on experience working with AWS platform using services such as RDS, S3, IAM Lambda, API Gateway. Experience working in cloud environments AWS and Big data environments Atleast one language common to cloud platforms such as java/python Creation and maintenance of CI/CD pipelines Familiarity with Docker Test driven development and/or behavior driven development. Performed Migration of Informatica and Talend objects to production systems Experience with Autosys and data workflow automation Experience working with Oracle Experience working with Redshift on AWS cloud AWS cloud ETL experience
12/17/2025
Design, develop, validate and deploy the Talend ETL processes using Talend data integration and data quality tools in AWS Environment. Successful candidates should have working knowledge with Agile methodology. Skills: 1) Informatica 2) Talend 3) AWS Services 4) python/Java Description: 6+ years of ETL development experience 2+ years of Informatica Development experience 1+ year of Talend Development experience 3+ years of hands on experience working with AWS platform using services such as RDS, S3, IAM Lambda, API Gateway. Experience working in cloud environments AWS and Big data environments Atleast one language common to cloud platforms such as java/python Creation and maintenance of CI/CD pipelines Familiarity with Docker Test driven development and/or behavior driven development. Performed Migration of Informatica and Talend objects to production systems Experience with Autosys and data workflow automation Experience working with Oracle Experience working with Redshift on AWS cloud AWS cloud ETL experience