Job DescriptionJob Description Clearance Level: Public Trust US Citizenship: Required Job Classification: Full Time Location: Remote Years of Experience: 5-7 years of relevant experience Education Level: BS Degree - experience may be considered in place of education requirement. Briefly Describe the Work: GITI is seeking a Senior RF Software Engineer to support Cyber Operations Research and Development on passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will implement, test, and maintain components of production software pipeline - a stream ingestion, rollup, and post-processing system operating on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. Working under the direction of the Principal Engineer and the Technical Lead. The Senior RF Software Engineer supports Cyber Operations by contributing to pipeline development across a range of functional areas including stream processing, database integration, display and reporting tools, simulation infrastructure, and CI/CD tooling. The role requires strong Python skills, comfort with air-gapped Linux environments, and the ability to work independently on well-defined components with minimal supervision in support of real world cyber operations. Responsibilities: Implement, test, and maintain assigned pipeline components including stream ingestion, rollup processing, database write, and batch post-processing modules in support of real world cyber operations Develop and maintain browser-based visualization and reporting tools (track plots, waterfall displays, SmartBook report generation) that consume pipeline database output Implement and maintain stream simulation infrastructure, including TDMA network mission log replay and stream generation at controllable rates for pipeline testing Develop lightweight TNS simulator components: emitter and receiver models capable of following track plots and emitting in accordance with a network description Contribute to database integration work on tactical-box-spec hardware, including MySQL schema design, query optimization, and performance benchmarking Write comprehensive unit and integration tests for assigned components; implement and maintain CI/CD pipelines using GitLab to ensure functionality on hardware or in cloud environment Identify and report performance bottlenecks in Python pipeline components; assist with porting mature components to Rust or C as directed Perform basic Linux system administration on remote servers including package management, user configuration, and environment setup Manage source code using GitLab; follow disciplined versioning, branching, and code review practices as established by the Principal Engineer Produce clear technical documentation for implemented components including interface specifications, configuration guides, and test procedures Participate in periodic technical check-ins with the program technical lead; share findings and flag blockers promptly Career level with a complete understanding and wide application of technical principles, theories, and concepts. Working under general direction from the Principal Engineer, provides technical solutions to a wide range of well-defined problems and independently executes on assigned components. Bachelor's (or equivalent) with 5-7 years of experience, or a Master's with 3-5 years of experience. Required Skills: Strong proficiency in Python, with demonstrated experience in data processing pipelines, stream ingestion, or ETL development Proficiency with Python data science libraries including NumPy, Pandas (or Polars), and scikit-learn Experience with relational database development using MySQL, PostgreSQL, or SQLite, including schema design and query optimization Experience parsing or generating binary serialization formats (FlatBuffers, Protocol Buffers, or equivalent) Ability to develop, test, and debug on remote Linux servers via SSH using command-line tools and a modern IDE Solid Linux operating system fundamentals including file system management, process control, and basic security hardening (Ubuntu) Proficient in software engineering practices including Git/GitLab version control, unit testing, and CI/CD pipeline usage Experience developing browser-based data visualization or reporting tools, or demonstrated ability to learn React/D3-based tooling on the job Strong written and oral communication skills; ability to produce clear technical documentation for engineering audiences Ability to work independently on assigned components with minimal supervision in a small, distributed team Desired Skills: Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent) Experience with Rust or Go for systems-level or performance-critical development on Linux Experience with Polars or DuckDB for high-performance analytical workloads Experience with performance profiling and optimization of Python pipelines on resource-constrained x86 hardware Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards Familiarity with AI/ML libraries (PyTorch, TensorFlow); ability to integrate trained model inference into a pipeline without requiring deep ML expertise Experience with Jupyter Notebooks and research enclave environments; ability to read and adapt research prototype code Experience with simulation or synthetic data generation for pipeline testing purposes Familiarity with Apache data science tools such as Spark or Dask for large-scale data processing Relevant Certifications: Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Scrum Developer (CSD); Red Hat Certified Enterprise Application Developer; Certified Secure Software Lifecycle Professional (CSSLP); C++ Certified Associate Programmer (CPA); Professional Software Developer Certification (PSD); etc.) Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability. About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.
04/23/2026
Full time
Job DescriptionJob Description Clearance Level: Public Trust US Citizenship: Required Job Classification: Full Time Location: Remote Years of Experience: 5-7 years of relevant experience Education Level: BS Degree - experience may be considered in place of education requirement. Briefly Describe the Work: GITI is seeking a Senior RF Software Engineer to support Cyber Operations Research and Development on passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will implement, test, and maintain components of production software pipeline - a stream ingestion, rollup, and post-processing system operating on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. Working under the direction of the Principal Engineer and the Technical Lead. The Senior RF Software Engineer supports Cyber Operations by contributing to pipeline development across a range of functional areas including stream processing, database integration, display and reporting tools, simulation infrastructure, and CI/CD tooling. The role requires strong Python skills, comfort with air-gapped Linux environments, and the ability to work independently on well-defined components with minimal supervision in support of real world cyber operations. Responsibilities: Implement, test, and maintain assigned pipeline components including stream ingestion, rollup processing, database write, and batch post-processing modules in support of real world cyber operations Develop and maintain browser-based visualization and reporting tools (track plots, waterfall displays, SmartBook report generation) that consume pipeline database output Implement and maintain stream simulation infrastructure, including TDMA network mission log replay and stream generation at controllable rates for pipeline testing Develop lightweight TNS simulator components: emitter and receiver models capable of following track plots and emitting in accordance with a network description Contribute to database integration work on tactical-box-spec hardware, including MySQL schema design, query optimization, and performance benchmarking Write comprehensive unit and integration tests for assigned components; implement and maintain CI/CD pipelines using GitLab to ensure functionality on hardware or in cloud environment Identify and report performance bottlenecks in Python pipeline components; assist with porting mature components to Rust or C as directed Perform basic Linux system administration on remote servers including package management, user configuration, and environment setup Manage source code using GitLab; follow disciplined versioning, branching, and code review practices as established by the Principal Engineer Produce clear technical documentation for implemented components including interface specifications, configuration guides, and test procedures Participate in periodic technical check-ins with the program technical lead; share findings and flag blockers promptly Career level with a complete understanding and wide application of technical principles, theories, and concepts. Working under general direction from the Principal Engineer, provides technical solutions to a wide range of well-defined problems and independently executes on assigned components. Bachelor's (or equivalent) with 5-7 years of experience, or a Master's with 3-5 years of experience. Required Skills: Strong proficiency in Python, with demonstrated experience in data processing pipelines, stream ingestion, or ETL development Proficiency with Python data science libraries including NumPy, Pandas (or Polars), and scikit-learn Experience with relational database development using MySQL, PostgreSQL, or SQLite, including schema design and query optimization Experience parsing or generating binary serialization formats (FlatBuffers, Protocol Buffers, or equivalent) Ability to develop, test, and debug on remote Linux servers via SSH using command-line tools and a modern IDE Solid Linux operating system fundamentals including file system management, process control, and basic security hardening (Ubuntu) Proficient in software engineering practices including Git/GitLab version control, unit testing, and CI/CD pipeline usage Experience developing browser-based data visualization or reporting tools, or demonstrated ability to learn React/D3-based tooling on the job Strong written and oral communication skills; ability to produce clear technical documentation for engineering audiences Ability to work independently on assigned components with minimal supervision in a small, distributed team Desired Skills: Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent) Experience with Rust or Go for systems-level or performance-critical development on Linux Experience with Polars or DuckDB for high-performance analytical workloads Experience with performance profiling and optimization of Python pipelines on resource-constrained x86 hardware Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards Familiarity with AI/ML libraries (PyTorch, TensorFlow); ability to integrate trained model inference into a pipeline without requiring deep ML expertise Experience with Jupyter Notebooks and research enclave environments; ability to read and adapt research prototype code Experience with simulation or synthetic data generation for pipeline testing purposes Familiarity with Apache data science tools such as Spark or Dask for large-scale data processing Relevant Certifications: Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Scrum Developer (CSD); Red Hat Certified Enterprise Application Developer; Certified Secure Software Lifecycle Professional (CSSLP); C++ Certified Associate Programmer (CPA); Professional Software Developer Certification (PSD); etc.) Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability. About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.
Job DescriptionJob Description Clearance Level: US Citizenship: Required Job Classification: Full Time Location: Remote Years of Experience: 10+ years of relevant experience Education Level: BS Degree - experience may be considered in place of education requirement. Briefly Describe the Work: GITI is seeking a Principal Software Engineer to support Cyber Operations Research and Development as the technical lead for production software development on a passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will own the architecture, implementation, and delivery of the production pipeline - a stream ingestion, rollup, and post-processing system that operates on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. The candidate will lead a small team of senior software engineers, coordinate closely with the program technical lead and AI/ML researchers to drive Cyber Operations software from prototype to production-quality, resource-efficient components deployable on tactical edge hardware. This is a hands-on technical leadership role: the Principal Engineer writes code, makes architecture decisions, and is accountable for pipeline performance and reliability in support of real world cyber operations. Responsibilities: Own the architecture and implementation of the production software pipeline, including stream ingestion, rollup, database write, and batch post-processing components Lead a team of Senior Software Engineers in support of real world cyber operations; assign work, conduct code reviews, enforce quality standards, and provide technical mentorship Establish and maintain disciplined software engineering practices: versioning, CI/CD pipelines, unit and integration testing, and documentation standards Design and evaluate database and storage architecture for the tactical system and research enclave environments Collaborate with the program technical lead to translate research findings and batch optimization algorithms into production pipeline components Evaluate and benchmark Python pipeline performance on tactical-box-spec hardware; identify bottlenecks and lead porting of mature components to Rust or C for edge deployment Manage and coordinate the tactical system VM environment and stream simulation infrastructure; ensure research VM is not disrupted by development activity Define and enforce stream interface contracts between the ingestion layer, database, and downstream consumers Evaluate emerging technologies (e.g., DuckDB/Parquet, Polars, message queues) against program requirements and recommend adoption decisions to the technical lead Maintain the program's GitLab repository structure, branching strategy, and release management Produce clear technical documentation including architecture decision records, interface specifications, and deployment guides Support technical reviews and provide written inputs for sponsor deliverables as directed by the program technical lead Expert-level career professional with broad and deep application of software engineering principles across the full development lifecycle. Exercises independent judgment in evaluating methods, techniques, and approaches; identifies and resolves complex technical problems with significant program impact. Provides technical leadership and direction to other engineers. Bachelor's (or equivalent) with 10+ years of experience. Required Skills: Demonstrated experience leading a software engineering team on a production data pipeline or streaming system; ability to set technical direction and mentor junior engineers Expert-level Python development, including stream processing, multi-threaded/async architectures, and performance profiling Proficiency in one or more compiled or systems languages (Rust, C, C++, or Go) for performance-critical components; experience porting Python to compiled targets Hands-on experience designing and implementing relational database schemas and write-intensive data pipelines (MySQL, PostgreSQL, or equivalent) Experience parsing binary serialization formats such as FlatBuffers or Protocol Buffers in a production context Demonstrated ability to benchmark and optimize pipeline throughput on resource-constrained hardware or cloud environment Strong proficiency with Linux system administration, remote server management via SSH, and air-gapped development environments Experience architecting multi-consumer data systems where a single write path must serve concurrent display, analytics, and batch processing readers Proficient in disciplined software engineering practices: GitLab/Git, CI/CD pipeline design, test-driven development, and code review Excellent written and oral communication skills; ability to produce architecture decision records and technical documentation for both engineering and leadership audiences Desired Skills: Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts Experience deploying and operating software on tactical edge hardware co-located with a sensor system Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent) Experience with Polars or DuckDB for high-performance analytical workloads and write-once/read-many storage patterns Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards Familiarity with AI/ML model inference integration - ability to incorporate batch optimizer outputs into the production pipeline without requiring ML expertise Experience with browser-based data visualization or reporting tools (React, D3, or equivalent) as a consumer of pipeline output Experience with Jupyter Notebooks and research enclave environments; ability to bridge from research prototype to production code Experience with FlatBuffers binary stream replay and simulation infrastructure for pipeline testing Familiarity with Rust toolchain and ecosystem for systems-level development on Linux Relevant Certifications: Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Secure Software Lifecycle Professional (CSSLP); Red Hat Certified Engineer (RHCE); C++ Certified Professional Programmer (CPP); Professional Software Developer Certification (PSD); etc.) Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability. About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.
04/23/2026
Full time
Job DescriptionJob Description Clearance Level: US Citizenship: Required Job Classification: Full Time Location: Remote Years of Experience: 10+ years of relevant experience Education Level: BS Degree - experience may be considered in place of education requirement. Briefly Describe the Work: GITI is seeking a Principal Software Engineer to support Cyber Operations Research and Development as the technical lead for production software development on a passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will own the architecture, implementation, and delivery of the production pipeline - a stream ingestion, rollup, and post-processing system that operates on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. The candidate will lead a small team of senior software engineers, coordinate closely with the program technical lead and AI/ML researchers to drive Cyber Operations software from prototype to production-quality, resource-efficient components deployable on tactical edge hardware. This is a hands-on technical leadership role: the Principal Engineer writes code, makes architecture decisions, and is accountable for pipeline performance and reliability in support of real world cyber operations. Responsibilities: Own the architecture and implementation of the production software pipeline, including stream ingestion, rollup, database write, and batch post-processing components Lead a team of Senior Software Engineers in support of real world cyber operations; assign work, conduct code reviews, enforce quality standards, and provide technical mentorship Establish and maintain disciplined software engineering practices: versioning, CI/CD pipelines, unit and integration testing, and documentation standards Design and evaluate database and storage architecture for the tactical system and research enclave environments Collaborate with the program technical lead to translate research findings and batch optimization algorithms into production pipeline components Evaluate and benchmark Python pipeline performance on tactical-box-spec hardware; identify bottlenecks and lead porting of mature components to Rust or C for edge deployment Manage and coordinate the tactical system VM environment and stream simulation infrastructure; ensure research VM is not disrupted by development activity Define and enforce stream interface contracts between the ingestion layer, database, and downstream consumers Evaluate emerging technologies (e.g., DuckDB/Parquet, Polars, message queues) against program requirements and recommend adoption decisions to the technical lead Maintain the program's GitLab repository structure, branching strategy, and release management Produce clear technical documentation including architecture decision records, interface specifications, and deployment guides Support technical reviews and provide written inputs for sponsor deliverables as directed by the program technical lead Expert-level career professional with broad and deep application of software engineering principles across the full development lifecycle. Exercises independent judgment in evaluating methods, techniques, and approaches; identifies and resolves complex technical problems with significant program impact. Provides technical leadership and direction to other engineers. Bachelor's (or equivalent) with 10+ years of experience. Required Skills: Demonstrated experience leading a software engineering team on a production data pipeline or streaming system; ability to set technical direction and mentor junior engineers Expert-level Python development, including stream processing, multi-threaded/async architectures, and performance profiling Proficiency in one or more compiled or systems languages (Rust, C, C++, or Go) for performance-critical components; experience porting Python to compiled targets Hands-on experience designing and implementing relational database schemas and write-intensive data pipelines (MySQL, PostgreSQL, or equivalent) Experience parsing binary serialization formats such as FlatBuffers or Protocol Buffers in a production context Demonstrated ability to benchmark and optimize pipeline throughput on resource-constrained hardware or cloud environment Strong proficiency with Linux system administration, remote server management via SSH, and air-gapped development environments Experience architecting multi-consumer data systems where a single write path must serve concurrent display, analytics, and batch processing readers Proficient in disciplined software engineering practices: GitLab/Git, CI/CD pipeline design, test-driven development, and code review Excellent written and oral communication skills; ability to produce architecture decision records and technical documentation for both engineering and leadership audiences Desired Skills: Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts Experience deploying and operating software on tactical edge hardware co-located with a sensor system Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent) Experience with Polars or DuckDB for high-performance analytical workloads and write-once/read-many storage patterns Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards Familiarity with AI/ML model inference integration - ability to incorporate batch optimizer outputs into the production pipeline without requiring ML expertise Experience with browser-based data visualization or reporting tools (React, D3, or equivalent) as a consumer of pipeline output Experience with Jupyter Notebooks and research enclave environments; ability to bridge from research prototype to production code Experience with FlatBuffers binary stream replay and simulation infrastructure for pipeline testing Familiarity with Rust toolchain and ecosystem for systems-level development on Linux Relevant Certifications: Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Secure Software Lifecycle Professional (CSSLP); Red Hat Certified Engineer (RHCE); C++ Certified Professional Programmer (CPP); Professional Software Developer Certification (PSD); etc.) Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability. About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.
Job DescriptionJob Description Clearance Level: Public Trust US Citizenship: Required Job Classification: Full Time Location: Remote Years of Experience: 7-10 years of relevant experience Education Level: BS Degree - experience may be considered in place of education requirement. Briefly Describe the Work: GITI is seeking a Lead RF Software Engineer to support Cyber Operations Research and Development on passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will implement, test, and maintain components of production software pipeline - a stream ingestion, rollup, and post-processing system operating on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. Working under the direction of the Principal Engineer and the Technical Lead. The Lead RF Software Engineer supports Cyber Operations by contributing to pipeline development across a range of functional areas including stream processing, database integration, display and reporting tools, simulation infrastructure, and CI/CD tooling. The role requires strong Python skills, comfort with air-gapped Linux environments, and the ability to work independently on well-defined components with minimal supervision in support of real world cyber operations. Responsibilities: Implement, test, and maintain assigned pipeline components including stream ingestion, rollup processing, database write, and batch post-processing modules in support of real world cyber operations Develop and maintain browser-based visualization and reporting tools (track plots, waterfall displays, SmartBook report generation) that consume pipeline database output Implement and maintain stream simulation infrastructure, including TDMA network mission log replay and stream generation at controllable rates for pipeline testing Develop lightweight TNS simulator components: emitter and receiver models capable of following track plots and emitting in accordance with a network description Contribute to database integration work on tactical-box-spec hardware, including MySQL schema design, query optimization, and performance benchmarking Write comprehensive unit and integration tests for assigned components; implement and maintain CI/CD pipelines using GitLab to ensure functionality on hardware or in cloud environment Identify and report performance bottlenecks in Python pipeline components; assist with porting mature components to Rust or C as directed Perform basic Linux system administration on remote servers including package management, user configuration, and environment setup Manage source code using GitLab; follow disciplined versioning, branching, and code review practices as established by the Principal Engineer Produce clear technical documentation for implemented components including interface specifications, configuration guides, and test procedures Participate in periodic technical check-ins with the program technical lead; share findings and flag blockers promptly Career level with a complete understanding and wide application of technical principles, theories, and concepts. Working under general direction from the Principal Engineer, provides technical solutions to a wide range of well-defined problems and independently executes on assigned components. Bachelor's (or equivalent) with 7-10 years of experience, or a Master's with 5-7 years of experience. Required Skills: Strong proficiency in Python, with demonstrated experience in data processing pipelines, stream ingestion, or ETL development Proficiency with Python data science libraries including NumPy, Pandas (or Polars), and scikit-learn Experience with relational database development using MySQL, PostgreSQL, or SQLite, including schema design and query optimization Experience parsing or generating binary serialization formats (FlatBuffers, Protocol Buffers, or equivalent) Ability to develop, test, and debug on remote Linux servers via SSH using command-line tools and a modern IDE Solid Linux operating system fundamentals including file system management, process control, and basic security hardening (Ubuntu) Proficient in software engineering practices including Git/GitLab version control, unit testing, and CI/CD pipeline usage Experience developing browser-based data visualization or reporting tools, or demonstrated ability to learn React/D3-based tooling on the job Strong written and oral communication skills; ability to produce clear technical documentation for engineering audiences Ability to work independently on assigned components with minimal supervision in a small, distributed team Desired Skills: Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent) Experience with Rust or Go for systems-level or performance-critical development on Linux Experience with Polars or DuckDB for high-performance analytical workloads Experience with performance profiling and optimization of Python pipelines on resource-constrained x86 hardware Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards Familiarity with AI/ML libraries (PyTorch, TensorFlow); ability to integrate trained model inference into a pipeline without requiring deep ML expertise Experience with Jupyter Notebooks and research enclave environments; ability to read and adapt research prototype code Experience with simulation or synthetic data generation for pipeline testing purposes Familiarity with Apache data science tools such as Spark or Dask for large-scale data processing Relevant Certifications: Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Scrum Developer (CSD); Red Hat Certified Enterprise Application Developer; Certified Secure Software Lifecycle Professional (CSSLP); C++ Certified Associate Programmer (CPA); Professional Software Developer Certification (PSD); etc.) Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability. About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.
04/23/2026
Full time
Job DescriptionJob Description Clearance Level: Public Trust US Citizenship: Required Job Classification: Full Time Location: Remote Years of Experience: 7-10 years of relevant experience Education Level: BS Degree - experience may be considered in place of education requirement. Briefly Describe the Work: GITI is seeking a Lead RF Software Engineer to support Cyber Operations Research and Development on passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will implement, test, and maintain components of production software pipeline - a stream ingestion, rollup, and post-processing system operating on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. Working under the direction of the Principal Engineer and the Technical Lead. The Lead RF Software Engineer supports Cyber Operations by contributing to pipeline development across a range of functional areas including stream processing, database integration, display and reporting tools, simulation infrastructure, and CI/CD tooling. The role requires strong Python skills, comfort with air-gapped Linux environments, and the ability to work independently on well-defined components with minimal supervision in support of real world cyber operations. Responsibilities: Implement, test, and maintain assigned pipeline components including stream ingestion, rollup processing, database write, and batch post-processing modules in support of real world cyber operations Develop and maintain browser-based visualization and reporting tools (track plots, waterfall displays, SmartBook report generation) that consume pipeline database output Implement and maintain stream simulation infrastructure, including TDMA network mission log replay and stream generation at controllable rates for pipeline testing Develop lightweight TNS simulator components: emitter and receiver models capable of following track plots and emitting in accordance with a network description Contribute to database integration work on tactical-box-spec hardware, including MySQL schema design, query optimization, and performance benchmarking Write comprehensive unit and integration tests for assigned components; implement and maintain CI/CD pipelines using GitLab to ensure functionality on hardware or in cloud environment Identify and report performance bottlenecks in Python pipeline components; assist with porting mature components to Rust or C as directed Perform basic Linux system administration on remote servers including package management, user configuration, and environment setup Manage source code using GitLab; follow disciplined versioning, branching, and code review practices as established by the Principal Engineer Produce clear technical documentation for implemented components including interface specifications, configuration guides, and test procedures Participate in periodic technical check-ins with the program technical lead; share findings and flag blockers promptly Career level with a complete understanding and wide application of technical principles, theories, and concepts. Working under general direction from the Principal Engineer, provides technical solutions to a wide range of well-defined problems and independently executes on assigned components. Bachelor's (or equivalent) with 7-10 years of experience, or a Master's with 5-7 years of experience. Required Skills: Strong proficiency in Python, with demonstrated experience in data processing pipelines, stream ingestion, or ETL development Proficiency with Python data science libraries including NumPy, Pandas (or Polars), and scikit-learn Experience with relational database development using MySQL, PostgreSQL, or SQLite, including schema design and query optimization Experience parsing or generating binary serialization formats (FlatBuffers, Protocol Buffers, or equivalent) Ability to develop, test, and debug on remote Linux servers via SSH using command-line tools and a modern IDE Solid Linux operating system fundamentals including file system management, process control, and basic security hardening (Ubuntu) Proficient in software engineering practices including Git/GitLab version control, unit testing, and CI/CD pipeline usage Experience developing browser-based data visualization or reporting tools, or demonstrated ability to learn React/D3-based tooling on the job Strong written and oral communication skills; ability to produce clear technical documentation for engineering audiences Ability to work independently on assigned components with minimal supervision in a small, distributed team Desired Skills: Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent) Experience with Rust or Go for systems-level or performance-critical development on Linux Experience with Polars or DuckDB for high-performance analytical workloads Experience with performance profiling and optimization of Python pipelines on resource-constrained x86 hardware Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards Familiarity with AI/ML libraries (PyTorch, TensorFlow); ability to integrate trained model inference into a pipeline without requiring deep ML expertise Experience with Jupyter Notebooks and research enclave environments; ability to read and adapt research prototype code Experience with simulation or synthetic data generation for pipeline testing purposes Familiarity with Apache data science tools such as Spark or Dask for large-scale data processing Relevant Certifications: Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Scrum Developer (CSD); Red Hat Certified Enterprise Application Developer; Certified Secure Software Lifecycle Professional (CSSLP); C++ Certified Associate Programmer (CPA); Professional Software Developer Certification (PSD); etc.) Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability. About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.