Description:
You will act as a senior back-end software engineer, helping to design and develop an entirely new graph analysis platform that – for the very first time – will allow our mission customers to visualize, analyze, and traverse their expansive and complex mission data in a graph format and in near-real-time. This is an ambitious, high-visibility project with a tremendous opportunity to transform core customer workflows; performance, usability, and scalability are key project goals. To be successful, you will work closely with analysts and operators to gain first-person insight into their missions, workflows, and perspectives, then utilize that knowledge to inform the platform's design. You will write and optimize graph retrieval queries, design and maintain ingest processes, and develop batch and streaming analytics that answer key customer questions and surface critical insights.
Responsibilities:
- Design and architect complex, enterprise-grade software solutions for a streaming analytic app that uses Java, Spring Boot, and Kafka.
- Work with another senior back-end engineer to support the project's containerized environments in both Docker and Kubernetes.
- Utilize Java and Spring Boot while applying leading design patterns to ensure the product's scalability and maintainability.
- Develop and optimize various streaming analytics to transform and analyze data in a performant way.
- Become proficient with the project's Neo4j graph database; develop, optimize, and troubleshoot graph queries.
- Work regularly with stakeholders to understand the domain, elicit requirements, and devise solutions. Solve real customer problems!
Skills Requirements:
- 12 yrs., B.S. in a technical discipline or 4 additional yrs. in place of B.S.
- Active and current TS.SCI w FSP from Maryland
- Expert with Java and Spring Boot; proficient using them to build enterprise-scale applications.
- Experience building real-time data processing applications using streaming libraries like Kafka Streams.
- Understanding of common Enterprise Integration Patterns (EIP) and how to apply them.
- Experience with service containerization and deployment using Docker and/or Kubernetes.
- Experience with Extract, Transform, Load (ETL) software patterns to ingest large and complex datasets.
- Familiarity with Git and GitLab CI/CD.
Preferred:
- Experience with graph databases such as Neo4j.
- Experience building real-time data processing applications using streaming libraries like Kafka Streams.
- Experience modeling data and relationships in graph databases.
- Experience with networking concepts, protocols, and analysis (routers, switches, etc.).
- Knowledge of SIGINT collection and analysis systems.
- Experience with production CNO capabilities and operations.
This position is 100% on-site.
Applicants for positions requiring security clearance will be automatically rejected for candidates not meeting the Security Clearance requirement.