Platzhalter Bild

Data Pipeline Technical Lead bei ECS

ECS · Fairfax, Vereinigte Staaten Von Amerika · Hybrid

Jetzt bewerben

ECS is seeking a Data Pipeline Technical Lead to work in a hybrid setting out of our Fairfax, VA office.

 

ECS is seeking talented professionals to join our successful and growing team in building the next-generation Continuous Diagnostics and Mitigation (CDM) Cyber data solution. The CDM Program is the Cybersecurity and Infrastructure Security Agency’s (CISA) dynamic approach to strengthening the cybersecurity of Federal networks and systems through better awareness and visibility into their security posture and cyber threats. ECS is responsible for designing, building, deploying, operating, and maintaining a complete ‘Data Services’ solution which includes the collection, normalization, visualization, and sharing of cyber data from more than 100 Federal agencies. The CDM Data Services product is an integrated suite of multiple Commercial Off the Shelf (COTS) products, software configuration packages, and custom code which work together to operate as an integrated solution tailored to meet Department of Homeland Security (DHS) requirements.  

 

We are seeking professionals who thrive in a dynamic, fast-paced, and highly collaborative environment where problem-solving, critical thinking, and a holistic approach to serving the mission are key.  Our program operates within the Scaled Agile Framework (SAFe). An aptitude and enthusiasm for continuous learning, improvement, and cyber security is a must!

 

ECS is seeking a highly skilled Data Pipeline Technical Lead to design, build, and maintain large-scale, real-time data pipelines that provide mission critical information for our customers. You’ll be responsible for leading technical design, guiding a cross functional team of engineers, and ensuring our pipeline is resilient, scalable, and efficient. 

Key Responsibilities: 

  • Accountable for all technical facets in the delivery of our data pipeline product. 
  • Lead the architecture, design, development, and delivery of our data pipeline product, which consists of a team of 20+ members. 
  • Build and maintain microservices and data connectors that ingest, process, and distribute data across multiple systems and APIs. 
  • Ensure reliability, scalability, and performance of data pipelines through best practices in resiliency, fault tolerance, and eventual consistency. 
  • Optimize data processing and enrichment leveraging caching layers like Redis, ElastiCache, or Valkey. 
  • Guide data modeling decisions and define and enforce data format standards (JSON, Avro, Parquet), while leveraging a schema registry. 
  • Define technical work necessary for cross-functional team members (DevSecOps, Site Reliability Engineering, Security, Testing, Infrastructure) to ensure successful delivery of the data pipeline product. 
  • Work within a SAFe Agile framework to plan work, decompose Epics, develop Stories, and task team members appropriately. 
  • Represent the data pipeline team on a broader level within the program, whether that is across internal teams, with our customer and other stakeholders, provide demos, deliver product status, or generally answering technical questions about the data pipeline. 
  • Provide mentorship to team members to strengthen their skills, knowledge, and capabilities. 
Jetzt bewerben

Weitere Jobs