Data Lake Services Developer

Job Locations 2 weeks ago(4/18/2024 5:01 PM)
Job ID
2024-10528
Posted Date
Engineering
Job Location
US-VA-Chantilly

Company Overview

We are a world-class team of professionals who deliver next generation technology and products in robotic and autonomous platforms, ground, soldier, and maritime systems in 50+ locations world-wide. Much of our work contributes to innovative research in the fields of sensor science, signal processing, data fusion, artificial intelligence (AI), machine learning (ML), and augmented reality (AR).

 

QinetiQ US’s dedicated experts in defense, aerospace, security, and related fields all work together to explore new ways of protecting the American Warfighter, Security Forces, and Allies. Being a part of QinetiQ US means being central to the safety and security of the world around us. Partnering with our customers, we help save lives; reduce risks to society; and maintain the global infrastructure on which we all depend.

 

Why Join QinetiQ US?

 

If you have the courage to take on a wide variety of complex challenges, then you will experience a unique working environment where innovative teams blend different perspectives, disciplines, and technologies to discover new ways of solving complex problems.  In our diverse and inclusive environment, you can be authentic, feel valued, be respected, and realize your full potential. QinetiQ US will support you with workplace flexibility, a commitment to the health and well-being of you and your family and provide opportunities to work with a purpose. We are committed to supporting your success in both your professional and personal lives.

Position Overview

We are recruiting for a Data Lake Services Developer who will leverage their development skills and experience to support the successful ingestion, cleansing, transformation, loading, and display of significant amounts of data, while maintaining and improving the Client's data lake services, Hadoop environment, and Hadoop services.

Responsibilities

  • Designing and implementing large-scale ingest systems in a Big Data environment.
  • Optimizing all stages of the data lifecycle, from initial planning, to ingest, through final display and beyond.
  • Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution capabilities.
  • Developing custom solutions/code to ingest and exploit new and existing data sources.
  • Working with Sponsor development teams to improve system and application performance.
  • Providing support to maintain, optimize, troubleshoot, and configure data lake services, the Hadoop environment, and Hadoop services as needed.
  • Organizing and maintaining documentation so others are able to understand and use it.
  • Collaborating with teammates, other service providers, vendors, and users to develop new and more efficient method.

Required Qualifications

  • Active TS/SCI clearance with polygraph required #qinetiqclearedjob
  • Strong programming/development skills and experience in Java.
  • Experience with service oriented design and architectures.
  • Strong ability to manage competing priorities and communication to multiple stakeholders.

Preferred Qualifications

  • Hands-on experience with Hadoop and Hadoop services, particularly with large Hadoop clusters.
  • Experience with Nifi flows and deployments.

Company EEO Statement

Accessibility/Accommodation:

If because of a medical condition or disability you need a reasonable accommodation for any part of the employment process, please send an e-mail to staffing@us.QinetiQ.com or call (540) 658-2720 Opt. 4 and let us know the nature of your request and contact information.

 

QinetiQ US is an Equal Opportunity/Affirmative Action employer. All Qualified Applicants will receive equal consideration for employment without regard to race, age, color, religion, creed, sex, sexual orientation, gender identity, national origin, disability, or protected Veteran status.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed