Posted by: Booz Allen Hamilton on Feb 18, 2020
Job Number: R0078655
DataOps EngineerKey Role:
Perform Data Ops activities leveraging cutting-edge capabilities, a cloud environment, and data tools. Perform activities, including architecting data systems, standing up data platforms, building out ETL pipelines, and writing custom code to support continuous integration and continuous delivery and deployment of data pipelines, tools, and environments. Work as part of a client-facing, internal consulting team that addresses data challenges, including discussing, designing, developing, and maintaining a scalable data platform.
-3+ years of experience with automating, managing, and monitoring data pipeline operations at large scale
-2+ years of experience with continuous integration and automated deployments of data pipelines, tools, and environments
-Experience with AWS data services, including Redshift, S3, RDS, DMS, and DynamoDB
-Experience with ETL tools, including Pentaho Data Integration, StreamSets, or NiFi
-Experience with BI tools, including MicroStrategy and Tableau
-Experience with Cloud infrastructure and automating the deployment of infrastructure components using one of the following tools: Ansible, Chef, or Puppet
-Ability to obtain a security clearance
-BA or BS degree
-Experience with Linux, RHEL 7, or Windows
-Experience with NoSQL databases, including MongoDB, Dynamo, ElasticSearch, pr Redis
-Experience with the Data Catalog Tool, Data Quality Tool, and Data Modeling Tool
-Experience with Agile software development
-Knowledge of Jira, Git, or Jenkins
-Possession of excellent oral and written communication skills
Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information.
We're an EOE that empowers our people-no matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, veteran status, or other protected characteristic-to fearlessly drive change.
HR. Website URL: