• Support the agile development of enterprise tools
• Create custom code building to quickly extract, triage, and exploit data across domains in support of analytic work while supporting the strategic development of replicable processes.
• Leverage standard commercial tools for Extract, Transform, and Loading data between databases.
• Attend regular user group meetings and appropriate recommendations to improve the use and dissemination of existing tools as well as new datasets is required.
• Education: Bachelor’s degree in Computer Science, Engineering, or a related technical discipline, or the equivalent combination of education, technical certifications or training, or work experience.
• Minimum of 11 years related experience.
• Technical Skills: Expertise in Python and Java (demonstrated proficiency in both), open source ETL work and experience with Apache-NIFI, Kudu, Oozie, Cloudera ecosystem to include Impala, HBase, Spark. Understanding
and proficiency in cross-domain solutions (ETLing data from unclass to class and across classified environments). Experience using Agile and proficiency in continuous integration/delivery tools such as Jenkins, Artifactory,
and Git. Desired experience with AWS and container technologies such as Docker.
• Security Clearance Level: Top Secret/SCI with Polygraph