
Work on cutting edge topics involving large datasets. Perform activities, including data architecture, building data and analytic platforms, building out ETL pipelines and data access services, and ensuring data discoverability and quality. Support the assessing, designing, building, and maintaining of scalable data platforms that use the latest and best in Big Data tools. Transforms data into a useful format for data scientists. Work with a multidisciplinary team of analysts, data engineers, data scientists, developers, and data consumers in an agile fast-paced environment that is pushing the envelope of cutting-edge Big Data implementations.
Basic Qualifications:
-Experience with coding using Java, Scala, or Python
-Experience with developing and deploying ETL pipelines
-Experience in interfacing with modern databases
-Experience in working with Big Data platforms, including Hadoop, AWS, Azure, or DataBricks
-Ability to learn technical concepts quickly and communicate with multiple functional groups
-Top Secret Security Clearance
-9 years of Experience and a Master’s degree, or 19 years of equivalent experience.
Additional Qualifications:
-Experience with NoSQL data stores, including HBase, MongoDB, JanusGraph or Neo4J, and Cassandra
-Experience with ETL tools, including StreamSets, NiFi, and Taland
-Experience in working with enterprise production systems
-Possession of excellent oral and written communication skills
-AWS or related certifications
Active TS/SCI w/Polygraph clearance
Job Features
Job Category | Data Scientist, Software Developer, Software Engineer |