Description
Boeing Intelligence & Analytics (BI&A) is seeking a Principal Analytics Software Developer to join the development team on one of our (many) prime programs. This Prime Contract is in year three (3) of a seven (7) year period of performance Work Location: Annapolis Junction, MD Telework Availability: N/A What You Will Do: Build and enhance high-visibility, collaborative web application tools used in the day-to-day mission, throughout the customer and across the broader Intelligence Community (IC). Your work will enable data to be updated and shared in real-time and will help drive more informed and timely decisions in matters of national security. You will work with real-time data and create tools using cutting-edge visualization, development and analytic technologies. Thus far, our team has created: Streamlined, collaborative User Interfaces (UI) that share data across agencies within the IC UIs that allows users to consolidate, organize, and reconcile and update data in real-time tabular, graphical, and map visualizations UIs that allows analysts to quickly capture, document and access information An analyst tool that enables users to quickly capture, document and access information. A Day in the Life (Just a Few of the Things You Will Do): Developing and deploying Map/Reduce Analytics to the corporate cloud environment Create and update Java Map/Reduce (Batch/Corporate Thread) Analytics to utilize cloud processing for large data sets and store results in HDFS, Accumulo or egress to a corporate tool. Create analytics to query other corporate sources to enrich data. Upload Map/Reduce Analytics to Job Management Control to schedule and execute analytic jobs daily. Perform Queries against large cloud data sets Works on submission of analytics through the compliance process via JIRA ticketing system Utilize Git code repository for source version control Be part of collaborative and diverse team of junior, senior and expert level developers and mission leaders allowing you to mentor and learn from others. High level of collaboration with multiple product teams to design solutions from the user’s perspective. Review and test software components and gain experience developing comprehensive system, performance, and design plans. Develop software analytics for knowledge data anomalies within enterprise solutions. Developing ETL processes to retrieve and deliver data across multiple persistence technologies in a performant manner Understanding and ensure corporate compliance rule sets are upheld Connecting to outside data sources to retrieve, collate, and cache data Writing automated unit and integration tests Monitoring running applications and troubleshooting errors. Required Education / Years of Experience: Twenty years (20) years’ experience as a System Architect or Systems Engineer in programs and contracts of similar scope, type and complexity is required. Bachelor’s degree in Electrical Engineering, System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related discipline from an accredited college or university is required. Five (5) years of additional Systems Engineering experience may be substituted for a bachelor’s degree. Required Skills/Qualifications (What You Must Have): Active TS/SCI clearance with polygraph Experience developing software with high level languages (such as Java, C, C++), developing software in UNIX/Linux (RedHat versions 3-5+) and software integration and testing (to include developing and implementing test plans and scripts). Experience with writing analytics to query other corporate sources to enrich data. Experience with distributed scalable Big Data Store (NoSQL) such as H Base, CloudBase/Accumulo, Big Table, etc., as well as the Map/Reduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, Pig, Etc. Shall have demonstrated work experience with 1) Serialization such as JSON and/or BSON, 2) developing restful services, and 3) using source code management tools. Developing applications using Java application frameworks (i.e. Spring Boot) Full Software Development Lifecyle (SDLC) experience that includes twenty (20) years or more of: Analyzing and translating user requirements into software requirements Developing software solutions by analyzing system performance standards A combination of providing both new software development and capabilities enhancement Developing simple data queries for existing or proposed databases or data repositories Software Integration (new and existing systems) Troubleshooting, debugging and defects correction Creation and editing of software system documentation Ensuring unit testing and software quality control of all developed and modified software Serving as a mentor to junior team members Desired Experience/ Skills (Nice to Have): Experience with writing Linux based scripts to facilitate application integration using a one or more appropriate server-side languages (i.e. Shell, Python, etc.) Modify the repository schema of an existing application to support new data items, develop data mining and data ingest processes for new data, and support integration with middle-tier and back-end APIs Experience with cloud technologies Experience with Apache NiFi Experience with Java/Pig MapReduce is required Experience performing cloud development Experience performing queries against large data sets Experience with Machine Learning is desired Experience with Python and Jupyter Notebook is desired Experience developing software in a Linux environment Experience with customer GHOSTMACHINE analytic development Experience with Jira, Maven and Git Familiarity with customer Government off-the-Shelf (GOTS) corporate tools for data UI visualization. Experience working in an Agile software development environment Experience managing software code using Git & MAVEN Experience using Jira and Confluence Work in a team environment
|