Description
Boeing Intelligence & Analytics (BI&A) seeking a Data Engineer with data transformation (ETL) experience working with latest industry tools; elastic search, apache Kafka, and Apache NiFi. COVID-19: As a U.S. government contractor, Boeing will require all new hires and employees located in the United States to be fully vaccinated against COVID-19 by December 8, 2021. Individuals who are unable to be fully vaccinated due to a disability/medical condition or sincerely held religious belief may apply for a reasonable accommodation during the post-offer process. Individuals who are approved for a reasonable accommodation will be subject to frequent COVID-19 testing and possibly other health and safety measures. Employees working in certain positions may be required to undergo frequent COVID-19 testing prior to December 8th if they are not fully vaccinated. DUTIES ENTAIL: Work with a teammate on data integration requirements. Write code on ETL platform to transform data to a suitable formats as defined by IC ITE initiatives. Add features to ETL platform to shorten timelines for future data integration efforts. Develop, maintain code, and integrate software into a fully functional software system. Participate in daily scum meetings, sprint retrospectives, and other agile processes. Work with external teams to validate data ingest. Provide and maintain documentation of system architecture, development, and enhancements. EDUCATION: Bachelor’s Degree and 6 or more years’ experience or Master's Degree with 3 or more years' experience from an accredited course of study, in engineering, computer science, mathematics, physics or chemistry. REQUIRED EXPERIENCE: Active TS/SCI clearance with polygraph 6+ years of software development experience Demonstrated understanding of high scale cloud architecture Linux/Unix experience Object Oriented programming language Possess strong verbal and written communication skills Possess strong analytical skills, with excellent problem solving abilities in the face of ambiguity DESIRED EXPERIENCE: Expertise in data ingestion, data transformation (ETL), and data modeling. Experience with Java, Ruby, or Python Experience in Agile/SCRUM enterprise-scale software development 3 years’ experience working with batch-processing and tools (eg, Nifi, Midpoint, MapReduce, Yarn, Pig, Hive, HDFS, Oozie) 1 year working with Restful web services Experience with code development, deployment, versioning, and build tools (eg, Eclipse, git, svn, maven, Jenkins) Experience working with tools in the stream-processing (eg, Storm) Experience developing applications that work with NoSQL stores (eg, ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB) Working in cloud architecture with AWS EC2, RDS, S3, VPC, Elastic Search
|