Cyber Support Developer, Principal (Government)



AT&T Public Sector
View Company Profile

<< Go back

Post Date: May 31, 2022
Location: Virginia - Chantilly
Security Clearance: Top Secret - SCI,Top Secret w/ CI Poly
Job Type: Permanent
Start Date: - n/a -
Salary: - n/a -
Job Reference: 2218763
APPLY NOW
    Email Job to a Friend     Save Job to Inbox     Printer Friendly

Description

AT&T Global Public Sector is a trusted provider of secure, IP-enabled, cloud-based, network solutions and professional services to the Federal Government. We are dedicated to recruiting, developing, and empowering a diverse, high-performing workforce that is passionate about what they do, committed to our shared values, and dedicated to our customers’ mission.



Our Cyber Security Team supports the customer by investigating, analyzing, and mitigating cybersecurity incidents that attempt to breach the Customer’s network infrastructure, applications, and operating systems.



AT&T has an opening for a Principal Cyber Support Developer to support the Grimlock contract.



The job duties of the Principal Cyber Support Developer are as follows:                                                                                                        




            
  • Work as an integral part of the Data Analytic team, including cyber architect and cloud security architect SMEs, to collect key stakeholder requirements to characterize system use-case methodologies from customer requirements and apply to a technical specification

  •         
  • Provides data analytics support and data science expertise for specialized cybersecurity applications and big data analytical systems.

  •         
  • Collect information on the enterprise data sources, formats and key stakeholder use cases to transform data sets into a common schema, e.g., merging different event types to provide an enriched data set

  •         
  • Assistance with technical design and documentation, perform testing and resolution of bug/defects identified.

  •         
  • Design, customize, and document services based on COTS products using Agile development approaches and methodologies

  •         
  • Provide support for consolidation of collections of metadata, data management and search tools into a CyberSecurity Data Catalog.

  •         
  • Turn data into action with intelligent analytics and clear insights. Define raw input requirements to support data models and final outputs for quick analysis and summary reporting for actions.

  •         
  • Meets professional obligations through efficient work habits such as, meeting deadlines, honoring schedules, coordinating resources and meetings in an effective and timely manner



Required Clearance:



TS/SCI with poly (#ts/sci) (#polygraph)



Required Qualifications:




            
  • A minimum of 7+ years overall relevant experience and a Bachelor’s degree or an Associates degree and 9+ years relevant experience or 11+ years overall relevant experience with no degree; or a Master’s degree with 3-5 years of relevant experience

  •         
  • Minimum of 5 years of Linux scripting and automation using Bash, Python and/or Java

  •         
  • Experience with installation and administration of COTS applications on RHEL and/or CentOS Linux

  •         
  • Ability to create dashboard content and visualizations to demonstrate the ability to provide actionable intelligence, e.g., leverage machine learning (ML) make the enterprise cyber data more easily consumable, to identify uncommon characteristics of a dataset and ensure a better/fuller use of cyber data

  •         
  • Ability to understand the types of data assets through discovery, description and organization of datasets.

  •         
  • Ability to provide development support to the emerging requirements of event ingest and transformation into a common data schema for consistent data analytic search capabilities.

  •         
  • Experience in design and development of Elastic Beats or Logstash configurations to enable the data collection to either accept or extract data from an audit generation system such as an application database, file or syslog stream

  •         
  • Ability to establish and implement a Data Governance Workflow focusing primarily on meta data, data sources, data quality, policies and procedures.



Desired Qualifications:




            
  • Experience with streaming data tools and software, such as Apache or Confluent Kafka

  •         
  • Experience with Data Integration, Data Engineering and Data Lake implementations using ETL, Big Data and Cloud Technology.

  •         
  • Experience with JIRA, Confluence, and Git

  •         
  • Familiarity with Security Information and Event Management (SIEM) software

  •         
  • Experience with Container Services like Docker and Kubernetes



Ready to join our team? Apply Today!



 











Powered by Jobbex