Overview

Bellevue, WA, USA
Boeing

We are looking for a forwarding thinking Hadoop Systems Administrator to join our team in Bellevue, Washington.  As a Hadoop Administrator, you will provide systems management and infrastructure support for Hadoop analytics systems which include various open source analytics software on Linux based servers. Review and analyze Hadoop analytics tools for appropriateness, robustness, and available support in a leading-edge hardware & software infrastructure delivery environment.

Responsibilities Include:

  • Provide Hadoop systems infrastructure support
  • Provide general systems management and maintenance of software
  • Provide computing system software troubleshooting and problem resolution
  • Provide system monitoring capabilities using approved tools and processes to validate system health
  • Perform root cause analysis
  • Coordinate with peers, end users, focals, program/product managers, and vendors as needed
  • Maintain and improve delivery systems availability, functionality, and performance
  • Develop and maintain procedures used for computing system management

Boeing is the world’s largest aerospace company and leading manufacturer of commercial airplanes and defense, space and security systems. We are engineers and technicians. Skilled scientists and thinkers. Bold innovators and dreamers. Join us, and you can build something better for yourself, for our customers and for the world.

Division
CIO, Information & Analytics

Relocation Assistance Available
No. Relocation assistance is not a negotiable benefit.

Qualifications

Typical Education/Experience:

  • Technical bachelor’s degree and typically 5 or more years’ related work experience or a Master’s degree with typically 3 or more years’ or a PhD degree or an equivalent combination of education and experience. A technical degree is defined as any four year degree, or greater, in a mathematic, scientific or information technology field of study.

Required Qualifications:

  • This position must meet Export Control compliance requirements, therefore a “US Person” as defined by 22 C.F.R. § 120.15 is required. “US Person” includes US Citizen, lawful permanent resident, refugee, or asylee.
  • Minimum of 1 year work experience with Hadoop systems
  • Minimum of 3 years work experience with Linux
  • Minimum of 5 years experience in the IT industry
  • Experience with Linux operating system
  • Experience with the Hadoop distributed file system (HDFS) and the analytics applications that use it

Preferred Qualifications:

  • An understanding of the function and purpose of the Hortonworks Hadoop components.
  • Experience with the Hadoop distributed file system (HDFS) and the analytics applications that use it.
  • Database: Hive, LLAP, Druid.
  • Machine Learning: Zeppelin, Spark, Tensor Flow.
  • Stream Processing: Kafka, Storm, NiFi.
  • Operational Data Store: Phoenix, HBase.
  • Security: Knox, Atlas, Ranger, Kerberos.
  • Workload Management: Yarn, Ambari.
  • Developer tools: Python, Java, Bash.
  • Experience with Linux clustering best practices.
  • Networking knowledge such as: TCP/IP, Active Directory, SSO, Multi-homing.
  • Ability to provide technical solutions to complex problems that require ingenuity and creativity.
  • Good interpersonal, communication, and customer relations skills, reputation as a ‘team player.
  • Ability to work independently, one on one, or in a group setting.
  • Experience in a large and diverse production environment.
  • Positive ‘can-do’ attitude and ability to work in a fast-paced, demanding environment.
  • Knowledge of parallel database systems, a bonus.
  • Experience with both physical & virtual systems a bonus

Experience Level
Individual Contributor

Job Type
Standard

Travel
Yes, 10 % of the Time

Contingent Upon Program Award
No

Union
No

Job Code
BAUGP3 [...]

Source: