Jr. Hadoop Administrator

Job Description

Responsibilities

  • Database backup and recovery
  • Database connectivity and security
  • Performance monitoring
  • Disk space management
  • Software patches and upgrades
  • Automation of manual tasks
  • File system management and monitoring
  • Develop processes, tools, and documentation in support of production operations including code migration
  • Software installation and configuration
  • HDFS support and maintenance
  • Manage and review Hadoop log files
  • Performance tuning of Hadoop clusters and Hadoop routines
  • Screening Hadoop cluster job performances and capacity planning
  • Monitoring Hadoop cluster connectivity and security
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Loading large data volumes in a timely manner
  • Evaluate new software, hardware and infrastructure solutions
  • Be the Point of Contact for production issues and alerts and resolve the issues in a timely manner
  • Be the Point of Contact for Vendor escalation
  • Work with the business intelligence team to build the Hadoop platform infrastructure reference architecture
  • Work with data delivery teams to setup new Hadoop users
  • This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users

Required Skills

  • Understanding all components of the platform infrastructure in order to analyze impact of alerts and other system messages
  • Knowledge in networking to troubleshoot and isolate infrastructure issues
  • Excellent verbal and written communication and persuasion skills; able to collaborate and engage effectively with technical and non-technical resources, speaking the language of the business

Required Experience

  • 1-3 years of overall industry experience, with 1-2 + years in Big data, Hadoop and Hadoop-ecosystems
  • 1-3 years experience supporting databases and Linux operating system platform
  • 1-2 years experience with scripting & automation – shell or other language
  • 1-2 years experience with SQL
  • 1-3 years experience with the entire Software
  • Development Lifecycle (SDLC) and hands on experience with monitoring tools
  • Experienced in deployment and administration of different Hadoop distributions for leading organizations that had an enterprise-scale solutions architecture and implementation HDP (HortonWorks Data Platform) Certified Administrator preferred
  • Experienced in design and deployment architectures with broad understanding of the components of Hadoop and Hadoop-eco system architecture
  • Experience in design and implementation of security for Hadoop and Hadoop eco-system applications
  • Experience in building the monitoring environment for Hadoop clusters using vendor provided tools and open source tool
  • Experienced in anticipating problems and taking decisive action to solve issues with minimal impact on development clusters and production clusters
  • Experience installing and maintaining development tools (i.e. source control, testing tools & collaboration tools)
  • Experience using AWS EC2 a plus
  • Experience and exposure to our existing BI toolset (Terredata, Cognos, SQLServer, Paraccel, DataStage) a plus
  • Retail industry and E-commerce experience a plus

Education Requirements

  • Bachelor’s degree in Computer Science or equivalent work experience in a related field
Upload your CV/resume or any other relevant file. Max. file size: 256 MB.

Equal Employment Opportunity (EEO): All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

Got a question? We're here to help.