Responsibilities
- To run data aggregation either directly from Kafka or from Hbase
- Move the Storm Topology out and use Spark Streaming directly to consume the data and write to HBase, Parquet files
- Real-time application events are published to Kafka messaging queue
- Storm acts as a consumer of those real time events from Kafka queue
- Storm topology processes the data and writes it to HBase and a subset of data to HDFS in ORC format
- Data written to HDFS is processed by Spark job to do partial aggregation REST API(Jersey) is being used to query data from HBase and render it to the user interface
- Dynamic aggregation is being done using Phoenix. Hortonworks distribution HDP2.4
.
Required Skills
- Strong analytical and diagnostic skills
- Analysis and design for the products and services in the Wireless business to fulfill the needs of strategic initiatives
- Understand the business needs and make critical design decisions and implement the same with other team members
- Participate in day to day activities involving design, development, test support, deployment, production monitoring for Order Management platform
- Communicate with external teams including business clients on various issues and resolutions thereof to provide best customer experience
- Participate in requirement gathering and create design documents
- Systems or equivalent experience.
- Certification in Hadoop preferred
Required Experience
- Experience in Big Data with strong suite in Spark/Kafka/HBase (Clincher) <===
- 8 Experience with all phases of SDLC, incl system analysis, design, coding, testing, debugging & documentation
- 5 years development experience on Hadoop platform incl Spark, Kafka, PIG, Hive, Sqoop, Hbase, Flume &related tools
- 5 year of coding Java MapReduce, Python, Pig programming, Hadoop Streaming, HiveQL
- 3 Years Experience in Big Data ETL Hadoop Stack.
- 3 years professional experience designing and developing applications on RDBMS Teradata version 14 or 15
- 5 years professional experience using 3 or more development languages or tools (Ex: C#, ASP.Net, J2EE Application Framework, Hadoop.)
- 3 years of Building Java tools & apps, 1 year of experience Developing REST web services
- 2 year of building and coding applications using Hadoop components – HDFS, Hbase, Hive, Sqoop, Flume etc
- 3 year implementing relational data models, understanding of traditional ETL tools & RDBMS
- 5 years experience designing and developing applications on one operating system (Unix or Win 2000) or designing complex multi-tiered applications
- 3 years work experience as a developer is desirable, preferably in the wireless industry
- Has experience working with at least 3 business applications/systems and has also provided tier 4 production support
.
Education Requiements
- .
- B.S. Computer Science or Management Information