Hadoop Developer
Job Description:
(Hadoop/Hive/Spark/Python) | · Experience extracting data from a variety of sources, and a desire to expand those skills (working knowledge SQL is required, Spark is a required) · Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark. · Experience with Object oriented programming using python and its design patterns. · Experience handling Unix systems, for optimal usage to host enterprise web applications. · Expert in SQL and deep understanding of relational databases and Strong experience in performance tuning SQL · Strong understanding of Hadoop internals · Expert in Hadoop: HDFS, Hive, MapReduce, HiveQL, Spark SQL · Knowledge of scripting languages – Python · Creating ETL pipelines using SQL, Python, Hive, and Spark to populate data models | |
Key Skills:
- Hadoop/Hive/Spark/Python