Big Data ETL Developer Location:Chevy Chase, MD Duration: 3 month
Washington City, DC
Job Description:
Role: Big Data ETL Developer
Location:Chevy Chase, MD
Duration: 3 month
Job Description
client looking for 2 (TWO) resources to develop ETL/ELT jobs on the Hadoop platform using Spark and other BigData technologies. Other responsibilities will include designing needed transformations, mapping data from source to target for jobs on platform, developing and using frameworks for data transformation, creating and scheduling jobs with Oozie and other job coordinators, etc.
Essential Skills:
- 3 years of hands-on experience in Hadoop Eco System (HDFS AND YARN AND MapReduce AND Oozie AND Hive)
- 2 years of hands-on experience in Spark core, Scala AND Spark SQL
Plusses:
- Experience in any one of the ETL tools such as Abinitio Talend, or Kettle
- Hadoop OR NoSQL performance optimization and benchmarking using tools such as HiBench OR YCSB
- Experience on continuous build and test process using tools such as Gradle or Maven AND Jenkins
- Understanding of Kafka AND Spark Streaming
- Experience with Graph Databases