Java/Hadoop Developer
Dearborn, MI
Job Description:
Hi,
Greeting from Smart Folks …!!!
My name is Krishna we have ajob opportunity for you as Java/Hadoop Developer of our clientsbased at Dearborn, MI. Please find the job description below, ifyou are available and interested, please send us your word copy of your resumewith following details to sai@smartfolksinc.com Or please call meon 4698885469.
Client: Ford Direct
JobTitle: .Java/HadoopDeveloper
Location:Dearborn,MI
Duration: 06 months
Must Have Skills:
1. Java
2. J2EE, Web Applications, Tomcat (or any equivalent Appserver) , Restful Services, JSON
3. Spring, Spring Boot, Struts, Design Patterns
4. Hadoop (preferably Cloudera (CDH)) , HDFS, Hive,Impala, Spark, Oozie, HBase
5. SCALA
6. SQL
7. Linux
Nice to Have:
8. Google Analytics, Adobe Analytics
9. Python, Perl
10. Flume, Solr
11. Strong Database Design Skills
12. ETL Tools
13. NoSQL databases (Mongo, Couchbase, Cassandra)
14. JavaScript UI frameworks (Angular, NodeJS, Bootstrap)
Detailed Job Description:
should have at least 7-8 years of total experiencewith at least 5 on Hadoop Development
Job Summary: The Java/Hadoop Developer position willprovide expertise in a wide range of technical areas, including but not limitedto: Cloudera Hadoop ecosystem, Java, collaboration toolsets integration usingSSO, configuration management, hardware and software configuration and tuning,software design and development, and application of new technologies andlanguages which are aligned with other FordDirect internal projects.
Essential Job Functions:
1. Design and development of data ingestion pipelines.
2. Perform data migration and conversion activities.
3. Develop and integrate software applications usingsuitable development methodologies and standards, applying standardarchitectural patterns, taking into account critical performance characteristicsand security measures.
4. Collaborate with Business Analysts, Architects andSenior Developers to establish the physical application framework (e.g.libraries, modules, execution environments).
5. Perform end to end automation of ETL process for variousdatasets that are being ingested into the big data platform.
Other Responsibilities:
1. Document and maintain project artifacts.
2. Suggest best practices, and implementation strategiesusing Hadoop, Java, ETL tools.
3. Maintain comprehensive knowledge of industry standards,methodologies, processes, and best practices.
4. Other duties as assigned.
Minimum Qualifications and Job Requirements:
• Must have a Bachelor’s degree in Computer Science orrelated IT discipline
• Must have at least 5 years of IT development experience.
• Must have strong, hands-on J2EE development
• Must have indepth knowledge of SCALA – Spark programming
• Must have 3+ years relevant professional experienceworking with Hadoop (HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript,.Net, SQL, PERL, Python or equivalent scripting language
• Must have experience with ETL tools • Must have experienceintegrating web services
• Knowledge of standard software development methodologiessuch as Agile and Waterfall
• Strong communication skills.
• Must be willing to flex work hours accordingly to supportapplication launches and manage production outages if necessary SpecificKnowledge, Skills and Abilities:
• Ability to multitask with numerous projects andresponsibilities
• Experience working with JIRA and WIKI
• Must have experience working in a fast-paced dynamicenvironment.
• Must have strong analytical and problem solving skills.
• Must have excellent verbal and written communicationskills
• Must be able and willing to participate as individualcontributor as needed.
• Must have ability to work the time necessary to completeprojects and/or meetdeadlines.