Big Data Architect

Dallas, TX

Job Description:

Big Data Architect

Location: Dallas, TX

F2F on 4th May

Flexible to Relocate

Job Description:

The primary function of this position is for the development of sophisticated solutions for various customers using Big Data technologies and related platforms. The Big Data Architect will be responsible for technical leadership roles in ownership of Big Data Competency and help the Big data practice lead at TekLink in pre-sales, marketing, team skills building and delivery of Big Data technology related solutions.

The candidate will be expected to architect solutions in cloud deployments of big data platforms to integrate data from various sources and position them to perform analytics, reporting, visualizations and quick access for online applications. This position will be expected to provide technical leadership and mentoring, working with a team junior Big Data developers in design, development, deployment and systems integration activities. In addition, the Big Data Architect is expected to do hands on coding and develop/implement design patterns and other best practices.

Required skills

  • Must have minimum experience of 12+ years of IT Industry
  • 2+ Hands-on years’ experience working with Big Data/Hadoop Architecture

· Experience in building and maintaining Hadoop cluster in a multi-tenant environment

· Preferable work experience with Cloudera / Hortonworks / MapR Hadoop distributions.

· Proficient with HDFS, MR2, Hive, HBase, Sqoop, Oozie, BigSQL, Kafka, Spark and SQL.

  • Hands-on experience with "commercializing" Hadoop applications (e.g. administration, security, configuration management, monitoring, debugging, and performance tuning)
  • In depth experience working on cloud computing Infrastructure.

· Experience with any of the following technologies (or similar) would be considered a plus.

· Deep knowledge of search engines like Apache Lucene and Solr/Elastic search

· Knowledge in data structures, modeling standards and machine learning frameworks is a plus.

· Working experience on Linux scripting and Windows power shell scripting

· Experience working in an Agile Scrum environment.

· Possess skills and experience managing onsite/offshore team of size 10 or more in delivering Software development services.

· Take end-to-end bottom line responsibility of project delivery from Big Data track including building project team, schedule assignments, monitor, review, report project status, and manage project risks and manage production releases.

· Big data Analysis, Data management, Data migration, Data governance and handling data warehouse (raw data, atomic data store, datamart) and big data workloads (event based log data etc.)

· Solution design, data and technical architecture and component design and technical oversight and data governance.

· Solution and Design: Provide solution and prepare High level and Detail design for implementing the specified requirements.

· Build: Perform development activities

· Support System Testing: Provide support for System Integration Testing

· Support Performance Testing: Provide support for System Integration Testing

· Technical code review: Perform technical code reviews and ensure high quality deliverables.

· UAT Support: Provide support for User Acceptance Testing

· Release Management: Contribute to release management. Prepare production deployment plan, communicate to all stakeholders and ensure the production activities are performed successfully.

· Project Scheduling: Contribute to prepare a detailed project schedule and manage any changes to ensure on time delivery of project.

· People Management: Allocation of activities to right people, plan for and the mentor the team in competency development, drive processes related to Reward & Recognitions and team building.


Key Skills:

  • Big Data Architect Hadoop Architecture

Share Profile