Cloud Data Engineer Loc: Framingham, MA Dur: 12 M+ Rate:Open

Job Description:

Dur: 12 M+

Rate:Open

Role: Cloud Data Engineer

Loc: Framingham, MA

Job Description:

Softworld client seeking to hire a Cloud Data Engineer to "build what the data goes into".

Responsibilities:

  • As a key member of the Cloud Platform Data Team you will be responsible for designing and developing major database components of our next generation cloud platform.
  • Build highly scalable data abstractions and infrastructure like key:value and secret storage
  • Design large graph infrastructure and query/search interfaces
  • Collect and process data streams at scale (including stream-processing and batch ETL)
  • Work closely with engineering teams to help build and maintain systems that support advanced analytics
  • Work with platform architects on software and system optimizations, helping to identify and remove potential performance bottlenecks
  • Stay up to date on relevant technologies, plug into user groups, understand trends and opportunities that ensure we are using the best techniques and tools

Required Experience/ Skills:

  • Strong data systems and MicroServices development experience
  • Experience working with modern big data and machine learning platforms, cloud deployment models and test-driven development in a fast-paced agile environment.
  • Experience in creating architecture, building and maintaining (commercial or open source) software platforms and large-scale data infrastructures
  • Experience with cloud resource platforms such as Amazon AWS, Google Cloud Platform, Microsoft Azure
  • Experience building big data solutions using technologies such as Cassandra, HBase, Hadoop, Kafka, and Titan
  • Experience with a stream or micro-batch processing tool such as Storm or Spark
  • Strong knowledge of data technologies including SQL, Hive, Map/Reduce
  • Familiar operating in UNIX/Linux environments
  • A strong team player, able to quickly triage and troubleshoot complex problems

Desired Skills:

  • Experience with cloud solutions, APIs, IaaS and PaaS components; Amazon Web Services preferred
  • Interest in development tools (APIs, frameworks, and environments) for distributed applications, such as Spring, Elastic Beanstalk, and Cloud Foundry
  • Modern tools experience, such as Scala, Phoenix, Docker, Vagrant, Jenkins
  • Experience with Hadoop deployment and automation tools such as Ambari, Cloudbreak, EMR
  • Experience with securing Hadoop (Kerberos, Knox, Ranger, etc.) is a plus
  • Experience with traditional data warehouse systems and concepts


Share Profile