Big Data Developer
Phoenix, AZ (On-Site)
Job Description:
Role: Big Data Developer with GCP
Location: Phoenix, AZ – Day 1 onsite
Job Type: Contract
Job Description:
Key Responsibilities:
Design and Development:
- Design and implement scalable, high-performance data processing solutions using GCP services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage).
- Develop, maintain, and optimize ETL pipelines and data integration workflows.
Ensure data quality and integrity throughout the data lifecycle.
Data Architecture:
- Develop and manage data architectures that support analytical and reporting needs.
- Implement best practices for data storage, partitioning, and distribution to optimize performance.
- Collaboration and Communication:
- Collaborate with data engineers, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
- Provide technical guidance and support to team members on GCP best practices.
- Performance Optimization:
- Optimize and troubleshoot complex SQL queries, data processing jobs, and GCP services for performance and cost efficiency.
- Monitor and manage data pipeline performance, ensuring minimal downtime and high availability.
- Security and Compliance:
- Implement data security and privacy measures, ensuring compliance with relevant regulations and company policies.
- Manage access controls and permissions for GCP data services.
- Automation and CI/CD:
- Develop and maintain CI/CD pipelines for data workflows and applications.
- Automate repetitive tasks and processes to improve efficiency and reduce manual effort.
Qualifications:
- Education:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Experience:
- 6+ years of experience in big data development, with a strong focus on GCP.
- Proven experience with GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
- Hands-on experience with ETL processes, data modeling, and data warehousing.
- Proficiency in programming languages such as Python, Java, or Scala.
Strong SQL skills and experience with query optimization.
Skills:
- Expertise in building and maintaining data pipelines using tools like Apache Beam, Apache Spark, or similar.
- Familiarity with data security and governance practices on GCP.
- Experience with version control systems (e.g., Git) and CI/CD tools (e.g., Jenkins, Cloud Build).
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Key Skills:
- GCP
Hadoop