Big Data Engineers and Leads

Big Data, Spark, PySpark, Java, Python, AWS, GCP, Azure, Scala
Description

GSPANN is looking for Big Data Engineers and Leads. Our culture fosters individual initiative and excellence. Join our global workforce to unleash a pool of opportunities.

Location: Anywhere in India
Role Type: Full Time
Published On: 10 October 2020
Experience: 1-10 Years
Description
GSPANN is looking for Big Data Engineers and Leads. Our culture fosters individual initiative and excellence. Join our global workforce to unleash a pool of opportunities.
Role and Responsibilities
  • Develop high-quality, scalable, and high-volume data pipelines using Big Data, Spark, Java, or Python. 
  • Conduct unit and system integration testing and fix issues. 
  • Analyze source and target system data. Map the transformation that meets the requirement. 
  • Interact with clients and onsite teams during different phases of a project. 
  • Collaborate with the team to explore and learn the existing systems. 
  • Maintain a high standard of code quality and unit test coverage. 
  • Coordinate and collaborate with business stakeholders, architects, and other teams (like UX, QA, DevOps, etc.) 
  • Participate in daily scrum, sprint planning, reviews, demos, retrospectives, and grooming sessions. 
  • Provide daily and weekly updates along with the corresponding status reports to different teams. 
  • Conduct performance and scalability tuning.
Skills and Experience
  • Experience in developing relevant Big Data/ETL data warehouse and building cloud-native data pipelines. 
  • Prior experience in developing Agile/Scrum applications using Jira. 
  • Sound knowledge of Hive, Spark, Scala, Java/Python, and SQL. 
  • Prior experience in Object and Functional programming using Python. 
  • Good understanding of REST and SOAP-based APIs to extract data for data pipelines. 
  • Expertise in Hadoop and related processing frameworks. 
  • Prior experience in working in a public cloud environment, i.e., GCP, AWS, or Azure. 
  • Hands-on experience in working with real-time data streams and Kafka platform. 
  • Good knowledge of workflow orchestration tools, such as Apache Airflow design and deploy Directed Acyclic Graphs (DAGs).

Key Details

Location: Anywhere in India
Role Type: Full Time
Published On: 10 October 2020
Experience: 1-10 Years

Apply Now