Snowflake Developer/Lead

Snowflake, Spark, PySpark, Java, Python, Scala, Cloud Technologies
Description

GSPANN is looking for a Snowflake Developer/Lead who is determined to solve the organization’s most challenging problems. Join our global workforce and unleash a pool of opportunities.

Who We Are

GSPANN has been in business for over a decade, with over 1800 employees worldwide, and servicing some of the largest retail, high technology, and manufacturing clients in North America. We provide an environment that enables career growth while still interacting with company leadership.

Visit Why GSPANN for more information.

Location: Hyderabad / Gurugram / Pune
Role Type: Full Time
Published On: 15 December 2022
Experience: 5+ Years
Description
GSPANN is looking for a Snowflake Developer/Lead who is determined to solve the organization’s most challenging problems. Join our global workforce and unleash a pool of opportunities.
Role and Responsibilities
  • Develop continuous integration/continuous delivery (CI/CD) environment using tools like Jenkins, CircleCI.
  • Maintain the build and deployment process using build integration tools.
  • Design instrumentation into code integrating software and logging analysis tools like log4Python, New Relic, SignalFX, and/or Splunk.
  • Conduct knowledge-sharing sessions and publish case studies. Take accountability for maintaining program or project documents in a knowledge base repository.
  • Identify accelerators and innovations and understand complex interdependencies to identify the right team composition for delivery.
  • Develop relevant big data/ETL data warehouse to build cloud-native data pipelines.
  • Work with REST and SOAP-based APIs to extract data for data pipelines.
  • Conduct performance and scalability tuning.
Skills and Experience
  • Hands-on with Snowflake Architecture (access control, provisioning).
  • Good understanding of Data transformation and processing using Data Build Tool.
  • SnowPro Data engineering certification and prior experience in Teradata and Snowflake would be an advantage.
  • Expertise in source control, merging strategies, and coding standards, specifically Bitbucket/Git and deployment through Jenkins pipelines.
  • Should be comfortable in communicating with business stakeholders and architects.
  • Experience in PySpark, Scala, Java, and SQL String Object and functional programming experience in Python.
  • Extensive experience in working with Hadoop and related processing frameworks, such as Spark, Hive, Sqoop, etc.
  • Must have a public cloud environment, particularly AWS experience.
  • Implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, and Athena.
  • Comfortable working with real-time data streams and the Kafka platform.
  • Got a grip of workflow orchestration tools like Apache Airflow design and deploy dags.
  • Expertise in Agile/Scrum application development using Jira.

Key Details

Location: Hyderabad / Gurugram / Pune
Role Type: Full Time
Published On: 15 December 2022
Experience: 5+ Years

Apply Now