Big Data Architect

Big Data, GCP, Kafka, ETL Tools, SQL
Description

We are looking for a Big Data Architect to join our global workforce. Our dynamic team offers valuable opportunities and a tangible career support system for your professional and personal development.

Who We Are

GSPANN has been in business for over a decade, with over 1800 employees worldwide, and servicing some of the largest retail, high technology, and manufacturing clients in North America. We provide an environment that enables career growth while still interacting with company leadership.

Visit Why GSPANN for more information.

Location: Hyderabad / Gurugram / Pune
Role Type: Full Time
Published On: 20 December 2022
Experience: 10+ Years
Description
We are looking for a Big Data Architect to join our global workforce. Our dynamic team offers valuable opportunities and a tangible career support system for your professional and personal development.
Role and Responsibilities
  • Participate in design architecture, support projects, review information elements, including models, glossary, flows, and data usage.
  • Provide guidance to the team in achieving the project goals/milestones.
  • Contribute to multiple delivery teams, define best practices, build reusable components, build capability, align industry trends, and actively engage with wider data communities.
  • Build real-time and batch ETL/ELT solutions using open-source technology like Spark/Flink/Storm/Kafka streaming.
  • Create a data model (ER and Dimension model) to help data consumers create a high-performance consumption layer.
  • Investigate new technologies, data modeling methods, and information management systems to determine which ones should be incorporated into data architecture and develop implementation timelines and milestones.
  • Create or support production software/systems by identifying and resolving performance bottlenecks for production systems.
  • Run multiple projects simultaneously following both waterfall and Agile methodologies.
  • Maintain data quality by introducing data governance/validation framework.
Skills and Experience
  • Prior experience in database concepts like Online Transaction Processing (OLTP), Online Analytical Processing (OLAP), Start and Snowflake Schema, Normalization and Demoralization, etc.
  • Expertise in SAP, Talend, and Apache Nifi will be advantageous.
  • Familiar with open-source workflow management software like Airflow/Oozie.
  • Hands-on in managing the Engineering team and own Architecture best practices.
  • Excellent coding experience in Big Data technologies like Hadoop, KAFKA, Hive, Spark , Flink, Storm, etc.
  • Experience in any two of the cloud services - GCP/AWS/Azure.
  • Good understanding of  cloud data lake implementation, preferably GCP.
  • Expert in Python and SQL.

Key Details

Location: Hyderabad / Gurugram / Pune
Role Type: Full Time
Published On: 20 December 2022
Experience: 10+ Years

Apply Now