Big Data Architect

Big Data, GCP, Kafka, ETL Tools, SQL
Description

We are looking for a Big Data Architect to join our global workforce. Our dynamic team offers valuable opportunities and a tangible career support system for your professional and personal development.

Who We Are

GSPANN has been in business for over a decade, with over 2000 employees worldwide, and servicing some of the largest retail, high technology, and manufacturing clients in North America. We provide an environment that enables career growth while still interacting with company leadership.

Visit Why GSPANN for more information.

Location: Hyderabad / Pune / Gurugram
Role Type: Full Time
Published On: 10 August 2023
Experience: 10+ Years
Description
We are looking for a Big Data Architect to join our global workforce. Our dynamic team offers valuable opportunities and a tangible career support system for your professional and personal development.
Role and Responsibilities
  • Participate in design architecture, support projects, and review information elements, including models, glossary, flows, and data usage.
  • Guide the team in achieving project goals/milestones.
  • Contribute to multiple delivery teams, define best practices, build reusable components, build capability, align industry trends, and actively engage with wider data communities.
  • Build real-time and batch ETL/ELT solutions using open-source technology like Spark/Flink/Storm/Kafka streaming.
  • Create a data model (ER and Dimension model) to help data consumers create a high-performance consumption layer.
  • Investigate new technologies, data modeling methods, and information management systems to determine which ones should be incorporated into data architecture and develop implementation timelines and milestones.
  • Create or support production software/systems by identifying and resolving performance bottlenecks for production systems.
  • Run multiple projects simultaneously following both waterfall and Agile methodologies.
  • Maintain data quality by introducing a data governance/validation framework.
  • Manage the Engineering team and own the architecture best practices.
Skills and Experience
  • Prior experience in database concepts like Online Transaction Processing (OLTP), Online Analytical Processing (OLAP), Start and Snowflake Schema, Normalization and Demoralization, etc., would be preferred.
  • Expertise in SAP, Talend, and Apache Nifi will be advantageous.
  • Should be familiar with open-source workflow management software like Airflow/Oozie.
  • Excellent coding experience in big data technologies like Hadoop, KAFKA, Hive, Spark , Flink, Storm, etc., is desirable.
  • Should be experienced in any two of the cloud services - GCP/AWS/Azure.
  • Must have a good understanding of cloud data lake implementation, preferably GCP.
  • Expertise in Python and SQL is mandatory.

Key Details

Location: Hyderabad / Pune / Gurugram
Role Type: Full Time
Published On: 10 August 2023
Experience: 10+ Years

Apply Now