E

Snowflake Data Engineer

Exusia
Full-time
Remote
Data Science and Analytics

Department:                          Sales and Delivery Team - Empower

Industry:                                 Information Technology & Services, Computer Software, Management Consulting

Location:                                 WFH/ India Remote

Experience Range:                6 - 10 years

Basic Qualification:               Bachelor of Engineering or Equivalent

Travel Requirements:           Not required

Website:                                 www.exusia.com

 

Exusia, a cutting-edge digital transformation consultancy, and is looking for top talent in DWH & Data Engineering space with specific skills in Snowflake/Python/DBT to join our global delivery team's Industry Analytics practice in India.


What’s the Role?

 

·        Full-time job to work with Exusia's clients in the United States to deliver on bleeding-edge data-driven solutions

·        Developing and managing large scale data pipelines and data repositories

·        Collaborate with Product Owners, Solution Architects to develop optimized data engineering solutions.

 

Criteria for the Role!

 

·        Minimum 6 years of experience working as a Data engineer

·        Min 2 years exp in Snowflake and DBT

·        Master of Science (preferably in Computer and Information Sciences or Business Information Technology) or an Engineering degree in the above areas.

·        Excellent communication skills and should be able to work directly with business stakeholders, creative problem solver, flexible, proactive, attitude to learn newer tools/technologies

 

Responsibilities

 

·      Implementing end to end data pipelines to move data from source systems into data lake or data warehouse

·      Build pipeline automation and orchestration process

·      Develop Snowflake data models (e.g., star schema, snowflake schema) for optimized query performance.

·      Working with Data Analysts to ensure the pipelines are tested and optimised to provide accurate and timely data

·      Working in an agile software delivery model and manage changing requirements and priorities during the SDLC 

Mandatory Skills

 

·      Develop and maintain Snowflake data models (e.g., star schema, snowflake schema) for optimized query performance.

·      Create and maintain Snowpipe & SnowSQL scripts for data loading, data transformations, and data retrieval.

·      Proficiency in SQL for data manipulation, transformation, and processing.

·      Expertise in DBT to develop modular, scalable, and well-documented data pipelines.

·      Strong python programming experience to support data processing and automation.

·      Hands-on experience with Airflow for orchestrating data pipelines

·      Knowledge of cloud platforms specifically storage and databases to source or stage data for Snowflake

·      Problem-solving skills with the ability to work with large datasets and debug data pipeline issues.

 

Nice to have skills

 

·      Understanding of Data Modeling & legacy ETL technologies

·      Prior migration experience - On Prem legacy databases to snowflake

·      Knowledge of Spark / Spark based data processing tools

·      Exposure to one or more Cloud platforms – Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform

·      Exposure to Data Governance aspects like Metadata Management, Data Dictionary, Data Glossary, Data Lineage