Browse Jobs
For Employers
Post JobLog inGet Started

Snowflakes DW developer

FlairsTech
Maadi, Cairo
Posted 1 year ago
1 open position
Search other opportunities

Job Details

Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:

Skills And Tools:

Job Description

Snowflake Inc.’s Data Engineers will be responsible for creating very large-scale data analytics solutions based on the Snowflake Data Warehouse.  

The ability to design, implement, and optimize large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is essential. Expertise with Amazon Redshift is a must. A Data Engineer at Snowflake is responsible for:

  • Implementing ETL pipelines within and outside of a data warehouse using Python and Snowflakes Snow SQL
  • Querying Snowflake using SQL.
  • Development of scripts using Unix, Python, etc. for loading, extracting, and transforming data.
  • Knowledge of Amazon Web Services Redshift
  • Assist with production issues in Data Warehouses like reloading data, transformations, and translations
  • Develop a Database Design and Reporting Design based on Business Intelligence and Reporting requirements.

Together, Data Engineers and data scientists would work closely. In order to work effectively with data platforms, you need a deep understanding of data models, algorithms, and data transformation techniques. Engineers will be responsible for building ETL tools (extracting, transforming, and loading), storage, and analytics tools. It is therefore imperative that you have experience with current ETL and BI solutions.

Using Kafka or Hadoop for big data projects requires more specific expertise. Data Engineers need knowledge of machine learning libraries and artificial intelligence frameworks. These include TensorFlow, Spark, PyTorch, and MLPack.

  • A solid understanding of data science concepts is required
  • Data analysis expertise
  • Working knowledge of ETL tools
  • Knowledge of BI tools
  • Experience with Big Data technologies such as Hadoop and Kafka
  • Extensive experience with ML frameworks and libraries including TensorFlow, Spark, PyTorch, and MLPACK

Job Requirements

Snowflake Data Engineer Basic Qualifications

Snowflake Data Engineers are required to have the following qualifications:

  • Minimum of 1 year of experience designing and implementing a full-scale data warehouse solution based on Snowflake.
  • A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Java, Spark, Scala, Python.
  • Experience with complex data warehouse solutions on Teradata, Oracle, or DB2 platforms with 2 years of hands-on experience
  • Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting.
  • A highly effective communicator, both orally and in writing
  • Problem-solving and architecting skills in cases of unclear requirements.
  • A minimum of one year of experience architecting large-scale data solutions, performing architectural assessments, examining architectural alternatives, and choosing the best solution in collaboration with both IT and business stakeholders.
  • Extensive experience with Talend, Informatica, and building data ingestion pipelines.
  • Expertise with Amazon Web Services, Microsoft Azure and Google Cloud.

 

Featured Jobs

Similar Jobs

Search other opportunities
JobsIT/Software DevelopmentSnowflakes DW developer