Data Engineering intern
QUANT -
Riyadh, Saudi ArabiaPosted 2 years ago44People have clicked1 open position
Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
Quant is a prominent consultancy that provides services and products in Data Science, Artificial Intelligence, and Business Intelligence. Our aim is to empower aspiring businesses and dynamic governments to become data-driven, allowing them to optimize operations, enhance efficiency and augment decision making.
The Data Engineer is a core team member that enables Quant to collect, store, process, and analyze vast data sets. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Summarized Job Responsibilities:
1. Design, construct, install, test and maintain highly scalable data management systems
2. Ensure systems meet business requirements and industry practices
3. Build high-performance algorithms, prototypes, predictive models, and proof of concepts
4. Research opportunities for data acquisition and new uses for existing data
5. Develop data set processes for data modeling, mining, and production
6. Integrate new data management technologies and software engineering tools into existing structures
7. Create custom software components (e.g. specialized UDFs) and analytics applications
8. Employ a variety of languages and tools (e.g. scripting languages) to integrate systems together
9. Install and update disaster recovery procedures
10. Recommend ways to improve data reliability, efficiency, and quality
11. Collaborate with data architects, modelers, and IT team members on project goals
RequirementsRequirements
Academic Prerequisites:
Bachelors and/or Master’s degree in Computer Science or other suitable majors.
Personal Qualities Required:
1. Agile
2. Critical and logical thinker
3. Innovative
4. Pragmatic and passionate personality
5. Strong commitment and initiative taking
6. Emotionally intelligent and strong social skills
7. Aptitude to work in a diverse environment
8. Full proficiency in English and Arabic
9. Efficiently managing their time and tasks
Technical Skills Required:
1. Is capable in programming and/or the understanding of programming languages
2. Deep understanding of various database structures, concepts, uses and practices
3. Knowledge of postgresql, mysql and expert knowledge of SQL
4. Experienced with data architecture
5. Desire and ability to continuously learn new technologies and skills
6. Experienced in building and developing data pipelines and data stream workflows
Programming Languages and Software Knowledge Required:
1. Python, R, Pandas and other data processing tools/languages
2. Languages such as bash, awk, JavaScript, Java, Objective-C or Swift, etc.
3. NoSQL database such as Elastic Search and MongoDB
4. Stream-processing systems, solutions such as Storm or Spark-Streaming
5. SQL Server T-SQL Stored Procedures, Functions, and Triggers
6. SQL Server Management Studio, Visual Studio IDE, SQL Command, MySQL Workbench, and PGAdmin
7. SSIS Import/Export and other utilities, such as backup and bulk import
8. ETL Solutions for various data sources such as PostgreSQL, MongoDB, API, and Flat files
9. SAS
10. Visualization Tools such as Tableau and Power BI
11. Expertise in using cloud platform (i.e. Azure and AWS)
12. Analysis tools such as Alteryx
The Data Engineer is a core team member that enables Quant to collect, store, process, and analyze vast data sets. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Summarized Job Responsibilities:
1. Design, construct, install, test and maintain highly scalable data management systems
2. Ensure systems meet business requirements and industry practices
3. Build high-performance algorithms, prototypes, predictive models, and proof of concepts
4. Research opportunities for data acquisition and new uses for existing data
5. Develop data set processes for data modeling, mining, and production
6. Integrate new data management technologies and software engineering tools into existing structures
7. Create custom software components (e.g. specialized UDFs) and analytics applications
8. Employ a variety of languages and tools (e.g. scripting languages) to integrate systems together
9. Install and update disaster recovery procedures
10. Recommend ways to improve data reliability, efficiency, and quality
11. Collaborate with data architects, modelers, and IT team members on project goals
RequirementsRequirements
Academic Prerequisites:
Bachelors and/or Master’s degree in Computer Science or other suitable majors.
Personal Qualities Required:
1. Agile
2. Critical and logical thinker
3. Innovative
4. Pragmatic and passionate personality
5. Strong commitment and initiative taking
6. Emotionally intelligent and strong social skills
7. Aptitude to work in a diverse environment
8. Full proficiency in English and Arabic
9. Efficiently managing their time and tasks
Technical Skills Required:
1. Is capable in programming and/or the understanding of programming languages
2. Deep understanding of various database structures, concepts, uses and practices
3. Knowledge of postgresql, mysql and expert knowledge of SQL
4. Experienced with data architecture
5. Desire and ability to continuously learn new technologies and skills
6. Experienced in building and developing data pipelines and data stream workflows
Programming Languages and Software Knowledge Required:
1. Python, R, Pandas and other data processing tools/languages
2. Languages such as bash, awk, JavaScript, Java, Objective-C or Swift, etc.
3. NoSQL database such as Elastic Search and MongoDB
4. Stream-processing systems, solutions such as Storm or Spark-Streaming
5. SQL Server T-SQL Stored Procedures, Functions, and Triggers
6. SQL Server Management Studio, Visual Studio IDE, SQL Command, MySQL Workbench, and PGAdmin
7. SSIS Import/Export and other utilities, such as backup and bulk import
8. ETL Solutions for various data sources such as PostgreSQL, MongoDB, API, and Flat files
9. SAS
10. Visualization Tools such as Tableau and Power BI
11. Expertise in using cloud platform (i.e. Azure and AWS)
12. Analysis tools such as Alteryx