Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
- Develop, construct, test and maintain architectures such as databases and large-scale processing systems
- Collaborate with data architects, modelers, and IT team members on project goals
- Implement complex, scalable, and robust data pipelines to move data across systems
- Ensure systems meet business requirements and industry practices
- Integrate new data management technologies and software engineering tools into existing structures
- Create custom software components and analytics applications
- Leverage knowledge of database design and data processing to organize data as needed
- Perform periodic system improvements and upgrades to optimize performance
- Maintain security and data privacy in all data-related activities in accordance with best practices and legal requirements
Job Requirements
- Proven experience as a Data Engineer, Software Developer, or similar role.
- Proficiency in Python programming language.
- Strong knowledge of Linux operating system.
- Excellent understanding of database design and principles.
- Extensive experience with data pipeline and workflow management tools, specifically Apache Airflow.
- Experience with big data tools: PySpark or Hadoop, etc.
- Ability to analyze, organize, and visualize data to provide meaningful insights.
- Experience with cloud services such as AWS or Azure is preferred
- Knowledge of data warehousing solutions is preferred
- Familiarity with machine learning algorithms and data science techniques is a plus