Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
Responsibilities
- Work as a powerful member in DWH & Big Data teams to develop & deliver the proposed Data Architecture
- Help to create logical & Physical data model for the required data structures using client's modelling standards and standard ERWin modelling tools.
- Effectively apply the process of data mapping between source systems and EDW and from EDW to different integrated systems.
Data Transformation(ETL):
- Develop and implement different data integration & ETL workflows and jobs between data sources and EDW and from EDW to different integrated systems
- Adhere to the best approaches to use for data integration & transformation
Data Consumption:
- Responsible for development of different application data needs from EDW
- Implementation of the best approaches to use for applications data integration with EDW
Data Development:
- Put Hadoop and big data technology stack in work and development of the needed data applications for better data development.
- Development & Implementation for data integration and distribution across Hadoop nodes
- Applying best data movements’ techniques from and to Hadoop.
Controls
- Full adherence to a set of guiding principles for data warehousing & Big Data
- Develop a feedback loop with data science and EPM/ERM team to constantly enhance data structure.
Job Requirements
Qualifications
- Data science
- Over 2 years’ experience in enterprise data warehouse & Big Data management; experience in data architecture is plus
Skills:
- Excellent Data management skills using SQL, ErWin, and Data Management tools
- Excellent ETL skills with knowledge of Informatica & Data Stage
- Strong programming and object oriented experience (Java, Python, C#,..).
- Sound understanding of the financial services industry that allows for a detailed interpretation of budgetary, financial, and related management information
- Experience in manipulating complex, high-volume, high-dimensionality data from varying sources
- Demonstrated ability to effectively work in a dynamic environment, 2) effectively work in a cross-team environment 3) Work independently and directly with the business partners.
- Excellent Hadoop development skills (HBase, Hive, Impala, MR,….)