Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
- Provide implement data ingestion, data transformation, use case management, stakeholder management, data analysis, producing dashboards, MI & KPIs.
- Implement data pipelines solutions using ELK development stack. Install, configure and maintain existing and new production solutions with KAFKA, ELK
- Solve technical problems and incidents
- Big Data Technical Log lake team manages the banks monitoring and alerting solution based on Big Data technologies.
- Team is responsible for creating data ingestion pipelines, visualisations, data store and alerting on the data.
- It also does the installation, upgrades, maintenance of the BigData Clusters.
- Provisioning ElasticSearch Clusters for production and testing
- Tuning, administering and refactoring ElasticSearch Cluster setup
- Reviewing and improving current ElasticSearch queries and processes within the product
- Managing all plugins and interfaces with integrations into our various products
- Developing Kibana dashboards to provide insight into our operations of the various products
- Supporting the Agile/Scrum team with all ES and other data requirements for development processes; and supporting the DevOps team with the use of Ansible and other forms of infrastructure as code
- Assisting in supporting the role of data and ELK with regards to CI/CD
Job Requirements
- Bachelor degree in IT or any related field
- 3-5 years of working experience in Big Data Analytics and Elasticsearch stack.
- Recent strong experience with Elasticsearch and Kafka
- Strong experience in writing data ingestion pipelines using Logstash Desirable
- Strong experience with Elasticsearch stack and enterprise production experience using multi-node implementations
- Good Kibana and ElasticSerach scripting knowledge