Browse Jobs
For Employers
Post JobLog inGet Started

Big Data Engineer

Vodafone Egypt
Cairo, Egypt
Posted 5 years ago
44Applicants for1 open position
  • 35Viewed
  • 9In Consideration
  • 26Not Selected
Search other opportunities

Job Details

Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:

Skills And Tools:

Job Description

The Big Data Engineer provides expert guidance and delivers through self and others to:

  • Integrate the necessary data from several sources in the Big Data Program necessary for analysis and for Technology actions
  • Build applications that make use of large volumes of data and generate outputs that allow commercial actions that generate incremental value
  • Deliver and implement core capabilities(frameworks, platform, development infrastructure, documentation, guidelines and support) to speed up the Local Markets and tenants delivery in the Big Data Program, assuring quality, performance and alignment to the Group technology blueprint of components releases in the platform
  • Support local markets, tenants and Group functions in obtaining benefiting business value from the operational data.

Key accountabilities and decision ownership:

  • Designing and producing high performing stable end-to-end applications to perform complex processing of batch and streaming massive volumes of data in a multi-tenancy big data platform, both Hadoop on-premises and in the cloud, and output insights back to business systems according to their requirements.
  • Design and implement core platform capabilities, tools, processes, ways of working and conventions under agile development to support the integration of the LM and tenant’s data sourcing and use cases implementation, towards re-usability, to easy up delivery and ensure standardization across Local Markets deliverables in the platform.
  • Support the distributed data engineering teams, including technical support and training in the Big Data Program frameworks and ways of working, revision and integration of source code, support to releasing and source code quality control
  • Working with the Group architecture team to define the strategy for evolving the Big Data capability, including solution architectural decisions aligned with the platform architecture
  • Defining the technologies to be used on the Big Data Platform and investigating new technologies to identify where they can bring benefits

Key performance indicators:

  • Development of core frameworks to speed up and facilitate integration of Local Markets developments in the BDP
  • Speed of on-boarding data sources and use cases for EU Hub markets and new tenants
  • Delivered integrated use cases from Local Markets and tenants to add value to the business using the Big Data Program

Job Requirements

Must have technical / professional qualifications:

  • Expert level experience with Hadoop ecosystem (Spark, Hive/Impala, HBase, Yarn); desirable experience with Cloudera distribution; experience with similar cloud provider solutions also considered (AWS, GCP, Azure)
  • Strong software development experience in Scala and Python programming languages; Java and functional languages desirable;
  • Experience with Unix-based systems, including bash scripting
  • Experience with columnar data format
  • Experience with other distributed technologies such as Cassandra, Splunk, Solr/ElasticSearch, Flink, Heron, Bean, would also be desirable.

Core competencies, knowledge and experience:

  • Expert level experience in designing, building and managing applications to process large amounts of data in a Hadoop ecosystem or other big data frameworks
  • Extensive experience with performance tuning applications on Hadoop and configuring Hadoop systems to maximise performance or other big data frameworks;
  • Experience building systems to perform real-time data processing using Spark Streaming, Flink, Storm or Heron data processing frameworks, and Kafka, Beam, Dataflow, Kinesis or similar data streaming frameworks;;
  • Experience with common SDLC, including SCM, build tools, unit, integration, functional and performance testingfrom automation perspective, TDD/BDD, CI and continuous delivery, under agile practises
  • Experience working in large-scale multi tenancy big data environments;

Featured Jobs

Similar Jobs

Search other opportunities
JobsIT/Software DevelopmentBig Data Engineer