Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities
- Monitoring performance and advising any necessary infrastructure changes
- Defining data retention policies
- Mentor junior team members
Job Requirements
- Management of Hadoop cluster, with all included services
- Minimum 2 yrs of experience with various messaging systems, such as #Kafka
- Ability to solve any ongoing issues with operating the cluster
- Experience with building stream-processing systems, using solutions Storm or #Spark-Streaming
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
- Knowledge of various ETL techniques and frameworks, such as Flume, Nifi