Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
- Responsible for the documentation, design, development, and architecture of big data applications.
- Handle the installation, configuration, and supporting of Hadoop ecosystem and other big data components.
- Building complete infrastructure to ingest, transform & store data for further analysis & business requirement and maintaining the data security and privacy.
- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Performance optimization: Automating processes, optimizing data delivery & re-designing the complete architecture to improve performance.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Advanced work experience in database architectures, data modeling, ETL development, and data warehousing.
Job Requirements
-
2+ years of experience with the suite of open source big data technologies and platforms (Cloudera/Hortonworks, Hadoop Ecosystems, Spark, Kafka, Hive, Cassandra).
-
Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc, is a plus.
- Basic knowledge of Linux/UNIX including to process large data sets.
- Excellent communication skills and a dynamic team player.