Intelliswift Software, Inc
Requirements
3-5 years of experience with the Hadoop ecosystem and Big Data technologies
Hands-on experience with the Hadoop stack (HDFS, MapReduce, Hbase, Pig, Hive, Storm, Impala, Spark)
Scripting and automation skills in Java or Python
Experience deploying new Hadoop infrastructure, Hadoop cluster upgrades, and cluster maintenance
Experience developing Hadoop integrations for data ingestion, data mapping and data processing capabilities
Familiarity with cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems
Knowledge of design and architecture principles
Strong communication skills and ability to interact with our business, product, and development teams
Experience of design and architecture principles
Proficiency in Scala
Experience with Cloudera distributions CDH 4x/5x is a plus
Experience with MQ technologies (Kafka, RabbitMQ) is a plus
Experience with graph processing systems like Apache Giraph is a plus
Prior experience with analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout) is a plus
Hadoop Developer, Big Data, HDFS, Hive, Hadoop, AWS, MapR
Multiple Openings