Role: Hadoop Data Integration Developer
No of Positions – 5
Duration: 7 months
Rate: $42/hr on C2C (MAX)
Location: SAN FRANCISCO, CA
Job Description:
Hadoop Data integration Developer / Engineer.
Experience with big-data technologies such as Hadoop/Hive, MongoDb, or other NoSQL-based data stores. - Strong experience with traditional RDBMS systems like Oracle & Teradata. - 3+ Years of JAVA ETL / Data Integration Experience. - 2+ Years of experience working with Hadoop ecosystem - Cloudera, Hortonworks or Mapr (preferred). - Comfortable working in Linux environment. - Experience with scripting - shell scripting, python etc. - Profile and analyze large amounts of source system data, including structured and semi-structured data. - Work with data originators to analyze the gaps in the data collected. - Expert-level SQL coding/querying skills is a must. - Conduct ETL performance tuning, troubleshooting & support. - Must be comfortable working in a fast-paced, flexible environment, and take the initiative to learn new tools quickly. - Strong understanding of Unix operating systems and Unix security paradigms - Excellent communication skills and experience with first-level support