Role : ETL Developer with PENTAHO
Location : Philadelphia, PA
Duration : 10 Months
Skills: ETL, SQL, Java and Unix/Linux a must; Nice to haves: Python, Bash, Pig, Hive, Scala, Spark
Required Skills- Java, ETL
Candidates MUST have Pentaho, no candidates will be considered without it. Also, candidates must have Java programming experience in order to develop plug-ins for Pentaho. Oracle database experience preferred.
- Candidates have to be comfortable with traditional ETL work in Pentaho: new technologies and Java programming
- Someone that had Team Lead experience would be nice
- Need to be diverse in their thinking: improving the process
Job Description-
Data Warehouse Analyst
Responsible for gathering and assessing business information needs and preparing system requirements.
Performs analyses, development and evaluation of data mining in a data warehouse environment which includes data design, database architecture, metadata and repository creation.
Uses data mining and data analysis tools.
Reviews and validates data loaded into the data warehouse for accuracy.
Interacts with user community to produce reporting requirements.
Provides technical consulting to users of the various data warehouses and advises users on conflicts and inappropriate data usage.
Responsible for prototyping solutions, preparing test scripts, and conducting tests and for data replication, extraction, loading, cleansing, and data modeling for data warehouses.
Maintains knowledge of software tools, languages, scripts, and shells that effectively support the data warehouse environment in different operating system environments.
Possesses working knowledge of Relational Database Management Systems (RDBMS) and data warehouse front-end tools.
Must have an extensive knowledge of data warehouse and data mart concepts.
Bachelor’s Degree in Computer Science, Information Systems, or other related field Or equivalent work experience.
Typically has 1 - 3 years of IT work experience in business intelligence tools and systems.
Job Responsibilities
- Create and manage ETL code (Pentaho) to support high-volume and high-velocity data pipelines.
- Develop and test ETL components to high standards of quality and performance.
- Perform design/code/test reviews for ETL components.
- Assist with planning and execute releases of ETL components into production.
- Develop or enhance Pentaho plugins (Java) as needed.
- Provide operational support for the analytics infrastructure.
- Research, identify and recommend technical and operational improvements resulting in improved reliability and/or efficiencies in maintaining and/or developing the application.
- Evaluate and advise on technical aspects of open work requests in the product backlog with the project lead.