ETL Developer with PENTAHO

logo

ETL Developer with PENTAHO

Cloud Big Data Technologies

icon Philadelphia, PA, US, 19107

icon21 July 2024

Apply Now

Role : ETL Developer with PENTAHO

Location : Philadelphia, PA

Duration : 10 Months


Skills: ETL, SQL, Java and Unix/Linux a must; Nice to haves: Python, Bash, Pig, Hive, Scala, Spark

Required Skills- Java, ETL

Candidates MUST have Pentaho, no candidates will be considered without it. Also, candidates must have Java programming experience in order to develop plug-ins for Pentaho. Oracle database experience preferred.

  1. Candidates have to be comfortable with traditional ETL work in Pentaho: new technologies and Java programming
  2. Someone that had Team Lead experience would be nice
  3. Need to be diverse in their thinking: improving the process

Job Description-

Data Warehouse Analyst

Responsible for gathering and assessing business information needs and preparing system requirements.

Performs analyses, development and evaluation of data mining in a data warehouse environment which includes data design, database architecture, metadata and repository creation.

Uses data mining and data analysis tools.

Reviews and validates data loaded into the data warehouse for accuracy.

Interacts with user community to produce reporting requirements.

Provides technical consulting to users of the various data warehouses and advises users on conflicts and inappropriate data usage.

Responsible for prototyping solutions, preparing test scripts, and conducting tests and for data replication, extraction, loading, cleansing, and data modeling for data warehouses.

Maintains knowledge of software tools, languages, scripts, and shells that effectively support the data warehouse environment in different operating system environments.

Possesses working knowledge of Relational Database Management Systems (RDBMS) and data warehouse front-end tools.

Must have an extensive knowledge of data warehouse and data mart concepts.

Bachelor’s Degree in Computer Science, Information Systems, or other related field Or equivalent work experience.

Typically has 1 - 3 years of IT work experience in business intelligence tools and systems.

Job Responsibilities

- Create and manage ETL code (Pentaho) to support high-volume and high-velocity data pipelines.
- Develop and test ETL components to high standards of quality and performance.
- Perform design/code/test reviews for ETL components.
- Assist with planning and execute releases of ETL components into production.
- Develop or enhance Pentaho plugins (Java) as needed.
- Provide operational support for the analytics infrastructure.
- Research, identify and recommend technical and operational improvements resulting in improved reliability and/or efficiencies in maintaining and/or developing the application.
- Evaluate and advise on technical aspects of open work requests in the product backlog with the project lead.

Equal Opportunity Employer

Cloud Big Data Technologies is an equal opportunity employer inclusive of female, minority, disability and veterans, (M/F/D/V). Hiring, promotion, transfer, compensation, benefits, discipline, termination and all other employment decisions are made without regard to race, color, religion, sex, sexual orientation, gender identity, age, disability, national origin, citizenship/immigration status, veteran status or any other protected status. Cloud Big Data Technologies will not make any posting or employment decision that does not comply with applicable laws relating to labor and employment, equal opportunity, employment eligibility requirements or related matters. Nor will Cloud Big Data Technologies require in a posting or otherwise U.S. citizenship or lawful permanent residency in the U.S. as a condition of employment except as necessary to comply with law, regulation, executive order, or federal, state, or local government contract
Apply Now