• Develop software using Hadoop technologies like HBase, Spark, Sqoop, NIFI and Kafka
• Utilizing programming languages like Java, Scala, Python with an emphasis on tuning, optimization and best practices for application developers
• Analyze requirements to successfully support design activities
• Design and build integration components and interfaces in collaboration with Architects and Infrastructure Engineers
• Perform all technical aspects of software development (write, test, support)
• Perform unit, component, integration testing of software components including the design, implementation, evaluation, and execution
• Conduct code reviews and tests of automated build scripts
• Debug software components, identify, fix and verify remediation of code defects - own work and the work of others
• Work with product owners to prioritize features for ongoing sprints
• Manage a list of technical requirements based on industry trends, new technologies, known defects, and issues
• Custom data pipeline development (cloud and locally hosted); work heavily within the Hadoop ecosystem
• Experience within Insurance, Financial Services, or other regulated industries