Company Description
At Fannie Mae, futures are made. The inspiring work we do helps make a home a possibility for millions of homeowners and renters. Every day offers compelling opportunities to use tech to tackle housing’s biggest challenges and impact the future of the industry. You’ll be a part of an expert team thriving in an energizing, flexible environment. Here, you will grow your career and help create access to fair, affordable housing finance.
Job Description
As a valued colleague on our team, you will provide expert advice and guidance to the team regarding the development of data infrastructures and pipelines to capture, integrate, organize, and centralize data while testing and ensuring data is readily accessible and in a usable state, including quality assurance.
THE IMPACT YOU WILL MAKE
The Lead Data Engineer role will offer you the flexibility to make each day your own, while working alongside people who care so that you can deliver on the following responsibilities:
- Assess customer needs and intended use of requested data in the development of database requirements and support the planning and engineering of enterprise databases.
- Maintain comprehensive knowledge of database technologies, complex coding languages, and computer system skills.
- Lead the team to organize and integrate data into readily available formats while maintaining existing structures and govern their use according to business requirements.
- Lead the analysis of new data sources and monitoring of performance, scalability, and security of data.
- Analyze the initial analysis and deliver the user interface (UI) to the customer to enable further analysis.
Qualifications
THE EXPERIENCE YOU BRING TO THE TEAM
Minimum Required Experiences:
- 4+ years with Big Data Hadoop cluster (HDFS, Yarn, Hive, MapReduce frameworks), Spark, AWS EMR
- 4+ years of recent experience with building and deploying applications in AWS (S3, Hive, Glue, AWS Batch, Dynamo DB, Redshift, AWS EMR, Cloudwatch, RDS, Lambda, SNS, SWS etc.)
- 4+ years of Python, SQL, SparkSQL, PySpark
- Excellent problem solving skills and strong verbal & written communication skills
- Ability to work independently as well as part of an agile team (Scrum / Kanban)
Desired Experiences:
- Bachelor degree or equivalent
- Knowledge of Spark streaming technologies
- Experience in working with agile development teams
- Familiarity with Hadoop / Spark information architecture, Data Modeling, Machine Learning (ML)
- Knowledge of Environmental, Social, and Corporate Governance (ESG)
Skills
- Skilled in discovering patterns in large data sets with the use of relevant software such as Oracle Data Mining or Informatica
- Skilled in documentation and database reporting for the purposes of analysis, data discovery, and decision-making with the use of relevant software such as Crystal Reports, Excel, or SSRS
- Skilled in cloud technologies and cloud computing
- Experience using software and computer systems' architectural principles to integrate enterprise computer applications such as xMatters, AWS Application Integration, or WebSphere
- Determining causes of operating errors and taking corrective action
- Experience in the process of analyzing data to identify trends or relationships to inform conclusions about the data
- Skilled in creating and managing databases with the use of relevant software such as MySQL, Hadoop, or MongoDB
- Business Insight including advising, designing business models, interpreting customer and market insights, forecasting, benchmarking, etc.
- Programming including coding, debugging, and using relevant programming languages
- Communication including communicating in writing or verbally, copywriting, planning and distributing communication, etc.
- Governance and Compliance including creating policies, evaluating compliance, conducting internal investigations, developing data governance, etc.
- Adept at managing project plans, resources, and people to ensure successful project completion
- Working with people with different functional expertise respectfully and cooperatively to work toward a common goal
Tools
- SageMaker
- AWS
- Python