Create enterprise grade streaming, messaging and batch data pipelines and ETL processes.
Develop conceptual and logical data models in a variety of storage types, including but not limited to data warehouses, data marts, NoSql, search and graph databases that support data analysis, applications and business intelligence
Write “clean”, well-designed data engineering pipelines in Python.
Set the standard with regards to data modeling and data engineering.
Contribute in architecture and solution design sessions to establish design patterns and ensure best practices around data egress and ingress.
Design and develop high-volume data ingestion components and ETL solutions to collect data from various internal and external sources.
Recommend data governance processes, security standards, and models that will provide architectural guidelines to support the development initiatives.
Ensures data quality and controls are in place to support of various data ingestion inflows and outflows processes.
Administer the overall data warehouse strategy, architectures and security, definition of data models, all data marts, evaluation of infrastructure components, software, performance, and data applications design.
Effectively estimate, groom, and deliver on sprint planning and sprint goals.
Troubleshoot, test, and maintain the performance of our data systems to ensure optimization and speed.
Participate in and contribute in scrum ceremonies and ideation sessions.
Hold yourself and others accountable for delivery on timelines and functionality.
QUALIFICATIONS
6+ years experience in a similar role
6+ years working with NoSql, search, and relational databases
6+ years and a deep expertise in architecting and optimizing enterprise-scale relational data systems.
2+ years experience working with ElasticSearch
4+ years working with Python to build data processing pipelines.
Experience with AWS RDS, ElasticCache
Proficient in Data Governance practices and experience
Deep understanding of data security best practices, PII and PCI data concerns
Experience developing streaming, messaging, batch, and microbatch ETL and data pipelines.
Experience with DDD as it pertains to data architecture.
Experience working in an Agile environment with tools like JIRA.
Experience with version control tools and git principles.
Excellent communication skills
NICE TO HAVE
BS degree in Computer Science, Engineering or a related subject preferred
Experience with advanced analytics and complex data processing