Are you a Data Engineer looking to join a team that truly understands the value of unlocking the power of data? As a member of this highly respected team, you will be responsible for all stages of data lifecycle management which includes capture, maintenance and publication. You will also be responsible for designing, developing, testing, deploying and supporting data management solutions. This includes BI development, data pipeline (ETL/ELT) development, API data integrations, data quality and process monitoring/alerting.
How will your talents be utilized?
Design, develop and support database entities, data pipelines and processes required for data integration (ETL/ELT)
Implement processes that ensure data quality, code quality, and the confidentiality of all data
Possess the ability to learn emerging technologies quickly and apply them effectively
Design, implement, lead and manage large-scale, enterprise-wide and complex data projects
Define and comply with internal development coding standards, procedural guides, and checklists for the support of the data platform.
Provide in-depth troubleshooting skills to assist in resolving errors and performance issues, including tier 2 production support
Understand business requirements, contribute to technical requirements and deliver on time within a SCRUM methodology.
Required Expertise:
6+ years with relational databases and SQL
3+ years data warehouse and ELT/ETL experience
2+ years programming with Python (Pandas and other common libraries)
2+ years object oriented programming experience
2+ years API integration experience
1+ year experience performing data integrations or data warehousing in a cloud data platform, preferably Google Cloud
Preferred Qualifications:
Experience with data warehousing using Big Query in Google Cloud.
Experience with GCP Services - Cloud Dataflow, Cloud Firestore, Cloud Composer and Stackdriver.
A broad range of knowledge across Big Data, Azure, Data Vault, Tableau and other emerging technologies.
Education: