Develop and maintain ETL workflows and data integration processes using Spark in Python, SQL, and other relevant technologies
Developed data models, schemas, and data dictionaries to support business intelligence and reporting needs
Create and maintain data pipelines using Airflow
Managed multiple POCs to optimize data flow performance and response times
Collaborate with cross-functional teams to gather and understand data requirements, ensuring alignment with business objectives
Work in a collaborative Agile environment to understand requirements and roadmap