Job Title:
Data Engineer – Databricks & Azure
Location:
Columbus, OH preferred / Remote (US or Canada)
Experience Required:
7–12 years
Compensation:
$150K USD / CAD $170K–$190K
Job Type:
Full-Time
Position Overview:
We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for development and optimization of data pipelines, implementing robust data checks, and ensuring the accuracy and integrity of data flows. This role is critical in supporting data-driven decision-making processes, especially in the context of our insurance-focused business operations.
Responsibilities:
Collaborate with data analysts, reporting team and business advisors to gather requirements and define data models
Develop and maintain scalable and efficient data pipelines
Implement robust data checks and validate large datasets
Monitor data jobs and troubleshoot issues
Review and audit data processes for compliance
Work within Agile methodologies including scrum and sprint planning
Qualifications:
Bachelor’s degree in Computer Science, IT, or related field
7–12 years of experience in Data Engineering with Databricks & Cloud
Strong proficiency in PySpark, Python, SQL
Experience in data modeling, ETL/ELT pipeline development, and automation
Hands-on experience with Azure Data Factory, Azure Databricks, Azure Data Lake
Experience with Delta Lake, Delta Live Tables, Autoloader & Unity Catalog
Preferred: Knowledge of insurance industry data requirements
Strong analytical and communication skills
This is a remote position.