About US: We are a company that provides innovative, transformative IT services and solutions. We are passionate about helping our clients achieve their goals and exceed their expectations. We strive to provide the best possible experience for our clients and employees. We are committed to continuous improvement and innovation, and we are always looking for ways to improve our services and solutions. We believe in working collaboratively with our clients and employees to achieve success.
DS Technologies Inc is looking for Databricks Developer role for one of our premier clients.
Job Title: Databricks Developer
Location: Atlanta, GA (Hybrid) Position Type: Contarct Only W2
Job Description:
A Pyspark and Databricks Developer with a good understanding of the entire ETL/Azure lifecycle with a background of data projects.
Responsibilities:
Design, develop, and maintain scalable data pipelines and ETL processes using Azure Databricks, Data Factory, and other Azure services
Implement and optimize Spark jobs, data transformations, and data processing workflows, Managing Databricks notebooks, Delta lake with Python, Delta Lake with Sparks SQL in Databricks
Leverage Azure DevOps and CI/CD best practices to automate the deployment /DAB Deployments and management of data pipelines and infrastructure
Ensure Data Integrity checks and Data Quality checks with zero percent errors when deployed to production
Understand Databricks new features Unity Catalog/Lake flow/DAB Deployments/Catalog Federation
Hands on experience Data extraction (extract, schemas, corrupt records, error handling, parallelized code), transformations and loads (user defined functions, join optimizations) and Production optimize (automate ETL)
Qualifications:
Bachelor’s degree in computer science, Information Technology, or related field.
Minimum of 5 years of experience in data engineering or similar roles.
Proven expertise with Azure Databricks and data processing frameworks.
Strong understanding of data warehousing, ETL processes, and data pipeline design.
Experience with SQL, Python, and Spark.
Excellent problem-solving and analytical skills.
Effective communication and teamwork abilities.
Skills:
Azure Databricks
Python
Apache Spark
SQL
ETL processes
Data Warehousing
Data Pipeline Design
Cloud Architecture
Performance Tuning
Flexible work from home options available.