Job Description
REMOTE START AND LATER ONSITE IN CHICAGO,IL AFTER COVID RESTRICTIONS ARE UPLIFTED.
- Bachelor's degree in Computer Science or similar field
- 4+ years of experience in traditional and modern Big Data technologies (HDFS, Hadoop, Hive, Pig, Sqoop, Kafka, Apache Spark, hBase, Oozie, No SQL databases, PostgreSQL, GIT, Python, REST API, Snowflake, etc.)
- Experience in Java/Python/Scala
- Experience extracting/querying/joining large data sets at scale
- Experience building data platforms using Azure stack
- Experience building data ingestion pipelines using Azure Data Factory to ingest structured and unstructured data
- Strong knowledge on Azure Storage schematics such as Gen1 and Gen2
- Experience in harmonizing raw data into a consumer-friendly format using Azure Databricks
- Knowledge of Azure networking, security, key vaults, etc.
- Experience in data wrangling, advanced analytic modeling, and AI/ML capabilities is preferred
- Experience utilizing Snowflake to build data marts with the data residing in Azure storage is a plus
- Knowledge of SAS, Teradata, Oracle, or other databases a plus
- Exposure with R and ML technologies a plus
- Strong communication and organizational skills