KeyPoint Consulting is seeking a senior data engineer for a client in the Baltimore area. This role combines software development and analytics experience
Qualifications:
- Familiar with the modern cloud data platforms (Azure preferred but AWS is ok). Experience in some of the following Azure experience is preferred:
- Azure Technologies:
- Data Lake (Analytics and Storage),
- Data Warehouse / Synapse Analytics / Amazon Redshift
- Data Factory
- Logic Apps
- Data Bricks
- 2+ years of data taxonomy, BI, data migration or data lake integration experience
- Proven experience using one or more data modeling tools (including, but not limited to, SQL, Spark, ElasticSearch, Hive, Azure Data Lake, BigQuery, etc.)
- Proven experience using object-oriented programming languages (including, but not limited to, Python, JavaScript, Java, Scala, etc.) and functional data science tools such as R, or others
- Experience with data visualization tools (including, but not limited to, Matplot, Splunk, Kibana, DataStudio, Tableau, Cognos, etc.)
- Experience with Consumer telemetry, infrastructure monitoring, and process improvement preferred
- Machine learning and artificial intelligence experience preferred
- Thrive in a fast-paced and demanding environment, possess a high level of intellectual curiosity, and demonstrate strong judgment in the face of ambiguity
Summary:
- Integrate data within Corporate Data Lake and Enterprise Data Warehouse environments; maintain data warehouse
- Migrate legacy data components to future state environments
- Analyze application and infrastructure performance data in order to prevent, detect, and mitigate broadcast center incidents
- Create and maintain a data broker layer for orchestrating broadcast center telemetry messages
- Create a Consumer telemetry layer
- Establish metadata taxonomy and data governance standards
- Build scalable, high performance infrastructure for delivering clear business insights from raw data sources
- Responsible for designing, building, testing, integrating, managing, and optimizing data
- Responsible for data format, resilience, scaling, and security
- Track operations and constantly look for ways to make things work better, faster, and smoother