Sorry, this listing is no longer accepting applications. Don’t worry, we have more awesome opportunities and internships for you.

Underwriting Internship - Spring Semester 2020

Chromalox

Underwriting Internship - Spring Semester 2020

Tempe, AZ
Internship
Paid
  • Responsibilities

    Headquartered in Pittsburgh, Pennsylvania, at Chromalox, a Spirax-Sarco Engineering company, we build advanced thermal technologies for the world’s most challenging industrial heating applications. We do it better and have been doing it longer than anyone else.

    Chromalox started with an innovative solution 100 years ago when a self-taught engineer invented the first metal-sheathed resistance heating element. It was this then-advanced thermal technology that launched an entire industry.

    That pioneering, innovative spirit continues today. Built on opportunity and innovation, Chromalox has grown to serve an increasing number of global markets and industries. We excel in industries that have high expectations. We are acknowledged as experts at delivering solutions that exceed specifications, limit risk, and reduce operating costs.

    Join us as we continue to provide solutions to our customers and the world!

    The Role:

    Under the supervision of the Senior Manager of Enterprise Data and Integrations, the Azure Data Engineer’s primary purpose will be to expand the company’s use of data as a strategic enabler of company and group objectives.

    This role will involve designing, building, analyzing, and supporting data solutions and pipelines using Azure cloud technologies. In this position, the candidate will develop and maintain a data platform that will become the single source of truth for the enterprise.

    Responsibilities:

    • Develops, builds, maintains, and manages data pipelines for the enterprise data platform.
    • Build and maintain large data sets in the cloud (data lakes, delta lakes, data lake houses, etc.).
    • Designed, developed, and supported the data pipelines needed for the platform.
    • Developed and supported ELT/ETL processes utilizing Azure Data Factory and Databricks to transform data.
    • Identifies new opportunities for data ingestion and automating data pipelines.
    • Implements processes and systems to monitor data quality, ensuring that production data is accurate and available for critical stakeholders.
    • Helps to define new and forward-thinking business intelligence and data management strategies.
    • Manage requests for data from internal and external clients, ensuring timely delivery of high-quality results.
    • Define and modify data lake design and delta lake pipelines, ensuring data quality at each medallion layer (bronze, silver, gold).
    • Builds semantic layer for consumption by Power BI.
    • Protects sensitive data, including implementing row-level security as appropriate.

    Required Skills

    Required Experience

    • At least 5+ years of experience, with a combination bachelor's degree in a related field and/or a combination.
    • Work requires continual attention to detail in composing, typing, and proofing materials, establishing priorities, and meeting deadlines.
    • Experience developing data pipelines in an enterprise cloud environment is required.
    • Experience working in Azure, AWS, or other cloud technologies.
    • Experience with Databricks, Lakehouse Delta Lake, and medallion architectures.
    • Technical Expertise in data warehouse concepts and design.
    • Programming language expertise (SQL, Python, R, JavaScript, etc.).
    • Power BI experience in an enterprise environment is a plus.
    • Experience with CI/CD pipelines in Azure DevOps is a plus.
    • Manufacturing industry experience is a plus.
  • Qualifications

     

    • Has an interest in a carrier within the insurance industry
    • Demonstrates a good work ethic, relationship building skills, and analytical thinking
    • Strong research and writing skills
    • Basic computer knowledge - Microsoft Word, Excel, and PowerPoint
  • Industry
    Other