Sorry, this listing is no longer accepting applications. Don’t worry, we have more awesome opportunities and internships for you.

Data Engineer

Infima Technologies

Data Engineer

National
Full Time
Paid
  • Responsibilities

    Join a world-class team to deliver game-changing AI prediction technologies to financial markets!

    infima is a capital market technology startup building transformative prediction services for the $25T credit market. Our disruptive deep learning technologies deliver actionable predictions of borrower, security and market behavior that set new accuracy and latency standards in the market. Our predictions provide new, performance-boosting edges for investors and dealers, thus making them a must-have for many market participants. We are backed by leading venture capital and strategic investors.

    We are looking for a Data Engineer/Architect to join the engineering team to be a core contributor on the design and development of our datasets and pipelines that form the backbone of our credit analytics product.  The ideal candidate is an expert is with multiple years of experience who will challenge our current architecture, design new systems, and conceptualize frameworks that guide overall modeling, warehousing, and pipeline practices. 

    _THIS POSITION CAN BE REMOTE OR BASED IN OUR SAN MATEO, CA OFFICE. _

    RESPONSIBILITIES:    

    • Design and develop a modern data lake.
    • Build robust and reliable ETL pipelines with best practices. 
    • Scale existing data pipeline in multiple directions (volume, sources, etc.).     
    • Improve orchestration and monitoring of data processes.        
    • Optimize machine learning data pipelines.

    REQUIREMENTS:

    • Bachelor's Degree in Computer Science, or closely related field, from an accredited university or college.       
    • 3+ years of data-oriented software development and experience building data backends using Python 
    • Strong understanding of database and data lake design. 
    • Architect mindset, source of proposal, autonomous.
    • Proficient in software development best practices: code reviews, version control, testing. 
    • Experience with a workflow orchestration system such as Prefect, Airflow, etc.  
    • Proactively monitoring what is happening in the data / software engineering community.

     

    NICE TO HAVE:

    • Experience with building machine learning feature stores 
    • Experience with deploying applications on AWS using Kubernetes 
    • Experience with serverless frameworks such as AWS Lambda  
    • Experience with web crawling / scraping

     

    TECH STACK: 

    • Data engineering: Python, Prefect, Dask, Parquet 
    • Cloud: AWS (S3, Glue, Athena, EKS, RDS) 
    • DevOps: Docker, Kubernetes, Gitlab CI 
    • Back-end: Golang (Google Protobuf), Postgesql, TimescaleDB 
    • Front-end: React, Redux