Sorry, this listing is no longer accepting applications. Don’t worry, we have more awesome opportunities and internships for you.

Mid-Level Backend Engineer (Python Specialist)

Alaffia Health

Mid-Level Backend Engineer (Python Specialist)

New York, NY
Full Time
Paid
  • Responsibilities

    About Alaffia & Our Mission

    Each year, the U.S. healthcare system suffers from over $500B in wasted spending due to medical billing fraud, waste, and administrative burden. At Alaffia, we’re on a mission to change that. We’ve assembled a team of clinicians, AI/ML engineers, and product experts to build advanced AI that finally bends the cost curve for all patients across our ecosystem. Collectively, we’re building best-in-class AI software to provide our customers with co-pilot tools, AI agents, and other cutting-edge solutions to reduce administrative burden and reduce healthcare costs. We’re a high-growth, venture-backed startup based in NYC and are actively scaling our company.

    About the Role & What You’ll Be Doing

    In this role, you'll be at the forefront of developing and enhancing our data infrastructure, driving the efficiency and effectiveness of our healthcare payment processing platform. You'll have the opportunity to design and implement robust data pipelines, integrate cutting-edge technologies, and collaborate with cross-functional teams to deliver high-quality solutions. Your responsibilities will include writing production-level code, architecting scalable systems, and ensuring data integrity throughout the pipeline. Additionally, you'll play a key role in innovating our data engineering practices and addressing security challenges to safeguard sensitive information.

    **** Your Responsibilities

    • Developing production-level code in Python for our data pipelines and adjacent services to ensure efficient data processing
    • Designing and implementing new services to extract valuable insights from real-time data ingestion, enhancing the platform's capabilities
    • Architecting robust data pipelines using Apache Airflow and Kubernetes, enabling the integration of human-in-the-loop machine learning models into our production environment
    • Integrating technologies like GraphQL and PostgreSQL to develop a flexible and high-performing web application used by various stakeholders in the healthcare industry
    • Driving innovation in data engineering by leveraging modern tools such as Apache Arrow, Postgres Foreign Data Wrappers, Deltalake, and MLFlow to optimize data processing workflows
    • Making architectural decisions to support live updates for users and ensure scalability for continuous ingestion of medical claims
    • Establishing comprehensive testing suites to validate data processing tasks and maintain data integrity throughout the pipeline
    • Collaborating closely with backend and machine learning engineers to implement new APIs and services, enhancing the platform's functionality
    • Working alongside the Payment Integrity team to incorporate their domain expertise into core development and data initiatives, ensuring alignment with industry best practices
    • Partnering with the Product team to enable frontend features driven by a robust and efficient data pipeline
    • Addressing security challenges related to private data at scale by implementing hardened data access policies, including Postgres Row-Level Security Policies (RLS), to safeguard sensitive information

    Requirements for the Role

    • 5+ years of data engineering service development and data processing pipeline development
    • Algorithmic, Statistical or ML based service development
    • Demonstrated experience working in a fast-paced, innovation centric environment
    • Deep experience with pipeline solutions and modern data related technologies
      • Dataframe frameworks such as Pandas, Polars, Nushell, or Spark
      • Orchestrators such as Airflow or Kubeflow
      • Production level, object-oriented code written in Python
      • Testing frameworks in python - Pytest, Poetry, Pydantic
      • PostgreSQL or similar SQL technology
      • Datalakes such as Snowflake, Deltalake, Azure Synapse, or AWS Athena/Glue
    • PDF / Image extraction and normalization experience
    • Object-oriented programming experience
    • Large file system and data lake management experience
    • Team development and mentorship
    • Preferred experience with modern infrastructure platforms and patterns
      • Kubernetes
      • AWS
      • Microservice architectures
      • Message Queues (Kafka, AMQP, Jetstream) and event-driven architectures
      • Datadog or similar logging and monitoring solutions
      • Container-based workflows with Docker
    • Nice to Haves:
      • Experience in the Healthcare, Insurance, or Healthcare Payments Industry
      • Experience working with FHIR, EDI, or 837 data format
      • Experience with Software as a Service (SaaS) enterprise systems
      • Templating languages such as Helm, Jinja, or Go Templates
      • GitHub, CI/CD using GitHub Actions
      • Large Language Model development

    **** Our Culture

    At Alaffia, we fundamentally believe that the whole is more valuable than the sum of its individual parts. Further to that point, we believe a diverse team of individuals with various backgrounds, ideologies, and types of training generates the most value. If you want to work alongside driven people on a mission to make a major impact at the core of U.S. healthcare by implementing the latest in cutting-edge technologies, then we’d like to meet you!

    What Else Do You Get Working With Us?

    Company stock options

    Hybrid work environment - work from the office and home

    Employer-sponsored Medical, Dental, and Vision benefits

    Flexible, paid vacation policy

    Work in a flat organizational structure — direct access to Leadership

    ** *Please note: Alaffia Health does not provide employment sponsorships at this time.**