Sorry, this listing is no longer accepting applications. Don’t worry, we have more awesome opportunities and internships for you.

Flink with Datastreams API

Katalyst Healthcares & Life Sciences

Flink with Datastreams API

Dallas, TX
Full Time
Paid
  • Responsibilities

    Job Description

    Equipment for Resource:

    • Proficient in writing and supporting both functional and non-functional aspects of Flink with DataStreams API.

    Flink Functional Requirements:

    • Expertise in Flink APIs (DataStream, Process functions, etc.).
    • Competence in state management (checkpoints and savepoints) with local storage.
    • Configuration of connectors like EventHub, Kafka, and MongoDB.
    • Implementation of Flink API Aggregators.
    • Handling watermarks for out-of-order events.
    • Management of state using Azure Data Lake Storage (ADLS).

    Flink Non-Functional Requirements:

    • Set up a private Flink cluster within a designated AKS environment.
    • Configure both session-based and application-type deployments.
    • Define and build nodes and slots.
    • Manage and configure Job/Task Managers.
    • Establish necessary connectors, e.g., external storage for the Flink Cluster.
    • Configure heap memory and Rocks DB for state management.
    • Define and set up checkpoints and savepoints for state recovery.
    • Enable Auto-Pilot capabilities.
    • Integrate network resources, such as Azure EventHub and external databases like MongoDB.
    • Implement integration with Argo CD for job submissions.
    • Install LTM agents for logging and Dynatrace agents for monitoring purposes.
    • Provide access to the Flink Dashboard.
    • Establish High Availability (HA) and Disaster Recovery (DR) configurations.

    Experience:

    • 10 years of hands-on design and java coding experience in back-end system development.
    • 5 years hands-on experience with Kafka, Flink, Cloud, Unit/Functional/Integration testing, SQL or kSQL, Java, Github Actions, Dynatrace, Code scanner, and MongoDB.

    Additional details:
    Delivery of a fully configured Confluent Cloud infrastructure with FLINK integration and automated CI/CD pipeline to deploy in all environments (Test, Stage, Prod).

  • Qualifications

    Additional Information

    All your information will be kept confidential according to EEO guidelines.

  • Industry
    Hospital and Health Care