Sorry, this listing is no longer accepting applications. Don’t worry, we have more awesome opportunities and internships for you.

Cloud Data Engineer/ BigData Engineer

Nr Consulting LLC

Cloud Data Engineer/ BigData Engineer

Irvine, CA
Full Time
Paid
  • Responsibilities

    Job Description

    ROLE: Cloud Data Engineer/ BigData Engineer

    WORK LOCATION: Irvine, CA [Local candidate preferred]

    EXPERIENCE: 6+ Years ideal

    TO BE SUCCESSFUL A CANDIDATE WILL NEED:

    Translate application requirement/storyboards/use cases into functional applications

    Design, build, and maintain efficient, reusable, and reliable code with documentation

    Ensure the best possible performance, quality, and responsiveness of applications

    Identify bottlenecks and bugs, and devise solutions to mitigate and address these issues

    Help maintain code quality, organization, and automatization

    Follow the Project Defined Process, Organizational Standards and Policies

    Regular Progress Updates to Reporting Manager

    Interact with team, peer review, fix peer reported issues

    Communication with Stakeholders

    Design, develop, unit test and release high quality code for new and existing products/applications.

    Participate actively in design, code, and quality reviews.

    Participate actively as a member of a scrum team including story creation, estimation, planning, sprint review demonstrations, and retrospectives.

    Adhere to the complete software development processes, design, coding, test automation, and release management standards, guidelines, and processes; suggest and implement improvements.

    Higher education industry experience is desirable

    SKILLSETS WE ARE LOOKING FOR IN A CANDIDATE:

    Minimum 4+ years professional experience developing applications in Python on AWS or Google Cloud

    4 to 6 years of IT experience in implementing DW/BI projects on both on-prem and Cloud

    4+ years exp in Python, Pandas, Numpy, Matplotlib, Seaborn

    2+ years exp in Apache Spark

    2+ years exp in Apache Beam, Apache Airflow

    4+ years exp in SQL/PL SQL

    Strong proficiency in Python and Pyspark

    Strong exposure to Data processing frameworks like Hadoop, Spark, Kafka

    Strong proficiency in AWS services such as S3, EC2, EMR, RDS, Redshift or Google Cloud services such as Data flow, GCS, Dataproc, BigQuery etc.

    Familiarity with various design and architectural patterns is a must

    Skill of writing clean, readable, commented and easily maintainable code

    Understanding of fundamental design principles for building a scalable application

    Skill for writing reusable libraries

    Experience implementing automated testing platforms and unit tests

    Proficiency in understanding code versioning tools such as Git, SVN, TFS etc.,

    Familiarity with Continuous Delivery, Continuous Testing and Continuous Deployment (DevOps, Jenkins etc)

    Good exposure to various SDLC methodologies like Agile, Iteration and Waterfall