Data Engineer II _ Big Data, GCP _EDAI

American Express

Data Engineer II _ Big Data, GCP _EDAI

Phoenix, AZ
Full Time
Paid
  • Responsibilities

    JOB DESCRIPTION

    The Enterprise Data and AI organization unites the data governance, strategy, engineering, and product teams with those responsible for AI engineering, generative AI enablement, and automation product and engineering. This group plays a pivotal role in leveraging data as a core driver of innovation and integrating AI capabilities to transform products, operations, and customer experiences. ​

    The EDAI organization also incorporates technology Research & Development and experimentation with emerging capabilities, along with engineering support for Amex Digital Labs. This integration ensures that research breakthroughs seamlessly translate into business impact.​

    Unified Data Intelligence Technology builds and enables a trusted, scalable, and accessible enterprise data foundation that powers analytics and AI/ML across the organization. It delivers a centralized, cost-efficient unified data platform with elastic performance, standardized batch and streaming ingestion, and optimized compute, while embedding strong governance through enterprise metadata, data quality monitoring, role-based security, and regulatory compliance. The technology empowers teams with self-service analytics, reusable and well-governed data products, and end-to-end ML enablement, all enhanced by an intelligence layer that leverages metadata, semantic services, knowledge graphs, and AI-driven capabilities to improve data discovery, understanding, and decision-making at scale.​

    RESPONSIBILITIES

    Evaluates data requirements and stories, documents them for seamless integration into existing architectures, and maintains data models to support business needs

    Builds and enhances data pipelines and database designs to meet performance, scalability, and security requirements

    Collaborates in design reviews and testing, and provides production environment support with guidance from peers and leaders

    Operates data assets according to enterprise standards, guidelines, and policies

    Completes work reviews and fosters a collaborative learning environment with guidance from peers and leaders

    Communicates and collaborates with business and product teams to facilitate changes and implementation

    Completes Big Data requirements by implementing basic partitioning and indexing solutions

    Collaborates and co-creates effectively with teams in product and the business to align technology initiatives with business objectives

    QUALIFICATIONS

    • 8+ years of experience in Business Intelligence and Data engineering.

    • Strong hands-on experience with Google Cloud Platform (GCP) data engineering stack, including BigQuery, Dataproc, Cloud Composer, and data transformation tools (e.g., dbt).

    • 8+ years of experience in Big Data technologies, with proven expertise in designing and building scalable data pipelines and distributed systems.

    • Advanced proficiency in SQL and relational databases (e.g., PostgreSQL), including strong data modeling and performance optimization skills.

    • Experience integrating and consuming APIs (REST/GraphQL).

    • Working knowledge of Machine Learning concepts, with hands-on experience in building and deploying models such as Time Series and Linear Regression, along with experience in hyperparameter tuning and model optimization.

    • Familiarity with unsupervised learning techniques, including clustering, dimensionality reduction, and association rule mining.

    • Proficiency in Python backend development and shell scripting for building automation and data workflows.

    • Experience with modern development tools such as GitHub Copilot and ChatGPT to enhance development efficiency and code quality.

    • Strong understanding of software engineering best practices, including design patterns, testing frameworks, and code quality standards.

    • Experience with CI/CD pipelines and tools such as Git, Docker, Jenkins, Maven/Gradle, XL Release, ensuring reliable and automated deployments.

    • Solid understanding of Software Development Lifecycle (SDLC) methodologies, particularly SAFe Agile, and experience with tools such as Jira, Rally, and Confluence.

    • Depending on factors such as business unit requirements, the nature of the position, cost and applicable laws, American Express may provide visa sponsorship for certain positions​

  • Industry
    Financial Services