Data Engineer III - Python, SQL - Senior Associate

J.P. Morgan

J.P. Morgan

Software Engineering, Data Science
London, UK
Posted on Mar 16, 2026

Are you ready to shape the future of data engineering at JPMorgan Chase? Join a dynamic team where your unique skills will help build innovative solutions and contribute to a winning culture. You’ll have opportunities for career growth, collaborate with talented professionals, and make a real impact on our business objectives. Your expertise will empower our teams and drive success across the firm.

As a Data Engineer III in our agile team, you will design and deliver reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. You will develop, test, and maintain essential data pipelines and architectures, supporting various business functions to achieve the firm’s goals. Working with us, you will use your skills to drive innovation and help shape our team culture. Together, we focus on excellence, collaboration, and continuous improvement.

Job responsibilities

  • Develop workflows and ELT pipelines using Python and Databricks.
  • Support review of controls to ensure sufficient protection of enterprise data.
  • Implement data security using entitlements frameworks.
  • Update logical or physical data models based on new use cases.
  • Use SQL frequently and understand NoSQL databases

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3 years applied experience.
  • Good working knowledge of AWS, Databricks, and Python.
  • Experience across the data lifecycle.
  • Advanced at SQL, including joins and aggregations.
  • Working understanding of NoSQL databases.
  • Significant experience with statistical data analysis and ability to determine appropriate tools and data patterns for analysis.
  • Utilize AWS Cloud Services for developing, deploying, and managing applications at scale.
  • Good understanding and working knowledge of software development lifecycle tools used for configuration management, CI/CD pipelines, unit testing, regression testing, and performance testing.

Preferred qualifications, capabilities, and skills

  • Familiarity with the Standardized data layer practises (Medallion architecture)
  • Exposure to Aurora Postgres and MongoDB
  • Skills in designing efficient data models including normalization, denormalization, and schema design and an understanding around relational and star schemas.


J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world’s most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.

J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.
Develop, test, and maintain critical data pipelines and architectures across multiple technical areas