Software Engineer, Data Platform

Published on October 19, 2023

Stripe is a financial infrastructure platform for businesses. Millions of companies—from the world’s largest enterprises to the most ambitious startups—use Stripe to accept payments, grow their revenue, and accelerate new business opportunities. Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead. That means you have an unprecedented opportunity to put the global economy within everyone’s reach while doing the most important work of your career.

Who we are

About Stripe

Stripe is a financial infrastructure platform for businesses. Millions of companies—from the world’s largest enterprises to the most ambitious startups—use Stripe to accept payments, grow their revenue, and accelerate new business opportunities. Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead. That means you have an unprecedented opportunity to put the global economy within everyone’s reach while doing the most important work of your career.

About the team

You’ll join one of the teams behind data processing platforms used by the rest of engineering, such as our event bus, stream processing systems, batch processing systems, real-time analytics or asynchronous processing platforms such as task queues and workflow engines. You’ll make decisions with a significant impact on Stripe. There is a lot of work to do to make Stripe engineers’ work easier and our platforms even more reliable than they are today, and we’d love for you to be a part of it. We’re close to the people using our systems and we constantly get feedback that we use to make them better.

What you’ll do

We have a few dozen infrastructure engineers today spread across several different teams, and you’ll work with other infrastructure engineers as well as the product engineers who use the systems we build.

We’re looking for people with a strong background (or interest!) in Data. We’d love to hear from you whether you’re a seasoned software engineer, or whether you’ve just learned you might like working with real-time systems. Many of our infrastructure engineers work remotely, and we’d be happy to talk to you about the possibility of working remote.

Responsibilities

  • Design, build, and maintain data infrastructure systems such as Kafka, Flink, Pinot and Airflow used by all of Stripe’s engineering teams
  • Design alerting and testing systems to ensure the accuracy and timeliness of these pipelines. (e.g., improve instrumentation, optimize logging, etc)
  • Debug production issues across services and levels of the stack
  • Plan for the growth of Stripe’s infrastructure
  • Build a great customer experience for developers using your infrastructure
  • Work with teams to build and continue to evolve data models and data flows to enable data driven decision-making
  • Identify the shared data needs across Stripe, understand their specific requirements, and build efficient and scalable data pipelines to meet the various needs to enable data-driven decisions across Stripe

Who you are

We’re looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement.

Minimum requirements

  • A strong engineering background and interest in Data
  • Experience developing, maintaining and debugging distributed systems built with open source tools
  • Experience building infrastructure as a product centered around users needs
  • Experience optimizing the end to end performance of distributed systems
  • Experience with scaling distributed systems in a rapidly moving environment
  • Experience managing and designing data pipelines
  • Can follow the flow of data through various pipelines to debug data issues

Preferred qualifications

  • Experience working on stream processing systems such as Kafka, Kinesis, Flink or Beam
  • Experience working with batch processing systems like Airflow, Spark.
  • Experience working on real-time analytic systems as Presto, Druid or Pinot
  • Experience with orchestration platforms such as Cadence
  • Experience with Java, Scala and Ruby
  • Experience with Lambda Architecture systems