Senior Data Engineer

Los Angeles / Engineering — Platform Engineering / Full-time
Note to applicants: remote in the US is ok, except Colorado

Who we are
Albert is a new type of financial service that uses powerful technology to automate your finances, with a team of human experts to guide you. Albert saves and invests automatically for you, helps you avoid overdrafts, finds savings you’re missing, identifies bills you’re overpaying, and much more. Text Albert a financial question, and our geniuses won’t just offer guidance — they’ll help you take action.

We're an LA-based startup with a proven business model, backed by top-tier institutional investors and with nearly 6 million users who have trusted Albert to help them achieve their financial goals. We're on a mission to democratize money management through our simple, beautifully designed product, and we're looking for thoughtful, talented people to join us on our journey.

About the role
Managing, transforming, and accessing data efficiently is critical to every business process at Albert, from backend and mobile development to growth and business analytics. We are looking for talented Data Engineers to help support our data analytics pipelines and systems, as well as help us evolve our data architecture as we continue to scale.
Things you're good at
  • Shipping: Delivering great products that you're proud of on a regular basis.
  • Architecture: Getting it done is important. Getting it done in way that will scale is equally important.
  • Diving in: Taking ownership of the data stack.
  • Collaboration: We bring the best out of each other. We're looking for people who will bring the best out of all of us.
  • Take over existing data pipelines, ETL and task running processes, starting with our ETL processes for BI analytics
  • Partner closely with VP of Analytics to make data accessible to the entire company so we can make timely decisions backed by data
  • Monitor our analytics data pipelines to ensure data quality and timeliness
  • Continuously improve our BI tooling, platforms and monitoring to help the team create dynamic tools and reporting
  • Drive optimization, testing, and tooling to improve data quality
  • Write clean, maintainable and well-documented code to support our data processes. Help improve and evolve out data architecture over time by planning, developing, and deploying infrastructure using state of the art tools and practices appropriate for our needs
  • Concisely and effectively communicate the benefits and implications of adding new data technologies and techniques to our infrastructure
  • Bachelor's Degree
  • 4+ years of experience in a Data Engineering role, with a focus on building data pipelines
  • BI tooling and/or data app development
  • Proficiency in Python
  • Experience with some or all of the following: Postgres, Redshift, Celery, Elasticsearch, Kafka, and Airflow.

  • Competitive salary and meaningful equity
  • Health, vision and dental insurance
  • Daily meals provided
  • Monthly wellness stipend
  • 401k match
Apply for this job