Training + Business. Get your 2-for-1 deal to ODSC
East & CxO Summit before it expires on Friday.

This deal has timed out, but the next deal might just around the corner, or find a way to contact us about writing a blog and we'll talk. See you at ODSC East!

Use code: BUSINESS for an extra 20% Off

Data Warehouse Engineer


Job description

Engineering at Klaviyo

Klaviyo is a Boston startup located right in the heart of downtown Boston. We craft software helping thousands of ecommerce companies to have engaging relationships with hundreds of millions of consumers. We love tackling tough engineering problems and look for full stack engineers who specialize in certain areas but are passionate about building, owning & scaling features end to end from scratch and breaking through any obstacle or technical challenge in their way. We push each other to move out of our comfort zone, learn new technologies and work hard to ensure each day is better than the last.

About the Role

As a Data Warehouse Engineer at Klaviyo you will have the opportunity to architect, deliver and optimize a data warehouse storing terabytes of data growing to petabytes in the coming year. Klaviyo’s mission is to help businesses grow by unlocking the data stored in ecommerce companies’ existing systems and in a similar vein your mission within Klaviyo will be enable our data to be a competitive advantage and have it at the fingertips of all Klaviyo internal teams. This data set spans thousands of customers, hundreds of millions of consumers, and billions of monthly events.

The person ideally suited for this job will have equal passion for the technical as well as the business side of harnessing massive quantities of data. You have both strong programming skills, data modeling and querying expertise and systems chops. You get energized working with individuals across the organization to understanding their data insight needs, how to best answer their questions, and how what you build and optimize every day will generate value for them and transform Klaviyo’s product and our customers’ businesses.

About You

You Will

  • Have the chance to manage a data warehouse for 50+ TB of data growing 200% YoY
  • Work on business projects that have impact across Klaviyo and make easy access to data turn into a competitive advantage for Klaviyo internal teams and the Klaviyo product
  • Wholly own your own systems and the data warehousing strategy
  • Research, prototype, build and evolve a data warehouse using industry-standard technologies and AWS-optimized solutions and partners
  • Grow your own technical skills by mastering not only real-time data processing techniques and the latest data warehousing technologies, but relevant engineering skills such as Python programming, database administration & optimization, machine learning, distributed computing and more
  • Pair with product, marketing and sales experts to derive insight that transforms Klaviyo’s and our customers’ business
  • Pair with Klaviyo product engineers to improve the engineering data pipeline, surface more streams and sources of data reliably and cost effectively, and improve the Klaviyo product with insights derived from the data warehouse

You Have

  • Bachelor’s Degree in Computer Science, Information Systems or similar programs (or similar relevant experience)
  • Strong programming experience in languages such as Python, Ruby, Go, etc.
  • Experience writing enterprise-grade code that is understandable, maintainable, testable and reusable
  • An innate desire to deliver results, break through obstacles and deliver value quickly and master new technical and business skills
  • Experience working with business partners in a product company and other engineers to gather, understand and bridge definitions and requirements
  • Exceptional relational database and SQL query expertise
  • Proven ability to bring data in from internal and third party sources in a reliable fashion, and you take operational reliability and performance seriously
  • Love for getting your hands dirty with the data implementing custom ETLs to shape it into information
  • Frugal nature–You love efficiency and always have an itch to better tune your data pipelines to efficiently deliver accurate data without excessive spend on storing or processing extraneous information
  • Bonus Experience
  • AWS Cloud data warehouse and pipeline tools (RedShift, Data Pipeline, Kinesis, EMR, etc.)
  • Hadoop Ecosystem of Tools (Spark, Hive, Pig, MapReduce, etc)
  • Business intelligence tools (Tableau, Qlikview, Looker, etc)
  • Database management and performance concepts such as indices, segmentation, projections, partitions and advanced query performance analysis
Post a Job

Apply for job