April 22, 2024

Data Engineer

Company:
Klaus
Location:
Estonia 🇪🇪 (Hybrid Remote from Estonia)

What's the opportunity

Zendesk’s category-defining quality management platform for customer support teams makes giving internal feedback easy and systematic. We’re at the forefront of the burgeoning customer experience market, enabling support teams to review & improve their customer service quality. We know it takes an amazing team to build such fantastic products.

That is why we are looking for a Data Engineer who loves to tackle complex problems, join our team, and help us shape our vision. Does that sound like you? If so, be sure to read on!

Our Stack

  • Go, Python, Java, PostgreSQL, Google Cloud Spanner, GRPC, REST, microservices
  • Google AI Platform, Apache Beam, and Airflow
  • Google Kubernetes Engine, Cloud Run, Github Actions, Bazel, and monorepo
  • dbt

What you'll be doing

  • Publish well-written and tested code to production
  • Investigate production issues and fine-tune our data pipelines
  • Continually improve data pipelines for high efficiency, throughput and quality of data
  • Collaborate with team members on researching and brainstorming different solutions for technical challenges we face
  • Solve complex problems with passion and technical leadership, helping to discover solutions to complex problems
  • Develop standard methodologies and mentor others to help make technical decisions on projects

What you'll bring to the role

Basic Qualifications

  • 2+ years of hands-on experience of building scalable data platforms and/or reliable data pipelines
  • Java, Python, Scala (proficient in at least one)
  • Developer skills; demonstrating a strong passion for designing scalable and fault-tolerant software systems
  • Experience with AWS, Google Cloud or related cloud technologies
  • Experience in developing and operating high-volume, high-availability environments
  • Solid working understanding of Data Modeling
  • Working understanding of Kubernetes’ infrastructure and security best practices
  • Ability to work optimally in a remote environment that includes geographically spread teams and customers
  • Proven track record with cloud and data-related technologies

Preferred Qualifications

  • Experience writing ETL jobs to help address various data engineering challenges
  • Strong understanding of Build tools and Deployment tools
  • Familiarity with Kafka, Flink, Spark frameworks with validated understanding of at least one job scheduling tool: Airflow, Celery, AWS Step functions

About Klaus    

For the past five years we’ve been building the best quality management solution that helps companies drive revenue by identifying gaps in customer experience – using AI, automation, and data analysis. This year we hit an important milestone and were acquired by Zendesk!

Zendesk is a global, digital-first company on a mission to make the world better—one customer experience at a time.

Stop stressing about Looker today

Spectacles saves you hours of bug-fixing and boosts confidence in your Looker dashboards. Book a call with our Looker experts to find out how!