Data Engineer
Location
Berlin, Germany

About the opportunity

At Contentful we collect and centralize all data from the different components of our microservice based platform. We enrich the data of our platform with the wealth of information that comes from Salesforce, Zendesk, and several other services that we use to interact and communicate with our users. As we grow we add services to our data aggregation pipeline and curate the collected data.

As a Data Engineer at Contentful, you will work on the Data Pipeline across the entire Contentful Stack together with our Infrastructure, Backend and Frontend engineers to ensure that data is consistently flowing and that it is collected in a form that allows for the generation of insights.

You will also be responsible for continually iterating and innovating our Data Warehouse, Data Lake and ETL procedures around them ensuring that the flow of new data doesn’t deteriorate performances. On the topics of ETL and data modeling, you will work closely with Data Analysts to understand in-depth the performance of domain models and analytical requirements.

Ideally, you have already worked on the Business Intelligence infrastructure of an online business and you’re looking for a challenge in a company that puts data-driven decisions at the center of its product and business.

What to expect?

  • Develop and maintain Data Pipeline Applications (data collectors that live within or outside the Contentful core infrastructure)
  • Maintain and evolve the Data Warehouse (AWS Redshift) and Data Lake (Athena) ensuring appropriate performance
  • Maintain and evolve ETL services to enrich data model and serve analytics and generation of insights using tools such as Airflow

What you need to be successful?

  • Solid, professional knowledge of Python, and/or Golang with TypeScript being an advantage
  • You can build applications to help and support data collection in a microservice architecture
  • Comfortable with Docker, as we are using it extensively
  • Experience with Athena, AWS Redshift and classic databases (PostgreSQL, MySQL)
  • You are familiar with AWS services and concepts (VPC, IAM)
  • Experience using Segment is considered a plus and knowledge of Airflow, Kubernetes and Terraform is a strong benefit
  • Experience with PubSub/Stream data RabbitMQ, AWS Kinesis, Kafka, Looker is considered a plus
  • You have strong SQL modeling and optimization knowledge for different analytics workloads and scenarios
  • Experience with ETL processing pipelines and design in batch and near real time

What's in it for you?

  • Join an ambitious tech company reshaping the way people build digital products. 
  • Full-time employees receive Stock Options for the opportunity to share ownership and the success of our company
  • We value Work-Life balance and You Time! A generous combination of paid time off, sick days and paid holidays 
  • Comprehensive health care package (health, vision, dental insurance, life and disability insurance) and a retirement savings plan
  • Use your personal education budget to improve your skills and grow in your career. Join a free German class or one of our many internal learning initiatives!
  • Enjoy a full range of virtual events, including workshops, guest speakers, and fun team activities, supporting learning and networking exchange beyond the usual work duties 
  • Our Berlin based employees: Get fit! We offer an Urban Sports Club discount and virtual morning fitness class to help you reach your fitness goals. 
  • Share and navigate the excitement of a new workplace with your CFF (Contentful First Friend) 
  • Plus, Contentful socks! And other amazing swag as part of company events. Oh yeah!

#LI-BR1

Impressions
add-circle arrow-right remove style-two-pin-marker subtract-circle