Skip to content

Istanbul, Turkey

Senior Data Engineer

Full-Time / Remote

Join us on our exciting journey at Dataroid, the award-winning digital analytics and customer engagement platform!

Dataroid empowers leading brands to enhance every individual customer experience through deep customer insight, data science modelling, and omnichannel marketing.

As Turkey’s fastest-growing data analytics platform, we embrace challenges, explore new technologies, and aim to impact industries globally by helping businesses harness data to create seamless customer experiences.

At Dataroid, our developers drive innovation, staying ahead of technology trends to deliver simple and seamless solutions. Already used by leading enterprises in finance, airlines, and retail, Dataroid reshapes the experience of over 120 million users.

Dataroid is in search of a Senior Data Engineer who will be responsible for designing and building large-scale resilient data pipelines, ensuring high performance, scalability, and seamless integration with various frameworks.

Key Responsibilities

  • Design, architect and build large-scale resilient real-time data pipelines processing billions of events daily from 120M+ users using Apache Spark, Apache Flink, Kafka and other tools and frameworks
  • Own end-to-end data platform architecture from SDK data ingestion to real-time analytics APIs serving major enterprises across banking, airlines, and retail, making key technical decisions on storage, processing, and serving layers
  • Build unified customer profile systems that consolidate fragmented user data across web, mobile and IoT channels in real-time
  • Write well-designed, reusable, testable, secure and scalable high-quality code that powers mission-critical analytics for major enterprise brands
  • Drive architectural decisions on streaming, storage, and processing layers as data volumes and client requirements grow
  • Collaborate with cross-functional teams including product, ML, and analytics to shape data strategy and enable data-driven insights
  • Mentor engineers and establish data engineering best practices across the organization
  • Ensure platform reliability and performance to meet enterprise SLAs for systems processing millions of events per second

Qualifications & Experience

  • BSc/MSc/PhD degree in Computer Science or a related field or equivalent work experience
  • 5+ years of experience in Data Engineering, Data Architecture or similar role building production systems at scale
  • Proven experience architecting and operating real-time analytics systems handling large volumes of data with demonstrated ability to discuss technical tradeoffs and scalability challenges
  • Strong experience with real-time data modeling, ETL/ELT practices, and streaming architectures at enterprise scale
  • Expert-level proficiency in one or more high-level Python or Java based batch and/or stream processing frameworks such as Apache Spark, Apache Flink or Kafka Streams
  • Production experience with columnar stores and real-time analytics databases (Druid, ClickHouse strongly preferred)
  • Strong experience with relational and non-relational data stores, key-value stores and search engines (PostgreSQL, ScyllaDB, Redis, Hazelcast, Elasticsearch etc.)
  • Hands-on experience with data workflow orchestration tools like Airflow or dbt
  • Deep understanding of storage formats such as Parquet, ORC and/or Avro
  • Strong experience with distributed systems, concurrent programming, and real-time data processing at scale
  • Experience with distributed storage systems like HDFS and/or S3
  • Familiarity with data lake and data warehouse solutions including Hive, Iceberg, Hudi and/or Delta Lake
  • Strong sense of analytical thinking and problem-solving skills with ability to debug complex distributed systems
  • Strong verbal and written communication skills with ability to explain technical decisions to both engineers and business stakeholders

Nice to have

  • Familiarity with containerization & orchestration – Docker and/or Kubernetes
  • Experience with product analytics, behavioral data, or customer journey analytics platforms
  • Background in real-time analytics, customer intelligence, or B2B SaaS platforms
  • Familiarity with generative models and a strong enthusiasm for generative AI, large language models (LLMs) and the agentic world
  • Prior experience with SCRUM/Agile methodologies

Why Dataroid?

🌟 Great Compensation & Benefits: We provide an attractive compensation package, including private health insurance, company-supported pension plans, meal vouchers, commute assistance, remote work benefits, and a paid day off for your birthday.

‍💻 Enhanced Workdays: Enjoy the flexibility of adaptable working hours. We offer online events, inspiring guest speakers, office snacks, a culture that limits unnecessary meetings, and many other perks designed to make your weekdays more enjoyable.

📚 Growth & Learning: Your development is our priority, with access to premier online learning platforms like Udemy, digital libraries, and tailored training programs to support your career journey.

🚀 Thriving Community: Be part of a vibrant and close-knit team that values connection. Enjoy happy hours, workshops, seasonal celebrations, and other events that bring us together.

🎯 Open Dialogue: Our flat organizational structure fosters direct and transparent communication. Your ideas and feedback are always welcome, ensuring your voice is heard and valued.

We will process your personal data as part of our recruitment procedures. To find out more, please consult our Candidate Privacy Notices.