Driving Impact at Scale
All one needs is strategy, skill and resources to make digitalization and AI happen. Innovation, added value, competitive advantage and a short Time-to-Market is just around the corner. So why is everything taking so long? Shouldn’t you all be finished yesterday already? Mastering innovation requires having many balls in the air at once. In this talk I’ll present a transformation use case of an established player including our best practices and anti-patterns.
The road to a billion forecasts per day at Albert Heijn
Every day, millions of Dutch and Belgian households do there groceries at Albert Heijn. Fulfilling these customer needs and, at the same time, minimizing the waste is one of the operational optimization problems Albert Heijn faces every day. In this session details will be shared on how data scientist use cloud and data science to tackle this large scale problem in order to stay the number one supermarket in NL/BE.
dbt Vision and Developments
After five years of changing the way analytics teams work, dbt Core is reaching its v1.0 milestone. Jeremy will discuss the big priorities leading up to v1. Also: what still feels missing in dbt-anchored data stacks in 2021; and a sneak preview of what the team will be working on in 2022.
Networks! project - real-time analytics that controls 50% of mobile network in Poland
The ability to analyze data in real time for mobile network is crucial for diagnostics and ensuring the quality of the service for end customers. To achieve this we have built a real-time ingestion and analytics platform that processes 2.2 billions messages a day from mobile networks hardware. During the talk we will show how we used Flink and Flink SQL to build this platform. The solution includes calculation of more than 5000 KPIs and 1500 aggregation defined in SQL, on 750 Kafka topics. We will describe how we manage Flink jobs at scale using Ververica and Kubernetes, how we monitor the platform using Clickhouse and what problems we need to overcome in the project.