Processing 1 billion rows per second
Hans-Jürgen Schönig has been a PostgreSQL expert and database specialist since the 90s. He is CEO and technical lead of CYBERTEC, which is one of the database market leaders worldwide and has served countless customers around the globe since the year 2000.As CEO of CYBERTEC, Hans-Jürgen Schönig regularly advises customers in database services and creates individual concepts tailored to each client’s needs. Additionally, he regularly gives training on PostgreSQL Advanced Optimization & Performance Tuning, PostgreSQL for Business Intelligence and Mass Data Analysis, PostgreSQL Replication Professional and Linux for PostgreSQL DBAs - just to name a few.
Everybody is talking about Big Data and about processing large amounts of data in real time or close to real time. However, to process a lot of data there is no need for commercial software or for some NoSQL stuff. PostgreSQL can do exactly what you need and process A LOT of data in real time. During our tests we have seen that crunching 1 billion rows of data in realtime is perfectly feasible, practical and definitely useful. This talk shows, which things has to be changed inside the PostgreSQL and what we learned when processing so much data for analytical purposes.
- 2017 March 30 09:00 EDT
- 50 min
- Liberty III
- PGConf US 2017 [PgConf.US]
- Use Cases