PostgreSQL exponent EDB has enhanced its new data platform, claiming this will help bring transactional, analytical, and AI workloads into a single environment.
The release, EDB Postgres AI, includes a purpose-built engine for PostgreSQL which can scale independently from storage in the cloud and is optimized for columnar open table formats including Iceberg and Delta Lake.
EDB claimed the new engine, which pushes queries to open source Apache DataFusion, returned queries 30x faster than standard Postgres while tiering offloads cold transactional data to storage is up 18x more cost-efficient.
DataFusion is an extensible query engine written in Rust that uses Apache Arrow as its in-memory format.
"Our analytics accelerator is a tiered table, where particular database analytical queries can be pushed to another open source engine, Apache DataFusion. We've done the glue and the hook-up between that and our Postgres distributed product to make an analytics accelerator that you can use right from within PostgreSQL," said Tom Kincaid, EDB senior veep for database server and tools.
Packaged as part of EDB's Postgres AI platform, the tiered storage and query engine can offer 6x better TCO and 30 percent faster transactional performance than SQL Server, the vendor claims.
PostgreSQL has mainly been known as a transaction database, but the latest move reflects increased interest in performing analytics in the same environment, in part to support AI use cases. It follows EDB's decision to fork Greenplum - an OLAP system built on PostgreSQL which once hoped to compete with mainstream data warehouse products such as Teradata - in order to build WarehousePG.
"We've also started the open source data warehouse PostgreSQL project, which effectively uses Greenplum-related workloads doing traditional analytical queries," said Kincaid, a former engineering veep at Oracle.
However, the objective is not to reposition PostgreSQL as an analytics database per se, but rather to offer users a solution when they need to do analytics within the PostgreSQL environment.
"The threshold for when you need to go from a traditional PostgreSQL, OLTP database to a specialized analytics variant will get a little higher. Maybe [it will be] set around [the] 2TB range now, and maybe it goes up to the 3TB range now.
"For a large percentage of cases, PostgreSQL will be an OLTP database, but handling more and more analytics use cases along the way. I wouldn't say it will be a full scale analytics engine in the future," he said. ®
Opinion Mozilla's management is a bug, not a feature
The mighty Z80 processor ran the code at astounding speed, proving retro-tech got a lot of things right
Analysis Markets advised to brace for 45 percent fall from Q1 to Q2
Using prompt injections to play a Jedi mind trick on LLMs
As power concerns beset builds, this floating datacenter can plug into powership next door
Line-judging tech flubs crucial point, leaving players and fans seeing red
Opinion A virtual environment makes a great de-hype advisor