Just having transactions isn’t enough if it’s at the cost of performance. In today’s environment of Big Data, even the most reliable results are of little use if they come in late. Trading algorithms need to update aggregate figures in real time, processing petabytes of trade data each moment. Health care applications need to rapidly track test results and compare them to millions of similar cases, in order to assist doctors in choosing the best course of action. Point Of Sale (POS) applications need to update inventory totals, regional sales and purchasing needs on the hour. These requirements call for a database capable of handling massive amounts of data in real time, while guaranteeing its integrity for every transaction. And the database must accomplish this while using up as little resources as possible, so teams with aging hardware will be able to run the database as if they had topnotch equipment.
In some cases, data is streamed from small edge points like sensors embedded in machinery, clothing and even the human body, and then relayed to servers for immediate processing. On a Raspberry PI, a $25 machine running on low powered ARM chips and a mere 1 GB of RAM, RavenDB can handle over 13,000 reads per second and over a 1,000 writes per second. This is more than enough for most small to medium applications. For demands higher than that, a single RavenDB server running on a machine of less than $1,000 can handle over 150,000 writes per second and over a million reads per second.