Requirements: Memory size

Hello!
We’re going to use enterprise TigerGraph for capturing some streaming data. At the moment average amount of daily data is about 350GB. And the same data amount we would process in day-to-day mode (24x7). We also expect that the unique data will be stabilized at 1TB level and for the most of the items we will just change the timestamps. Also we’re going to remove the elements older than 90 days.
So, the question is: can we use in production environment a cluster of some instances with total 64 cores + 488GB of memory or the system will go down on a memory lack?

Hey @centerco the compression ratio of TigerGraph is pretty good. 1 TB should compress in less than 488GB of ram however there might be constraints when running complex queries. You will be sitting pretty close to full throttle. Let me close this thread and I’ll as one of the Solution Architects to connect with you.