Failed - Accelerator (creation)

My data source is S3, through Dremio I connect very easy and fast and I created a dataset, I think these data will be visualized through Tableau, but, when I make the configuration of the reflection or consult the live detaset, it generates a memory error. I would like to know how I can solve this problem, or what is the best way to create the reflection when there are a lot of data.

Hi Adam - where are your running Dremio? The macOS and Windows installers assume a laptop environment and configure minimal use of RAM. We designed Dremio to run on clusters of servers with healthy amounts of RAM. It is possible that based on the size of your raw data that you are exceeding the capacity of the resources allocated to Dremio.

Just a guess as to what may be going on - tell us more and we will see what we can figure out!

We installed Dremio in an EC2 instance “c4.2xlarge” with 600GB of storage. There we only have Dremio; Is it possible to expand the RAM that Dremio can use, how would this configuration be done?

By default, Dremio will use all the memory available on the box (for example, minus heap). Dremio’s processing is typically in-memory and hence memory intensive. Do you have anything else running on this box consuming resources? How large are the S3 datasets?

in this box only runs Dremio, so far, it works well when the data volume is small between 10MB and 150MB since they are the biggest month files when this memory error is generated. Is there any relationship table RAM memory and data volume?