Hi Team,
We are trying to run a pivot table report, and getting an error message: "Dataset is too large. Some fields cannot be expanded. Please narrow down the dataset". Can you please help us resolve this issue?
Is there a limitation on the size of data that the pivot table can handle? If so, what is the limitation?
Thanks
Hello,
Thank you for writing to us.
We want to inform you that the problem is likely to be caused by multiple expands. Expansion of cells is a heavy operation and requires time and browser resources. We suggest reducing the number of expands in the report.
For example, if you have slice.expands.expandAll
in the Report Object set to true
, we kindly recommend defining it as false
. Also, we recommend not to use the expandAllData
API call. This will help the component to process your data without the mentioned alert appearing.
Concerning the limitation on the data set size.
Flexmonster does not impose any programmatical limitations on the size of the data set and the number of expands. Still, the component uses resources available to the client's browser in order to perform calculations. Heavy operations performed over large data sets can lead to exceptions and page crashes.
Flexmonster aborts the operation and shows the mentioned alert if it takes too much time to complete. It allows for avoiding page crashes and errors.
Please let us know if it helps.
Do not hesitate to contact us in case further assistance is needed.
Kind regards,
Illia
Hi Team,
We are also encountered similar issue. Is there any fix in recent versions of Flexmonster on the same issue as described above ? kindly advise.
Thanks
Nagaraj
Hello, Nagaraj,
Thank you for your question.
Our team would like to kindly explain that the message's purpose is to prevent page crashes and improve the user experience:
The pop-up message appears only when Flexmonster detects that the client's browser can't handle the current load. This approach helps to avoid potential page crashes and errors that can occur when working with large data volumes.
We would like to confirm that the best practices provided earlier in this thread are relevant.
We hope this helps. If further questions arise, please feel free to reach out.
Kind regards,
Vera
Hi,
I see this is an old ticket, and we are experiencing the same issue where the end user is trying to look at large amounts of data.
Once large set of data for them is about 3 million rows, and each row could have up to 30 cells... can you confirm if this is considered too large ?
If I am understanding the responses on this issue correctly, then if the customer has a more powerful computer with lots of RAM, then this might allow the browser to be able to handle larger amounts of data ? DO you have any advice on client user computer requirements for this size of data?
Your advice would be appreciated
Thanks