I am facing the issue of out of memory issue. Data is only 30000 and the Field count is 185.
Please provide a solution to this issue.
Thanks.
Inva Team.
Hello, Nagaraj!
Thank you for reporting this issue to us.
Please note that such an issue may occur while loading big datasets directly on the client. This happens due to RAM limitations, which may vary for different machines. To provide you with more accurate assistance, we would like to know the following details:
Looking forward to hearing from you.
Best Regards,
Maksym
Hi Maksym,
Please find below details:
Thanks,
Nilesh Mane
Hello, Nagaraj!
Thank you for sharing this information with us.
Several steps can be taken to optimize the loading process for big JSON files. Firstly, you can try using the dataSource.useStreamLoader
property, which optimizes the processing of large JSON files using the stream loader. Furthermore, the JSON data can be converted to an array of arrays format. This would reduce the total JSON size by avoiding duplicating field names per record.
In case these steps would not help, we suggest filtering the data on the server side so that a smaller JSON is loaded on the page. For example, you could implement custom user inputs to prefilter the dataset by the year in the request for data. To load data when another year is inputted, the updateData()
API call can be used. Also, some unnecessary columns can be omitted in the server's response as an alternative to filtering out rows.
Please let us know if our recommendations helped you.
Best Regards,
Maksym
Hello, Nagaraj!
Hope you are doing well.
Our team is wondering if you had time to try the suggested solutions for the memory issue.
Looking forward to hearing your feedback.
Best Regards,
Maksym
Hello, Nagaraj!
We would like to know if you were able to solve the issue with the "Out of Memory" error.
Looking forward to hearing from you.
Best Regards,
Maksym
Hi Maksym,
We are still working on resolving this issue.
We use stored procedure and C# Dot Net in the backend to bind data which is returned to the Flexmonster.
The stored procedure returns data in table format which we format in JSON format. We are facing the "Out of Memory" issue while formatting the data from the table to JSON format.
Could you please suggest if we can directly pass the data from SP which is in the table format to flex monster?
Thanks,
Nilesh Mane
Hello,
Thank you for your reply.
Please note that Flexmonster does not support direct connection to the database from the client due to security reasons.
To avoid having the "Out of Memory" error, we recommend implementing server-side filters to exclude data before loading it on the client. The filter parameter can be changed through an additionally implemented control. To switch between different data chunks, we suggest using the updateData
API call.
The filter can be passed through the query parameters:
filename: "https://your.server.com/data?year=2023"
Please let us know if creating a server-side filter is a viable solution for you.
Best Regards,
Maksym
Hi Maksym,
The provided solution is not usable for our logic. We are facing an issue with the latest 6-month data.
Could you please provide another solution?
Thank you,
Nilesh Mane
Hello,
Thank you for your feedback.
Kindly note that we have already provided all available solutions in our previous messages. You are welcome to check the quick summary for these options:
Please let us know if one of the suggested approaches helped you.
Best Regards,
Maksym
Hello, Nagaraj!
Hope you are doing well.
Our team is wondering if you were able to solve the issue with "Out of Memory" error.
Looking forward to hearing your feedback.
Best Regards,
Maksym