New Pivot 2.7: optimized for your browser's memory
Only one release ago our development team made a massive reduction of browser’s memory consumption by our Pivot. This is a tangible optimization for all clients that extended the ability of our product to load more data.
Why do we attend to this so closely?
As a front-end solution, our Pivot is limited by the browser’s memory. Therefore, we are working all the time on a smart optimization of the memory that our tool consumes.
If you have a large dataset - You care.
This optimization increases not only the amount of data that could be pivoted but also notably rose up the speed of pivoting. As the result, you get your report considerably faster.
So, what numbers are we talking about?
It’s not feasible to determine specific numbers or volumes that could impose some restrictions. You know that there is no amount of data that Flexmonster couldn’t handle but there is some that a browser couldn’t.
Based on the latest tests for our new version we could say that the memory performance of our grid was cumulatively improved by around 36%.
Let’s show you more details about the test we’ve done and the results we’ve got.
This time our final measurements were done with JSON data source in Chrome browser by means of Chrome DevTools (need to be said that it’s a new feature that was released by Chrome DevTools).
For tests we used JSON data with the following characteristics:
- 100k unique records JSON with 5 fields in each object (14,4 MB)
- 1M unique records JSON with 5 fields in each object (126 MB)
Total JS heap size including live objects, garbage, and reserved space was measured in every test after a forced garbage collection. For the measurements, we used a simple HTML page with embedded Flexmonster Pivot Table that represents the report based on JSON data.
Four tests were performed, two for inline JSON and two for JSON data loaded from a file. Here is what we’ve got:
Test dataset | JSON source (inline/from file) | Flexmonster version 2.6.15, MB of RAM | Flexmonster version 2.7.0, MB of RAM | Memory consumption improvement, % |
---|---|---|---|---|
100k unique records JSON | Inline (via dataSource.data report property) | 121 MB | 84.9 MB | 30% |
100k unique records JSON | From file (via dataSource.filename report property) | 112 MB | 77.3 MB | 31% |
1M unique records JSON | Inline (via dataSource.data report property) | 972 MB | 560 MB | 42% |
1M unique records JSON | From file (via dataSource.filename report property) | 900 MB | 522 MB | 42% |
These figures clearly show that compared to 2.6.15 in the new 2.7 version the amount of browser’s memory needed for JSON dataset with 100k unique records notably dried up.
The same reduction we observe for the report based on the JSON data source of 1M unique records.
To brief, memory performance was considerably improved by 36%.
What should be also mentioned as a good example of huge improvement we did, previously, the JSON data of 1M unique records with 11 fields in each object (256 MB) couldn’t be performed properly. Now it’s all good with that. Gonna try?
How it influences your usage experience
- You pivot more data
- You do it a whole lot faster
- The icing on the top - all the operations are performed far more smoothly.
Sounds promising? - Try it to be certain.
Download our latest version and look for more new and outstanding features that we’ve got for you this time.