Morpho Problems with Very Large Data Files
One of the REOT students tried submitting a very large (360,000 lines) data file
and discovered that Morpho 'hangs' in trying to work with the file (i.e. in
going from one screen to the next in the Package wizard.
It looks like the problem is due to code which copies the data file to a string
(i.e. an in-memory data structure)
Updated by Dan Higgins about 21 years ago
Previously, the FileSystemDataStore class handled data files just like xml
metadata files. Data files were thus copied to a string for insertion of ids and
an exception used to detect that the data file was not a known type of eml doc.
This resulted in multimegabyte strings in memory when the data files were very
large (i.e. multimegabytes). This would cause out-of-memeory errors. Also, the
streams in the FileSystemDataStore were not buffered, making the reading/writing
To fix the problem, all streams were buffered. This gives about a factor of 20
speed increase. In addition, new methods were added to handle data files
explicitly and avoid making 'in-memory' copies.