Project

General

Profile

Actions

Bug #401

closed

Morpho Problems with Very Large Data Files

Added by Dan Higgins about 22 years ago. Updated about 22 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
Category:
morpho - general
Target version:
Start date:
01/23/2002
Due date:
% Done:

0%

Estimated time:
Bugzilla-Id:
401

Description

One of the REOT students tried submitting a very large (360,000 lines) data file
and discovered that Morpho 'hangs' in trying to work with the file (i.e. in
going from one screen to the next in the Package wizard.

It looks like the problem is due to code which copies the data file to a string
(i.e. an in-memory data structure)

Actions #1

Updated by Dan Higgins about 22 years ago

Previously, the FileSystemDataStore class handled data files just like xml
metadata files. Data files were thus copied to a string for insertion of ids and
an exception used to detect that the data file was not a known type of eml doc.
This resulted in multimegabyte strings in memory when the data files were very
large (i.e. multimegabytes). This would cause out-of-memeory errors. Also, the
streams in the FileSystemDataStore were not buffered, making the reading/writing
very slow.

To fix the problem, all streams were buffered. This gives about a factor of 20
speed increase. In addition, new methods were added to handle data files
explicitly and avoid making 'in-memory' copies.

Actions #2

Updated by Redmine Admin about 11 years ago

Original Bugzilla ID was 401

Actions

Also available in: Atom PDF