is to minimize the memory consumption to a degree that will make the data fit entirely in the RAM and thereby prevent swapping to the hard drive. A good way to achieve this is to consider your data a 3D structure for which you must minimize the volume. Your data has 3 dimensions – columns, rows and average cell content length. Average cell content length is a very loose definition as it varies greatly with the type of data being handled (text, number, date etc.).
The biggest user of space is the text fields – no doubt about that. We strongly encourage you to look carefully at all your text columns when importing and consider the possibility of ignoring them. In most cases this should improve the performance and data throughput – especially if you are experiencing difficulties in that area.
At ABC Softwork, we are always looking for new ways to improve the performance of our software. We can’t really do anything about your data and the size of it – we’ll leave that to you – but the way we handle your data can be of great importance. During the two and a half (and counting) year the program has existed on the Microsoft .NET-platform, we have made many performance enhancing changes. One of the tales of success was when the classification process received a speed boost (it is now about 100-150 times faster). We will continue our hunt to find points in the software that clutches performance. Finding and eliminating performance bottlenecks is one of our focus areas in the development process.