
searching for the letter ‘a’ – which was the actual search phrase in one of the crash reports we received) it can be a problem. However, when searching over a very large data set with criteria that might not be very selective (e.g.

Rarely will a file have 10,000 hits or a line have 20,000 characters. That’s not usually a problem when searching in a limited set of files. algorithms that reserved more memory than was necessary, but some of the problems were more subtle function related issues.īy default FileLocator Pro will record up to 10,000 lines of text per file and each line can be up to around 20,000 characters. We found a few problems that were simply bugs in the code, e.g. It didn’t take long to see that FileLocator Pro had a problem on low spec’d machines performing searches where the data was in the gigabyte range and involved millions of files. We’ve had a slow trickle of crash reports over the last few months and while most were odd, quick to fix, edge-case samples the majority have been related to memory management issues. Since then you may have noticed an increase in memory management related upgrades to FileLocator Pro.

Based on CrashRpt (an open source product hosted on Google Code) it’s one of the most useful quality control features we’ve ever added, although we hope it’s a ‘feature’ most of our users will never have cause to see. During August 2012 we quietly added a new crash reporting module to FileLocator Pro.
