Qtegra LabBook Extraction Tool

Choose LabBook File: (*.imexp, max. size 2000 MB)

Full Extraction?: (generally not required, but use this to extract all files from the LabBook)

After clicking submit please be patient. The file has to be uploaded to our compute server in Oxford, be processed, and then downloaded back to your location.

What?

This tool will extract iCAP data from Qtegra LabBook files, even if the file is corrupted.

The Qtegra LabBook is a compressed binary file that contains a virtual file system (Solid FS). When the file is corrupted (typically caused by a crash of Qtegra or the OS) this file becomes truncated and Qtegra will refuse to open the file. If this happens to you, then this online tool may be able to extract some of the MS data from the file.

The first part of the extraction recovers any readable content that exists in the file. If the file is truncated, then we can generally read up to the point when data stopped being written, however because the file structure is complex and data are compressed in chunks, it is most likely that some data have been lost (e.g. the last 1/2 hour of a 6 hour data file).

The second part of the extraction identifies the mass spectrometer data files ("MainRuns.bin") and decodes their binary data into an easy to read CSV format. This format is the same as that exported from Qtegra, but without the extensive header information, as parsing the Qtegra XML files to extract it is simply not practical.

If performing a full extraction then all files from the LabBook will be extracted and downloaded. Many of these files are in native formats that only Qtegra can understand, but there may still be some value to you in accessing them, for example, you may wish to inspect data saved by other plugins (e.g. laser or auto sampler) or from other instruments (e.g. OES).

The regular extraction simply identifies the "MainRuns.bin" files and extracts them. This is typically what most people are after.

Please note that the LabBook can contain a lot of data. Furthermore it contains "versioned data" such that it keeps track of all data that were collected, even if later data overwrite earlier data. It is hoped that the extracted CSV files are numbered in the order they were collected, but I am sorry to say that this cannot be guaranteed.

Finally, to use the CSV files in your data reduction software it might be necessary to prepend a suitable header to the file. To do this, take a look at a valid CSV file (ideally measured using the same method as the one you are extracting) exported from your Qtegra software. Copy the header section and paste it into your recovered CSV file. Then edit the header to set the sample name and time stamp. This new file should be identical to what you would get out of Qtegra normally, and can be loaded and processed in the usual ways.

While this tool appears to work with all of the test files I have used, I cannot guarantee that it will work on all LabBook files. If your file does not work, please email me for assistance.

Future improvements to this tool could include additional parsing of LabBook files with the intention of extracting sample names, acquisition parameters (e.g. dwell times), and validation of the acquisition order of the data files. If you find this tool useful and want us to add these features please email us and let us know.

Where?

Currently this tool is hosted on our compute server, located in Oxford, UK. We may move the tool to another host to best suit the load. To ensure you can find the tool again in the future we recommend bookmarking the following landing page:

https://norsci.com/?p=software-qtx

How?

In case you are curious, the original tool was written as a command line application in C# using .NET v4.6.1. To make it accessible from the web, the tool has been set up to run under Wine on Ubuntu Linux. It would have worked under Mono, but the Solid FS library calls native code that is simply unavailable in the current Mono stack. The good news is that Wine handles this easily, and it's impressive how good the cross-platform support is these days for .NET software, especially for "Framework" projects. In my experience the "Core" code is harder to get running under Linux, despite it being designed with that in mind, but as always YMMV. When you submit your LabBook to this online tool, the file is uploaded, processed by the tool, and the output compressed into a ZIP archive for download. The entire operation is performed immediately, so for large LabBooks (>200 MB) the operation can take some time. Maximum LabBook file size is 2000 MB and maximum processing time is 10 minutes. Please contact me for assistance if your file is too large for this online service.

Why?

I work with many LA-ICP-MS laboratories around the world, and unfortunately I started to encounter many users complaining about corrupted LabBook files. While Thermo has been generally helpful, they seem to be recovering files using a manual process and many users tell me that it can take several months for them to receive any recovered data.

To help users in the LA-ICP-MS community I reverse engineered the LabBook file format and wrote this tool to extract MS data from the file.

Who?

We are Norris Scientific, an independent company providing software and hardware solutions in the field of microanalysis, with an emphasis on LA-ICP-MS.

If this tool has been of assistance please consider sending me an email or come and say g'day when you see me at a conference.

Ashley Norris, Norris Scientific - © 2022

Privacy Statement

Files uploaded to this site are processed and downloaded immediately with no technical reason why they must be stored on our server. However, we do retain your file on our server for a period of two weeks for the sole purpose of troubleshooting issues with the tool. Your data will never be processed beyond a casual inspection to determine "did the extraction work" and no data will ever be shared. Most importantly, no data will ever be retained beyond this two week time period and files are deleted automatically. If you require any additional certainty with regards to data privacy please email us to make arrangements.

When you use this site we make a record in our logs of your IP address and the time when you accessed the tool. This information is used for the sole purpose of assessing demand for the tool and load on the server. By understanding load on the server we are able to plan ahead and take necessary steps to ensure that the tool can be hosted effectively, keeping it free of charge and available for all users into the future.