Tables for
Volume F
Crystallography of biological macromolecules
Edited by E. Arnold, D. M. Himmel and M. G. Rossmann

International Tables for Crystallography (2012). Vol. F, ch. 19.9, pp. 624-628

Chapter 19.9. Four-dimensional cryo-electron microscopy at quasi-atomic resolution: IMAGIC 4D

M. van Heel,a* R. Portugal,a A. Rohou,a C. Linnemayr,b C. Bebeacua,a R. Schmidt,c T. Granta and M. Schatzc

aFaculty of Natural Sciences, Division of Molecular Biosciences, Imperial College London, London SW7 2AZ, England,bDivision of Internal Medicine, Inflammation Research, University Hospital Zürich, Switzerland, and cImage Science Software GmbH, Gillweg 3, D-14193 Berlin, Germany
Correspondence e-mail:

The traditional tools of the structural biologist seeking to understand macromolecules and their complexes are X-ray crystallography and NMR spectroscopy. Single-particle cryo-electron microscopy (cryo-EM) has established itself as a new structural-biology technique over the last 15 years. Spectacular insights into the functioning of macromolecular complexes have been achieved especially from combining cryo-EM with the earlier approaches. The resolution levels achieved improved over the last decade from ∼10 Å to sometimes better than ∼4 Å, meaning that a de novo structure determination based on single-particle cryo-EM studies alone is now feasible. More challenging is the new perspective that cryo-EM brings: sorting heterogeneous populations of molecules into individual three-dimensional conformers resulting in sequences of related three-dimensional structures, or in short `4D cryo-EM'. Thanks to these developments, single-particle cryo-EM has become the technique of choice for shedding light on the functioning of many a complex biological system. The design of the software instrumentation for 4D cryo-EM is crucial. In this chapter we elaborate on organization issues for single-particle cryo-EM software, as exemplified by recent developments in the IMAGIC 4D software system.

19.9.1. Introduction

| top | pdf |

During the 1950s and 1960s the electron microscope became a routine research instrument. Many biological structures we now take for granted, such as the ribosome (Chemistry Nobel Prize 2009, see ), were first discovered using the electron microscope. An avalanche of such discoveries emerged after the appropriate preparation techniques for biological specimens were developed, such as the contrasting of molecular complexes with heavy-metal salts (Brenner & Horne, 1959[link]). This dry `negative-stain' approach, however, can also create structural deformations of the sample, hampering its detailed analysis. The introduction of the vitreous water (`vitreous ice') specimen preparation technique by Jacques Dubochet and co-workers (Adrian et al., 1984[link]) represented a huge improvement in structural preservation for biological specimens in the harsh vacuum environment of the electron microscope. Notwithstanding the contributions made to the cryo-electron microscopy (cryo-EM) field by the data-processing methodology and software – the subject of this chapter – it was the vitreous ice specimen preparation approach that marked the future success of single-particle cryo-EM. For all practical purposes, vitreous ice is water and the first electron images of virus particles trapped in their native solution state were spectacular (Adrian et al., 1984[link]). The vitreous ice preparations can trap the molecular complexes in full action, in different conformational states, and may represent a window into the detailed functioning of the complexes in their natural environment.

Structural analysis by single-particle cryo-EM is therefore very different to that carried out by traditional X-ray crystallography. With the latter technique, the data collected are diffraction intensities, necessarily reflecting already averaged contributions from all unit cells in the crystal. Any information relating to the behaviour of individual molecules is thus lost during data collection. Single-particle cryo-EM, in contrast, is unique in that it preserves the identity of each individual molecular image registered in whatever structural state it happens to be. The marked difference with X-ray crystallography is that the information about each individual molecule or complex is explicitly available in digital form, albeit strongly deteriorated by noise. This also means that in cryo-EM all averaging takes place in the computer; the computational requirements of cryo-EM therefore exceed those of X-ray crystallography by orders of magnitude.

The intrinsic heterogeneity of cryo-EM samples was originally known as the curse of cryo-EM because it degrades the quality of the resulting average structures. Many structures simply would not refine to a reasonable resolution [for reviews of the single-particle methods see van Heel et al. (2000[link]) and Frank (2006[link])]. In recent years, however, heterogeneity has been seen more as a blessing, since, with the appropriate advanced data-processing technology, it is now possible to separate the different conformations of the complex into individual three-dimensional structures. Each of these three-dimensional structures may reveal a different state of the biological complex, as was first shown with a mixture of functional states of the E. coli 70S ribosome in complex RF3 (Klaholz et al., 2004[link]). For recent reviews on the challenges posed by heterogeneous biological samples, see Leschziner & Nogales (2007[link]) and Spahn & Penczek (2009[link]). These fascinating new developments build upon almost four decades of advances in software instrumentation, which started in the mid-1960s (DeRosier & Klug, 1968[link]). We here review recent developments of the IMAGIC system aimed at optimally processing large heterogeneous `four-dimensional' (4D) cryo-EM data sets to high resolution.

19.9.2. The IMAGIC software system

| top | pdf |

The IMAGIC software system (van Heel & Keegstra, 1981[link]; van Heel et al., 1996[link]) is the result of over 30 years of continuous developments, including various major redesign phases. It was one of the first software packages in electron microscopy and many now-standard procedures were pioneered in IMAGIC, including surface rendering, multivariate statistical analysis (MSA), automatic classification, projection matching, angular reconstitution, automatic particle selection etc. The basic file format of IMAGIC consists of a header file, to hold image parameters and processing information, and a separate file holding all actual image data. This `stack' format can hold millions of individual molecular images and all IMAGIC programs are designed to loop over all the images in the stack (van Heel & Keegstra, 1981[link]). Users are not burdened with the task of formulating loops over the two-dimensional images needing the same treatment.

The IMAGIC philosophy is to create specialized high-level programs for all more complicated tasks. High-level programs allow much better interactive guidance for the user than do image-processing scripts (which are operating-system dependent). A unique feature of all IMAGIC programs is that all user interaction is concentrated in a `user interaction block' (UIB). Whether interactively processing images or creating a large `batch job' for later processing, the user always communicates with the program's UIB with all interactive guidance available (each question comes with its own interactive help). The IMAGIC system is available for the most popular current operating systems including Linux, Windows XP/Vista/7 and Mac OS-X.

19.9.3. IMAGIC `4D' processing/data format

| top | pdf |

Recently, another major overhaul of the system was undertaken in order to better handle three-dimensional structural heterogeneity. The relevant IMAGIC programs can now loop over entire three-dimensional volumes (and not just two-dimensional images) allowing programs to tackle problems at a much higher level of complexity. About 25 years ago, a radical redesign of the system allowed small two-dimensional images to reside `in core' (the main memory of the computer) rather than on disk, facilitating all alignment algorithms. The new redesign allows whole three-dimensional volumes to remain in core and to be mani­pulated efficiently by single subroutine calls without I/O (input/output) overhead. The file format for `4D' processing has hardly needed any updating other than defining some new header parameters. The reason is that a stack of three-dimensional (3D) volumes (a `4D' data format) is still just a stack of two-dimensional (2D) images, the traditional IMAGIC data format (Fig.[link]). This `4D' upgrade, however, did require a radical overhaul of all programs, since all relevant programs needed an extra loop over a set of 3D volumes. At the same time, these 3D programs have all become much faster, more compact and more easily maintained because they rely on the new in-core 3D libraries and are now freed of all excess I/O calls.


Figure | top | pdf |

IMAGIC 4D file format. The 4D file format is not very different to the original file format with one header record per 2D image in a stack (van Heel & Keegstra, 1981[link]). The main difference is that specific header locations now indicate the additional organization of a sequence of 2D images in the stack belonging together in a 3D volume. Substantial changes were required in the software, however, to make programs loop over the 3D volumes in one 4D IMAGIC file in the way that the programs looped over the 2D images in a stack. For full format details, see .

19.9.4. Software parallelization

| top | pdf |

In cryo-EM all averaging and information extraction from all individual molecular images takes place in silico; the computational requirements are thus huge. Computers are never fast enough for the most demanding single-particle approaches. Over the last 15 years much emphasis has been placed on the parallel­ization of critical IMAGIC code (van Heel et al., 2000[link]), mainly using a message passing interface (MPI; Gropp et al., 1994[link]) to take advantage of modern `cluster' computing environments. Other software packages have since followed the same parallelization path (Smith & Carragher, 2008[link]). The IMAGIC software is implemented such that the same code will run on all machines from a single-CPU notebook computer up to large cluster systems with hundreds of CPUs.

Recently, a `GPU' library (see home.html ) has been implemented in IMAGIC to exploit the properties of cheap graphic processors. However, for most `standard library' operations, like 2D fast Fourier transforms, which can be almost directly linked to existing programs, most time gained in GPU processing is still lost in transporting data to and from the limited memory of the GPUs. The question is whether the time invested in software development for exploiting specific properties of any specific parallel computing hardware will pay off in the longer term for the specific needs of cryo-EM. In the case of the GPUs, the amount of memory typically available per GPU core is rather low.

19.9.5. Full 2D (parallel) astigmatic contrast transfer function correction

| top | pdf |

The IMAGIC contrast transfer function (CTF) estimation and correction programs (van Heel et al., 2000[link]) have recently been upgraded to work fully in two dimensions, enabling the accurate detection of all CTF parameters including astigmatism. This has been carried out in order to accomplish two goals. Firstly, the programs are now capable of operating on an entire data set of charge-coupled device (CCD) images or patches of micrographs. This now allows the use of the parallel MSA programs (see below) to classify sets of amplitude spectra (and create class averages thereof) prior to the precise determination of the defocus and astigmatism parameters. This enables the simple and largely automatic determination of CTF parameters, and subsequent CTF correction via phase flipping of entire data-set stacks. Secondly, the ability to accurately detect even extreme levels of astigmatism enables the use of highly astigmatic images, unlocking extremely close-to-focus defoci in order to push the achievable resolution. These programs have been used for the processing of a highly astigmatic data set of the Limulus polyphemus hemocyanin, resulting in a ∼4 Å reconstruction from only 15 000 raw molecular images, corresponding to 60 000 asym­metric units for this C2 point-group symmetry structure (Grant et al., 2011[link]).

19.9.6. Parallel automatic particle picking

| top | pdf |

Parallel processing is of increasing importance for collecting the large raw data sets required for 4D cryo-EM. If we are interested in a structure that represents 1% of all molecular complexes, we need to increase the size of the data set 100-fold in order to achieve the same resolution we had for a monodisperse data set. This implies that we need to be able to rapidly process data sets of the order of 1 Tbyte in size. One of the fastest automatic particle-selection procedures is still one of the oldest ones, based on the local variance (van Heel, 1982[link]) in the raw micrographs or CCD images. It is applied immediately after the full-data-set CTF correction discussed above. Very important for the calculation of the variance image is the choice of the frequency range used for discriminating the presence of a particle with respect to the background. The sensitivity of the approach is approximately as good as that of particle searching using the cross-correlation function (CCF) (Saxton & Frank, 1977[link]). However, like all CCF alignments (see below), the CCF particle picking requires templates and may tend to bias the particle selection towards the references (Boekema et al., 1986[link]; Stewart & Grigorieff, 2004[link]). Modulation-image particle picking is a variant of the original variance image detection (van Heel, 1982[link]) which is based on the local standard deviation rather than the local variance and avoids squaring of amplitudes. This new algorithm is often the method of choice for an unbiased automatic particle selection. Once a reliable first 3D structure has been calculated, the particle-picking program can then look for all possible views in all possible orientations in an extensive stack of input images (the massive calculations involved exploit MPI parallelization).

19.9.7. Parallel multi-reference alignments and reference bias

| top | pdf |

Large multi-reference alignment (MRA) schemes (van Heel & Stöffler-Meilicke, 1985[link]) have been accelerated by some orders of magnitude with the introduction of MPI parallelization on computer clusters (van Heel et al., 2000[link]), and MRA remains the largest source of CPU needs in single-particle cryo-EM. However, with the recent focus on discriminating between different functional states, the issue of reference bias resurfaces. With the renewed emphasis on avoiding reference bias (Boekema et al., 1986[link]; Stewart & Grigorieff, 2004[link]), `alignment by classification' of centred molecular images (Dube et al., 1993[link]) has taken on a new emphasis. An especially useful variant is the rotational alignment of the data set with respect to the main symmetry-related eigenvector. This alignment does not bias the data set with respect to specific references, yet it concentrates much of the variance in the data set into the lower eigenimages (van Heel et al., 2009[link]) and thus yields better class averages while using the same total number of eigenvectors. Such new unbiased approaches are still in full development.

MRAs are now used as a tool in 4D analysis by generating MRA references from a set of 3D structures rather than just a single one (Klaholz et al., 2004[link]; Spahn & Penczek, 2009[link]). Although not different from the 2D versions of MRA, the new IMAGIC 4D programs make it simpler to explicitly track a `3D membership' indicating to which 3D volume the best alignment was achieved. The overall administration of the 4D procedures is greatly facilitated, especially during iterative refinement rounds (Fig.[link])


Figure | top | pdf |

One starts with the (a priori) assignment of the images in a stack (original images or class averages) to a number of 3D volumes (named 3D-1 to 3D-3). This assignment can be of various origins: 2D MSA manifold separation; 3D MSA classification; or even a random-number generator, depending on the strategy pursued. Once different 3Ds have been generated, various `competitive' 3D membership assignment schemes can be applied to the 2D images in the stack. A classical one is multi-reference alignment with respect to re-projections of the different 3D volumes. Another approach is the multi-anchor set Euler angle assignment by `angular reconstitution' as described in the text. The resulting new 3Ds (named 3D-1′ to 3D-3′) take the place of the earlier 3Ds in this generic iterative refinement scheme. The iterations are stopped once convergence criteria like the Fourier shell correlation (FSC) resolution of the 3Ds reach stability (Harauz & van Heel, 1986[link]; van Heel & Schatz, 2005[link]).

19.9.8. MSA and its parallelization

| top | pdf |

Multivariate statistical analysis (MSA) was originally introduced to electron microscopy to handle different views of a macromolecule (van Heel & Frank, 1981[link]; Borland & van Heel, 1990[link]; van Heel et al., 2000[link]; Frank, 2006[link]; for an extensive review, see van Heel et al., 2009[link]). MSA approaches are the techniques of choice to structure complex, heterogeneous data sets. Today, a wealth of information on MSA techniques is readily available on the Internet. The highly efficient IMAGIC MSA programs are specifically optimized for large data sets: the CPU time required for calculating the main eigenvectors is directly proportional to the total size of the data set. The naive use of standard MSA libraries in most EM software leads to eigenvector/eigenvalue calculations that are proportional to the square of the number of images in a data set, making it impossible to process more than a few thousand images (van Heel et al., 2009[link]).

In the first 4D cryo-EM paper revealing different functional states of the same structure (Klaholz et al., 2004[link]), the MSA separation of the data set into the two states was achieved by a manually supported separation of the two conformers in eigenvector space, each representing a different functional state of the 70S ribosome in complex with RF3. Each functional state of a complex represents a `manifold' in factor space, and one seeks to separate the different manifolds in an unambiguous way, possibly by weighting the eigenvectors (van Heel, 1984[link]; Elad et al., 2008[link]). An alternative is to use the MSA programs to explicitly analyse large sets of 3D volumes in the way one traditionally analyses large sets of 2D images (see van Heel et al., 2009[link]).

The ever-increasing size of the data sets posed a new problem: the MSA processing of large data sets containing, say, a million images (totalling, say, ∼1 Tbyte) takes months to process on a single CPU. An MPI parallelization project was undertaken to speed up the eigenvector/eigenvalue calculations. This required parallelization of not only the pure computations but also the I/O operations, since the algorithm reads the large input data sets many times (∼30 times) in order to iteratively refine towards a stable answer, leading to I/O bottlenecks (van Heel et al., 2009[link]). Like all other programs mentioned in this chapter, the parallel MSA program is now part of the normal IMAGIC distribution.

19.9.9. Handling multiple 3D reconstructions in parallel

| top | pdf |

A new parallel 3D reconstruction program was created based on the earlier ex-core program implementing the `exact filter' algorithm (Harauz & van Heel, 1986[link]). With all 3D reconstructions now organized in in-core subroutines, the new program is organized to loop over many 3D volumes and to generate whole series of 3D reconstructions using new options. It is an ideal tool for generating (large) 4D data sets as needed, for example, for the 4D MSA approaches. It is a perfect match for the 4D MRA (above) and the 4D angular reconstitution programs (Fig.[link]).

The new program, apart from the exact filter algorithm, now also features a novel 3D deconvolution algorithm based on the idea that one can predict the 3D point-spread function (PSF) for any 3D reconstruction geometry. The correct 3D reconstruction is then calculated by the 3D deconvolution of the unfiltered 3D reconstruction with the PSF. This 3D deconvolution algorithm scales proportionally to the number of projections used to generate a 3D reconstruction (the exact filter algorithm scales as the square of the number of projections) and yields results of the same overall quality as the classical exact filter algorithm (Harauz & van Heel, 1986[link]).

19.9.10. Angular reconstitution 4D refinements

| top | pdf |

With the angular reconstitution approach (van Heel, 1987[link]; van Heel et al., 2000[link]) one typically assigns a Euler angle orientation to a 2D image by finding its best overall sinogram correlation peaks with respect to an `anchor set' of 2D projections from one single 3D structure. The IMAGIC `Euler' program, which performs the angular reconstitution orientational search, has now been extended to loop over multiple anchor sets and has thus been upgraded to a 4D level. As a consequence, the Euler program is now capable of competitively choosing the best anchor set among anchor sets generated from different 3D structures. The program can thus now also be used for refining towards multiple 3D structures, as was already the case with the MRA program in IMAGIC.

19.9.11. Discussion

| top | pdf |

The potential benefits of 4D cryo-EM at atomic resolution are obvious. With some further developments, 4D cryo-EM may become the structural-biology method par excellence. The grow­ing importance of software in cryo-EM has been the subject of various editorials and special issues of journals (Carragher & Smith, 1996[link]; Smith & Carragher, 2008[link]). The processing of 4D data sets, however, requires solving a large set of 3D structures simultaneously and that is necessarily a complex matter. The researcher will have to understand the basic principles of the approach and its pitfalls, and will thus have to go through a steep learning curve. The basic methodology (and the use of IMAGIC) is taught in courses such as the EMBO course for 3D cryo-EM, or the Brazil school for single-particle cryo-EM ( ). The IMAGIC 4D system is aimed at accompanying the user at the appropriate level of complexity, without forcing the user to become an expert script programmer. Throughout the IMAGIC 4D system, the software will directly understand the 4D frame of thinking of the user.

There is a growing understanding in the field that continuity of development, maintenance and support of the software for cryo-EM is an important issue for the success of a project. One needs to safeguard the human investment in mastering the learning curve, and one needs continuity of support over the duration of a (long-term) project. Software continuity is at least as important as it is to continue maintaining electron microscopes for data collection in cryo-EM. An example of a successful community effort to consolidate software instrumentation in the field of X-ray crystallography is the maintenance of the CCP4 suite of programs (Collaborative Computational Project, Number 4, 1994[link]). One must realize, however, that this model cannot be transferred directly to the cryo-EM field because each of the handful of packages in use in single-particle cryo-EM is at least as complex as the CCP4 system, largely due to the necessity of parallelization.

One further remark about using scripts for running existing software: scripts tend to lead a life of their own and are often exchanged among users; combinations of scripts are sometimes even distributed as a `new' package. However, scripts are operating-system dependent, and any change in the original programs requires updating of all associated scripts in all different operating systems. The lack of `backward' compatibility with existing scripts actually impedes the development of the original software, since many users may refuse to update the core software to avoid having to modify their (borrowed) scripts. Creating wrappers (Python) to interface with original programs has a similar effect of possibly hampering software evolution.

Another compatibility issue is that of the formats used in cryo-EM. The original 2D stack format of IMAGIC (van Heel & Keegstra, 1981[link]) is one of the richest formats in use in cryo-EM and therefore it is possible to convert from all file formats in use in cryo-EM into IMAGIC without loss of information. The popular EM2EM conversion program ( ) for converting all cryo-EM formats into each other is an IMAGIC program that converts formats without any unnecessary loss of header data. The popular CCP4 volume data format (Collaborative Computational Project, Number 4, 1994[link]) has been adopted as a standard for the current EM databases (Smith & Carragher, 2008[link]). This format is a 3D format, with one header record for the full 3D volume. It can also hold a single 2D image, but it cannot hold a stack of images, each with its own header record, as is required for single-particle cryo-EM. For 4D cryo-EM applications a 4D format will be required for the structure databases. There is some urgency for the cryo-EM and X-ray communities to agree on a more elaborate common density format for our structure databases.

Atomic resolution structures (∼3 Å) have hitherto been elucidated mainly by X-ray crystallography, where the biological molecules are confined to the rigidity of a 3D crystal. Single-particle cryo-EM gives a direct window into the solution, revealing a plethora of different views, of different complexes, in different functional states, albeit at a resolution level still somewhat inferior to that achieved by X-ray crystallography and NMR spectroscopy. Single-particle cryo-EM techniques are now approaching resolution levels previously only achievable by X-ray crystallography. For a better understanding of biological processes, it is essential to see the sequence of conformational changes and interactions that molecules undergo during their functional cycle. A primary challenge in structural biology today is to generate `4D movies' of biological complexes at (quasi)-atomic resolution. The new IMAGIC 4D software has been tailored for such 4D analysis at atomic resolution.


We acknowledge recent financial support from the EU/NOE (grant No. NOE-PE0748), from the Dutch Ministry of Economic Affairs (Cyttron project No. BIBCR_PX0948) and from the BBSRC (grant No. BB/G015236/1). We thank all users of the IMAGIC system for constructive feedback and past contributors for their code and ideas. We also acknowledge the use of the Eclipse/SVN software suite for helping to keep the IMAGIC system coherent with contributors operating from three different continents.


Adrian, M., Dubochet, J., Lepault, J. & McDowall, A. W. (1984). Cryo-electron microscopy of viruses. Nature (London), 308 32–36.
Boekema, E. J., Berden, J. A. & van Heel, M. G. (1986). Structure of mitochondrial F1-ATPase studied by electron microscopy and image processing. Biochim. Biophys. Acta, 851, 353–360.
Borland, L. & van Heel, M. (1990). Classification of image data in conjugate representation spaces. J. Opt. Soc. Am. A, 7, 601–610.
Brenner, S. & Horne, R. W. (1959). A negative staining method for high resolution electron microscopy of viruses. Biochim. Biophys. Acta, 34, 103–110.
Carragher, B. & Smith, P. R. (1996). Advances in computational image processing for microscopy. J. Struct. Biol. 116, 2–8.
Collaborative Computational Project, Number 4 (1994). The CCP4 suite: programs for protein crystallography. Acta Cryst. D50, 760–763.
DeRosier, D. J. & Klug, A. (1968). Reconstruction of three-dimensional structures from electron micrographs. Nature (London), 217 130–134.
Dube, P., Tavares, P., Lurz, R. & van Heel, M. (1993). The portal protein of bacteriophage SPP1: a DNA pump with 13-fold symmetry. EMBO J. 12, 1303–1309.
Elad, N., Clare, D. K., Saibil, H. R. & Orlova, E. V. (2008). Detection and separation of heterogeneity in molecular complexes by statistical analysis of their two-dimensional projections. J. Struct. Biol. 162, 108–120.
Frank, J. (2006). Three-Dimensional Electron Microscopy of Macromolecular Assemblies. Oxford University Press.
Grant, T., van Duinen, G., Patwardhan, A. & van Heel, M. (2011). Exploiting astigmatism in single particle cryo electron microscopy. Submitted.
Gropp, W., Lusk, E. & Skjellum, A. (1994). Using MPI: Portable Parallel Programming with the Message-Passing Interface. Scientific and Engineering Computation Series. Cambridge, MA: MIT Press.
Harauz, G. & van Heel, M. (1986). Exact filters for general geometry three dimensional reconstruction. Optik, 73 146–156.
Heel, M. van (1982). Detection of objects in quantum noise-limited images. Ultramicroscopy, 7, 331–342.
Heel, M. van (1984). Multivariate statistical classification of noisy images (randomly oriented biological macromolecules). Ultramicroscopy, 13, 165–183.
Heel, M. van (1987). Angular reconstitution: a posteriori assignment of projection directions for 3D reconstruction. Ultramicroscopy, 21, 111–124.
Heel, M. van & Frank, J. (1981). Use of multivariate statistics in analyzing the images of biological macromolecules. Ultramicroscopy, 6, 187–194.
Heel, M. van, Gowen, B., Matadeen, R., Orlova, E. V., Finn, R., Pape, T., Cohen, D., Stark, H., Schmidt, R., Schatz, M. & Patwardhan, A. (2000). Single-particle electron cryo-microscopy: towards atomic resolution. Q. Rev. Biophys. 33, 307–369.
Heel, M. van, Harauz, G., Orlova, E. V., Schmidt, R. & Schatz, M. (1996). A new generation of the IMAGIC image processing system. J. Struct. Biol. 116, 17–24.
Heel, M. van & Keegstra, W. (1981). IMAGIC: a fast flexible and friendly image analysis software system. Ultramicroscopy, 7, 113–130.
Heel, M. van, Portugal, R. & Schatz, M. (2009). Multivariate statistical analysis in single particle (cryo) electron microscopy. In An Electronic Text Book: Electron Microscopy in Life Science, 3D-EM Network of Excellence, edited by A. Verkley & E. V. Orlova. London: 3D-EM Network of Excellence.
Heel, M. van & Schatz, M. (2005). Fourier shell correlation threshold criteria. J. Struct. Biol. 151, 250–262.
Heel, M. van & Stöffler-Meilicke, M. (1985). Characteristic views of E. coli and B. stearothermophilus 30S ribosomal subunits in the electron microscope. EMBO J. 4, 2389–2395.
Klaholz, B. P., Myasnikov, A. G. & van Heel, M. (2004). Release factor 3 seen on the ribosome during termination of protein synthesis. Nature (London), 427, 862–865.
Leschziner, A. E. & Nogales, E. (2007). Visualizing flexibility at molecular resolution: analysis of heterogeneity in single-particle electron microscopy reconstructions. Annu. Rev. Biophys. Biomol. Struct. 36, 43–62.
Saxton, W. O. & Frank, J. (1977). Motif detection in quantum noise-limited electron micrographs by cross-correlation. Ultramicroscopy, 2 219–227.
Smith, R. & Carragher, B. (2008). Software tools for molecular microscopy. J. Struct. Biol. 163, 224–228.
Spahn, C. M. T. & Penczek, P. A. (2009). Exploring conformational modes of macromolecular assemblies by multiparticle cryo-EM. Curr. Opin. Struct. Biol. 19, 1–9.
Stewart, A. & Grigorieff, N. (2004). Noise bias in the refinement of structures derived from single particles. Ultramicroscopy, 102, 67–84.

to end of page
to top of page