Computing environments

From Dynamo
Jump to navigation Jump to search

Dynamo can be run on GPU systems, multicore machines or CPU clusters. We advise to use a workstation for manipulation of tomograms, and a dedicated computing system (GPUs or clusters) for alignment projects (where the most intensive number crunching occurs)


Computing systems

Our advice is to use GPU machines. A single server controlling several (up to eight) GPUs will typically yield the performance of several hundreds single CPU cores. The price of such a GPU server will be in the order of magnitude of 10K CHF.

With multicore machine we refer to a desktop (or a remote server) with several cores thats allows login for an interactive session. Most modern desktops have at least 4 cores. Computing dedicated severs with up to 64 cores are becoming more and more normal. Such a machine might be useful for projects involving less that 1000 subtomograms (coarse estimation, highly dependent on the cores, the project and the patience of the user), for more demanding projects you should go for GPU computing or parallel clusters.

CPU clusters normally demand the submission of a computing job to a queue. The performance of Dynamo in a CPU cluster is variable, highly dependent on the availability of a fast file system.


RAM requirements

Number crunching

During the alignment stage, cropped particles are stored in disk and read into memory one at a time for each process. After reading one particle (a time expensive operation), the memory of the process will keep in a given time a copy of a single subtomogram, a template, a rotated version of the template, a fourier transformed version of all them, and four other auxiliary volumes. Even for particles with a sidelength of 256 pixels, that's less than 2Gb.

This boils down to a rather low RAM demand when working with a GPU. Each GPU is ran by a different process, so that you'll only have up to eight process running concurrently during the same project. In a multicore machine with, say, 32 cores, you'll probably need a RAM with at least 64 Gb.

Visualization

For the stage of inspecting an annotating tomograms, it is important to have access to a powerful workstation, probably better if you can have it below your desk: remote access is usually a bad friend of visualization. We advice at least 64Gb for such a workstation, so that you can keep several full tomograms in memory.