Ed Grochowski's Website
Newer Articles
Unix 50th Anniversary
Write Once, Run Forever
Lentil Vegetable Soup
Philips LED
Remembering Bow-bow
Tax Season
Sanus AFA Audio Rack
My GPU Finally Works
The Analog Synthesizer Revival
Older Articles


Ed Grochowski

Written 9-1-2017

Megatasking is the latest buzzword to describe usage scenarios for high-core-count CPUs.

It turns out I have been doing megatasking on my home PCs all along. The rationale is "why run jobs sequentially when I can run them all at once?" Just throw the jobs at the computer and let the computer grind through them. A powerful computer can take this onslaught in stride.

My most common megatasking scenarios are:

  • Building a hundred application programs simultaneously
  • Regression testing of a program against a hundred test cases
  • Batch processing of hundreds of files

It is very impressive to watch the load average on my 8-core/16-thread Haswell-E climb into the hundreds, while the machine remains responsive. This is something that the PDP-11/70 (on which I first encountered UNIX in the early 1980s) could never dream of.

Often, memory usage is the limiting factor in how many jobs can be run simultaneously. 64GB of memory divided by a hundred jobs is not very much memory for each job.

A word of caution: don't try megatasking on a shared computer. I once tested one of my programs with 500 test cases simultaneously on a shared server. This prompted an email from a sysadmin: "Don't do that!"