ARSC HPC Users' Newsletter 360, April 20, 2007

More fun with GNU Make, Part IV of III

[ by: Kate Hedstrom ]

Two years ago I described how to take advantage of some of the new features in GNU make. I got as far as describing a build system for our ocean model in which we split the source code into multiple directories. You may review that series of articles here: > part I , > part II , and > part III .

At that time we were happy with letting all the object and other temporary files get created in the top level directory, from which the make was launched. Time passed, the model has grown, it's time for Part IV.

The number of these temporary files is now such that we've been wanting to redirect them to a Build directory. The GNU make bible ("Managing Projects with GNU Make," by Robert Mecklenburg, O'Reilly, 2004) provides a couple of examples showing how to do this for C programs. One method requires the user to cd into the Build directory first, but we've opted to keep the Makefile in the top directory and have the user type "make" there.


As a reminder, our source files have an extension of .F and need to be preprocessed by cpp into intermediate .f90 files. Some contain F90 modules which produce .mod files as well as .o files when compiled. Most of our subdirectories will have the resulting .o files archived in a library which will be linked to the main program. We want the .f90, .o, .mod, and libxx.a files to all end up in the Build directory. The book's example uses a whole directory tree under Build, but we're lazy and we'll let them all just land in ./Build.

User-Defined Make Functions

From Mecklenburg:

"GNU make supports both built-in and user-defined functions. A function invocation looks much like a variable reference, but includes one or more parameters separated by commas. Most built-in functions expand to some value that is then assigned to a variable or passed to a subshell. A user-defined function is stored in a variable or macro and expects one or more parameters to be passed by the caller."

We make extensive use of user-defined functions, and start off by defining them in the top Makefile:

# ---------
#  Make functions for putting the temporary files in $(SCRATCH_DIR)
# ---------

# $(call source-dir-to-binary-dir, directory-list)
source-dir-to-binary-dir = $(addprefix $(SCRATCH_DIR)/, $(notdir $1))

# $(call source-to-object, source-file-list)
source-to-object = $(call source-dir-to-bindary-dir,   \
                   $(subst .F,.o,$(filter %.F,$1)))

# $(call f90-source, source-file-list)
f90-source = $(call source-dir-to-binary-dir,     \
                   $(subst .F,.f90,$1))

# $(call make-library, library-name, source-file-list)
define make-library
   libraries += $(SCRATCH_DIR)/$1
   sources   += $2

   $(SCRATCH_DIR)/$1: $(call source-dir-to-binary-dir,    \
                      $(subst .F,.o,$2))
       $(AR) $(ARFLAGS) $$@ $$^
       $(RANLIB) $$@

# $(call one-compile-rule, binary-file, f90-file, source-files)
define one-compile-rule
  $1: $2 $3
       cd $$(SCRATCH_DIR); $$(FC) -c $$(FFLAGS) $(notdir $2)

  $2: $3
       $$(CPP) $$(CPPFLAGS) $$< > $$@


# $(compile-rules)
define compile-rules
  $(foreach f, $(local_src),       \
    $(call one-compile-rule,$(call source-to-object,$f), \
    $(call f90-source,$f),$f))

Here are description of these functions:

  1. We define a function to convert the path from the source directory to the Build directory, called source-dir-to-binary-dir. Note that the Build directory is called $(SCRATCH_DIR) here. All it does is strip off the leading directory with the the built-in function "notdir", then paste on the Build directory.
  2. Next comes source-to-object, which calls the function above to return the object file name when given the source file name. Our version assumes that all sources have a .F extension.
  3. A similar function is f90-source, which returns the name of the intermediate source which is created by cpp from our .F file.
  4. The makefile fragment in each library source directory invokes make-library, which takes the library name and the list of sources as its arguments. The function adds its library to the global list of libraries and provides rules for building itself. The double dollar signs are to delay the variable substitution. Note that we call source-dir-to-binary-dir instead of source-to-object - this is a work-around for a make bug.
  5. The next, one-compile-rule, takes three arguments: the .o file name, the .f90 file name, and the .F file name. From these, it produces the make rules for running cpp and the compiler. It would be possible to compile from the top directory and put the .o file in Build with the appropriate arguments, but I don't know how to get the .mod file into Build short of a "mv" command. Likewise, if we compile in the top directory, we need to know the compiler option to tell it to look in Build for the .mod files it uses. Doing a "cd" to Build before compiling simplifies these steps.
  6. The last, compile-rules, is given a list of sources, then calls one-compile-rule once per source file.

Other Makefile Changes

Don't forget to set the $(SCRATCH_DIR) variable:

  clean_list += $(SCRATCH_DIR)/*
and add its files to vpath:

vpath %.f90 $(SCRATCH_DIR)
vpath %.o $(SCRATCH_DIR)

One annoyance is that the compile phase of one-compile-rule didn't seem to be working. I was forced to keep the pattern rule for .o files:

%.o: %.f90
<TAB>  cd $(SCRATCH_DIR); $(FC) -c $(FFLAGS) $(notdir $<)

Makefile Fragments

The library directory files are now shorter:

local_lib  := libNLM.a
local_src  := $(wildcard $(subdirectory)/*.F)

$(eval $(call make-library,$(local_lib),$(local_src)))

$(eval $(compile-rules))

The eval forces it to evaluate later in the build phase. The main directory is a little trickier:

local_src  := $(wildcard $(subdirectory)/*.F)
#local_objs := $(call source-to-object,$(local_src))
local_objs := $(subst .F,.o,$(local_src))
local_objs := $(addprefix $(SCRATCH_DIR)/, $(notdir $(local_objs)))

sources    += $(local_src)

$(BIN): $(libraries) $(local_objs)
        $(LD) $(FFLAGS) $(local_objs) -o $@ $(libraries) $(LIBS_WIN32) $(LDFLAGS)
$(BIN): $(libraries) $(local_objs)
        $(LD) $(FFLAGS) $(LDFLAGS) $(local_objs) -o $@ $(libraries) $(LIBS)

$(eval $(compile-rules))                          

You'll note the commented-out version of local_objs, above. I thought this version would work, but it doesn't, producing instead (as indicated by "make -p") an empty value for local_objs.

Final Warnings

I'm going ahead with this change for my code, though it may cause trouble for some:

  1. We're a little closer to the GNU make bugs here, and probably need a newer version of GNU make than before. I've been testing it with version 3.80.
  2. This Makefile takes more memory than the old one and led to a malloc failure from make on my Mac laptop. The older Makefile works just fine on that system.
  3. kefile dependencies get just a little trickier every change we make. Note that F90 has potentially both include and module use dependencies. The book's example uses the C compiler to produce its own dependencies for each source file into a corresponding .d file to be included by make. Our Fortran compilers are not so smart. For these hairy compiles, it's critical to have accurate dependency information unless we're willing to "make clean" between compiles. I've got an old Perl script that seems to still be doing the job, but I cross my fingers every time I have to mess with it.

ARSC Faculty Boot Camp 2007, Informational Meeting Next Week

For the eighth glorious year, ARSC is offering Faculty Camp, now called "Faculty Boot Camp" (but don't let that scare you!), dedicated to High Performance Computing in Alaska, computational science, training, and the building of collaborations between faculty and ARSC.

Don't miss the:

"Informational Meeting and Open House"

  • Tuesday, April 24, 1-2pm
  • West Ridge Research Building, rm: 010
  • (refreshments will be served... no RSVP necessary...)

ARSC Faculty Boot Camp 2007 will take place July 30 - Aug. 17. This is an excellent opportunity to learn how to manage extremely large volumes of data, run simulations for longer durations and visualize data in 3-D, in addition to receiving one-on-one attention to help use ARSC resources effectively.

Seminars, workshops and hands-on learning experiences will include, but are not limited to, Unix basics, batch processing, scripting, parallel programming concepts and techniques, and basic familiarization with visualization packages. Presented by ARSC staff, UAF/ARSC joint faculty and current users, Faculty Boot Camp 2007 provides participants with assistance and expertise as they focus on independent work and self-guided study for individual projects.

If you're interested, please submit a short, 250-word description of the skills you want to develop and the project you intend to pursue. This will assist us in organizing events and speakers to address the specific needs of the attendees. Submit your abstract to Tom Logan ( ) in ASCII or pdf format by May 16.


Send questions on next week's Open House or Faculty Boot Camp, in general, to Tom Logan ( ).

Visualization with IDL

[ by: Sergei Maurits ]

ARSC recently offered a two-part course for users on Interactive Data Language or IDL ( ). About twenty participants took this hands-on training in the ARSC computer classroom, recently re-equipped with Linux workstations. For those who were unable to attend, here's a brief description of IDL, and a hope to see you at training opportunities in the future. Also, as described below, self-guided tutorials and other course materials are available.

IDL is a popular data visualization package. The primary IDL interface is a C-like language which supports multiple data types, do-loops, begin-end modules, various I/O operations, subroutines, and many other standard features of high-level programming languages. A well-developed mathematical library and 100+ graphical procedures for data visualization and analysis complete the picture. An on-line searchable guide,"idlhelp", and extended demo suite, "idldemo," provide convenient reference materials for the IDL user. "idldemo" groups demos by various IDL functions and by the scientific or engineering field of the applications. The source codes of all demos are accessible for further reference and study.

Another strong side of IDL is the presence of tools for image manipulation, which are included into the demo suite. An interesting IDL feature is several dozen predefined color palettes, professionally enhanced for application in various visualization tasks.

A relatively new addition is the "Interactive Tools" or "iTools" suite. iTools provide a fully interactive GUI for development of various graphical data representations, ranging from simple XY-plots to complex volume rendering applications. The "iMap" tool allows for placing data into geographical map context and combining it with geographic and geophysical information.

The recent training events, "Introduction to IDL" and "iTools," covered such topics as establishing the IDL user environment, using the IDL programing tool "idlde," IDL I/O methods, and "iTools" basics. The permanent location of all materials for these courses is the directory, /projects/CLASSES/IDL, in ARSC's on-line storage system. The directory includes README-files with descriptions of the directory contents (README.content); detailed instructions on how to perform tutorial exercises (README.howto); heavily commented samples of IDL codes for data visualization; and "iTools" tutorial with step-by-step PDF-instructions and data files. The content is available for independent training and/or as reference materials for all interested users.

Quick-Tip Q & A

A: [[ If I have an array of double-precision floating point values in C
   [[ and I want to write each value out to a file in an ASCII format 
   [[ so that when they are read back in I have no loss of precision, 
   [[ how would I do it?  I'd like the solution to avoid writing 
   [[ characters that don't affect the precision, like trailing 0's.  
   [[ (This question might be of interest to Fortran users as well).

   # No responses yet.  Feel free to send an answer (or an article!) 
   # whenever inspiration strikes...

Q: When I accidentally "cat" a binary file and my terminal gets messed
   up, how can I make it usable again?  And why doesn't "stty sane"
   seem to help?

   # Thanks to Dale Clark for this question.
[[ Answers, Questions, and Tips Graciously Accepted ]]

Current Editors:
Ed Kornkven ARSC HPC Specialist ph: 907-450-8669
Kate Hedstrom ARSC Oceanographic Specialist ph: 907-450-8678
Arctic Region Supercomputing Center
University of Alaska Fairbanks
PO Box 756020
Fairbanks AK 99775-6020
E-mail Subscriptions: Archives:
    Back issues of the ASCII e-mail edition of the ARSC T3D/T3E/HPC Users' Newsletter are available by request. Please contact the editors.
Back to Top