-*- outline -*- This file contains instructions for the installation of HDF5 on Unix-like systems. Users of the Intel TFLOPS machine should see the INSTALL.ascired for instructions. * Obtaining HDF5 The latest supported public release of HDF5 is available from ftp://hdf.ncsa.uiuc.edu/pub/dist/HDF5 and is available in tar format uncompressed or compressed with compress, gzip, or bzip2. The HDF team also makes snapshots of the source code available on a regular basis but these. These snapshots are unsupported (that is, the HDF team will not release a bug-fix on a particular snapshot; rather any bug fixes will be rolled into the next snapshot). Furthermore, the snapshots have only been tested on a few machines and may not test correctly for parallel applications. Snapshots can be found at ftp://hdf.ncsa.uiuc.edu/pub/outgoing/hdf5/snapshots in a limited number for formats. * Warnings about compilers OUTPUT FROM THE FOLLOWING COMPILERS SHOULD BE EXTREMELY SUSPECT WHEN USED TO COMPILE THE HDF5 LIBRARY, ESPECIALLY IF OPTIMIZATIONS ARE ENABLED. IN ALL CASES, HDF5 ATTEMPTS TO WORK AROUND THE COMPILER BUGS BUT THE HDF5 DEVELOPMENT TEAM MAKES NO GUARANTEES THAT THERE ARE OTHER CODE GENERATION PROBLEMS. ** GNU (Intel platforms) Versions before 2.8.1 have serious problems allocating registers when functions contain operations on `long long' data types. Supplying the `--disable-hsizet' switch to configure (documented below) will prevent hdf5 from using `long long' data types in situations that are known not to work, but it limits the hdf5 address space to 2GB. ** DEC The V5.2-038 compiler (and possibly others) occasionally generates incorrect code for memcpy() calls when optimizations are enabled, resulting in unaligned access faults. HDF5 works around the problem by casting the second argument to `char*'. ** SGI (Irix64 6.2) The Mongoose 7.00 compiler has serious optimization bugs and should be upgraded to MIPSpro 7.2.1.2m. Patches are available from SGI. ** Windows/NT The MicroSoft Win32 5.0 compiler is unable to cast unsigned long long values to doubles. HDF5 works around this bug by first casting to signed long long and then to double. * Quick installation For those that don't like to read ;-) the following steps can be used to configure, build, test, and install the HDF5 library, header files, and support programs. $ gunzip 2GB) vs. small (<2GB) file capability In order to read or write files that could potentially be larger than 2GB it is necessary to use the non-ANSI `long long' data type on some platforms. However, some compilers (e.g., GNU gcc versions before 2.8.1 on Intel platforms) are unable to produce correct machine code for this data type. To disable use of the `long long' type on these machines say: $ ./configure --disable-hsizet *** Parallel vs. serial library The HDF5 library can be configured to use MPI and MPI-IO for parallelizm on a distributed multi-processor system. The easy way to do this is to have a properly installed parallel compiler (e.g., MPICH's mpicc or IBM's mpcc) and supply that executable as the value of the CC environment variable: $ CC=mpcc ./configure $ CC=/usr/local/mpi/bin/mpicc ./configure If no such wrapper script is available then you must specify your normal C compiler along with the distribution of MPI/MPI-IO which is to be used (values other than `mpich' will be added at a later date): $ ./configure --enable-parallel=mpich If the MPI/MPI-IO include files and/or libraries cannot be found by the compiler then their directories must be given as arguments to CPPFLAGS and/or LDFLAGS: $ CPPFLAGS=-I/usr/local/mpi/include \ LDFLAGS=-L/usr/local/mpi/lib/LINUX/ch_p4 \ ./configure --enable-parallel=mpich If a parallel library is being built then configure attempts to determine how to run a parallel application on one processor and on many processors. If the compiler is mpicc and the user hasn't specified values for RUNSERIAL and RUNPARALLEL then configure chooses `mpirun' from the same directory as `mpicc': RUNSERIAL: /usr/local/mpi/bin/mpirun -np 1 RUNPARALLEL: /usr/local/mpi/bin/mpirun -np $${NPROCS:=2} The `$${NPROCS:=2}' will be substituted with the value of the NPROCS environment variable at the time `make check' is run (or the value 2). ** Building The library, confidence tests, and programs can be build by saying just $ make Note that if you supplied some other make command via the MAKE variable during the configuration step then that same command must be used here. When using GNU make you can add `-j -l6' to the make command to compile in arallel on SMP machines. Do not give a number after th `-j' since GNU make will turn it off for recursive invocations of make. $ make -j -l6 ** Testing HDF5 comes with various test suites, all of which can be run by saying $ make check To run only the tests for the library change to the `test' directory before issuing the command. Similarly, tests for the parallel aspects of the library are in `testpar' and tests for the support programs are in `tools'. Temporary files will be deleted by each test when it complets, but may continue to exist in an incomplete state if the test fails. To prevent deletion of the files define the HDF5_NOCLEANUP environment variable. ** Installing The HDF5 library, include files, and support programs can be installed in a (semi-)public place by saying `make install'. The files are installed under the directory specified with `--prefix=DIR' (or '/usr/local') in directories named `lib', `include', and `bin'. The prefix directory must exist prior to `make install', but its subdirectories are created automatically. The library can be used without installing it by pointing the compiler at the `src' directory for both include files and libraries. However, the minimum which must be installed to make the library publically available is: The library: ./src/libhdf5.a The public header files: ./src/H5*public.h The main header file: ./src/hdf5.h The configuration information: ./src/H5config.h The support programs that are useful are: ./tools/h5ls (list file contents) ./tools/h5dump (dump file contents) ./tools/h5repart (repartition file families) ./tools/h5toh4 (hdf5 to hdf4 file converter) ./tools/h5debug (low-level file debugging) ./tools/h5import (a demo) * Using the Library Please see the User Manual in the doc/html directory. Most programs will include and link with -lhdf5. Additional libraries may also be necessary depending on whether support for compression, etc. was compiled into the hdf5 library. A summary of the hdf5 installation can be found in the libhdf5.settings file in the same directory as the static and/or shared hdf5 libraries. * Support Support is described in the README file.