diff options
author | Bill Wendling <wendling@ncsa.uiuc.edu> | 2000-12-04 22:22:11 (GMT) |
---|---|---|
committer | Bill Wendling <wendling@ncsa.uiuc.edu> | 2000-12-04 22:22:11 (GMT) |
commit | 2876134885a79ed5e692ee7c2ed15ecc203b3c29 (patch) | |
tree | 7b2bfe304921b973c6540b5303312983d0a8acbc /INSTALL | |
parent | e552b96ce3289133b462c1cb2be92ca667b2fb12 (diff) | |
download | hdf5-2876134885a79ed5e692ee7c2ed15ecc203b3c29.zip hdf5-2876134885a79ed5e692ee7c2ed15ecc203b3c29.tar.gz hdf5-2876134885a79ed5e692ee7c2ed15ecc203b3c29.tar.bz2 |
[svn-r3075] Purpose:
Reformatting
Diffstat (limited to 'INSTALL')
-rw-r--r-- | INSTALL | 444 |
1 files changed, 219 insertions, 225 deletions
@@ -51,41 +51,40 @@ 1. Obtaining HDF5 The latest supported public release of HDF5 is available from - ftp://hdf.ncsa.uiuc.edu/pub/dist/HDF5. For Unix platforms, it - is available in tar format uncompressed or compressed with - compress, gzip, or bzip2. For Microsoft Windows, it is in - ZIP format. - - The HDF team also makes snapshots of the source code available - on a regular basis. These snapshots are unsupported (that is, - the HDF team will not release a bug-fix on a particular - snapshot; rather any bug fixes will be rolled into the next - snapshot). Furthermore, the snapshots have only been tested on - a few machines and may not test correctly for parallel - applications. Snapshots can be found at - ftp://hdf.ncsa.uiuc.edu/pub/outgoing/hdf5/snapshots in a - limited number of formats. + ftp://hdf.ncsa.uiuc.edu/pub/dist/HDF5. For Unix platforms, it is + available in tar format uncompressed or compressed with compress, + gzip, or bzip2. For Microsoft Windows, it is in ZIP format. + + The HDF team also makes snapshots of the source code available on + a regular basis. These snapshots are unsupported (that is, the + HDF team will not release a bug-fix on a particular snapshot; + rather any bug fixes will be rolled into the next snapshot). + Furthermore, the snapshots have only been tested on a few + machines and may not test correctly for parallel applications. + Snapshots can be found at + ftp://hdf.ncsa.uiuc.edu/pub/outgoing/hdf5/snapshots in a limited + number of formats. 2. Warnings about compilers - OUTPUT FROM THE FOLLOWING COMPILERS SHOULD BE EXTREMELY - SUSPECT WHEN USED TO COMPILE THE HDF5 LIBRARY, ESPECIALLY IF + OUTPUT FROM THE FOLLOWING COMPILERS SHOULD BE EXTREMELY SUSPECT + WHEN USED TO COMPILE THE HDF5 LIBRARY, ESPECIALLY IF OPTIMIZATIONS ARE ENABLED. IN ALL CASES, HDF5 ATTEMPTS TO WORK - AROUND THE COMPILER BUGS BUT THE HDF5 DEVELOPMENT TEAM MAKES - NO GUARANTEES THAT THERE ARE OTHER CODE GENERATION PROBLEMS. + AROUND THE COMPILER BUGS BUT THE HDF5 DEVELOPMENT TEAM MAKES NO + GUARANTEES THAT THERE ARE OTHER CODE GENERATION PROBLEMS. 2.1. GNU (Intel platforms) - Versions before 2.8.1 have serious problems allocating - registers when functions contain operations on `long long' - data types. Supplying the `--disable-hsizet' switch to - configure (documented below) will prevent hdf5 from using - `long long' data types in situations that are known not to - work, but it limits the hdf5 address space to 2GB. + Versions before 2.8.1 have serious problems allocating registers + when functions contain operations on `long long' data types. + Supplying the `--disable-hsizet' switch to configure (documented + below) will prevent hdf5 from using `long long' data types in + situations that are known not to work, but it limits the hdf5 + address space to 2GB. 2.2. DEC The V5.2-038 compiler (and possibly others) occasionally generates incorrect code for memcpy() calls when optimizations are enabled, resulting in unaligned access faults. HDF5 works - around the problem by casting the second argument to `char*'. + around the problem by casting the second argument to `char *'. 2.3. SGI (Irix64 6.2) The Mongoose 7.00 compiler has serious optimization bugs and @@ -93,16 +92,15 @@ from SGI. 2.4. Windows/NT - The MicroSoft Win32 5.0 compiler is unable to cast unsigned - long long values to doubles. HDF5 works around this bug by - first casting to signed long long and then to double. + The MicroSoft Win32 5.0 compiler is unable to cast unsigned long + long values to doubles. HDF5 works around this bug by first + casting to signed long long and then to double. 3. Quick installation - For those that don't like to read ;-) the following steps can - be used to configure, build, test, and install the HDF5 - library, header files, and support programs. - + For those that don't like to read ;-) the following steps can be + used to configure, build, test, and install the HDF5 library, + header files, and support programs. $ gunzip <hdf5-1.2.0.tar.gz |tar xf - $ cd hdf5-1.2.0 @@ -114,122 +112,126 @@ should see the INSTALL_TFLOPS for more instructions. 3.2. Windows - Users of Microsoft Windows should see the INSTALL_Windows.txt - for detailed instructions. + Users of Microsoft Windows should see the INSTALL_Windows.txt for + detailed instructions. 3.3. Certain Virtual File Layer(VFL) - If users want to install with special Virtual File Layer(VFL), - please go to read INSTALL_VFL file. SRB and Globus-GASS have - been documented. + If users want to install with special Virtual File Layer(VFL), + please go to read INSTALL_VFL file. SRB and Globus-GASS have + been documented. + 4. HDF5 dependencies 4.1. Zlib - The HDF5 library has a predefined compression filter that uses + The HDF5 library has a predefined compression filter that uses the "deflate" method for chunked datatsets. If zlib-1.1.2 or - later is found then HDF5 will use it, otherwise HDF5's - predefined compression method will degenerate to a no-op (the - compression filter will succeed but the data will not be - compressed). + later is found then HDF5 will use it, otherwise HDF5's predefined + compression method will degenerate to a no-op (the compression + filter will succeed but the data will not be compressed). 4.2. MPI and MPI-IO - The parallel version of the library is built upon the - foundation provided by MPI and MPI-IO. If these libraries are - not available when HDF5 is configured then only a serial - version of HDF5 can be built. + The parallel version of the library is built upon the foundation + provided by MPI and MPI-IO. If these libraries are not available + when HDF5 is configured then only a serial version of HDF5 can be + built. 5. Full installation instructions for source distributions 5.1. Unpacking the distribution - The HDF5 source code is distributed in a variety of formats - which can be unpacked with the following commands, each of - which creates an `hdf5-1.2.0' directory. + The HDF5 source code is distributed in a variety of formats which + can be unpacked with the following commands, each of which + creates an `hdf5-1.2.0' directory. 5.1.1. Non-compressed tar archive (*.tar) + $ tar xf hdf5-1.2.0.tar 5.1.2. Compressed tar archive (*.tar.Z) + $ uncompress -c < hdf5-1.2.0.tar.Z | tar xf - 5.1.3. Gzip'd tar archive (*.tar.gz) + $ gunzip < hdf5-1.2.0.tar.gz | tar xf - 5.1.4. Bzip'd tar archive (*.tar.bz2) + $ bunzip2 < hdf5-1.2.0.tar.gz | tar xf - 5.2. Source vs. Build Directories - On most systems the build can occur in a directory other than - the source directory, allowing multiple concurrent builds - and/or read-only source code. In order to accomplish this, one - should create a build directory, cd into that directory, and - run the `configure' script found in the source directory - (configure details are below). - - Unfortunately, this does not work on recent Irix platforms - (6.5? and later) because that `make' doesn't understand the - VPATH variable. However, hdf5 also supports Irix `pmake' which - has a .PATH target which serves a similar purpose. Here's what - the man pages say about VPATH, which is the facility used by - HDF5 makefiles for this feature: - - The VPATH facility is a derivation of the undocumented - VPATH feature in the System V Release 3 version of make. - System V Release 4 has a new VPATH implementation, much - like the pmake(1) .PATH feature. This new feature is also - undocumented in the standard System V Release 4 manual - pages. For this reason it is not available in the IRIX - version of make. The VPATH facility should not be used - with the new parallel make option. + On most systems the build can occur in a directory other than the + source directory, allowing multiple concurrent builds and/or + read-only source code. In order to accomplish this, one should + create a build directory, cd into that directory, and run the + `configure' script found in the source directory (configure + details are below). + + Unfortunately, this does not work on recent Irix platforms (6.5? + and later) because that `make' doesn't understand the VPATH + variable. However, hdf5 also supports Irix `pmake' which has a + .PATH target which serves a similar purpose. Here's what the man + pages say about VPATH, which is the facility used by HDF5 + makefiles for this feature: + + The VPATH facility is a derivation of the undocumented VPATH + feature in the System V Release 3 version of make. System V + Release 4 has a new VPATH implementation, much like the + pmake(1) .PATH feature. This new feature is also undocumented + in the standard System V Release 4 manual pages. For this + reason it is not available in the IRIX version of make. The + VPATH facility should not be used with the new parallel make + option. 5.3. Configuring HDF5 uses the GNU autoconf system for configuration, which detects various features of the host system and creates the - Makefiles. On most systems it should be sufficient to say: + Makefiles. On most systems it should be sufficient to say: $ ./configure OR $ sh configure - The configuration process can be controlled through - environment variables, command-line switches, and host - configuration files. For a complete list of switches say - `./configure --help'. The host configuration files are located - in the `config' directory and are based on architecture name, - vendor name, and/or operating system which are displayed near - the beginning of the `configure' output. The host config file - influences the behavior of configure by setting or augmenting - shell variables. + The configuration process can be controlled through environment + variables, command-line switches, and host configuration files. + For a complete list of switches type: + + $ ./configure --help + + The host configuration files are located in the `config' + directory and are based on architecture name, vendor name, and/or + operating system which are displayed near the beginning of the + `configure' output. The host config file influences the behavior + of configure by setting or augmenting shell variables. 5.3.1. Specifying the installation directories Typing `make install' will install the HDF5 library, header files, and support programs in /usr/local/lib, - /usr/local/include, and /usr/local/bin. To use a path other - than /usr/local specify the path with the `--prefix=PATH' - switch: + /usr/local/include, and /usr/local/bin. To use a path other than + /usr/local specify the path with the `--prefix=PATH' switch: $ ./configure --prefix=$HOME - If shared libraries are being built (the default) then the - final home of the shared library must be specified with this - switch before the library and executables are built. + If shared libraries are being built (the default) then the final + home of the shared library must be specified with this switch + before the library and executables are built. 5.3.2. Using an alternate C compiler By default, configure will look for the C compiler by trying - `gcc' and `cc'. However, if the environment variable "CC" is - set then its value is used as the C compiler (users of csh and - derivatives will need to prefix the commands below with - `env'). For instance, to use the native C compiler on a system - which also has the GNU gcc compiler: + `gcc' and `cc'. However, if the environment variable "CC" is set + then its value is used as the C compiler (users of csh and + derivatives will need to prefix the commands below with `env'). + For instance, to use the native C compiler on a system which also + has the GNU gcc compiler: $ CC=cc ./configure - A parallel version of hdf5 can be built by specifying `mpicc' - as the C compiler (the `--enable-parallel' flag documented - below is optional). Using the `mpicc' compiler will insure - that the correct MPI and MPI-IO header files and libraries are - used. + A parallel version of hdf5 can be built by specifying `mpicc' as + the C compiler (the `--enable-parallel' flag documented below is + optional). Using the `mpicc' compiler will insure that the + correct MPI and MPI-IO header files and libraries are used. $ CC=/usr/local/mpi/bin/mpicc ./configure - On Irix64 the default compiler is `cc'. To use an - alternate compiler specify it with the CC variable: + On Irix64 the default compiler is `cc'. To use an alternate + compiler specify it with the CC variable: $ CC='cc -o32' ./configure @@ -242,46 +244,46 @@ 5.3.3. Additional compilation flags If addtional flags must be passed to the compilation commands - then specify those flags with the CFLAGS variable. For - instance, to enable symbolic debugging of a production version - of HDF5 one might say: + then specify those flags with the CFLAGS variable. For instance, + to enable symbolic debugging of a production version of HDF5 one + might say: $ CFLAGS=-g ./configure --enable-production 5.3.4. Compiling HDF5 wrapper libraries - One can optionally build the Fortran 90 and/or C++ interface to - the HDF5 C library. By default, both options are disabled. To - build them, specify `--enable-fortran' and `--enable-cxx'. + One can optionally build the Fortran and/or C++ interface to the + HDF5 C library. By default, both options are disabled. To build + them, specify `--enable-fortran' and `--enable-cxx' respectively. $ ./configure --enable-fortran $ ./configure --enable-cxx - Configuration will halt if a working Fortran 90 compiler or C++ - compiler is not found. Currently, the Fortran 90 configure tests + Configuration will halt if a working Fortran 90 or 95 compiler or + C++ compiler is not found. Currently, the Fortran configure tests for these compilers in order: f90, pgf90, f95. To use an alternative compiler specify it with the F9X variable: - + $ F9X=/usr/local/bin/g95 ./configure --enable-fortran - - Note: The Fortran 90 and C++ interfaces may not support all the - platforms the main HDF5 library supports. The Fortran 90 + + Note: The Fortran and C++ interfaces are not supported on all the + platforms the main HDF5 library supports. Also, the Fortran interface supports parallel HDF5 while the C++ interface does not. 5.3.5. Specifying other programs - The build system has been tuned for use with GNU make but - works also with other versions of make. If the `make' command - runs a non-GNU version but a GNU version is available under a - different name (perhaps `gmake') then HDF5 can be configured - to use it by setting the MAKE variable. Note that whatever - value is used for MAKE must also be used as the make command - when building the library: + The build system has been tuned for use with GNU make but works + also with other versions of make. If the `make' command runs a + non-GNU version but a GNU version is available under a different + name (perhaps `gmake') then HDF5 can be configured to use it by + setting the MAKE variable. Note that whatever value is used for + MAKE must also be used as the make command when building the + library: $ MAKE=gmake ./configure $ gmake - The `AR' and `RANLIB' variables can also be set to the names - of the `ar' and `ranlib' (or `:') commands to override values + The `AR' and `RANLIB' variables can also be set to the names of + the `ar' and `ranlib' (or `:') commands to override values detected by configure. The HDF5 library, include files, and utilities are installed @@ -293,37 +295,34 @@ machine as of March 2, 1999) you have two choices: 1. Copy the bin/install-sh program to your $HOME/bin - directory, name it `install', and make sure that - $HOME/bin is searched before the system bin - directories. + directory, name it `install', and make sure that $HOME/bin + is searched before the system bin directories. 2. Specify the full path name of the `install-sh' program - as the value of the INSTALL environment variable. Note: - do not use `cp' or some other program in place of - install because the HDF5 makefiles also use the install - program to also change file ownership and/or access - permissions. + as the value of the INSTALL environment variable. Note: do + not use `cp' or some other program in place of install + because the HDF5 makefiles also use the install program to + also change file ownership and/or access permissions. 5.3.6. Specifying other libraries and headers - Configure searches the standard places (those places known by - the systems compiler) for include files and header - files. However, additional directories can be specified by - using the CPPFLAGS and/or LDFLAGS variables: + Configure searches the standard places (those places known by the + systems compiler) for include files and header files. However, + additional directories can be specified by using the CPPFLAGS + and/or LDFLAGS variables: $ CPPFLAGS=-I/home/robb/include \ LDFLAGS=-L/home/robb/lib \ ./configure - HDF5 uses the zlib library for two purposes: it provides - support for the HDF5 deflate data compression filter, and it - is used by the h5toh4 converter and the h4toh5 converter - in support of HDF4. Configure - searches the standard places (plus those specified above with - CPPFLAGS and LDFLAGS variables) for the zlib headers and - library. The search can be disabled by specifying - `--without-zlib' or alternate directories can be specified - with `--with-zlib=INCDIR,LIBDIR' or through the CPPFLAGS and - LDFLAGS variables: + HDF5 uses the zlib library for two purposes: it provides support + for the HDF5 deflate data compression filter, and it is used by + the h5toh4 converter and the h4toh5 converter in support of + HDF4. Configure searches the standard places (plus those + specified above with CPPFLAGS and LDFLAGS variables) for the zlib + headers and library. The search can be disabled by specifying + `--without-zlib' or alternate directories can be specified with + `--with-zlib=INCDIR,LIBDIR' or through the CPPFLAGS and LDFLAGS + variables: $ ./configure --with-zlib=/usr/unsup/include,/usr/unsup/lib @@ -340,44 +339,42 @@ directory. 5.3.7. Static versus shared linking - The build process will create static libraries on all systems - and shared libraries on systems that support dynamic linking - to a sufficient degree. Either form of library may be - suppressed by saying `--disable-static' or `--disable-shared'. + The build process will create static libraries on all systems and + shared libraries on systems that support dynamic linking to a + sufficient degree. Either form of library may be suppressed by + saying `--disable-static' or `--disable-shared'. $ ./configure --disable-shared - To build only statically linked executables, on platforms which - support shared libraries, use the `--enable-static-exec'. + To build only statically linked executables on platforms which + support shared libraries, use the `--enable-static-exec' flag. $ ./configure --enable-static-exec 5.3.8. Optimization versus symbolic debugging - The library can be compiled to provide symbolic debugging - support so it can be debugged with gdb, dbx, ddd, etc or it - can be compiled with various optimizations. To compile for - symbolic debugging (the default for snapshots) say - `--disable-production'; to compile with optimizations (the - default for supported public releases) say - `--enable-production'. On some systems the library can also - be compiled for profiling with gprof by saying + The library can be compiled to provide symbolic debugging support + so it can be debugged with gdb, dbx, ddd, etc or it can be + compiled with various optimizations. To compile for symbolic + debugging (the default for snapshots) say `--disable-production'; + to compile with optimizations (the default for supported public + releases) say `--enable-production'. On some systems the library + can also be compiled for profiling with gprof by saying `--enable-production=profile'. $ ./configure --disable-production #symbolic debugging $ ./configure --enable-production #optimized code $ ./configure --enable-production=profile #for use with gprof - Regardless of whether support for symbolic debugging is - enabled, the library also is able to perform runtime debugging - of certain packages (such as type conversion execution times, - and extensive invariant condition checking). To enable this - debugging supply a comma-separated list of package names to to - the `--enable-debug' switch (see Debugging.html for a list of - package names). Debugging can be disabled by saying - `--disable-debug'. The default debugging level for snapshots - is a subset of the available packages; the default for - supported releases is no debugging (debugging can incur a - significant runtime penalty). + Regardless of whether support for symbolic debugging is enabled, + the library also is able to perform runtime debugging of certain + packages (such as type conversion execution times, and extensive + invariant condition checking). To enable this debugging supply a + comma-separated list of package names to to the `--enable-debug' + switch (see Debugging.html for a list of package names). + Debugging can be disabled by saying `--disable-debug'. The + default debugging level for snapshots is a subset of the + available packages; the default for supported releases is no + debugging (debugging can incur a significant runtime penalty). $ ./configure --enable-debug=s,t #debug only H5S and H5T $ ./configure --enable-debug #debug normal packages @@ -385,20 +382,19 @@ $ ./configure --disable-debug #no debugging HDF5 is also able to print a trace of all API function calls, - their arguments, and the return values. To enable or disable - the ability to trace the API say `--enable-trace' (the default - for snapthots) or `--disable-trace' (the default for public - releases). The tracing must also be enabled at runtime to see - any output (see Debugging.html). + their arguments, and the return values. To enable or disable the + ability to trace the API say `--enable-trace' (the default for + snapthots) or `--disable-trace' (the default for public + releases). The tracing must also be enabled at runtime to see any + output (see Debugging.html). 5.3.9. Large (>2GB) vs. small (<2GB) file capability - In order to read or write files that could potentially be - larger than 2GB it is necessary to use the non-ANSI `long - long' data type on some platforms. However, some compilers - (e.g., GNU gcc versions before 2.8.1 on Intel platforms) - are unable to produce correct machine code for this data - type. To disable use of the `long long' type on these machines - say: + In order to read or write files that could potentially be larger + than 2GB it is necessary to use the non-ANSI `long long' data + type on some platforms. However, some compilers (e.g., GNU gcc + versions before 2.8.1 on Intel platforms) are unable to produce + correct machine code for this data type. To disable use of the + `long long' type on these machines say: $ ./configure --disable-hsizet @@ -409,78 +405,78 @@ 5.3.11. Threadsafe capability The HDF5 library can be configured to be thread-safe (on a very - large scale) with the with the '--enable-threadsafe' flag to - configure. Read the file doc/TechNotes/ThreadSafeLibrary.html - for further details. + large scale) with the with the `--enable-threadsafe' flag to + configure. Read the file doc/TechNotes/ThreadSafeLibrary.html for + further details. 5.3.12. Backward compatibility - The 1.4 version of the HDF5 library can be configured to operate - identically to the v1.2 library with the '--enable-hdf5v1_2' configure - flag. This allows existing code to be compiled with the v1.4 library - without requiring immediate changes to the application source code. - This flag will only be supported in the v1.4 branch of the library, - it will not be available in v1.5+. + The 1.4 version of the HDF5 library can be configured to operate + identically to the v1.2 library with the `--enable-hdf5v1_2' + configure flag. This allows existing code to be compiled with the + v1.4 library without requiring immediate changes to the + application source code. This flag will only be supported in the + v1.4 branch of the library, it will not be available in v1.5+. 5.3.13. Network stream capability - The HDF5 library can be configured with a network stream file - driver with the '--enable-stream-vfd' configure flag. This option - compiles the "stream" Virtual File Driver into the main library. - See the documentation on the Virtual File Layer for more details - about the use of this driver. + The HDF5 library can be configured with a network stream file + driver with the `--enable-stream-vfd' configure flag. This option + compiles the "stream" Virtual File Driver into the main library. + See the documentation on the Virtual File Layer for more details + about the use of this driver. 5.4. Building The library, confidence tests, and programs can be build by - saying just + saying just: $ make - Note that if you supplied some other make command via the MAKE + Note that if you supplied some other make command via the MAKE variable during the configuration step then that same command must be used here. - When using GNU make you can add `-j -l6' to the make command - to compile in parallel on SMP machines. Do not give a number - after th `-j' since GNU make will turn it off for recursive - invocations of make. + When using GNU make you can add `-j -l6' to the make command to + compile in parallel on SMP machines. Do not give a number after + th `-j' since GNU make will turn it off for recursive invocations + of make. $ make -j -l6 5.5. Testing - HDF5 comes with various test suites, all of which can be run - by saying + HDF5 comes with various test suites, all of which can be run by + saying $ make check To run only the tests for the library change to the `test' - directory before issuing the command. Similarly, tests for the - parallel aspects of the library are in `testpar' and tests for + directory before issuing the command. Similarly, tests for the + parallel aspects of the library are in `testpar' and tests for the support programs are in `tools'. - Temporary files will be deleted by each test when it complets, + Temporary files will be deleted by each test when it complets, but may continue to exist in an incomplete state if the test - fails. To prevent deletion of the files define the - HDF5_NOCLEANUP environment variable. + fails. To prevent deletion of the files define the HDF5_NOCLEANUP + environment variable. 5.6. Installing The HDF5 library, include files, and support programs can be - installed in a (semi-)public place by saying `make - install'. The files are installed under the directory - specified with `--prefix=DIR' (or '/usr/local') in directories - named `lib', `include', and `bin'. The prefix directory must - exist prior to `make install', but its subdirectories are - created automatically. - - If `make install' fails because the install command at your - site somehow fails, you may use the install-sh that comes - with the source. You need to run ./configure again. + installed in a (semi-)public place by saying `make install'. The + files are installed under the directory specified with + `--prefix=DIR' (or '/usr/local') in directories named `lib', + `include', and `bin'. The prefix directory must exist prior to + `make install', but its subdirectories are created automatically. + + If `make install' fails because the install command at your site + somehow fails, you may use the install-sh that comes with the + source. You need to run ./configure again. + $ INSTALL="$PWD/bin/install-sh -c" ./configure ... $ make install The library can be used without installing it by pointing the compiler at the `src' directory for both include files and - libraries. However, the minimum which must be installed to - make the library publically available is: + libraries. However, the minimum which must be installed to make + the library publically available is: The library: ./src/libhdf5.a @@ -505,15 +501,13 @@ 6. Using the Library Please see the User Manual in the doc/html directory. - Most programs will include <hdf5.h> and link with - -lhdf5. Additional libraries may also be necessary depending - on whether support for compression, etc. was compiled into the - hdf5 library. + Most programs will include <hdf5.h> and link with -lhdf5. + Additional libraries may also be necessary depending on whether + support for compression, etc. was compiled into the hdf5 library. A summary of the hdf5 installation can be found in the - libhdf5.settings file in the same directory as the static - and/or shared hdf5 libraries. + libhdf5.settings file in the same directory as the static and/or + shared hdf5 libraries. 7. Support Support is described in the README file. - |