summaryrefslogtreecommitdiffstats
path: root/release_docs
diff options
context:
space:
mode:
authorlrknox <lrknox>2018-05-04 21:18:29 (GMT)
committerlrknox <lrknox>2018-05-04 21:18:29 (GMT)
commitcc2eea14f6a6532a201dde13af13353d4d73fd92 (patch)
treee8f8441cf03001362edf75bae862ab5b4b93a658 /release_docs
parent5f702f40b719a05bf2d271797179b368c5fd39c5 (diff)
parentbd0de172504724b67cd4000a78efc426a696e9fb (diff)
downloadhdf5-cc2eea14f6a6532a201dde13af13353d4d73fd92.zip
hdf5-cc2eea14f6a6532a201dde13af13353d4d73fd92.tar.gz
hdf5-cc2eea14f6a6532a201dde13af13353d4d73fd92.tar.bz2
Merge tag 'hdf5-1_10_2' into 1.10/master.
Tag revisions for HDF5 1.10.2 release Commit HDF5 1.10.2 release versions.
Diffstat (limited to 'release_docs')
-rw-r--r--release_docs/HISTORY-1_10.txt734
-rwxr-xr-x[-rw-r--r--]release_docs/INSTALL373
-rw-r--r--release_docs/INSTALL_CMake.txt186
-rwxr-xr-x[-rw-r--r--]release_docs/INSTALL_Cygwin.txt20
-rwxr-xr-x[-rw-r--r--]release_docs/INSTALL_parallel16
-rwxr-xr-x[-rw-r--r--]release_docs/RELEASE.txt1305
-rw-r--r--release_docs/USING_HDF5_CMake.txt240
-rw-r--r--release_docs/USING_HDF5_VS.txt4
8 files changed, 1992 insertions, 886 deletions
diff --git a/release_docs/HISTORY-1_10.txt b/release_docs/HISTORY-1_10.txt
index 03d0e3e..52eb273 100644
--- a/release_docs/HISTORY-1_10.txt
+++ b/release_docs/HISTORY-1_10.txt
@@ -3,11 +3,745 @@ HDF5 History
This file contains development history of the HDF5 1.10 branch
+03. Release Information for hdf5-1.10.1
02. Release Information for hdf5-1.10.0-patch1
01. Release Information for hdf5-1.10.0
[Search on the string '%%%%' for section breaks of each release.]
+%%%%1.10.1%%%%
+
+HDF5 version 1.10.1 released on 2017-04-27
+================================================================================
+
+INTRODUCTION
+
+This document describes the differences between HDF5-1.10.0-patch1 and
+HDF5 1.10.1, and contains information on the platforms tested and known
+problems in HDF5-1.10.1. For more details check the HISTORY*.txt files
+in the HDF5 source.
+
+Links to HDF5 1.10.1 source code, documentation, and additional materials can
+be found on The HDF5 web page at:
+
+ https://support.hdfgroup.org/HDF5/
+
+The HDF5 1.10.1 release can be obtained from:
+
+ https://support.hdfgroup.org/HDF5/release/obtain5.html
+
+User documentation for the snapshot can be accessed directly at this location:
+
+ https://support.hdfgroup.org/HDF5/doc/
+
+New features in the HDF5-1.10.x release series, including brief general
+descriptions of some new and modified APIs, are described in the "New Features
+in HDF5 Release 1.10" document:
+
+ https://support.hdfgroup.org/HDF5/docNewFeatures/index.html
+
+All new and modified APIs are listed in detail in the "HDF5 Software Changes
+from Release to Release" document, in the section "Release 10.1 (current
+release) versus Release 1.10.0
+
+ https://support.hdfgroup.org/HDF5/doc/ADGuide/Changes.html
+
+If you have any questions or comments, please send them to the HDF Help Desk:
+
+ help@hdfgroup.org
+
+
+CONTENTS
+
+- Major New Features Introduced in HDF5 1.10.1
+- Other New Features and Enhancements
+- Support for New Platforms, Languages, and Compilers
+- Bug Fixes since HDF5-1.10.0-patch1
+- Supported Platforms
+- Tested Configuration Features Summary
+- More Tested Platforms
+- Known Problems
+
+
+Major New Features Introduced in HDF5 1.10.1
+============================================
+
+For links to the RFCs and documentation in this section please view
+https://support.hdfgroup.org/HDF5/docNewFeatures in a web browser.
+
+________________________________________
+Metadata Cache Image
+________________________________________
+
+ HDF5 metadata is typically small, and scattered throughout the HDF5 file.
+ This can affect performance, particularly on large HPC systems. The
+ Metadata Cache Image feature can improve performance by writing the
+ metadata cache in a single block on file close, and then populating the
+ cache with the contents of this block on file open, thus avoiding the many
+ small I/O operations that would otherwise be required on file open and
+ close. See the RFC for complete details regarding this feature. Also,
+ see the Fine Tuning the Metadata Cache documentation.
+
+ At present, metadata cache images may not be generated by parallel
+ applications. Parallel applications can read files with metadata cache
+ images, but since this is a collective operation, a deadlock is possible
+ if one or more processes do not participate.
+
+________________________________________
+Metadata Cache Evict on Close
+________________________________________
+
+ The HDF5 library's metadata cache is fairly conservative about holding on
+ to HDF5 object metadata (object headers, chunk index structures, etc.),
+ which can cause the cache size to grow, resulting in memory pressure on
+ an application or system. The "evict on close" property will cause all
+ metadata for an object to be evicted from the cache as long as metadata
+ is not referenced from any other open object. See the Fine Tuning the
+ Metadata Cache documentation for information on the APIs.
+
+ At present, evict on close is disabled in parallel builds.
+
+________________________________________
+Paged Aggregation
+________________________________________
+
+ The current HDF5 file space allocation accumulates small pieces of metadata
+ and raw data in aggregator blocks which are not page aligned and vary
+ widely in sizes. The paged aggregation feature was implemented to provide
+ efficient paged access of these small pieces of metadata and raw data.
+ See the RFC for details. Also, see the File Space Management documentation.
+
+________________________________________
+Page Buffering
+________________________________________
+
+ Small and random I/O accesses on parallel file systems result in poor
+ performance for applications. Page buffering in conjunction with paged
+ aggregation can improve performance by giving an application control of
+ minimizing HDF5 I/O requests to a specific granularity and alignment.
+ See the RFC for details. Also, see the Page Buffering documentation.
+
+ At present, page buffering is disabled in parallel builds.
+
+
+
+Other New Features and Enhancements
+===================================
+
+ Library
+ -------
+ - Added a mechanism for disabling the SWMR file locking scheme.
+
+ The file locking calls used in HDF5 1.10.0 (including patch1)
+ will fail when the underlying file system does not support file
+ locking or where locks have been disabled. To disable all file
+ locking operations, an environment variable named
+ HDF5_USE_FILE_LOCKING can be set to the five-character string
+ 'FALSE'. This does not fundamentally change HDF5 library
+ operation (aside from initial file open/create, SWMR is lock-free),
+ but users will have to be more careful about opening files
+ to avoid problematic access patterns (i.e.: multiple writers)
+ that the file locking was designed to prevent.
+
+ Additionally, the error message that is emitted when file lock
+ operations set errno to ENOSYS (typical when file locking has been
+ disabled) has been updated to describe the problem and potential
+ resolution better.
+
+ (DER, 2016/10/26, HDFFV-9918)
+
+ - The return type of H5Pget_driver_info() has been changed from void *
+ to const void *.
+
+ The pointer returned by this function points to internal library
+ memory and should not be freed by the user.
+
+ (DER, 2016/11/04, HDFFV-10017)
+
+ - The direct I/O VFD has been removed from the list of VFDs that
+ support SWMR.
+
+ This configuration was never officially tested and several SWMR
+ tests fail when this VFD is set.
+
+ (DER, 2016/11/03, HDFFV-10169)
+
+ Configuration:
+ --------------
+ - The minimum version of CMake required to build HDF5 is now 3.2.2.
+
+ (ADB, 2017/01/10)
+
+ - An --enable/disable-developer-warnings option has been added to
+ configure.
+
+ This disables warnings that do not indicate poor code quality such
+ as -Winline and gcc's -Wsuggest-attribute. Developer warnings are
+ disabled by default.
+
+ (DER, 2017/01/10)
+
+ - A bin/restore.sh script was added that reverts autogen.sh processing.
+
+ (DER, 2016/11/08)
+
+ - CMake: Added NAMESPACE hdf5:: to package configuration files to allow
+ projects using installed HDF5 binaries built with CMake to link with
+ them without specifying the HDF5 library location via IMPORTED_LOCATION.
+
+ (ABD, 2016/10/17, HDFFV-10003)
+
+ - CMake: Changed the CTEST_BUILD_CONFIGURATION option to
+ CTEST_CONFIGURATION_TYPE as recommended by the CMake documentation.
+
+ (ABD, 2016/10/17, HDFFV-9971)
+
+
+ Fortran Library:
+ ----------------
+
+ - The HDF5 Fortran library can now be compiled with the NAG compiler.
+
+ (MSB, 2017/2/10, HDFFV-9973)
+
+
+ C++ Library:
+ ------------
+
+ - The following C++ API wrappers have been added to the C++ Library:
+
+ // Sets/Gets the strategy and the threshold value that the library
+ // will employ in managing file space.
+ FileCreatPropList::setFileSpaceStrategy - H5Pset_file_space_strategy
+ FileCreatPropList::getFileSpaceStrategy - H5Pget_file_space_strategy
+
+ // Sets/Gets the file space page size for paged aggregation.
+ FileCreatPropList::setFileSpacePagesize - H5Pset_file_space_page_size
+ FileCreatPropList::getFileSpacePagesize - H5Pget_file_space_page_size
+
+ // Checks if the given ID is valid.
+ IdComponent::isValid - H5Iis_valid
+
+ // Sets/Gets the number of soft or user-defined links that can be
+ // traversed before a failure occurs.
+ LinkAccPropList::setNumLinks - H5Pset_nlinks
+ LinkAccPropList::getNumLinks - H5Pget_nlinks
+
+ // Returns a copy of the creation property list of a datatype.
+ DataType::getCreatePlist - H5Tget_create_plist
+
+ // Opens/Closes an object within a group or a file, regardless of object
+ // type
+ Group::getObjId - H5Oopen
+ Group::closeObjId - H5Oclose
+
+ // Maps elements of a virtual dataset to elements of the source dataset.
+ DSetCreatPropList::setVirtual - H5Pset_virtual
+
+ // Gets general information about this file.
+ H5File::getFileInfo - H5Fget_info2
+
+ // Returns the number of members in a type.
+ IdComponent::getNumMembers - H5Inmembers
+
+ // Determines if an element type exists.
+ IdComponent::typeExists - H5Itype_exists
+
+ // Determines if an object exists.
+ H5Location::exists - H5Lexists.
+
+ // Returns the header version of an HDF5 object.
+ H5Object::objVersion - H5Oget_info for version
+
+ (BMR, 2017/03/20, HDFFV-10004, HDFFV-10139, HDFFV-10145)
+
+ - New exception: ObjHeaderIException for H5O interface.
+
+ (BMR, 2017/03/15, HDFFV-10145)
+
+ - New class LinkAccPropList for link access property list, to be used by
+ wrappers of H5Lexists.
+
+ (BMR, 2017/01/04, HDFFV-10145)
+
+ - New constructors to open datatypes in ArrayType, CompType, DataType,
+ EnumType, FloatType, IntType, StrType, and VarLenType.
+
+ (BMR, 2016/12/26, HDFFV-10056)
+
+ - New member functions:
+
+ DSetCreatPropList::setNbit() to setup N-bit compression for a dataset.
+
+ ArrayType::getArrayNDims() const
+ ArrayType::getArrayDims() const
+ both to replace the non-const versions.
+
+ (BMR, 2016/04/25, HDFFV-8623, HDFFV-9725)
+
+
+ Tools:
+ ------
+ - The following options have been added to h5clear:
+ -s: clear the status_flags field in the file's superblock
+ -m: Remove the metadata cache image from the file
+
+ (QAK, 2017/03/22, PR#361)
+
+
+ High-Level APIs:
+ ---------------
+ - Added New Fortran 2003 API for h5tbmake_table_f.
+
+ (MSB, 2017/02/10, HDFFV-8486)
+
+
+
+Support for New Platforms, Languages, and Compilers
+===================================================
+
+ - Added NAG compiler
+
+
+
+Bug Fixes since HDF5-1.10.0-patch1 release
+==================================
+
+ Library
+ -------
+ - Outdated data structure was used in H5D_CHUNK_DEBUG blocks, causing
+ compilation errors when H5D_CHUNK_DEBUG was defined. This is fixed.
+
+ (BMR, 2017/04/04, HDFFV-8089)
+
+ - SWMR implementation in the HDF5 1.10.0 and 1.10.0-patch1 releases has a
+ broken metadata flush dependency that manifested itself with the following
+ error at the end of the HDF5 error stack:
+
+ H5Dint.c line 846 in H5D__swmr_setup(): dataspace chunk index must be 0
+ for SWMR access, chunkno = 1
+ major: Dataset
+ minor: Bad value
+
+ It was also reported at https://github.com/areaDetector/ADCore/issues/203
+
+ The flush dependency is fixed in this release.
+
+ - Changed the plugins dlopen option from RTLD_NOW to RTLD_LAZY
+
+ (ABD, 2016/12/12, PR#201)
+
+ - A number of issues were fixed when reading/writing from/to corrupted
+ files to ensure that the library fails gracefully in these cases:
+
+ * Writing to a corrupted file that has an object message which is
+ incorrectly marked as sharable on disk results in a buffer overflow /
+ invalid write instead of a clean error message.
+
+ * Decoding data from a corrupted file with a dataset encoded with the
+ H5Z_NBIT decoding can result in a code execution vulnerability under
+ the context of the application using the HDF5 library.
+
+ * When decoding an array datatype from a corrupted file, the HDF5 library
+ fails to return an error in production if the number of dimensions
+ decoded is greater than the maximum rank.
+
+ * When decoding an "old style" array datatype from a corrupted file, the
+ HDF5 library fails to return an error in production if the number of
+ dimensions decoded is greater than the maximum rank.
+
+ (NAF, 2016/10/06, HDFFV-9950, HDFFV-9951, HDFFV-9992, HDFFV-9993)
+
+ - Fixed an error that would occur when copying an object with an attribute
+ which is a compound datatype consisting of a variable length string.
+
+ (VC, 2016/08/24, HDFFV-7991)
+
+ - H5DOappend will no longer fail if a dataset has no append callback
+ registered.
+
+ (VC, 2016/08/14, HDFFV-9960)
+
+ - Fixed an issue where H5Pset_alignment could result in misaligned blocks
+ with some input combinations, causing an assertion failure in debug mode.
+
+ (NAF, 2016/08/11, HDFFV-9948)
+
+ - Fixed a problem where a plugin compiled into a DLL in the default plugin
+ directory could not be found by the HDF5 library at runtime on Windows
+ when the HDF5_PLUGIN_PATH environment variable was not set.
+
+ (ABD, 2016/08/01, HDFFV-9706)
+
+ - Fixed an error that would occur when calling H5Adelete on an attribute
+ which is attached to an externally linked object in the target file and
+ whose datatype is a committed datatype in the main file.
+
+ (VC, 2016/07/06, HDFFV-9940)
+
+ - (a) Throw an error instead of assertion when v1 btree level hits the 1
+ byte limit.
+ (b) Modifications to better handle error recovery when conversion by
+ h5format_convert fails.
+
+ (VC, 2016/05/29, HDFFV-9434)
+
+ - Fixed a memory leak where an array used by the library to track SWMR
+ read retries was unfreed.
+
+ The leaked memory was small (on the order of a few tens of ints) and
+ allocated per-file. The memory was allocated (and lost) only when a
+ file was opened for SWMR access.
+
+ (DER, 2016/04/27, HDFFV-9786)
+
+ - Fixed a memory leak that could occur when opening a file for the first
+ time (including creating) and the call fails.
+
+ This occurred when the file-driver-specific info was not cleaned up.
+ The amount of memory leaked varied with the file driver, but would
+ normally be less than 1 kB.
+
+ (DER, 2016/12/06, HDFFV-10168)
+
+ - Fixed a failure in collective metadata writes.
+
+ This failure only appeared when collective metadata writes
+ were enabled (via H5Pset_coll_metadata_write()).
+
+ (JRM, 2017/04/10, HDFFV-10055)
+
+
+ Parallel Library
+ ----------------
+ - Fixed a bug that could occur when allocating a chunked dataset in parallel
+ with an alignment set and an alignment threshold greater than the chunk
+ size but less than or equal to the raw data aggregator size.
+
+ (NAF, 2016/08/11, HDFFV-9969)
+
+
+ Configuration
+ -------------
+ - Configuration will check for the strtoll and strtoull functions
+ before using alternatives
+
+ (ABD, 2017/03/17, PR#340)
+
+ - CMake uses a Windows pdb directory variable if available and
+ will generate both static and shared pdb files.
+
+ (ABD, 2017/02/06, HDFFV-9875)
+
+ - CMake now builds shared versions of tools.
+
+ (ABD, 2017/02/01, HDFFV-10123)
+
+ - Makefiles and test scripts have been updated to correctly remove files
+ created when running "make check" and to avoid removing any files under
+ source control. In-source builds followed by "make clean" and "make
+ distclean" should result in the original source files.
+ (LRK, 2017/01/17, HDFFV-10099)
+
+ - The tools directory has been divided into two separate source and test
+ directories. This resolves a build dependency and, as a result,
+ 'make check' will no longer fail in the tools directory if 'make' was
+ not executed first.
+
+ (ABD, 2016/10/27, HDFFV-9719)
+
+ - CMake: Fixed a timeout error that would occasionally occur when running
+ the virtual file driver tests simultaneously due to test directory
+ and file name collisions.
+
+ (ABD, 2016/09/19, HDFFV-9431)
+
+ - CMake: Fixed a command length overflow error by converting custom
+ commands inside CMakeTest.cmake files into regular dependencies and
+ targets.
+
+ (ABD, 2016/07/12, HDFFV-9939)
+
+ - Fixed a problem preventing HDF5 to be built on 32-bit CYGWIN by
+ condensing cygwin configuration files into a single file and
+ removing outdated compiler settings.
+
+ (ABD, 2016/07/12, HDFFV-9946)
+
+
+ Fortran
+ --------
+ - Changed H5S_ALL_F from INTEGER to INTEGER(HID_T)
+
+ (MSB, 2016/10/14, HDFFV-9987)
+
+
+ Tools
+ -----
+ - h5diff now correctly ignores strpad in comparing strings.
+
+ (ABD, 2017/03/03, HDFFV-10128)
+
+ - h5repack now correctly parses the command line filter options.
+
+ (ABD, 2017/01/24, HDFFV-10046)
+
+ - h5diff now correctly returns an error when it cannot read data due
+ to an unavailable filter plugin.
+
+ (ADB 2017/01/18, HDFFV-9994 )
+
+ - Fixed an error in the compiler wrapper scripts (h5cc, h5fc, et al.)
+ in which they would erroneously drop the file argument specified via
+ the -o flag when the -o flag was specified before the -c flag on the
+ command line, resulting in a failure to compile.
+
+ (LRK, 2016/11/04, HDFFV-9938, HDFFV-9530)
+
+ - h5repack User Defined (UD) filter parameters were not parsed correctly.
+
+ The UD filter parameters were not being parsed correctly. Reworked coding
+ section to parse the correct values and verify number of parameters.
+
+ (ABD, 2016/10/19, HDFFV-9996, HDFFV-9974, HDFFV-9515, HDFFV-9039)
+
+ - h5repack allows the --enable-error-stack option on the command line.
+
+ (ADB, 2016/08/08, HDFFV-9775)
+
+
+ C++ APIs
+ --------
+ - The member function H5Location::getNumObjs() is moved to
+ class Group because the objects are in a group or a file only,
+ and H5Object::getNumAttrs to H5Location to get the number of
+ attributes at a given location.
+
+ (BMR, 2017/03/17, PR#466)
+
+ - Due to the change in the C API, the overloaded functions of
+ PropList::setProperty now need const for some arguments. They are
+ planned for deprecation and are replaced by new versions with proper
+ consts.
+
+ (BMR, 2017/03/17, PR#344)
+
+ - The high-level API Packet Table (PT) did not write data correctly when
+ the datatype is a compound type that has string type as one of the
+ members. This problem started in 1.8.15, after the fix of HDFFV-9042
+ was applied, which caused the Packet Table to use native type to access
+ the data. It should be up to the application to specify whether the
+ buffer to be read into memory is in the machine's native architecture.
+ Thus, the PT is fixed to not use native type but to make a copy of the
+ user's provided datatype during creation or the packet table's datatype
+ during opening. If an application wishes to use native type to read the
+ data, then the application will request that. However, the Packet Table
+ doesn't provide a way to specify memory datatype in this release. This
+ feature will be available in future releases.
+
+ (BMR, 2016/10/27, HDFFV-9758)
+
+ - The obsolete macros H5_NO_NAMESPACE and H5_NO_STD have been removed from
+ the HDF5 C++ API library.
+
+ (BMR, 2016/10/23, HDFFV-9532)
+
+ - The problem where a user-defined function cannot access both, attribute
+ and dataset, using only one argument is now fixed.
+
+ (BMR, 2016/10/11, HDFFV-9920)
+
+ - In-memory array information, ArrayType::rank and
+ ArrayType::dimensions, were removed. This is an implementation
+ detail and should not affect applications.
+
+ (BMR, 2016/04/25, HDFFV-9725)
+
+
+ Testing
+ -------
+ - Fixed a problem that caused tests using SWMR to occasionally fail when
+ running "make check" using parallel make.
+
+ (LRK, 2016/03/22, PR#338, PR#346, PR#358)
+
+
+Supported Platforms
+===================
+
+ Linux 2.6.32-573.18.1.el6.ppc64 gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-4)
+ #1 SMP ppc64 GNU/Linux g++ (GCC) 4.4.7 20120313 (Red Hat 4.4.7-4)
+ (ostrich) GNU Fortran (GCC) 4.4.7 20120313
+ (Red Hat 4.4.7-4)
+ IBM XL C/C++ V13.1
+ IBM XL Fortran V15.1
+
+ Linux 3.10.0-327.10.1.el7 GNU C (gcc), Fortran (gfortran), C++ (g++)
+ #1 SMP x86_64 GNU/Linux compilers:
+ (kituo/moohan) Version 4.8.5 20150623 (Red Hat 4.8.5-4)
+ Version 4.9.3, Version 5.2.0
+ Intel(R) C (icc), C++ (icpc), Fortran (icc)
+ compilers:
+ Version 15.0.3.187 Build 20150407
+ MPICH 3.1.4 compiled with GCC 4.9.3
+
+ SunOS 5.11 32- and 64-bit Sun C 5.12 SunOS_sparc
+ (emu) Sun Fortran 95 8.6 SunOS_sparc
+ Sun C++ 5.12 SunOS_sparc
+
+ Windows 7 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2013 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+
+ Windows 7 x64 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2013 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Visual Studio 2015 w/ MSMPI 8 (cmake)
+ Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
+ gcc and gfortran compilers (GCC 5.4.0)
+ (cmake and autotools)
+
+ Windows 10 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
+ gcc and gfortran compilers (GCC 5.4.0)
+ (cmake and autotools)
+
+ Windows 10 x64 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+
+ Mac OS X Mt. Lion 10.8.5 Apple clang/clang++ version 5.1 from Xcode 5.1
+ 64-bit gfortran GNU Fortran (GCC) 4.8.2
+ (swallow/kite) Intel icc/icpc/ifort version 15.0.3
+
+ Mac OS X Mavericks 10.9.5 Apple clang/clang++ version 6.0 from Xcode 6.2
+ 64-bit gfortran GNU Fortran (GCC) 4.9.2
+ (wren/quail) Intel icc/icpc/ifort version 15.0.3
+
+ Mac OS X Yosemite 10.10.5 Apple clang/clang++ version 6.1 from Xcode 7.0
+ 64-bit gfortran GNU Fortran (GCC) 4.9.2
+ (osx1010dev/osx1010test) Intel icc/icpc/ifort version 15.0.3
+
+ Mac OS X El Capitan 10.11.6 Apple clang/clang++ version 7.3 from Xcode 7.3
+ 64-bit gfortran GNU Fortran (GCC) 5.2.0
+ (osx1010dev/osx1010test) Intel icc/icpc/ifort version 16.0.2
+
+
+Tested Configuration Features Summary
+=====================================
+
+ In the tables below
+ y = tested
+ n = not tested in this release
+ C = Cluster
+ W = Workstation
+ x = not working in this release
+ dna = does not apply
+ ( ) = footnote appears below second table
+ <blank> = testing incomplete on this feature or platform
+
+Platform C F90/ F90 C++ zlib SZIP
+ parallel F2003 parallel
+Solaris2.11 32-bit n y/y n y y y
+Solaris2.11 64-bit n y/n n y y y
+Windows 7 y y/y n y y y
+Windows 7 x64 y y/y y y y y
+Windows 7 Cygwin n y/n n y y y
+Windows 7 x64 Cygwin n y/n n y y y
+Windows 10 y y/y n y y y
+Windows 10 x64 y y/y n y y y
+Mac OS X Mountain Lion 10.8.5 64-bit n y/y n y y y
+Mac OS X Mavericks 10.9.5 64-bit n y/y n y y y
+Mac OS X Yosemite 10.10.5 64-bit n y/y n y y y
+Mac OS X El Capitan 10.11.6 64-bit n y/y n y y y
+CentOS 7.2 Linux 2.6.32 x86_64 PGI n y/y n y y y
+CentOS 7.2 Linux 2.6.32 x86_64 GNU y y/y y y y y
+CentOS 7.2 Linux 2.6.32 x86_64 Intel n y/y n y y y
+Linux 2.6.32-573.18.1.el6.ppc64 n y/y n y y y
+
+
+Platform Shared Shared Shared Thread-
+ C libs F90 libs C++ libs safe
+Solaris2.11 32-bit y y y y
+Solaris2.11 64-bit y y y y
+Windows 7 y y y y
+Windows 7 x64 y y y y
+Windows 7 Cygwin n n n y
+Windows 7 x64 Cygwin n n n y
+Windows 10 y y y y
+Windows 10 x64 y y y y
+Mac OS X Mountain Lion 10.8.5 64-bit y n y y
+Mac OS X Mavericks 10.9.5 64-bit y n y y
+Mac OS X Yosemite 10.10.5 64-bit y n y y
+Mac OS X El Capitan 10.11.6 64-bit y n y y
+CentOS 7.2 Linux 2.6.32 x86_64 PGI y y y n
+CentOS 7.2 Linux 2.6.32 x86_64 GNU y y y y
+CentOS 7.2 Linux 2.6.32 x86_64 Intel y y y n
+Linux 2.6.32-573.18.1.el6.ppc64 y y y n
+
+Compiler versions for each platform are listed in the preceding
+"Supported Platforms" table.
+
+
+More Tested Platforms
+=====================
+
+The following platforms are not supported but have been tested for this release.
+
+ Linux 2.6.32-573.22.1.el6 GNU C (gcc), Fortran (gfortran), C++ (g++)
+ #1 SMP x86_64 GNU/Linux compilers:
+ (mayll/platypus) Version 4.4.7 20120313
+ Version 4.8.4
+ PGI C, Fortran, C++ for 64-bit target on
+ x86-64;
+ Version 16.10-0
+ Intel(R) C (icc), C++ (icpc), Fortran (icc)
+ compilers:
+ Version 15.0.3.187 (Build 20150407)
+ MPICH 3.1.4 compiled with GCC 4.9.3
+
+ Linux 3.10.0-327.18.2.el7 GNU C (gcc) and C++ (g++) compilers
+ #1 SMP x86_64 GNU/Linux Version 4.8.5 20150623 (Red Hat 4.8.5-4)
+ (jelly) with NAG Fortran Compiler Release 6.1(Tozai)
+ Intel(R) C (icc) and C++ (icpc) compilers
+ Version 15.0.3.187 (Build 20150407)
+ with NAG Fortran Compiler Release 6.1(Tozai)
+
+ Linux 2.6.32-573.18.1.el6.ppc64 MPICH mpich 3.1.4 compiled with
+ #1 SMP ppc64 GNU/Linux IBM XL C/C++ for Linux, V13.1
+ (ostrich) and IBM XL Fortran for Linux, V15.1
+
+ Debian 8.4 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1 x86_64 GNU/Linux
+ gcc, g++ (Debian 4.9.2-10) 4.9.2
+ GNU Fortran (Debian 4.9.2-10) 4.9.2
+ (cmake and autotools)
+
+ Fedora 24 4.7.2-201.fc24.x86_64 #1 SMP x86_64 x86_64 x86_64 GNU/Linux
+ gcc, g++ (GCC) 6.1.1 20160621
+ (Red Hat 6.1.1-3)
+ GNU Fortran (GCC) 6.1.1 20160621
+ (Red Hat 6.1.1-3)
+ (cmake and autotools)
+
+ Ubuntu 16.04.1 4.4.0-38-generic #57-Ubuntu SMP x86_64 GNU/Linux
+ gcc, g++ (Ubuntu 5.4.0-6ubuntu1~16.04.2)
+ 5.4.0 20160609
+ GNU Fortran (Ubuntu 5.4.0-6ubuntu1~16.04.2)
+ 5.4.0 20160609
+ (cmake and autotools)
+
+
+Known Problems
+==============
+
+ At present, metadata cache images may not be generated by parallel
+ applications. Parallel applications can read files with metadata cache
+ images, but since this is a collective operation, a deadlock is possible
+ if one or more processes do not participate.
+
+ Known problems in previous releases can be found in the HISTORY*.txt files
+ in the HDF5 source. Please report any new problems found to
+ help@hdfgroup.org.
+
+
%%%%1.10.0-patch1%%%%
diff --git a/release_docs/INSTALL b/release_docs/INSTALL
index 2dcb9be..baad559 100644..100755
--- a/release_docs/INSTALL
+++ b/release_docs/INSTALL
@@ -3,10 +3,11 @@ Instructions for the Installation of HDF5 Software
==================================================
This file provides instructions for installing the HDF5 software.
-If you have any problems with the installation, please see The HDF Group's
-support page at the following location:
- http://www.hdfgroup.org/services/support.html
+For help with installing, questions can be posted to the HDF Forum or sent to the HDF Helpdesk:
+
+ HDF Forum: https://forum.hdfgroup.org/
+ HDF Helpdesk: https://portal.hdfgroup.org/display/support/The+HDF+Help+Desk
CONTENTS
--------
@@ -31,59 +32,34 @@ CONTENTS
4.3. Configuring
4.3.1. Specifying the installation directories
4.3.2. Using an alternate C compiler
- 4.3.3. Configuring for 64-bit support
- 4.3.4. Additional compilation flags
- 4.3.5. Compiling HDF5 wrapper libraries
- 4.3.6. Specifying other programs
- 4.3.7. Specifying other libraries and headers
- 4.3.8. Static versus shared linking
- 4.3.9. Optimization versus symbolic debugging
- 4.3.10. Parallel versus serial library
- 4.3.11. Threadsafe capability
- 4.3.12. Backward compatibility
+ 4.3.3. Additional compilation flags
+ 4.3.4. Compiling HDF5 wrapper libraries
+ 4.3.5. Specifying other programs
+ 4.3.6. Specifying other libraries and headers
+ 4.3.7. Static versus shared linking
+ 4.3.8. Optimization versus symbolic debugging
+ 4.3.9. Parallel versus serial library
+ 4.3.10. Threadsafe capability
+ 4.3.11. Backward compatibility
4.4. Building
4.5. Testing
4.6. Installing HDF5
5. Using the Library
- 6. Support
-
- A. Warnings about compilers
- A.1. GNU (Intel platforms)
- A.2. DEC
- A.3. SGI (Irix64 6.2)
- A.4. Windows/NT
-
- B. Large (>2GB) versus small (<2GB) file capability
-
- C. Building and testing with other compilers
- C.1. Building and testing with Intel compilers
- C.2. Building and testing with PGI compilers
*****************************************************************************
1. Obtaining HDF5
The latest supported public release of HDF5 is available from
- ftp://ftp.hdfgroup.org/HDF5/current/src. For Unix and UNIX-like
+ https://www.hdfgroup.org/downloads/hdf5/. For Unix and UNIX-like
platforms, it is available in tar format compressed with gzip.
For Microsoft Windows, it is in ZIP format.
- The HDF team also makes snapshots of the source code available on
- a regular basis. These snapshots are unsupported (that is, the
- HDF team will not release a bug-fix on a particular snapshot;
- rather any bug fixes will be rolled into the next snapshot).
- Furthermore, the snapshots have only been tested on a few
- machines and may not test correctly for parallel applications.
- Snapshots, in a limited number of formats, can be found on THG's
- development FTP server:
-
- ftp://ftp.hdfgroup.uiuc.edu/pub/outgoing/hdf5/snapshots
-
2. Quick installation
For those who don't like to read ;-) the following steps can be used
- to configure, build, test, and install the HDF5 Library, header files,
+ to configure, build, test, and install the HDF5 library, header files,
and support programs. For example, to install HDF5 version X.Y.Z at
location /usr/local/hdf5, use the following steps.
@@ -125,28 +101,30 @@ CONTENTS
3. HDF5 dependencies
3.1. Zlib
- The HDF5 Library includes a predefined compression filter that
+ The HDF5 library includes a predefined compression filter that
uses the "deflate" method for chunked datasets. If zlib-1.1.2 or
later is found, HDF5 will use it. Otherwise, HDF5's predefined
compression method will degenerate to a no-op; the compression
filter will succeed but the data will not be compressed.
3.2. Szip (optional)
- The HDF5 Library includes a predefined compression filter that
+ The HDF5 library includes a predefined compression filter that
uses the extended-Rice lossless compression algorithm for chunked
- datasets. For more information about Szip compression and license
- terms, see http://hdfgroup.org/doc_resource/SZIP/.
+ datasets. For information on Szip compression, license terms,
+ and obtaining the Szip source code, see:
+
+ https://portal.hdfgroup.org/display/HDF5/Szip+Compression+in+HDF+Products
- The Szip source code can be obtained from the HDF5 Download page
- http://www.hdfgroup.org/HDF5/release/obtain5.html#extlibs. Building
- instructions are available with the Szip source code.
+ Building instructions are available with the Szip source code.
The HDF Group does not distribute separate Szip precompiled libraries,
- but the HDF5 binaries available from
- http://www.hdfgroup.org/HDF5/release/obtain5.html include
- the Szip encoder enabled binary for the corresponding platform.
+ but the HDF5 pre-built binaries provided on The HDF Group download page
+ include the Szip library with the encoder enabled. These can be found
+ here:
- To configure the HDF5 Library with the Szip compression filter, use
+ https://www.hdfgroup.org/downloads/hdf5/
+
+ To configure the HDF5 library with the Szip compression filter, use
the '--with-szlib=/PATH_TO_SZIP' flag. For more information, see
section 4.3.7, "Specifying other libraries and headers."
@@ -204,20 +182,6 @@ CONTENTS
$ cd build-fortran
$ ../hdf5-X.Y.Z/configure --enable-fortran ...
- Unfortunately, this does not work on recent Irix platforms (6.5?
- and later) because that `make' does not understand the VPATH variable.
- However, HDF5 also supports Irix `pmake' which has a .PATH target
- which serves a similar purpose. Here's what the Irix man pages say
- about VPATH, the facility used by HDF5 makefiles for this feature:
-
- The VPATH facility is a derivation of the undocumented
- VPATH feature in the System V Release 3 version of make.
- System V Release 4 has a new VPATH implementation, much
- like the pmake(1) .PATH feature. This new feature is also
- undocumented in the standard System V Release 4 manual
- pages. For this reason it is not available in the IRIX
- version of make. The VPATH facility should not be used
- with the new parallel make option.
4.3. Configuring
HDF5 uses the GNU autoconf system for configuration, which
@@ -243,7 +207,7 @@ CONTENTS
4.3.1. Specifying the installation directories
The default installation location is the HDF5 directory created in
the build directory. Typing `make install' will install the HDF5
- Library, header files, examples, and support programs in hdf5/lib,
+ library, header files, examples, and support programs in hdf5/lib,
hdf5/include, hdf5/doc/hdf5/examples, and hdf5/bin. To use a path
other than hdf5, specify the path with the `--prefix=PATH' switch:
@@ -275,45 +239,24 @@ CONTENTS
$ CC=/usr/local/mpi/bin/mpicc ./configure
-4.3.3. Configuring for 64-bit support
- Several machine architectures support 32-bit or 64-bit binaries.
- The options below describe how to enable support for different options.
-
- On Irix64, the default compiler is `cc'. To use an alternate compiler,
- specify it with the CC variable:
-
- $ CC='cc -n32' ./configure
-
- Similarly, users compiling on a Solaris machine and desiring to
- build the distribution with 64-bit support should specify the
- correct flags with the CC variable:
-
- $ CC='cc -m64' ./configure
-
- To configure AIX 64-bit support including the Fortran and C++ APIs,
- (Note: need to set $AR to 'ar -X 64'.)
- Serial:
- $ CFLAGS=-q64 FCFLAGS=-q64 CXXFLAGS=-q64 AR='ar -X 64'\
- ./configure --enable-fortran
- Parallel: (C++ not supported with parallel)
- $ CFLAGS=-q64 FCFLAGS=-q64 AR='ar -X 64'\
- ./configure --enable-fortran
-4.3.4. Additional compilation flags
- If addtional flags must be passed to the compilation commands,
+4.3.3. Additional compilation flags
+ If additional flags must be passed to the compilation commands,
specify those flags with the CFLAGS variable. For instance,
to enable symbolic debugging of a production version of HDF5, one
might say:
- $ CFLAGS=-g ./configure --enable-production
+ $ CFLAGS=-g ./configure --enable-build-mode=production
-4.3.5. Compiling HDF5 wrapper libraries
- One can optionally build the Fortran and/or C++ interfaces to the
- HDF5 C library. By default, both options are disabled. To build
- them, specify `--enable-fortran' and `--enable-cxx', respectively.
+4.3.4. Compiling HDF5 wrapper libraries
+ One can optionally build the Fortran, C++, and Java interfaces to
+ the HDF5 C library. By default, these options are disabled. To build
+ them, specify '--enable-fortran', '--enable-cxx', or '--enable-java',
+ respectively.
$ ./configure --enable-fortran
$ ./configure --enable-cxx
+ $ ./configure --enable-java
Configuration will halt if a working Fortran 90 or 95 compiler or
C++ compiler is not found. Currently, the Fortran configure tests
@@ -322,15 +265,8 @@ CONTENTS
$ FC=/usr/local/bin/g95 ./configure --enable-fortran
- Note: The Fortran and C++ interfaces are not supported on all the
- platforms the main HDF5 Library supports. Also, the Fortran
- interface supports parallel HDF5 while the C++ interface does
- not.
- Note: See sections 4.7 and 4.8 for building the Fortran library with
- Intel or PGI compilers.
-
-4.3.6. Specifying other programs
+4.3.5. Specifying other programs
The build system has been tuned for use with GNU make but also
works with other versions of make. If the `make' command runs a
non-GNU version but a GNU version is available under a different
@@ -346,7 +282,7 @@ CONTENTS
the `ar' and `ranlib' (or `:') commands to override values
detected by configure.
- The HDF5 Library, include files, and utilities are installed
+ The HDF5 library, include files, and utilities are installed
during `make install' (described below) with a BSD-compatible
install program detected automatically by configure. If none is
found, the shell script bin/install-sh is used. Configure does not
@@ -364,7 +300,7 @@ CONTENTS
because the HDF5 makefiles also use the install program to
change file ownership and/or access permissions.
-4.3.7. Specifying other libraries and headers
+4.3.6. Specifying other libraries and headers
Configure searches the standard places (those places known by the
systems compiler) for include files and header files. However,
additional directories can be specified by using the CPPFLAGS
@@ -389,12 +325,12 @@ CONTENTS
./configure
HDF5 includes Szip as a predefined compression method (see 3.2).
- To enable Szip compression, the HDF5 Library must be configured
- and built using the Szip Library:
+ To enable Szip compression, the HDF5 library must be configured
+ and built using the Szip library:
$ ./configure --with-szlib=/Szip_Install_Directory
-4.3.8. Static versus shared linking
+4.3.7. Static versus shared linking
The build process will create static libraries on all systems and
shared libraries on systems that support dynamic linking to a
sufficient degree. Either form of the library may be suppressed by
@@ -410,75 +346,75 @@ CONTENTS
$ ./configure --enable-static-exec
-4.3.9. Optimization versus symbolic debugging
+4.3.8. Optimization versus symbolic debugging
The library can be compiled to provide symbolic debugging support
so it can be debugged with gdb, dbx, ddd, etc., or it can be
compiled with various optimizations. To compile for symbolic
- debugging (the default for snapshots), say `--disable-production';
- to compile with optimizations (the default for supported public
- releases), say `--enable-production'. On some systems the library
- can also be compiled for profiling with gprof by saying
+ debugging (the default for snapshots), say
+ `--enable-build-mode=production'; to compile with optimizations
+ (the default for supported public releases),
+ say `--enable-build-mode=production'. On some systems the
+ library can also be compiled for profiling with gprof by saying
`--enable-production=profile'.
- $ ./configure --disable-production #symbolic debugging
- $ ./configure --enable-production #optimized code
- $ ./configure --enable-production=profile #for use with gprof
+ $ ./configure --enable-build-mode=debug #symbolic debugging
+ $ ./configure --enable-build-mode=production #optimized code
+ $ ./configure --enable-production=profile #for use with gprof
Regardless of whether support for symbolic debugging is enabled,
the library can also perform runtime debugging of certain packages
(such as type conversion execution times and extensive invariant
- condition checking). To enable this debugging, supply a
- comma-separated list of package names to to the `--enable-debug'
- switch. See "Debugging HDF5 Applications" for a list of package
- names:
-
- http://www.hdfgroup.org/HDF5/doc/H5.user/Debugging.html
+ condition checking). To enable this debugging, supply a
+ comma-separated list of package names to the `--enable-internal-debug'
+ switch.
- Debugging can be disabled by saying `--disable-debug'.
+ Debugging can be disabled by saying `--disable-internal-debug'.
The default debugging level for snapshots is a subset of the
available packages; the default for supported releases is no
debugging (debugging can incur a significant runtime penalty).
- $ ./configure --enable-debug=s,t #debug only H5S and H5T
- $ ./configure --enable-debug #debug normal packages
- $ ./configure --enable-debug=all #debug all packages
- $ ./configure --disable-debug #no debugging
+ $ ./configure --enable-internal-debug=s,t #debug only H5S and H5T
+ $ ./configure --enable-internal-debug #debug normal packages
+ $ ./configure --enable-internal-debug=all #debug all packages
+ $ ./configure --disable-internal-debug #no debugging
HDF5 can also print a trace of all API function calls, their
arguments, and the return values. To enable or disable the
ability to trace the API say `--enable-trace' (the default for
snapthots) or `--disable-trace' (the default for public releases).
- The tracing must also be enabled at runtime to see any output
- (see "Debugging HDF5 Applications," reference above).
+ The tracing must also be enabled at runtime to see any output.
-4.3.10. Parallel versus serial library
- The HDF5 Library can be configured to use MPI and MPI-IO for
+4.3.9. Parallel versus serial library
+ The HDF5 library can be configured to use MPI and MPI-IO for
parallelism on a distributed multi-processor system. Read the
- file INSTALL_parallel for detailed explanations.
+ file INSTALL_parallel for detailed information.
-4.3.11. Threadsafe capability
- The HDF5 Library can be configured to be thread-safe (on a very
+4.3.10. Threadsafe capability
+ The HDF5 library can be configured to be thread-safe (on a very
large scale) with the `--enable-threadsafe' flag to the configure
script. Some platforms may also require the '-with-pthread=INC,LIB'
(or '--with-pthread=DIR') flag to the configure script.
- For further details, see "HDF5 Thread Safe Library":
+ For further information, see:
+
+ https://portal.hdfgroup.org/display/knowledge/Questions+about+thread-safety+and+concurrent+access
- http://www.hdfgroup.org/HDF5/doc/TechNotes/ThreadSafeLibrary.html
-4.3.12. Backward compatibility
- The 1.8 version of the HDF5 Library can be configured to operate
- identically to the v1.6 library with the
+4.3.11. Backward compatibility
+ The 1.10 version of the HDF5 library can be configured to operate
+ identically to the v1.8 library with the
+ --with-default-api-version=v18
+ configure flag, or identically to the v1.6 library with the
--with-default-api-version=v16
configure flag. This allows existing code to be compiled with the
- v1.8 library without requiring immediate changes to the application
- source code. For addtional configuration options and other details,
- see "API Compatibility Macros in HDF5":
+ v1.10 library without requiring immediate changes to the application
+ source code. For additional configuration options and other details,
+ see "API Compatibility Macros":
- http://www.hdfgroup.org/HDF5/doc/RM/APICompatMacros.html
+ https://portal.hdfgroup.org/display/HDF5/API+Compatibility+Macros
4.4. Building
The library, confidence tests, and programs can be built by
- saying just:
+ specifying:
$ make
@@ -495,7 +431,7 @@ CONTENTS
4.5. Testing
HDF5 comes with various test suites, all of which can be run by
- saying
+ specifying:
$ make check
@@ -524,13 +460,13 @@ CONTENTS
longer test, set HDF5TestExpress to 0. 1 is the default.
4.6. Installing HDF5
- The HDF5 Library, include files, and support programs can be
- installed in a (semi-)public place by saying `make install'. The
- files are installed under the directory specified with
- `--prefix=DIR' (default is 'hdf5') in directories named `lib',
- `include', and `bin'. The directories, if not existing, will be
- created automatically, provided the mkdir command supports the -p
- option.
+ The HDF5 library, include files, and support programs can be
+ installed by specifying `make install'. The files are installed under the
+ directory specified with `--prefix=DIR' (or if not specified, in 'hdf5'
+ in the top directory of the HDF5 source code). They will be
+ placed in directories named `lib', `include', and `bin'. The directories,
+ if not existing, will be created automatically, provided the mkdir command
+ supports the -p option.
If `make install' fails because the install command at your site
somehow fails, you may use the install-sh that comes with the
@@ -587,134 +523,15 @@ CONTENTS
5. Using the Library
- Please see the "HDF5 User's Guide" and the "HDF5 Reference Manual":
-
- http://www.hdfgroup.org/HDF5/doc/
-
- Most programs will include <hdf5.h> and link with -lhdf5.
- Additional libraries may also be necessary depending on whether
- support for compression, etc., was compiled into the HDF5 Library.
-
- A summary of the HDF5 installation can be found in the
- libhdf5.settings file in the same directory as the static and/or
- shared HDF5 Libraries.
-
-
-6. Support
- Support is described in the README file.
-
-
-*****************************************************************************
- APPENDIX
-*****************************************************************************
-
-A. Warnings about compilers
- Output from the following compilers should be extremely suspected
- when used to compile the HDF5 Library, especially if optimizations are
- enabled. In all cases, HDF5 attempts to work around the compiler bugs.
-
-A.1. GNU (Intel platforms)
- Versions before 2.8.1 have serious problems allocating registers
- when functions contain operations on `long long' datatypes.
-
-A.2. COMPAQ/DEC
- The V5.2-038 compiler (and possibly others) occasionally
- generates incorrect code for memcpy() calls when optimizations
- are enabled, resulting in unaligned access faults. HDF5 works
- around the problem by casting the second argument to `char *'.
- The Fortran module (5.4.1a) fails in compiling some Fortran
- programs. Use 5.5.0 or higher.
-
-A.3. SGI (Irix64 6.2)
- The Mongoose 7.00 compiler has serious optimization bugs and
- should be upgraded to MIPSpro 7.2.1.2m. Patches are available
- from SGI.
-
-A.4. Windows/NT
- The Microsoft Win32 5.0 compiler is unable to cast unsigned long
- long values to doubles. HDF5 works around this bug by first
- casting to signed long long and then to double.
-
- A link warning: defaultlib "LIBC" conflicts with use of other libs
- appears for debug version of VC++ 6.0. This warning will not affect
- building and testing HDF5 Libraries.
-
-
-B. Large (>2GB) versus small (<2GB) file capability
- In order to read or write files that could potentially be larger
- than 2GB, it is necessary to use the non-ANSI `long long' data
- type on some platforms. However, some compilers (e.g., GNU gcc
- versions before 2.8.1 on Intel platforms) are unable to produce
- correct machine code for this datatype.
-
-
-C. Building and testing with other compilers
-C.1. Building and testing with Intel compilers
- When Intel compilers are used (icc or ecc), you will need to modify
- the generated "libtool" program after configuration is finished.
- On or around line 104 of the libtool file, there are lines which
- look like:
-
- # How to pass a linker flag through the compiler.
- wl=""
-
- Change these lines to this:
-
- # How to pass a linker flag through the compiler.
- wl="-Wl,"
-
- UPDATE: This is now done automatically by the configure script.
- However, if you still experience a problem, you may want to check this
- line in the libtool file and make sure that it has the correct value.
-
- * To build the Fortran library using Intel compiler on Linux 2.4,
- one has to perform the following steps:
- x Use the -fpp -DDEC$=DEC_ -DMS$=MS_ compiler flags to disable
- DEC and MS compiler directives in source files in the fortran/src,
- fortran/test, and fortran/examples directories.
- E.g., setenv F9X 'ifc -fpp -DDEC$=DEC_ -DMS$=MS_'
- Do not use double quotes since $ is interpreted in them.
-
- x If Version 6.0 of Fortran compiler is used, the build fails in
- the fortran/test directory and then in the fortran/examples
- directory. To proceed, edit the work.pcl files in those
- directories to contain two lines:
-
- work.pc
- ../src/work.pc
-
- x Do the same in the fortran/examples directory.
-
- x A problem with work.pc files was resolved for the newest version
- of the compiler (7.0).
-
- * To build the Fortran library on IA32, follow the steps described
- above, except that the DEC and MS compiler directives should be
- removed manually or use a patch from HDF FTP server:
-
- ftp://ftp.hdfgroup.org/HDF5/current/
-
-
-C.2. Building and testing with PGI compilers
- When PGI C and C++ compilers are used (pgcc or pgCC), you will need to
- modify the generated "libtool" program after configuration is finished.
- On or around line 104 of the libtool file, there are lines which
- look like this:
-
- # How to pass a linker flag through the compiler.
- wl=""
+
+ For information on using HDF5 see the documentation, tutorials and examples
+ found here:
- Change these lines to this:
+ https://portal.hdfgroup.org/display/HDF5/HDF5
- # How to pass a linker flag through the compiler.
- wl="-Wl,"
+ A summary of the features included in the built HDF5 installation can be found
+ in the libhdf5.settings file in the same directory as the static and/or
+ shared HDF5 libraries.
- UPDATE: This is now done automatically by the configure script. However,
- if you still experience a problem, you may want to check this line in
- the libtool file and make sure that it has the correct value.
- To build the HDF5 C++ Library with pgCC (version 4.0 and later), set
- the environment variable CXX to "pgCC -tlocal"
- setenv CXX "pgCC -tlocal"
- before running the configure script.
diff --git a/release_docs/INSTALL_CMake.txt b/release_docs/INSTALL_CMake.txt
index 4c4460e..a2d209a 100644
--- a/release_docs/INSTALL_CMake.txt
+++ b/release_docs/INSTALL_CMake.txt
@@ -24,13 +24,13 @@ Obtaining HDF5 source code
1. Create a directory for your development; for example, "myhdfstuff".
2. Obtain compressed (*.tar or *.zip) HDF5 source from
- http://www.hdfgroup.org/ftp/HDF5/current/src/
+ https://portal.hdfgroup.org/display/support/Building+HDF5+with+CMake
and put it in "myhdfstuff".
Uncompress the file. There should be a hdf5-1.10."X" folder.
CMake version
1. We suggest you obtain the latest CMake from the Kitware web site.
- The HDF5 1.10."X" product requires a minimum CMake version 3.2.2,
+ The HDF5 1.10."X" product requires a minimum CMake version 3.10,
where "X" is the current HDF5 release version.
Note:
@@ -48,12 +48,12 @@ the config/cmake/cacheinit.cmake file.
HDF Group recommends using the ctest script mode to build HDF5.
The following files referenced below are available at the HDF web site:
- http://www.hdfgroup.org/HDF5/release/cmakebuild.html
+ https://portal.hdfgroup.org/display/support/Building+HDF5+with+CMake
Single compressed file with all the files needed, including source:
- hdf5-1.10.X-CMake.zip or hdf5-1.10.X-CMake.tar.gz
+ CMake-hdf5-1.10.X.zip or CMake-hdf5-1.10.X.tar.gz
-Individual files
+Individual files included in the above mentioned compressed files
-----------------------------------------------
CMake build script:
CTestScript.cmake
@@ -62,37 +62,44 @@ External compression szip and zlib libraries:
SZip.tar.gz
ZLib.tar.gz
-Platform configuration files:
- HDF518config.cmake
+Examples Source package:
+ HDF5Examples-1.10.x-Source.tar.gz
+
+Configuration files:
+ HDF5config.cmake
+ HDF5options.cmake
+
+Build scripts for windows or linux
-----------------------------------------------
To build HDF5 with the SZIP and ZLIB external libraries you will need to:
1. Change to the development directory "myhdfstuff".
- 2. Download the SZip.tar.gz and ZLib.tar.gz to "myhdfstuff".
- Do not uncompress the files.
+ 2. Download the CMake-hdf5-1.10.X.zip(.tar.gz) file to "myhdfstuff".
+ Uncompress the file.
- 3. Download the CTestScript.cmake file to "myhdfstuff".
+ 3. Change to the source directory "hdf5-1.10.x".
CTestScript.cmake file should not be modified.
- 4. Download the platform configuration file, HDF518config.cmake,
- to "myhdfstuff". Do not modify the file unless you want to change
- default build environment. (See http://www.hdfgroup.org/HDF5/release/chgcmkbuild.html)
+ 4. Edit the platform configuration file, HDF5options.cmake, if you want to change
+ the default build environment.
+ (See https://portal.hdfgroup.org/display/support/How+to+Change+HDF5+CMake+Build+Options)
5. From the "myhdfstuff" directory execute the CTest Script with the
following options:
- On 32-bit Windows with Visual Studio 2012, execute:
- ctest -S HDF5config.cmake,BUILD_GENERATOR=VS2012 -C Release -VV -O hdf5.log
- On 64-bit Windows with Visual Studio 2012, execute:
- ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201264 -C Release -VV -O hdf5.log
+ On 32-bit Windows with Visual Studio 2015, execute:
+ ctest -S HDF5config.cmake,BUILD_GENERATOR=VS2015 -C Release -VV -O hdf5.log
+ On 64-bit Windows with Visual Studio 2015, execute:
+ ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201564 -C Release -VV -O hdf5.log
On 32-bit Windows with Visual Studio 2013, execute:
ctest -S HDF5config.cmake,BUILD_GENERATOR=VS2013 -C Release -VV -O hdf5.log
On 64-bit Windows with Visual Studio 2013, execute:
ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201364 -C Release -VV -O hdf5.log
On Linux and Mac, execute:
ctest -S HDF5config.cmake,BUILD_GENERATOR=Unix -C Release -VV -O hdf5.log
+ The supplied build scripts are versions of the above.
The command above will configure, build, test, and create an install
package in the myhdfstuff folder. It will have the format:
@@ -103,19 +110,20 @@ To build HDF5 with the SZIP and ZLIB external libraries you will need to:
installer on your system, you will also see a similar file that ends
in either .exe (NSIS) or .msi (WiX).
- The -S option uses the script version of ctest.
+ Notes on the command line options.
+ The -S option uses the script version of ctest.
- The value for the -C option (as shown above, "-C Release") must
- match the setting for CTEST_CONFIGURATION_TYPE in the platform
- configuration file.
+ The value for the -C option (as shown above, "-C Release") must
+ match the setting for CTEST_CONFIGURATION_TYPE in the platform
+ configuration file.
- The -VV option is for most verbose; use -V for less verbose.
+ The -VV option is for most verbose; use -V for less verbose.
- The "-O hdf5.log" option saves the output to a log file hdf5.log.
+ The "-O hdf5.log" option saves the output to a log file hdf5.log.
6. To install, "X" is the current release version
- On Windows, execute:
+ On Windows (with WiX installed), execute:
HDF5-1.10."X"-win32.msi or HDF5-1.10."X"-win64.msi
By default this program will install the hdf5 library into the
"C:\Program Files" directory and will create the following
@@ -165,8 +173,9 @@ To build HDF5 with the SZIP and ZLIB external libraries you will need to:
III. Quick Step Building HDF5 C Static Libraries and Tools with CMake
========================================================================
Notes: This short set of instructions is written for users who want to
- quickly build the just the HDF5 C static library and tools from
+ quickly build just the HDF5 C static library and tools from
the HDF5 source code package using the CMake command line tools.
+ Avoid the use of drive letters in paths on Windows.
Go through these steps:
@@ -201,7 +210,7 @@ Notes: This short set of instructions is written for users who want to
cpack -C Release CPackConfig.cmake
9. To install
- On Windows, execute:
+ On Windows (with WiX installed), execute:
HDF5-1.10."X"-win32.msi or HDF5-1.10."X"-win64.msi
By default this program will install the hdf5 library into the
"C:\Program Files" directory and will create the following
@@ -249,7 +258,7 @@ IV. Further considerations
========================================================================
1. We suggest you obtain the latest CMake for windows from the Kitware
- web site. The HDF5 1.10."X" product requires a minimum CMake version 3.2.2.
+ web site. The HDF5 1.10."X" product requires a minimum CMake version 3.10.
2. If you plan to use Zlib or Szip:
A. Download the binary packages and install them in a central location.
@@ -261,7 +270,7 @@ IV. Further considerations
-DSZIP_INCLUDE_DIR:PATH=some_location/include
where "some_location" is the full path to the extlibs folder.
- B. Use source packages from an SVN server by adding the following CMake
+ B. Use source packages from an GIT server by adding the following CMake
options:
HDF5_ALLOW_EXTERNAL_SUPPORT:STRING="GIT"
ZLIB_GIT_URL:STRING="http://some_location/zlib"
@@ -314,9 +323,21 @@ Notes: CMake and HDF5
how CMake support can be improved on any system. Visit the
KitWare site for more information about CMake.
- 3. Build and test results can be submitted to our CDash server,
- please read the HDF and CDash document at:
- www.hdfgroup.org/CDash/HowToSubmit.
+ 3. Build and test results can be submitted to our CDash server:
+ The CDash server for community submissions of hdf5 is at
+ https://cdash.hdfgroup.org.
+
+ Submitters are requested to register their name and contact info and
+ maintain their test sites. After your first submission, login and go
+ to your "My CDash" link and claim your site.
+
+ We ask that all submissions include the configuration information and
+ contact information in the CTest Notes Files upload step. See the
+ current reports on CDash for examples.
+
+ Please follow the convention that "NIGHTLY" submissions maintain the same
+ configuration every time. "EXPERIMENTAL" submissions can be expected to
+ be different for each submission.
4. See the appendix at the bottom of this file for examples of using
a ctest script for building and testing. Using a ctest script is
@@ -396,34 +417,16 @@ These five steps are described in detail below.
# EXTERNAL cache entries
########################
set (CMAKE_INSTALL_FRAMEWORK_PREFIX "Library/Frameworks" CACHE STRING "Frameworks installation directory" FORCE)
- set (HDF5_GENERATE_HEADERS ON CACHE BOOL "Rebuild Generated Files" FORCE)
set (HDF_PACKAGE_EXT "" CACHE STRING "Name of HDF package extension" FORCE)
set (HDF5_BUILD_FORTRAN ON CACHE BOOL "Build FORTRAN support" FORCE)
- set (HDF5_BUILD_GENERATORS OFF CACHE BOOL "Build Test Generators" FORCE)
set (HDF5_ENABLE_Z_LIB_SUPPORT ON CACHE BOOL "Enable Zlib Filters" FORCE)
set (HDF5_ENABLE_SZIP_SUPPORT ON CACHE BOOL "Use SZip Filter" FORCE)
set (HDF5_ENABLE_SZIP_ENCODING ON CACHE BOOL "Use SZip Encoding" FORCE)
- set (HDF5_ENABLE_HSIZET ON CACHE BOOL "Enable datasets larger than memory" FORCE)
- set (ALLOW_UNSUPPORTED OFF CACHE BOOL "Enable unsupported combinations of configuration options" FORCE)
- set (HDF5_ENABLE_DEPRECATED_SYMBOLS ON CACHE BOOL "Enable deprecated public API symbols" FORCE)
- set (HDF5_ENABLE_DIRECT_VFD OFF CACHE BOOL "Build the Direct I/O Virtual File Driver" FORCE)
- set (HDF5_ENABLE_PARALLEL OFF CACHE BOOL "Enable parallel build (requires MPI)" FORCE)
set (MPIEXEC_MAX_NUMPROCS "3" CACHE STRING "Minimum number of processes for HDF parallel tests" FORCE)
- set (HDF5_BUILD_PARALLEL_ALL OFF CACHE BOOL "Build Parallel Programs" FORCE)
- set (HDF5_ENABLE_COVERAGE OFF CACHE BOOL "Enable code coverage for Libraries and Programs" FORCE)
- set (HDF5_ENABLE_USING_MEMCHECKER OFF CACHE BOOL "Indicate that a memory checker is used" FORCE)
- set (HDF5_MEMORY_ALLOC_SANITY_CHECK OFF CACHE BOOL "Indicate that internal memory allocation sanity checks are enabled" FORCE)
- set (HDF5_DISABLE_COMPILER_WARNINGS OFF CACHE BOOL "Disable compiler warnings" FORCE)
set (HDF5_ENABLE_ALL_WARNINGS ON CACHE BOOL "Enable all warnings" FORCE)
- set (HDF5_USE_FOLDERS ON CACHE BOOL "Enable folder grouping of projects in IDEs." FORCE)
- set (HDF5_USE_16_API_DEFAULT OFF CACHE BOOL "Use the HDF5 1.6.x API by default" FORCE)
- set (HDF5_USE_18_API_DEFAULT OFF CACHE BOOL "Use the HDF5 1.8.x API by default" FORCE)
- set (HDF5_ENABLE_THREADSAFE OFF CACHE BOOL "(WINDOWS)Enable Threadsafety" FORCE)
set (HDF_TEST_EXPRESS "2" CACHE STRING "Control testing framework (0-3)" FORCE)
- set (HDF5_PACKAGE_EXTLIBS OFF CACHE BOOL "(WINDOWS)CPACK - include external libraries" FORCE)
- set (HDF5_NO_PACKAGES OFF CACHE BOOL "CPACK - Disable packaging" FORCE)
- set (HDF5_ALLOW_EXTERNAL_SUPPORT "NO" CACHE STRING "Allow External Library Building (NO GIT SVN TGZ)" FORCE)
- set_property (CACHE HDF5_ALLOW_EXTERNAL_SUPPORT PROPERTY STRINGS NO GIT SVN TGZ)
+ set (HDF5_ALLOW_EXTERNAL_SUPPORT "NO" CACHE STRING "Allow External Library Building (NO GIT TGZ)" FORCE)
+ set_property (CACHE HDF5_ALLOW_EXTERNAL_SUPPORT PROPERTY STRINGS NO GIT TGZ)
set (ZLIB_TGZ_NAME "ZLib.tar.gz" CACHE STRING "Use ZLib from compressed file" FORCE)
set (SZIP_TGZ_NAME "SZip.tar.gz" CACHE STRING "Use SZip from compressed file" FORCE)
set (ZLIB_PACKAGE_NAME "zlib" CACHE STRING "Name of ZLIB package" FORCE)
@@ -433,7 +436,7 @@ These five steps are described in detail below.
2.1 Visual CMake users, click the Configure button. If this is the first time you are
running cmake-gui in this directory, you will be prompted for the
- generator you wish to use (for example on Windows, Visual Studio 11).
+ generator you wish to use (for example on Windows, Visual Studio 12).
CMake will read in the CMakeLists.txt files from the source directory and
display options for the HDF5 project. After the first configure you
can adjust the cache settings and/or specify the locations of other programs.
@@ -580,7 +583,7 @@ HDF5_BUILD_TOOLS "Build HDF5 Tools" ON
---------------- HDF5 Advanced Options ---------------------
ALLOW_UNSUPPORTED "Allow unsupported combinations of configure options" OFF
HDF5_DISABLE_COMPILER_WARNINGS "Disable compiler warnings" OFF
-HDF5_ENABLE_INSTRUMENT "Instrument The library" OFF
+HDF5_ENABLE_ALL_WARNINGS "Enable all warnings" OFF
HDF5_ENABLE_CODESTACK "Enable the function stack tracing (for developer debugging)." OFF
HDF5_ENABLE_COVERAGE "Enable code coverage for Libraries and Programs" OFF
HDF5_ENABLE_DEBUG_APIS "Turn on extra debug output in all packages" OFF
@@ -588,32 +591,33 @@ HDF5_ENABLE_DEPRECATED_SYMBOLS "Enable deprecated public API symbols"
HDF5_ENABLE_DIRECT_VFD "Build the Direct I/O Virtual File Driver" OFF
HDF5_ENABLE_EMBEDDED_LIBINFO "embed library info into executables" ON
HDF5_ENABLE_HSIZET "Enable datasets larger than memory" ON
-HDF5_ENABLE_LARGE_FILE "Enable support for large (64-bit) files on Linux." ON
HDF5_ENABLE_PARALLEL "Enable parallel build (requires MPI)" OFF
HDF5_ENABLE_TRACE "Enable API tracing capability" OFF
HDF5_ENABLE_USING_MEMCHECKER "Indicate that a memory checker is used" OFF
-HDF5_GENERATE_HEADERS "Rebuild Generated Files" OFF
+HDF5_GENERATE_HEADERS "Rebuild Generated Files" ON
+HDF5_BUILD_GENERATORS "Build Test Generators" OFF
HDF5_JAVA_PACK_JRE "Package a JRE installer directory" OFF
HDF5_MEMORY_ALLOC_SANITY_CHECK "Indicate that internal memory allocation sanity checks are enabled" OFF
HDF5_METADATA_TRACE_FILE "Enable metadata trace file collection" OFF
HDF5_NO_PACKAGES "Do not include CPack Packaging" OFF
HDF5_PACK_EXAMPLES "Package the HDF5 Library Examples Compressed File" OFF
HDF5_PACK_MACOSX_FRAMEWORK "Package the HDF5 Library in a Frameworks" OFF
+HDF5_BUILD_FRAMEWORKS "TRUE to build as frameworks libraries,
+ FALSE to build according to BUILD_SHARED_LIBS" FALSE
HDF5_PACKAGE_EXTLIBS "CPACK - include external libraries" OFF
HDF5_STRICT_FORMAT_CHECKS "Whether to perform strict file format checks" OFF
HDF_TEST_EXPRESS "Control testing framework (0-3)" "0"
HDF5_TEST_VFD "Execute tests with different VFDs" OFF
HDF5_USE_16_API_DEFAULT "Use the HDF5 1.6.x API by default" OFF
HDF5_USE_18_API_DEFAULT "Use the HDF5 1.8.x API by default" OFF
-HDF5_USE_FOLDERS "Enable folder grouping of projects in IDEs." OFF
+HDF5_USE_FOLDERS "Enable folder grouping of projects in IDEs." ON
HDF5_WANT_DATA_ACCURACY "IF data accuracy is guaranteed during data conversions" ON
HDF5_WANT_DCONV_EXCEPTION "exception handling functions is checked during data conversions" ON
HDF5_ENABLE_THREADSAFE "Enable Threadsafety" OFF
-SKIP_HDF5_FORTRAN_SHARED "Do not build the fortran shared libraries" OFF
if (APPLE)
HDF5_BUILD_WITH_INSTALL_NAME "Build with library install_name set to the installation path" OFF
if (CMAKE_BUILD_TYPE MATCHES Debug)
- HDF5_ENABLE_TRACE "Enable API tracing capability" ON
+ HDF5_ENABLE_INSTRUMENT "Instrument The library" OFF
if (HDF5_TEST_VFD)
HDF5_TEST_FHEAP_VFD "Execute fheap test with different VFDs" ON
@@ -659,34 +663,30 @@ adding an option (${CTEST_SCRIPT_ARG}) to the platform configuration script.
#############################################################################################
### ${CTEST_SCRIPT_ARG} is of the form OPTION=VALUE ###
-### BUILD_GENERATOR required [Unix, VS2015, VS201564, VS2013, VS201364, VS2012, VS201264] ###
-### ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201264 -C Release -VV -O hdf5.log ###
+### BUILD_GENERATOR required [Unix, VS2017, VS201764, VS2015, VS201564, VS2013, VS201364] ###
+### ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201764 -C Release -VV -O hdf5.log ###
#############################################################################################
-cmake_minimum_required (VERSION 3.2.2 FATAL_ERROR)
+cmake_minimum_required (VERSION 3.10)
############################################################################
# Usage:
# ctest -S HDF5config.cmake,OPTION=VALUE -C Release -VV -O test.log
# where valid options for OPTION are:
# BUILD_GENERATOR - The cmake build generator:
-# Unix * Unix Makefiles
+# Unix * Unix Makefiles
+# VS2017 * Visual Studio 15 2017
+# VS201764 * Visual Studio 15 2017 Win64
# VS2015 * Visual Studio 14 2015
-# VS201564 * Visual Studio 14 2015 Win64
+# VS201564 * Visual Studio 14 2015 Win64
# VS2013 * Visual Studio 12 2013
-# VS201364 * Visual Studio 12 2013 Win64
-# VS2012 * Visual Studio 11 2012
-# VS201264 * Visual Studio 11 2012 Win64
+# VS201364 * Visual Studio 12 2013 Win64
#
# INSTALLDIR - root folder where hdf5 is installed
# CTEST_CONFIGURATION_TYPE - Release, Debug, etc
# CTEST_SOURCE_NAME - source folder
-# STATIC_ONLY - Build/use static libraries
-# FORTRAN_LIBRARIES - Build/use fortran libraries
-# JAVA_LIBRARIES - Build/use java libraries
-# NO_MAC_FORTRAN - Yes to be SHARED on a Mac
##############################################################################
-set (CTEST_SOURCE_VERSION 1.10.1)
+set (CTEST_SOURCE_VERSION "1.11.0")
set (CTEST_SOURCE_VERSEXT "")
##############################################################################
@@ -695,10 +695,6 @@ set (CTEST_SOURCE_VERSEXT "")
#INSTALLDIR - HDF5-1.10.0 root folder
#CTEST_CONFIGURATION_TYPE - Release, Debug, RelWithDebInfo
#CTEST_SOURCE_NAME - name of source folder; HDF5-1.10.0
-#STATIC_ONLY - Default is YES
-#FORTRAN_LIBRARIES - Default is NO
-#JAVA_LIBRARIES - Default is NO
-#NO_MAC_FORTRAN - set to TRUE to allow shared libs on a Mac
if (DEFINED CTEST_SCRIPT_ARG)
# transform ctest script arguments of the form
# script.ctest,var1=value1,var2=value2
@@ -713,25 +709,7 @@ endif ()
# build generator must be defined
if (NOT DEFINED BUILD_GENERATOR)
- message (FATAL_ERROR "BUILD_GENERATOR must be defined - Unix, VS2015, VS201564, VS2013, VS201364, VS2012, or VS201264")
-else ()
- if (${BUILD_GENERATOR} STREQUAL "Unix")
- set (CTEST_CMAKE_GENERATOR "Unix Makefiles")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2015")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 14 2015")
- elseif (${BUILD_GENERATOR} STREQUAL "VS201564")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 14 2015 Win64")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2013")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 12 2013")
- elseif (${BUILD_GENERATOR} STREQUAL "VS201364")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 12 2013 Win64")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2012")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 11 2012")
- elseif (${BUILD_GENERATOR} STREQUAL "VS201264")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 11 2012 Win64")
- else ()
- message (FATAL_ERROR "Invalid BUILD_GENERATOR must be - Unix, VS2015, VS201564, VS2013, VS201364, VS2012, or VS201264")
- endif ()
+ message (FATAL_ERROR "BUILD_GENERATOR must be defined - Unix, VS2017, or VS201764, VS2015, VS201564, VS2013, VS201364")
endif ()
###################################################################
@@ -783,30 +761,48 @@ endif ()
if (WIN32)
set (SITE_OS_NAME "Windows")
set (SITE_OS_VERSION "WIN7")
- if (${BUILD_GENERATOR} STREQUAL "VS201564")
+ if (${BUILD_GENERATOR} STREQUAL "VS201764")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 15 2017 Win64")
+ set (SITE_OS_BITS "64")
+ set (SITE_COMPILER_NAME "vs2017")
+ set (SITE_COMPILER_VERSION "15")
+ elseif (${BUILD_GENERATOR} STREQUAL "VS2017")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 15 2017")
+ set (SITE_OS_BITS "32")
+ set (SITE_COMPILER_NAME "vs2017")
+ set (SITE_COMPILER_VERSION "15")
+ elseif (${BUILD_GENERATOR} STREQUAL "VS201564")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 14 2015 Win64")
set (SITE_OS_BITS "64")
set (SITE_COMPILER_NAME "vs2015")
set (SITE_COMPILER_VERSION "14")
elseif (${BUILD_GENERATOR} STREQUAL "VS2015")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 14 2015")
set (SITE_OS_BITS "32")
set (SITE_COMPILER_NAME "vs2015")
set (SITE_COMPILER_VERSION "14")
elseif (${BUILD_GENERATOR} STREQUAL "VS201364")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 12 2013 Win64")
set (SITE_OS_BITS "64")
set (SITE_COMPILER_NAME "vs2013")
set (SITE_COMPILER_VERSION "12")
elseif (${BUILD_GENERATOR} STREQUAL "VS2013")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 12 2013")
set (SITE_OS_BITS "32")
set (SITE_COMPILER_NAME "vs2013")
set (SITE_COMPILER_VERSION "12")
elseif (${BUILD_GENERATOR} STREQUAL "VS201264")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 11 2012 Win64")
set (SITE_OS_BITS "64")
set (SITE_COMPILER_NAME "vs2012")
set (SITE_COMPILER_VERSION "11")
elseif (${BUILD_GENERATOR} STREQUAL "VS2012")
+ set (CTEST_CMAKE_GENERATOR "Visual Studio 11 2012")
set (SITE_OS_BITS "32")
set (SITE_COMPILER_NAME "vs2012")
set (SITE_COMPILER_VERSION "11")
+ else ()
+ message (FATAL_ERROR "Invalid BUILD_GENERATOR must be - Unix, VS2017, or VS201764, VS2015, VS201564, VS2013, VS201364")
endif ()
## Set the following to unique id your computer ##
set (CTEST_SITE "WIN7${BUILD_GENERATOR}.XXXX")
diff --git a/release_docs/INSTALL_Cygwin.txt b/release_docs/INSTALL_Cygwin.txt
index ddffcf1..f5f1d6a 100644..100755
--- a/release_docs/INSTALL_Cygwin.txt
+++ b/release_docs/INSTALL_Cygwin.txt
@@ -66,12 +66,12 @@ Preconditions:
2.2.2 Szip
The HDF5 library has a predefined compression filter that uses
the extended-Rice lossless compression algorithm for chunked
- datatsets. For more information about Szip compression and
- license terms see
- http://hdfgroup.org/HDF5/doc_resource/SZIP/index.html.
-
- The latest supported public release of SZIP is available from
- ftp://ftp.hdfgroup.org/lib-external/szip/2.1.
+ datasets. For information on Szip compression, license terms,
+ and obtaining the Szip source code, see:
+
+ https://portal.hdfgroup.org/display/HDF5/Szip+Compression+in+HDF+Products
+
+
2.3 Additional Utilities
@@ -93,7 +93,7 @@ Build, Test and Install HDF5 on Cygwin
1. Get HDF5 source code package
Users can download HDF5 source code package from HDF website
- (http://hdfgroup.org).
+ (https://www.hdfgroup.org/downloads/hdf5/).
2. Unpacking the distribution
@@ -266,4 +266,8 @@ Build, Test and Install HDF5 on Cygwin
-----------------------------------------------------------------------
-Need Further assistance, email help@hdfgroup.org
+For further assistance, contact:
+
+ HDF Forum: https://forum.hdfgroup.org/
+ HDF Helpdesk: https://portal.hdfgroup.org/display/support/The+HDF+Help+Desk
+
diff --git a/release_docs/INSTALL_parallel b/release_docs/INSTALL_parallel
index 5a8b603..f32fffc 100644..100755
--- a/release_docs/INSTALL_parallel
+++ b/release_docs/INSTALL_parallel
@@ -40,9 +40,11 @@ and the parallel file system.
1.2. Further Help
-----------------
-If you still have difficulties installing PHDF5 in your system, please send
-mail to
- help@hdfgroup.org
+
+For help with installing, questions can be posted to the HDF Forum or sent to the HDF Helpdesk:
+
+ HDF Forum: https://forum.hdfgroup.org/
+ HDF Helpdesk: https://portal.hdfgroup.org/display/support/The+HDF+Help+Desk
In your mail, please include the output of "uname -a". If you have run the
"configure" command, attach the output of the command and the content of
@@ -87,12 +89,8 @@ The following steps are for building HDF5 for the Hopper compute
nodes. They would probably work for other Cray systems but have
not been verified.
-Obtain a copy from the HDF ftp server:
-http://www.hdfgroup.org/ftp/HDF5/current/src/
-(link might change, so always double check the HDF group website).
-
-$ wget http://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-x.x.x.tar.gz
-unpack the tarball
+Obtain the HDF5 source code:
+ https://portal.hdfgroup.org/display/support/Downloads
The entire build process should be done on a MOM node in an interactive allocation and on a file system accessible by all compute nodes.
Request an interactive allocation with qsub:
diff --git a/release_docs/RELEASE.txt b/release_docs/RELEASE.txt
index c1adf8d..0c3873b 100644..100755
--- a/release_docs/RELEASE.txt
+++ b/release_docs/RELEASE.txt
@@ -1,576 +1,1090 @@
-HDF5 version 1.10.1 released on 2017-04-27
+HDF5 version 1.10.2 released on 2018-03-29
================================================================================
-INTRODUCTION
-
-This document describes the differences between HDF5-1.10.0-patch1 and
-HDF5 1.10.1, and contains information on the platforms tested and known
-problems in HDF5-1.10.1. For more details check the HISTORY*.txt files
-in the HDF5 source.
-
-Links to HDF5 1.10.1 source code, documentation, and additional materials can
-be found on The HDF5 web page at:
- https://support.hdfgroup.org/HDF5/
+INTRODUCTION
-The HDF5 1.10.1 release can be obtained from:
+This document describes the differences between this release and the previous
+HDF5 release. It contains information on the platforms tested and known
+problems in this release. For more details check the HISTORY*.txt files in the
+HDF5 source.
- https://support.hdfgroup.org/HDF5/release/obtain5.html
+Note that documentation in the links below will be updated at the time of each
+final release.
-User documentation for the snapshot can be accessed directly at this location:
+Links to HDF5 documentation can be found on The HDF5 web page:
- https://support.hdfgroup.org/HDF5/doc/
+ https://portal.hdfgroup.org/display/HDF5/HDF5
-New features in the HDF5-1.10.x release series, including brief general
-descriptions of some new and modified APIs, are described in the "New Features
-in HDF5 Release 1.10" document:
+The official HDF5 releases can be obtained from:
- https://support.hdfgroup.org/HDF5/docNewFeatures/index.html
+ https://www.hdfgroup.org/downloads/hdf5/
-All new and modified APIs are listed in detail in the "HDF5 Software Changes
-from Release to Release" document, in the section "Release 10.1 (current
-release) versus Release 1.10.0
+Changes from Release to Release and New Features in the HDF5-1.10.x release series
+can be found at:
- https://support.hdfgroup.org/HDF5/doc/ADGuide/Changes.html
+ https://portal.hdfgroup.org/display/HDF5/HDF5+Application+Developer%27s+Guide
If you have any questions or comments, please send them to the HDF Help Desk:
- help@hdfgroup.org
+ help@hdfgroup.org
CONTENTS
-- Major New Features Introduced in HDF5 1.10.1
-- Other New Features and Enhancements
-- Support for New Platforms, Languages, and Compilers
-- Bug Fixes since HDF5-1.10.0-patch1
+- New Features
+- Support for new platforms and languages
+- Bug Fixes since HDF5-1.10.1
- Supported Platforms
- Tested Configuration Features Summary
- More Tested Platforms
- Known Problems
-Major New Features Introduced in HDF5 1.10.1
-============================================
+New Features
+============
-For links to the RFCs and documentation in this section please view
-https://support.hdfgroup.org/HDF5/docNewFeatures in a web browser.
+ Configuration and Build Systems:
+ --------------------------------
+ - CMake builds
+ --------------
-________________________________________
-Metadata Cache Image
-________________________________________
+ - Changed minimum CMake required version to 3.10.
- HDF5 metadata is typically small, and scattered throughout the HDF5 file.
- This can affect performance, particularly on large HPC systems. The
- Metadata Cache Image feature can improve performance by writing the
- metadata cache in a single block on file close, and then populating the
- cache with the contents of this block on file open, thus avoiding the many
- small I/O operations that would otherwise be required on file open and
- close. See the RFC for complete details regarding this feature. Also,
- see the Fine Tuning the Metadata Cache documentation.
+ This change removed the need to support a copy of the FindMPI.cmake module,
+ which has been removed, along with its subfolder in the config/cmake_ext_mod
+ location.
- At present, metadata cache images may not be generated by parallel
- applications. Parallel applications can read files with metadata cache
- images, but since this is a collective operation, a deadlock is possible
- if one or more processes do not participate.
+ (ADB - 2018/03/09)
-________________________________________
-Metadata Cache Evict on Close
-________________________________________
+ - Added pkg-config file generation
- The HDF5 library's metadata cache is fairly conservative about holding on
- to HDF5 object metadata (object headers, chunk index structures, etc.),
- which can cause the cache size to grow, resulting in memory pressure on
- an application or system. The "evict on close" property will cause all
- metadata for an object to be evicted from the cache as long as metadata
- is not referenced from any other open object. See the Fine Tuning the
- Metadata Cache documentation for information on the APIs.
+ Added pkg-config file generation for the C, C++, HL, and HL C++ libraries.
+ In addition, builds on Linux will create h5cc, h5c++, h5hlcc, and h5hlc++ scripts in the bin
+ directory that use the pkg-config files. The scripts can be used to build HDF5 C and C++
+ applications (i.e, similar to the compiler scripts produced by the Autotools builds).
- At present, evict on close is disabled in parallel builds.
+ (ADB - 2018/03/08, HDFFV-4359)
-________________________________________
-Paged Aggregation
-________________________________________
+ - Refactored use of CMAKE_BUILD_TYPE for new variable, which understands
+ the type of generator in use.
- The current HDF5 file space allocation accumulates small pieces of metadata
- and raw data in aggregator blocks which are not page aligned and vary
- widely in sizes. The paged aggregation feature was implemented to provide
- efficient paged access of these small pieces of metadata and raw data.
- See the RFC for details. Also, see the File Space Management documentation.
-
-________________________________________
-Page Buffering
-________________________________________
+ Added new configuration macros to use new HDF_BUILD_TYPE variable. This
+ variable is set correctly for the type of generator being used for the build.
- Small and random I/O accesses on parallel file systems result in poor
- performance for applications. Page buffering in conjunction with paged
- aggregation can improve performance by giving an application control of
- minimizing HDF5 I/O requests to a specific granularity and alignment.
- See the RFC for details. Also, see the Page Buffering documentation.
+ (ADB - 2018/01/08, HDFFV-10385, HDFFV-10296)
- At present, page buffering is disabled in parallel builds.
+ - Autotools builds
+ ------------------
+ - Removed version-specific gcc/gfortran flags for version 4.0 (inclusive)
+ and earlier.
+ The config/gnu-flags file, which is sourced as a part of the configure
+ process, adds version-specific flags for use when building HDF5. Most of
+ these flags control warnings and do not affect the final product.
-Other New Features and Enhancements
-===================================
+ Flags for older versions of the compiler were consolidated into the
+ common flags section. Moving these flags simplifies maintenance of
+ the file.
- Library
- -------
- - Added a mechanism for disabling the SWMR file locking scheme.
+ The upshot of this is that building with ancient versions of gcc
+ (<= 4.0) will possibly no longer work without hand-hacking the file
+ to remove the flags not understood by that version of the compiler.
+ Nothing should change when building with gcc >= 4.1.
- The file locking calls used in HDF5 1.10.0 (including patch1)
- will fail when the underlying file system does not support file
- locking or where locks have been disabled. To disable all file
- locking operations, an environment variable named
- HDF5_USE_FILE_LOCKING can be set to the five-character string
- 'FALSE'. This does not fundamentally change HDF5 library
- operation (aside from initial file open/create, SWMR is lock-free),
- but users will have to be more careful about opening files
- to avoid problematic access patterns (i.e.: multiple writers)
- that the file locking was designed to prevent.
+ (DER - 2017/05/31, HDFFV-9937)
- Additionally, the error message that is emitted when file lock
- operations set errno to ENOSYS (typical when file locking has been
- disabled) has been updated to describe the problem and potential
- resolution better.
+ - -fno-omit-frame-pointer was added when building with debugging symbols
+ enabled.
- (DER, 2016/10/26, HDFFV-9918)
+ Debugging symbols can be enabled independently of the overall build
+ mode in both the autotools and CMake. This allows (limited) debugging
+ of optimized code. Since many debuggers rely on the frame pointer,
+ we've disabled this optimization when debugging symbols are requested
+ (e.g.: via building with --enable-symbols).
- - The return type of H5Pget_driver_info() has been changed from void *
- to const void *.
+ (DER - 2017/05/31, HDFFV-10226)
- The pointer returned by this function points to internal library
- memory and should not be freed by the user.
- (DER, 2016/11/04, HDFFV-10017)
+ Library:
+ --------
+ - Added an enumerated value to H5F_libver_t for H5Pset_libver_bounds().
- - The direct I/O VFD has been removed from the list of VFDs that
- support SWMR.
+ Currently, the library defines two values for H5F_libver_t and supports
+ only two pairs of (low, high) combinations as derived from these values.
+ Thus the bounds setting via H5Pset_libver_bounds() is rather restricted.
- This configuration was never officially tested and several SWMR
- tests fail when this VFD is set.
+ Added an enumerated value (H5F_LIBVER_V18) to H5F_libver_t and
+ H5Pset_libver_bounds() now supports five pairs of (low, high) combinations
+ as derived from these values. This addition provides the user more
+ flexibility in setting bounds for object creation.
- (DER, 2016/11/03, HDFFV-10169)
+ (VC - 2018/03/14)
- Configuration:
- --------------
- - The minimum version of CMake required to build HDF5 is now 3.2.2.
+ - Added prefix option to VDS files.
- (ADB, 2017/01/10)
+ Currently, VDS source files must be in the active directory to be
+ found by the virtual file. Adding the option of a prefix to be set
+ on the virtual file, using a data access property list (DAPL),
+ allows the source files to locate at an absolute or relative path
+ to the virtual file.
+ Private utility functions in H5D and H5L packages merged into single
+ function in H5F package.
- - An --enable/disable-developer-warnings option has been added to
- configure.
+ New public APIs:
+ herr_t H5Pset_virtual_prefix(hid_t dapl_id, const char* prefix);
+ ssize_t H5Pget_virtual_prefix(hid_t dapl_id, char* prefix /*out*/, size_t size);
+ The prefix can also be set with an environment variable, HDF5_VDS_PREFIX.
- This disables warnings that do not indicate poor code quality such
- as -Winline and gcc's -Wsuggest-attribute. Developer warnings are
- disabled by default.
+ (ADB - 2017/12/12, HDFFV-9724, HDFFV-10361)
- (DER, 2017/01/10)
+ - H5FDdriver_query() API call added to the C library.
- - A bin/restore.sh script was added that reverts autogen.sh processing.
+ This new library call allows the user to query a virtual file driver
+ (VFD) for the feature flags it supports (listed in H5FDpublic.h).
+ This can be useful to determine if a VFD supports SWMR, for example.
- (DER, 2016/11/08)
+ Note that some VFDs have feature flags that may only be present
+ after a file has been created or opened (e.g.: the core VFD will
+ have the H5FD_FEAT_POSIX_COMPAT_HANDLE flag set if the backing
+ store is switched on). Since the new API call queries a generic VFD
+ unassociated with a file, these flags will never be returned.
- - CMake: Added NAMESPACE hdf5:: to package configuration files to allow
- projects using installed HDF5 binaries built with CMake to link with
- them without specifying the HDF5 library location via IMPORTED_LOCATION.
+ (DER - 2017/05/31, HDFFV-10215)
- (ABD, 2016/10/17, HDFFV-10003)
+ - H5FD_FEAT_DEFAULT_VFD_COMPATIBLE VFD feature flag added to the C library.
- - CMake: Changed the CTEST_BUILD_CONFIGURATION option to
- CTEST_CONFIGURATION_TYPE as recommended by the CMake documentation.
+ This new feature flag indicates that the VFD is compatible with the
+ default VFD. VFDs that set this flag create single files that follow
+ the canonical HDF5 file format.
- (ABD, 2016/10/17, HDFFV-9971)
-
+ (DER - 2017/05/31, HDFFV-10214)
- Fortran Library:
- ----------------
+ - The H5I_REFERENCE value in the H5I_type_t enum (defined in H5Ipublic.h)
+ has been marked as deprectated.
+
+ This ID type value is not used in the C library. i.e.: There are no
+ hid_t values that are of ID type H5I_REFERENCE.
+
+ This enum value will be removed in a future major version of the library.
+ The code will remain unchanged in the HDF5 1.10.x releases and branches.
+
+ (DER - 2017/04/05, HDFFV-10252)
+
+
+ Parallel Library:
+ -----------------
+ - Enabled compression for parallel applications.
+
+ With this release parallel applications can create and write compressed
+ datasets (or the datasets with the filters such as Fletcher32 applied).
+
+ (EIP - 2018/03/29)
+
+ - Addressed slow file close on some Lustre file systems.
+
+ Slow file close has been reported on some Lustre file systems.
+ While the ultimate cause is not understood fully, the proximate
+ cause appears to be long delays in MPI_File_set_size() calls at
+ file close and flush.
+
+ To minimize this problem pending a definitive diagnosis and fix,
+ PHDF5 has been modified to avoid MPI_File_set_size() calls when
+ possible. This is done by comparing the library's EOA (End of
+ Allocation) with the file systems EOF, and skipping the
+ MPI_File_set_size() call if the two match.
+
+ (JRM - 2018/03/29)
- - The HDF5 Fortran library can now be compiled with the NAG compiler.
+ - Optimized parallel open/location of the HDF5 super-block.
- (MSB, 2017/2/10, HDFFV-9973)
+ Previous releases of PHDF5 required all parallel ranks to
+ search for the HDF5 superblock signature when opening the
+ file. As this is accomplished more or less as a synchronous
+ operation, a large number of processes can experience a
+ slowdown in the file open due to filesystem contention.
+
+ As a first step in improving the startup/file-open performance,
+ we allow MPI rank 0 of the associated MPI communicator to locate
+ the base offset of the super-block and then broadcast that result
+ to the remaining ranks in the parallel group. Note that this
+ approach is utilized ONLY during file opens which employ the MPIO
+ file driver in HDF5 by previously having called H5Pset_fapl_mpio().
+
+ HDF5 parallel file operations which do not employ multiple ranks
+ e.g. specifiying MPI_COMM_SELF (whose MPI_Comm_size == 1)
+ as opposed to MPI_COMM_WORLD, will not be affected by this
+ optimization. Conversely, parallel file operations on subgroups
+ of MPI_COMM_WORLD are allowed to be run in parallel with each
+ subgroup operating as an independant collection of processes.
+
+ (RAW - 2017/10/10, HDFFV-10294)
+
+ - Added large (>2GB) MPI-IO transfers.
+
+ Previous releases of PHDF5 would fail when attempting to
+ read or write greater than 2GB of data in a single IO operation.
+ This issue stems principally from an MPI API whose definitions
+ utilize 32 bit integers to describe the number of data elements
+ and datatype that MPI should use to effect a data transfer.
+ Historically, HDF5 has invoked MPI-IO with the number of
+ elements in a contiguous buffer represented as the length
+ of that buffer in bytes.
+
+ Resolving the issue and thus enabling larger MPI-IO transfers
+ is accomplished first, by detecting when a user IO request would
+ exceed the 2GB limit as described above. Once a transfer request
+ is identified as requiring special handling, PHDF5 now creates a
+ derived datatype consisting of a vector of fixed sized blocks
+ which is in turn wrapped within a single MPI_Type_struct to
+ contain the vector and any remaining data. The newly created
+ datatype is then used in place of MPI_BYTE and can be used to
+ fulfill the original user request without encountering API
+ errors.
+
+ (RAW - 2017/09/10, HDFFV-8839)
C++ Library:
------------
-
- The following C++ API wrappers have been added to the C++ Library:
+ + H5Lcreate_soft:
+ // Creates a soft link from link_name to target_name.
+ void link(const char *target_name, const char *link_name,...)
+ void link(const H5std_string& target_name,...)
- // Sets/Gets the strategy and the threshold value that the library
- // will employ in managing file space.
- FileCreatPropList::setFileSpaceStrategy - H5Pset_file_space_strategy
- FileCreatPropList::getFileSpaceStrategy - H5Pget_file_space_strategy
+ + H5Lcreate_hard:
+ // Creates a hard link from new_name to curr_name.
+ void link(const char *curr_name, const Group& new_loc,...)
+ void link(const H5std_string& curr_name, const Group& new_loc,...)
- // Sets/Gets the file space page size for paged aggregation.
- FileCreatPropList::setFileSpacePagesize - H5Pset_file_space_page_size
- FileCreatPropList::getFileSpacePagesize - H5Pget_file_space_page_size
+ // Creates a hard link from new_name to curr_name in same location.
+ void link(const char *curr_name, const hid_t same_loc,...)
+ void link(const H5std_string& curr_name, const hid_t same_loc,...)
- // Checks if the given ID is valid.
- IdComponent::isValid - H5Iis_valid
+ Note: previous version of H5Location::link will be deprecated.
- // Sets/Gets the number of soft or user-defined links that can be
- // traversed before a failure occurs.
- LinkAccPropList::setNumLinks - H5Pset_nlinks
- LinkAccPropList::getNumLinks - H5Pget_nlinks
+ + H5Lcopy:
+ // Copy an object from a group of file to another.
+ void copyLink(const char *src_name, const Group& dst,...)
+ void copyLink(const H5std_string& src_name, const Group& dst,...)
- // Returns a copy of the creation property list of a datatype.
- DataType::getCreatePlist - H5Tget_create_plist
+ // Copy an object from a group of file to the same location.
+ void copyLink(const char *src_name, const char *dst_name,...)
+ void copyLink(const H5std_string& src_name,...)
- // Opens/Closes an object within a group or a file, regardless of object
- // type
- Group::getObjId - H5Oopen
- Group::closeObjId - H5Oclose
+ + H5Lmove:
+ // Rename an object in a group or file to a new location.
+ void moveLink(const char* src_name, const Group& dst,...)
+ void moveLink(const H5std_string& src_name, const Group& dst,...)
- // Maps elements of a virtual dataset to elements of the source dataset.
- DSetCreatPropList::setVirtual - H5Pset_virtual
+ // Rename an object in a group or file to the same location.
+ void moveLink(const char* src_name, const char* dst_name,...)
+ void moveLink(const H5std_string& src_name,...)
- // Gets general information about this file.
- H5File::getFileInfo - H5Fget_info2
+ Note: previous version H5Location::move will be deprecated.
- // Returns the number of members in a type.
- IdComponent::getNumMembers - H5Inmembers
+ + H5Ldelete:
+ // Removes the specified link from this location.
+ void unlink(const char *link_name,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT)
+ void unlink(const H5std_string& link_name,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT)
- // Determines if an element type exists.
- IdComponent::typeExists - H5Itype_exists
+ Note: additional parameter is added to previous H5Location::unlink.
- // Determines if an object exists.
- H5Location::exists - H5Lexists.
+ + H5Tencode and H5Tdecode:
+ // Creates a binary object description of this datatype.
+ void DataType::encode() - C API H5Tencode()
- // Returns the header version of an HDF5 object.
- H5Object::objVersion - H5Oget_info for version
+ // Returns the decoded type from the binary object description.
+ DataType::decode() - C API H5Tdecode()
+ ArrayType::decode() - C API H5Tdecode()
+ CompType::decode() - C API H5Tdecode()
+ DataType::decode() - C API H5Tdecode()
+ EnumType::decode() - C API H5Tdecode()
+ FloatType::decode() - C API H5Tdecode()
+ IntType::decode() - C API H5Tdecode()
+ StrType::decode() - C API H5Tdecode()
+ VarLenType::decode() - C API H5Tdecode()
- (BMR, 2017/03/20, HDFFV-10004, HDFFV-10139, HDFFV-10145)
+ + H5Lget_info:
+ // Returns the information of the named link.
+ H5L_info_t getLinkInfo(const H5std_string& link_name,...)
- - New exception: ObjHeaderIException for H5O interface.
+ (BMR - 2018/03/11, HDFFV-10149)
- (BMR, 2017/03/15, HDFFV-10145)
+ - Added class LinkCreatPropList for link create property list.
- - New class LinkAccPropList for link access property list, to be used by
- wrappers of H5Lexists.
+ (BMR - 2018/03/11, HDFFV-10149)
- (BMR, 2017/01/04, HDFFV-10145)
+ - Added overloaded functions H5Location::createGroup to take a link
+ creation property list.
+ Group createGroup(const char* name, const LinkCreatPropList& lcpl)
+ Group createGroup(const H5std_string& name, const LinkCreatPropList& lcpl)
- - New constructors to open datatypes in ArrayType, CompType, DataType,
- EnumType, FloatType, IntType, StrType, and VarLenType.
-
- (BMR, 2016/12/26, HDFFV-10056)
+ (BMR - 2018/03/11, HDFFV-10149)
- - New member functions:
+ - A document is added to the HDF5 C++ API Reference Manual to show the
+ mapping from a C API to C++ wrappers. It can be found from the main
+ page of the C++ API Reference Manual.
- DSetCreatPropList::setNbit() to setup N-bit compression for a dataset.
+ (BMR - 2017/10/17, HDFFV-10151)
- ArrayType::getArrayNDims() const
- ArrayType::getArrayDims() const
- both to replace the non-const versions.
- (BMR, 2016/04/25, HDFFV-8623, HDFFV-9725)
+ Java Library:
+ ----------------
+ - Wrapper added for enabling the error stack.
+ H5error_off would disable the error stack reporting. In order
+ to re-enable the reporting, the error stack info needs to be
+ saved so that H5error_on can revert state.
- Tools:
- ------
- - The following options have been added to h5clear:
- -s: clear the status_flags field in the file's superblock
- -m: Remove the metadata cache image from the file
+ (ADB - 2018/03/13, HDFFV-10412)
- (QAK, 2017/03/22, PR#361)
+ - Wrappers were added for the following C APIs:
+ H5Pset_evict_on_close
+ H5Pget_evict_on_close
+ H5Pset_chunk_opts
+ H5Pget_chunk_opts
+ H5Pset_efile_prefix
+ H5Pget_efile_prefix
+ H5Pset_virtual_prefix
+ H5Pget_virtual_prefix
+ (ADB - 2017/12/20)
- High-Level APIs:
- ---------------
- - Added New Fortran 2003 API for h5tbmake_table_f.
+ - The H5I_REFERENCE value in the H5I_type_t enum (defined in H5Ipublic.h)
+ has been marked as deprectated.
+
+ JNI code which refers to this value will be removed in a future
+ major version of the library. The code will remain unchanged in the
+ 1.10.x releases and branches.
+
+ See the C library section, above, for further information.
- (MSB, 2017/02/10, HDFFV-8486)
+ (HDFFV-10252, DER, 2017/04/05)
+ Tools:
+ ------
+ - h5diff has a new option to display error stack.
-Support for New Platforms, Languages, and Compilers
-===================================================
+ Updated h5diff with the --enable-error-stack argument, which
+ enables the display of the hdf5 error stack. This completes the
+ improvement to the main tools: h5copy, h5diff, h5dump, h5ls and
+ h5repack.
- - Added NAG compiler
+ (ADB - 2017/08/30, HDFFV-9774)
+Support for new platforms, languages and compilers.
+=======================================
+ - None
-Bug Fixes since HDF5-1.10.0-patch1 release
+Bug Fixes since HDF5-1.10.1 release
==================================
Library
-------
- - Outdated data structure was used in H5D_CHUNK_DEBUG blocks, causing
- compilation errors when H5D_CHUNK_DEBUG was defined. This is fixed.
+ - The data read after a direct chunk write to a chunked dataset with
+ one chunk was incorrect.
+
+ The problem was due to the passing of a null dataset pointer to
+ the insert callback for the chunk index in the routine
+ H5D__chunk_direct_write() in H5Dchunk.c
+ The dataset was a single-chunked dataset which will use the
+ single chunk index when latest format was enabled on file creation.
+ The single chunk index was the only index that used this pointer
+ in the insert callback.
- (BMR, 2017/04/04, HDFFV-8089)
+ Passed the dataset pointer to the insert callback for the chunk
+ index in H5D__chunk_direct_write().
- - SWMR implementation in the HDF5 1.10.0 and 1.10.0-patch1 releases has a
- broken metadata flush dependency that manifested itself with the following
- error at the end of the HDF5 error stack:
+ (VC - 2018/03/20, HDFFV-10425)
- H5Dint.c line 846 in H5D__swmr_setup(): dataspace chunk index must be 0
- for SWMR access, chunkno = 1
- major: Dataset
- minor: Bad value
+ - Added public routine H5DOread_chunk to the high-level C library.
- It was also reported at https://github.com/areaDetector/ADCore/issues/203
+ The patch for H5DOwrite_chunk() to write an entire chunk to the file
+ directly was contributed by GE Healthcare and integrated by The HDF Group
+ developers.
+
+ (VC - 2017/05/19, HDFFV-9934)
- The flush dependency is fixed in this release.
+ - Freeing of object header after failed checksum verification.
- - Changed the plugins dlopen option from RTLD_NOW to RTLD_LAZY
+ It was discovered that the object header (in H5Ocache.c) was not released properly
+ when the checksum verification failed and a re-load of the object
+ header was needed.
- (ABD, 2016/12/12, PR#201)
+ Freed the object header that failed the chksum verification only
+ after the new object header is reloaded, deserialized and set up.
- - A number of issues were fixed when reading/writing from/to corrupted
- files to ensure that the library fails gracefully in these cases:
+ (VC - 2018/03/14, HDFFV-10209)
- * Writing to a corrupted file that has an object message which is
- incorrectly marked as sharable on disk results in a buffer overflow /
- invalid write instead of a clean error message.
+ - Updated H5Pset_evict_on_close in H5Pfapl.c
- * Decoding data from a corrupted file with a dataset encoded with the
- H5Z_NBIT decoding can result in a code execution vulnerability under
- the context of the application using the HDF5 library.
+ Changed the minor error number from H5E_CANTSET to H5E_UNSUPPORTED for
+ parallel library.
- * When decoding an array datatype from a corrupted file, the HDF5 library
- fails to return an error in production if the number of dimensions
- decoded is greater than the maximum rank.
+ (ADB - 2018/03/06, HDFFV-10414)
- * When decoding an "old style" array datatype from a corrupted file, the
- HDF5 library fails to return an error in production if the number of
- dimensions decoded is greater than the maximum rank.
+ - Fixed the problems with the utility function that could not handle lowercase
+ Windows drive letters.
- (NAF, 2016/10/06, HDFFV-9950, HDFFV-9951, HDFFV-9992, HDFFV-9993)
+ Added call to upper function for drive letter.
- - Fixed an error that would occur when copying an object with an attribute
- which is a compound datatype consisting of a variable length string.
+ (ADB - 2017/12/18, HDFFV-10307)
- (VC, 2016/08/24, HDFFV-7991)
+ - Fixed H5Sencode() bug when the number of elements selected was > 2^32.
- - H5DOappend will no longer fail if a dataset has no append callback
- registered.
+ H5Sencode() incorrectly encodes dataspace selection with number of
+ elements exceeding 2^32. When decoding such selection via H5Sdecode(),
+ the number of elements in the decoded dataspace is not the same as
+ what is encoded. This problem exists for H5S_SEL_HYPER and
+ H5S_SEL_POINTS encoding.
- (VC, 2016/08/14, HDFFV-9960)
+ The cause of the problem is due to the fact that the library uses 32 bits to
+ encode counts and block offsets for the selection.
+ The solution is to use the original 32 bit encodings if possible,
+ but use a different way to encode selection if more that 32 bits is needed.
+ See details in the RFC: H5Sencode/H5Sdecode Format Change i
+ https://bitbucket.hdfgroup.org/projects/HDFFV/repos/hdf5doc/browse/RFCs/HDF5_Library/H5SencodeFormatChange.
- - Fixed an issue where H5Pset_alignment could result in misaligned blocks
- with some input combinations, causing an assertion failure in debug mode.
+ (VC - 2017/11/28, HDFFV-9947)
- (NAF, 2016/08/11, HDFFV-9948)
+ - Fixed filter plugin handling in H5PL.c and H5Z.c to not require i availability of
+ dependent libraries (e.g., szip or zlib).
- - Fixed a problem where a plugin compiled into a DLL in the default plugin
- directory could not be found by the HDF5 library at runtime on Windows
- when the HDF5_PLUGIN_PATH environment variable was not set.
+ It was discovered that the dynamic loading process used by
+ filter plugins had issues with library dependencies.
- (ABD, 2016/08/01, HDFFV-9706)
+ CMake build process changed to use LINK INTERFACE keywords, which
+ allowed HDF5 C library to make dependent libraries private. The
+ filter plugin libraries no longer require dependent libraries
+ (such as szip or zlib) to be available.
+
+ (ADB - 2017/11/16, HDFFV-10328)
- - Fixed an error that would occur when calling H5Adelete on an attribute
- which is attached to an externally linked object in the target file and
- whose datatype is a committed datatype in the main file.
+ - Fixed rare object header corruption bug.
- (VC, 2016/07/06, HDFFV-9940)
+ In certain cases, such as when converting large attributes to dense
+ storage, an error could occur which would either fail an assertion or
+ cause file corruption. Fixed and added test.
- - (a) Throw an error instead of assertion when v1 btree level hits the 1
- byte limit.
- (b) Modifications to better handle error recovery when conversion by
- h5format_convert fails.
+ (NAF - 2017/11/14, HDFFV-10274)
- (VC, 2016/05/29, HDFFV-9434)
+ - Updated H5Zfilter_avail in H5Z.c.
- - Fixed a memory leak where an array used by the library to track SWMR
- read retries was unfreed.
+ The public function checked for plugins, while the private
+ function did not.
- The leaked memory was small (on the order of a few tens of ints) and
- allocated per-file. The memory was allocated (and lost) only when a
- file was opened for SWMR access.
+ Modified H5Zfilter_avail and private function, H5Z_filter_avail.
+ Moved check for plugin from public to private function. Updated
+ H5P__set_filter due to change in H5Z_filter_avail. Updated tests.
- (DER, 2016/04/27, HDFFV-9786)
+ (ADB - 2017/10/10, HDFFV-10297, HDFFV-10319)
- - Fixed a memory leak that could occur when opening a file for the first
- time (including creating) and the call fails.
+ - h5dump produced SEGFAULT when dumping corrypted file.
+
+ The behavior was due to the error in the internal function H5HL_offset_into().
- This occurred when the file-driver-specific info was not cleaned up.
- The amount of memory leaked varied with the file driver, but would
- normally be less than 1 kB.
+ (1) Fixed H5HL_offset_into() to return error when offset exceeds heap data
+ block size.
+ (2) Fixed other places in the library that call this routine to detect
+ error routine.
- (DER, 2016/12/06, HDFFV-10168)
+ (VC - 2017/08/30, HDFFV-10216)
- - Fixed a failure in collective metadata writes.
+ - Fixes for paged aggregation feature.
- This failure only appeared when collective metadata writes
- were enabled (via H5Pset_coll_metadata_write()).
+ Skip test in test/fheap.c when:
+ (1) multi/split drivers and
+ (2) persisting free-space or using paged aggregation strategy
- (JRM, 2017/04/10, HDFFV-10055)
+ (VC, 2017/07/10)
+ Changes made based on RFC review comments:
+ (1) Added maximum value for file space page size
+ (2) Dropped check for page end metadata threshold
+ (3) Removed "can_shrink" and "shrink" callbacks for small section class
- Parallel Library
- ----------------
- - Fixed a bug that could occur when allocating a chunked dataset in parallel
- with an alignment set and an alignment threshold greater than the chunk
- size but less than or equal to the raw data aggregator size.
+ (VC - 2017/06/09)
+
+ - Fixed for infinite loop in H5VM_power2up().
+
+ The function H5VM_power2up() returns the next power of 2
+ for n. When n exceeds 2^63, it overflows and becomes 0 causing
+ the infinite looping.
+
+ The fix ensures that the function checks for n >= 2^63
+ and returns 0.
+
+ (VC - 2017/07/10, HDFFV-10217)
- (NAF, 2016/08/11, HDFFV-9969)
+ - Fixed for H5Ocopy doesn't work with open identifiers.
+
+ Changes made so that raw data for dataset objects are copied from
+ cached info when possible instead of flushing objects to file and
+ read them back in again.
+
+ (VC - 2017/07/05, HDFFV-7853)
+
+ - An uninitialized struct could cause a memory access error when using
+ variable-length or reference types in a compressed, chunked dataset.
+
+ A struct containing a callback function pointer and a pointer to some
+ associated data was used before initialization. This could cause a
+ memory access error and system crash. This could only occur under
+ unusual conditions when using variable-lenth and reference types in
+ a compressed, chunked dataset.
+
+ On recent versions of Visual Studio, when built in debug mode, the
+ debug heap will complain and cause a crash if the code in question
+ is executed (this will cause the objcopy test to fail).
+
+ (DER - 2017/11/21, HDFFV-10330)
+
+ - Fixed collective metadata writes on file close.
+
+ It was discovered that metadata was being written twice as part of
+ the parallel file close behavior, once independently and once
+ collectively.
+
+ A fix for this error was included as part of the parallel compression
+ feature but remained undocumented here.
+
+ (RAW - 2017/12/01, HDFFV-10272)
+
+ - If an HDF5 file contains a filter pipeline message with a 'number of
+ filters' field that exceeds the maximum number of allowed filters,
+ the error handling code will attempt to dereference a NULL pointer.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17505.
+ https://security-tracker.debian.org/tracker/CVE-2017-17505
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17505
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ This problem arose because the error handling code assumed that
+ the 'number of filters' field implied that a dynamic array of that
+ size had already been created and that the cleanup code should
+ iterate over that array and clean up each element's resources. If
+ an error occurred before the array has been allocated, this will
+ not be true.
+
+ This has been changed so that the number of filters is set to
+ zero on errors. Additionally, the filter array traversal in the
+ error handling code now requires that the filter array not be NULL.
+
+ (DER - 2018/02/06, HDFFV-10354)
+
+ - If an HDF5 file contains a filter pipeline message which contains
+ a 'number of filters' field that exceeds the actual number of
+ filters in the message, the HDF5 C library will read off the end of
+ the read buffer.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17506.
+ https://security-tracker.debian.org/tracker/CVE-2017-17506
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17506
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ The problem was fixed by passing the buffer size with the buffer
+ and ensuring that the pointer cannot be incremented off the end
+ of the buffer. A mismatch between the number of filters declared
+ and the actual number of filters will now invoke normal HDF5
+ error handling.
+
+ (DER - 2018/02/26, HDFFV-10355)
+
+ - If an HDF5 file contains a malformed compound datatype with a
+ suitably large offset, the type conversion code can run off
+ the end of the type conversion buffer, causing a segmentation
+ fault.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17507.
+ https://security-tracker.debian.org/tracker/CVE-2017-17506
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17506
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ THE HDF GROUP WILL NOT FIX THIS BUG AT THIS TIME
+
+ Fixing this problem would involve updating the publicly visible
+ H5T_conv_t function pointer typedef and versioning the API calls
+ which use it. We normally only modify the public API during
+ major releases, so this bug will not be fixed at this time.
+
+ (DER - 2018/02/26, HDFFV-10356)
+
+ - If an HDF5 file contains a malformed compound type which contains
+ a member of size zero, a division by zero error will occur while
+ processing the type.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17508.
+ https://security-tracker.debian.org/tracker/CVE-2017-17508
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17508
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ Checking for zero before dividing fixes the problem. Instead of the
+ division by zero, the normal HDF5 error handling is invoked.
+
+ (DER - 2018/02/26, HDFFV-10357)
+
+ - If an HDF5 file contains a malformed symbol table node that declares
+ it contains more symbols than it actually contains, the library
+ can run off the end of the metadata cache buffer while processing
+ the symbol table node.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17509.
+ https://security-tracker.debian.org/tracker/CVE-2017-17509
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17509
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ Performing bounds checks on the buffer while processing fixes the
+ problem. Instead of the segmentation fault, the normal HDF5 error
+ handling is invoked.
+
+ (DER - 2018/03/12, HDFFV-10358)
+
+ - Fixed permissions passed to open(2) on file create.
+
+ On Windows, the POSIX permissions passed to open(2) when creating files
+ were only incidentally correct. They are now set to the correct value of
+ (_S_IREAD | _S_IWRITE).
+
+ On other platforms, the permissions were set to a mix of 666, 644, and
+ 000. They are now set uniformly to 666.
+
+ (DER - 2017/04/28, HDFFV-9877)
+
+ - The H5FD_FEAT_POSIX_COMPAT_HANDLE flag is no longer used to determine
+ if a virtual file driver (VFD) is compatible with SWMR.
+
+ Use of this VFD feature flag was not in line with the documentation in
+ the public H5FDpublich.h file. In particular, it was being used as a
+ proxy for determining if SWMR I/O is allowed. This is unecessary as we
+ already have a feature flag for this (H5FD_SUPPORTS_SWMR_IO).
+
+ (DER - 2017/05/31, HDFFV-10214)
Configuration
- -------------
- - Configuration will check for the strtoll and strtoull functions
- before using alternatives
+ -------------
+ - CMake changes
- (ABD, 2017/03/17, PR#340)
+ - Updated CMake commands configuration.
- - CMake uses a Windows pdb directory variable if available and
- will generate both static and shared pdb files.
+ A number of improvements were made to the CMake commands. Most
+ changes simplify usage or eliminate unused constructs. Also,
+ some changes support better cross-platform support.
- (ABD, 2017/02/06, HDFFV-9875)
+ (ADB - 2018/02/01, HDFFV-10398)
- - CMake now builds shared versions of tools.
+ - Corrected usage of CMAKE_BUILD_TYPE variable.
- (ABD, 2017/02/01, HDFFV-10123)
+ The use of the CMAKE_BUILD_TYPE is incorrect for multi-config
+ generators (Visual Studio and XCode) and is optional for single
+ config generators. Created a new macro to check
+ GLOBAL PROPERTY -> GENERATOR_IS_MULTI_CONFIG
+ Created two new HDF variable, HDF_BUILD_TYPE and HDF_CFG_BUILD_TYPE.
+ Defaults for these variables is "Release".
- - Makefiles and test scripts have been updated to correctly remove files
- created when running "make check" and to avoid removing any files under
- source control. In-source builds followed by "make clean" and "make
- distclean" should result in the original source files.
+ (ADB - 2018/01/10, HDFFV-10385)
- (LRK, 2017/01/17, HDFFV-10099)
+ - Added replacement of fortran flags if using static CRT.
- - The tools directory has been divided into two separate source and test
- directories. This resolves a build dependency and, as a result,
- 'make check' will no longer fail in the tools directory if 'make' was
- not executed first.
-
- (ABD, 2016/10/27, HDFFV-9719)
+ Added TARGET_STATIC_CRT_FLAGS call to HDFUseFortran.cmake file in
+ config/cmake_ext_mod folder.
- - CMake: Fixed a timeout error that would occasionally occur when running
- the virtual file driver tests simultaneously due to test directory
- and file name collisions.
+ (ADB - 2018/01/08, HDFFV-10334)
- (ABD, 2016/09/19, HDFFV-9431)
- - CMake: Fixed a command length overflow error by converting custom
- commands inside CMakeTest.cmake files into regular dependencies and
- targets.
+ - The hdf5 library used shared szip and zlib, which needlessly required
+ applications to link with the same szip and zlib libraries.
- (ABD, 2016/07/12, HDFFV-9939)
+ Changed the target_link_libraries commands to use the static libs.
+ Removed improper link duplication of szip and zlib.
+ Adjusted the link dependencies and the link interface values of
+ the target_link_libraries commands.
- - Fixed a problem preventing HDF5 to be built on 32-bit CYGWIN by
- condensing cygwin configuration files into a single file and
- removing outdated compiler settings.
+ (ADB - 2017/11/14, HDFFV-10329)
- (ABD, 2016/07/12, HDFFV-9946)
+ - CMake MPI
+ CMake implementation for MPI was problematic and would create incorrect
+ MPI library references in the hdf5 libraries.
- Fortran
- --------
- - Changed H5S_ALL_F from INTEGER to INTEGER(HID_T)
+ Reworked the CMake MPI code to properly create CMake targets. Also merged
+ the latest CMake FindMPI.cmake changes to the local copy. This is necessary
+ until HDF changes the CMake minimum to 3.9 or greater.
+
+ (ADB - 2017/11/02, HDFFV-10321)
+
+ - Corrected FORTRAN_HAVE_C_LONG_DOUBLE processing in the autotools.
- (MSB, 2016/10/14, HDFFV-9987)
+ A bug in the autotools Fortran processing code always set the
+ FORTRAN_HAVE_C_LONG_DOUBLE variable to be true regardless of
+ whether or not a C long double type was present.
+ This would cause compilation failures on platforms where a C
+ long double type was not available and the Fortran wrappers
+ were being built.
+
+ (DER - 2017/07/05, HDFFV-10247)
+
+ - The deprecated --enable-production and --enable-debug configure options
+ failed to emit errors when passed an empty string
+ (e.g.: --enable-debug="").
+
+ Due to the way we checked for these options being set, it was possible
+ to avoid the error message and continue configuration if an empty string
+ was passed to the option.
+
+ Any use of --enable-production or --enable-debug will now halt the
+ configuration step and emit a helpful error message
+ (use --enable-build-mode=debug|production instead).
+
+ (DER - 2017/07/05, HDFFV-10248)
+
+ - CMake
+
+ Too many commands for POST_BUILD step caused command line to be
+ too big on windows.
+
+ Changed foreach of copy command to use a custom command with the
+ use of the HDFTEST_COPY_FILE macro.
+
+ (ADB - 2017/07/12, HDFFV-10254)
+
+ - CMake test execution environment
+
+ The parallel HDF5 test: 't_pread' assumed the use of autotools
+ and the directory structure associated with that testing approach.
+ Modified the test code to check whether the 'h5jam' utility can be
+ found in the same directory as the test executable (which is
+ preferred directory structure utilized by cmake) and if found
+ will invoke the tool directly rather than utilizing a relative path.
+
+ (RAW - 2017/11/03, HDFFV-10318)
+
+ - Fortran compilation fails for xlf and CMake builds.
+
+ Fixed CMake shared library build for H5match_types and modules
+
+ (MSB - 2017/12/19, HDFFV-10363)
+
+ - Shared libraries fail test on OSX with Fortran enabled with CMake.
+
+ Fixed by removing the F77 use of EQUIVALENCE and COMMON, replaced
+ using MODULES. Updated CMake.
+
+ (MSB - 2017/12/07, HDFFV-10223)
+
+ - The bin/trace script now emits an error code on problems and autogen.sh
+ will fail if bin/trace fails.
+
+ The bin/trace script adds tracing functionality to public HDF5 API calls.
+ It is only of interest to developers who modify the HDF5 source code.
+ Previously, bin/trace just wrote an error message to stdout when it
+ encountered problems, so autogen.sh processing did not halt and a broken
+ version of the library could be built. The script will now return an
+ error code when it encounters problems, and autogen.sh will fail.
+
+ This only affects users who run autogen.sh to rebuild the Autotools files,
+ which is not necessary to build HDF5 from source in official releases of the
+ library. CMake users are unaffected as bin/trace is not run via CMake
+ at this time.
+
+ (DER - 2017/04/25, HDFFV-10178)
+
+ - FC_BASENAME was changed from gfortran40 to gfortran in a few places.
+
+ In the autotools, FC_BASENAME was set to gfortran40 in a few locations
+ (config/gnu-fflags and config/freebsd). This was probably a historical
+ artifact and did not seem to affect many users.
+
+ The value is now correctly set to gfortran.
+
+ (DER - 2017/05/26, HDFFV-10249)
+
+ - The ar flags were changed to -cr (was: -cru)
+
+ The autotools set the flags for ar to -cru by default. The -u flag,
+ which allows selective replacement of only the members which have
+ changed, raises warnings on some platforms, so the flags are now set to
+ -cr via AR_FLAGS in configure.ac. This causes the static library to
+ always be completely recreated from the object files on each build.
+
+ (DER - 2017/11/15, HDFFV-10428)
+
+
+ Fortran
+ --------
+ - Fixed compilation errors when using Intel 18 Fortran compilers
+ (MSB - 2017/11/3, HDFFV-10322)
Tools
-----
- - h5diff now correctly ignores strpad in comparing strings.
+ - h5clear
- (ABD, 2017/03/03, HDFFV-10128)
+ An enhancement to the tool in setting a file's stored EOA.
- - h5repack now correctly parses the command line filter options.
+ It was discovered that a crashed file's stored EOA in the superblock
+ was smaller than the actual file's EOF. When the file was reopened
+ and closed, the library truncated the file to the stored EOA.
- (ABD, 2017/01/24, HDFFV-10046)
+ Added an option to the tool in setting the file's stored EOA in the
+ superblock to the maximum of (EOA, EOF) + increment.
+ An option was also added to print the file's EOA and EOF.
- - h5diff now correctly returns an error when it cannot read data due
- to an unavailable filter plugin.
+ (VC - 2018/03/14, HDFFV-10360)
- (ADB 2017/01/18, HDFFV-9994 )
+ - h5repack
- - Fixed an error in the compiler wrapper scripts (h5cc, h5fc, et al.)
- in which they would erroneously drop the file argument specified via
- the -o flag when the -o flag was specified before the -c flag on the
- command line, resulting in a failure to compile.
+ h5repack changes the chunk parameters when a change of layout is not
+ specified and a filter is applied.
- (LRK, 2016/11/04, HDFFV-9938, HDFFV-9530)
+ HDFFV-10297, HDFFV-10319 reworked code for h5repack and h5diff code
+ in the tools library. The check for an existing layout was incorrectly
+ placed into an if block and not executed. The check was moved into
+ the normal path of the function.
- - h5repack User Defined (UD) filter parameters were not parsed correctly.
+ (ADB - 2018/02/21, HDFFV-10412)
- The UD filter parameters were not being parsed correctly. Reworked coding
- section to parse the correct values and verify number of parameters.
+ - h5dump
- (ABD, 2016/10/19, HDFFV-9996, HDFFV-9974, HDFFV-9515, HDFFV-9039)
+ The tools library will hide the error stack during file open.
- - h5repack allows the --enable-error-stack option on the command line.
+ While this is preferable almost always, there are reasons to enable
+ display of the error stack when a tool will not open a file. Adding an
+ optional argument to the --enable-error-stack will provide this use case.
+ As an optional argument it will not affect the operation of the
+ --enable-error-stack. h5dump is the only tool to implement this change.
- (ADB, 2016/08/08, HDFFV-9775)
+ (ADB - 2018/02/15, HDFFV-10384)
+ - h5dump
- C++ APIs
- --------
- - The member function H5Location::getNumObjs() is moved to
- class Group because the objects are in a group or a file only,
- and H5Object::getNumAttrs to H5Location to get the number of
- attributes at a given location.
+ h5dump would output an indented blank line in the filters section.
+
+ h5dump overused the h5tools_simple_prefix function, which is a
+ function intended to account for the data index (x,y,z) option.
+ Removed the function call for header information.
+
+ (ADB - 2018/01/25, HDFFV-10396)
+
+ - h5repack
+
+ h5repack incorrectly searched internal object table for name.
+
+ h5repack would search the table of objects for a name, if the
+ name did not match it tried to determine if the name without a
+ leading slash would match. The logic was flawed! The table
+ stored names(paths) without a leading slash and did a strstr
+ of the table path to the name.
+ The assumption was that if there was a difference of one then
+ it was a match, however "pressure" would match "/pressure" as
+ well as "/pressure1", "/pressure2", etc. Changed logic to remove
+ any leading slash and then do a full compare of the name.
+
+ (ADB - 2018/01/18, HDFFV-10393)
+
+ - h5repack
+
+ h5repack failed to handle command line parameters for customer filters.
+
+ User defined filter parameter conversions would fail whenintegers were
+ represented on the command line with character string
+ larger then 9 characters. Increased local variable array for storing
+ the current command line parameter to prevent buffer overflows.
+
+ (ADB - 2018/01/17, HDFFV-10392)
+
+ - h5diff
+
+ h5diff seg faulted if comparing VL strings against fixed strings.
+
+ Reworked solution for HDFFV-8625 and HDFFV-8639. Implemented the check
+ for string objects of same type in the diff_can_type function by
+ adding an if(tclass1 == H5T_STRING) block. This "if block" moves the
+ same check that was added for attributes to this function, which is
+ used by all object types. This function handles complex type structures.
+ Also added a new test file in h5diffgenttest for testing this issue
+ and removed the temporary files used in the test scripts.
- (BMR, 2017/03/17, PR#466)
+ (ADB - 2018/01/04, HDFFV-8745)
- - Due to the change in the C API, the overloaded functions of
- PropList::setProperty now need const for some arguments. They are
- planned for deprecation and are replaced by new versions with proper
- consts.
+ - h5repack
- (BMR, 2017/03/17, PR#344)
+ h5repack failed to copy a dataset with existing filter.
- - The high-level API Packet Table (PT) did not write data correctly when
- the datatype is a compound type that has string type as one of the
- members. This problem started in 1.8.15, after the fix of HDFFV-9042
- was applied, which caused the Packet Table to use native type to access
- the data. It should be up to the application to specify whether the
- buffer to be read into memory is in the machine's native architecture.
- Thus, the PT is fixed to not use native type but to make a copy of the
- user's provided datatype during creation or the packet table's datatype
- during opening. If an application wishes to use native type to read the
- data, then the application will request that. However, the Packet Table
- doesn't provide a way to specify memory datatype in this release. This
- feature will be available in future releases.
+ Reworked code for h5repack and h5diff code in the tools library. Added
+ improved error handling, cleanup of resources and checks of calls.
+ Modified H5Zfilter_avail and private function, H5Z_filter_avail.
+ Moved check for plugin from public to private function. Updated
+ H5P__set_filter due to change in H5Z_filter_avail. Updated tests.
+ Note, h5repack output display has changed to clarify the individual
+ steps of the repack process. The output indicates if an operation
+ applies to all objects. Lines with notation and no information
+ have been removed.
- (BMR, 2016/10/27, HDFFV-9758)
+ (ADB - 2017/10/10, HDFFV-10297, HDFFV-10319)
- - The obsolete macros H5_NO_NAMESPACE and H5_NO_STD have been removed from
- the HDF5 C++ API library.
+ - h5repack
- (BMR, 2016/10/23, HDFFV-9532)
+ h5repack always set the User Defined filter flag to H5Z_FLAG_MANDATORY.
- - The problem where a user-defined function cannot access both, attribute
- and dataset, using only one argument is now fixed.
+ Added another parameter to the 'UD=' option to set the flag by default
+ to '0' or H5Z_FLAG_MANDATORY, the other choice is '1' or H5Z_FLAG_OPTIONAL.
- (BMR, 2016/10/11, HDFFV-9920)
+ (ADB - 2017/08/31, HDFFV-10269)
- - In-memory array information, ArrayType::rank and
- ArrayType::dimensions, were removed. This is an implementation
- detail and should not affect applications.
+ - h5ls
- (BMR, 2016/04/25, HDFFV-9725)
+ h5ls generated error on stack when it encountered a H5S_NULL
+ dataspace.
+ Adding checks for H5S_NULL before calling H5Sis_simple (located
+ in the h5tools_dump_mem function) fixed the issue.
+
+ (ADB - 2017/08/17, HDFFV-10188)
+
+ - h5repack
+
+ Added tests to h5repack.sh.in to verify options added for paged
+ aggregation work as expected.
+
+ (VC - 2017/08/03)
+
+ - h5dump
+
+ h5dump segfaulted on output of XML file.
+
+ Function that escape'd strings used the full buffer length
+ instead of just the length of the replacement string in a
+ strncpy call. Using the correct length fixed the issue.
+
+ (ADB - 2017/08/01, HDFFV-10256)
+
+ - h5diff
+
+ h5diff segfaulted on compare of a NULL variable length string.
+
+ Improved h5diff compare of strings by adding a check for
+ NULL strings and setting the lengths to zero.
+
+ (ADB - 2017/07/25, HDFFV-10246)
+
+ - h5import
+
+ h5import crashed trying to import data from a subset of a dataset.
+
+ Improved h5import by adding the SUBSET keyword. h5import understands
+ to use the Count times the Block as the size of the dimensions.
+ Added INPUT_B_ORDER keyword to old-style configuration files.
+ The import from h5dump function expects the binary files to use native
+ types (FILE '-b' option) in the binary file.
+
+ (ADB - 2017/06/15, HDFFV-10219)
+
+ - h5repack
+
+ h5repack did not maintain the creation order flag of the root
+ group.
+
+ Improved h5repack by reading the creation order and applying the
+ flag to the new root group. Also added arguments to set the
+ order and index direction, which applies to the traversing of the
+ original file, on the command line.
+
+ (ADB - 2017/05/26, HDFFV-8611)
+
+ - h5diff
+
+ h5diff failed to account for strpad type and null terminators
+ of char strings. Also, h5diff failed to account for string length
+ differences and would give a different result depending on file
+ order in the command line.
+
+ Improved h5diff compare of strings and arrays by adding a check for
+ string lengths and if the strpad was null filled.
+
+ (ADB - 2017/05/18, HDFFV-9055, HDFFV-10128)
+
+ High-Level APIs:
+ ------
+ - H5DOwrite_chunk() problems when overwriting an existing chunk with
+ no filters enabled.
+
+ When overwriting chunks and no filters were being used, the library would
+ fail (when asserts are enabled, e.g. debug builds) or incorrectly
+ insert additional chunks instead of overwriting (when asserts are not
+ enabled, e.g. production builds).
+
+ This has been fixed and a test was added to the hl/test_dset_opt test.
+
+ (DER - 2017/05/11, HDFFV-10187)
+
+ C++ APIs
+ --------
+ - Removal of memory leaks.
+
+ A private function was inadvertently called, causing memory leaks. This
+ is now fixed.
+
+ (BMR - 2018/03/12 - User's reported in email)
Testing
-------
- - Fixed a problem that caused tests using SWMR to occasionally fail when
- running "make check" using parallel make.
+ - Memory for three variables in testphdf5's coll_write_test was malloced
+ but not freed, leaking memory when running the test.
+
+ The variables' memory is now freed.
- (LRK, 2016/03/22, PR#338, PR#346, PR#358)
+ (LRK - 2018/03/12, HDFFV-10397)
+
+ - Refactored the testpar/t_bigio.c test to include ALARM macros
+
+ Changed the test to include the ALARM_ON and ALARM_OFF macros which
+ are intended to prevent nightly test hangs that have been observed
+ with this particular parallel test example. The code was also modified to
+ simplify status reporting (only from MPI rank 0) and additional
+ status checking added.
+
+ (RAW - 2017/11/08, HDFFV-10301)
Supported Platforms
===================
- Linux 2.6.32-573.18.1.el6.ppc64 gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-4)
- #1 SMP ppc64 GNU/Linux g++ (GCC) 4.4.7 20120313 (Red Hat 4.4.7-4)
- (ostrich) GNU Fortran (GCC) 4.4.7 20120313
- (Red Hat 4.4.7-4)
+ Linux 2.6.32-696.16.1.el6.ppc64 gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
+ #1 SMP ppc64 GNU/Linux g++ (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
+ (ostrich) GNU Fortran (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
IBM XL C/C++ V13.1
IBM XL Fortran V15.1
- Linux 3.10.0-327.10.1.el7 GNU C (gcc), Fortran (gfortran), C++ (g++)
+ Linux 3.10.0-327.10.1.el7 GNU C (gcc), Fortran (gfortran), C++ (g++)
#1 SMP x86_64 GNU/Linux compilers:
- (kituo/moohan) Version 4.8.5 20150623 (Red Hat 4.8.5-4)
- Version 4.9.3, Version 5.2.0
+ (kituo/moohan) Version 4.8.5 20150623 (Red Hat 4.8.5-4)
+ Version 4.9.3, Version 5.2.0,
Intel(R) C (icc), C++ (icpc), Fortran (icc)
compilers:
- Version 15.0.3.187 Build 20150407
+ Version 17.0.0.098 Build 20160721
MPICH 3.1.4 compiled with GCC 4.9.3
SunOS 5.11 32- and 64-bit Sun C 5.12 SunOS_sparc
@@ -584,33 +1098,30 @@ Supported Platforms
Windows 7 x64 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
Visual Studio 2013 w/ Intel Fortran 15 (cmake)
Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Visual Studio 2015 w/ Intel C, Fortran 2017 (cmake)
Visual Studio 2015 w/ MSMPI 8 (cmake)
- Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
+ Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
gcc and gfortran compilers (GCC 5.4.0)
(cmake and autotools)
Windows 10 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
- Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
+ Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
gcc and gfortran compilers (GCC 5.4.0)
(cmake and autotools)
Windows 10 x64 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
- Mac OS X Mt. Lion 10.8.5 Apple clang/clang++ version 5.1 from Xcode 5.1
- 64-bit gfortran GNU Fortran (GCC) 4.8.2
- (swallow/kite) Intel icc/icpc/ifort version 15.0.3
-
- Mac OS X Mavericks 10.9.5 Apple clang/clang++ version 6.0 from Xcode 6.2
- 64-bit gfortran GNU Fortran (GCC) 4.9.2
- (wren/quail) Intel icc/icpc/ifort version 15.0.3
-
Mac OS X Yosemite 10.10.5 Apple clang/clang++ version 6.1 from Xcode 7.0
64-bit gfortran GNU Fortran (GCC) 4.9.2
(osx1010dev/osx1010test) Intel icc/icpc/ifort version 15.0.3
- Mac OS X El Capitan 10.11.6 Apple clang/clang++ version 7.3 from Xcode 7.3
+ Mac OS X El Capitan 10.11.6 Apple clang/clang++ version 7.3.0 from Xcode 7.3
64-bit gfortran GNU Fortran (GCC) 5.2.0
- (osx1010dev/osx1010test) Intel icc/icpc/ifort version 16.0.2
+ (osx1011dev/osx1011test) Intel icc/icpc/ifort version 16.0.2
+
+ Mac OS Sierra 10.12.6 Apple LLVM version 8.1.0 (clang/clang++-802.0.42)
+ 64-bit gfortran GNU Fortran (GCC) 7.1.0
+ (swallow/kite) Intel icc/icpc/ifort version 17.0.2
Tested Configuration Features Summary
@@ -671,28 +1182,33 @@ Compiler versions for each platform are listed in the preceding
More Tested Platforms
=====================
-
The following platforms are not supported but have been tested for this release.
- Linux 2.6.32-573.22.1.el6 GNU C (gcc), Fortran (gfortran), C++ (g++)
+ Linux 2.6.32-573.22.1.el6 GNU C (gcc), Fortran (gfortran), C++ (g++)
#1 SMP x86_64 GNU/Linux compilers:
(mayll/platypus) Version 4.4.7 20120313
- Version 4.8.4
+ Version 4.9.3, 5.3.0, 6.2.0
PGI C, Fortran, C++ for 64-bit target on
x86-64;
- Version 16.10-0
+ Version 17.10-0
Intel(R) C (icc), C++ (icpc), Fortran (icc)
compilers:
- Version 15.0.3.187 (Build 20150407)
+ Version 17.0.4.196 Build 20170411
MPICH 3.1.4 compiled with GCC 4.9.3
Linux 3.10.0-327.18.2.el7 GNU C (gcc) and C++ (g++) compilers
#1 SMP x86_64 GNU/Linux Version 4.8.5 20150623 (Red Hat 4.8.5-4)
- (jelly) with NAG Fortran Compiler Release 6.1(Tozai)
+ (jelly) with NAG Fortran Compiler Release 6.1(Tozai)
+ GCC Version 7.1.0
+ OpenMPI 3.0.0-GCC-7.2.0-2.29
Intel(R) C (icc) and C++ (icpc) compilers
- Version 15.0.3.187 (Build 20150407)
+ Version 17.0.0.098 Build 20160721
with NAG Fortran Compiler Release 6.1(Tozai)
+ Linux 3.10.0-327.10.1.el7 MPICH 3.2 compiled with GCC 5.3.0
+ #1 SMP x86_64 GNU/Linux
+ (moohan)
+
Linux 2.6.32-573.18.1.el6.ppc64 MPICH mpich 3.1.4 compiled with
#1 SMP ppc64 GNU/Linux IBM XL C/C++ for Linux, V13.1
(ostrich) and IBM XL Fortran for Linux, V15.1
@@ -703,16 +1219,16 @@ The following platforms are not supported but have been tested for this release.
(cmake and autotools)
Fedora 24 4.7.2-201.fc24.x86_64 #1 SMP x86_64 x86_64 x86_64 GNU/Linux
- gcc, g++ (GCC) 6.1.1 20160621
+ gcc, g++ (GCC) 6.1.1 20160621
(Red Hat 6.1.1-3)
- GNU Fortran (GCC) 6.1.1 20160621
+ GNU Fortran (GCC) 6.1.1 20160621
(Red Hat 6.1.1-3)
(cmake and autotools)
Ubuntu 16.04.1 4.4.0-38-generic #57-Ubuntu SMP x86_64 GNU/Linux
- gcc, g++ (Ubuntu 5.4.0-6ubuntu1~16.04.2)
+ gcc, g++ (Ubuntu 5.4.0-6ubuntu1~16.04.2)
5.4.0 20160609
- GNU Fortran (Ubuntu 5.4.0-6ubuntu1~16.04.2)
+ GNU Fortran (Ubuntu 5.4.0-6ubuntu1~16.04.2)
5.4.0 20160609
(cmake and autotools)
@@ -720,11 +1236,18 @@ The following platforms are not supported but have been tested for this release.
Known Problems
==============
- At present, metadata cache images may not be generated by parallel
+ At present, metadata cache images may not be generated by parallel
applications. Parallel applications can read files with metadata cache
images, but since this is a collective operation, a deadlock is possible
if one or more processes do not participate.
- Known problems in previous releases can be found in the HISTORY*.txt files
- in the HDF5 source. Please report any new problems found to
+ Three tests fail with OpenMPI 3.0.0/GCC-7.2.0-2.29:
+ testphdf5 (ecdsetw, selnone, cchunk1, cchunk3, cchunk4, and actualio)
+ t_shapesame (sscontig2)
+ t_pflush1/fails on exit
+ The first two tests fail attempting collective writes.
+
+ Known problems in previous releases can be found in the HISTORY*.txt files
+ in the HDF5 source. Please report any new problems found to
help@hdfgroup.org.
+
diff --git a/release_docs/USING_HDF5_CMake.txt b/release_docs/USING_HDF5_CMake.txt
index ecf972d..169a06f 100644
--- a/release_docs/USING_HDF5_CMake.txt
+++ b/release_docs/USING_HDF5_CMake.txt
@@ -37,7 +37,7 @@ I. Preconditions
1. We suggest you obtain the latest CMake for windows from the Kitware
web site. The HDF5 1.10.x product requires a minimum CMake version
- of 3.2.2.
+ of 3.10.1.
2. You have installed the HDF5 library built with CMake, by executing
the HDF Install Utility (the *.msi file in the binary package for
@@ -101,10 +101,10 @@ These steps are described in more detail below.
* Unix Makefiles
* Visual Studio 12 2013
* Visual Studio 12 2013 Win64
- * Visual Studio 11 2012
- * Visual Studio 11 2012 Win64
* Visual Studio 14 2015
* Visual Studio 14 2015 Win64
+ * Visual Studio 15 2017
+ * Visual Studio 15 2017 Win64
<options> is:
* BUILD_TESTING:BOOL=ON
@@ -180,7 +180,7 @@ Given the preconditions in section I, create a CMakeLists.txt file at the
source root. Include the following text in the file:
##########################################################
-cmake_minimum_required (VERSION 3.2.2)
+cmake_minimum_required (VERSION 3.10)
project (HDF5MyApp C CXX)
set (LIB_TYPE STATIC) # or SHARED
@@ -194,7 +194,6 @@ set (LINK_LIBS ${LINK_LIBS} ${HDF5_C_${LIB_TYPE}_LIBRARY})
set (example hdf_example)
add_executable (${example} ${PROJECT_SOURCE_DIR}/${example}.c)
-TARGET_NAMING (${example} ${LIB_TYPE})
TARGET_C_PROPERTIES (${example} ${LIB_TYPE} " " " ")
target_link_libraries (${example} ${LINK_LIBS})
@@ -217,132 +216,116 @@ adjust the forward slash to double backslashes, except for the HDF_DIR
environment variable.
NOTE: this file is available at the HDF web site:
- http://www.hdfgroup.org/HDF5/release/cmakebuild.html
+ https://portal.hdfgroup.org/display/support/Building+HDF5+with+CMake
HDF5_Examples.cmake
+ HDF5_Examples_options.cmake
Also available at the HDF web site is a CMake application framework template.
You can quickly add files to the framework and execute the script to compile
your application with an installed HDF5 binary.
========================================================================
-ctest
+ctest use of HDF5_Examples.cmake and HDF5_Examples_options.cmake
========================================================================
-cmake_minimum_required(VERSION 3.2.2 FATAL_ERROR)
+cmake_minimum_required (VERSION 3.10)
###############################################################################################################
# This script will build and run the examples from a folder
# Execute from a command line:
-# ctest -S HDF5_Examples.cmake,OPTION=VALUE -C Release -V -O test.log
+# ctest -S HDF5_Examples.cmake,OPTION=VALUE -C Release -VV -O test.log
###############################################################################################################
-set (CTEST_CMAKE_GENERATOR "@CMAKE_GENERATOR@")
-set (CTEST_DASHBOARD_ROOT ${CTEST_SCRIPT_DIRECTORY})
+set(CTEST_CMAKE_GENERATOR "@CMAKE_GENERATOR@")
+if("@CMAKE_GENERATOR_TOOLSET@")
+ set(CMAKE_GENERATOR_TOOLSET "@CMAKE_GENERATOR_TOOLSET@")
+endif()
+set(CTEST_DASHBOARD_ROOT ${CTEST_SCRIPT_DIRECTORY})
# handle input parameters to script.
#INSTALLDIR - HDF5 root folder
#CTEST_CONFIGURATION_TYPE - Release, Debug, RelWithDebInfo
-#CTEST_SOURCE_NAME - name of source folder; HDF4Examples
-#STATIC_ONLY - Default is YES
-#FORTRAN_LIBRARIES - Default is NO
-#JAVA_LIBRARIES - Default is NO
-##NO_MAC_FORTRAN - set to TRUE to allow shared libs on a Mac)
-if (DEFINED CTEST_SCRIPT_ARG)
- # transform ctest script arguments of the form
- # script.ctest,var1=value1,var2=value2
- # to variables with the respective names set to the respective values
- string (REPLACE "," ";" script_args "${CTEST_SCRIPT_ARG}")
- foreach (current_var ${script_args})
- if ("${current_var}" MATCHES "^([^=]+)=(.+)$")
- set("${CMAKE_MATCH_1}" "${CMAKE_MATCH_2}")
- endif ()
- endforeach ()
-endif ()
-if (NOT DEFINED INSTALLDIR)
- set (INSTALLDIR "@CMAKE_INSTALL_PREFIX@")
-endif ()
-if (NOT DEFINED CTEST_CONFIGURATION_TYPE)
- set (CTEST_CONFIGURATION_TYPE "Release")
-endif ()
-if (NOT DEFINED CTEST_SOURCE_NAME)
- set (CTEST_SOURCE_NAME "HDF5Examples")
-endif ()
-if (NOT DEFINED STATIC_ONLY)
- set (STATICONLYLIBRARIES "YES")
-else ()
- set (STATICONLYLIBRARIES "NO")
-endif ()
-if (NOT DEFINED FORTRAN_LIBRARIES)
- set (FORTRANLIBRARIES "NO")
-else ()
- set (FORTRANLIBRARIES "YES")
-endif ()
-if (NOT DEFINED JAVA_LIBRARIES)
- set (JAVALIBRARIES "NO")
-else ()
- set (JAVALIBRARIES "YES")
-endif ()
+#CTEST_SOURCE_NAME - name of source folder; HDF5Examples
+if(DEFINED CTEST_SCRIPT_ARG)
+ # transform ctest script arguments of the form
+ # script.ctest,var1=value1,var2=value2
+ # to variables with the respective names set to the respective values
+ string(REPLACE "," ";" script_args "${CTEST_SCRIPT_ARG}")
+ foreach(current_var ${script_args})
+ if("${current_var}" MATCHES "^([^=]+)=(.+)$")
+ set("${CMAKE_MATCH_1}" "${CMAKE_MATCH_2}")
+ endif()
+ endforeach()
+endif()
+
+###################################################################
+### Following Line is one of [Release, RelWithDebInfo, Debug] #####
+set(CTEST_CONFIGURATION_TYPE "$ENV{CMAKE_CONFIG_TYPE}")
+if(NOT DEFINED CTEST_CONFIGURATION_TYPE)
+ set(CTEST_CONFIGURATION_TYPE "Release")
+endif()
+set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCTEST_CONFIGURATION_TYPE:STRING=${CTEST_CONFIGURATION_TYPE}")
+##################################################################
+
+if(NOT DEFINED INSTALLDIR)
+ set(INSTALLDIR "@CMAKE_INSTALL_PREFIX@")
+endif()
+
+if(NOT DEFINED CTEST_SOURCE_NAME)
+ set(CTEST_SOURCE_NAME "HDF5Examples")
+endif()
+
+if(NOT DEFINED HDF_LOCAL)
+ set(CDASH_LOCAL "NO")
+else()
+ set(CDASH_LOCAL "YES")
+endif()
+if(NOT DEFINED CTEST_SITE)
+ set(CTEST_SITE "local")
+endif()
+if(NOT DEFINED CTEST_BUILD_NAME)
+ set(CTEST_BUILD_NAME "examples")
+endif()
+set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DSITE:STRING=${CTEST_SITE} -DBUILDNAME:STRING=${CTEST_BUILD_NAME}")
#TAR_SOURCE - name of tarfile
-#if (NOT DEFINED TAR_SOURCE)
-# set (CTEST_USE_TAR_SOURCE "HDF5Examples-1.10.1-Source")
-#endif ()
+#if(NOT DEFINED TAR_SOURCE)
+# set(CTEST_USE_TAR_SOURCE "HDF5Examples-1.10.7-Source")
+#endif()
###############################################################################################################
-# Adjust the following SET Commands as needed
-###############################################################################################################
-if (WIN32)
- if (${STATICONLYLIBRARIES})
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DBUILD_SHARED_LIBS:BOOL=OFF")
- endif ()
- set (ENV{HDF5_DIR} "${INSTALLDIR}/cmake")
- set (CTEST_BINARY_NAME ${CTEST_SOURCE_NAME}\\build)
- set (CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_SOURCE_NAME}")
- set (CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_BINARY_NAME}")
-else (WIN32)
- if (${STATICONLYLIBRARIES})
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DBUILD_SHARED_LIBS:BOOL=OFF -DCMAKE_ANSI_CFLAGS:STRING=-fPIC")
- endif ()
- set (ENV{HDF5_DIR} "${INSTALLDIR}/share/cmake")
- set (ENV{LD_LIBRARY_PATH} "${INSTALLDIR}/lib")
- set (CTEST_BINARY_NAME ${CTEST_SOURCE_NAME}/build)
- set (CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_SOURCE_NAME}")
- set (CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_BINARY_NAME}")
-endif(WIN32)
-if (${FORTRANLIBRARIES})
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DHDF_BUILD_FORTRAN:BOOL=ON")
-else ()
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DHDF_BUILD_FORTRAN:BOOL=OFF")
-endif ()
-if (${JAVALIBRARIES})
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DHDF_BUILD_JAVA:BOOL=ON")
-else ()
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DHDF_BUILD_JAVA:BOOL=OFF")
-endif ()
-set (BUILD_OPTIONS "${BUILD_OPTIONS} -DHDF5_PACKAGE_NAME:STRING=@HDF5_PACKAGE@@HDF_PACKAGE_EXT@")
+if(WIN32)
+ set(SITE_OS_NAME "Windows")
+ set(ENV{HDF5_DIR} "${INSTALLDIR}/cmake")
+ set(CTEST_BINARY_NAME ${CTEST_SOURCE_NAME}\\build)
+ set(CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_SOURCE_NAME}")
+ set(CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_BINARY_NAME}")
+else()
+ set(ENV{HDF5_DIR} "${INSTALLDIR}/share/cmake")
+ set(ENV{LD_LIBRARY_PATH} "${INSTALLDIR}/lib")
+ set(CTEST_BINARY_NAME ${CTEST_SOURCE_NAME}/build)
+ set(CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_SOURCE_NAME}")
+ set(CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_BINARY_NAME}")
+endif()
+if(${CDASH_LOCAL})
+ set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCDASH_LOCAL:BOOL=ON")
+endif()
+set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_PACKAGE_NAME:STRING=@HDF5_PACKAGE@@HDF_PACKAGE_EXT@")
###############################################################################################################
-# For any comments please contact cdashhelp@hdfgroup.org
+# For any comments please contact help@hdfgroup.org
#
###############################################################################################################
-#-----------------------------------------------------------------------------
-# MAC machines need special option
-#-----------------------------------------------------------------------------
-if (APPLE)
- # Compiler choice
- execute_process (COMMAND xcrun --find cc OUTPUT_VARIABLE XCODE_CC OUTPUT_STRIP_TRAILING_WHITESPACE)
- execute_process (COMMAND xcrun --find c++ OUTPUT_VARIABLE XCODE_CXX OUTPUT_STRIP_TRAILING_WHITESPACE)
- set (ENV{CC} "${XCODE_CC}")
- set (ENV{CXX} "${XCODE_CXX}")
- if (NOT NO_MAC_FORTRAN)
- # Shared fortran is not supported, build static
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DBUILD_SHARED_LIBS:BOOL=OFF -DCMAKE_ANSI_CFLAGS:STRING=-fPIC")
- else ()
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DHDF_BUILD_FORTRAN:BOOL=OFF")
- endif ()
- set (BUILD_OPTIONS "${BUILD_OPTIONS} -DCTEST_USE_LAUNCHERS:BOOL=ON -DCMAKE_BUILD_WITH_INSTALL_RPATH:BOOL=OFF")
-endif ()
+#############################################################################################
+#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
+#### format for file: set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DXXX:YY=ZZZZ") ###
+#############################################################################################
+if(WIN32)
+ include(${CTEST_SCRIPT_DIRECTORY}\\HDF5_Examples_options.cmake)
+else()
+ include(${CTEST_SCRIPT_DIRECTORY}/HDF5_Examples_options.cmake)
+endif()
#-----------------------------------------------------------------------------
set (CTEST_CMAKE_COMMAND "\"${CMAKE_COMMAND}\"")
@@ -416,6 +399,57 @@ endif ()
#-----------------------------------------------------------------------------
##############################################################################################################
+##############################################################################################################
+#### HDF5_Examples_options.cmake ###
+#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
+##############################################################################################################
+#############################################################################################
+#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
+#### format: set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DXXX:YY=ZZZZ") ###
+#### DEFAULT: ###
+#### BUILD_SHARED_LIBS:BOOL=OFF ###
+#### HDF_BUILD_C:BOOL=ON ###
+#### HDF_BUILD_CXX:BOOL=OFF ###
+#### HDF_BUILD_FORTRAN:BOOL=OFF ###
+#### HDF_BUILD_JAVA:BOOL=OFF ###
+#### BUILD_TESTING:BOOL=OFF ###
+#### HDF_ENABLE_PARALLEL:BOOL=OFF ###
+#### HDF_ENABLE_THREADSAFE:BOOL=OFF ###
+#############################################################################################
+
+### uncomment/comment and change the following lines for other configuration options
+### build with shared libraries
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DBUILD_SHARED_LIBS:BOOL=ON")
+
+#############################################################################################
+#### languages ####
+### disable C builds
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_C:BOOL=OFF")
+
+### enable C++ builds
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_CXX:BOOL=ON")
+
+### enable Fortran builds
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_FORTRAN:BOOL=ON")
+
+### enable JAVA builds
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_JAVA:BOOL=ON")
+
+#############################################################################################
+### enable parallel program builds
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_ENABLE_PARALLEL:BOOL=ON")
+
+#############################################################################################
+### enable threadsafe program builds
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_ENABLE_THREADSAFE:BOOL=ON")
+
+#############################################################################################
+### enable test program builds, requires reference files in testfiles subdirectory
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DBUILD_TESTING:BOOL=ON")
+#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCOMPARE_TESTING:BOOL=ON")
+
+#############################################################################################
+
========================================================================
diff --git a/release_docs/USING_HDF5_VS.txt b/release_docs/USING_HDF5_VS.txt
index 3aaa56d..3019631 100644
--- a/release_docs/USING_HDF5_VS.txt
+++ b/release_docs/USING_HDF5_VS.txt
@@ -81,11 +81,11 @@ Using Visual Studio 2008 with HDF5 Libraries built with Visual Studio 2008
Many other common questions and hints are located online and being updated
in the HDF5 FAQ. For Windows-specific questions, please see:
- http://www.hdfgroup.org/HDF5/faq/windows.html
+ https://support.hdfgroup.org/HDF5/faq/windows.html
For all other general questions, you can look in the general FAQ:
- http://hdfgroup.org/HDF5-FAQ.html
+ https://support.hdfgroup.org/HDF5/HDF5-FAQ.html
************************************************************************
Please send email to help@hdfgroup.org for further assistance.