summaryrefslogtreecommitdiffstats
path: root/release_docs
diff options
context:
space:
mode:
Diffstat (limited to 'release_docs')
-rw-r--r--release_docs/HISTORY-1_10.txt1257
-rw-r--r--release_docs/INSTALL_CMake.txt352
-rwxr-xr-xrelease_docs/INSTALL_Cygwin.txt20
-rwxr-xr-xrelease_docs/RELEASE.txt1229
-rw-r--r--release_docs/USING_CMake_Examples.txt2
-rw-r--r--release_docs/USING_HDF5_CMake.txt230
-rw-r--r--release_docs/USING_HDF5_VS.txt14
7 files changed, 1614 insertions, 1490 deletions
diff --git a/release_docs/HISTORY-1_10.txt b/release_docs/HISTORY-1_10.txt
index 52eb273..9887a54 100644
--- a/release_docs/HISTORY-1_10.txt
+++ b/release_docs/HISTORY-1_10.txt
@@ -3,12 +3,1269 @@ HDF5 History
This file contains development history of the HDF5 1.10 branch
+04. Release Information for hdf5-1.10.2
03. Release Information for hdf5-1.10.1
02. Release Information for hdf5-1.10.0-patch1
01. Release Information for hdf5-1.10.0
[Search on the string '%%%%' for section breaks of each release.]
+%%%%1.10.2%%%%
+
+HDF5 version 1.10.2 released on 2018-03-29
+================================================================================
+
+
+INTRODUCTION
+
+This document describes the differences between this release and the previous
+HDF5 release. It contains information on the platforms tested and known
+problems in this release. For more details check the HISTORY*.txt files in the
+HDF5 source.
+
+Note that documentation in the links below will be updated at the time of each
+final release.
+
+Links to HDF5 documentation can be found on The HDF5 web page:
+
+ https://portal.hdfgroup.org/display/HDF5/HDF5
+
+The official HDF5 releases can be obtained from:
+
+ https://www.hdfgroup.org/downloads/hdf5/
+
+Changes from Release to Release and New Features in the HDF5-1.10.x release series
+can be found at:
+
+ https://portal.hdfgroup.org/display/HDF5/HDF5+Application+Developer%27s+Guide
+
+If you have any questions or comments, please send them to the HDF Help Desk:
+
+ help@hdfgroup.org
+
+
+CONTENTS
+
+- New Features
+- Support for new platforms and languages
+- Bug Fixes since HDF5-1.10.1
+- Supported Platforms
+- Tested Configuration Features Summary
+- More Tested Platforms
+- Known Problems
+
+
+New Features
+============
+
+ Configuration and Build Systems:
+ --------------------------------
+ - CMake builds
+ --------------
+
+ - Changed minimum CMake required version to 3.10.
+
+ This change removed the need to support a copy of the FindMPI.cmake module,
+ which has been removed, along with its subfolder in the config/cmake_ext_mod
+ location.
+
+ (ADB - 2018/03/09)
+
+ - Added pkg-config file generation
+
+ Added pkg-config file generation for the C, C++, HL, and HL C++ libraries.
+ In addition, builds on Linux will create h5cc, h5c++, h5hlcc, and h5hlc++ scripts in the bin
+ directory that use the pkg-config files. The scripts can be used to build HDF5 C and C++
+ applications (i.e, similar to the compiler scripts produced by the Autotools builds).
+
+ (ADB - 2018/03/08, HDFFV-4359)
+
+ - Refactored use of CMAKE_BUILD_TYPE for new variable, which understands
+ the type of generator in use.
+
+ Added new configuration macros to use new HDF_BUILD_TYPE variable. This
+ variable is set correctly for the type of generator being used for the build.
+
+ (ADB - 2018/01/08, HDFFV-10385, HDFFV-10296)
+
+ - Autotools builds
+ ------------------
+
+ - Removed version-specific gcc/gfortran flags for version 4.0 (inclusive)
+ and earlier.
+
+ The config/gnu-flags file, which is sourced as a part of the configure
+ process, adds version-specific flags for use when building HDF5. Most of
+ these flags control warnings and do not affect the final product.
+
+ Flags for older versions of the compiler were consolidated into the
+ common flags section. Moving these flags simplifies maintenance of
+ the file.
+
+ The upshot of this is that building with ancient versions of gcc
+ (<= 4.0) will possibly no longer work without hand-hacking the file
+ to remove the flags not understood by that version of the compiler.
+ Nothing should change when building with gcc >= 4.1.
+
+ (DER - 2017/05/31, HDFFV-9937)
+
+ - -fno-omit-frame-pointer was added when building with debugging symbols
+ enabled.
+
+ Debugging symbols can be enabled independently of the overall build
+ mode in both the autotools and CMake. This allows (limited) debugging
+ of optimized code. Since many debuggers rely on the frame pointer,
+ we've disabled this optimization when debugging symbols are requested
+ (e.g.: via building with --enable-symbols).
+
+ (DER - 2017/05/31, HDFFV-10226)
+
+
+ Library:
+ --------
+ - Added an enumerated value to H5F_libver_t for H5Pset_libver_bounds().
+
+ Currently, the library defines two values for H5F_libver_t and supports
+ only two pairs of (low, high) combinations as derived from these values.
+ Thus the bounds setting via H5Pset_libver_bounds() is rather restricted.
+
+ Added an enumerated value (H5F_LIBVER_V18) to H5F_libver_t and
+ H5Pset_libver_bounds() now supports five pairs of (low, high) combinations
+ as derived from these values. This addition provides the user more
+ flexibility in setting bounds for object creation.
+
+ (VC - 2018/03/14)
+
+ - Added prefix option to VDS files.
+
+ Currently, VDS source files must be in the active directory to be
+ found by the virtual file. Adding the option of a prefix to be set
+ on the virtual file, using a data access property list (DAPL),
+ allows the source files to locate at an absolute or relative path
+ to the virtual file.
+ Private utility functions in H5D and H5L packages merged into single
+ function in H5F package.
+
+ New public APIs:
+ herr_t H5Pset_virtual_prefix(hid_t dapl_id, const char* prefix);
+ ssize_t H5Pget_virtual_prefix(hid_t dapl_id, char* prefix /*out*/, size_t size);
+ The prefix can also be set with an environment variable, HDF5_VDS_PREFIX.
+
+ (ADB - 2017/12/12, HDFFV-9724, HDFFV-10361)
+
+ - H5FDdriver_query() API call added to the C library.
+
+ This new library call allows the user to query a virtual file driver
+ (VFD) for the feature flags it supports (listed in H5FDpublic.h).
+ This can be useful to determine if a VFD supports SWMR, for example.
+
+ Note that some VFDs have feature flags that may only be present
+ after a file has been created or opened (e.g.: the core VFD will
+ have the H5FD_FEAT_POSIX_COMPAT_HANDLE flag set if the backing
+ store is switched on). Since the new API call queries a generic VFD
+ unassociated with a file, these flags will never be returned.
+
+ (DER - 2017/05/31, HDFFV-10215)
+
+ - H5FD_FEAT_DEFAULT_VFD_COMPATIBLE VFD feature flag added to the C library.
+
+ This new feature flag indicates that the VFD is compatible with the
+ default VFD. VFDs that set this flag create single files that follow
+ the canonical HDF5 file format.
+
+ (DER - 2017/05/31, HDFFV-10214)
+
+ - The H5I_REFERENCE value in the H5I_type_t enum (defined in H5Ipublic.h)
+ has been marked as deprecated.
+
+ This ID type value is not used in the C library. i.e.: There are no
+ hid_t values that are of ID type H5I_REFERENCE.
+
+ This enum value will be removed in a future major version of the library.
+ The code will remain unchanged in the HDF5 1.10.x releases and branches.
+
+ (DER - 2017/04/05, HDFFV-10252)
+
+
+ Parallel Library:
+ -----------------
+ - Enabled compression for parallel applications.
+
+ With this release parallel applications can create and write compressed
+ datasets (or the datasets with the filters such as Fletcher32 applied).
+
+ (EIP - 2018/03/29)
+
+ - Addressed slow file close on some Lustre file systems.
+
+ Slow file close has been reported on some Lustre file systems.
+ While the ultimate cause is not understood fully, the proximate
+ cause appears to be long delays in MPI_File_set_size() calls at
+ file close and flush.
+
+ To minimize this problem pending a definitive diagnosis and fix,
+ PHDF5 has been modified to avoid MPI_File_set_size() calls when
+ possible. This is done by comparing the library's EOA (End of
+ Allocation) with the file systems EOF, and skipping the
+ MPI_File_set_size() call if the two match.
+
+ (JRM - 2018/03/29)
+
+ - Optimized parallel open/location of the HDF5 super-block.
+
+ Previous releases of PHDF5 required all parallel ranks to
+ search for the HDF5 superblock signature when opening the
+ file. As this is accomplished more or less as a synchronous
+ operation, a large number of processes can experience a
+ slowdown in the file open due to filesystem contention.
+
+ As a first step in improving the startup/file-open performance,
+ we allow MPI rank 0 of the associated MPI communicator to locate
+ the base offset of the super-block and then broadcast that result
+ to the remaining ranks in the parallel group. Note that this
+ approach is utilized ONLY during file opens which employ the MPIO
+ file driver in HDF5 by previously having called H5Pset_fapl_mpio().
+
+ HDF5 parallel file operations which do not employ multiple ranks
+ e.g. specifiying MPI_COMM_SELF (whose MPI_Comm_size == 1)
+ as opposed to MPI_COMM_WORLD, will not be affected by this
+ optimization. Conversely, parallel file operations on subgroups
+ of MPI_COMM_WORLD are allowed to be run in parallel with each
+ subgroup operating as an independant collection of processes.
+
+ (RAW - 2017/10/10, HDFFV-10294)
+
+ - Added large (>2GB) MPI-IO transfers.
+
+ Previous releases of PHDF5 would fail when attempting to
+ read or write greater than 2GB of data in a single IO operation.
+ This issue stems principally from an MPI API whose definitions
+ utilize 32 bit integers to describe the number of data elements
+ and datatype that MPI should use to effect a data transfer.
+ Historically, HDF5 has invoked MPI-IO with the number of
+ elements in a contiguous buffer represented as the length
+ of that buffer in bytes.
+
+ Resolving the issue and thus enabling larger MPI-IO transfers
+ is accomplished first, by detecting when a user IO request would
+ exceed the 2GB limit as described above. Once a transfer request
+ is identified as requiring special handling, PHDF5 now creates a
+ derived datatype consisting of a vector of fixed sized blocks
+ which is in turn wrapped within a single MPI_Type_struct to
+ contain the vector and any remaining data. The newly created
+ datatype is then used in place of MPI_BYTE and can be used to
+ fulfill the original user request without encountering API
+ errors.
+
+ (RAW - 2017/09/10, HDFFV-8839)
+
+
+ C++ Library:
+ ------------
+ - The following C++ API wrappers have been added to the C++ Library:
+ + H5Lcreate_soft:
+ // Creates a soft link from link_name to target_name.
+ void link(const char *target_name, const char *link_name,...)
+ void link(const H5std_string& target_name,...)
+
+ + H5Lcreate_hard:
+ // Creates a hard link from new_name to curr_name.
+ void link(const char *curr_name, const Group& new_loc,...)
+ void link(const H5std_string& curr_name, const Group& new_loc,...)
+
+ // Creates a hard link from new_name to curr_name in same location.
+ void link(const char *curr_name, const hid_t same_loc,...)
+ void link(const H5std_string& curr_name, const hid_t same_loc,...)
+
+ Note: previous version of H5Location::link will be deprecated.
+
+ + H5Lcopy:
+ // Copy an object from a group of file to another.
+ void copyLink(const char *src_name, const Group& dst,...)
+ void copyLink(const H5std_string& src_name, const Group& dst,...)
+
+ // Copy an object from a group of file to the same location.
+ void copyLink(const char *src_name, const char *dst_name,...)
+ void copyLink(const H5std_string& src_name,...)
+
+ + H5Lmove:
+ // Rename an object in a group or file to a new location.
+ void moveLink(const char* src_name, const Group& dst,...)
+ void moveLink(const H5std_string& src_name, const Group& dst,...)
+
+ // Rename an object in a group or file to the same location.
+ void moveLink(const char* src_name, const char* dst_name,...)
+ void moveLink(const H5std_string& src_name,...)
+
+ Note: previous version H5Location::move will be deprecated.
+
+ + H5Ldelete:
+ // Removes the specified link from this location.
+ void unlink(const char *link_name,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT)
+ void unlink(const H5std_string& link_name,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT)
+
+ Note: additional parameter is added to previous H5Location::unlink.
+
+ + H5Tencode and H5Tdecode:
+ // Creates a binary object description of this datatype.
+ void DataType::encode() - C API H5Tencode()
+
+ // Returns the decoded type from the binary object description.
+ DataType::decode() - C API H5Tdecode()
+ ArrayType::decode() - C API H5Tdecode()
+ CompType::decode() - C API H5Tdecode()
+ DataType::decode() - C API H5Tdecode()
+ EnumType::decode() - C API H5Tdecode()
+ FloatType::decode() - C API H5Tdecode()
+ IntType::decode() - C API H5Tdecode()
+ StrType::decode() - C API H5Tdecode()
+ VarLenType::decode() - C API H5Tdecode()
+
+ + H5Lget_info:
+ // Returns the information of the named link.
+ H5L_info_t getLinkInfo(const H5std_string& link_name,...)
+
+ (BMR - 2018/03/11, HDFFV-10149)
+
+ - Added class LinkCreatPropList for link create property list.
+
+ (BMR - 2018/03/11, HDFFV-10149)
+
+ - Added overloaded functions H5Location::createGroup to take a link
+ creation property list.
+ Group createGroup(const char* name, const LinkCreatPropList& lcpl)
+ Group createGroup(const H5std_string& name, const LinkCreatPropList& lcpl)
+
+ (BMR - 2018/03/11, HDFFV-10149)
+
+ - A document is added to the HDF5 C++ API Reference Manual to show the
+ mapping from a C API to C++ wrappers. It can be found from the main
+ page of the C++ API Reference Manual.
+
+ (BMR - 2017/10/17, HDFFV-10151)
+
+
+ Java Library:
+ ----------------
+ - Wrapper added for enabling the error stack.
+
+ H5error_off would disable the error stack reporting. In order
+ to re-enable the reporting, the error stack info needs to be
+ saved so that H5error_on can revert state.
+
+ (ADB - 2018/03/13, HDFFV-10412)
+
+ - Wrappers were added for the following C APIs:
+ H5Pset_evict_on_close
+ H5Pget_evict_on_close
+ H5Pset_chunk_opts
+ H5Pget_chunk_opts
+ H5Pset_efile_prefix
+ H5Pget_efile_prefix
+ H5Pset_virtual_prefix
+ H5Pget_virtual_prefix
+
+ (ADB - 2017/12/20)
+
+ - The H5I_REFERENCE value in the H5I_type_t enum (defined in H5Ipublic.h)
+ has been marked as deprectated.
+
+ JNI code which refers to this value will be removed in a future
+ major version of the library. The code will remain unchanged in the
+ 1.10.x releases and branches.
+
+ See the C library section, above, for further information.
+
+ (HDFFV-10252, DER, 2017/04/05)
+
+
+ Tools:
+ ------
+ - h5diff has a new option to display error stack.
+
+ Updated h5diff with the --enable-error-stack argument, which
+ enables the display of the hdf5 error stack. This completes the
+ improvement to the main tools: h5copy, h5diff, h5dump, h5ls and
+ h5repack.
+
+ (ADB - 2017/08/30, HDFFV-9774)
+
+
+Support for new platforms, languages and compilers.
+=======================================
+ - None
+
+Bug Fixes since HDF5-1.10.1 release
+==================================
+
+ Library
+ -------
+ - The data read after a direct chunk write to a chunked dataset with
+ one chunk was incorrect.
+
+ The problem was due to the passing of a null dataset pointer to
+ the insert callback for the chunk index in the routine
+ H5D__chunk_direct_write() in H5Dchunk.c
+ The dataset was a single-chunked dataset which will use the
+ single chunk index when latest format was enabled on file creation.
+ The single chunk index was the only index that used this pointer
+ in the insert callback.
+
+ Passed the dataset pointer to the insert callback for the chunk
+ index in H5D__chunk_direct_write().
+
+ (VC - 2018/03/20, HDFFV-10425)
+
+ - Added public routine H5DOread_chunk to the high-level C library.
+
+ The patch for H5DOwrite_chunk() to write an entire chunk to the file
+ directly was contributed by GE Healthcare and integrated by The HDF Group
+ developers.
+
+ (VC - 2017/05/19, HDFFV-9934)
+
+ - Freeing of object header after failed checksum verification.
+
+ It was discovered that the object header (in H5Ocache.c) was not released properly
+ when the checksum verification failed and a re-load of the object
+ header was needed.
+
+ Freed the object header that failed the chksum verification only
+ after the new object header is reloaded, deserialized and set up.
+
+ (VC - 2018/03/14, HDFFV-10209)
+
+ - Updated H5Pset_evict_on_close in H5Pfapl.c
+
+ Changed the minor error number from H5E_CANTSET to H5E_UNSUPPORTED for
+ parallel library.
+
+ (ADB - 2018/03/06, HDFFV-10414)
+
+ - Fixed the problems with the utility function that could not handle lowercase
+ Windows drive letters.
+
+ Added call to upper function for drive letter.
+
+ (ADB - 2017/12/18, HDFFV-10307)
+
+ - Fixed H5Sencode() bug when the number of elements selected was > 2^32.
+
+ H5Sencode() incorrectly encodes dataspace selection with number of
+ elements exceeding 2^32. When decoding such selection via H5Sdecode(),
+ the number of elements in the decoded dataspace is not the same as
+ what is encoded. This problem exists for H5S_SEL_HYPER and
+ H5S_SEL_POINTS encoding.
+
+ The cause of the problem is due to the fact that the library uses 32 bits to
+ encode counts and block offsets for the selection.
+ The solution is to use the original 32 bit encodings if possible,
+ but use a different way to encode selection if more that 32 bits is needed.
+ See details in the RFC: H5Sencode/H5Sdecode Format Change i
+ https://bitbucket.hdfgroup.org/projects/HDFFV/repos/hdf5doc/browse/RFCs/HDF5_Library/H5SencodeFormatChange.
+
+ (VC - 2017/11/28, HDFFV-9947)
+
+ - Fixed filter plugin handling in H5PL.c and H5Z.c to not require i availability of
+ dependent libraries (e.g., szip or zlib).
+
+ It was discovered that the dynamic loading process used by
+ filter plugins had issues with library dependencies.
+
+ CMake build process changed to use LINK INTERFACE keywords, which
+ allowed HDF5 C library to make dependent libraries private. The
+ filter plugin libraries no longer require dependent libraries
+ (such as szip or zlib) to be available.
+
+ (ADB - 2017/11/16, HDFFV-10328)
+
+ - Fixed rare object header corruption bug.
+
+ In certain cases, such as when converting large attributes to dense
+ storage, an error could occur which would either fail an assertion or
+ cause file corruption. Fixed and added test.
+
+ (NAF - 2017/11/14, HDFFV-10274)
+
+ - Updated H5Zfilter_avail in H5Z.c.
+
+ The public function checked for plugins, while the private
+ function did not.
+
+ Modified H5Zfilter_avail and private function, H5Z_filter_avail.
+ Moved check for plugin from public to private function. Updated
+ H5P__set_filter due to change in H5Z_filter_avail. Updated tests.
+
+ (ADB - 2017/10/10, HDFFV-10297, HDFFV-10319)
+
+ - h5dump produced SEGFAULT when dumping corrypted file.
+
+ The behavior was due to the error in the internal function H5HL_offset_into().
+
+ (1) Fixed H5HL_offset_into() to return error when offset exceeds heap data
+ block size.
+ (2) Fixed other places in the library that call this routine to detect
+ error routine.
+
+ (VC - 2017/08/30, HDFFV-10216)
+
+ - Fixes for paged aggregation feature.
+
+ Skip test in test/fheap.c when:
+ (1) multi/split drivers and
+ (2) persisting free-space or using paged aggregation strategy
+
+ (VC, 2017/07/10)
+
+ Changes made based on RFC review comments:
+ (1) Added maximum value for file space page size
+ (2) Dropped check for page end metadata threshold
+ (3) Removed "can_shrink" and "shrink" callbacks for small section class
+
+ (VC - 2017/06/09)
+
+ - Fixed for infinite loop in H5VM_power2up().
+
+ The function H5VM_power2up() returns the next power of 2
+ for n. When n exceeds 2^63, it overflows and becomes 0 causing
+ the infinite looping.
+
+ The fix ensures that the function checks for n >= 2^63
+ and returns 0.
+
+ (VC - 2017/07/10, HDFFV-10217)
+
+ - Fixed for H5Ocopy doesn't work with open identifiers.
+
+ Changes made so that raw data for dataset objects are copied from
+ cached info when possible instead of flushing objects to file and
+ read them back in again.
+
+ (VC - 2017/07/05, HDFFV-7853)
+
+ - An uninitialized struct could cause a memory access error when using
+ variable-length or reference types in a compressed, chunked dataset.
+
+ A struct containing a callback function pointer and a pointer to some
+ associated data was used before initialization. This could cause a
+ memory access error and system crash. This could only occur under
+ unusual conditions when using variable-lenth and reference types in
+ a compressed, chunked dataset.
+
+ On recent versions of Visual Studio, when built in debug mode, the
+ debug heap will complain and cause a crash if the code in question
+ is executed (this will cause the objcopy test to fail).
+
+ (DER - 2017/11/21, HDFFV-10330)
+
+ - Fixed collective metadata writes on file close.
+
+ It was discovered that metadata was being written twice as part of
+ the parallel file close behavior, once independently and once
+ collectively.
+
+ A fix for this error was included as part of the parallel compression
+ feature but remained undocumented here.
+
+ (RAW - 2017/12/01, HDFFV-10272)
+
+ - If an HDF5 file contains a filter pipeline message with a 'number of
+ filters' field that exceeds the maximum number of allowed filters,
+ the error handling code will attempt to dereference a NULL pointer.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17505.
+ https://security-tracker.debian.org/tracker/CVE-2017-17505
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17505
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ This problem arose because the error handling code assumed that
+ the 'number of filters' field implied that a dynamic array of that
+ size had already been created and that the cleanup code should
+ iterate over that array and clean up each element's resources. If
+ an error occurred before the array has been allocated, this will
+ not be true.
+
+ This has been changed so that the number of filters is set to
+ zero on errors. Additionally, the filter array traversal in the
+ error handling code now requires that the filter array not be NULL.
+
+ (DER - 2018/02/06, HDFFV-10354)
+
+ - If an HDF5 file contains a filter pipeline message which contains
+ a 'number of filters' field that exceeds the actual number of
+ filters in the message, the HDF5 C library will read off the end of
+ the read buffer.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17506.
+ https://security-tracker.debian.org/tracker/CVE-2017-17506
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17506
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ The problem was fixed by passing the buffer size with the buffer
+ and ensuring that the pointer cannot be incremented off the end
+ of the buffer. A mismatch between the number of filters declared
+ and the actual number of filters will now invoke normal HDF5
+ error handling.
+
+ (DER - 2018/02/26, HDFFV-10355)
+
+ - If an HDF5 file contains a malformed compound datatype with a
+ suitably large offset, the type conversion code can run off
+ the end of the type conversion buffer, causing a segmentation
+ fault.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17507.
+ https://security-tracker.debian.org/tracker/CVE-2017-17506
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17506
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ THE HDF GROUP WILL NOT FIX THIS BUG AT THIS TIME
+
+ Fixing this problem would involve updating the publicly visible
+ H5T_conv_t function pointer typedef and versioning the API calls
+ which use it. We normally only modify the public API during
+ major releases, so this bug will not be fixed at this time.
+
+ (DER - 2018/02/26, HDFFV-10356)
+
+ - If an HDF5 file contains a malformed compound type which contains
+ a member of size zero, a division by zero error will occur while
+ processing the type.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17508.
+ https://security-tracker.debian.org/tracker/CVE-2017-17508
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17508
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ Checking for zero before dividing fixes the problem. Instead of the
+ division by zero, the normal HDF5 error handling is invoked.
+
+ (DER - 2018/02/26, HDFFV-10357)
+
+ - If an HDF5 file contains a malformed symbol table node that declares
+ it contains more symbols than it actually contains, the library
+ can run off the end of the metadata cache buffer while processing
+ the symbol table node.
+
+ This issue was reported to The HDF Group as issue #CVE-2017-17509.
+ https://security-tracker.debian.org/tracker/CVE-2017-17509
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17509
+
+ NOTE: The HDF5 C library cannot produce such a file. This condition
+ should only occur in a corrupt (or deliberately altered) file
+ or a file created by third-party software.
+
+ Performing bounds checks on the buffer while processing fixes the
+ problem. Instead of the segmentation fault, the normal HDF5 error
+ handling is invoked.
+
+ (DER - 2018/03/12, HDFFV-10358)
+
+ - Fixed permissions passed to open(2) on file create.
+
+ On Windows, the POSIX permissions passed to open(2) when creating files
+ were only incidentally correct. They are now set to the correct value of
+ (_S_IREAD | _S_IWRITE).
+
+ On other platforms, the permissions were set to a mix of 666, 644, and
+ 000. They are now set uniformly to 666.
+
+ (DER - 2017/04/28, HDFFV-9877)
+
+ - The H5FD_FEAT_POSIX_COMPAT_HANDLE flag is no longer used to determine
+ if a virtual file driver (VFD) is compatible with SWMR.
+
+ Use of this VFD feature flag was not in line with the documentation in
+ the public H5FDpublic.h file. In particular, it was being used as a
+ proxy for determining if SWMR I/O is allowed. This is unecessary as we
+ already have a feature flag for this (H5FD_SUPPORTS_SWMR_IO).
+
+ (DER - 2017/05/31, HDFFV-10214)
+
+
+ Configuration
+ -------------
+ - CMake changes
+
+ - Updated CMake commands configuration.
+
+ A number of improvements were made to the CMake commands. Most
+ changes simplify usage or eliminate unused constructs. Also,
+ some changes support better cross-platform support.
+
+ (ADB - 2018/02/01, HDFFV-10398)
+
+ - Corrected usage of CMAKE_BUILD_TYPE variable.
+
+ The use of the CMAKE_BUILD_TYPE is incorrect for multi-config
+ generators (Visual Studio and XCode) and is optional for single
+ config generators. Created a new macro to check
+ GLOBAL PROPERTY -> GENERATOR_IS_MULTI_CONFIG
+ Created two new HDF variable, HDF_BUILD_TYPE and HDF_CFG_BUILD_TYPE.
+ Defaults for these variables is "Release".
+
+ (ADB - 2018/01/10, HDFFV-10385)
+
+ - Added replacement of fortran flags if using static CRT.
+
+ Added TARGET_STATIC_CRT_FLAGS call to HDFUseFortran.cmake file in
+ config/cmake_ext_mod folder.
+
+ (ADB - 2018/01/08, HDFFV-10334)
+
+
+ - The hdf5 library used shared szip and zlib, which needlessly required
+ applications to link with the same szip and zlib libraries.
+
+ Changed the target_link_libraries commands to use the static libs.
+ Removed improper link duplication of szip and zlib.
+ Adjusted the link dependencies and the link interface values of
+ the target_link_libraries commands.
+
+ (ADB - 2017/11/14, HDFFV-10329)
+
+ - CMake MPI
+
+ CMake implementation for MPI was problematic and would create incorrect
+ MPI library references in the hdf5 libraries.
+
+ Reworked the CMake MPI code to properly create CMake targets. Also merged
+ the latest CMake FindMPI.cmake changes to the local copy. This is necessary
+ until HDF changes the CMake minimum to 3.9 or greater.
+
+ (ADB - 2017/11/02, HDFFV-10321)
+
+ - Corrected FORTRAN_HAVE_C_LONG_DOUBLE processing in the autotools.
+
+ A bug in the autotools Fortran processing code always set the
+ FORTRAN_HAVE_C_LONG_DOUBLE variable to be true regardless of
+ whether or not a C long double type was present.
+
+ This would cause compilation failures on platforms where a C
+ long double type was not available and the Fortran wrappers
+ were being built.
+
+ (DER - 2017/07/05, HDFFV-10247)
+
+ - The deprecated --enable-production and --enable-debug configure options
+ failed to emit errors when passed an empty string
+ (e.g.: --enable-debug="").
+
+ Due to the way we checked for these options being set, it was possible
+ to avoid the error message and continue configuration if an empty string
+ was passed to the option.
+
+ Any use of --enable-production or --enable-debug will now halt the
+ configuration step and emit a helpful error message
+ (use --enable-build-mode=debug|production instead).
+
+ (DER - 2017/07/05, HDFFV-10248)
+
+ - CMake
+
+ Too many commands for POST_BUILD step caused command line to be
+ too big on windows.
+
+ Changed foreach of copy command to use a custom command with the
+ use of the HDFTEST_COPY_FILE macro.
+
+ (ADB - 2017/07/12, HDFFV-10254)
+
+ - CMake test execution environment
+
+ The parallel HDF5 test: 't_pread' assumed the use of autotools
+ and the directory structure associated with that testing approach.
+ Modified the test code to check whether the 'h5jam' utility can be
+ found in the same directory as the test executable (which is
+ preferred directory structure utilized by cmake) and if found
+ will invoke the tool directly rather than utilizing a relative path.
+
+ (RAW - 2017/11/03, HDFFV-10318)
+
+ - Fortran compilation fails for xlf and CMake builds.
+
+ Fixed CMake shared library build for H5match_types and modules
+
+ (MSB - 2017/12/19, HDFFV-10363)
+
+ - Shared libraries fail test on OSX with Fortran enabled with CMake.
+
+ Fixed by removing the F77 use of EQUIVALENCE and COMMON, replaced
+ using MODULES. Updated CMake.
+
+ (MSB - 2017/12/07, HDFFV-10223)
+
+ - The bin/trace script now emits an error code on problems and autogen.sh
+ will fail if bin/trace fails.
+
+ The bin/trace script adds tracing functionality to public HDF5 API calls.
+ It is only of interest to developers who modify the HDF5 source code.
+ Previously, bin/trace just wrote an error message to stdout when it
+ encountered problems, so autogen.sh processing did not halt and a broken
+ version of the library could be built. The script will now return an
+ error code when it encounters problems, and autogen.sh will fail.
+
+ This only affects users who run autogen.sh to rebuild the Autotools files,
+ which is not necessary to build HDF5 from source in official releases of the
+ library. CMake users are unaffected as bin/trace is not run via CMake
+ at this time.
+
+ (DER - 2017/04/25, HDFFV-10178)
+
+ - FC_BASENAME was changed from gfortran40 to gfortran in a few places.
+
+ In the autotools, FC_BASENAME was set to gfortran40 in a few locations
+ (config/gnu-fflags and config/freebsd). This was probably a historical
+ artifact and did not seem to affect many users.
+
+ The value is now correctly set to gfortran.
+
+ (DER - 2017/05/26, HDFFV-10249)
+
+ - The ar flags were changed to -cr (was: -cru)
+
+ The autotools set the flags for ar to -cru by default. The -u flag,
+ which allows selective replacement of only the members which have
+ changed, raises warnings on some platforms, so the flags are now set to
+ -cr via AR_FLAGS in configure.ac. This causes the static library to
+ always be completely recreated from the object files on each build.
+
+ (DER - 2017/11/15, HDFFV-10428)
+
+
+ Fortran
+ --------
+ - Fixed compilation errors when using Intel 18 Fortran compilers
+ (MSB - 2017/11/3, HDFFV-10322)
+
+ Tools
+ -----
+ - h5clear
+
+ An enhancement to the tool in setting a file's stored EOA.
+
+ It was discovered that a crashed file's stored EOA in the superblock
+ was smaller than the actual file's EOF. When the file was reopened
+ and closed, the library truncated the file to the stored EOA.
+
+ Added an option to the tool in setting the file's stored EOA in the
+ superblock to the maximum of (EOA, EOF) + increment.
+ An option was also added to print the file's EOA and EOF.
+
+ (VC - 2018/03/14, HDFFV-10360)
+
+ - h5repack
+
+ h5repack changes the chunk parameters when a change of layout is not
+ specified and a filter is applied.
+
+ HDFFV-10297, HDFFV-10319 reworked code for h5repack and h5diff code
+ in the tools library. The check for an existing layout was incorrectly
+ placed into an if block and not executed. The check was moved into
+ the normal path of the function.
+
+ (ADB - 2018/02/21, HDFFV-10412)
+
+ - h5dump
+
+ The tools library will hide the error stack during file open.
+
+ While this is preferable almost always, there are reasons to enable
+ display of the error stack when a tool will not open a file. Adding an
+ optional argument to the --enable-error-stack will provide this use case.
+ As an optional argument it will not affect the operation of the
+ --enable-error-stack. h5dump is the only tool to implement this change.
+
+ (ADB - 2018/02/15, HDFFV-10384)
+
+ - h5dump
+
+ h5dump would output an indented blank line in the filters section.
+
+ h5dump overused the h5tools_simple_prefix function, which is a
+ function intended to account for the data index (x,y,z) option.
+ Removed the function call for header information.
+
+ (ADB - 2018/01/25, HDFFV-10396)
+
+ - h5repack
+
+ h5repack incorrectly searched internal object table for name.
+
+ h5repack would search the table of objects for a name, if the
+ name did not match it tried to determine if the name without a
+ leading slash would match. The logic was flawed! The table
+ stored names(paths) without a leading slash and did a strstr
+ of the table path to the name.
+ The assumption was that if there was a difference of one then
+ it was a match, however "pressure" would match "/pressure" as
+ well as "/pressure1", "/pressure2", etc. Changed logic to remove
+ any leading slash and then do a full compare of the name.
+
+ (ADB - 2018/01/18, HDFFV-10393)
+
+ - h5repack
+
+ h5repack failed to handle command line parameters for customer filters.
+
+ User defined filter parameter conversions would fail when integers were
+ represented on the command line with character string
+ larger then 9 characters. Increased local variable array for storing
+ the current command line parameter to prevent buffer overflows.
+
+ (ADB - 2018/01/17, HDFFV-10392)
+
+ - h5diff
+
+ h5diff seg faulted if comparing VL strings against fixed strings.
+
+ Reworked solution for HDFFV-8625 and HDFFV-8639. Implemented the check
+ for string objects of same type in the diff_can_type function by
+ adding an if(tclass1 == H5T_STRING) block. This "if block" moves the
+ same check that was added for attributes to this function, which is
+ used by all object types. This function handles complex type structures.
+ Also added a new test file in h5diffgenttest for testing this issue
+ and removed the temporary files used in the test scripts.
+
+ (ADB - 2018/01/04, HDFFV-8745)
+
+ - h5repack
+
+ h5repack failed to copy a dataset with existing filter.
+
+ Reworked code for h5repack and h5diff code in the tools library. Added
+ improved error handling, cleanup of resources and checks of calls.
+ Modified H5Zfilter_avail and private function, H5Z_filter_avail.
+ Moved check for plugin from public to private function. Updated
+ H5P__set_filter due to change in H5Z_filter_avail. Updated tests.
+ Note, h5repack output display has changed to clarify the individual
+ steps of the repack process. The output indicates if an operation
+ applies to all objects. Lines with notation and no information
+ have been removed.
+
+ (ADB - 2017/10/10, HDFFV-10297, HDFFV-10319)
+
+ - h5repack
+
+ h5repack always set the User Defined filter flag to H5Z_FLAG_MANDATORY.
+
+ Added another parameter to the 'UD=' option to set the flag by default
+ to '0' or H5Z_FLAG_MANDATORY, the other choice is '1' or H5Z_FLAG_OPTIONAL.
+
+ (ADB - 2017/08/31, HDFFV-10269)
+
+ - h5ls
+
+ h5ls generated error on stack when it encountered a H5S_NULL
+ dataspace.
+
+ Adding checks for H5S_NULL before calling H5Sis_simple (located
+ in the h5tools_dump_mem function) fixed the issue.
+
+ (ADB - 2017/08/17, HDFFV-10188)
+
+ - h5repack
+
+ Added tests to h5repack.sh.in to verify options added for paged
+ aggregation work as expected.
+
+ (VC - 2017/08/03)
+
+ - h5dump
+
+ h5dump segfaulted on output of XML file.
+
+ Function that escape'd strings used the full buffer length
+ instead of just the length of the replacement string in a
+ strncpy call. Using the correct length fixed the issue.
+
+ (ADB - 2017/08/01, HDFFV-10256)
+
+ - h5diff
+
+ h5diff segfaulted on compare of a NULL variable length string.
+
+ Improved h5diff compare of strings by adding a check for
+ NULL strings and setting the lengths to zero.
+
+ (ADB - 2017/07/25, HDFFV-10246)
+
+ - h5import
+
+ h5import crashed trying to import data from a subset of a dataset.
+
+ Improved h5import by adding the SUBSET keyword. h5import understands
+ to use the Count times the Block as the size of the dimensions.
+ Added INPUT_B_ORDER keyword to old-style configuration files.
+ The import from h5dump function expects the binary files to use native
+ types (FILE '-b' option) in the binary file.
+
+ (ADB - 2017/06/15, HDFFV-10219)
+
+ - h5repack
+
+ h5repack did not maintain the creation order flag of the root
+ group.
+
+ Improved h5repack by reading the creation order and applying the
+ flag to the new root group. Also added arguments to set the
+ order and index direction, which applies to the traversing of the
+ original file, on the command line.
+
+ (ADB - 2017/05/26, HDFFV-8611)
+
+ - h5diff
+
+ h5diff failed to account for strpad type and null terminators
+ of char strings. Also, h5diff failed to account for string length
+ differences and would give a different result depending on file
+ order in the command line.
+
+ Improved h5diff compare of strings and arrays by adding a check for
+ string lengths and if the strpad was null filled.
+
+ (ADB - 2017/05/18, HDFFV-9055, HDFFV-10128)
+
+ High-Level APIs:
+ ------
+ - H5DOwrite_chunk() problems when overwriting an existing chunk with
+ no filters enabled.
+
+ When overwriting chunks and no filters were being used, the library would
+ fail (when asserts are enabled, e.g. debug builds) or incorrectly
+ insert additional chunks instead of overwriting (when asserts are not
+ enabled, e.g. production builds).
+
+ This has been fixed and a test was added to the hl/test_dset_opt test.
+
+ (DER - 2017/05/11, HDFFV-10187)
+
+ C++ APIs
+ --------
+ - Removal of memory leaks.
+
+ A private function was inadvertently called, causing memory leaks. This
+ is now fixed.
+
+ (BMR - 2018/03/12 - User's reported in email)
+
+ Testing
+ -------
+ - Memory for three variables in testphdf5's coll_write_test was malloced
+ but not freed, leaking memory when running the test.
+
+ The variables' memory is now freed.
+
+ (LRK - 2018/03/12, HDFFV-10397)
+
+ - Refactored the testpar/t_bigio.c test to include ALARM macros
+
+ Changed the test to include the ALARM_ON and ALARM_OFF macros which
+ are intended to prevent nightly test hangs that have been observed
+ with this particular parallel test example. The code was also modified to
+ simplify status reporting (only from MPI rank 0) and additional
+ status checking added.
+
+ (RAW - 2017/11/08, HDFFV-10301)
+
+
+Supported Platforms
+===================
+
+ Linux 2.6.32-696.16.1.el6.ppc64 gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
+ #1 SMP ppc64 GNU/Linux g++ (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
+ (ostrich) GNU Fortran (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
+ IBM XL C/C++ V13.1
+ IBM XL Fortran V15.1
+
+ Linux 3.10.0-327.10.1.el7 GNU C (gcc), Fortran (gfortran), C++ (g++)
+ #1 SMP x86_64 GNU/Linux compilers:
+ (kituo/moohan) Version 4.8.5 20150623 (Red Hat 4.8.5-4)
+ Version 4.9.3, Version 5.2.0,
+ Intel(R) C (icc), C++ (icpc), Fortran (icc)
+ compilers:
+ Version 17.0.0.098 Build 20160721
+ MPICH 3.1.4 compiled with GCC 4.9.3
+
+ SunOS 5.11 32- and 64-bit Sun C 5.12 SunOS_sparc
+ (emu) Sun Fortran 95 8.6 SunOS_sparc
+ Sun C++ 5.12 SunOS_sparc
+
+ Windows 7 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2013 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+
+ Windows 7 x64 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2013 w/ Intel Fortran 15 (cmake)
+ Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Visual Studio 2015 w/ Intel C, Fortran 2017 (cmake)
+ Visual Studio 2015 w/ MSMPI 8 (cmake)
+ Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
+ gcc and gfortran compilers (GCC 5.4.0)
+ (cmake and autotools)
+
+ Windows 10 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
+ gcc and gfortran compilers (GCC 5.4.0)
+ (cmake and autotools)
+
+ Windows 10 x64 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+
+ Mac OS X Yosemite 10.10.5 Apple clang/clang++ version 6.1 from Xcode 7.0
+ 64-bit gfortran GNU Fortran (GCC) 4.9.2
+ (osx1010dev/osx1010test) Intel icc/icpc/ifort version 15.0.3
+
+ Mac OS X El Capitan 10.11.6 Apple clang/clang++ version 7.3.0 from Xcode 7.3
+ 64-bit gfortran GNU Fortran (GCC) 5.2.0
+ (osx1011dev/osx1011test) Intel icc/icpc/ifort version 16.0.2
+
+ Mac OS Sierra 10.12.6 Apple LLVM version 8.1.0 (clang/clang++-802.0.42)
+ 64-bit gfortran GNU Fortran (GCC) 7.1.0
+ (swallow/kite) Intel icc/icpc/ifort version 17.0.2
+
+
+Tested Configuration Features Summary
+=====================================
+
+ In the tables below
+ y = tested
+ n = not tested in this release
+ C = Cluster
+ W = Workstation
+ x = not working in this release
+ dna = does not apply
+ ( ) = footnote appears below second table
+ <blank> = testing incomplete on this feature or platform
+
+Platform C F90/ F90 C++ zlib SZIP
+ parallel F2003 parallel
+Solaris2.11 32-bit n y/y n y y y
+Solaris2.11 64-bit n y/n n y y y
+Windows 7 y y/y n y y y
+Windows 7 x64 y y/y y y y y
+Windows 7 Cygwin n y/n n y y y
+Windows 7 x64 Cygwin n y/n n y y y
+Windows 10 y y/y n y y y
+Windows 10 x64 y y/y n y y y
+Mac OS X Mountain Lion 10.8.5 64-bit n y/y n y y y
+Mac OS X Mavericks 10.9.5 64-bit n y/y n y y y
+Mac OS X Yosemite 10.10.5 64-bit n y/y n y y y
+Mac OS X El Capitan 10.11.6 64-bit n y/y n y y y
+CentOS 7.2 Linux 2.6.32 x86_64 PGI n y/y n y y y
+CentOS 7.2 Linux 2.6.32 x86_64 GNU y y/y y y y y
+CentOS 7.2 Linux 2.6.32 x86_64 Intel n y/y n y y y
+Linux 2.6.32-573.18.1.el6.ppc64 n y/y n y y y
+
+
+Platform Shared Shared Shared Thread-
+ C libs F90 libs C++ libs safe
+Solaris2.11 32-bit y y y y
+Solaris2.11 64-bit y y y y
+Windows 7 y y y y
+Windows 7 x64 y y y y
+Windows 7 Cygwin n n n y
+Windows 7 x64 Cygwin n n n y
+Windows 10 y y y y
+Windows 10 x64 y y y y
+Mac OS X Mountain Lion 10.8.5 64-bit y n y y
+Mac OS X Mavericks 10.9.5 64-bit y n y y
+Mac OS X Yosemite 10.10.5 64-bit y n y y
+Mac OS X El Capitan 10.11.6 64-bit y n y y
+CentOS 7.2 Linux 2.6.32 x86_64 PGI y y y n
+CentOS 7.2 Linux 2.6.32 x86_64 GNU y y y y
+CentOS 7.2 Linux 2.6.32 x86_64 Intel y y y n
+Linux 2.6.32-573.18.1.el6.ppc64 y y y n
+
+Compiler versions for each platform are listed in the preceding
+"Supported Platforms" table.
+
+
+More Tested Platforms
+=====================
+The following platforms are not supported but have been tested for this release.
+
+ Linux 2.6.32-573.22.1.el6 GNU C (gcc), Fortran (gfortran), C++ (g++)
+ #1 SMP x86_64 GNU/Linux compilers:
+ (mayll/platypus) Version 4.4.7 20120313
+ Version 4.9.3, 5.3.0, 6.2.0
+ PGI C, Fortran, C++ for 64-bit target on
+ x86-64;
+ Version 17.10-0
+ Intel(R) C (icc), C++ (icpc), Fortran (icc)
+ compilers:
+ Version 17.0.4.196 Build 20170411
+ MPICH 3.1.4 compiled with GCC 4.9.3
+
+ Linux 3.10.0-327.18.2.el7 GNU C (gcc) and C++ (g++) compilers
+ #1 SMP x86_64 GNU/Linux Version 4.8.5 20150623 (Red Hat 4.8.5-4)
+ (jelly) with NAG Fortran Compiler Release 6.1(Tozai)
+ GCC Version 7.1.0
+ OpenMPI 3.0.0-GCC-7.2.0-2.29
+ Intel(R) C (icc) and C++ (icpc) compilers
+ Version 17.0.0.098 Build 20160721
+ with NAG Fortran Compiler Release 6.1(Tozai)
+
+ Linux 3.10.0-327.10.1.el7 MPICH 3.2 compiled with GCC 5.3.0
+ #1 SMP x86_64 GNU/Linux
+ (moohan)
+
+ Linux 2.6.32-573.18.1.el6.ppc64 MPICH mpich 3.1.4 compiled with
+ #1 SMP ppc64 GNU/Linux IBM XL C/C++ for Linux, V13.1
+ (ostrich) and IBM XL Fortran for Linux, V15.1
+
+ Debian 8.4 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1 x86_64 GNU/Linux
+ gcc, g++ (Debian 4.9.2-10) 4.9.2
+ GNU Fortran (Debian 4.9.2-10) 4.9.2
+ (cmake and autotools)
+
+ Fedora 24 4.7.2-201.fc24.x86_64 #1 SMP x86_64 x86_64 x86_64 GNU/Linux
+ gcc, g++ (GCC) 6.1.1 20160621
+ (Red Hat 6.1.1-3)
+ GNU Fortran (GCC) 6.1.1 20160621
+ (Red Hat 6.1.1-3)
+ (cmake and autotools)
+
+ Ubuntu 16.04.1 4.4.0-38-generic #57-Ubuntu SMP x86_64 GNU/Linux
+ gcc, g++ (Ubuntu 5.4.0-6ubuntu1~16.04.2)
+ 5.4.0 20160609
+ GNU Fortran (Ubuntu 5.4.0-6ubuntu1~16.04.2)
+ 5.4.0 20160609
+ (cmake and autotools)
+
+
+Known Problems
+==============
+
+ At present, metadata cache images may not be generated by parallel
+ applications. Parallel applications can read files with metadata cache
+ images, but since this is a collective operation, a deadlock is possible
+ if one or more processes do not participate.
+
+ Three tests fail with OpenMPI 3.0.0/GCC-7.2.0-2.29:
+ testphdf5 (ecdsetw, selnone, cchunk1, cchunk3, cchunk4, and actualio)
+ t_shapesame (sscontig2)
+ t_pflush1/fails on exit
+ The first two tests fail attempting collective writes.
+
+ Known problems in previous releases can be found in the HISTORY*.txt files
+ in the HDF5 source. Please report any new problems found to
+ help@hdfgroup.org.
+
+
%%%%1.10.1%%%%
HDF5 version 1.10.1 released on 2017-04-27
diff --git a/release_docs/INSTALL_CMake.txt b/release_docs/INSTALL_CMake.txt
index a2d209a..3a69022 100644
--- a/release_docs/INSTALL_CMake.txt
+++ b/release_docs/INSTALL_CMake.txt
@@ -12,7 +12,6 @@ Section IV: Further considerations
Section V: Options for building HDF5 Libraries with CMake command line
Section VI: CMake option defaults for HDF5
Section VII: User Defined Options for HDF5 Libraries with CMake
-Section VIII: Options for platform configuration files
************************************************************************
@@ -35,7 +34,8 @@ CMake version
Note:
To change the install prefix from the platform defaults initialize
- the CMake variable, CMAKE_INSTALL_PREFIX.
+ the CMake variable, CMAKE_INSTALL_PREFIX. Users of build scripts
+ will use the INSTALLDIR option.
========================================================================
@@ -89,14 +89,14 @@ To build HDF5 with the SZIP and ZLIB external libraries you will need to:
5. From the "myhdfstuff" directory execute the CTest Script with the
following options:
+ On 32-bit Windows with Visual Studio 2017, execute:
+ ctest -S HDF5config.cmake,BUILD_GENERATOR=VS2017 -C Release -VV -O hdf5.log
+ On 64-bit Windows with Visual Studio 2017, execute:
+ ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201764 -C Release -VV -O hdf5.log
On 32-bit Windows with Visual Studio 2015, execute:
ctest -S HDF5config.cmake,BUILD_GENERATOR=VS2015 -C Release -VV -O hdf5.log
On 64-bit Windows with Visual Studio 2015, execute:
ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201564 -C Release -VV -O hdf5.log
- On 32-bit Windows with Visual Studio 2013, execute:
- ctest -S HDF5config.cmake,BUILD_GENERATOR=VS2013 -C Release -VV -O hdf5.log
- On 64-bit Windows with Visual Studio 2013, execute:
- ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201364 -C Release -VV -O hdf5.log
On Linux and Mac, execute:
ctest -S HDF5config.cmake,BUILD_GENERATOR=Unix -C Release -VV -O hdf5.log
The supplied build scripts are versions of the above.
@@ -167,6 +167,14 @@ To build HDF5 with the SZIP and ZLIB external libraries you will need to:
By default the installation will create the bin, include, lib and cmake
folders in the <install destination directory>/HDF_Group/HDF5/1.10."X"
+ The <install destination directory> depends on the build platform;
+ Windows will set the default to:
+ C:/Program Files/HDF_Group/HDF5/1.10."X"
+ Linux will set the default to:
+ "myhdfstuff/HDF_Group/HDF5/1.10."X"
+ The default can be changed by adding ",INSTALLDIR=<my new dir>" to the
+ "ctest -S HDF5config.cmake..." command. For example on linux:
+ ctest -S HDF5config.cmake,INSTALLDIR=/usr/local/myhdf5,BUILD_GENERATOR=Unix -C Release -VV -O hdf5.log
========================================================================
@@ -610,6 +618,7 @@ HDF_TEST_EXPRESS "Control testing framework (0-3)"
HDF5_TEST_VFD "Execute tests with different VFDs" OFF
HDF5_USE_16_API_DEFAULT "Use the HDF5 1.6.x API by default" OFF
HDF5_USE_18_API_DEFAULT "Use the HDF5 1.8.x API by default" OFF
+HDF5_USE_110_API_DEFAULT "Use the HDF5 1.10.x API by default" ON
HDF5_USE_FOLDERS "Enable folder grouping of projects in IDEs." ON
HDF5_WANT_DATA_ACCURACY "IF data accuracy is guaranteed during data conversions" ON
HDF5_WANT_DCONV_EXCEPTION "exception handling functions is checked during data conversions" ON
@@ -635,6 +644,11 @@ else ()
H5_DEFAULT_PLUGINDIR "/usr/local/hdf5/lib/plugin"
endif ()
+NOTE:
+ The BUILD_STATIC_EXECS ("Build Static Executables") option is only valid
+ on some unix operating systems. It adds the "-static" flag to cflags. This
+ flag is not available on windows and some modern linux systems will
+ ignore the flag.
========================================================================
@@ -651,332 +665,6 @@ UserMacros.cmake file. Then enable the option to the CMake configuration,
build and test process.
========================================================================
-VIII. Options for Platform Configuration Files
-========================================================================
-
-Below is the HDF5config.cmake and HDF5options.cmake ctest scripts.
-Execute:
- ctest -S HDF5config.cmake,BUILD_GENERATOR=xxx -C Release -VV -O hdf5.log
-The same scripts can be used on Linux, Mac OSX or a Windows machine by
-adding an option (${CTEST_SCRIPT_ARG}) to the platform configuration script.
-
-
-#############################################################################################
-### ${CTEST_SCRIPT_ARG} is of the form OPTION=VALUE ###
-### BUILD_GENERATOR required [Unix, VS2017, VS201764, VS2015, VS201564, VS2013, VS201364] ###
-### ctest -S HDF5config.cmake,BUILD_GENERATOR=VS201764 -C Release -VV -O hdf5.log ###
-#############################################################################################
-
-cmake_minimum_required (VERSION 3.10)
-############################################################################
-# Usage:
-# ctest -S HDF5config.cmake,OPTION=VALUE -C Release -VV -O test.log
-# where valid options for OPTION are:
-# BUILD_GENERATOR - The cmake build generator:
-# Unix * Unix Makefiles
-# VS2017 * Visual Studio 15 2017
-# VS201764 * Visual Studio 15 2017 Win64
-# VS2015 * Visual Studio 14 2015
-# VS201564 * Visual Studio 14 2015 Win64
-# VS2013 * Visual Studio 12 2013
-# VS201364 * Visual Studio 12 2013 Win64
-#
-# INSTALLDIR - root folder where hdf5 is installed
-# CTEST_CONFIGURATION_TYPE - Release, Debug, etc
-# CTEST_SOURCE_NAME - source folder
-##############################################################################
-
-set (CTEST_SOURCE_VERSION "1.11.0")
-set (CTEST_SOURCE_VERSEXT "")
-
-##############################################################################
-# handle input parameters to script.
-#BUILD_GENERATOR - which CMake generator to use, required
-#INSTALLDIR - HDF5-1.10.0 root folder
-#CTEST_CONFIGURATION_TYPE - Release, Debug, RelWithDebInfo
-#CTEST_SOURCE_NAME - name of source folder; HDF5-1.10.0
-if (DEFINED CTEST_SCRIPT_ARG)
- # transform ctest script arguments of the form
- # script.ctest,var1=value1,var2=value2
- # to variables with the respective names set to the respective values
- string (REPLACE "," ";" script_args "${CTEST_SCRIPT_ARG}")
- foreach (current_var ${script_args})
- if ("${current_var}" MATCHES "^([^=]+)=(.+)$")
- set ("${CMAKE_MATCH_1}" "${CMAKE_MATCH_2}")
- endif ()
- endforeach ()
-endif ()
-
-# build generator must be defined
-if (NOT DEFINED BUILD_GENERATOR)
- message (FATAL_ERROR "BUILD_GENERATOR must be defined - Unix, VS2017, or VS201764, VS2015, VS201564, VS2013, VS201364")
-endif ()
-
-###################################################################
-### Following Line is one of [Release, RelWithDebInfo, Debug] #####
-set (CTEST_CONFIGURATION_TYPE "$ENV{CMAKE_CONFIG_TYPE}")
-###################################################################
-
-if (NOT DEFINED INSTALLDIR)
- if (WIN32)
- set (INSTALLDIR "C:/Program Files/HDF_Group/HDF5/${CTEST_SOURCE_VERSION}")
- else ()
- set (INSTALLDIR "${CTEST_SCRIPT_DIRECTORY}/HDF_Group/HDF5/${CTEST_SOURCE_VERSION}")
- endif ()
-endif ()
-if (NOT DEFINED CTEST_CONFIGURATION_TYPE)
- set (CTEST_CONFIGURATION_TYPE "Release")
-endif ()
-if (NOT DEFINED CTEST_SOURCE_NAME)
- set (CTEST_SOURCE_NAME "hdf5-${CTEST_SOURCE_VERSION}${CTEST_SOURCE_VERSEXT}")
-endif ()
-if (NOT DEFINED STATIC_ONLY)
- set (STATICONLYLIBRARIES "YES")
-else ()
- set (STATICONLYLIBRARIES "NO")
-endif ()
-if (NOT DEFINED FORTRAN_LIBRARIES)
- set (FORTRANLIBRARIES "NO")
-else ()
- set(FORTRANLIBRARIES "YES")
-endif ()
-if (NOT DEFINED JAVA_LIBRARIES)
- set (JAVALIBRARIES "NO")
-else ()
- set (JAVALIBRARIES "YES")
-endif ()
-
-set (CTEST_BINARY_NAME "build")
-set (CTEST_DASHBOARD_ROOT "${CTEST_SCRIPT_DIRECTORY}")
-if (WIN32)
- set (CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_SOURCE_NAME}")
- set (CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_BINARY_NAME}")
-else ()
- set (CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_SOURCE_NAME}")
- set (CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_BINARY_NAME}")
-endif ()
-
-###################################################################
-######### Following describes compiler ############
-if (WIN32)
- set (SITE_OS_NAME "Windows")
- set (SITE_OS_VERSION "WIN7")
- if (${BUILD_GENERATOR} STREQUAL "VS201764")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 15 2017 Win64")
- set (SITE_OS_BITS "64")
- set (SITE_COMPILER_NAME "vs2017")
- set (SITE_COMPILER_VERSION "15")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2017")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 15 2017")
- set (SITE_OS_BITS "32")
- set (SITE_COMPILER_NAME "vs2017")
- set (SITE_COMPILER_VERSION "15")
- elseif (${BUILD_GENERATOR} STREQUAL "VS201564")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 14 2015 Win64")
- set (SITE_OS_BITS "64")
- set (SITE_COMPILER_NAME "vs2015")
- set (SITE_COMPILER_VERSION "14")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2015")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 14 2015")
- set (SITE_OS_BITS "32")
- set (SITE_COMPILER_NAME "vs2015")
- set (SITE_COMPILER_VERSION "14")
- elseif (${BUILD_GENERATOR} STREQUAL "VS201364")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 12 2013 Win64")
- set (SITE_OS_BITS "64")
- set (SITE_COMPILER_NAME "vs2013")
- set (SITE_COMPILER_VERSION "12")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2013")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 12 2013")
- set (SITE_OS_BITS "32")
- set (SITE_COMPILER_NAME "vs2013")
- set (SITE_COMPILER_VERSION "12")
- elseif (${BUILD_GENERATOR} STREQUAL "VS201264")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 11 2012 Win64")
- set (SITE_OS_BITS "64")
- set (SITE_COMPILER_NAME "vs2012")
- set (SITE_COMPILER_VERSION "11")
- elseif (${BUILD_GENERATOR} STREQUAL "VS2012")
- set (CTEST_CMAKE_GENERATOR "Visual Studio 11 2012")
- set (SITE_OS_BITS "32")
- set (SITE_COMPILER_NAME "vs2012")
- set (SITE_COMPILER_VERSION "11")
- else ()
- message (FATAL_ERROR "Invalid BUILD_GENERATOR must be - Unix, VS2017, or VS201764, VS2015, VS201564, VS2013, VS201364")
- endif ()
-## Set the following to unique id your computer ##
- set (CTEST_SITE "WIN7${BUILD_GENERATOR}.XXXX")
-else ()
- set (CTEST_CMAKE_GENERATOR "Unix Makefiles")
-## Set the following to unique id your computer ##
- if (APPLE)
- set (CTEST_SITE "MAC.XXXX")
- else ()
- set (CTEST_SITE "LINUX.XXXX")
- endif ()
- if (APPLE)
- execute_process (COMMAND xcrun --find cc OUTPUT_VARIABLE XCODE_CC OUTPUT_STRIP_TRAILING_WHITESPACE)
- execute_process (COMMAND xcrun --find c++ OUTPUT_VARIABLE XCODE_CXX OUTPUT_STRIP_TRAILING_WHITESPACE)
- set (ENV{CC} "${XCODE_CC}")
- set (ENV{CXX} "${XCODE_CXX}")
- set (CTEST_USE_LAUNCHERS 1)
- set (RR_WARNINGS_COMMON "-Wno-format-nonliteral -Wno-cast-align -Wno-unused -Wno-unused-variable -Wno-unused-function -Wno-self-assign -Wno-unused-parameter -Wno-sign-compare")
- set (RR_WARNINGS_C "${RR_WARNINGS_COMMON} -Wno-deprecated-declarations -Wno-uninitialized")
- set (RR_WARNINGS_CXX "${RR_WARNINGS_COMMON} -Woverloaded-virtual -Wshadow -Wwrite-strings -Wc++11-compat")
- set (RR_FLAGS_COMMON "-g -O0 -fstack-protector-all -D_FORTIFY_SOURCE=2")
- set (RR_FLAGS_C "${RR_FLAGS_COMMON}")
- set (RR_FLAGS_CXX "${RR_FLAGS_COMMON}")
- set (ENV{CFLAGS} "${RR_WARNINGS_C} ${RR_FLAGS_C}")
- set (ENV{CXXFLAGS} "${RR_WARNINGS_CXX} ${RR_FLAGS_CXX}")
- endif ()
-endif ()
-###################################################################
-
-###################################################################
-######### Following is for submission to CDash ############
-###################################################################
-set (MODEL "Experimental")
-###################################################################
-
-###################################################################
-##### Following controls CDash submission #####
-#set (LOCAL_SUBMIT "TRUE")
-##### Following controls test process #####
-#set (LOCAL_SKIP_TEST "TRUE")
-#set (LOCAL_MEMCHECK_TEST "TRUE")
-#set (LOCAL_COVERAGE_TEST "TRUE")
-##### Following controls cpack command #####
-#set (LOCAL_NO_PACKAGE "TRUE")
-##### Following controls source update #####
-#set (LOCAL_UPDATE "TRUE")
-set (REPOSITORY_URL "https://git@bitbucket.hdfgroup.org/scm/hdffv/hdf5.git")
-set (REPOSITORY_BRANCH "develop")
-
-#uncomment to use a compressed source file: *.tar on linux or mac *.zip on windows
-#set(CTEST_USE_TAR_SOURCE "${CTEST_SOURCE_VERSION}")
-###################################################################
-
-###################################################################
-if (${STATICONLYLIBRARIES})
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DBUILD_SHARED_LIBS:BOOL=OFF")
- ######### Following describes computer ############
- ## following is optional to describe build ##
- set (SITE_BUILDNAME_SUFFIX "STATIC")
-endif ()
-###################################################################
-#### fortran ####
-if (${FORTRANLIBRARIES})
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_BUILD_FORTRAN:BOOL=ON")
- ### enable Fortran 2003 depends on HDF5_BUILD_FORTRAN
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ENABLE_F2003:BOOL=ON")
-else ()
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_BUILD_FORTRAN:BOOL=OFF")
- ### enable Fortran 2003 depends on HDF5_BUILD_FORTRAN
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ENABLE_F2003:BOOL=OFF")
-endif ()
-#### java ####
-if (${JAVALIBRARIES})
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_BUILD_JAVA:BOOL=ON")
-else ()
- set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_BUILD_JAVA:BOOL=OFF")
-endif ()
-
-### change install prefix
-set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCMAKE_INSTALL_PREFIX:PATH=${INSTALLDIR}")
-set (ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCTEST_CONFIGURATION_TYPE:STRING=$ENV{CMAKE_CONFIG_TYPE}")
-
-###################################################################
-
-if (WIN32)
- set (BINFILEBASE "HDF5-${CTEST_SOURCE_VERSION}${CTEST_SOURCE_VERSEXT}-win${SITE_OS_BITS}")
- include (${CTEST_SCRIPT_DIRECTORY}\\HDF5options.cmake)
- include (${CTEST_SCRIPT_DIRECTORY}\\CTestScript.cmake)
- if (EXISTS "${CTEST_BINARY_DIRECTORY}\\${BINFILEBASE}.exe")
- file (COPY "${CTEST_BINARY_DIRECTORY}\\${BINFILEBASE}.exe" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}\\${BINFILEBASE}.msi")
- file (COPY "${CTEST_BINARY_DIRECTORY}\\${BINFILEBASE}.msi" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}\\${BINFILEBASE}.zip")
- file (COPY "${CTEST_BINARY_DIRECTORY}\\${BINFILEBASE}.zip" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
-else ()
- set (BINFILEBASE "HDF5-${CTEST_SOURCE_VERSION}${CTEST_SOURCE_VERSEXT}")
- include (${CTEST_SCRIPT_DIRECTORY}/HDF5options.cmake)
- include (${CTEST_SCRIPT_DIRECTORY}/CTestScript.cmake)
- if (APPLE)
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Darwin.dmg")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Darwin.dmg" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Darwin.tar.gz")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Darwin.tar.gz" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Darwin.sh")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Darwin.sh" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- else ()
- if (CYGWIN)
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-CYGWIN.sh")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-CYGWIN.sh" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-CYGWIN.tar.gz")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-CYGWIN.tar.gz" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- else ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Linux.sh")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Linux.sh" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- if (EXISTS "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Linux.tar.gz")
- file (COPY "${CTEST_BINARY_DIRECTORY}/${BINFILEBASE}-Linux.tar.gz" DESTINATION ${CTEST_SCRIPT_DIRECTORY})
- endif ()
- endif ()
- endif ()
-endif ()
-
-HDF5options.cmake:
-#############################################################################################
-#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
-#### format: set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DXXX:YY=ZZZZ") ###
-#############################################################################################
-
-### uncomment/comment and change the following lines for other configuration options
-
-#############################################################################################
-#### alternate toolsets ####
-#set(CMAKE_GENERATOR_TOOLSET "Intel C++ Compiler 17.0")
-
-#############################################################################################
-#### ext libraries ####
-
-### ext libs from tgz
-set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ALLOW_EXTERNAL_SUPPORT:STRING=TGZ -DTGZPATH:PATH=${CTEST_SCRIPT_DIRECTORY}")
-### ext libs from git
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ALLOW_EXTERNAL_SUPPORT:STRING=GIT")
-### ext libs on system
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DZLIB_LIBRARY:FILEPATH=some_location/lib/zlib.lib -DZLIB_INCLUDE_DIR:PATH=some_location/include")
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DSZIP_LIBRARY:FILEPATH=some_location/lib/szlib.lib -DSZIP_INCLUDE_DIR:PATH=some_location/include")
-
-### disable ext zlib building
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ENABLE_Z_LIB_SUPPORT:BOOL=OFF")
-### disable ext szip building
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ENABLE_SZIP_SUPPORT:BOOL=OFF")
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_ENABLE_SZIP_ENCODING:BOOL=OFF")
-
-#############################################################################################
-### disable test program builds
-
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DBUILD_TESTING:BOOL=OFF")
-
-#############################################################################################
-### disable packaging
-
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_NO_PACKAGES:BOOL=ON")
-### Create install package with external libraries (szip, zlib)
-set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_PACKAGE_EXTLIBS:BOOL=ON")
-
-#############################################################################################
-
-========================================================================
For further assistance, send email to help@hdfgroup.org
========================================================================
diff --git a/release_docs/INSTALL_Cygwin.txt b/release_docs/INSTALL_Cygwin.txt
index f5f1d6a..ddffcf1 100755
--- a/release_docs/INSTALL_Cygwin.txt
+++ b/release_docs/INSTALL_Cygwin.txt
@@ -66,12 +66,12 @@ Preconditions:
2.2.2 Szip
The HDF5 library has a predefined compression filter that uses
the extended-Rice lossless compression algorithm for chunked
- datasets. For information on Szip compression, license terms,
- and obtaining the Szip source code, see:
-
- https://portal.hdfgroup.org/display/HDF5/Szip+Compression+in+HDF+Products
-
-
+ datatsets. For more information about Szip compression and
+ license terms see
+ http://hdfgroup.org/HDF5/doc_resource/SZIP/index.html.
+
+ The latest supported public release of SZIP is available from
+ ftp://ftp.hdfgroup.org/lib-external/szip/2.1.
2.3 Additional Utilities
@@ -93,7 +93,7 @@ Build, Test and Install HDF5 on Cygwin
1. Get HDF5 source code package
Users can download HDF5 source code package from HDF website
- (https://www.hdfgroup.org/downloads/hdf5/).
+ (http://hdfgroup.org).
2. Unpacking the distribution
@@ -266,8 +266,4 @@ Build, Test and Install HDF5 on Cygwin
-----------------------------------------------------------------------
-For further assistance, contact:
-
- HDF Forum: https://forum.hdfgroup.org/
- HDF Helpdesk: https://portal.hdfgroup.org/display/support/The+HDF+Help+Desk
-
+Need Further assistance, email help@hdfgroup.org
diff --git a/release_docs/RELEASE.txt b/release_docs/RELEASE.txt
index 0c3873b..8f3c434 100755
--- a/release_docs/RELEASE.txt
+++ b/release_docs/RELEASE.txt
@@ -1,15 +1,15 @@
-HDF5 version 1.10.2 released on 2018-03-29
+HDF5 version 1.10.3 released on 2018-08-21
================================================================================
INTRODUCTION
-This document describes the differences between this release and the previous
-HDF5 release. It contains information on the platforms tested and known
-problems in this release. For more details check the HISTORY*.txt files in the
+This document describes the differences between this release and the previous
+HDF5 release. It contains information on the platforms tested and known
+problems in this release. For more details check the HISTORY*.txt files in the
HDF5 source.
-Note that documentation in the links below will be updated at the time of each
+Note that documentation in the links below will be updated at the time of each
final release.
Links to HDF5 documentation can be found on The HDF5 web page:
@@ -33,575 +33,306 @@ If you have any questions or comments, please send them to the HDF Help Desk:
CONTENTS
- New Features
-- Support for new platforms and languages
-- Bug Fixes since HDF5-1.10.1
+- Bug Fixes since HDF5-1.10.2
- Supported Platforms
- Tested Configuration Features Summary
- More Tested Platforms
- Known Problems
+- CMake vs. Autotools installations
New Features
============
- Configuration and Build Systems:
- --------------------------------
- - CMake builds
- --------------
-
- - Changed minimum CMake required version to 3.10.
-
- This change removed the need to support a copy of the FindMPI.cmake module,
- which has been removed, along with its subfolder in the config/cmake_ext_mod
- location.
-
- (ADB - 2018/03/09)
-
- - Added pkg-config file generation
-
- Added pkg-config file generation for the C, C++, HL, and HL C++ libraries.
- In addition, builds on Linux will create h5cc, h5c++, h5hlcc, and h5hlc++ scripts in the bin
- directory that use the pkg-config files. The scripts can be used to build HDF5 C and C++
- applications (i.e, similar to the compiler scripts produced by the Autotools builds).
-
- (ADB - 2018/03/08, HDFFV-4359)
-
- - Refactored use of CMAKE_BUILD_TYPE for new variable, which understands
- the type of generator in use.
-
- Added new configuration macros to use new HDF_BUILD_TYPE variable. This
- variable is set correctly for the type of generator being used for the build.
-
- (ADB - 2018/01/08, HDFFV-10385, HDFFV-10296)
-
- - Autotools builds
- ------------------
-
- - Removed version-specific gcc/gfortran flags for version 4.0 (inclusive)
- and earlier.
-
- The config/gnu-flags file, which is sourced as a part of the configure
- process, adds version-specific flags for use when building HDF5. Most of
- these flags control warnings and do not affect the final product.
-
- Flags for older versions of the compiler were consolidated into the
- common flags section. Moving these flags simplifies maintenance of
- the file.
-
- The upshot of this is that building with ancient versions of gcc
- (<= 4.0) will possibly no longer work without hand-hacking the file
- to remove the flags not understood by that version of the compiler.
- Nothing should change when building with gcc >= 4.1.
-
- (DER - 2017/05/31, HDFFV-9937)
-
- - -fno-omit-frame-pointer was added when building with debugging symbols
- enabled.
-
- Debugging symbols can be enabled independently of the overall build
- mode in both the autotools and CMake. This allows (limited) debugging
- of optimized code. Since many debuggers rely on the frame pointer,
- we've disabled this optimization when debugging symbols are requested
- (e.g.: via building with --enable-symbols).
-
- (DER - 2017/05/31, HDFFV-10226)
-
-
- Library:
- --------
- - Added an enumerated value to H5F_libver_t for H5Pset_libver_bounds().
-
- Currently, the library defines two values for H5F_libver_t and supports
- only two pairs of (low, high) combinations as derived from these values.
- Thus the bounds setting via H5Pset_libver_bounds() is rather restricted.
-
- Added an enumerated value (H5F_LIBVER_V18) to H5F_libver_t and
- H5Pset_libver_bounds() now supports five pairs of (low, high) combinations
- as derived from these values. This addition provides the user more
- flexibility in setting bounds for object creation.
-
- (VC - 2018/03/14)
-
- - Added prefix option to VDS files.
-
- Currently, VDS source files must be in the active directory to be
- found by the virtual file. Adding the option of a prefix to be set
- on the virtual file, using a data access property list (DAPL),
- allows the source files to locate at an absolute or relative path
- to the virtual file.
- Private utility functions in H5D and H5L packages merged into single
- function in H5F package.
-
- New public APIs:
- herr_t H5Pset_virtual_prefix(hid_t dapl_id, const char* prefix);
- ssize_t H5Pget_virtual_prefix(hid_t dapl_id, char* prefix /*out*/, size_t size);
- The prefix can also be set with an environment variable, HDF5_VDS_PREFIX.
-
- (ADB - 2017/12/12, HDFFV-9724, HDFFV-10361)
-
- - H5FDdriver_query() API call added to the C library.
-
- This new library call allows the user to query a virtual file driver
- (VFD) for the feature flags it supports (listed in H5FDpublic.h).
- This can be useful to determine if a VFD supports SWMR, for example.
-
- Note that some VFDs have feature flags that may only be present
- after a file has been created or opened (e.g.: the core VFD will
- have the H5FD_FEAT_POSIX_COMPAT_HANDLE flag set if the backing
- store is switched on). Since the new API call queries a generic VFD
- unassociated with a file, these flags will never be returned.
-
- (DER - 2017/05/31, HDFFV-10215)
-
- - H5FD_FEAT_DEFAULT_VFD_COMPATIBLE VFD feature flag added to the C library.
-
- This new feature flag indicates that the VFD is compatible with the
- default VFD. VFDs that set this flag create single files that follow
- the canonical HDF5 file format.
-
- (DER - 2017/05/31, HDFFV-10214)
-
- - The H5I_REFERENCE value in the H5I_type_t enum (defined in H5Ipublic.h)
- has been marked as deprectated.
-
- This ID type value is not used in the C library. i.e.: There are no
- hid_t values that are of ID type H5I_REFERENCE.
-
- This enum value will be removed in a future major version of the library.
- The code will remain unchanged in the HDF5 1.10.x releases and branches.
-
- (DER - 2017/04/05, HDFFV-10252)
-
-
- Parallel Library:
- -----------------
- - Enabled compression for parallel applications.
-
- With this release parallel applications can create and write compressed
- datasets (or the datasets with the filters such as Fletcher32 applied).
-
- (EIP - 2018/03/29)
-
- - Addressed slow file close on some Lustre file systems.
-
- Slow file close has been reported on some Lustre file systems.
- While the ultimate cause is not understood fully, the proximate
- cause appears to be long delays in MPI_File_set_size() calls at
- file close and flush.
+ Library
+ -------
+ - Moved the H5DOread/write_chunk() API calls to H5Dread/write_chunk()
+
+ The functionality of the direct chunk I/O calls in the high-level
+ library has been moved to the H5D package in the main library. This
+ will allow using those functions without building the high-level
+ library. The parameters and functionality of the H5D calls are
+ identical to the H5DO calls.
+
+ The original H5DO high-level API calls have been retained, though
+ they are now just wrappers for the H5D calls. They are marked as
+ deprecated and are only available when the library is built with
+ deprecated functions. New code should use the H5D calls for this
+ reason.
+
+ As a part of this work, the following symbols from H5Dpublic.h are no
+ longer used:
+
+ H5D_XFER_DIRECT_CHUNK_WRITE_FLAG_NAME
+ H5D_XFER_DIRECT_CHUNK_WRITE_FILTERS_NAME
+ H5D_XFER_DIRECT_CHUNK_WRITE_OFFSET_NAME
+ H5D_XFER_DIRECT_CHUNK_WRITE_DATASIZE_NAME
+ H5D_XFER_DIRECT_CHUNK_READ_FLAG_NAME
+ H5D_XFER_DIRECT_CHUNK_READ_OFFSET_NAME
+ H5D_XFER_DIRECT_CHUNK_READ_FILTERS_NAME
+
+ And properties with these names are no longer stored in the dataset
+ transfer property lists. The symbols are still defined in H5Dpublic.h,
+ but only when the library is built with deprecated symbols.
+
+ (DER - 2018/05/04)
+
+ Configuration:
+ -------------
+ - Add missing USE_110_API_DEFAULT option.
- To minimize this problem pending a definitive diagnosis and fix,
- PHDF5 has been modified to avoid MPI_File_set_size() calls when
- possible. This is done by comparing the library's EOA (End of
- Allocation) with the file systems EOF, and skipping the
- MPI_File_set_size() call if the two match.
+ Option USE_110_API_DEFAULT sets the default version of
+ versioned APIs. The bin/makevers perl script did not set
+ the maxidx variable correctly when the 1.10 branch was
+ created. This caused the versioning process to always use
+ the latest version of any API.
- (JRM - 2018/03/29)
+ (ADB - 2018/08/17, HDFFV-10552)
- - Optimized parallel open/location of the HDF5 super-block.
+ - Added configuration checks for the following MPI functions:
- Previous releases of PHDF5 required all parallel ranks to
- search for the HDF5 superblock signature when opening the
- file. As this is accomplished more or less as a synchronous
- operation, a large number of processes can experience a
- slowdown in the file open due to filesystem contention.
+ MPI_Mprobe - Used for the Parallel Compression feature
+ MPI_Imrecv - Used for the Parallel Compression feature
- As a first step in improving the startup/file-open performance,
- we allow MPI rank 0 of the associated MPI communicator to locate
- the base offset of the super-block and then broadcast that result
- to the remaining ranks in the parallel group. Note that this
- approach is utilized ONLY during file opens which employ the MPIO
- file driver in HDF5 by previously having called H5Pset_fapl_mpio().
+ MPI_Get_elements_x - Used for the "big Parallel I/O" feature
+ MPI_Type_size_x - Used for the "big Parallel I/O" feature
- HDF5 parallel file operations which do not employ multiple ranks
- e.g. specifiying MPI_COMM_SELF (whose MPI_Comm_size == 1)
- as opposed to MPI_COMM_WORLD, will not be affected by this
- optimization. Conversely, parallel file operations on subgroups
- of MPI_COMM_WORLD are allowed to be run in parallel with each
- subgroup operating as an independant collection of processes.
+ (JTH - 2018/08/02, HDFFV-10512)
- (RAW - 2017/10/10, HDFFV-10294)
+ - Added section to the libhdf5.settings file to indicate
+ the status of the Parallel Compression and "big Parallel I/O"
+ features.
- - Added large (>2GB) MPI-IO transfers.
+ (JTH - 2018/08/02, HDFFV-10512)
- Previous releases of PHDF5 would fail when attempting to
- read or write greater than 2GB of data in a single IO operation.
- This issue stems principally from an MPI API whose definitions
- utilize 32 bit integers to describe the number of data elements
- and datatype that MPI should use to effect a data transfer.
- Historically, HDF5 has invoked MPI-IO with the number of
- elements in a contiguous buffer represented as the length
- of that buffer in bytes.
+ - Add option to execute swmr shell scripts from CMake.
- Resolving the issue and thus enabling larger MPI-IO transfers
- is accomplished first, by detecting when a user IO request would
- exceed the 2GB limit as described above. Once a transfer request
- is identified as requiring special handling, PHDF5 now creates a
- derived datatype consisting of a vector of fixed sized blocks
- which is in turn wrapped within a single MPI_Type_struct to
- contain the vector and any remaining data. The newly created
- datatype is then used in place of MPI_BYTE and can be used to
- fulfill the original user request without encountering API
- errors.
+ Option TEST_SHELL_SCRIPTS redirects processing into a
+ separate ShellTests.cmake file for UNIX types. The tests
+ execute the shell scripts if a SH program is found.
- (RAW - 2017/09/10, HDFFV-8839)
+ (ADB - 2018/07/16)
C++ Library:
------------
- - The following C++ API wrappers have been added to the C++ Library:
- + H5Lcreate_soft:
- // Creates a soft link from link_name to target_name.
- void link(const char *target_name, const char *link_name,...)
- void link(const H5std_string& target_name,...)
-
- + H5Lcreate_hard:
- // Creates a hard link from new_name to curr_name.
- void link(const char *curr_name, const Group& new_loc,...)
- void link(const H5std_string& curr_name, const Group& new_loc,...)
-
- // Creates a hard link from new_name to curr_name in same location.
- void link(const char *curr_name, const hid_t same_loc,...)
- void link(const H5std_string& curr_name, const hid_t same_loc,...)
-
- Note: previous version of H5Location::link will be deprecated.
-
- + H5Lcopy:
- // Copy an object from a group of file to another.
- void copyLink(const char *src_name, const Group& dst,...)
- void copyLink(const H5std_string& src_name, const Group& dst,...)
-
- // Copy an object from a group of file to the same location.
- void copyLink(const char *src_name, const char *dst_name,...)
- void copyLink(const H5std_string& src_name,...)
-
- + H5Lmove:
- // Rename an object in a group or file to a new location.
- void moveLink(const char* src_name, const Group& dst,...)
- void moveLink(const H5std_string& src_name, const Group& dst,...)
-
- // Rename an object in a group or file to the same location.
- void moveLink(const char* src_name, const char* dst_name,...)
- void moveLink(const H5std_string& src_name,...)
-
- Note: previous version H5Location::move will be deprecated.
+ - New wrappers
- + H5Ldelete:
- // Removes the specified link from this location.
- void unlink(const char *link_name,
- const LinkAccPropList& lapl = LinkAccPropList::DEFAULT)
- void unlink(const H5std_string& link_name,
- const LinkAccPropList& lapl = LinkAccPropList::DEFAULT)
+ Added the following items:
- Note: additional parameter is added to previous H5Location::unlink.
+ + Class DSetAccPropList for the dataset access property list.
- + H5Tencode and H5Tdecode:
- // Creates a binary object description of this datatype.
- void DataType::encode() - C API H5Tencode()
+ + Wrapper for H5Dget_access_plist to class DataSet
+ // Gets the access property list of this dataset.
+ DSetAccPropList getAccessPlist() const;
- // Returns the decoded type from the binary object description.
- DataType::decode() - C API H5Tdecode()
- ArrayType::decode() - C API H5Tdecode()
- CompType::decode() - C API H5Tdecode()
- DataType::decode() - C API H5Tdecode()
- EnumType::decode() - C API H5Tdecode()
- FloatType::decode() - C API H5Tdecode()
- IntType::decode() - C API H5Tdecode()
- StrType::decode() - C API H5Tdecode()
- VarLenType::decode() - C API H5Tdecode()
+ + Wrappers for H5Pset_chunk_cache and H5Pget_chunk_cache to class DSetAccPropList
+ // Sets the raw data chunk cache parameters.
+ void setChunkCache(size_t rdcc_nslots, size_t rdcc_nbytes, double rdcc_w0)
- + H5Lget_info:
- // Returns the information of the named link.
- H5L_info_t getLinkInfo(const H5std_string& link_name,...)
+ // Retrieves the raw data chunk cache parameters.
+ void getChunkCache(size_t &rdcc_nslots, size_t &rdcc_nbytes, double &rdcc_w0)
- (BMR - 2018/03/11, HDFFV-10149)
+ + New operator!= to class DataType (HDFFV-10472)
+ // Determines whether two datatypes are not the same.
+ bool operator!=(const DataType& compared_type)
- - Added class LinkCreatPropList for link create property list.
+ + Wrappers for H5Oget_info2, H5Oget_info_by_name2, and H5Oget_info_by_idx2
+ (HDFFV-10458)
- (BMR - 2018/03/11, HDFFV-10149)
+ // Retrieves information about an HDF5 object.
+ void getObjinfo(H5O_info_t& objinfo, unsigned fields = H5O_INFO_BASIC) const;
- - Added overloaded functions H5Location::createGroup to take a link
- creation property list.
- Group createGroup(const char* name, const LinkCreatPropList& lcpl)
- Group createGroup(const H5std_string& name, const LinkCreatPropList& lcpl)
+ // Retrieves information about an HDF5 object, given its name.
+ void getObjinfo(const char* name, H5O_info_t& objinfo,
+ unsigned fields = H5O_INFO_BASIC,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT) const;
+ void getObjinfo(const H5std_string& name, H5O_info_t& objinfo,
+ unsigned fields = H5O_INFO_BASIC,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT) const;
- (BMR - 2018/03/11, HDFFV-10149)
+ // Retrieves information about an HDF5 object, given its index.
+ void getObjinfo(const char* grp_name, H5_index_t idx_type,
+ H5_iter_order_t order, hsize_t idx, H5O_info_t& objinfo,
+ unsigned fields = H5O_INFO_BASIC,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT) const;
+ void getObjinfo(const H5std_string& grp_name, H5_index_t idx_type,
+ H5_iter_order_t order, hsize_t idx, H5O_info_t& objinfo,
+ unsigned fields = H5O_INFO_BASIC,
+ const LinkAccPropList& lapl = LinkAccPropList::DEFAULT) const;
- - A document is added to the HDF5 C++ API Reference Manual to show the
- mapping from a C API to C++ wrappers. It can be found from the main
- page of the C++ API Reference Manual.
-
- (BMR - 2017/10/17, HDFFV-10151)
+ (BMR - 2018/07/22, HDFFV-10150, HDFFV-10458, HDFFV-1047)
Java Library:
----------------
- - Wrapper added for enabling the error stack.
-
- H5error_off would disable the error stack reporting. In order
- to re-enable the reporting, the error stack info needs to be
- saved so that H5error_on can revert state.
-
- (ADB - 2018/03/13, HDFFV-10412)
-
- - Wrappers were added for the following C APIs:
- H5Pset_evict_on_close
- H5Pget_evict_on_close
- H5Pset_chunk_opts
- H5Pget_chunk_opts
- H5Pset_efile_prefix
- H5Pget_efile_prefix
- H5Pset_virtual_prefix
- H5Pget_virtual_prefix
-
- (ADB - 2017/12/20)
-
- - The H5I_REFERENCE value in the H5I_type_t enum (defined in H5Ipublic.h)
- has been marked as deprectated.
-
- JNI code which refers to this value will be removed in a future
- major version of the library. The code will remain unchanged in the
- 1.10.x releases and branches.
+ - Java HDFLibraryException class
- See the C library section, above, for further information.
+ Change parent class from Exception to RuntimeException.
- (HDFFV-10252, DER, 2017/04/05)
+ (ADB - 2018/07/30, HDFFV-10534)
+ - JNI Read and Write
- Tools:
- ------
- - h5diff has a new option to display error stack.
+ Refactored variable-length functions, H5DreadVL and H5AreadVL,
+ to correct dataset and attribute reads. New write functions,
+ H5DwriteVL and H5AwriteVL, are under construction.
- Updated h5diff with the --enable-error-stack argument, which
- enables the display of the hdf5 error stack. This completes the
- improvement to the main tools: h5copy, h5diff, h5dump, h5ls and
- h5repack.
+ (ADB - 2018/06/02, HDFFV-10519)
- (ADB - 2017/08/30, HDFFV-9774)
-
-Support for new platforms, languages and compilers.
-=======================================
- - None
-
-Bug Fixes since HDF5-1.10.1 release
+Bug Fixes since HDF5-1.10.2 release
==================================
Library
-------
- - The data read after a direct chunk write to a chunked dataset with
- one chunk was incorrect.
-
- The problem was due to the passing of a null dataset pointer to
- the insert callback for the chunk index in the routine
- H5D__chunk_direct_write() in H5Dchunk.c
- The dataset was a single-chunked dataset which will use the
- single chunk index when latest format was enabled on file creation.
- The single chunk index was the only index that used this pointer
- in the insert callback.
-
- Passed the dataset pointer to the insert callback for the chunk
- index in H5D__chunk_direct_write().
-
- (VC - 2018/03/20, HDFFV-10425)
-
- - Added public routine H5DOread_chunk to the high-level C library.
-
- The patch for H5DOwrite_chunk() to write an entire chunk to the file
- directly was contributed by GE Healthcare and integrated by The HDF Group
- developers.
-
- (VC - 2017/05/19, HDFFV-9934)
+ - Performance issue with H5Oget_info
- - Freeing of object header after failed checksum verification.
+ H5Oget_info family of routines retrieves information for an object such
+ as object type, access time, number of attributes, and storage space etc.
+ Retrieving all such information regardless is an overkill and causes
+ performance issue when doing so for many objects.
- It was discovered that the object header (in H5Ocache.c) was not released properly
- when the checksum verification failed and a re-load of the object
- header was needed.
+ Add an additional parameter "fields" to the the H5Oget_info family of routines
+ indicating the type of information to be retrieved. The same is done to
+ the H5Ovisit family of routines which recursively visits an object
+ returning object information in a callback function. Both sets of routines
+ are versioned and the corresponding compatibility macros are added.
- Freed the object header that failed the chksum verification only
- after the new object header is reloaded, deserialized and set up.
+ The version 2 names of the two sets of routines are:
+ (1) H5Oget_info2, H5Oget_info_by_idx2, H5Oget_info_by_name2
+ (2) H5Ovisit2, H5Ovisit_by_name2
- (VC - 2018/03/14, HDFFV-10209)
+ (VC - 2018/08/15, HDFFV-10180)
- - Updated H5Pset_evict_on_close in H5Pfapl.c
+ - Test failure due to metadata size in test/vds.c
- Changed the minor error number from H5E_CANTSET to H5E_UNSUPPORTED for
- parallel library.
+ The size of metadata from test_api_get_ex_dcpl() in test/vds.c is not as expected
+ because the latest format should be used when encoding the layout for VDS.
- (ADB - 2018/03/06, HDFFV-10414)
+ Set the latest format in a temporary fapl and pass the setting to the routines that
+ encode the dataset selection for VDS.
- - Fixed the problems with the utility function that could not handle lowercase
- Windows drive letters.
+ (VC - 2018/08/14 HDFFV-10469)
- Added call to upper function for drive letter.
+ - Java HDF5LibraryException class
- (ADB - 2017/12/18, HDFFV-10307)
+ The error minor and major values would be lost after the
+ constructor executed.
- - Fixed H5Sencode() bug when the number of elements selected was > 2^32.
+ Created two local class variables to hold the values obtained during
+ execution of the constructor. Refactored the class functions to retrieve
+ the class values rather then calling the native functions.
+ The native functions were renamed and called only during execution
+ of the constructor.
+ Added error checking to calling class constructors in JNI classes.
- H5Sencode() incorrectly encodes dataspace selection with number of
- elements exceeding 2^32. When decoding such selection via H5Sdecode(),
- the number of elements in the decoded dataspace is not the same as
- what is encoded. This problem exists for H5S_SEL_HYPER and
- H5S_SEL_POINTS encoding.
+ (ADB - 2018/08/06, HDFFV-10544)
- The cause of the problem is due to the fact that the library uses 32 bits to
- encode counts and block offsets for the selection.
- The solution is to use the original 32 bit encodings if possible,
- but use a different way to encode selection if more that 32 bits is needed.
- See details in the RFC: H5Sencode/H5Sdecode Format Change i
- https://bitbucket.hdfgroup.org/projects/HDFFV/repos/hdf5doc/browse/RFCs/HDF5_Library/H5SencodeFormatChange.
+ - Added checks of the defined MPI_VERSION to guard against usage of
+ MPI-3 functions in the Parallel Compression and "big Parallel I/O"
+ features when HDF5 is built with MPI-2. Previously, the configure
+ step would pass but the build itself would fail when it could not
+ locate the MPI-3 functions used.
- (VC - 2017/11/28, HDFFV-9947)
+ As a result of these new checks, HDF5 can again be built with MPI-2,
+ but the Parallel Compression feature will be disabled as it relies
+ on the MPI-3 functions used.
- - Fixed filter plugin handling in H5PL.c and H5Z.c to not require i availability of
- dependent libraries (e.g., szip or zlib).
+ (JTH - 2018/08/02, HDFFV-10512)
- It was discovered that the dynamic loading process used by
- filter plugins had issues with library dependencies.
+ - User's patches: CVEs
- CMake build process changed to use LINK INTERFACE keywords, which
- allowed HDF5 C library to make dependent libraries private. The
- filter plugin libraries no longer require dependent libraries
- (such as szip or zlib) to be available.
-
- (ADB - 2017/11/16, HDFFV-10328)
+ The following patches have been applied:
- - Fixed rare object header corruption bug.
+ CVE-2018-11202 - NULL pointer dereference was discovered in
+ H5S_hyper_make_spans in H5Shyper.c (HDFFV-10476)
+ https://security-tracker.debian.org/tracker/CVE-2018-11202
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2018-11202
- In certain cases, such as when converting large attributes to dense
- storage, an error could occur which would either fail an assertion or
- cause file corruption. Fixed and added test.
+ CVE-2018-11203 - A division by zero was discovered in
+ H5D__btree_decode_key in H5Dbtree.c (HDFFV-10477)
+ https://security-tracker.debian.org/tracker/CVE-2018-11203
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2018-11203
- (NAF - 2017/11/14, HDFFV-10274)
+ CVE-2018-11204 - A NULL pointer dereference was discovered in
+ H5O__chunk_deserialize in H5Ocache.c (HDFFV-10478)
+ https://security-tracker.debian.org/tracker/CVE-2018-11204
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2018-11204
- - Updated H5Zfilter_avail in H5Z.c.
+ CVE-2018-11206 - An out of bound read was discovered in
+ H5O_fill_new_decode and H5O_fill_old_decode in H5Ofill.c
+ (HDFFV-10480)
+ https://security-tracker.debian.org/tracker/CVE-2018-11206
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2018-11206
- The public function checked for plugins, while the private
- function did not.
+ CVE-2018-11207 - A division by zero was discovered in
+ H5D__chunk_init in H5Dchunk.c (HDFFV-10481)
+ https://security-tracker.debian.org/tracker/CVE-2018-11207
+ https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2018-11207
- Modified H5Zfilter_avail and private function, H5Z_filter_avail.
- Moved check for plugin from public to private function. Updated
- H5P__set_filter due to change in H5Z_filter_avail. Updated tests.
+ (BMR - 2018/7/22, PR#s: 1134 and 1139,
+ HDFFV-10476, HDFFV-10477, HDFFV-10478, HDFFV-10480, HDFFV-10481)
- (ADB - 2017/10/10, HDFFV-10297, HDFFV-10319)
+ - H5Adelete
- - h5dump produced SEGFAULT when dumping corrypted file.
-
- The behavior was due to the error in the internal function H5HL_offset_into().
+ H5Adelete failed when deleting the last "large" attribute that
+ is stored densely via fractal heap/v2 b-tree.
- (1) Fixed H5HL_offset_into() to return error when offset exceeds heap data
- block size.
- (2) Fixed other places in the library that call this routine to detect
- error routine.
+ After removing the attribute, update the ainfo message. If the
+ number of attributes goes to zero, remove the message.
- (VC - 2017/08/30, HDFFV-10216)
+ (VC - 2018/07/20, HDFFV-9277)
- - Fixes for paged aggregation feature.
+ - A bug was discovered in the parallel library which caused partial
+ parallel reads of filtered datasets to return incorrect data. The
+ library used the incorrect dataspace for each chunk read, causing
+ the selection used in each chunk to be wrong.
- Skip test in test/fheap.c when:
- (1) multi/split drivers and
- (2) persisting free-space or using paged aggregation strategy
+ The bug was not caught during testing because all of the current
+ tests which do parallel reads of filtered data read all of the data
+ using an H5S_ALL selection. Several tests were added which exercise
+ partial parallel reads.
- (VC, 2017/07/10)
+ (JTH - 2018/07/16, HDFFV-10467)
- Changes made based on RFC review comments:
- (1) Added maximum value for file space page size
- (2) Dropped check for page end metadata threshold
- (3) Removed "can_shrink" and "shrink" callbacks for small section class
+ - A bug was discovered in the parallel library which caused parallel
+ writes of filtered datasets to trigger an assertion failure in the
+ file free space manager.
- (VC - 2017/06/09)
+ This occurred when the filter used caused chunks to repeatedly shrink
+ and grow over the course of several dataset writes. The previous chunk
+ information, such as the size of the chunk and the offset in the file,
+ was being cached and not updated after each write, causing the next write
+ to the chunk to retrieve the incorrect cached information and run into
+ issues when reallocating space in the file for the chunk.
- - Fixed for infinite loop in H5VM_power2up().
+ (JTH - 2018/07/16, HDFFV-10509)
- The function H5VM_power2up() returns the next power of 2
- for n. When n exceeds 2^63, it overflows and becomes 0 causing
- the infinite looping.
+ - A bug was discovered in the parallel library which caused the
+ H5D__mpio_array_gatherv() function to allocate too much memory.
- The fix ensures that the function checks for n >= 2^63
- and returns 0.
+ When the function is called with the 'allgather' parameter set
+ to a non-true value, the function will receive data from all MPI
+ ranks and gather it to the single rank specied by the 'root'
+ parameter. However, the bug in the function caused memory for
+ the received data to be allocated on all MPI ranks, not just the
+ singular rank specified as the receiver. In some circumstances,
+ this would cause an application to fail due to the large amounts
+ of memory being allocated.
- (VC - 2017/07/10, HDFFV-10217)
+ (JTH - 2018/07/16, HDFFV-10467)
- - Fixed for H5Ocopy doesn't work with open identifiers.
+ - Error checks in h5stat and when decoding messages
- Changes made so that raw data for dataset objects are copied from
- cached info when possible instead of flushing objects to file and
- read them back in again.
+ h5stat exited with seg fault/core dumped when
+ errors are encountered in the internal library.
- (VC - 2017/07/05, HDFFV-7853)
-
- - An uninitialized struct could cause a memory access error when using
- variable-length or reference types in a compressed, chunked dataset.
-
- A struct containing a callback function pointer and a pointer to some
- associated data was used before initialization. This could cause a
- memory access error and system crash. This could only occur under
- unusual conditions when using variable-lenth and reference types in
- a compressed, chunked dataset.
-
- On recent versions of Visual Studio, when built in debug mode, the
- debug heap will complain and cause a crash if the code in question
- is executed (this will cause the objcopy test to fail).
-
- (DER - 2017/11/21, HDFFV-10330)
-
- - Fixed collective metadata writes on file close.
-
- It was discovered that metadata was being written twice as part of
- the parallel file close behavior, once independently and once
- collectively.
-
- A fix for this error was included as part of the parallel compression
- feature but remained undocumented here.
-
- (RAW - 2017/12/01, HDFFV-10272)
-
- - If an HDF5 file contains a filter pipeline message with a 'number of
- filters' field that exceeds the maximum number of allowed filters,
- the error handling code will attempt to dereference a NULL pointer.
-
- This issue was reported to The HDF Group as issue #CVE-2017-17505.
- https://security-tracker.debian.org/tracker/CVE-2017-17505
- https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17505
-
- NOTE: The HDF5 C library cannot produce such a file. This condition
- should only occur in a corrupt (or deliberately altered) file
- or a file created by third-party software.
+ Add error checks and --enable-error-stack option to h5stat.
+ Add range checks when decoding messages: old fill value, old
+ layout and refcount.
- This problem arose because the error handling code assumed that
- the 'number of filters' field implied that a dynamic array of that
- size had already been created and that the cleanup code should
- iterate over that array and clean up each element's resources. If
- an error occurred before the array has been allocated, this will
- not be true.
-
- This has been changed so that the number of filters is set to
- zero on errors. Additionally, the filter array traversal in the
- error handling code now requires that the filter array not be NULL.
-
- (DER - 2018/02/06, HDFFV-10354)
-
- - If an HDF5 file contains a filter pipeline message which contains
- a 'number of filters' field that exceeds the actual number of
- filters in the message, the HDF5 C library will read off the end of
- the read buffer.
-
- This issue was reported to The HDF Group as issue #CVE-2017-17506.
- https://security-tracker.debian.org/tracker/CVE-2017-17506
- https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17506
-
- NOTE: The HDF5 C library cannot produce such a file. This condition
- should only occur in a corrupt (or deliberately altered) file
- or a file created by third-party software.
-
- The problem was fixed by passing the buffer size with the buffer
- and ensuring that the pointer cannot be incremented off the end
- of the buffer. A mismatch between the number of filters declared
- and the actual number of filters will now invoke normal HDF5
- error handling.
-
- (DER - 2018/02/26, HDFFV-10355)
+ (VC - 2018/07/11, HDFFV-10333)
- If an HDF5 file contains a malformed compound datatype with a
suitably large offset, the type conversion code can run off
@@ -609,8 +340,6 @@ Bug Fixes since HDF5-1.10.1 release
fault.
This issue was reported to The HDF Group as issue #CVE-2017-17507.
- https://security-tracker.debian.org/tracker/CVE-2017-17506
- https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17506
NOTE: The HDF5 C library cannot produce such a file. This condition
should only occur in a corrupt (or deliberately altered) file
@@ -625,448 +354,102 @@ Bug Fixes since HDF5-1.10.1 release
(DER - 2018/02/26, HDFFV-10356)
- - If an HDF5 file contains a malformed compound type which contains
- a member of size zero, a division by zero error will occur while
- processing the type.
-
- This issue was reported to The HDF Group as issue #CVE-2017-17508.
- https://security-tracker.debian.org/tracker/CVE-2017-17508
- https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17508
-
- NOTE: The HDF5 C library cannot produce such a file. This condition
- should only occur in a corrupt (or deliberately altered) file
- or a file created by third-party software.
-
- Checking for zero before dividing fixes the problem. Instead of the
- division by zero, the normal HDF5 error handling is invoked.
-
- (DER - 2018/02/26, HDFFV-10357)
-
- - If an HDF5 file contains a malformed symbol table node that declares
- it contains more symbols than it actually contains, the library
- can run off the end of the metadata cache buffer while processing
- the symbol table node.
-
- This issue was reported to The HDF Group as issue #CVE-2017-17509.
- https://security-tracker.debian.org/tracker/CVE-2017-17509
- https://cve.mitre.org/cgi-bin/cvename.cgi?name=3DCVE-2017-17509
-
- NOTE: The HDF5 C library cannot produce such a file. This condition
- should only occur in a corrupt (or deliberately altered) file
- or a file created by third-party software.
-
- Performing bounds checks on the buffer while processing fixes the
- problem. Instead of the segmentation fault, the normal HDF5 error
- handling is invoked.
-
- (DER - 2018/03/12, HDFFV-10358)
-
- - Fixed permissions passed to open(2) on file create.
-
- On Windows, the POSIX permissions passed to open(2) when creating files
- were only incidentally correct. They are now set to the correct value of
- (_S_IREAD | _S_IWRITE).
-
- On other platforms, the permissions were set to a mix of 666, 644, and
- 000. They are now set uniformly to 666.
-
- (DER - 2017/04/28, HDFFV-9877)
-
- - The H5FD_FEAT_POSIX_COMPAT_HANDLE flag is no longer used to determine
- if a virtual file driver (VFD) is compatible with SWMR.
-
- Use of this VFD feature flag was not in line with the documentation in
- the public H5FDpublich.h file. In particular, it was being used as a
- proxy for determining if SWMR I/O is allowed. This is unecessary as we
- already have a feature flag for this (H5FD_SUPPORTS_SWMR_IO).
-
- (DER - 2017/05/31, HDFFV-10214)
-
Configuration
-------------
- - CMake changes
-
- - Updated CMake commands configuration.
-
- A number of improvements were made to the CMake commands. Most
- changes simplify usage or eliminate unused constructs. Also,
- some changes support better cross-platform support.
-
- (ADB - 2018/02/01, HDFFV-10398)
-
- - Corrected usage of CMAKE_BUILD_TYPE variable.
-
- The use of the CMAKE_BUILD_TYPE is incorrect for multi-config
- generators (Visual Studio and XCode) and is optional for single
- config generators. Created a new macro to check
- GLOBAL PROPERTY -> GENERATOR_IS_MULTI_CONFIG
- Created two new HDF variable, HDF_BUILD_TYPE and HDF_CFG_BUILD_TYPE.
- Defaults for these variables is "Release".
-
- (ADB - 2018/01/10, HDFFV-10385)
-
- - Added replacement of fortran flags if using static CRT.
-
- Added TARGET_STATIC_CRT_FLAGS call to HDFUseFortran.cmake file in
- config/cmake_ext_mod folder.
-
- (ADB - 2018/01/08, HDFFV-10334)
-
-
- - The hdf5 library used shared szip and zlib, which needlessly required
- applications to link with the same szip and zlib libraries.
-
- Changed the target_link_libraries commands to use the static libs.
- Removed improper link duplication of szip and zlib.
- Adjusted the link dependencies and the link interface values of
- the target_link_libraries commands.
-
- (ADB - 2017/11/14, HDFFV-10329)
-
- - CMake MPI
-
- CMake implementation for MPI was problematic and would create incorrect
- MPI library references in the hdf5 libraries.
-
- Reworked the CMake MPI code to properly create CMake targets. Also merged
- the latest CMake FindMPI.cmake changes to the local copy. This is necessary
- until HDF changes the CMake minimum to 3.9 or greater.
-
- (ADB - 2017/11/02, HDFFV-10321)
-
- - Corrected FORTRAN_HAVE_C_LONG_DOUBLE processing in the autotools.
-
- A bug in the autotools Fortran processing code always set the
- FORTRAN_HAVE_C_LONG_DOUBLE variable to be true regardless of
- whether or not a C long double type was present.
-
- This would cause compilation failures on platforms where a C
- long double type was not available and the Fortran wrappers
- were being built.
-
- (DER - 2017/07/05, HDFFV-10247)
-
- - The deprecated --enable-production and --enable-debug configure options
- failed to emit errors when passed an empty string
- (e.g.: --enable-debug="").
+ - Applied patches to address Cywin build issues
- Due to the way we checked for these options being set, it was possible
- to avoid the error message and continue configuration if an empty string
- was passed to the option.
+ There were three issues for Cygwin builds:
+ - Shared libs were not built.
+ - The -std=c99 flag caused a SIG_SETMASK undeclared error.
+ - Undefined errors when buildbing test shared libraries.
- Any use of --enable-production or --enable-debug will now halt the
- configuration step and emit a helpful error message
- (use --enable-build-mode=debug|production instead).
+ Patches to address these issues were received and incorporated in this version.
- (DER - 2017/07/05, HDFFV-10248)
+ (LRK - 2018/07/18, HDFFV-10475)
- - CMake
+ - The --enable-debug/production configure flags are listed as 'deprecated'
+ when they should really be listed as 'removed'.
- Too many commands for POST_BUILD step caused command line to be
- too big on windows.
+ In the autotools overhaul several years ago, we removed these flags and
+ implemented a new --enable-build-mode= flag. This was done because we
+ changed the semantics of the modes and didn't want users to silently
+ be exposed to them. The newer system is also more flexible and us to
+ add other modes (like 'clean').
- Changed foreach of copy command to use a custom command with the
- use of the HDFTEST_COPY_FILE macro.
+ The --enable-debug/production flags are now listed as removed.
- (ADB - 2017/07/12, HDFFV-10254)
+ (DER - 2018/05/31, HDFFV-10505)
- - CMake test execution environment
+ - Moved the location of gcc attribute.
- The parallel HDF5 test: 't_pread' assumed the use of autotools
- and the directory structure associated with that testing approach.
- Modified the test code to check whether the 'h5jam' utility can be
- found in the same directory as the test executable (which is
- preferred directory structure utilized by cmake) and if found
- will invoke the tool directly rather than utilizing a relative path.
+ The gcc attribute(no_sanitize), named as the macro HDF_NO_UBSAN,
+ was located after the function name. Builds with GCC 7 did not
+ indicate any problem, but GCC 8 issued errors. Moved the
+ attribute before the function name, as required.
- (RAW - 2017/11/03, HDFFV-10318)
+ (ADB - 2018/05/22, HDFFV-10473)
- - Fortran compilation fails for xlf and CMake builds.
+ - Reworked java test suite into individual JUnit tests.
- Fixed CMake shared library build for H5match_types and modules
+ Testing the whole suite of java unit tests in a single JUnit run
+ made it difficult to determine actual failures when tests would fail.
+ Running each file set of tests individually, allows individual failures
+ to be diagnosed easier. A side benefit is that tests for optional components
+ of the library can be disabled if not configured.
- (MSB - 2017/12/19, HDFFV-10363)
+ (ADB - 2018/05/16, HDFFV-9739)
- - Shared libraries fail test on OSX with Fortran enabled with CMake.
+ - Converted CMake global commands ADD_DEFINITIONS and INCLUDE_DIRECTORIES
+ to use target_* type commands. This change modernizes the CMake usage
+ in the HDF5 library.
- Fixed by removing the F77 use of EQUIVALENCE and COMMON, replaced
- using MODULES. Updated CMake.
+ In addition, there is the intention to convert to generator expressions,
+ where possible. The exception is Fortran FLAGS on Windows Visual Studio.
+ The HDF macros TARGET_C_PROPERTIES and TARGET_FORTRAN_PROPERTIES have
+ been removed with this change in usage.
- (MSB - 2017/12/07, HDFFV-10223)
+ The additional language (C++ and Fortran) checks have also been localized
+ to only be checked when that language is enabled.
- - The bin/trace script now emits an error code on problems and autogen.sh
- will fail if bin/trace fails.
+ (ADB - 2018/05/08)
- The bin/trace script adds tracing functionality to public HDF5 API calls.
- It is only of interest to developers who modify the HDF5 source code.
- Previously, bin/trace just wrote an error message to stdout when it
- encountered problems, so autogen.sh processing did not halt and a broken
- version of the library could be built. The script will now return an
- error code when it encounters problems, and autogen.sh will fail.
- This only affects users who run autogen.sh to rebuild the Autotools files,
- which is not necessary to build HDF5 from source in official releases of the
- library. CMake users are unaffected as bin/trace is not run via CMake
- at this time.
-
- (DER - 2017/04/25, HDFFV-10178)
-
- - FC_BASENAME was changed from gfortran40 to gfortran in a few places.
-
- In the autotools, FC_BASENAME was set to gfortran40 in a few locations
- (config/gnu-fflags and config/freebsd). This was probably a historical
- artifact and did not seem to affect many users.
-
- The value is now correctly set to gfortran.
-
- (DER - 2017/05/26, HDFFV-10249)
-
- - The ar flags were changed to -cr (was: -cru)
-
- The autotools set the flags for ar to -cru by default. The -u flag,
- which allows selective replacement of only the members which have
- changed, raises warnings on some platforms, so the flags are now set to
- -cr via AR_FLAGS in configure.ac. This causes the static library to
- always be completely recreated from the object files on each build.
+ Performance
+ -------------
+ - Revamped internal use of DXPLs, improving performance
- (DER - 2017/11/15, HDFFV-10428)
+ (QAK - 2018/05/20)
Fortran
--------
- - Fixed compilation errors when using Intel 18 Fortran compilers
- (MSB - 2017/11/3, HDFFV-10322)
-
- Tools
- -----
- - h5clear
-
- An enhancement to the tool in setting a file's stored EOA.
-
- It was discovered that a crashed file's stored EOA in the superblock
- was smaller than the actual file's EOF. When the file was reopened
- and closed, the library truncated the file to the stored EOA.
-
- Added an option to the tool in setting the file's stored EOA in the
- superblock to the maximum of (EOA, EOF) + increment.
- An option was also added to print the file's EOA and EOF.
-
- (VC - 2018/03/14, HDFFV-10360)
-
- - h5repack
-
- h5repack changes the chunk parameters when a change of layout is not
- specified and a filter is applied.
-
- HDFFV-10297, HDFFV-10319 reworked code for h5repack and h5diff code
- in the tools library. The check for an existing layout was incorrectly
- placed into an if block and not executed. The check was moved into
- the normal path of the function.
-
- (ADB - 2018/02/21, HDFFV-10412)
-
- - h5dump
-
- The tools library will hide the error stack during file open.
-
- While this is preferable almost always, there are reasons to enable
- display of the error stack when a tool will not open a file. Adding an
- optional argument to the --enable-error-stack will provide this use case.
- As an optional argument it will not affect the operation of the
- --enable-error-stack. h5dump is the only tool to implement this change.
-
- (ADB - 2018/02/15, HDFFV-10384)
-
- - h5dump
-
- h5dump would output an indented blank line in the filters section.
-
- h5dump overused the h5tools_simple_prefix function, which is a
- function intended to account for the data index (x,y,z) option.
- Removed the function call for header information.
-
- (ADB - 2018/01/25, HDFFV-10396)
-
- - h5repack
-
- h5repack incorrectly searched internal object table for name.
+ - Fixed issue with h5fget_obj_count_f and using a file id of H5F_OBJ_ALL_F not
+ returning the correct count.
- h5repack would search the table of objects for a name, if the
- name did not match it tried to determine if the name without a
- leading slash would match. The logic was flawed! The table
- stored names(paths) without a leading slash and did a strstr
- of the table path to the name.
- The assumption was that if there was a difference of one then
- it was a match, however "pressure" would match "/pressure" as
- well as "/pressure1", "/pressure2", etc. Changed logic to remove
- any leading slash and then do a full compare of the name.
+ (MSB - 2018/5/15, HDFFV-10405)
- (ADB - 2018/01/18, HDFFV-10393)
-
- - h5repack
-
- h5repack failed to handle command line parameters for customer filters.
-
- User defined filter parameter conversions would fail whenintegers were
- represented on the command line with character string
- larger then 9 characters. Increased local variable array for storing
- the current command line parameter to prevent buffer overflows.
-
- (ADB - 2018/01/17, HDFFV-10392)
-
- - h5diff
-
- h5diff seg faulted if comparing VL strings against fixed strings.
-
- Reworked solution for HDFFV-8625 and HDFFV-8639. Implemented the check
- for string objects of same type in the diff_can_type function by
- adding an if(tclass1 == H5T_STRING) block. This "if block" moves the
- same check that was added for attributes to this function, which is
- used by all object types. This function handles complex type structures.
- Also added a new test file in h5diffgenttest for testing this issue
- and removed the temporary files used in the test scripts.
-
- (ADB - 2018/01/04, HDFFV-8745)
-
- - h5repack
-
- h5repack failed to copy a dataset with existing filter.
-
- Reworked code for h5repack and h5diff code in the tools library. Added
- improved error handling, cleanup of resources and checks of calls.
- Modified H5Zfilter_avail and private function, H5Z_filter_avail.
- Moved check for plugin from public to private function. Updated
- H5P__set_filter due to change in H5Z_filter_avail. Updated tests.
- Note, h5repack output display has changed to clarify the individual
- steps of the repack process. The output indicates if an operation
- applies to all objects. Lines with notation and no information
- have been removed.
-
- (ADB - 2017/10/10, HDFFV-10297, HDFFV-10319)
-
- - h5repack
-
- h5repack always set the User Defined filter flag to H5Z_FLAG_MANDATORY.
-
- Added another parameter to the 'UD=' option to set the flag by default
- to '0' or H5Z_FLAG_MANDATORY, the other choice is '1' or H5Z_FLAG_OPTIONAL.
-
- (ADB - 2017/08/31, HDFFV-10269)
-
- - h5ls
-
- h5ls generated error on stack when it encountered a H5S_NULL
- dataspace.
-
- Adding checks for H5S_NULL before calling H5Sis_simple (located
- in the h5tools_dump_mem function) fixed the issue.
-
- (ADB - 2017/08/17, HDFFV-10188)
-
- - h5repack
-
- Added tests to h5repack.sh.in to verify options added for paged
- aggregation work as expected.
-
- (VC - 2017/08/03)
-
- - h5dump
-
- h5dump segfaulted on output of XML file.
-
- Function that escape'd strings used the full buffer length
- instead of just the length of the replacement string in a
- strncpy call. Using the correct length fixed the issue.
-
- (ADB - 2017/08/01, HDFFV-10256)
-
- - h5diff
-
- h5diff segfaulted on compare of a NULL variable length string.
-
- Improved h5diff compare of strings by adding a check for
- NULL strings and setting the lengths to zero.
-
- (ADB - 2017/07/25, HDFFV-10246)
-
- - h5import
-
- h5import crashed trying to import data from a subset of a dataset.
-
- Improved h5import by adding the SUBSET keyword. h5import understands
- to use the Count times the Block as the size of the dimensions.
- Added INPUT_B_ORDER keyword to old-style configuration files.
- The import from h5dump function expects the binary files to use native
- types (FILE '-b' option) in the binary file.
-
- (ADB - 2017/06/15, HDFFV-10219)
-
- - h5repack
-
- h5repack did not maintain the creation order flag of the root
- group.
-
- Improved h5repack by reading the creation order and applying the
- flag to the new root group. Also added arguments to set the
- order and index direction, which applies to the traversing of the
- original file, on the command line.
-
- (ADB - 2017/05/26, HDFFV-8611)
-
- - h5diff
-
- h5diff failed to account for strpad type and null terminators
- of char strings. Also, h5diff failed to account for string length
- differences and would give a different result depending on file
- order in the command line.
-
- Improved h5diff compare of strings and arrays by adding a check for
- string lengths and if the strpad was null filled.
-
- (ADB - 2017/05/18, HDFFV-9055, HDFFV-10128)
-
- High-Level APIs:
- ------
- - H5DOwrite_chunk() problems when overwriting an existing chunk with
- no filters enabled.
-
- When overwriting chunks and no filters were being used, the library would
- fail (when asserts are enabled, e.g. debug builds) or incorrectly
- insert additional chunks instead of overwriting (when asserts are not
- enabled, e.g. production builds).
-
- This has been fixed and a test was added to the hl/test_dset_opt test.
-
- (DER - 2017/05/11, HDFFV-10187)
C++ APIs
--------
- - Removal of memory leaks.
-
- A private function was inadvertently called, causing memory leaks. This
- is now fixed.
+ - Adding default arguments to existing functions
- (BMR - 2018/03/12 - User's reported in email)
+ Added the following items:
+ + Two more property list arguments are added to H5Location::createDataSet:
+ const DSetAccPropList& dapl = DSetAccPropList::DEFAULT
+ const LinkCreatPropList& lcpl = LinkCreatPropList::DEFAULT
- Testing
- -------
- - Memory for three variables in testphdf5's coll_write_test was malloced
- but not freed, leaking memory when running the test.
-
- The variables' memory is now freed.
+ + One more property list argument is added to H5Location::openDataSet:
+ const DSetAccPropList& dapl = DSetAccPropList::DEFAULT
- (LRK - 2018/03/12, HDFFV-10397)
+ (BMR - 2018/07/21, PR# 1146)
- - Refactored the testpar/t_bigio.c test to include ALARM macros
+ - Improvement C++ documentation
- Changed the test to include the ALARM_ON and ALARM_OFF macros which
- are intended to prevent nightly test hangs that have been observed
- with this particular parallel test example. The code was also modified to
- simplify status reporting (only from MPI rank 0) and additional
- status checking added.
+ Replaced the table in main page of the C++ documentation from mht to htm format
+ for portability.
- (RAW - 2017/11/08, HDFFV-10301)
+ (BMR - 2018/07/17, PR# 1141)
Supported Platforms
@@ -1081,7 +464,7 @@ Supported Platforms
Linux 3.10.0-327.10.1.el7 GNU C (gcc), Fortran (gfortran), C++ (g++)
#1 SMP x86_64 GNU/Linux compilers:
(kituo/moohan) Version 4.8.5 20150623 (Red Hat 4.8.5-4)
- Version 4.9.3, Version 5.2.0,
+ Version 4.9.3, Version 5.2.0
Intel(R) C (icc), C++ (icpc), Fortran (icc)
compilers:
Version 17.0.0.098 Build 20160721
@@ -1091,25 +474,18 @@ Supported Platforms
(emu) Sun Fortran 95 8.6 SunOS_sparc
Sun C++ 5.12 SunOS_sparc
- Windows 7 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
- Visual Studio 2013 w/ Intel Fortran 15 (cmake)
- Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Windows 7 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
Windows 7 x64 Visual Studio 2012 w/ Intel Fortran 15 (cmake)
Visual Studio 2013 w/ Intel Fortran 15 (cmake)
Visual Studio 2015 w/ Intel Fortran 16 (cmake)
Visual Studio 2015 w/ Intel C, Fortran 2017 (cmake)
Visual Studio 2015 w/ MSMPI 8 (cmake)
- Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
- gcc and gfortran compilers (GCC 5.4.0)
- (cmake and autotools)
- Windows 10 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
- Cygwin(CYGWIN_NT-6.1 2.8.0(0.309/5/3)
- gcc and gfortran compilers (GCC 5.4.0)
- (cmake and autotools)
+ Windows 10 Visual Studio 2015 w/ Intel Fortran 18 (cmake)
- Windows 10 x64 Visual Studio 2015 w/ Intel Fortran 16 (cmake)
+ Windows 10 x64 Visual Studio 2015 w/ Intel Fortran 18 (cmake)
+ Visual Studio 2017 w/ Intel Fortran 18 (cmake)
Mac OS X Yosemite 10.10.5 Apple clang/clang++ version 6.1 from Xcode 7.0
64-bit gfortran GNU Fortran (GCC) 4.9.2
@@ -1123,7 +499,6 @@ Supported Platforms
64-bit gfortran GNU Fortran (GCC) 7.1.0
(swallow/kite) Intel icc/icpc/ifort version 17.0.2
-
Tested Configuration Features Summary
=====================================
@@ -1147,10 +522,10 @@ Windows 7 Cygwin n y/n n y y y
Windows 7 x64 Cygwin n y/n n y y y
Windows 10 y y/y n y y y
Windows 10 x64 y y/y n y y y
-Mac OS X Mountain Lion 10.8.5 64-bit n y/y n y y y
Mac OS X Mavericks 10.9.5 64-bit n y/y n y y y
Mac OS X Yosemite 10.10.5 64-bit n y/y n y y y
Mac OS X El Capitan 10.11.6 64-bit n y/y n y y y
+Mac OS Sierra 10.12.6 64-bit n y/y n y y y
CentOS 7.2 Linux 2.6.32 x86_64 PGI n y/y n y y y
CentOS 7.2 Linux 2.6.32 x86_64 GNU y y/y y y y y
CentOS 7.2 Linux 2.6.32 x86_64 Intel n y/y n y y y
@@ -1167,10 +542,10 @@ Windows 7 Cygwin n n n y
Windows 7 x64 Cygwin n n n y
Windows 10 y y y y
Windows 10 x64 y y y y
-Mac OS X Mountain Lion 10.8.5 64-bit y n y y
Mac OS X Mavericks 10.9.5 64-bit y n y y
Mac OS X Yosemite 10.10.5 64-bit y n y y
Mac OS X El Capitan 10.11.6 64-bit y n y y
+Mac OS Sierra 10.12.6 64-bit y n y y
CentOS 7.2 Linux 2.6.32 x86_64 PGI y y y n
CentOS 7.2 Linux 2.6.32 x86_64 GNU y y y y
CentOS 7.2 Linux 2.6.32 x86_64 Intel y y y n
@@ -1190,19 +565,20 @@ The following platforms are not supported but have been tested for this release.
Version 4.9.3, 5.3.0, 6.2.0
PGI C, Fortran, C++ for 64-bit target on
x86-64;
- Version 17.10-0
+ Version 17.10-0
Intel(R) C (icc), C++ (icpc), Fortran (icc)
compilers:
- Version 17.0.4.196 Build 20170411
+ Version 17.0.4.196 Build 20170411
MPICH 3.1.4 compiled with GCC 4.9.3
Linux 3.10.0-327.18.2.el7 GNU C (gcc) and C++ (g++) compilers
#1 SMP x86_64 GNU/Linux Version 4.8.5 20150623 (Red Hat 4.8.5-4)
(jelly) with NAG Fortran Compiler Release 6.1(Tozai)
GCC Version 7.1.0
- OpenMPI 3.0.0-GCC-7.2.0-2.29
+ OpenMPI 3.0.0-GCC-7.2.0-2.29,
+ 3.1.0-GCC-7.2.0-2.29
Intel(R) C (icc) and C++ (icpc) compilers
- Version 17.0.0.098 Build 20160721
+ Version 17.0.0.098 Build 20160721
with NAG Fortran Compiler Release 6.1(Tozai)
Linux 3.10.0-327.10.1.el7 MPICH 3.2 compiled with GCC 5.3.0
@@ -1251,3 +627,38 @@ Known Problems
in the HDF5 source. Please report any new problems found to
help@hdfgroup.org.
+
+CMake vs. Autotools installations
+=================================
+While both build systems produce similar results, there are differences.
+Each system produces the same set of folders on linux (only CMake works
+on standard Windows); bin, include, lib and share. Autotools places the
+COPYING and RELEASE.txt file in the root folder, CMake places them in
+the share folder.
+
+The bin folder contains the tools and the build scripts. Additionally, CMake
+creates dynamic versions of the tools with the suffix "-shared". Autotools
+installs one set of tools depending on the "--enable-shared" configuration
+option.
+ build scripts
+ -------------
+ Autotools: h5c++, h5cc, h5fc
+ CMake: h5c++, h5cc, h5hlc++, h5hlcc
+
+The include folder holds the header files and the fortran mod files. CMake
+places the fortran mod files into separate shared and static subfolders,
+while Autotools places one set of mod files into the include folder. Because
+CMake produces a tools library, the header files for tools will appear in
+the include folder.
+
+The lib folder contains the library files, and CMake adds the pkgconfig
+subfolder with the hdf5*.pc files used by the bin/build scripts created by
+the CMake build. CMake separates the C interface code from the fortran code by
+creating C-stub libraries for each Fortran library. In addition, only CMake
+installs the tools library. The names of the szip libraries are different
+between the build systems.
+
+The share folder will have the most differences because CMake builds include
+a number of CMake specific files for support of CMake's find_package and support
+for the HDF5 Examples CMake project.
+
diff --git a/release_docs/USING_CMake_Examples.txt b/release_docs/USING_CMake_Examples.txt
index f188ab3..6f744d9 100644
--- a/release_docs/USING_CMake_Examples.txt
+++ b/release_docs/USING_CMake_Examples.txt
@@ -48,6 +48,8 @@ Default installation process:
with the CTEST_SOURCE_NAME script option.
The default installation folder is defined as "@CMAKE_INSTALL_PREFIX@".
It can be changed with the INSTALLDIR script option.
+ (Note: Windows has issues with spaces and paths -The path will need to
+ be set correctly.)
The default ctest configuration is defined as "Release". It can be changed
with the CTEST_CONFIGURATION_TYPE script option. Note that this must
be the same as the value used with the -C command line option.
diff --git a/release_docs/USING_HDF5_CMake.txt b/release_docs/USING_HDF5_CMake.txt
index 169a06f..73a24d9 100644
--- a/release_docs/USING_HDF5_CMake.txt
+++ b/release_docs/USING_HDF5_CMake.txt
@@ -188,13 +188,13 @@ string(TOLOWER ${LIB_TYPE} SEARCH_TYPE)
find_package (HDF5 NAMES hdf5 COMPONENTS C ${SEARCH_TYPE})
# find_package (HDF5) # Find non-cmake built HDF5
-INCLUDE_DIRECTORIES (${HDF5_INCLUDE_DIR})
+set_directory_properties(PROPERTIES INCLUDE_DIRECTORIES "${HDF5_INCLUDE_DIR}")
set (LINK_LIBS ${LINK_LIBS} ${HDF5_C_${LIB_TYPE}_LIBRARY})
set (example hdf_example)
add_executable (${example} ${PROJECT_SOURCE_DIR}/${example}.c)
-TARGET_C_PROPERTIES (${example} ${LIB_TYPE} " " " ")
+TARGET_C_PROPERTIES (${example} PRIVATE ${LIB_TYPE})
target_link_libraries (${example} ${LINK_LIBS})
enable_testing ()
@@ -225,232 +225,6 @@ Also available at the HDF web site is a CMake application framework template.
You can quickly add files to the framework and execute the script to compile
your application with an installed HDF5 binary.
-========================================================================
-ctest use of HDF5_Examples.cmake and HDF5_Examples_options.cmake
-========================================================================
-
-cmake_minimum_required (VERSION 3.10)
-###############################################################################################################
-# This script will build and run the examples from a folder
-# Execute from a command line:
-# ctest -S HDF5_Examples.cmake,OPTION=VALUE -C Release -VV -O test.log
-###############################################################################################################
-
-set(CTEST_CMAKE_GENERATOR "@CMAKE_GENERATOR@")
-if("@CMAKE_GENERATOR_TOOLSET@")
- set(CMAKE_GENERATOR_TOOLSET "@CMAKE_GENERATOR_TOOLSET@")
-endif()
-set(CTEST_DASHBOARD_ROOT ${CTEST_SCRIPT_DIRECTORY})
-
-# handle input parameters to script.
-#INSTALLDIR - HDF5 root folder
-#CTEST_CONFIGURATION_TYPE - Release, Debug, RelWithDebInfo
-#CTEST_SOURCE_NAME - name of source folder; HDF5Examples
-if(DEFINED CTEST_SCRIPT_ARG)
- # transform ctest script arguments of the form
- # script.ctest,var1=value1,var2=value2
- # to variables with the respective names set to the respective values
- string(REPLACE "," ";" script_args "${CTEST_SCRIPT_ARG}")
- foreach(current_var ${script_args})
- if("${current_var}" MATCHES "^([^=]+)=(.+)$")
- set("${CMAKE_MATCH_1}" "${CMAKE_MATCH_2}")
- endif()
- endforeach()
-endif()
-
-###################################################################
-### Following Line is one of [Release, RelWithDebInfo, Debug] #####
-set(CTEST_CONFIGURATION_TYPE "$ENV{CMAKE_CONFIG_TYPE}")
-if(NOT DEFINED CTEST_CONFIGURATION_TYPE)
- set(CTEST_CONFIGURATION_TYPE "Release")
-endif()
-set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCTEST_CONFIGURATION_TYPE:STRING=${CTEST_CONFIGURATION_TYPE}")
-##################################################################
-
-if(NOT DEFINED INSTALLDIR)
- set(INSTALLDIR "@CMAKE_INSTALL_PREFIX@")
-endif()
-
-if(NOT DEFINED CTEST_SOURCE_NAME)
- set(CTEST_SOURCE_NAME "HDF5Examples")
-endif()
-
-if(NOT DEFINED HDF_LOCAL)
- set(CDASH_LOCAL "NO")
-else()
- set(CDASH_LOCAL "YES")
-endif()
-if(NOT DEFINED CTEST_SITE)
- set(CTEST_SITE "local")
-endif()
-if(NOT DEFINED CTEST_BUILD_NAME)
- set(CTEST_BUILD_NAME "examples")
-endif()
-set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DSITE:STRING=${CTEST_SITE} -DBUILDNAME:STRING=${CTEST_BUILD_NAME}")
-
-#TAR_SOURCE - name of tarfile
-#if(NOT DEFINED TAR_SOURCE)
-# set(CTEST_USE_TAR_SOURCE "HDF5Examples-1.10.7-Source")
-#endif()
-
-###############################################################################################################
-if(WIN32)
- set(SITE_OS_NAME "Windows")
- set(ENV{HDF5_DIR} "${INSTALLDIR}/cmake")
- set(CTEST_BINARY_NAME ${CTEST_SOURCE_NAME}\\build)
- set(CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_SOURCE_NAME}")
- set(CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}\\${CTEST_BINARY_NAME}")
-else()
- set(ENV{HDF5_DIR} "${INSTALLDIR}/share/cmake")
- set(ENV{LD_LIBRARY_PATH} "${INSTALLDIR}/lib")
- set(CTEST_BINARY_NAME ${CTEST_SOURCE_NAME}/build)
- set(CTEST_SOURCE_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_SOURCE_NAME}")
- set(CTEST_BINARY_DIRECTORY "${CTEST_DASHBOARD_ROOT}/${CTEST_BINARY_NAME}")
-endif()
-if(${CDASH_LOCAL})
- set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCDASH_LOCAL:BOOL=ON")
-endif()
-set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF5_PACKAGE_NAME:STRING=@HDF5_PACKAGE@@HDF_PACKAGE_EXT@")
-
-###############################################################################################################
-# For any comments please contact help@hdfgroup.org
-#
-###############################################################################################################
-
-#############################################################################################
-#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
-#### format for file: set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DXXX:YY=ZZZZ") ###
-#############################################################################################
-if(WIN32)
- include(${CTEST_SCRIPT_DIRECTORY}\\HDF5_Examples_options.cmake)
-else()
- include(${CTEST_SCRIPT_DIRECTORY}/HDF5_Examples_options.cmake)
-endif()
-
-#-----------------------------------------------------------------------------
-set (CTEST_CMAKE_COMMAND "\"${CMAKE_COMMAND}\"")
-## --------------------------
-if (CTEST_USE_TAR_SOURCE)
- ## Uncompress source if tar or zip file provided
- ## --------------------------
- if (WIN32)
- message (STATUS "extracting... [${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_USE_TAR_SOURCE}.zip]")
- execute_process (COMMAND ${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_DASHBOARD_ROOT}\\${CTEST_USE_TAR_SOURCE}.zip RESULT_VARIABLE rv)
- else ()
- message (STATUS "extracting... [${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_USE_TAR_SOURCE}.tar]")
- execute_process (COMMAND ${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_DASHBOARD_ROOT}/${CTEST_USE_TAR_SOURCE}.tar RESULT_VARIABLE rv)
- endif ()
-
- if (NOT rv EQUAL 0)
- message (STATUS "extracting... [error-(${rv}) clean up]")
- file (REMOVE_RECURSE "${CTEST_SOURCE_DIRECTORY}")
- message (FATAL_ERROR "error: extract of ${CTEST_SOURCE_NAME} failed")
- endif ()
-endif ()
-
-#-----------------------------------------------------------------------------
-## Clear the build directory
-## --------------------------
-set (CTEST_START_WITH_EMPTY_BINARY_DIRECTORY TRUE)
-if (EXISTS "${CTEST_BINARY_DIRECTORY}" AND IS_DIRECTORY "${CTEST_BINARY_DIRECTORY}")
- ctest_empty_binary_directory (${CTEST_BINARY_DIRECTORY})
-else ()
- file (MAKE_DIRECTORY "${CTEST_BINARY_DIRECTORY}")
-endif ()
-
-# Use multiple CPU cores to build
-include (ProcessorCount)
-ProcessorCount (N)
-if (NOT N EQUAL 0)
- if (NOT WIN32)
- set (CTEST_BUILD_FLAGS -j${N})
- endif ()
- set (ctest_test_args ${ctest_test_args} PARALLEL_LEVEL ${N})
-endif ()
-set (CTEST_CONFIGURE_COMMAND
- "${CTEST_CMAKE_COMMAND} -C \"${CTEST_SOURCE_DIRECTORY}/config/cmake/cacheinit.cmake\" -DCMAKE_BUILD_TYPE:STRING=${CTEST_CONFIGURATION_TYPE} ${BUILD_OPTIONS} \"-G${CTEST_CMAKE_GENERATOR}\" \"${CTEST_SOURCE_DIRECTORY}\""
-)
-
-#-----------------------------------------------------------------------------
-## -- set output to english
-set ($ENV{LC_MESSAGES} "en_EN")
-
-#-----------------------------------------------------------------------------
-configure_file (${CTEST_SOURCE_DIRECTORY}/config/cmake/CTestCustom.cmake ${CTEST_BINARY_DIRECTORY}/CTestCustom.cmake)
-ctest_read_custom_files ("${CTEST_BINARY_DIRECTORY}")
-## NORMAL process
-## --------------------------
-ctest_start (Experimental)
-ctest_configure (BUILD "${CTEST_BINARY_DIRECTORY}")
-if (LOCAL_SUBMIT)
- ctest_submit (PARTS Configure Notes)
-endif ()
-ctest_build (BUILD "${CTEST_BINARY_DIRECTORY}" APPEND)
-if (LOCAL_SUBMIT)
- ctest_submit (PARTS Build)
-endif ()
-ctest_test (BUILD "${CTEST_BINARY_DIRECTORY}" APPEND ${ctest_test_args} RETURN_VALUE res)
-if (LOCAL_SUBMIT)
- ctest_submit (PARTS Test)
-endif ()
-if (res GREATER 0)
- message (FATAL_ERROR "tests FAILED")
-endif ()
-#-----------------------------------------------------------------------------
-##############################################################################################################
-
-##############################################################################################################
-#### HDF5_Examples_options.cmake ###
-#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
-##############################################################################################################
-#############################################################################################
-#### Change default configuration of options in config/cmake/cacheinit.cmake file ###
-#### format: set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DXXX:YY=ZZZZ") ###
-#### DEFAULT: ###
-#### BUILD_SHARED_LIBS:BOOL=OFF ###
-#### HDF_BUILD_C:BOOL=ON ###
-#### HDF_BUILD_CXX:BOOL=OFF ###
-#### HDF_BUILD_FORTRAN:BOOL=OFF ###
-#### HDF_BUILD_JAVA:BOOL=OFF ###
-#### BUILD_TESTING:BOOL=OFF ###
-#### HDF_ENABLE_PARALLEL:BOOL=OFF ###
-#### HDF_ENABLE_THREADSAFE:BOOL=OFF ###
-#############################################################################################
-
-### uncomment/comment and change the following lines for other configuration options
-### build with shared libraries
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DBUILD_SHARED_LIBS:BOOL=ON")
-
-#############################################################################################
-#### languages ####
-### disable C builds
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_C:BOOL=OFF")
-
-### enable C++ builds
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_CXX:BOOL=ON")
-
-### enable Fortran builds
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_FORTRAN:BOOL=ON")
-
-### enable JAVA builds
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_BUILD_JAVA:BOOL=ON")
-
-#############################################################################################
-### enable parallel program builds
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_ENABLE_PARALLEL:BOOL=ON")
-
-#############################################################################################
-### enable threadsafe program builds
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DHDF_ENABLE_THREADSAFE:BOOL=ON")
-
-#############################################################################################
-### enable test program builds, requires reference files in testfiles subdirectory
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DBUILD_TESTING:BOOL=ON")
-#set(ADD_BUILD_OPTIONS "${ADD_BUILD_OPTIONS} -DCOMPARE_TESTING:BOOL=ON")
-
-#############################################################################################
-
-
========================================================================
For further assistance, send email to help@hdfgroup.org
diff --git a/release_docs/USING_HDF5_VS.txt b/release_docs/USING_HDF5_VS.txt
index 3019631..ba22753 100644
--- a/release_docs/USING_HDF5_VS.txt
+++ b/release_docs/USING_HDF5_VS.txt
@@ -15,9 +15,9 @@ NOTE: Building applications with the dynamic/shared hdf5 libraries requires
The following two sections are helpful if you do not use CMake to build
your applications.
-========================================================================
-Using Visual Studio 2010 with HDF5 Libraries built with Visual Studio 2010
-========================================================================
+==============================================================================================
+Using Visual Studio 2010 and above with HDF5 Libraries built with Visual Studio 2010 and above
+==============================================================================================
1. Set up path for external libraries and headers
@@ -79,13 +79,9 @@ Using Visual Studio 2008 with HDF5 Libraries built with Visual Studio 2008
3.1 FAQ
Many other common questions and hints are located online and being updated
- in the HDF5 FAQ. For Windows-specific questions, please see:
-
- https://support.hdfgroup.org/HDF5/faq/windows.html
-
- For all other general questions, you can look in the general FAQ:
+ in the HDF Knowledge Base, please see:
- https://support.hdfgroup.org/HDF5/HDF5-FAQ.html
+ https://portal.hdfgroup.org/display/knowledge/HDF+Knowledge+Base
************************************************************************
Please send email to help@hdfgroup.org for further assistance.