diff options
author | Larry Knox <lrknox@hdfgroup.org> | 2023-05-19 16:30:12 (GMT) |
---|---|---|
committer | GitHub <noreply@github.com> | 2023-05-19 16:30:12 (GMT) |
commit | 722574e05c34297e27ea8ef13f7b55efe5fc2de0 (patch) | |
tree | eccb9e3a2c19e8ca7c4a0ecf125ac1147bc63de5 /release_docs/RELEASE.txt | |
parent | dd82e186ff1719303f07a1c11c08f0905eb6c177 (diff) | |
download | hdf5-722574e05c34297e27ea8ef13f7b55efe5fc2de0.zip hdf5-722574e05c34297e27ea8ef13f7b55efe5fc2de0.tar.gz hdf5-722574e05c34297e27ea8ef13f7b55efe5fc2de0.tar.bz2 |
Update version (#2971)
* Update version
Copy RELEASE.txt from 1.14.1-2 release into HISTORY-1_14.txt
Clear out RELEASE.txt
Diffstat (limited to 'release_docs/RELEASE.txt')
-rw-r--r-- | release_docs/RELEASE.txt | 376 |
1 files changed, 9 insertions, 367 deletions
diff --git a/release_docs/RELEASE.txt b/release_docs/RELEASE.txt index d956a1e..a163f35 100644 --- a/release_docs/RELEASE.txt +++ b/release_docs/RELEASE.txt @@ -1,4 +1,4 @@ -HDF5 version 1.14.1-1 currently under development +HDF5 version 1.14.2-1 currently under development ================================================================================ @@ -47,117 +47,12 @@ New Features Configuration: ------------- - - Added new CMake options for building and running HDF5 API tests - (Experimental) - - HDF5 API tests are an experimental feature, primarily targeted - toward HDF5 VOL connector authors, that is currently being developed. - These tests exercise the HDF5 API and are being integrated back - into the HDF5 library from the HDF5 VOL tests repository - (https://github.com/HDFGroup/vol-tests). To support this feature, - the following new options have been added to CMake: - - * HDF5_TEST_API: ON/OFF (Default: OFF) - - Controls whether the HDF5 API tests will be built. These tests - will only be run during testing of HDF5 if the HDF5_TEST_SERIAL - (for serial tests) and HDF5_TEST_PARALLEL (for parallel tests) - options are enabled. - - * HDF5_TEST_API_INSTALL: ON/OFF (Default: OFF) - - Controls whether the HDF5 API test executables will be installed - on the system alongside the HDF5 library. This option is currently - not functional. - - * HDF5_TEST_API_ENABLE_ASYNC: ON/OFF (Default: OFF) - - Controls whether the HDF5 Async API tests will be built. These - tests will only be run if the VOL connector used supports Async - operations. - - * HDF5_TEST_API_ENABLE_DRIVER: ON/OFF (Default: OFF) - - Controls whether to build the HDF5 API test driver program. This - test driver program is useful for VOL connectors that use a - client/server model where the server needs to be up and running - before the VOL connector can function. This option is currently - not functional. - - * HDF5_TEST_API_SERVER: String (Default: "") - - Used to specify a path to the server executable that the test - driver program should execute. - - - Added support for CMake presets file. - - CMake supports two main files, CMakePresets.json and CMakeUserPresets.json, - that allow users to specify common configure options and share them with others. - HDF added a CMakePresets.json file of a typical configuration and support - file, config/cmake-presets/hidden-presets.json. - Also added a section to INSTALL_CMake.txt with very basic explanation of the - process to use CMakePresets. - - - Deprecated and removed old SZIP library in favor of LIBAEC library - - LIBAEC library has been used in HDF5 binaries as the szip library of choice - for a few years. We are removing the options for using the old SZIP library. - - Also removed the config/cmake/FindSZIP.cmake file. - - - Enabled instrumentation of the library by default in CMake for parallel - debug builds - - HDF5 can be configured to instrument portions of the parallel library to - aid in debugging. Autotools builds of HDF5 turn this capability on by - default for parallel debug builds and off by default for other build types. - CMake has been updated to match this behavior. - - - Added new option to build libaec and zlib inline with CMake. - - Using the CMake FetchContent module, the external filters can populate - content at configure time via any method supported by the ExternalProject - module. Whereas ExternalProject_Add() downloads at build time, the - FetchContent module makes content available immediately, allowing the - configure step to use the content in commands like add_subdirectory(), - include() or file() operations. - - The HDF options (and defaults) for using this are: - BUILD_SZIP_WITH_FETCHCONTENT:BOOL=OFF - LIBAEC_USE_LOCALCONTENT:BOOL=OFF - BUILD_ZLIB_WITH_FETCHCONTENT:BOOL=OFF - ZLIB_USE_LOCALCONTENT:BOOL=OFF - - The CMake variables to control the path and file names: - LIBAEC_TGZ_ORIGPATH:STRING - LIBAEC_TGZ_ORIGNAME:STRING - ZLIB_TGZ_ORIGPATH:STRING - ZLIB_TGZ_ORIGNAME:STRING - - See the CMakeFilters.cmake and config/cmake/cacheinit.cmake files for usage. + - Library: -------- - - Added a Subfiling VFD configuration file prefix environment variable - - The Subfiling VFD now checks for values set in a new environment - variable "H5FD_SUBFILING_CONFIG_FILE_PREFIX" to determine if the - application has specified a pathname prefix to apply to the file - path for its configuration file. For example, this can be useful - for cases where the application wishes to write subfiles to a - machine's node-local storage while placing the subfiling configuration - file on a file system readable by all machine nodes. - - - Added H5Pset_selection_io(), H5Pget_selection_io(), and - H5Pget_no_selection_io_cause() API functions to manage the selection I/O - feature. This can be used to enable collective I/O with type conversion, - or it can be used with custom VFDs that support vector or selection I/O. - - - Added H5Pset_modify_write_buf() and H5Pget_modify_write_buf() API - functions to allow the library to modify the contents of write buffers, in - order to avoid malloc/memcpy. Currently only used for type conversion - with selection I/O. + - Parallel Library: @@ -167,11 +62,8 @@ New Features Fortran Library: ---------------- - - Fortran async APIs H5A, H5D, H5ES, H5G, H5F, H5L and H5O were added. + - - - Added Fortran APIs: - h5pset_selection_io_f, h5pget_selection_io_f - h5pset_modify_write_buf_f, h5pget_modify_write_buf_f C++ Library: ------------ @@ -205,9 +97,7 @@ New Features Documentation: -------------- - - Ported the existing VOL Connector Author Guide document to doxygen. - - Added new dox file, VOLConnGuide.dox. + - Support for new platforms, languages and compilers @@ -215,199 +105,11 @@ Support for new platforms, languages and compilers - -Bug Fixes since HDF5-1.14.0 release +Bug Fixes since HDF5-1.14.1 release =================================== Library ------- - - Fixed a bug in H5Ocopy that could generate invalid HDF5 files - - H5Ocopy was missing a check to determine whether the new object's - object header version is greater than version 1. Without this check, - copying of objects with object headers that are smaller than a - certain size would cause H5Ocopy to create an object header for the - new object that has a gap in the header data. According to the - HDF5 File Format Specification, this is not allowed for version - 1 of the object header format. - - Fixes GitHub issue #2653 - - - Fixed H5Pget_vol_cap_flags and H5Pget_vol_id to accept H5P_DEFAULT - - H5Pget_vol_cap_flags and H5Pget_vol_id were updated to correctly - accept H5P_DEFAULT for the 'plist_id' FAPL parameter. Previously, - they would fail if provided with H5P_DEFAULT as the FAPL. - - - Fixed ROS3 VFD anonymous credential usage with h5dump and h5ls - - ROS3 VFD anonymous credential functionality became broken in h5dump - and h5ls in the HDF5 1.14.0 release with the added support for VFD - plugins, which changed the way that the tools handled setting of - credential information that the VFD uses. The tools could be - provided the command-line option of "--s3-cred=(,,)" as a workaround - for anonymous credential usage, but the documentation for this - option stated that anonymous credentials could be used by simply - omitting the option. The latter functionality has been restored. - - Fixes GitHub issue #2406 - - - Fixed memory leaks when processing malformed object header continuation messages - - Malformed object header continuation messages can result in a too-small - buffer being passed to the decode function, which could lead to reading - past the end of the buffer. Additionally, errors in processing these - malformed messages can lead to allocated memory not being cleaned up. - - This fix adds bounds checking and cleanup code to the object header - continuation message processing. - - Fixes GitHub issue #2604 - - - Fixed memory leaks, aborts, and overflows in H5O EFL decode - - The external file list code could call assert(), read past buffer - boundaries, and not properly clean up resources when parsing malformed - external data files messages. - - This fix cleans up allocated memory, adds buffer bounds checks, and - converts asserts to HDF5 error checking. - - Fixes GitHub issue #2605 - - - Fixed potential heap buffer overflow in decoding of link info message - - Detections of buffer overflow were added for decoding version, index - flags, link creation order value, and the next three addresses. The - checkings will remove the potential invalid read of any of these - values that could be triggered by a malformed file. - - Fixes GitHub issue #2603 - - - Memory leak - - Memory leak was detected when running h5dump with "pov". The memory was allocated - via H5FL__malloc() in hdf5/src/H5FL.c - - The fuzzed file "pov" was an HDF5 file containing an illegal continuation message. - When deserializing the object header chunks for the file, memory is allocated for the - array of continuation messages (cont_msg_info->msgs) in continuation message info struct. - As error is encountered in loading the illegal message, the memory allocated for - cont_msg_info->msgs needs to be freed. - - Fixes GitHub issue #2599 - - - Fixed memory leaks that could occur when reading a dataset from a - malformed file - - When attempting to read layout, pline, and efl information for a - dataset, memory leaks could occur if attempting to read pline/efl - information threw an error, which is due to the memory that was - allocated for pline and efl not being properly cleaned up on error. - - Fixes GitHub issue #2602 - - - Fixed potential heap buffer overrun in group info header decoding from malformed file - - H5O__ginfo_decode could sometimes read past allocated memory when parsing a - group info message from the header of a malformed file. - - It now checks buffer size before each read to properly throw an error in these cases. - - Fixes GitHub issue #2601 - - - Fixed potential buffer overrun issues in some object header decode routines - - Several checks were added to H5O__layout_decode and H5O__sdspace_decode to - ensure that memory buffers don't get overrun when decoding buffers read from - a (possibly corrupted) HDF5 file. - - - Fixed issues in the Subfiling VFD when using the SELECT_IOC_EVERY_NTH_RANK - or SELECT_IOC_TOTAL I/O concentrator selection strategies - - Multiple bugs involving these I/O concentrator selection strategies - were fixed, including: - - * A bug that caused the selection strategy to be altered when - criteria for the strategy was specified in the - H5FD_SUBFILING_IOC_SELECTION_CRITERIA environment variable as - a single value, rather than in the old and undocumented - 'integer:integer' format - * Two bugs which caused a request for 'N' I/O concentrators to - result in 'N - 1' I/O concentrators being assigned, which also - lead to issues if only 1 I/O concentrator was requested - - Also added a regression test for these two I/O concentrator selection - strategies to prevent future issues. - - - Fixed a heap buffer overflow that occurs when reading from - a dataset with a compact layout within a malformed HDF5 file - - During opening of a dataset that has a compact layout, the - library allocates a buffer that stores the dataset's raw data. - The dataset's object header that gets written to the file - contains information about how large of a buffer the library - should allocate. If this object header is malformed such that - it causes the library to allocate a buffer that is too small - to hold the dataset's raw data, future I/O to the dataset can - result in heap buffer overflows. To fix this issue, an extra - check is now performed for compact datasets to ensure that - the size of the allocated buffer matches the expected size - of the dataset's raw data (as calculated from the dataset's - dataspace and datatype information). If the two sizes do not - match, opening of the dataset will fail. - - Fixes GitHub issue #2606 - - - Fixed a memory corruption issue that can occur when reading - from a dataset using a hyperslab selection in the file - dataspace and a point selection in the memory dataspace - - When reading from a dataset using a hyperslab selection in - the dataset's file dataspace and a point selection in the - dataset's memory dataspace where the file dataspace's "rank" - is greater than the memory dataspace's "rank", memory corruption - could occur due to an incorrect number of selection points - being copied when projecting the point selection onto the - hyperslab selection's dataspace. - - - Fixed an issue with collective metadata writes of global heap data - - New test failures in parallel netCDF started occurring with debug - builds of HDF5 due to an assertion failure and this was reported in - GitHub issue #2433. The assertion failure began happening after the - collective metadata write pathway in the library was updated to use - vector I/O so that parallel-enabled HDF5 Virtual File Drivers (other - than the existing MPI I/O VFD) can support collective metadata writes. - - The assertion failure was fixed by updating collective metadata writes - to treat global heap metadata as raw data, as done elsewhere in the - library. - - Fixes GitHub issue #2433 - - - Fix CVE-2021-37501 / GHSA-rfgw-5vq3-wrjf - - Check for overflow when calculating on-disk attribute data size. - - A bogus hdf5 file may contain dataspace messages with sizes - which lead to the on-disk data sizes to exceed what is addressable. - When calculating the size, make sure, the multiplication does not - overflow. - The test case was crafted in a way that the overflow caused the - size to be 0. - - Fixes GitHub issue #2458 - - - Fixed buffer overflow error in image decoding function. - - The error occurred in the function for decoding address from the specified - buffer, which is called many times from the function responsible for image - decoding. The length of the buffer is known in the image decoding function, - but no checks are produced, so the buffer overflow can occur in many places, - including callee functions for address decoding. - - The error was fixed by inserting corresponding checks for buffer overflow. - - Fixes GitHub issue #2432 + - Java Library @@ -417,72 +119,12 @@ Bug Fixes since HDF5-1.14.0 release Configuration ------------- - - Fixed syntax of generator expressions used by CMake - - Add quotes around the generator expression should allow CMake to - correctly parse the expression. Generator expressions are typically - parsed after command arguments. If a generator expression contains - spaces, new lines, semicolons or other characters that may be - interpreted as command argument separators, the whole expression - should be surrounded by quotes when passed to a command. Failure to - do so may result in the expression being split and it may no longer - be recognized as a generator expression. - - Fixes GitHub issue #2906 - - - Fixed improper include of Subfiling VFD build directory - - With the release of the Subfiling Virtual File Driver feature, compiler - flags were added to the Autotools build's CPPFLAGS and AM_CPPFLAGS - variables to always include the Subfiling VFD source code directory, - regardless of whether the VFD is enabled and built or not. These flags - are needed because the header files for the VFD contain macros that are - assumed to always be available, such as H5FD_SUBFILING_NAME, so the - header files are unconditionally included in the HDF5 library. However, - these flags are only needed when building HDF5, so they belong in the - H5_CPPFLAGS variable instead. Inclusion in the CPPFLAGS and AM_CPPFLAGS - variables would export these flags to the h5cc and h5c++ wrapper scripts, - as well as the libhdf5.settings file, which would break builds of software - that use HDF5 and try to use or parse information out of these files after - deleting temporary HDF5 build directories. - - Fixes GitHub issues #2422 and #2621 - - - Correct the CMake generated pkg-config file - - The pkg-config file generated by CMake had the order and placement of the - libraries wrong. Also added support for debug library names. - - Changed the order of Libs.private libraries so that dependencies come after - dependents. Did not move the compression libraries into Requires.private - because there was not a way to determine if the compression libraries had - supported pkconfig files. Still recommend that the CMake config file method - be used for building projects with CMake. - - Fixes GitHub issues #1546 and #2259 - - - Force lowercase Fortran module file names - - The Cray Fortran compiler uses uppercase Fortran module file names, which - caused CMake installs to fail. A compiler option was added to use lowercase - instead. + - Tools ----- - - Names of objects with square brackets will have trouble without the - special argument, --no-compact-subset, on the h5dump command line. - - h5diff did not have this option and now it has been added. - - Fixes GitHub issue #2682 - - - In the tools traverse function - an error in either visit call - will bypass the cleanup of the local data variables. - - Replaced the H5TOOLS_GOTO_ERROR with just H5TOOLS_ERROR. - - Fixes GitHub issue #2598 + - Performance |