summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorLarry Knox <lrknox@hdfgroup.org>2024-02-15 22:51:33 (GMT)
committerGitHub <noreply@github.com>2024-02-15 22:51:33 (GMT)
commit413d10f6e3d4db5341413ba7cd4f819eb5156a51 (patch)
tree8741384bb37a09d0e8d31116c64d9065a0b046f3
parent987a734e759c82c65a661ae6090b2252d63a7aec (diff)
parent424cb6ecd35bc262120e250ee25706c3d3c3c15d (diff)
downloadhdf5-413d10f6e3d4db5341413ba7cd4f819eb5156a51.zip
hdf5-413d10f6e3d4db5341413ba7cd4f819eb5156a51.tar.gz
hdf5-413d10f6e3d4db5341413ba7cd4f819eb5156a51.tar.bz2
Merge pull request #4019 from lrknox/1_14_dev_sync2_lrk
* Update upload- artifact to match download version (#3929) * Reorg and update options for doc and cmake config (#3934) * Add binary build for linux S3 (#3936) * Clean up Doxygen for szip functions and constants (#3943) * Replace off_t with HDoff_t internally (#3944) off_t is a 32-bit signed value on Windows, so we should use HDoff_t (which is __int64 on Windows) internally instead. Also defines HDftell on Windows to be _ftelli64(). * Fix chid_t to hid_t (#3948) * Fortran API work. (#3941) * - Added Fortran APIs: H5FGET_INTENT_F, H5SSELECT_ITER_CREATE_F, H5SSEL_ITER_GET_SEQ_LIST_F, H5SSELECT_ITER_CLOSE_F, H5S_mp_H5SSELECT_ITER_RESET_F - Added Fortran Parameters: H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F, H5S_SEL_ITER_SHARE_WITH_DATASPACE_F - Added tests for new APIs - Removed H5F C wrapper stubs - Documentation misc. cleanup. * Add the user test program in HDFFV-9174 for committed types. (#3937) Add the user test program for committed types in HDFFV-9174 * Remove cached datatype conversion path table entries on file close (#3942) * fixed BIND name (#3957) * update H5Ssel_iter_reset_f test * Change 'extensible' to 'fixed' in H5FA code (#3964) * RF: move codespell configuration into .codespellrc so could be used locally as well (#3958) * Add RELEASE.txt note for the fix for issue #1256 (#3955) * Fix doxygen errors (#3962) * Add API support for Fortran MPI_F08 module definitions. (#3959) * revert to using c-stub for _F08 MPI APIs * use mpi compiler wrappers for cmake and nvhpc * Added a GitHub Codespaces configuration. (#3966) * Fixed XL and gfortran errors (#3968) * h5 compiler wrappers now pass all arguments passed to it to the compile line (#3954) * The issue was that the "allargs" variable was not being used in the final command of the compiler wrapper. Any entries containing an escaped quote (\", \') or other non-matching argument (*) would not be passed to the compile line. I have fixed this problem by ensuring all arguments passed to the compiler wrapper are now included in the compile line. * Add binary testing to CI testing (#3971) * Replace 'T2' with ' ' to avoid failure to match expected output due to (#3975) * Clarify vlen string datatype message (#3950) * append '-WF,' when passing C preprocessor directives to the xlf compiler (#3976) * Create CITATION.cff (#3927) Add citation source based on http://web.archive.org/web/20230610185232/https://portal.hdfgroup.org/display/knowledge/How+do+I+properly+cite+HDF5%The space difference in the Fortran examples must be fixed to match the expected output for compression filter examples. * corrected warning: implicit conversion changes signedness (#3982) * Skip mac bintest until more reliable (#3983) * Make platform specific test presets for windows and macs (#3988) * chore: fix typo (#3989) * Add a missing left parenthesis in RELEASE.txt. (#3990) * Remove ADB signature from RELEASE.txt. (#3986) * Bump the github-actions group with 6 updates (#3981) * Sync API tests with vol-tests (#3940) * Fix for github issue #2414: segfault when copying dataset with attrib… (#3967) * Fix for github issue #2414: segfault when copying dataset with attributes. This also fixes github issue #3241: segfault when copying dataset. Need to set the location via H5T_set_loc() of the src datatype when copying dense attributes. Otherwise the vlen callbacks are not set up therefore causing seg fault when doing H5T_convert() -> H5T__conv_vlen(). * Fix broken links caused by examples relocation. (#3995) * Add abi-complience check and upload to releases (#3996) * Fix h5watch test failures to ignore system warnings on ppc64le. (#3997) * Remove oneapi/clang compiler printf() type warning. (#3994) * Updated information about obtaining the HDF5 source code to use the repos. (#3972) * Fix overwritten preset names (#4000) * Fix incompatible pointer type warnings in object reference examples (#3999) * Fix build issue and some warnings in H5_api_dataset_test.c (#3998) * Modern C++ dtor declarations (#1830) * C++ dtor modernization - Replaced a bunch of empty dtors with `= default` - Removed deprecated `throw()`. In C++11, dtors are `noexcept` by default. * remove incorrect check for environ (#4002) * Add a missing file into Makefile.am for MinGW Autotools build error. (#4004) * Issue #1824: Replaced most remaining sprintf with safer snprint (#4003) * Add hl and cpp ABI reports to daily build (#4006) * Don't add files and directories with names that begin with ., or that match *autom4te* to release tar & zip files. (#4009) * Fix some output issues with ph5diff (#4008) * Update install texts (#4010) * Add C in project line for CMake to fix #4012. (#4014) * separate out individual checks for string removal (#4015) * Add compound subset ops on attributes to API tests (#4005) ---------
-rw-r--r--.codespellrc6
-rw-r--r--.devcontainer/Dockerfile5
-rw-r--r--.devcontainer/devcontainer.json26
-rw-r--r--.devcontainer/noop.txt3
-rw-r--r--.github/workflows/abi-report.yml171
-rw-r--r--.github/workflows/autotools.yml12
-rw-r--r--.github/workflows/clang-format-fix.yml2
-rw-r--r--.github/workflows/cmake-bintest.yml218
-rw-r--r--.github/workflows/cmake-ctest.yml126
-rw-r--r--.github/workflows/cmake.yml27
-rw-r--r--.github/workflows/codespell.yml3
-rw-r--r--.github/workflows/daily-build.yml12
-rw-r--r--.github/workflows/linux-auto-aocc-ompi.yml2
-rw-r--r--.github/workflows/main-auto-par-spc.yml137
-rw-r--r--.github/workflows/main-auto-par.yml102
-rw-r--r--.github/workflows/main-auto.yml60
-rw-r--r--.github/workflows/main-cmake-par.yml77
-rw-r--r--.github/workflows/main-cmake.yml132
-rw-r--r--.github/workflows/main.yml3
-rw-r--r--.github/workflows/nvhpc-cmake.yml4
-rw-r--r--.github/workflows/release-files.yml38
-rw-r--r--.github/workflows/release.yml16
-rw-r--r--.github/workflows/tarball.yml4
-rw-r--r--CITATION.cff12
-rw-r--r--CMakeInstallation.cmake5
-rw-r--r--CMakeLists.txt19
-rw-r--r--CMakePresets.json96
-rw-r--r--HDF5Examples/C/CMakeLists.txt2
-rw-r--r--HDF5Examples/C/H5PAR/CMakeLists.txt2
-rw-r--r--HDF5Examples/C/H5T/h5ex_t_convert.c2
-rw-r--r--HDF5Examples/C/H5T/h5ex_t_objref.c2
-rw-r--r--HDF5Examples/C/H5T/h5ex_t_objrefatt.c2
-rw-r--r--HDF5Examples/C/H5T/h5ex_t_opaque.c2
-rw-r--r--HDF5Examples/C/Perf/CMakeLists.txt2
-rw-r--r--HDF5Examples/CMakePresets.json50
-rw-r--r--HDF5Examples/FORTRAN/H5D/h5ex_d_checksum.F9012
-rw-r--r--HDF5Examples/FORTRAN/H5D/h5ex_d_gzip.F9012
-rw-r--r--HDF5Examples/FORTRAN/H5D/h5ex_d_nbit.F9014
-rw-r--r--HDF5Examples/FORTRAN/H5D/h5ex_d_soint.F9016
-rw-r--r--HDF5Examples/FORTRAN/H5D/h5ex_d_szip.F9016
-rw-r--r--HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_checksum.tst2
-rw-r--r--HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_gzip.tst2
-rw-r--r--HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_nbit.tst2
-rw-r--r--HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_soint.tst2
-rw-r--r--HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_szip.tst2
-rw-r--r--HDF5Examples/config/cmake-presets/hidden-presets.json2
-rw-r--r--HDF5Examples/config/cmake/HDFExampleMacros.cmake26
-rw-r--r--HDF5Examples/config/cmake/HDFMacros.cmake8
-rwxr-xr-xautogen.sh36
-rw-r--r--bin/h5cc.in16
-rwxr-xr-xbin/release2
-rw-r--r--c++/examples/testh5c++.sh.in32
-rw-r--r--c++/src/H5AbstractDs.cpp8
-rw-r--r--c++/src/H5AbstractDs.h2
-rw-r--r--c++/src/H5ArrayType.cpp8
-rw-r--r--c++/src/H5ArrayType.h2
-rw-r--r--c++/src/H5AtomType.cpp10
-rw-r--r--c++/src/H5AtomType.h2
-rw-r--r--c++/src/H5CommonFG.cpp8
-rw-r--r--c++/src/H5CommonFG.h2
-rw-r--r--c++/src/H5CompType.cpp8
-rw-r--r--c++/src/H5CompType.h2
-rw-r--r--c++/src/H5DaccProp.cpp8
-rw-r--r--c++/src/H5DaccProp.h2
-rw-r--r--c++/src/H5DcreatProp.cpp8
-rw-r--r--c++/src/H5DcreatProp.h2
-rw-r--r--c++/src/H5DxferProp.cpp8
-rw-r--r--c++/src/H5DxferProp.h2
-rw-r--r--c++/src/H5EnumType.cpp8
-rw-r--r--c++/src/H5EnumType.h2
-rw-r--r--c++/src/H5Exception.cpp93
-rw-r--r--c++/src/H5Exception.h26
-rw-r--r--c++/src/H5FaccProp.cpp8
-rw-r--r--c++/src/H5FaccProp.h2
-rw-r--r--c++/src/H5FcreatProp.cpp8
-rw-r--r--c++/src/H5FcreatProp.h2
-rw-r--r--c++/src/H5FloatType.cpp8
-rw-r--r--c++/src/H5FloatType.h2
-rw-r--r--c++/src/H5IdComponent.cpp8
-rw-r--r--c++/src/H5IdComponent.h2
-rw-r--r--c++/src/H5IntType.cpp8
-rw-r--r--c++/src/H5IntType.h2
-rw-r--r--c++/src/H5LaccProp.cpp8
-rw-r--r--c++/src/H5LaccProp.h2
-rw-r--r--c++/src/H5LcreatProp.cpp9
-rw-r--r--c++/src/H5LcreatProp.h2
-rw-r--r--c++/src/H5Library.cpp8
-rw-r--r--c++/src/H5Library.h2
-rw-r--r--c++/src/H5Location.cpp8
-rw-r--r--c++/src/H5Location.h2
-rw-r--r--c++/src/H5Object.cpp10
-rw-r--r--c++/src/H5Object.h2
-rw-r--r--c++/src/H5OcreatProp.cpp8
-rw-r--r--c++/src/H5OcreatProp.h2
-rw-r--r--c++/src/H5PredType.cpp9
-rw-r--r--c++/src/H5PredType.h2
-rw-r--r--c++/src/H5StrType.cpp8
-rw-r--r--c++/src/H5StrType.h2
-rw-r--r--c++/src/H5VarLenType.cpp8
-rw-r--r--c++/src/H5VarLenType.h2
-rw-r--r--c++/src/h5c++.in16
-rw-r--r--c++/test/h5cpputil.cpp14
-rw-r--r--c++/test/h5cpputil.h4
-rw-r--r--config/cmake-presets/hidden-presets.json2
-rw-r--r--config/cmake/H5pubconf.h.in3
-rw-r--r--config/cmake/HDF5ExampleCache.cmake2
-rw-r--r--config/cmake/HDF5PluginCache.cmake2
-rw-r--r--config/cmake/HDFMacros.cmake8
-rw-r--r--config/cmake/hdf5-config.cmake.in39
-rw-r--r--config/cmake/runTest.cmake5
-rw-r--r--config/cmake/scripts/CTestScript.cmake2
-rw-r--r--configure.ac23
-rw-r--r--doc/parallel-compression.md4
-rw-r--r--doxygen/dox/LearnBasics3.dox2
-rw-r--r--doxygen/dox/PredefinedDatatypeTables.dox35
-rw-r--r--doxygen/examples/H5.format.html24
-rw-r--r--doxygen/examples/tables/fileDriverLists.dox2
-rw-r--r--examples/testh5cc.sh.in79
-rw-r--r--fortran/examples/testh5fc.sh.in40
-rw-r--r--fortran/src/CMakeLists.txt7
-rw-r--r--fortran/src/H5Ff.c92
-rw-r--r--fortran/src/H5Fff.F90166
-rw-r--r--fortran/src/H5Pff.F90266
-rw-r--r--fortran/src/H5Sff.F90144
-rw-r--r--fortran/src/H5_f.c3
-rw-r--r--fortran/src/H5_ff.F904
-rw-r--r--fortran/src/H5config_f.inc.cmake8
-rw-r--r--fortran/src/H5config_f.inc.in3
-rw-r--r--fortran/src/H5f90global.F905
-rw-r--r--fortran/src/h5fc.in16
-rw-r--r--fortran/src/hdf5_fortrandll.def.in17
-rw-r--r--fortran/test/fortranlib_test.F908
-rw-r--r--fortran/test/tH5F.F9013
-rw-r--r--fortran/test/tH5Sselect.F90144
-rw-r--r--fortran/testpar/CMakeLists.txt1
-rw-r--r--fortran/testpar/Makefile.am2
-rw-r--r--fortran/testpar/mpi_param.F90326
-rw-r--r--fortran/testpar/ptest.F9010
-rw-r--r--hl/tools/gif2h5/gif2hdf.c6
-rw-r--r--hl/tools/h5watch/CMakeTests.cmake5
-rw-r--r--java/src/hdf/hdf5lib/CMakeLists.txt2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/Callbacks.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5A_iterate_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5D_append_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5D_iterate_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5E_walk_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5L_iterate_t.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5O_iterate_t.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_cls_close_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_cls_copy_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_cls_create_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_iterate_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_close_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_compare_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_copy_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_create_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_delete_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_get_func_cb.java2
-rw-r--r--java/src/hdf/hdf5lib/callbacks/H5P_prp_set_func_cb.java2
-rw-r--r--java/src/jni/h5util.c19
-rw-r--r--java/test/TestH5.java10
-rw-r--r--java/test/testfiles/JUnit-TestH5.txt7
-rw-r--r--release_docs/INSTALL_Auto.txt (renamed from release_docs/INSTALL)21
-rw-r--r--release_docs/INSTALL_CMake.txt48
-rw-r--r--release_docs/RELEASE.txt52
-rw-r--r--src/H5Aint.c8
-rw-r--r--src/H5Apublic.h2
-rw-r--r--src/H5Dmodule.h9
-rw-r--r--src/H5FAcache.c2
-rw-r--r--src/H5FAhdr.c2
-rw-r--r--src/H5FDfamily.c33
-rw-r--r--src/H5FDsplitter.c33
-rw-r--r--src/H5FDsubfiling/H5FDsubfiling.c2
-rw-r--r--src/H5Fint.c12
-rw-r--r--src/H5Fmodule.h9
-rw-r--r--src/H5Gmodule.h2
-rw-r--r--src/H5Pdcpl.c2
-rw-r--r--src/H5Pmodule.h75
-rw-r--r--src/H5Ppublic.h17
-rw-r--r--src/H5Spublic.h21
-rw-r--r--src/H5T.c112
-rw-r--r--src/H5Tmodule.h18
-rw-r--r--src/H5Tpkg.h3
-rw-r--r--src/H5Tprivate.h2
-rw-r--r--src/H5VLmodule.h6
-rw-r--r--src/H5Zpublic.h49
-rw-r--r--src/H5checksum.c2
-rw-r--r--src/H5win32defs.h3
-rw-r--r--src/Makefile.am2
-rw-r--r--test/API/H5_api_async_test.c22
-rw-r--r--test/API/H5_api_attribute_test.c535
-rw-r--r--test/API/H5_api_attribute_test.h13
-rw-r--r--test/API/H5_api_dataset_test.c1476
-rw-r--r--test/API/H5_api_dataset_test.h40
-rw-r--r--test/API/H5_api_file_test.c2
-rw-r--r--test/API/H5_api_group_test.c8
-rw-r--r--test/API/H5_api_object_test.c263
-rw-r--r--test/API/H5_api_object_test.h19
-rw-r--r--test/dtypes.c169
-rw-r--r--test/filter_plugin.c10
-rw-r--r--test/h5test.c18
-rw-r--r--test/tmisc.c190
-rw-r--r--testpar/API/H5_api_async_test_parallel.c22
-rw-r--r--testpar/t_subfiling_vfd.c2
-rw-r--r--testpar/t_vfd.c9
-rw-r--r--tools/lib/h5diff.c11
-rw-r--r--tools/src/h5diff/ph5diff_main.c3
-rw-r--r--tools/test/h5dump/CMakeVFDTests.cmake45
-rw-r--r--tools/test/h5dump/h5dumpgentest.c2
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5bin0 -> 211 bytes
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile.config2
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_1_of_2bin0 -> 22497827 bytes
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_2_of_2bin0 -> 22495773 bytes
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5bin0 -> 96 bytes
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile.config2
-rw-r--r--tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile_1_of_1bin0 -> 889 bytes
-rw-r--r--utils/tools/h5dwalk/h5dwalk.c4
217 files changed, 5620 insertions, 1359 deletions
diff --git a/.codespellrc b/.codespellrc
new file mode 100644
index 0000000..42fd9a7
--- /dev/null
+++ b/.codespellrc
@@ -0,0 +1,6 @@
+# Ref: https://github.com/codespell-project/codespell#using-a-config-file
+[codespell]
+skip = .git,*.svg,.codespellrc,./bin/trace,./hl/tools/h5watch/h5watch.c,./tools/test/h5jam/tellub.c,./config/sanitizer/LICENSE,./config/sanitizer/sanitizers.cmake,./tools/test/h5repack/testfiles/*.dat,./test/API/driver,./configure,./bin/ltmain.sh,./bin/depcomp,./bin/config.guess,./bin/config.sub,./autom4te.cache,./m4/libtool.m4,./c++/src/*.html,./HDF5Examples/depcomp
+check-hidden = true
+# ignore-regex =
+ignore-words-list = ot,isnt,inout,nd,parms,parm,ba,offsetP,ser,ois,had,fiter,fo,clude,refere,minnum,offsetp,creat,ans:,eiter,lastr,ans,isn't,ifset,sur,trun,dne,tthe,hda,filname,te,htmp,ake,gord,numer,ro,oce,msdos
diff --git a/.devcontainer/Dockerfile b/.devcontainer/Dockerfile
new file mode 100644
index 0000000..8b942f5
--- /dev/null
+++ b/.devcontainer/Dockerfile
@@ -0,0 +1,5 @@
+FROM mcr.microsoft.com/devcontainers/base:debian
+
+RUN apt-get update && apt-get -y install --no-install-recommends \
+ build-essential cmake cmake-curses-gui doxygen git graphviz \
+ less libtool-bin libyajl-dev mpi-default-dev valgrind wget \ No newline at end of file
diff --git a/.devcontainer/devcontainer.json b/.devcontainer/devcontainer.json
new file mode 100644
index 0000000..5e78e03
--- /dev/null
+++ b/.devcontainer/devcontainer.json
@@ -0,0 +1,26 @@
+{
+ "name": "HDF5 Developer",
+ "build": {
+ "context": "..",
+ "dockerfile": "Dockerfile"
+ },
+ "customizations": {
+ "vscode": {
+ "extensions": [
+ "ms-python.python",
+ "ms-toolsai.jupyter",
+ "ms-vscode.cpptools",
+ "ms-vscode.live-server",
+ "ms-vscode-remote.remote-containers",
+ "ms-azuretools.vscode-docker",
+ "h5web.vscode-h5web",
+ "davidanson.vscode-markdownlint"
+ ],
+ "settings": {
+ "C_Cpp.default.cppStandard": "c++17",
+ "C_Cpp.default.cStandard": "c99",
+ "terminal.integrated.shell.linux": "/bin/bash"
+ }
+ }
+ }
+}
diff --git a/.devcontainer/noop.txt b/.devcontainer/noop.txt
new file mode 100644
index 0000000..4682d3a
--- /dev/null
+++ b/.devcontainer/noop.txt
@@ -0,0 +1,3 @@
+This file is copied into the container along with environment.yml* from the
+parent folder. This file prevents the Dockerfile COPY instruction from failing
+if no environment.yml is found. \ No newline at end of file
diff --git a/.github/workflows/abi-report.yml b/.github/workflows/abi-report.yml
new file mode 100644
index 0000000..93b9adb
--- /dev/null
+++ b/.github/workflows/abi-report.yml
@@ -0,0 +1,171 @@
+name: hdf5 1.14 Check Application Binary Interface (ABI)
+
+on:
+ workflow_call:
+ inputs:
+ use_tag:
+ description: 'Release version tag'
+ type: string
+ required: false
+ default: snapshot
+ use_environ:
+ description: 'Environment to locate files'
+ type: string
+ required: true
+ default: snapshots
+ file_base:
+ description: "The common base name of the binary"
+ required: true
+ type: string
+ file_ref:
+ description: "The reference name for the release binary"
+ required: true
+ type: string
+
+permissions:
+ contents: read
+
+jobs:
+ check:
+ runs-on: ubuntu-latest
+ continue-on-error: true
+
+ steps:
+ - name: Install System dependencies
+ run: |
+ sudo apt update
+ sudo apt install -q -y abi-compliance-checker abi-dumper
+ sudo apt install -q -y japi-compliance-checker
+
+ - name: Convert hdf5 reference name (Linux)
+ id: convert-hdf5lib-refname
+ run: |
+ FILE_DOTS=$(echo "${{ inputs.file_ref }}" | sed -r "s/([0-9]+)\_([0-9]+)\_([0-9]+).*/\1\.\2\.\3/")
+ echo "HDF5R_DOTS=$FILE_DOTS" >> $GITHUB_OUTPUT
+
+ - uses: actions/checkout@v4.1.1
+
+ - name: Get published binary (Linux)
+ uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ with:
+ name: tgz-ubuntu-2204_gcc-binary
+ path: ${{ github.workspace }}
+
+ - name: List files for the space (Linux)
+ run: |
+ ls -l ${{ github.workspace }}
+
+ - name: Uncompress gh binary (Linux)
+ run: tar -zxvf ${{ github.workspace }}/${{ inputs.file_base }}-ubuntu-2204_gcc.tar.gz
+
+ - name: Uncompress hdf5 binary (Linux)
+ run: |
+ cd "${{ github.workspace }}/hdf5"
+ tar -zxvf ${{ github.workspace }}/hdf5/HDF5-*-Linux.tar.gz --strip-components 1
+
+ - name: List files for the HDF space (Linux)
+ run: |
+ ls -l ${{ github.workspace }}/hdf5
+ ls -l ${{ github.workspace }}/hdf5/HDF_Group/HDF5
+
+ - name: set hdf5lib name
+ id: set-hdf5lib-name
+ run: |
+ HDF5DIR=${{ github.workspace }}/hdf5/HDF_Group/HDF5/
+ FILE_NAME_HDF5=$(ls ${{ github.workspace }}/hdf5/HDF_Group/HDF5)
+ FILE_VERS=$(echo "$FILE_NAME_HDF5" | sed -r "s/([0-9]+\.[0-9]+\.[0-9]+)\..*/\1/")
+ echo "HDF5_ROOT=$HDF5DIR$FILE_NAME_HDF5" >> $GITHUB_OUTPUT
+ echo "HDF5_VERS=$FILE_VERS" >> $GITHUB_OUTPUT
+
+ - name: Download reference version
+ run: |
+ mkdir "${{ github.workspace }}/hdf5R"
+ cd "${{ github.workspace }}/hdf5R"
+ wget -q https://github.com/HDFGroup/hdf5/releases/download/hdf5-${{ inputs.file_ref }}/hdf5-${{ inputs.file_ref }}-ubuntu-2204.tar.gz
+ tar zxf hdf5-${{ inputs.file_ref }}-ubuntu-2204.tar.gz
+
+ - name: List files for the space (Linux)
+ run: |
+ ls -l ${{ github.workspace }}/hdf5R
+
+ - name: Uncompress hdf5 reference binary (Linux)
+ run: |
+ cd "${{ github.workspace }}/hdf5R"
+ tar -zxvf ${{ github.workspace }}/hdf5R/hdf5/HDF5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTS }}-Linux.tar.gz --strip-components 1
+
+ - name: List files for the HDFR space (Linux)
+ run: |
+ ls -l ${{ github.workspace }}/hdf5R
+ ls -l ${{ github.workspace }}/hdf5R/HDF_Group/HDF5
+
+ - name: set hdf5lib reference name
+ id: set-hdf5lib-refname
+ run: |
+ HDF5RDIR=${{ github.workspace }}/hdf5R/HDF_Group/HDF5/
+ FILE_NAME_HDF5R=$(ls ${{ github.workspace }}/hdf5R/HDF_Group/HDF5)
+ echo "HDF5R_ROOT=$HDF5RDIR$FILE_NAME_HDF5R" >> $GITHUB_OUTPUT
+ echo "HDF5R_VERS=$FILE_NAME_HDF5R" >> $GITHUB_OUTPUT
+
+ - name: List files for the lib spaces (Linux)
+ run: |
+ ls -l ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib
+ ls -l ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib
+
+ - name: Run Java API report
+ run: |
+ japi-compliance-checker ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/jarhdf5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTS }}.jar ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/jarhdf5-${{ steps.set-hdf5lib-name.outputs.HDF5_VERS }}.jar
+
+ - name: Run ABI report
+ run: |
+ abi-dumper ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/libhdf5.so -o ABI-0.dump -public-headers ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/include
+ abi-dumper ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/libhdf5.so -o ABI-1.dump -public-headers ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/include
+ abi-compliance-checker -l ${{ inputs.file_base }} -old ABI-0.dump -new ABI-1.dump
+ continue-on-error: true
+
+ - name: Run hl ABI report
+ run: |
+ abi-dumper ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/libhdf5_hl.so -o ABI-2.dump -public-headers ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/include
+ abi-dumper ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/libhdf5_hl.so -o ABI-3.dump -public-headers ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/include
+ abi-compliance-checker -l ${{ inputs.file_base }}_hl -old ABI-2.dump -new ABI-3.dump
+ continue-on-error: true
+
+ - name: Run cpp ABI report
+ run: |
+ abi-dumper ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/libhdf5_cpp.so -o ABI-4.dump -public-headers ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/include
+ abi-dumper ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/libhdf5_cpp.so -o ABI-5.dump -public-headers ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/include
+ abi-compliance-checker -l ${{ inputs.file_base }}_cpp -old ABI-4.dump -new ABI-5.dump
+ continue-on-error: true
+
+ - name: Run hl_cpp ABI report
+ run: |
+ abi-dumper ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/libhdf5_hl_cpp.so -o ABI-6.dump -public-headers ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/include
+ abi-dumper ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/libhdf5_hl_cpp.so -o ABI-7.dump -public-headers ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/include
+ abi-compliance-checker -l ${{ inputs.file_base }}_hl_cpp -old ABI-6.dump -new ABI-7.dump
+ continue-on-error: true
+
+ - name: Copy ABI reports
+ run: |
+ cp compat_reports/jarhdf5-/${{ steps.set-hdf5lib-refname.outputs.HDF5R_VERS }}_to_${{ steps.set-hdf5lib-name.outputs.HDF5_VERS }}/compat_report.html ${{ inputs.file_base }}-java_compat_report.html
+ ls -l compat_reports/${{ inputs.file_base }}/X_to_Y
+ cp compat_reports/${{ inputs.file_base }}/X_to_Y/compat_report.html ${{ inputs.file_base }}-hdf5_compat_report.html
+ ls -l compat_reports/${{ inputs.file_base }}_hl/X_to_Y
+ cp compat_reports/${{ inputs.file_base }}_hl/X_to_Y/compat_report.html ${{ inputs.file_base }}-hdf5_hl_compat_report.html
+ ls -l compat_reports/${{ inputs.file_base }}_cpp/X_to_Y
+ cp compat_reports/${{ inputs.file_base }}_cpp/X_to_Y/compat_report.html ${{ inputs.file_base }}-hdf5_cpp_compat_report.html
+# ls -l compat_reports/${{ inputs.file_base }}_hl_cpp/X_to_Y
+# cp compat_reports/${{ inputs.file_base }}_hl_cpp/X_to_Y/compat_report.html ${{ inputs.file_base }}-hdf5_hl_cpp_compat_report.html
+
+ - name: List files for the report spaces (Linux)
+ run: |
+ ls -l compat_reports
+ ls -l *.html
+
+ - name: Save output as artifact
+ uses: actions/upload-artifact@v4
+ with:
+ name: abi-reports
+ path: |
+ ${{ inputs.file_base }}-hdf5_compat_report.html
+ ${{ inputs.file_base }}-hdf5_hl_compat_report.html
+ ${{ inputs.file_base }}-hdf5_cpp_compat_report.html
+ ${{ inputs.file_base }}-java_compat_report.html
diff --git a/.github/workflows/autotools.yml b/.github/workflows/autotools.yml
index 68d5a2f..e61c14b 100644
--- a/.github/workflows/autotools.yml
+++ b/.github/workflows/autotools.yml
@@ -17,7 +17,19 @@ jobs:
call-parallel-special-autotools:
name: "Autotools Parallel Special Workflows"
+ uses: ./.github/workflows/main-auto-par-spc.yml
+
+ call-debug-parallel-autotools:
+ name: "Autotools Parallel Workflows"
uses: ./.github/workflows/main-auto-par.yml
+ with:
+ build_mode: "debug"
+
+ call-release-parallel-autotools:
+ name: "Autotools Parallel Workflows"
+ uses: ./.github/workflows/main-auto-par.yml
+ with:
+ build_mode: "production"
call-debug-thread-autotools:
name: "Autotools Debug Thread-Safety Workflows"
diff --git a/.github/workflows/clang-format-fix.yml b/.github/workflows/clang-format-fix.yml
index 80befa2..a27db18 100644
--- a/.github/workflows/clang-format-fix.yml
+++ b/.github/workflows/clang-format-fix.yml
@@ -31,7 +31,7 @@ jobs:
inplace: True
style: file
exclude: './config ./hl/src/H5LTanalyze.c ./hl/src/H5LTparse.c ./hl/src/H5LTparse.h ./src/H5Epubgen.h ./src/H5Einit.h ./src/H5Eterm.h ./src/H5Edefin.h ./src/H5version.h ./src/H5overflow.h'
- - uses: EndBug/add-and-commit@1bad3abcf0d6ec49a5857d124b0bfb52dc7bb081 # v9.1.3
+ - uses: EndBug/add-and-commit@a94899bca583c204427a224a7af87c02f9b325d5 # v9.1.4
with:
author_name: github-actions
author_email: 41898282+github-actions[bot]@users.noreply.github.com
diff --git a/.github/workflows/cmake-bintest.yml b/.github/workflows/cmake-bintest.yml
new file mode 100644
index 0000000..3306c0a
--- /dev/null
+++ b/.github/workflows/cmake-bintest.yml
@@ -0,0 +1,218 @@
+name: hdf5 1.14 examples bintest runs
+
+# Controls when the action will run. Triggers the workflow on a schedule
+on:
+ workflow_call:
+ inputs:
+ build_mode:
+ description: "release vs. debug build"
+ required: true
+ type: string
+
+permissions:
+ contents: read
+
+# A workflow run is made up of one or more jobs that can run sequentially or
+# in parallel
+jobs:
+ test_binary_win:
+ # Windows w/ MSVC + CMake
+ #
+ name: "Windows MSVC Binary Test"
+ runs-on: windows-latest
+ steps:
+ - name: Install Dependencies (Windows)
+ run: choco install ninja
+
+ - name: Set up JDK 19
+ uses: actions/setup-java@v4
+ with:
+ java-version: '19'
+ distribution: 'temurin'
+
+ - name: Enable Developer Command Prompt
+ uses: ilammy/msvc-dev-cmd@v1.13.0
+
+ # Get files created by cmake-ctest script
+ - name: Get published binary (Windows)
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
+ with:
+ name: zip-vs2022_cl-${{ inputs.build_mode }}-binary
+ path: ${{ github.workspace }}/hdf5
+
+ - name: Uncompress hdf5 binary (Win)
+ working-directory: ${{ github.workspace }}/hdf5
+ run: 7z x HDF5-*-win64.zip
+ shell: bash
+
+ - name: List files for the space (Win)
+ run: |
+ ls -l ${{ github.workspace }}
+ ls -l ${{ github.workspace }}/hdf5
+
+ - name: create hdf5 location (Win)
+ working-directory: ${{ github.workspace }}/hdf5
+ run: |
+ New-Item -Path "${{ github.workspace }}/HDF_Group/HDF5" -ItemType Directory
+ Copy-Item -Path "${{ github.workspace }}/hdf5/HDF*/*" -Destination "${{ github.workspace }}/HDF_Group/HDF5" -Recurse
+ shell: pwsh
+
+ - name: List files for the space (Win)
+ run: ls -l ${{ github.workspace }}/HDF_Group/HDF5
+
+ - name: set hdf5lib name
+ id: set-hdf5lib-name
+ run: |
+ HDF5DIR="${{ github.workspace }}/HDF_Group/HDF5"
+ echo "HDF5_ROOT=$HDF5DIR$FILE_NAME_HDF5" >> $GITHUB_OUTPUT
+ echo "HDF5_PLUGIN_PATH=$HDF5_ROOT/lib/plugin" >> $GITHUB_OUTPUT
+ shell: bash
+
+ - name: List files for the binaries (Win)
+ run: |
+ ls -l ${{ github.workspace }}/HDF_Group/HDF5
+
+ - name: using powershell
+ shell: pwsh
+ run: Get-Location
+
+ - name: List files for the space (Windows)
+ run: |
+ Get-ChildItem -Path ${{ github.workspace }}
+ Get-ChildItem -Path ${{ runner.workspace }}
+ shell: pwsh
+
+ - name: Run ctest (Windows)
+ env:
+ HDF5_ROOT: ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}
+ HDF5_PLUGIN_PATH: ${{ steps.set-hdf5lib-name.outputs.HDF5_PLUGIN_PATH }}
+ run: |
+ cd "${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/HDF5Examples"
+ cmake --workflow --preset=ci-StdShar-MSVC --fresh
+ shell: bash
+
+ test_binary_linux:
+ # Linux (Ubuntu) w/ gcc + CMake
+ #
+ name: "Ubuntu gcc Binary Test"
+ runs-on: ubuntu-latest
+ steps:
+ - name: Install CMake Dependencies (Linux)
+ run: sudo apt-get install ninja-build doxygen graphviz
+
+ - name: Set up JDK 19
+ uses: actions/setup-java@v4
+ with:
+ java-version: '19'
+ distribution: 'temurin'
+
+ - name: Get published binary (Linux)
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
+ with:
+ name: tgz-ubuntu-2204_gcc-${{ inputs.build_mode }}-binary
+ path: ${{ github.workspace }}
+
+ - name: Uncompress hdf5 binary (Linux)
+ run: |
+ cd "${{ github.workspace }}"
+ tar -zxvf ${{ github.workspace }}/HDF5-*-Linux.tar.gz --strip-components 1
+
+ - name: set hdf5lib name
+ id: set-hdf5lib-name
+ run: |
+ HDF5DIR=${{ github.workspace }}/HDF_Group/HDF5/
+ FILE_NAME_HDF5=$(ls ${{ github.workspace }}/HDF_Group/HDF5)
+ echo "HDF5_ROOT=$HDF5DIR$FILE_NAME_HDF5" >> $GITHUB_OUTPUT
+ echo "HDF5_PLUGIN_PATH=$HDF5_ROOT/lib/plugin" >> $GITHUB_OUTPUT
+
+ - name: List files for the binaries (Linux)
+ run: |
+ ls -l ${{ github.workspace }}/HDF_Group/HDF5
+
+ - name: Set file base name (Linux)
+ id: set-file-base
+ run: |
+ FILE_NAME_BASE=$(echo "${{ inputs.file_base }}")
+ echo "FILE_BASE=$FILE_NAME_BASE" >> $GITHUB_OUTPUT
+
+ - name: List files for the space (Linux)
+ run: |
+ ls -l ${{ github.workspace }}
+ ls ${{ runner.workspace }}
+
+ - name: Run ctest (Linux)
+ env:
+ HDF5_ROOT: ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}
+ HDF5_PLUGIN_PATH: ${{ steps.set-hdf5lib-name.outputs.HDF5_PLUGIN_PATH }}
+ run: |
+ cd "${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/share/HDF5Examples"
+ cmake --workflow --preset=ci-StdShar-GNUC --fresh
+ shell: bash
+
+ test_binary_mac:
+ # MacOS w/ Clang + CMake
+ #
+ name: "MacOS Clang Binary Test"
+ runs-on: macos-13
+ steps:
+ - name: Install Dependencies (MacOS)
+ run: brew install ninja doxygen
+
+ - name: Set up JDK 19
+ uses: actions/setup-java@v4
+ with:
+ java-version: '19'
+ distribution: 'temurin'
+
+ - name: Get published binary (MacOS)
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
+ with:
+ name: tgz-osx12-${{ inputs.build_mode }}-binary
+ path: ${{ github.workspace }}
+
+ - name: Uncompress hdf5 binary (MacOS)
+ run: |
+ cd "${{ github.workspace }}"
+ tar -zxvf ${{ github.workspace }}/HDF5-*-Darwin.tar.gz --strip-components 1
+
+ - name: set hdf5lib name
+ id: set-hdf5lib-name
+ run: |
+ HDF5DIR=${{ github.workspace }}/HDF_Group/HDF5/
+ FILE_NAME_HDF5=$(ls ${{ github.workspace }}/HDF_Group/HDF5)
+ echo "HDF5_ROOT=$HDF5DIR$FILE_NAME_HDF5" >> $GITHUB_OUTPUT
+ echo "HDF5_PLUGIN_PATH=$HDF5_ROOT/lib/plugin" >> $GITHUB_OUTPUT
+
+ - name: List files for the binaries (MacOS)
+ run: |
+ ls -l ${{ github.workspace }}/HDF_Group/HDF5
+
+ - name: Set file base name (MacOS)
+ id: set-file-base
+ run: |
+ FILE_NAME_BASE=$(echo "${{ inputs.file_base }}")
+ echo "FILE_BASE=$FILE_NAME_BASE" >> $GITHUB_OUTPUT
+
+ - name: List files for the space (MacOS)
+ run: |
+ ls ${{ github.workspace }}
+ ls ${{ runner.workspace }}
+
+ # symlinks the compiler executables to a common location
+ - name: Setup GNU Fortran
+ uses: fortran-lang/setup-fortran@v1
+ id: setup-fortran
+ with:
+ compiler: gcc
+ version: 12
+
+ - name: Run ctest (MacOS)
+ id: run-ctest
+ env:
+ HDF5_ROOT: ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}
+ HDF5_PLUGIN_PATH: ${{ steps.set-hdf5lib-name.outputs.HDF5_PLUGIN_PATH }}
+ run: |
+ cd "${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/share/HDF5Examples"
+ cmake --workflow --preset=ci-StdShar-OSX-Clang --fresh
+ shell: bash
+
diff --git a/.github/workflows/cmake-ctest.yml b/.github/workflows/cmake-ctest.yml
index b4302cc..54642a4 100644
--- a/.github/workflows/cmake-ctest.yml
+++ b/.github/workflows/cmake-ctest.yml
@@ -8,6 +8,10 @@ on:
description: "The common base name of the source tarballs"
required: true
type: string
+ preset_name:
+ description: "The common base name of the preset configuration name to control the build"
+ required: true
+ type: string
permissions:
contents: read
@@ -24,6 +28,11 @@ jobs:
- name: Install Dependencies (Windows)
run: choco install ninja
+ - name: Install Dependencies
+ uses: ssciwr/doxygen-install@v1
+ with:
+ version: "1.10.0"
+
- name: Enable Developer Command Prompt
uses: ilammy/msvc-dev-cmd@v1.13.0
@@ -36,7 +45,7 @@ jobs:
# Get files created by release script
- name: Get zip-tarball (Windows)
- uses: actions/download-artifact@v4
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: zip-tarball
path: ${{ github.workspace }}
@@ -59,7 +68,7 @@ jobs:
- name: Run ctest (Windows)
run: |
cd "${{ runner.workspace }}/hdf5/hdfsrc"
- cmake --workflow --preset=ci-StdShar-MSVC --fresh
+ cmake --workflow --preset=${{ inputs.preset_name }}-MSVC --fresh
shell: bash
- name: Publish binary (Windows)
@@ -83,7 +92,7 @@ jobs:
# Save files created by ctest script
- name: Save published binary (Windows)
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: zip-vs2022_cl-binary
path: ${{ runner.workspace }}/build114/${{ steps.set-file-base.outputs.FILE_BASE }}-win-vs2022_cl.zip
@@ -96,7 +105,12 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Install CMake Dependencies (Linux)
- run: sudo apt-get install ninja-build doxygen graphviz
+ run: sudo apt-get install ninja-build graphviz
+
+ - name: Install Dependencies
+ uses: ssciwr/doxygen-install@v1
+ with:
+ version: "1.10.0"
- name: Set file base name (Linux)
id: set-file-base
@@ -106,7 +120,7 @@ jobs:
# Get files created by release script
- name: Get tgz-tarball (Linux)
- uses: actions/download-artifact@v4
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-tarball
path: ${{ github.workspace }}
@@ -122,7 +136,7 @@ jobs:
- name: Run ctest (Linux)
run: |
cd "${{ runner.workspace }}/hdf5/hdfsrc"
- cmake --workflow --preset=ci-StdShar-GNUC --fresh
+ cmake --workflow --preset=${{ inputs.preset_name }}-GNUC --fresh
shell: bash
- name: Publish binary (Linux)
@@ -145,7 +159,7 @@ jobs:
# Save files created by ctest script
- name: Save published binary (Linux)
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: tgz-ubuntu-2204_gcc-binary
path: ${{ runner.workspace }}/build114/${{ steps.set-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc.tar.gz
@@ -153,7 +167,7 @@ jobs:
# Save doxygen files created by ctest script
- name: Save published doxygen (Linux)
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: docs-doxygen
path: ${{ runner.workspace }}/hdf5/build114/ci-StdShar-GNUC/hdf5lib_docs/html
@@ -163,10 +177,15 @@ jobs:
# MacOS w/ Clang + CMake
#
name: "MacOS Clang CMake"
- runs-on: macos-11
+ runs-on: macos-13
steps:
- name: Install Dependencies (MacOS)
- run: brew install ninja doxygen
+ run: brew install ninja
+
+ - name: Install Dependencies
+ uses: ssciwr/doxygen-install@v1
+ with:
+ version: "1.10.0"
- name: Set file base name (MacOS)
id: set-file-base
@@ -176,7 +195,7 @@ jobs:
# Get files created by release script
- name: Get tgz-tarball (MacOS)
- uses: actions/download-artifact@v4
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-tarball
path: ${{ github.workspace }}
@@ -201,12 +220,13 @@ jobs:
id: run-ctest
run: |
cd "${{ runner.workspace }}/hdf5/hdfsrc"
- cmake --workflow --preset=ci-StdShar-Clang --fresh
+ cmake --workflow --preset=${{ inputs.preset_name }}-OSX-Clang --fresh
shell: bash
- name: Publish binary (MacOS)
id: publish-ctest-binary
run: |
+
mkdir "${{ runner.workspace }}/build114"
mkdir "${{ runner.workspace }}/build114/hdf5"
cp ${{ runner.workspace }}/hdf5/hdfsrc/COPYING ${{ runner.workspace }}/build114/hdf5
@@ -224,12 +244,76 @@ jobs:
# Save files created by ctest script
- name: Save published binary (MacOS)
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: tgz-osx12-binary
path: ${{ runner.workspace }}/build/${{ steps.set-file-base.outputs.FILE_BASE }}-osx12.tar.gz
if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
+ build_and_test_S3_linux:
+ # Linux S3 (Ubuntu) w/ gcc + CMake
+ #
+ name: "Ubuntu gcc CMake S3"
+ runs-on: ubuntu-latest
+ steps:
+ - name: Install CMake Dependencies (Linux S3)
+ run: |
+ sudo apt-get install ninja-build doxygen graphviz
+ sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
+
+ - name: Set file base name (Linux S3)
+ id: set-file-base
+ run: |
+ FILE_NAME_BASE=$(echo "${{ inputs.file_base }}")
+ echo "FILE_BASE=$FILE_NAME_BASE" >> $GITHUB_OUTPUT
+
+ # Get files created by release script
+ - name: Get tgz-tarball (Linux S3)
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
+ with:
+ name: tgz-tarball
+ path: ${{ github.workspace }}
+
+ - name: List files for the space (Linux S3)
+ run: |
+ ls -l ${{ github.workspace }}
+ ls ${{ runner.workspace }}
+
+ - name: Uncompress source (Linux S3)
+ run: tar -zxvf ${{ github.workspace }}/${{ steps.set-file-base.outputs.FILE_BASE }}.tar.gz
+
+ - name: Run ctest (Linux S3)
+ run: |
+ cd "${{ runner.workspace }}/hdf5/hdfsrc"
+ cmake --workflow --preset=${{ inputs.preset_name }}-GNUC-S3 --fresh
+ shell: bash
+
+ - name: Publish binary (Linux S3)
+ id: publish-ctest-binary
+ run: |
+ mkdir "${{ runner.workspace }}/build114"
+ mkdir "${{ runner.workspace }}/build114/hdf5"
+ cp ${{ runner.workspace }}/hdf5/hdfsrc/COPYING ${{ runner.workspace }}/build114/hdf5
+ cp ${{ runner.workspace }}/hdf5/hdfsrc/COPYING_LBNL_HDF5 ${{ runner.workspace }}/build114/hdf5
+ cp ${{ runner.workspace }}/hdf5/hdfsrc/README.md ${{ runner.workspace }}/build/hdf5114
+ cp ${{ runner.workspace }}/hdf5/build114/${{ inputs.preset_name }}-GNUC-S3/*.tar.gz ${{ runner.workspace }}/build114/hdf5
+ cd "${{ runner.workspace }}/build114"
+ tar -zcvf ${{ steps.set-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc_s3.tar.gz hdf5
+ shell: bash
+
+ - name: List files in the space (Linux S3)
+ run: |
+ ls ${{ github.workspace }}
+ ls -l ${{ runner.workspace }}
+
+ # Save files created by ctest script
+ - name: Save published binary (Linux S3)
+ uses: actions/upload-artifact@v4
+ with:
+ name: tgz-ubuntu-2204_gcc_s3-binary
+ path: ${{ runner.workspace }}/build/${{ steps.set-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc_s3.tar.gz
+ if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
+
####### intel builds
build_and_test_win_intel:
# Windows w/ OneAPI + CMake
@@ -259,7 +343,7 @@ jobs:
# Get files created by release script
- name: Get zip-tarball (Windows_intel)
- uses: actions/download-artifact@v4
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: zip-tarball
path: ${{ github.workspace }}
@@ -286,7 +370,7 @@ jobs:
CXX: ${{ steps.setup-fortran.outputs.cxx }}
run: |
cd "${{ runner.workspace }}/hdf5/hdfsrc"
- cmake --workflow --preset=ci-StdShar-Intel --fresh
+ cmake --workflow --preset=${{ inputs.preset_name }}-win-Intel --fresh
shell: pwsh
- name: Publish binary (Windows_intel)
@@ -297,7 +381,7 @@ jobs:
Copy-Item -Path ${{ runner.workspace }}/hdf5/hdfsrc/COPYING -Destination ${{ runner.workspace }}/build/hdf5/
Copy-Item -Path ${{ runner.workspace }}/hdf5/hdfsrc/COPYING_LBNL_HDF5 -Destination ${{ runner.workspace }}/build/hdf5/
Copy-Item -Path ${{ runner.workspace }}/hdf5/hdfsrc/README.md -Destination ${{ runner.workspace }}/build/hdf5/
- Copy-Item -Path ${{ runner.workspace }}/hdf5/build/ci-StdShar-Intel/* -Destination ${{ runner.workspace }}/build/hdf5/ -Include *.zip
+ Copy-Item -Path ${{ runner.workspace }}/hdf5/build/${{ inputs.preset_name }}-Intel/* -Destination ${{ runner.workspace }}/build/hdf5/ -Include *.zip
cd "${{ runner.workspace }}/build"
7z a -tzip ${{ steps.set-file-base.outputs.FILE_BASE }}-win-vs2022_intel.zip hdf5
shell: pwsh
@@ -310,7 +394,7 @@ jobs:
# Save files created by ctest script
- name: Save published binary (Windows_intel)
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: zip-vs2022_intel-binary
path: ${{ runner.workspace }}/build/${{ steps.set-file-base.outputs.FILE_BASE }}-win-vs2022_intel.zip
@@ -340,7 +424,7 @@ jobs:
# Get files created by release script
- name: Get tgz-tarball (Linux_intel)
- uses: actions/download-artifact@v4
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-tarball
path: ${{ github.workspace }}
@@ -360,7 +444,7 @@ jobs:
CXX: ${{ steps.setup-fortran.outputs.cxx }}
run: |
cd "${{ runner.workspace }}/hdf5/hdfsrc"
- cmake --workflow --preset=ci-StdShar-Intel --fresh
+ cmake --workflow --preset=${{ inputs.preset_name }}-Intel --fresh
shell: bash
- name: Publish binary (Linux_intel)
@@ -371,7 +455,7 @@ jobs:
cp ${{ runner.workspace }}/hdf5/hdfsrc/COPYING ${{ runner.workspace }}/build/hdf5
cp ${{ runner.workspace }}/hdf5/hdfsrc/COPYING_LBNL_HDF5 ${{ runner.workspace }}/build/hdf5
cp ${{ runner.workspace }}/hdf5/hdfsrc/README.md ${{ runner.workspace }}/build/hdf5
- cp ${{ runner.workspace }}/hdf5/build/ci-StdShar-Intel/*.tar.gz ${{ runner.workspace }}/build/hdf5
+ cp ${{ runner.workspace }}/hdf5/build/${{ inputs.preset_name }}-Intel/*.tar.gz ${{ runner.workspace }}/build/hdf5
cd "${{ runner.workspace }}/build"
tar -zcvf ${{ steps.set-file-base.outputs.FILE_BASE }}-ubuntu-2204_intel.tar.gz hdf5
shell: bash
@@ -383,7 +467,7 @@ jobs:
# Save files created by ctest script
- name: Save published binary (Linux_intel)
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: tgz-ubuntu-2204_intel-binary
path: ${{ runner.workspace }}/build/${{ steps.set-file-base.outputs.FILE_BASE }}-ubuntu-2204_intel.tar.gz
diff --git a/.github/workflows/cmake.yml b/.github/workflows/cmake.yml
index 11a010c..cca783a 100644
--- a/.github/workflows/cmake.yml
+++ b/.github/workflows/cmake.yml
@@ -19,30 +19,49 @@ jobs:
name: "CMake Debug Thread-Safety Workflows"
uses: ./.github/workflows/main-cmake.yml
with:
- thread_safety: true
+ thread_safety: "TS"
build_mode: "Debug"
call-release-thread-cmake:
name: "CMake Release Thread-Safety Workflows"
uses: ./.github/workflows/main-cmake.yml
with:
- thread_safety: true
+ thread_safety: "TS"
build_mode: "Release"
call-debug-cmake:
name: "CMake Debug Workflows"
uses: ./.github/workflows/main-cmake.yml
with:
- thread_safety: false
+ thread_safety: ""
build_mode: "Debug"
call-release-cmake:
name: "CMake Release Workflows"
uses: ./.github/workflows/main-cmake.yml
with:
- thread_safety: false
+ thread_safety: ""
build_mode: "Release"
+ call-release-bintest:
+ name: "CMake Test Release Binaries"
+ needs: call-release-cmake
+ uses: ./.github/workflows/cmake-bintest.yml
+ with:
+ build_mode: "Release"
+
+ call-release-par:
+ name: "CMake Parallel Release Workflows"
+ uses: ./.github/workflows/main-cmake-par.yml
+ with:
+ build_mode: "Release"
+
+ call-debug-par:
+ name: "CMake Parallel Debug Workflows"
+ uses: ./.github/workflows/main-cmake-par.yml
+ with:
+ build_mode: "Debug"
+
call-release-cmake-intel:
name: "CMake Intel Workflows"
uses: ./.github/workflows/intel-cmake.yml
diff --git a/.github/workflows/codespell.yml b/.github/workflows/codespell.yml
index cb68361..1477b14 100644
--- a/.github/workflows/codespell.yml
+++ b/.github/workflows/codespell.yml
@@ -12,6 +12,3 @@ jobs:
steps:
- uses: actions/checkout@v4.1.1
- uses: codespell-project/actions-codespell@master
- with:
- skip: ./.github/workflows/codespell.yml,./bin/trace,./hl/tools/h5watch/h5watch.c,./tools/test/h5jam/tellub.c,./config/sanitizer/LICENSE,./config/sanitizer/sanitizers.cmake,./tools/test/h5repack/testfiles/*.dat,./test/API/driver,./configure,./bin/ltmain.sh,./bin/depcomp,./bin/config.guess,./bin/config.sub,./autom4te.cache,./m4/libtool.m4,./c++/src/*.html,./HDF5Examples/depcomp
- ignore_words_list: ot,isnt,inout,nd,parms,parm,ba,offsetP,ser,ois,had,fiter,fo,clude,refere,minnum,offsetp,creat,ans:,eiter,lastr,ans,isn't,ifset,sur,trun,dne,tthe,hda,filname,te,htmp,ake,gord,numer,ro,oce,msdos
diff --git a/.github/workflows/daily-build.yml b/.github/workflows/daily-build.yml
index fadf6ff..257b352 100644
--- a/.github/workflows/daily-build.yml
+++ b/.github/workflows/daily-build.yml
@@ -27,8 +27,18 @@ jobs:
#use_environ: snapshots
if: ${{ needs.call-workflow-tarball.outputs.has_changes == 'true' }}
- call-workflow-release:
+ call-workflow-abi:
needs: [call-workflow-tarball, call-workflow-ctest]
+ uses: ./.github/workflows/abi-report.yml
+ with:
+ file_ref: '1_14_3'
+ file_base: ${{ needs.call-workflow-tarball.outputs.file_base }}
+ use_tag: snapshot
+ use_environ: snapshots
+ if: ${{ needs.call-workflow-tarball.outputs.has_changes == 'true' }}
+
+ call-workflow-release:
+ needs: [call-workflow-tarball, call-workflow-ctest, call-workflow-abi]
permissions:
contents: write # In order to allow tag creation
uses: ./.github/workflows/release-files.yml
diff --git a/.github/workflows/linux-auto-aocc-ompi.yml b/.github/workflows/linux-auto-aocc-ompi.yml
index d39ac40..c535bf7 100644
--- a/.github/workflows/linux-auto-aocc-ompi.yml
+++ b/.github/workflows/linux-auto-aocc-ompi.yml
@@ -44,7 +44,7 @@ jobs:
clang -v
- name: Cache OpenMPI 4.1.5 installation
id: cache-openmpi-4_1_5
- uses: actions/cache@v3
+ uses: actions/cache@v4
with:
path: /home/runner/work/hdf5/hdf5/openmpi-4.1.5-install
key: ${{ runner.os }}-${{ runner.arch }}-openmpi-4_1_5-cache
diff --git a/.github/workflows/main-auto-par-spc.yml b/.github/workflows/main-auto-par-spc.yml
new file mode 100644
index 0000000..5047685
--- /dev/null
+++ b/.github/workflows/main-auto-par-spc.yml
@@ -0,0 +1,137 @@
+name: hdf5 1.14 autotools parallel special CI
+
+# Controls when the action will run. Triggers the workflow on a call
+on:
+ workflow_call:
+
+permissions:
+ contents: read
+
+# A workflow run is made up of one or more jobs that can run sequentially or
+# in parallel. We just have one job, but the matrix items defined below will
+# run in parallel.
+jobs:
+ #
+ # SPECIAL AUTOTOOLS BUILDS
+ #
+ # These do not run tests and are not built into the matrix and instead
+ # become NEW configs as their name would clobber one of the matrix
+ # names (so make sure the names are UNIQUE).
+ #
+
+ build_parallel_debug_werror:
+ name: "gcc DBG parallel -Werror (build only)"
+ runs-on: ubuntu-latest
+ steps:
+ # SETUP
+ # Only CMake need ninja-build, but we just install it unilaterally
+ # libssl, etc. are needed for the ros3 VFD
+ - name: Install Linux Dependencies
+ run: |
+ sudo apt update
+ sudo apt-get install ninja-build doxygen graphviz
+ sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
+ sudo apt install gcc-12 g++-12 gfortran-12
+ sudo apt install automake autoconf libtool libtool-bin
+ sudo apt install libaec0 libaec-dev
+ sudo apt install openmpi-bin openmpi-common mpi-default-dev
+ echo "CC=mpicc" >> $GITHUB_ENV
+ echo "FC=mpif90" >> $GITHUB_ENV
+
+ # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
+ - name: Get Sources
+ uses: actions/checkout@v4.1.1
+
+ # AUTOTOOLS CONFIGURE
+ - name: Autotools Configure
+ run: |
+ sh ./autogen.sh
+ mkdir "${{ runner.workspace }}/build"
+ cd "${{ runner.workspace }}/build"
+ CFLAGS=-Werror $GITHUB_WORKSPACE/configure \
+ --enable-build-mode=debug \
+ --enable-deprecated-symbols \
+ --with-default-api-version=v114 \
+ --enable-shared \
+ --enable-parallel \
+ --enable-subfiling-vfd \
+ --disable-cxx \
+ --disable-fortran \
+ --disable-java \
+ --disable-mirror-vfd \
+ --enable-direct-vfd \
+ --disable-ros3-vfd \
+ shell: bash
+
+ # BUILD
+ - name: Autotools Build
+ run: make -j3
+ working-directory: ${{ runner.workspace }}/build
+
+ # INSTALL (note that this runs even when we don't run the tests)
+ - name: Autotools Install
+ run: make install
+ working-directory: ${{ runner.workspace }}/build
+
+ - name: Autotools Verify Install
+ run: make check-install
+ working-directory: ${{ runner.workspace }}/build
+
+ build_parallel_release_werror:
+ name: "gcc REL parallel -Werror (build only)"
+ runs-on: ubuntu-latest
+ steps:
+ # SETUP
+ # Only CMake need ninja-build, but we just install it unilaterally
+ # libssl, etc. are needed for the ros3 VFD
+ - name: Install Linux Dependencies
+ run: |
+ sudo apt update
+ sudo apt-get install ninja-build doxygen graphviz
+ sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
+ sudo apt install gcc-12 g++-12 gfortran-12
+ sudo apt install automake autoconf libtool libtool-bin
+ sudo apt install libaec0 libaec-dev
+ sudo apt install openmpi-bin openmpi-common mpi-default-dev
+ echo "CC=mpicc" >> $GITHUB_ENV
+ echo "FC=mpif90" >> $GITHUB_ENV
+
+ # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
+ - name: Get Sources
+ uses: actions/checkout@v4.1.1
+
+ # AUTOTOOLS CONFIGURE
+ - name: Autotools Configure
+ run: |
+ sh ./autogen.sh
+ mkdir "${{ runner.workspace }}/build"
+ cd "${{ runner.workspace }}/build"
+ CFLAGS=-Werror $GITHUB_WORKSPACE/configure \
+ --enable-build-mode=production \
+ --enable-deprecated-symbols \
+ --with-default-api-version=v114 \
+ --enable-shared \
+ --enable-parallel \
+ --enable-subfiling-vfd \
+ --disable-cxx \
+ --disable-fortran \
+ --disable-java \
+ --disable-mirror-vfd \
+ --enable-direct-vfd \
+ --disable-ros3-vfd \
+ shell: bash
+
+ # BUILD
+ - name: Autotools Build
+ run: make -j3
+ working-directory: ${{ runner.workspace }}/build
+
+ # INSTALL (note that this runs even when we don't run the tests)
+ - name: Autotools Install
+ run: make install
+ working-directory: ${{ runner.workspace }}/build
+
+ - name: Autotools Verify Install
+ run: make check-install
+ working-directory: ${{ runner.workspace }}/build
+
diff --git a/.github/workflows/main-auto-par.yml b/.github/workflows/main-auto-par.yml
index 63f5992..913ff52 100644
--- a/.github/workflows/main-auto-par.yml
+++ b/.github/workflows/main-auto-par.yml
@@ -3,6 +3,11 @@ name: hdf5 1.14 autotools CI
# Controls when the action will run. Triggers the workflow on a call
on:
workflow_call:
+ inputs:
+ build_mode:
+ description: "release vs. debug build"
+ required: true
+ type: string
permissions:
contents: read
@@ -12,78 +17,23 @@ permissions:
# run in parallel.
jobs:
#
- # SPECIAL AUTOTOOLS BUILDS
- #
- # These do not run tests and are not built into the matrix and instead
- # become NEW configs as their name would clobber one of the matrix
- # names (so make sure the names are UNIQUE).
+ # The GitHub runners are inadequate for running parallel HDF5 tests,
+ # so we catch most issues in daily testing. What we have here is just
+ # a compile check to make sure nothing obvious is broken.
+ # A workflow that builds the library
+ # Parallel Linux (Ubuntu) w/ gcc + Autotools
#
+ Autotools_build_parallel:
+ name: "Parallel GCC-${{ inputs.build_mode }}"
+ # Don't run the action if the commit message says to skip CI
+ if: "!contains(github.event.head_commit.message, 'skip-ci')"
- build_parallel_debug_werror:
- name: "gcc DBG parallel -Werror (build only)"
+ # The type of runner that the job will run on
runs-on: ubuntu-latest
- steps:
- # SETUP
- # Only CMake need ninja-build, but we just install it unilaterally
- # libssl, etc. are needed for the ros3 VFD
- - name: Install Linux Dependencies
- run: |
- sudo apt update
- sudo apt-get install ninja-build doxygen graphviz
- sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
- sudo apt install gcc-12 g++-12 gfortran-12
- sudo apt install automake autoconf libtool libtool-bin
- sudo apt install libaec0 libaec-dev
- sudo apt install openmpi-bin openmpi-common mpi-default-dev
- echo "CC=mpicc" >> $GITHUB_ENV
- echo "FC=mpif90" >> $GITHUB_ENV
-
- # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- - name: Get Sources
- uses: actions/checkout@v4.1.1
- # AUTOTOOLS CONFIGURE
- - name: Autotools Configure
- run: |
- sh ./autogen.sh
- mkdir "${{ runner.workspace }}/build"
- cd "${{ runner.workspace }}/build"
- CFLAGS=-Werror $GITHUB_WORKSPACE/configure \
- --enable-build-mode=debug \
- --enable-deprecated-symbols \
- --with-default-api-version=v114 \
- --enable-shared \
- --enable-parallel \
- --enable-subfiling-vfd \
- --disable-cxx \
- --disable-fortran \
- --disable-java \
- --disable-mirror-vfd \
- --enable-direct-vfd \
- --disable-ros3-vfd \
- shell: bash
-
- # BUILD
- - name: Autotools Build
- run: make -j3
- working-directory: ${{ runner.workspace }}/build
-
- # INSTALL (note that this runs even when we don't run the tests)
- - name: Autotools Install
- run: make install
- working-directory: ${{ runner.workspace }}/build
-
- - name: Autotools Verify Install
- run: make check-install
- working-directory: ${{ runner.workspace }}/build
-
- build_parallel_release_werror:
- name: "gcc REL parallel -Werror (build only)"
- runs-on: ubuntu-latest
+ # Steps represent a sequence of tasks that will be executed as part of the job
steps:
# SETUP
- # Only CMake need ninja-build, but we just install it unilaterally
- # libssl, etc. are needed for the ros3 VFD
- name: Install Linux Dependencies
run: |
sudo apt update
@@ -106,32 +56,22 @@ jobs:
sh ./autogen.sh
mkdir "${{ runner.workspace }}/build"
cd "${{ runner.workspace }}/build"
- CFLAGS=-Werror $GITHUB_WORKSPACE/configure \
- --enable-build-mode=production \
+ CC=mpicc $GITHUB_WORKSPACE/configure \
+ --enable-build-mode=${{ inputs.build_mode }} \
--enable-deprecated-symbols \
--with-default-api-version=v114 \
--enable-shared \
--enable-parallel \
- --enable-subfiling-vfd \
--disable-cxx \
- --disable-fortran \
+ --enable-fortran \
--disable-java \
--disable-mirror-vfd \
- --enable-direct-vfd \
+ --disable-direct-vfd \
--disable-ros3-vfd \
+ --with-szlib=yes
shell: bash
# BUILD
- name: Autotools Build
run: make -j3
working-directory: ${{ runner.workspace }}/build
-
- # INSTALL (note that this runs even when we don't run the tests)
- - name: Autotools Install
- run: make install
- working-directory: ${{ runner.workspace }}/build
-
- - name: Autotools Verify Install
- run: make check-install
- working-directory: ${{ runner.workspace }}/build
-
diff --git a/.github/workflows/main-auto.yml b/.github/workflows/main-auto.yml
index 7147dd6..f8b806d 100644
--- a/.github/workflows/main-auto.yml
+++ b/.github/workflows/main-auto.yml
@@ -120,63 +120,3 @@ jobs:
- name: Autotools Verify Install
run: make check-install
working-directory: ${{ runner.workspace }}/build
-
- #
- # The GitHub runners are inadequate for running parallel HDF5 tests,
- # so we catch most issues in daily testing. What we have here is just
- # a compile check to make sure nothing obvious is broken.
- # A workflow that builds the library
- # Parallel Linux (Ubuntu) w/ gcc + Autotools
- #
- Autotools_build_parallel:
- name: "Parallel GCC-${{ inputs.build_mode }}-TS=${{ inputs.thread_safety }}d"
- # Don't run the action if the commit message says to skip CI
- if: "!contains(github.event.head_commit.message, 'skip-ci')"
-
- # The type of runner that the job will run on
- runs-on: ubuntu-latest
-
- # Steps represent a sequence of tasks that will be executed as part of the job
- steps:
- # SETUP
- - name: Install Linux Dependencies
- run: |
- sudo apt update
- sudo apt-get install ninja-build doxygen graphviz
- sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
- sudo apt install gcc-12 g++-12 gfortran-12
- sudo apt install automake autoconf libtool libtool-bin
- sudo apt install libaec0 libaec-dev
- sudo apt install openmpi-bin openmpi-common mpi-default-dev
- echo "CC=mpicc" >> $GITHUB_ENV
- echo "FC=mpif90" >> $GITHUB_ENV
-
- # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- - name: Get Sources
- uses: actions/checkout@v4.1.1
-
- # AUTOTOOLS CONFIGURE
- - name: Autotools Configure
- run: |
- sh ./autogen.sh
- mkdir "${{ runner.workspace }}/build"
- cd "${{ runner.workspace }}/build"
- CC=mpicc $GITHUB_WORKSPACE/configure \
- --enable-build-mode=${{ inputs.build_mode }} \
- --enable-deprecated-symbols \
- --with-default-api-version=v114 \
- --enable-shared \
- --enable-parallel \
- --disable-cxx \
- --enable-fortran \
- --disable-java \
- --disable-mirror-vfd \
- --disable-direct-vfd \
- --disable-ros3-vfd \
- --with-szlib=yes
- shell: bash
-
- # BUILD
- - name: Autotools Build
- run: make -j3
- working-directory: ${{ runner.workspace }}/build
diff --git a/.github/workflows/main-cmake-par.yml b/.github/workflows/main-cmake-par.yml
new file mode 100644
index 0000000..8b5adb8
--- /dev/null
+++ b/.github/workflows/main-cmake-par.yml
@@ -0,0 +1,77 @@
+name: hdf5 1.14 PAR CMake CI
+
+# Controls when the action will run. Triggers the workflow on a call
+on:
+ workflow_call:
+ inputs:
+ build_mode:
+ description: "release vs. debug build"
+ required: true
+ type: string
+
+permissions:
+ contents: read
+
+# A workflow run is made up of one or more jobs that can run sequentially or
+# in parallel. We just have one job, but the matrix items defined below will
+# run in parallel.
+jobs:
+ #
+ # The GitHub runners are inadequate for running parallel HDF5 tests,
+ # so we catch most issues in daily testing. What we have here is just
+ # a compile check to make sure nothing obvious is broken.
+ # A workflow that builds the library
+ # Parallel Linux (Ubuntu) w/ gcc + Autotools
+ #
+ CMake_build_parallel:
+ name: "Parallel GCC-${{ inputs.build_mode }}"
+ # Don't run the action if the commit message says to skip CI
+ if: ${{ inputs.thread_safety != 'TS' }}
+
+ # The type of runner that the job will run on
+ runs-on: ubuntu-latest
+
+ # Steps represent a sequence of tasks that will be executed as part of the job
+ steps:
+ # SETUP
+ - name: Install Linux Dependencies
+ run: |
+ sudo apt update
+ sudo apt-get install ninja-build doxygen graphviz
+ sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
+ sudo apt install gcc-12 g++-12 gfortran-12
+ sudo apt install libaec0 libaec-dev
+ sudo apt install openmpi-bin openmpi-common mpi-default-dev
+ echo "CC=mpicc" >> $GITHUB_ENV
+ echo "FC=mpif90" >> $GITHUB_ENV
+
+ # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
+ - name: Get Sources
+ uses: actions/checkout@v4.1.1
+
+ # CMAKE CONFIGURE
+ - name: CMake Configure
+ run: |
+ mkdir "${{ runner.workspace }}/build"
+ cd "${{ runner.workspace }}/build"
+ CC=mpicc cmake -C $GITHUB_WORKSPACE/config/cmake/cacheinit.cmake \
+ -DCMAKE_BUILD_TYPE=${{ inputs.build_mode }} \
+ -DBUILD_SHARED_LIBS=ON \
+ -DHDF5_ENABLE_ALL_WARNINGS=ON \
+ -DHDF5_ENABLE_PARALLEL:BOOL=ON \
+ -DHDF5_BUILD_CPP_LIB:BOOL=OFF \
+ -DHDF5_BUILD_FORTRAN=ON \
+ -DHDF5_BUILD_JAVA=OFF \
+ -DLIBAEC_USE_LOCALCONTENT=OFF \
+ -DZLIB_USE_LOCALCONTENT=OFF \
+ -DHDF5_ENABLE_MIRROR_VFD:BOOL=OFF \
+ -DHDF5_ENABLE_DIRECT_VFD:BOOL=OFF \
+ -DHDF5_ENABLE_ROS3_VFD:BOOL=OFF \
+ -DHDF5_PACK_EXAMPLES:BOOL=ON \
+ $GITHUB_WORKSPACE
+ shell: bash
+
+ # BUILD
+ - name: CMake Build
+ run: cmake --build . --parallel 3 --config ${{ inputs.build_mode }}
+ working-directory: ${{ runner.workspace }}/build
diff --git a/.github/workflows/main-cmake.yml b/.github/workflows/main-cmake.yml
index 935c7eb..5d36b6a 100644
--- a/.github/workflows/main-cmake.yml
+++ b/.github/workflows/main-cmake.yml
@@ -5,9 +5,10 @@ on:
workflow_call:
inputs:
thread_safety:
- description: "thread-safety on/off"
+ description: "TS or empty"
required: true
- type: boolean
+ type: string
+
build_mode:
description: "release vs. debug build"
required: true
@@ -47,12 +48,12 @@ jobs:
#
# No Fortran, parallel, or VFDs that rely on POSIX things
- name: "Windows MSVC"
- os: windows-2022
+ os: windows-latest
toolchain: ""
cpp: ON
fortran: OFF
java: ON
- docs: OFF
+ docs: ON
libaecfc: ON
localaec: OFF
zlibfc: ON
@@ -82,7 +83,7 @@ jobs:
mirror_vfd: ON
direct_vfd: ON
ros3_vfd: ON
- toolchain: "config/toolchain/gcc.cmake"
+ toolchain: "-DCMAKE_TOOLCHAIN_FILE=config/toolchain/gcc.cmake"
generator: "-G Ninja"
run_tests: true
@@ -96,7 +97,7 @@ jobs:
cpp: ON
fortran: OFF
java: ON
- docs: OFF
+ docs: ON
libaecfc: ON
localaec: OFF
zlibfc: ON
@@ -105,14 +106,14 @@ jobs:
mirror_vfd: ON
direct_vfd: OFF
ros3_vfd: OFF
- toolchain: "config/toolchain/clang.cmake"
+ toolchain: "-DCMAKE_TOOLCHAIN_FILE=config/toolchain/clang.cmake"
generator: "-G Ninja"
run_tests: true
# Sets the job's name from the properties
- name: "${{ matrix.name }}-${{ inputs.build_mode }}-TS=${{ inputs.thread_safety }}"
+ name: "${{ matrix.name }}-${{ inputs.build_mode }}-${{ inputs.thread_safety }}"
# Don't run the action if the commit message says to skip CI
if: "!contains(github.event.head_commit.message, 'skip-ci')"
@@ -136,7 +137,7 @@ jobs:
- name: Install Linux Dependencies
run: |
sudo apt update
- sudo apt-get install ninja-build doxygen graphviz
+ sudo apt-get install ninja-build graphviz
sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
sudo apt install gcc-12 g++-12 gfortran-12
echo "CC=gcc-12" >> $GITHUB_ENV
@@ -151,16 +152,21 @@ jobs:
if: matrix.os == 'windows-latest'
- name: Install Dependencies (macOS)
- run: brew install ninja doxygen
+ run: brew install ninja
if: matrix.os == 'macos-13'
+ - name: Install Dependencies
+ uses: ssciwr/doxygen-install@v1
+ with:
+ version: "1.9.7"
+
- name: Set environment for MSVC (Windows)
run: |
# Set these environment variables so CMake picks the correct compiler
echo "CXX=cl.exe" >> $GITHUB_ENV
echo "CC=cl.exe" >> $GITHUB_ENV
if: matrix.os == 'windows-latest'
-
+
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- name: Get Sources
uses: actions/checkout@v4.1.1
@@ -176,9 +182,10 @@ jobs:
cmake -C $GITHUB_WORKSPACE/config/cmake/cacheinit.cmake \
${{ matrix.generator }} \
-DCMAKE_BUILD_TYPE=${{ inputs.build_mode }} \
- -DCMAKE_TOOLCHAIN_FILE=${{ matrix.toolchain }} \
+ ${{ matrix.toolchain }} \
-DBUILD_SHARED_LIBS=ON \
-DHDF5_ENABLE_ALL_WARNINGS=ON \
+ -DHDF5_ENABLE_DOXY_WARNINGS=ON \
-DHDF5_ENABLE_PARALLEL:BOOL=${{ matrix.parallel }} \
-DHDF5_BUILD_CPP_LIB:BOOL=${{ matrix.cpp }} \
-DHDF5_BUILD_FORTRAN=${{ matrix.fortran }} \
@@ -189,9 +196,11 @@ jobs:
-DHDF5_ENABLE_MIRROR_VFD:BOOL=${{ matrix.mirror_vfd }} \
-DHDF5_ENABLE_DIRECT_VFD:BOOL=${{ matrix.direct_vfd }} \
-DHDF5_ENABLE_ROS3_VFD:BOOL=${{ matrix.ros3_vfd }} \
+ -DHDF5_PACK_EXAMPLES:BOOL=ON \
+ -DHDF5_PACKAGE_EXTLIBS:BOOL=ON \
$GITHUB_WORKSPACE
shell: bash
- if: "! (matrix.thread_safety)"
+ if: ${{ inputs.thread_safety != 'TS' }}
- name: CMake Configure (Thread-Safe)
@@ -201,8 +210,9 @@ jobs:
cmake -C $GITHUB_WORKSPACE/config/cmake/cacheinit.cmake \
${{ matrix.generator }} \
-DCMAKE_BUILD_TYPE=${{ inputs.build_mode }} \
- -DCMAKE_TOOLCHAIN_FILE=${{ matrix.toolchain }} \
+ ${{ matrix.toolchain }} \
-DBUILD_SHARED_LIBS=ON \
+ -DBUILD_STATIC_LIBS=${{ (matrix.os != 'windows-latest') }} \
-DHDF5_ENABLE_ALL_WARNINGS=ON \
-DHDF5_ENABLE_THREADSAFE:BOOL=ON \
-DHDF5_ENABLE_PARALLEL:BOOL=${{ matrix.parallel }} \
@@ -216,9 +226,10 @@ jobs:
-DHDF5_ENABLE_MIRROR_VFD:BOOL=${{ matrix.mirror_vfd }} \
-DHDF5_ENABLE_DIRECT_VFD:BOOL=${{ matrix.direct_vfd }} \
-DHDF5_ENABLE_ROS3_VFD:BOOL=${{ matrix.ros3_vfd }} \
+ -DHDF5_PACK_EXAMPLES:BOOL=ON \
$GITHUB_WORKSPACE
shell: bash
- if: (matrix.thread_safety)
+ if: ${{ inputs.thread_safety == 'TS' }}
#
# BUILD
@@ -237,75 +248,48 @@ jobs:
- name: CMake Run Tests
run: ctest . --parallel 2 -C ${{ inputs.build_mode }} -V
working-directory: ${{ runner.workspace }}/build
- if: (matrix.run_tests) && ! (matrix.thread_safety)
+ if: ${{ matrix.run_tests && (inputs.thread_safety != 'TS') }}
# THREAD-SAFE
- name: CMake Run Thread-Safe Tests
run: ctest . --parallel 2 -C ${{ inputs.build_mode }} -V -R ttsafe
working-directory: ${{ runner.workspace }}/build
- if: (matrix.run_tests) && (matrix.thread_safety)
+ if: ${{ matrix.run_tests && (inputs.thread_safety == 'TS') }}
#
# INSTALL (note that this runs even when we don't run the tests)
#
- #
- # The GitHub runners are inadequate for running parallel HDF5 tests,
- # so we catch most issues in daily testing. What we have here is just
- # a compile check to make sure nothing obvious is broken.
- # A workflow that builds the library
- # Parallel Linux (Ubuntu) w/ gcc + Autotools
- #
- CMake_build_parallel:
- name: "Parallel GCC-${{ inputs.build_mode }}-TS=${{ inputs.thread_safety }}"
- # Don't run the action if the commit message says to skip CI
- if: "!contains(github.event.head_commit.message, 'skip-ci')"
-
- # The type of runner that the job will run on
- runs-on: ubuntu-latest
-
- # Steps represent a sequence of tasks that will be executed as part of the job
- steps:
- # SETUP
- - name: Install Linux Dependencies
- run: |
- sudo apt update
- sudo apt-get install ninja-build doxygen graphviz
- sudo apt install libssl3 libssl-dev libcurl4 libcurl4-openssl-dev
- sudo apt install gcc-12 g++-12 gfortran-12
- sudo apt install libaec0 libaec-dev
- sudo apt install openmpi-bin openmpi-common mpi-default-dev
- echo "CC=mpicc" >> $GITHUB_ENV
- echo "FC=mpif90" >> $GITHUB_ENV
-
- # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- - name: Get Sources
- uses: actions/checkout@v4.1.1
+ - name: CMake Run Package
+ run: cpack -C ${{ inputs.build_mode }} -V
+ working-directory: ${{ runner.workspace }}/build
- # CMAKE CONFIGURE
- - name: CMake Configure
+ - name: List files in the space
run: |
- mkdir "${{ runner.workspace }}/build"
- cd "${{ runner.workspace }}/build"
- CC=mpicc cmake -C $GITHUB_WORKSPACE/config/cmake/cacheinit.cmake \
- -DCMAKE_BUILD_TYPE=${{ inputs.build_mode }} \
- -DCMAKE_TOOLCHAIN_FILE=${{ matrix.toolchain }} \
- -DBUILD_SHARED_LIBS=ON \
- -DHDF5_ENABLE_ALL_WARNINGS=ON \
- -DHDF5_ENABLE_PARALLEL:BOOL=ON \
- -DHDF5_BUILD_CPP_LIB:BOOL=OFF \
- -DHDF5_BUILD_FORTRAN=ON \
- -DHDF5_BUILD_JAVA=OFF \
- -DLIBAEC_USE_LOCALCONTENT=OFF \
- -DZLIB_USE_LOCALCONTENT=OFF \
- -DHDF5_ENABLE_MIRROR_VFD:BOOL=OFF \
- -DHDF5_ENABLE_DIRECT_VFD:BOOL=OFF \
- -DHDF5_ENABLE_ROS3_VFD:BOOL=OFF \
- $GITHUB_WORKSPACE
- shell: bash
-
- # BUILD
- - name: CMake Build
- run: cmake --build . --parallel 3 --config ${{ inputs.build_mode }}
- working-directory: ${{ runner.workspace }}/build
+ ls -l ${{ runner.workspace }}/build
+
+ # Save files created by ctest script
+ - name: Save published binary (Windows)
+ uses: actions/upload-artifact@v4
+ with:
+ name: zip-vs2022_cl-${{ inputs.build_mode }}-binary
+ path: ${{ runner.workspace }}/build/HDF5-*-win64.zip
+ if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
+ if: ${{ (matrix.os == 'windows-latest') && (inputs.thread_safety != 'TS') }}
+
+ - name: Save published binary (linux)
+ uses: actions/upload-artifact@v4
+ with:
+ name: tgz-ubuntu-2204_gcc-${{ inputs.build_mode }}-binary
+ path: ${{ runner.workspace }}/build/HDF5-*-Linux.tar.gz
+ if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
+ if: ${{ (matrix.os == 'ubuntu-latest') && (inputs.thread_safety != 'TS') }}
+
+ - name: Save published binary (Mac)
+ uses: actions/upload-artifact@v4
+ with:
+ name: tgz-osx12-${{ inputs.build_mode }}-binary
+ path: ${{ runner.workspace }}/build/HDF5-*-Darwin.tar.gz
+ if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
+ if: ${{ (matrix.os == 'macos-13') && (inputs.thread_safety != 'TS') }}
diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml
index df3bcec..7e9805a 100644
--- a/.github/workflows/main.yml
+++ b/.github/workflows/main.yml
@@ -30,7 +30,10 @@ jobs:
workflow-autotools:
name: "Autotools Workflows"
uses: ./.github/workflows/autotools.yml
+ if: "!contains(github.event.head_commit.message, 'skip-ci')"
workflow-cmake:
name: "CMake Workflows"
uses: ./.github/workflows/cmake.yml
+ if: "!contains(github.event.head_commit.message, 'skip-ci')"
+
diff --git a/.github/workflows/nvhpc-cmake.yml b/.github/workflows/nvhpc-cmake.yml
index 0fd6974..b0b3143 100644
--- a/.github/workflows/nvhpc-cmake.yml
+++ b/.github/workflows/nvhpc-cmake.yml
@@ -33,8 +33,8 @@ jobs:
echo 'deb [signed-by=/usr/share/keyrings/nvidia-hpcsdk-archive-keyring.gpg] https://developer.download.nvidia.com/hpc-sdk/ubuntu/amd64 /' | sudo tee /etc/apt/sources.list.d/nvhpc.list
sudo apt-get update -y
sudo apt-get install -y nvhpc-23-9
- echo "CC=nvc" >> $GITHUB_ENV
- echo "FC=nvfortran" >> $GITHUB_ENV
+ echo "CC=/opt/nvidia/hpc_sdk/Linux_x86_64/23.9/comm_libs/openmpi4/bin/mpicc" >> $GITHUB_ENV
+ echo "FC=/opt/nvidia/hpc_sdk/Linux_x86_64/23.9/comm_libs/openmpi4/bin/mpifort" >> $GITHUB_ENV
echo "NVHPCSDK=/opt/nvidia/hpc_sdk" >> $GITHUB_ENV
echo "OMPI_CXX=/opt/nvidia/hpc_sdk/Linux_x86_64/23.9/compilers/bin/nvc++" >> $GITHUB_ENV
echo "OMPI_CC=/opt/nvidia/hpc_sdk/Linux_x86_64/23.9/compilers/bin/nvc" >> $GITHUB_ENV
diff --git a/.github/workflows/release-files.yml b/.github/workflows/release-files.yml
index 732fa1e..e443588 100644
--- a/.github/workflows/release-files.yml
+++ b/.github/workflows/release-files.yml
@@ -75,7 +75,7 @@ jobs:
# Get files created by tarball script
- name: Get doxygen (Linux)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: docs-doxygen
path: ${{ github.workspace }}/${{ steps.get-file-base.outputs.FILE_BASE }}.doxygen
@@ -84,48 +84,60 @@ jobs:
run: zip -r ${{ steps.get-file-base.outputs.FILE_BASE }}.doxygen.zip ./${{ steps.get-file-base.outputs.FILE_BASE }}.doxygen
- name: Get tgz-tarball (Linux)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-tarball
path: ${{ github.workspace }}
- name: Get zip-tarball (Windows)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: zip-tarball
path: ${{ github.workspace }}
# Get files created by cmake-ctest script
- name: Get published binary (Windows)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: zip-vs2022_cl-binary
path: ${{ github.workspace }}
- name: Get published binary (MacOS)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-osx12-binary
path: ${{ github.workspace }}
- name: Get published binary (Linux)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-ubuntu-2204_gcc-binary
path: ${{ github.workspace }}
+ - name: Get published binary (Linux S3)
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
+ with:
+ name: tgz-ubuntu-2204_gcc_s3-binary
+ path: ${{ github.workspace }}
+
- name: Get published binary (Windows_intel)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: zip-vs2022_intel-binary
path: ${{ github.workspace }}
- name: Get published binary (Linux_intel)
- uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ uses: actions/download-artifact@6b208ae046db98c579e8a3aa621ab581ff575935 # v4.1.1
with:
name: tgz-ubuntu-2204_intel-binary
path: ${{ github.workspace }}
+ - name: Get published abi reports (Linux)
+ uses: actions/download-artifact@f44cd7b40bfd40b6aa1cc1b9b5b7bf03d3c67110 # v4.1.0
+ with:
+ name: abi-reports
+ path: ${{ github.workspace }}
+
- name: Store snapshot name
run: |
echo "${{ steps.get-file-base.outputs.FILE_BASE }}" > ./last-file.txt
@@ -139,11 +151,16 @@ jobs:
prerelease: true
files: |
last-file.txt
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-hdf5_compat_report.html
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-hdf5_hl_compat_report.html
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-hdf5_cpp_compat_report.html
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-java_compat_report.html
${{ steps.get-file-base.outputs.FILE_BASE }}.doxygen.zip
${{ steps.get-file-base.outputs.FILE_BASE }}.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}.zip
${{ steps.get-file-base.outputs.FILE_BASE }}-osx12.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc.tar.gz
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc_s3.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}-win-vs2022_cl.zip
${{ steps.get-file-base.outputs.FILE_BASE }}-ubuntu-2204_intel.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}-win-vs2022_intel.zip
@@ -158,11 +175,16 @@ jobs:
prerelease: false
#body_path: ${{ github.workspace }}-CHANGELOG.txt
files: |
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-hdf5_compat_report.html
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-hdf5_hl_compat_report.html
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-hdf5_cpp_compat_report.html
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-java_compat_report.html
${{ steps.get-file-base.outputs.FILE_BASE }}.doxygen.zip
${{ steps.get-file-base.outputs.FILE_BASE }}.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}.zip
${{ steps.get-file-base.outputs.FILE_BASE }}-osx12.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc.tar.gz
+ ${{ steps.get-file-base.outputs.FILE_BASE }}-ubuntu-2204_gcc_s3.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}-win-vs2022_cl.zip
${{ steps.get-file-base.outputs.FILE_BASE }}-ubuntu-2204_intel.tar.gz
${{ steps.get-file-base.outputs.FILE_BASE }}-win-vs2022_intel.zip
diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml
index 2e65978..768581d 100644
--- a/.github/workflows/release.yml
+++ b/.github/workflows/release.yml
@@ -66,14 +66,14 @@ jobs:
# Save files created by release script
- name: Save tgz-tarball
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: tgz-tarball
path: ${{ steps.set-file-base.outputs.FILE_BASE }}.tar.gz
if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
- name: Save zip-tarball
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: zip-tarball
path: ${{ steps.set-file-base.outputs.FILE_BASE }}.zip
@@ -84,10 +84,20 @@ jobs:
uses: ./.github/workflows/cmake-ctest.yml
with:
file_base: ${{ needs.create-files-ctest.outputs.file_base }}
+ preset_name: ci-StdShar
+
+ call-workflow-abi:
+ needs: [log-the-inputs, create-files-ctest, call-workflow-ctest]
+ uses: ./.github/workflows/abi-report.yml
+ with:
+ file_ref: '1_14_3'
+ file_base: ${{ needs.create-files-ctest.outputs.file_base }}
+ use_tag: ${{ needs.log-the-inputs.outputs.rel_tag }}
+ use_environ: release
call-workflow-release:
#needs: [call-workflow-tarball, call-workflow-ctest]
- needs: [log-the-inputs, create-files-ctest, call-workflow-ctest]
+ needs: [log-the-inputs, create-files-ctest, call-workflow-ctest, call-workflow-abi]
permissions:
contents: write # In order to allow tag creation
uses: ./.github/workflows/release-files.yml
diff --git a/.github/workflows/tarball.yml b/.github/workflows/tarball.yml
index 66b1e7d..4e2d4a4 100644
--- a/.github/workflows/tarball.yml
+++ b/.github/workflows/tarball.yml
@@ -133,14 +133,14 @@ jobs:
# Save files created by release script
- name: Save tgz-tarball
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: tgz-tarball
path: ${{ steps.set-file-base.outputs.FILE_BASE }}.tar.gz
if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
- name: Save zip-tarball
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v4
with:
name: zip-tarball
path: ${{ steps.set-file-base.outputs.FILE_BASE }}.zip
diff --git a/CITATION.cff b/CITATION.cff
new file mode 100644
index 0000000..4e611a5
--- /dev/null
+++ b/CITATION.cff
@@ -0,0 +1,12 @@
+cff-version: 1.2.0
+title: 'Hierarchical Data Format, version 5'
+message: >-
+ If you use this software, please cite it using the
+ metadata from this file.
+type: software
+authors:
+ - name: The HDF Group
+ website: 'https://www.hdfgroup.org'
+repository-code: 'https://github.com/HDFGroup/hdf5'
+url: 'https://www.hdfgroup.org/HDF5/'
+repository-artifact: 'https://www.hdfgroup.org/downloads/hdf5/'
diff --git a/CMakeInstallation.cmake b/CMakeInstallation.cmake
index aae6d65..9229c4e 100644
--- a/CMakeInstallation.cmake
+++ b/CMakeInstallation.cmake
@@ -356,7 +356,10 @@ if (NOT HDF5_EXTERNALLY_CONFIGURED AND NOT HDF5_NO_PACKAGES)
endif ()
elseif (APPLE)
list (APPEND CPACK_GENERATOR "STGZ")
- list (APPEND CPACK_GENERATOR "DragNDrop")
+ option (HDF5_PACK_MACOSX_DMG "Package the HDF5 Library using DragNDrop" OFF)
+ if (HDF5_PACK_MACOSX_DMG)
+ list (APPEND CPACK_GENERATOR "DragNDrop")
+ endif ()
set (CPACK_COMPONENTS_ALL_IN_ONE_PACKAGE ON)
set (CPACK_PACKAGING_INSTALL_PREFIX "/${CPACK_PACKAGE_INSTALL_DIRECTORY}")
set (CPACK_PACKAGE_ICON "${HDF_RESOURCES_DIR}/hdf.icns")
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 1994458..2446958 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -953,7 +953,7 @@ if (HDF5_BUILD_DOC AND EXISTS "${HDF5_DOXYGEN_DIR}" AND IS_DIRECTORY "${HDF5_DOX
# check if Doxygen is installed
find_package(Doxygen)
if (DOXYGEN_FOUND)
- option (HDF5_ENABLE_DOXY_WARNINGS "Enable fail if doxygen parsing has warnings." ON)
+ option (HDF5_ENABLE_DOXY_WARNINGS "Enable fail if doxygen parsing has warnings." OFF)
mark_as_advanced (HDF5_ENABLE_DOXY_WARNINGS)
if (HDF5_ENABLE_DOXY_WARNINGS)
set (HDF5_DOXY_WARNINGS "FAIL_ON_WARNINGS")
@@ -1077,6 +1077,13 @@ if (EXISTS "${HDF5_SOURCE_DIR}/fortran" AND IS_DIRECTORY "${HDF5_SOURCE_DIR}/for
if (MPI_Fortran_LINK_FLAGS)
set (CMAKE_Fortran_EXE_LINKER_FLAGS "${MPI_Fortran_LINK_FLAGS} ${CMAKE_EXE_LINKER_FLAGS}")
endif ()
+ # Check if MPI-3 Fortran 2008 module mpi_f08 is supported
+ if (MPI_Fortran_HAVE_F08_MODULE)
+ set (H5_HAVE_MPI_F08 1)
+ message (VERBOSE "MPI-3 Fortran 2008 module mpi_f08 is supported")
+ else ()
+ message (VERBOSE "MPI-3 Fortran 2008 module mpi_f08 is NOT supported")
+ endif ()
endif ()
#option (HDF5_INSTALL_MOD_FORTRAN "Copy FORTRAN mod files to include directory (NO SHARED STATIC)" "NO")
@@ -1148,6 +1155,11 @@ if (EXISTS "${HDF5_SOURCE_DIR}/java" AND IS_DIRECTORY "${HDF5_SOURCE_DIR}/java")
endif ()
#-----------------------------------------------------------------------------
+# Generate the H5pubconf.h file containing user settings needed by compilation
+#-----------------------------------------------------------------------------
+configure_file (${HDF_RESOURCES_DIR}/H5pubconf.h.in ${HDF5_SRC_BINARY_DIR}/H5pubconf.h @ONLY)
+
+#-----------------------------------------------------------------------------
# Option to build examples
#-----------------------------------------------------------------------------
if (EXISTS "${HDF5_SOURCE_DIR}/HDF5Examples" AND IS_DIRECTORY "${HDF5_SOURCE_DIR}/HDF5Examples")
@@ -1159,9 +1171,4 @@ if (EXISTS "${HDF5_SOURCE_DIR}/HDF5Examples" AND IS_DIRECTORY "${HDF5_SOURCE_DIR
endif ()
endif ()
-#-----------------------------------------------------------------------------
-# Generate the H5pubconf.h file containing user settings needed by compilation
-#-----------------------------------------------------------------------------
-configure_file (${HDF_RESOURCES_DIR}/H5pubconf.h.in ${HDF5_SRC_BINARY_DIR}/H5pubconf.h @ONLY)
-
include (CMakeInstallation.cmake)
diff --git a/CMakePresets.json b/CMakePresets.json
index 84b4f2f..61afadd 100644
--- a/CMakePresets.json
+++ b/CMakePresets.json
@@ -94,6 +94,14 @@
}
},
{
+ "name": "ci-S3",
+ "hidden": true,
+ "cacheVariables": {
+ "HDF5_ENABLE_ROS3_VFD": "ON",
+ "HDF5_ENABLE_HDFS": "OFF"
+ }
+ },
+ {
"name": "ci-StdShar",
"hidden": true,
"inherits": ["ci-StdCompression", "ci-StdExamples", "ci-StdPlugins"],
@@ -150,6 +158,14 @@
]
},
{
+ "name": "ci-StdShar-GNUC-S3",
+ "description": "GNUC S3 Config for x64 (Release)",
+ "inherits": [
+ "ci-StdShar-GNUC",
+ "ci-S3"
+ ]
+ },
+ {
"name": "ci-StdShar-Intel",
"description": "Intel Standard Config for x64 (Release)",
"inherits": [
@@ -188,6 +204,15 @@
]
},
{
+ "name": "ci-StdShar-GNUC-S3",
+ "description": "GNUC S3 Build for x64 (Release)",
+ "configurePreset": "ci-StdShar-GNUC-S3",
+ "verbose": true,
+ "inherits": [
+ "ci-x64-Release-GNUC"
+ ]
+ },
+ {
"name": "ci-StdShar-Intel",
"description": "Intel Standard Build for x64 (Release)",
"configurePreset": "ci-StdShar-Intel",
@@ -218,6 +243,23 @@
]
},
{
+ "name": "ci-StdShar-OSX-Clang",
+ "configurePreset": "ci-StdShar-Clang",
+ "inherits": [
+ "ci-x64-Release-Clang"
+ ],
+ "execution": {
+ "noTestsAction": "error",
+ "timeout": 180,
+ "jobs": 2
+ },
+ "condition": {
+ "type": "equals",
+ "lhs": "${hostSystemName}",
+ "rhs": "Darwin"
+ }
+ },
+ {
"name": "ci-StdShar-GNUC",
"configurePreset": "ci-StdShar-GNUC",
"inherits": [
@@ -225,7 +267,14 @@
]
},
{
- "name": "ci-StdShar-Intel",
+ "name": "ci-StdShar-GNUC-S3",
+ "configurePreset": "ci-StdShar-GNUC-S3",
+ "inherits": [
+ "ci-x64-Release-GNUC"
+ ]
+ },
+ {
+ "name": "ci-StdShar-win-Intel",
"configurePreset": "ci-StdShar-Intel",
"inherits": [
"ci-x64-Release-Intel"
@@ -234,7 +283,19 @@
"exclude": {
"name": "H5DUMP-tfloatsattrs"
}
+ },
+ "condition": {
+ "type": "equals",
+ "lhs": "${hostSystemName}",
+ "rhs": "Windows"
}
+ },
+ {
+ "name": "ci-StdShar-Intel",
+ "configurePreset": "ci-StdShar-Intel",
+ "inherits": [
+ "ci-x64-Release-Intel"
+ ]
}
],
"packagePresets": [
@@ -254,6 +315,11 @@
"inherits": "ci-x64-Release-GNUC"
},
{
+ "name": "ci-StdShar-GNUC-S3",
+ "configurePreset": "ci-StdShar-GNUC-S3",
+ "inherits": "ci-x64-Release-GNUC"
+ },
+ {
"name": "ci-StdShar-Intel",
"configurePreset": "ci-StdShar-Intel",
"inherits": "ci-x64-Release-Intel"
@@ -279,6 +345,15 @@
]
},
{
+ "name": "ci-StdShar-OSX-Clang",
+ "steps": [
+ {"type": "configure", "name": "ci-StdShar-Clang"},
+ {"type": "build", "name": "ci-StdShar-Clang"},
+ {"type": "test", "name": "ci-StdShar-OSX-Clang"},
+ {"type": "package", "name": "ci-StdShar-Clang"}
+ ]
+ },
+ {
"name": "ci-StdShar-GNUC",
"steps": [
{"type": "configure", "name": "ci-StdShar-GNUC"},
@@ -288,6 +363,15 @@
]
},
{
+ "name": "ci-StdShar-GNUC-S3",
+ "steps": [
+ {"type": "configure", "name": "ci-StdShar-GNUC-S3"},
+ {"type": "build", "name": "ci-StdShar-GNUC-S3"},
+ {"type": "test", "name": "ci-StdShar-GNUC-S3"},
+ {"type": "package", "name": "ci-StdShar-GNUC-S3"}
+ ]
+ },
+ {
"name": "ci-StdShar-Intel",
"steps": [
{"type": "configure", "name": "ci-StdShar-Intel"},
@@ -295,7 +379,15 @@
{"type": "test", "name": "ci-StdShar-Intel"},
{"type": "package", "name": "ci-StdShar-Intel"}
]
+ },
+ {
+ "name": "ci-StdShar-win-Intel",
+ "steps": [
+ {"type": "configure", "name": "ci-StdShar-Intel"},
+ {"type": "build", "name": "ci-StdShar-Intel"},
+ {"type": "test", "name": "ci-StdShar-win-Intel"},
+ {"type": "package", "name": "ci-StdShar-Intel"}
+ ]
}
]
}
-
diff --git a/HDF5Examples/C/CMakeLists.txt b/HDF5Examples/C/CMakeLists.txt
index 4e589bc..12882cf 100644
--- a/HDF5Examples/C/CMakeLists.txt
+++ b/HDF5Examples/C/CMakeLists.txt
@@ -1,5 +1,5 @@
cmake_minimum_required (VERSION 3.12)
-PROJECT (HDF5Examples_C)
+project (HDF5Examples_C C)
#-----------------------------------------------------------------------------
# Build the C Examples
diff --git a/HDF5Examples/C/H5PAR/CMakeLists.txt b/HDF5Examples/C/H5PAR/CMakeLists.txt
index 6e569b4..9016220 100644
--- a/HDF5Examples/C/H5PAR/CMakeLists.txt
+++ b/HDF5Examples/C/H5PAR/CMakeLists.txt
@@ -1,5 +1,5 @@
cmake_minimum_required (VERSION 3.12)
-PROJECT (H5PAR_C)
+project (H5PAR_C C)
#-----------------------------------------------------------------------------
# Define Sources
diff --git a/HDF5Examples/C/H5T/h5ex_t_convert.c b/HDF5Examples/C/H5T/h5ex_t_convert.c
index b6f46b6..b7036e3 100644
--- a/HDF5Examples/C/H5T/h5ex_t_convert.c
+++ b/HDF5Examples/C/H5T/h5ex_t_convert.c
@@ -120,7 +120,7 @@ main(void)
* Output the data to the screen.
*/
for (i = 0; i < DIM0; i++) {
- printf("sensor[%d]:\n", i);
+ printf("sensor[%" PRIuHSIZE "]:\n", i);
printf("Serial number : %d\n", sensor[i].serial_no);
printf("Location : %s\n", sensor[i].location);
printf("Temperature (F) : %f\n", sensor[i].temperature);
diff --git a/HDF5Examples/C/H5T/h5ex_t_objref.c b/HDF5Examples/C/H5T/h5ex_t_objref.c
index 1109720..660cc11 100644
--- a/HDF5Examples/C/H5T/h5ex_t_objref.c
+++ b/HDF5Examples/C/H5T/h5ex_t_objref.c
@@ -36,7 +36,7 @@ main(void)
hid_t ref_type = H5T_STD_REF; /* Reference datatype */
H5R_ref_t wdata[DIM0]; /* buffer to write to disk */
H5R_ref_t *rdata = NULL; /* buffer to read into*/
- H5R_type_t objtype; /* Reference type */
+ H5O_type_t objtype; /* Reference type */
#else
hid_t ref_type = H5T_STD_REF_OBJ; /* Reference datatype */
hobj_ref_t wdata[DIM0]; /* Write buffer */
diff --git a/HDF5Examples/C/H5T/h5ex_t_objrefatt.c b/HDF5Examples/C/H5T/h5ex_t_objrefatt.c
index a464e9e..1d9d1fe 100644
--- a/HDF5Examples/C/H5T/h5ex_t_objrefatt.c
+++ b/HDF5Examples/C/H5T/h5ex_t_objrefatt.c
@@ -38,7 +38,7 @@ main(void)
hid_t ref_type = H5T_STD_REF; /* Reference datatype */
H5R_ref_t wdata[DIM0]; /* buffer to write to disk */
H5R_ref_t *rdata = NULL; /* buffer to read into*/
- H5R_type_t objtype; /* Reference type */
+ H5O_type_t objtype; /* Reference type */
#else
hid_t ref_type = H5T_STD_REF_OBJ; /* Reference datatype */
hobj_ref_t wdata[DIM0]; /* Write buffer */
diff --git a/HDF5Examples/C/H5T/h5ex_t_opaque.c b/HDF5Examples/C/H5T/h5ex_t_opaque.c
index 085183a..11a58ae 100644
--- a/HDF5Examples/C/H5T/h5ex_t_opaque.c
+++ b/HDF5Examples/C/H5T/h5ex_t_opaque.c
@@ -111,7 +111,7 @@ main(void)
*/
printf("Datatype tag for %s is: \"%s\"\n", DATASET, tag);
for (i = 0; i < dims[0]; i++) {
- printf("%s[%u]: ", DATASET, i);
+ printf("%s[%" PRIuHSIZE "]: ", DATASET, i);
for (j = 0; j < len; j++)
printf("%c", rdata[j + i * len]);
printf("\n");
diff --git a/HDF5Examples/C/Perf/CMakeLists.txt b/HDF5Examples/C/Perf/CMakeLists.txt
index e41def2..66f9327 100644
--- a/HDF5Examples/C/Perf/CMakeLists.txt
+++ b/HDF5Examples/C/Perf/CMakeLists.txt
@@ -1,5 +1,5 @@
cmake_minimum_required (VERSION 3.12)
-PROJECT (HDF5Examples_C_PERFORM)
+project (HDF5Examples_C_PERFORM C)
#-----------------------------------------------------------------------------
# Define Sources
diff --git a/HDF5Examples/CMakePresets.json b/HDF5Examples/CMakePresets.json
index 263ff29..d9fdd04 100644
--- a/HDF5Examples/CMakePresets.json
+++ b/HDF5Examples/CMakePresets.json
@@ -129,6 +129,23 @@
]
},
{
+ "name": "ci-StdShar-OSX-Clang",
+ "configurePreset": "ci-StdShar-Clang",
+ "inherits": [
+ "ci-x64-Release-Clang"
+ ],
+ "execution": {
+ "noTestsAction": "error",
+ "timeout": 180,
+ "jobs": 2
+ },
+ "condition": {
+ "type": "equals",
+ "lhs": "${hostSystemName}",
+ "rhs": "Darwin"
+ }
+ },
+ {
"name": "ci-StdShar-GNUC",
"configurePreset": "ci-StdShar-GNUC",
"inherits": [
@@ -136,6 +153,23 @@
]
},
{
+ "name": "ci-StdShar-win-Intel",
+ "configurePreset": "ci-StdShar-Intel",
+ "inherits": [
+ "ci-x64-Release-Intel"
+ ],
+ "filter": {
+ "exclude": {
+ "name": "H5DUMP-tfloatsattrs"
+ }
+ },
+ "condition": {
+ "type": "equals",
+ "lhs": "${hostSystemName}",
+ "rhs": "Windows"
+ }
+ },
+ {
"name": "ci-StdShar-Intel",
"configurePreset": "ci-StdShar-Intel",
"inherits": [
@@ -161,6 +195,14 @@
]
},
{
+ "name": "ci-StdShar-OSX-Clang",
+ "steps": [
+ {"type": "configure", "name": "ci-StdShar-Clang"},
+ {"type": "build", "name": "ci-StdShar-Clang"},
+ {"type": "test", "name": "ci-StdShar-OSX-Clang"}
+ ]
+ },
+ {
"name": "ci-StdShar-GNUC",
"steps": [
{"type": "configure", "name": "ci-StdShar-GNUC"},
@@ -175,6 +217,14 @@
{"type": "build", "name": "ci-StdShar-Intel"},
{"type": "test", "name": "ci-StdShar-Intel"}
]
+ },
+ {
+ "name": "ci-StdShar-win-Intel",
+ "steps": [
+ {"type": "configure", "name": "ci-StdShar-Intel"},
+ {"type": "build", "name": "ci-StdShar-Intel"},
+ {"type": "test", "name": "ci-StdShar-win-Intel"}
+ ]
}
]
}
diff --git a/HDF5Examples/FORTRAN/H5D/h5ex_d_checksum.F90 b/HDF5Examples/FORTRAN/H5D/h5ex_d_checksum.F90
index b0464a3..cab742b 100644
--- a/HDF5Examples/FORTRAN/H5D/h5ex_d_checksum.F90
+++ b/HDF5Examples/FORTRAN/H5D/h5ex_d_checksum.F90
@@ -120,15 +120,17 @@ PROGRAM main
!
nelmts = 0
CALL H5Pget_filter_f(dcpl, 0, flags, nelmts, cd_values, MaxChrLen, name, filter_id, hdferr)
- WRITE(*,'("Filter type is: ")', ADVANCE='NO')
+ WRITE(*,'(A,1X)', ADVANCE='NO') "Filter type is:"
IF(filter_id.EQ.H5Z_FILTER_DEFLATE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_DEFLATE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_DEFLATE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SHUFFLE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SHUFFLE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SHUFFLE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_FLETCHER32_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_FLETCHER32_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_FLETCHER32_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SZIP_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SZIP_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SZIP_F"
+ ELSE
+ WRITE(*,'(A)') "UNKNOWN"
ENDIF
!
! Read the data using the default properties.
diff --git a/HDF5Examples/FORTRAN/H5D/h5ex_d_gzip.F90 b/HDF5Examples/FORTRAN/H5D/h5ex_d_gzip.F90
index b46e3fc..7e7b6b5 100644
--- a/HDF5Examples/FORTRAN/H5D/h5ex_d_gzip.F90
+++ b/HDF5Examples/FORTRAN/H5D/h5ex_d_gzip.F90
@@ -118,15 +118,17 @@ PROGRAM main
!
nelmts = 1
CALL H5Pget_filter_f(dcpl, 0, flags, nelmts, cd_values, MaxChrLen, name, filter_id, hdferr)
- WRITE(*,'("Filter type is: ")', ADVANCE='NO')
+ WRITE(*,'(A,1X)', ADVANCE='NO') "Filter type is:"
IF(filter_id.EQ.H5Z_FILTER_DEFLATE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_DEFLATE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_DEFLATE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SHUFFLE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SHUFFLE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SHUFFLE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_FLETCHER32_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_FLETCHER32_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_FLETCHER32_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SZIP_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SZIP_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SZIP_F"
+ ELSE
+ WRITE(*,'(A)') "UNKNOWN"
ENDIF
!
! Read the data using the default properties.
diff --git a/HDF5Examples/FORTRAN/H5D/h5ex_d_nbit.F90 b/HDF5Examples/FORTRAN/H5D/h5ex_d_nbit.F90
index 27e4d52..636898c 100644
--- a/HDF5Examples/FORTRAN/H5D/h5ex_d_nbit.F90
+++ b/HDF5Examples/FORTRAN/H5D/h5ex_d_nbit.F90
@@ -125,17 +125,19 @@ PROGRAM main
! first filter because we know that we only added one filter.
!
CALL H5Pget_filter_f(dcpl, 0, flags, nelmts, cd_values, INT(MaxChrLen, SIZE_T), name, filter_id, hdferr)
- WRITE(*,'("Filter type is: ")', ADVANCE='NO')
+ WRITE(*,'(A,1X)', ADVANCE='NO') "Filter type is:"
IF(filter_id.EQ.H5Z_FILTER_DEFLATE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_DEFLATE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_DEFLATE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SHUFFLE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SHUFFLE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SHUFFLE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_FLETCHER32_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_FLETCHER32_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_FLETCHER32_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SZIP_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SZIP_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SZIP_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_NBIT_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_NBIT_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_NBIT_F"
+ ELSE
+ WRITE(*,'(A)') "UNKNOWN"
ENDIF
!
! Read the data using the default properties.
diff --git a/HDF5Examples/FORTRAN/H5D/h5ex_d_soint.F90 b/HDF5Examples/FORTRAN/H5D/h5ex_d_soint.F90
index 120e896..e3bcc9f 100644
--- a/HDF5Examples/FORTRAN/H5D/h5ex_d_soint.F90
+++ b/HDF5Examples/FORTRAN/H5D/h5ex_d_soint.F90
@@ -133,19 +133,21 @@ PROGRAM main
!
nelmts = 1
CALL H5Pget_filter_f(dcpl, 0, flags, nelmts, cd_values, INT(MaxChrLen, SIZE_T), name, filter_id, hdferr)
- WRITE(*,'("Filter type is: ")', ADVANCE='NO')
+ WRITE(*,'(A,1X)', ADVANCE='NO') "Filter type is:"
IF(filter_id.EQ.H5Z_FILTER_DEFLATE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_DEFLATE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_DEFLATE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SHUFFLE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SHUFFLE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SHUFFLE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_FLETCHER32_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_FLETCHER32_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_FLETCHER32_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SZIP_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SZIP_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SZIP_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_NBIT_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_NBIT_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_NBIT_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SCALEOFFSET_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SCALEOFFSET_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SCALEOFFSET_F"
+ ELSE
+ WRITE(*,'(A)') "UNKNOWN"
ENDIF
!
! Read the data using the default properties.
diff --git a/HDF5Examples/FORTRAN/H5D/h5ex_d_szip.F90 b/HDF5Examples/FORTRAN/H5D/h5ex_d_szip.F90
index f66036e..fdd6ecf 100644
--- a/HDF5Examples/FORTRAN/H5D/h5ex_d_szip.F90
+++ b/HDF5Examples/FORTRAN/H5D/h5ex_d_szip.F90
@@ -119,20 +119,22 @@ PROGRAM main
nelmts = 1
CALL H5Pget_filter_f(dcpl, 0, flags, nelmts, cd_values, INT(MaxChrLen,SIZE_T), name, filter_id, hdferr)
- WRITE(*,'("Filter type is: ")', ADVANCE='NO')
+ WRITE(*,'(A,1X)', ADVANCE='NO') "Filter type is:"
IF(filter_id.EQ.H5Z_FILTER_DEFLATE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_DEFLATE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_DEFLATE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SHUFFLE_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SHUFFLE_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SHUFFLE_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_FLETCHER32_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_FLETCHER32_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_FLETCHER32_F"
ELSE IF(filter_id.EQ.H5Z_FILTER_SZIP_F)THEN
- WRITE(*,'(T2,"H5Z_FILTER_SZIP_F")')
+ WRITE(*,'(A)') "H5Z_FILTER_SZIP_F"
! DEFINED ONLY IN F2003 hdf5 branch
! ELSE IF(filter_id.EQ.H5Z_FILTER_NBIT_F)THEN
-! WRITE(*,'(T2,"H5Z_FILTER_NBIT_F")')
+! WRITE(*,'(" H5Z_FILTER_NBIT_F")')
! ELSE IF(filter_id.EQ.H5Z_FILTER_SCALEOFFSET_F)THEN
-! WRITE(*,'(T2,"H5Z_FILTER_SCALEOFFSET_F")')
+! WRITE(*,'(" H5Z_FILTER_SCALEOFFSET_F")')
+ ELSE
+ WRITE(*,'(A)') "UNKNOWN"
ENDIF
!
! Read the data using the default properties.
diff --git a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_checksum.tst b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_checksum.tst
index 01ed866..d2690e3 100644
--- a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_checksum.tst
+++ b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_checksum.tst
@@ -1,2 +1,2 @@
-Filter type is: H5Z_FILTER_FLETCHER32_F
+Filter type is: H5Z_FILTER_FLETCHER32_F
Maximum value in DS1 is: 1984
diff --git a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_gzip.tst b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_gzip.tst
index 9efcd78a..6fbaba1 100644
--- a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_gzip.tst
+++ b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_gzip.tst
@@ -1,2 +1,2 @@
-Filter type is: H5Z_FILTER_DEFLATE_F
+Filter type is: H5Z_FILTER_DEFLATE_F
Maximum value in DS1 is: 1890
diff --git a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_nbit.tst b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_nbit.tst
index 90f7a67..49c46ba 100644
--- a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_nbit.tst
+++ b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_nbit.tst
@@ -1,2 +1,2 @@
-Filter type is: H5Z_FILTER_NBIT_F
+Filter type is: H5Z_FILTER_NBIT_F
Maximum value in DS1 is: 1890
diff --git a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_soint.tst b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_soint.tst
index ddf8b30..d3dad13 100644
--- a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_soint.tst
+++ b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_soint.tst
@@ -1,5 +1,5 @@
Maximum value in write buffer is: 1890
Minimum value in write buffer is: -63
-Filter type is: H5Z_FILTER_SCALEOFFSET_F
+Filter type is: H5Z_FILTER_SCALEOFFSET_F
Maximum value in DS1 is: 1890
Minimum value in DS1 is: -63
diff --git a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_szip.tst b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_szip.tst
index 8f6ba90..bfd93d4 100644
--- a/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_szip.tst
+++ b/HDF5Examples/FORTRAN/H5D/tfiles/18/h5ex_d_szip.tst
@@ -1,2 +1,2 @@
-Filter type is: H5Z_FILTER_SZIP_F
+Filter type is: H5Z_FILTER_SZIP_F
Maximum value in DS1 is: 1890
diff --git a/HDF5Examples/config/cmake-presets/hidden-presets.json b/HDF5Examples/config/cmake-presets/hidden-presets.json
index 883b903..8b7f71b 100644
--- a/HDF5Examples/config/cmake-presets/hidden-presets.json
+++ b/HDF5Examples/config/cmake-presets/hidden-presets.json
@@ -274,7 +274,7 @@
"execution": {
"noTestsAction": "error",
"timeout": 600,
- "jobs": 8
+ "jobs": 4
}
},
{
diff --git a/HDF5Examples/config/cmake/HDFExampleMacros.cmake b/HDF5Examples/config/cmake/HDFExampleMacros.cmake
index 3bb1d48..245003c 100644
--- a/HDF5Examples/config/cmake/HDFExampleMacros.cmake
+++ b/HDF5Examples/config/cmake/HDFExampleMacros.cmake
@@ -182,21 +182,23 @@ macro (HDF5_SUPPORT)
set (H5EX_HDF5_LINK_LIBS ${H5EX_HDF5_LINK_LIBS} ${HDF5_C_STATIC_LIBRARY})
set_property (TARGET ${HDF5_NAMESPACE}h5dump PROPERTY IMPORTED_LOCATION "${HDF5_TOOLS_DIR}/h5dump")
endif ()
- if (HDF_BUILD_FORTRAN AND ${HDF5_BUILD_FORTRAN})
- if (BUILD_SHARED_LIBS AND HDF5_shared_Fortran_FOUND)
- set (H5EX_HDF5_LINK_LIBS ${H5EX_HDF5_LINK_LIBS} ${HDF5_FORTRAN_SHARED_LIBRARY})
- elseif (HDF5_static_Fortran_FOUND)
- set (H5EX_HDF5_LINK_LIBS ${H5EX_HDF5_LINK_LIBS} ${HDF5_FORTRAN_STATIC_LIBRARY})
- else ()
- set (HDF_BUILD_FORTRAN OFF CACHE BOOL "Build FORTRAN support" FORCE)
- message (STATUS "HDF5 Fortran libs not found - disable build of Fortran examples")
- endif ()
- else ()
+ if (NOT HDF5_static_Fortran_FOUND AND NOT HDF5_shared_Fortran_FOUND)
set (HDF_BUILD_FORTRAN OFF CACHE BOOL "Build FORTRAN support" FORCE)
message (STATUS "HDF5 Fortran libs not found - disable build of Fortran examples")
+ else ()
+ if (HDF_BUILD_FORTRAN AND ${HDF5_BUILD_FORTRAN})
+ if (BUILD_SHARED_LIBS AND HDF5_shared_Fortran_FOUND)
+ set (H5EX_HDF5_LINK_LIBS ${H5EX_HDF5_LINK_LIBS} ${HDF5_FORTRAN_SHARED_LIBRARY})
+ elseif (HDF5_static_Fortran_FOUND)
+ set (H5EX_HDF5_LINK_LIBS ${H5EX_HDF5_LINK_LIBS} ${HDF5_FORTRAN_STATIC_LIBRARY})
+ else ()
+ set (HDF_BUILD_FORTRAN OFF CACHE BOOL "Build FORTRAN support" FORCE)
+ message (STATUS "HDF5 Fortran libs not found - disable build of Fortran examples")
+ endif ()
+ endif ()
endif ()
- if (HDF_BUILD_JAVA)
- if (${HDF5_BUILD_JAVA} AND HDF5_Java_FOUND)
+ if (HDF_BUILD_JAVA AND HDF5_Java_FOUND)
+ if (${HDF5_BUILD_JAVA})
set (CMAKE_JAVA_INCLUDE_PATH "${CMAKE_JAVA_INCLUDE_PATH};${HDF5_JAVA_INCLUDE_DIRS}")
set (H5EX_JAVA_LIBRARY ${HDF5_JAVA_LIBRARY})
set (H5EX_JAVA_LIBRARIES ${HDF5_JAVA_LIBRARY})
diff --git a/HDF5Examples/config/cmake/HDFMacros.cmake b/HDF5Examples/config/cmake/HDFMacros.cmake
index 66a25aa..59efbfb 100644
--- a/HDF5Examples/config/cmake/HDFMacros.cmake
+++ b/HDF5Examples/config/cmake/HDFMacros.cmake
@@ -90,7 +90,7 @@ macro (HDFTEST_COPY_FILE src dest target)
endmacro ()
macro (HDF_DIR_PATHS package_prefix)
- option (H5EX_USE_GNU_DIRS "TRUE to use GNU Coding Standard install directory variables, FALSE to use historical settings" FALSE)
+ option (H5EX_USE_GNU_DIRS "ON to use GNU Coding Standard install directory variables, OFF to use historical settings" OFF)
if (H5EX_USE_GNU_DIRS)
include(GNUInstallDirs)
if (NOT ${package_prefix}_INSTALL_BIN_DIR)
@@ -121,7 +121,7 @@ macro (HDF_DIR_PATHS package_prefix)
endif ()
if (APPLE)
- option (${package_prefix}_BUILD_FRAMEWORKS "TRUE to build as frameworks libraries, FALSE to build according to BUILD_SHARED_LIBS" FALSE)
+ option (${package_prefix}_BUILD_FRAMEWORKS "ON to build as frameworks libraries, OFF to build according to BUILD_SHARED_LIBS" OFF)
endif ()
if (NOT ${package_prefix}_INSTALL_BIN_DIR)
@@ -170,10 +170,10 @@ macro (HDF_DIR_PATHS package_prefix)
message(STATUS "Final: ${${package_prefix}_INSTALL_DOC_DIR}")
# Always use full RPATH, i.e. don't skip the full RPATH for the build tree
- set (CMAKE_SKIP_BUILD_RPATH FALSE)
+ set (CMAKE_SKIP_BUILD_RPATH OFF)
# when building, don't use the install RPATH already
# (but later on when installing)
- set (CMAKE_INSTALL_RPATH_USE_LINK_PATH FALSE)
+ set (CMAKE_INSTALL_RPATH_USE_LINK_PATH OFF)
# add the automatically determined parts of the RPATH
# which point to directories outside the build tree to the install RPATH
set (CMAKE_BUILD_WITH_INSTALL_RPATH ON)
diff --git a/autogen.sh b/autogen.sh
index 74c6c45..142375d 100755
--- a/autogen.sh
+++ b/autogen.sh
@@ -44,7 +44,7 @@
# HDF5_AUTOHEADER
# HDF5_AUTOMAKE
# HDF5_AUTOCONF
-# HDF5_LIBTOOL
+# HDF5_LIBTOOLIZE
# HDF5_M4
#
# Note that aclocal will attempt to include libtool's share/aclocal
@@ -111,16 +111,12 @@ fi
if test -z "${HDF5_ACLOCAL}"; then
HDF5_ACLOCAL="$(command -v aclocal)"
fi
-if test -z "${HDF5_LIBTOOL}"; then
- case "$(uname)" in
- Darwin*)
- # libtool on OS-X is non-gnu
- HDF5_LIBTOOL="$(command -v glibtool)"
- ;;
- *)
- HDF5_LIBTOOL="$(command -v libtool)"
- ;;
- esac
+if test -z "${HDF5_LIBTOOLIZE}"; then
+ # check for glibtoolize (likely found on MacOS). If not found - check for libtoolize
+ HDF5_LIBTOOLIZE="$(command -v glibtoolize)"
+ if [ ! -f "$HDF5_LIBTOOLIZE" ] ; then
+ HDF5_LIBTOOLIZE="$(command -v libtoolize)"
+ fi
fi
if test -z "${HDF5_M4}"; then
HDF5_M4="$(command -v m4)"
@@ -129,26 +125,10 @@ fi
# Make sure that these versions of the autotools are in the path
AUTOCONF_DIR=$(dirname "${HDF5_AUTOCONF}")
-LIBTOOL_DIR=$(dirname "${HDF5_LIBTOOL}")
+LIBTOOL_DIR=$(dirname "${HDF5_LIBTOOLIZE}")
M4_DIR=$(dirname "${HDF5_M4}")
PATH=${AUTOCONF_DIR}:${LIBTOOL_DIR}:${M4_DIR}:$PATH
-# Make libtoolize match the specified libtool
-case "$(uname)" in
-Darwin*)
- # On OS X, libtoolize could be named glibtoolize or
- # libtoolize. Try the former first, then fall back
- # to the latter if it's not found.
- HDF5_LIBTOOLIZE="${LIBTOOL_DIR}/glibtoolize"
- if [ ! -f "$HDF5_LIBTOOLIZE" ] ; then
- HDF5_LIBTOOLIZE="${LIBTOOL_DIR}/libtoolize"
- fi
- ;;
-*)
- HDF5_LIBTOOLIZE="${LIBTOOL_DIR}/libtoolize"
- ;;
-esac
-
# Run scripts that process source.
#
# These should be run before the autotools so that failures here block
diff --git a/bin/h5cc.in b/bin/h5cc.in
index e3dc988..e4d4368 100644
--- a/bin/h5cc.in
+++ b/bin/h5cc.in
@@ -62,7 +62,7 @@ host_os="@host_os@"
prog_name="`basename $0`"
-allargs=""
+misc_args=""
compile_args=""
libraries=""
link_args=""
@@ -202,7 +202,6 @@ for arg in $@ ; do
case "$arg" in
-c)
- allargs="$allargs $arg"
compile_args="$compile_args $arg"
if test "x$do_link" = "xyes" -a -n "$output_file"; then
@@ -213,7 +212,6 @@ for arg in $@ ; do
dash_c="yes"
;;
-o)
- allargs="$allargs $arg"
dash_o="yes"
if test "x$dash_c" = "xyes"; then
@@ -225,14 +223,12 @@ for arg in $@ ; do
fi
;;
-E|-M|-MT)
- allargs="$allargs $arg"
compile_args="$compile_args $arg"
dash_c="yes"
do_link="no"
;;
-l*)
libraries=" $libraries $arg "
- allargs="$allargs $arg"
;;
-prefix=*)
prefix="`expr "$arg" : '-prefix=\(.*\)'`"
@@ -264,14 +260,14 @@ for arg in $@ ; do
;;
*\"*)
qarg="'"$arg"'"
- allargs="$allargs $qarg"
+ misc_args="$misc_args $qarg"
;;
*\'*)
- qarg='\"'"$arg"'\"'
- allargs="$allargs $qarg"
+ qarg='"'"$arg"'"'
+ misc_args="$misc_args $qarg"
;;
*)
- allargs="$allargs $qarg"
+ misc_args="$misc_args $qarg"
if test -s "$arg"; then
ext=`expr "$arg" : '.*\(\..*\)'`
@@ -313,7 +309,7 @@ if test "x$do_compile" = "xyes"; then
compile_args="-c $compile_args"
fi
- $SHOW $CC -I$includedir $H5BLD_CPPFLAGS $CPPFLAGS $H5BLD_CFLAGS $CFLAGS $compile_args
+ $SHOW $CC -I$includedir $H5BLD_CPPFLAGS $CPPFLAGS $H5BLD_CFLAGS $CFLAGS $misc_args $compile_args
status=$?
if test "$status" != "0"; then
diff --git a/bin/release b/bin/release
index 26a756c..5eb8b8c 100755
--- a/bin/release
+++ b/bin/release
@@ -466,7 +466,7 @@ done
# Create the tar file
test "$verbose" && echo " Running tar..." 1>&2
-(cd "$tmpdir" && exec tar -ch --exclude-vcs -f "$HDF5_VERS.tar" "./$HDF5_IN_VERS" || exit 1 )
+(cd "$tmpdir" && exec tar -ch --exclude-vcs --exclude=*autom4te* --exclude=.clang-format --exclude=.codespellrc --exclude=.devcontainer --exclude=.git* --exclude=.h5chkright.ini -f "$HDF5_VERS.tar" "./$HDF5_IN_VERS" || exit 1 )
# Compress
#SHA256=$HDF5_VERS.sha256
diff --git a/c++/examples/testh5c++.sh.in b/c++/examples/testh5c++.sh.in
index 84c7752..f3a973c 100644
--- a/c++/examples/testh5c++.sh.in
+++ b/c++/examples/testh5c++.sh.in
@@ -47,6 +47,8 @@ prog1_o=${H5TOOL}_prog1.o
prog2=${H5TOOL}_prog2.$suffix
prog2_o=${H5TOOL}_prog2.o
applib=libapp${H5TOOL}.a
+args=${H5TOOL}_args.$suffix
+args_o=${H5TOOL}_args.o
# short hands
# Caution: if some *.h5 files must be cleaned here, list them by names.
@@ -134,16 +136,38 @@ int main (void)
}
EOF
+# Generate args:
+# An application main that test misc command line arguments being passed.
+cat > $args <<EOF
+#include <string>
+#include <iostream>
+#include "H5Cpp.h"
+#ifndef H5_NO_NAMESPACE
+using namespace H5;
+#endif
+
+const H5std_string FILE_NAME( "args.h5" );
+
+int main (void)
+{
+ char c = SGL_QUOTE; // 'H'
+ char *s = DBL_QUOTE; // "HDF"
+ int val = MISC; // 42
+
+ H5File file( FILE_NAME, H5F_ACC_TRUNC );
+ return 0;
+}
+EOF
# Parse option
# None
-# Print a line-line message left justified in a field of 70 characters
+# Print a line-line message left justified in a field of 74 characters
# beginning with the word "Testing".
#
TESTING() {
SPACES=" "
- echo "Testing $* $SPACES" | cut -c1-70 | tr -d '\012'
+ echo "Testing $* $SPACES" | cut -c1-74 | tr -d '\012'
}
@@ -231,6 +255,10 @@ echo "***"Just preprocess, no compile, no link.
TOOLTEST -E $hdf5main
TOOLTEST -E $appmain $prog1 $prog2
+# HDF5 program that depends on input args.
+echo "***"Simple Compile and Link in one step with user-supplied arguments.
+TOOLTEST -DSGL_QUOTE=\'H\' -DDBL_QUOTE=\"HDF\" -DMISC=42 $args
+
##############################################################################
# END
##############################################################################
diff --git a/c++/src/H5AbstractDs.cpp b/c++/src/H5AbstractDs.cpp
index 7ea107c..ab3faba 100644
--- a/c++/src/H5AbstractDs.cpp
+++ b/c++/src/H5AbstractDs.cpp
@@ -316,12 +316,4 @@ AbstractDs::getVarLenType() const
}
}
-//--------------------------------------------------------------------------
-// Function: AbstractDs destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-AbstractDs::~AbstractDs()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5AbstractDs.h b/c++/src/H5AbstractDs.h
index 49bcfeb..d831bd6 100644
--- a/c++/src/H5AbstractDs.h
+++ b/c++/src/H5AbstractDs.h
@@ -67,7 +67,7 @@ class H5_DLLCPP AbstractDs {
virtual H5std_string fromClass() const = 0;
// Destructor
- virtual ~AbstractDs();
+ virtual ~AbstractDs() = default;
protected:
// Default constructor
diff --git a/c++/src/H5ArrayType.cpp b/c++/src/H5ArrayType.cpp
index c5ad7ea..3a2da2b 100644
--- a/c++/src/H5ArrayType.cpp
+++ b/c++/src/H5ArrayType.cpp
@@ -199,12 +199,4 @@ ArrayType::getArrayDims(hsize_t *dims) const
return (ndims);
}
-//--------------------------------------------------------------------------
-// Function: ArrayType destructor
-///\brief Properly terminates access to this array datatype.
-//--------------------------------------------------------------------------
-ArrayType::~ArrayType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5ArrayType.h b/c++/src/H5ArrayType.h
index e9afb9c..e02a3a2 100644
--- a/c++/src/H5ArrayType.h
+++ b/c++/src/H5ArrayType.h
@@ -60,7 +60,7 @@ class H5_DLLCPP ArrayType : public DataType {
ArrayType(const hid_t existing_id);
// Noop destructor
- virtual ~ArrayType() override;
+ virtual ~ArrayType() override = default;
// Default constructor
ArrayType();
diff --git a/c++/src/H5AtomType.cpp b/c++/src/H5AtomType.cpp
index db6c8f8..f2e037a 100644
--- a/c++/src/H5AtomType.cpp
+++ b/c++/src/H5AtomType.cpp
@@ -276,14 +276,4 @@ AtomType::setPad(H5T_pad_t lsb, H5T_pad_t msb) const
}
}
-#ifndef DOXYGEN_SHOULD_SKIP_THIS
-//--------------------------------------------------------------------------
-// Function: AtomType destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-AtomType::~AtomType()
-{
-}
-#endif // DOXYGEN_SHOULD_SKIP_THIS
-
} // namespace H5
diff --git a/c++/src/H5AtomType.h b/c++/src/H5AtomType.h
index bb2cf48..327604d 100644
--- a/c++/src/H5AtomType.h
+++ b/c++/src/H5AtomType.h
@@ -67,7 +67,7 @@ class H5_DLLCPP AtomType : public DataType {
AtomType(const AtomType &original);
// Noop destructor
- virtual ~AtomType() override;
+ virtual ~AtomType() override = default;
#endif // DOXYGEN_SHOULD_SKIP_THIS
protected:
diff --git a/c++/src/H5CommonFG.cpp b/c++/src/H5CommonFG.cpp
index 277ba34..d26c83f 100644
--- a/c++/src/H5CommonFG.cpp
+++ b/c++/src/H5CommonFG.cpp
@@ -343,14 +343,6 @@ CommonFG::CommonFG()
}
//--------------------------------------------------------------------------
-// Function: CommonFG destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-CommonFG::~CommonFG()
-{
-}
-
-//--------------------------------------------------------------------------
// Function: f_DataType_setId - friend
// Purpose: This function is friend to class H5::DataType so that it
// can set DataType::id in order to work around a problem
diff --git a/c++/src/H5CommonFG.h b/c++/src/H5CommonFG.h
index e675617..6eb01db 100644
--- a/c++/src/H5CommonFG.h
+++ b/c++/src/H5CommonFG.h
@@ -72,7 +72,7 @@ class H5_DLLCPP CommonFG {
CommonFG();
// Noop destructor.
- virtual ~CommonFG();
+ virtual ~CommonFG() = default;
protected:
virtual void p_setId(const hid_t new_id) = 0;
diff --git a/c++/src/H5CompType.cpp b/c++/src/H5CompType.cpp
index c89fa5c..70bbe66 100644
--- a/c++/src/H5CompType.cpp
+++ b/c++/src/H5CompType.cpp
@@ -527,12 +527,4 @@ CompType::setSize(size_t size) const
}
}
-//--------------------------------------------------------------------------
-// Function: CompType destructor
-///\brief Properly terminates access to this compound datatype.
-//--------------------------------------------------------------------------
-CompType::~CompType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5CompType.h b/c++/src/H5CompType.h
index 0675d20..a642b0d 100644
--- a/c++/src/H5CompType.h
+++ b/c++/src/H5CompType.h
@@ -113,7 +113,7 @@ class H5_DLLCPP CompType : public DataType {
}
// Noop destructor.
- virtual ~CompType() override;
+ virtual ~CompType() override = default;
private:
// Contains common code that is used by the member functions
diff --git a/c++/src/H5DaccProp.cpp b/c++/src/H5DaccProp.cpp
index 8b01665..ba4c8ef 100644
--- a/c++/src/H5DaccProp.cpp
+++ b/c++/src/H5DaccProp.cpp
@@ -153,12 +153,4 @@ DSetAccPropList::getChunkCache(size_t &rdcc_nslots, size_t &rdcc_nbytes, double
}
}
-//--------------------------------------------------------------------------
-// Function: DSetAccPropList destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-DSetAccPropList::~DSetAccPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5DaccProp.h b/c++/src/H5DaccProp.h
index bb404ce..3f101f3 100644
--- a/c++/src/H5DaccProp.h
+++ b/c++/src/H5DaccProp.h
@@ -50,7 +50,7 @@ class H5_DLLCPP DSetAccPropList : public LinkAccPropList {
DSetAccPropList(const hid_t plist_id);
// Noop destructor.
- virtual ~DSetAccPropList() override;
+ virtual ~DSetAccPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5DcreatProp.cpp b/c++/src/H5DcreatProp.cpp
index 8b199a8..8f4ee7e 100644
--- a/c++/src/H5DcreatProp.cpp
+++ b/c++/src/H5DcreatProp.cpp
@@ -754,12 +754,4 @@ DSetCreatPropList::setVirtual(const DataSpace &vspace, const H5std_string src_fn
setVirtual(vspace, src_fname.c_str(), src_dsname.c_str(), sspace);
}
-//--------------------------------------------------------------------------
-// Function: DSetCreatPropList destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-DSetCreatPropList::~DSetCreatPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5DcreatProp.h b/c++/src/H5DcreatProp.h
index 94ecbb5..3c8587d 100644
--- a/c++/src/H5DcreatProp.h
+++ b/c++/src/H5DcreatProp.h
@@ -140,7 +140,7 @@ class H5_DLLCPP DSetCreatPropList : public ObjCreatPropList {
DSetCreatPropList(const hid_t plist_id);
// Noop destructor.
- virtual ~DSetCreatPropList() override;
+ virtual ~DSetCreatPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5DxferProp.cpp b/c++/src/H5DxferProp.cpp
index 33e2ff5..1b9d651 100644
--- a/c++/src/H5DxferProp.cpp
+++ b/c++/src/H5DxferProp.cpp
@@ -527,12 +527,4 @@ DSetMemXferPropList::getEDCCheck() const
return (check);
}
-//--------------------------------------------------------------------------
-// Function: DSetMemXferPropList destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-DSetMemXferPropList::~DSetMemXferPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5DxferProp.h b/c++/src/H5DxferProp.h
index d0a65c1..b86202d 100644
--- a/c++/src/H5DxferProp.h
+++ b/c++/src/H5DxferProp.h
@@ -112,7 +112,7 @@ class H5_DLLCPP DSetMemXferPropList : public PropList {
DSetMemXferPropList(const hid_t plist_id);
// Noop destructor
- virtual ~DSetMemXferPropList() override;
+ virtual ~DSetMemXferPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5EnumType.cpp b/c++/src/H5EnumType.cpp
index 03c07d9..569e56a 100644
--- a/c++/src/H5EnumType.cpp
+++ b/c++/src/H5EnumType.cpp
@@ -317,12 +317,4 @@ EnumType::getMemberValue(unsigned memb_no, void *value) const
}
}
-//--------------------------------------------------------------------------
-// Function: EnumType destructor
-///\brief Properly terminates access to this enum datatype.
-//--------------------------------------------------------------------------
-EnumType::~EnumType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5EnumType.h b/c++/src/H5EnumType.h
index a98688e..a5096fc 100644
--- a/c++/src/H5EnumType.h
+++ b/c++/src/H5EnumType.h
@@ -81,7 +81,7 @@ class H5_DLLCPP EnumType : public DataType {
// Copy constructor: same as the original EnumType.
EnumType(const EnumType &original);
- virtual ~EnumType() override;
+ virtual ~EnumType() override = default;
}; // end of EnumType
} // namespace H5
diff --git a/c++/src/H5Exception.cpp b/c++/src/H5Exception.cpp
index 01b9cc1..906bd27 100644
--- a/c++/src/H5Exception.cpp
+++ b/c++/src/H5Exception.cpp
@@ -331,14 +331,6 @@ Exception::printErrorStack(FILE *stream, hid_t err_stack)
//}
//--------------------------------------------------------------------------
-// Function: Exception destructor
-///\brief Noop destructor
-//--------------------------------------------------------------------------
-Exception::~Exception() throw()
-{
-}
-
-//--------------------------------------------------------------------------
// Subclass: FileIException
//--------------------------------------------------------------------------
//--------------------------------------------------------------------------
@@ -359,13 +351,6 @@ FileIException::FileIException(const H5std_string &func, const H5std_string &mes
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: FileIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-FileIException::~FileIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: GroupIException
@@ -389,14 +374,6 @@ GroupIException::GroupIException(const H5std_string &func, const H5std_string &m
{
}
//--------------------------------------------------------------------------
-// Function: GroupIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-GroupIException::~GroupIException() throw()
-{
-}
-
-//--------------------------------------------------------------------------
// Subclass: DataSpaceIException
//--------------------------------------------------------------------------
//--------------------------------------------------------------------------
@@ -417,13 +394,6 @@ DataSpaceIException::DataSpaceIException(const H5std_string &func, const H5std_s
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: DataSpaceIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-DataSpaceIException::~DataSpaceIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: DataTypeIException
@@ -446,13 +416,6 @@ DataTypeIException::DataTypeIException(const H5std_string &func, const H5std_str
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: DataTypeIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-DataTypeIException::~DataTypeIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: ObjHeaderIException
@@ -475,13 +438,6 @@ ObjHeaderIException::ObjHeaderIException(const H5std_string &func, const H5std_s
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: ObjHeaderIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-ObjHeaderIException::~ObjHeaderIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: PropListIException
@@ -504,13 +460,6 @@ PropListIException::PropListIException(const H5std_string &func, const H5std_str
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: PropListIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-PropListIException::~PropListIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: DataSetIException
@@ -533,13 +482,6 @@ DataSetIException::DataSetIException(const H5std_string &func, const H5std_strin
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: DataSetIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-DataSetIException::~DataSetIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: AttributeIException
@@ -562,13 +504,6 @@ AttributeIException::AttributeIException(const H5std_string &func, const H5std_s
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: AttributeIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-AttributeIException::~AttributeIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: ReferenceException
@@ -591,13 +526,6 @@ ReferenceException::ReferenceException(const H5std_string &func, const H5std_str
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: ReferenceException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-ReferenceException::~ReferenceException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: LibraryIException
@@ -620,13 +548,6 @@ LibraryIException::LibraryIException(const H5std_string &func, const H5std_strin
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: LibraryIException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-LibraryIException::~LibraryIException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: LocationException
@@ -649,13 +570,6 @@ LocationException::LocationException(const H5std_string &func, const H5std_strin
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: LocationException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-LocationException::~LocationException() throw()
-{
-}
//--------------------------------------------------------------------------
// Subclass: IdComponentException
@@ -678,12 +592,5 @@ IdComponentException::IdComponentException(const H5std_string &func, const H5std
: Exception(func, message)
{
}
-//--------------------------------------------------------------------------
-// Function: IdComponentException destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-IdComponentException::~IdComponentException() throw()
-{
-}
} // namespace H5
diff --git a/c++/src/H5Exception.h b/c++/src/H5Exception.h
index d4533e5..6bf51ef 100644
--- a/c++/src/H5Exception.h
+++ b/c++/src/H5Exception.h
@@ -74,7 +74,7 @@ class H5_DLLCPP Exception {
Exception(const Exception &orig);
// virtual Destructor
- virtual ~Exception() throw();
+ virtual ~Exception() = default;
protected:
// Default value for detail_message
@@ -89,84 +89,84 @@ class H5_DLLCPP FileIException : public Exception {
public:
FileIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
FileIException();
- virtual ~FileIException() throw() override;
+ virtual ~FileIException() override = default;
};
class H5_DLLCPP GroupIException : public Exception {
public:
GroupIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
GroupIException();
- virtual ~GroupIException() throw() override;
+ virtual ~GroupIException() override = default;
};
class H5_DLLCPP DataSpaceIException : public Exception {
public:
DataSpaceIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
DataSpaceIException();
- virtual ~DataSpaceIException() throw() override;
+ virtual ~DataSpaceIException() override = default;
};
class H5_DLLCPP DataTypeIException : public Exception {
public:
DataTypeIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
DataTypeIException();
- virtual ~DataTypeIException() throw() override;
+ virtual ~DataTypeIException() override = default;
};
class H5_DLLCPP ObjHeaderIException : public Exception {
public:
ObjHeaderIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
ObjHeaderIException();
- virtual ~ObjHeaderIException() throw() override;
+ virtual ~ObjHeaderIException() override = default;
};
class H5_DLLCPP PropListIException : public Exception {
public:
PropListIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
PropListIException();
- virtual ~PropListIException() throw() override;
+ virtual ~PropListIException() override = default;
};
class H5_DLLCPP DataSetIException : public Exception {
public:
DataSetIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
DataSetIException();
- virtual ~DataSetIException() throw() override;
+ virtual ~DataSetIException() override = default;
};
class H5_DLLCPP AttributeIException : public Exception {
public:
AttributeIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
AttributeIException();
- virtual ~AttributeIException() throw() override;
+ virtual ~AttributeIException() override = default;
};
class H5_DLLCPP ReferenceException : public Exception {
public:
ReferenceException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
ReferenceException();
- virtual ~ReferenceException() throw() override;
+ virtual ~ReferenceException() override = default;
};
class H5_DLLCPP LibraryIException : public Exception {
public:
LibraryIException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
LibraryIException();
- virtual ~LibraryIException() throw() override;
+ virtual ~LibraryIException() override = default;
};
class H5_DLLCPP LocationException : public Exception {
public:
LocationException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
LocationException();
- virtual ~LocationException() throw() override;
+ virtual ~LocationException() override = default;
};
class H5_DLLCPP IdComponentException : public Exception {
public:
IdComponentException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
IdComponentException();
- virtual ~IdComponentException() throw() override;
+ virtual ~IdComponentException() override = default;
}; // end of IdComponentException
} // namespace H5
diff --git a/c++/src/H5FaccProp.cpp b/c++/src/H5FaccProp.cpp
index ea5692a..dc4b949 100644
--- a/c++/src/H5FaccProp.cpp
+++ b/c++/src/H5FaccProp.cpp
@@ -769,12 +769,4 @@ FileAccPropList::getLibverBounds(H5F_libver_t &libver_low, H5F_libver_t &libver_
}
}
-//--------------------------------------------------------------------------
-// Function: FileAccPropList destructor
-///\brief Noop destructor
-//--------------------------------------------------------------------------
-FileAccPropList::~FileAccPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5FaccProp.h b/c++/src/H5FaccProp.h
index 27028a2..67394f1 100644
--- a/c++/src/H5FaccProp.h
+++ b/c++/src/H5FaccProp.h
@@ -149,7 +149,7 @@ class H5_DLLCPP FileAccPropList : public PropList {
FileAccPropList(const hid_t plist_id);
// Noop destructor
- virtual ~FileAccPropList() override;
+ virtual ~FileAccPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5FcreatProp.cpp b/c++/src/H5FcreatProp.cpp
index fe46dee..66e4ceb 100644
--- a/c++/src/H5FcreatProp.cpp
+++ b/c++/src/H5FcreatProp.cpp
@@ -357,12 +357,4 @@ FileCreatPropList::getFileSpacePagesize() const
return (fsp_psize);
}
-//--------------------------------------------------------------------------
-// Function: FileCreatPropList destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-FileCreatPropList::~FileCreatPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5FcreatProp.h b/c++/src/H5FcreatProp.h
index 399db71..76c2ae5 100644
--- a/c++/src/H5FcreatProp.h
+++ b/c++/src/H5FcreatProp.h
@@ -90,7 +90,7 @@ class H5_DLLCPP FileCreatPropList : public PropList {
FileCreatPropList(const hid_t plist_id);
// Noop destructor
- virtual ~FileCreatPropList() override;
+ virtual ~FileCreatPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5FloatType.cpp b/c++/src/H5FloatType.cpp
index 74170da..41ee8a8 100644
--- a/c++/src/H5FloatType.cpp
+++ b/c++/src/H5FloatType.cpp
@@ -326,12 +326,4 @@ FloatType::setInpad(H5T_pad_t inpad) const
}
}
-//--------------------------------------------------------------------------
-// Function: FloatType destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-FloatType::~FloatType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5FloatType.h b/c++/src/H5FloatType.h
index 42437ec..b804892 100644
--- a/c++/src/H5FloatType.h
+++ b/c++/src/H5FloatType.h
@@ -78,7 +78,7 @@ class H5_DLLCPP FloatType : public AtomType {
FloatType(const FloatType &original);
// Noop destructor.
- virtual ~FloatType() override;
+ virtual ~FloatType() override = default;
}; // end of FloatType
} // namespace H5
diff --git a/c++/src/H5IdComponent.cpp b/c++/src/H5IdComponent.cpp
index 93df343..0d15aac 100644
--- a/c++/src/H5IdComponent.cpp
+++ b/c++/src/H5IdComponent.cpp
@@ -307,14 +307,6 @@ IdComponent::setId(const hid_t new_id)
incRefCount();
}
-//--------------------------------------------------------------------------
-// Function: IdComponent destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-IdComponent::~IdComponent()
-{
-}
-
//
// Implementation of protected functions for HDF5 Reference Interface
// and miscellaneous helpers.
diff --git a/c++/src/H5IdComponent.h b/c++/src/H5IdComponent.h
index d64bdb5..2fef96f 100644
--- a/c++/src/H5IdComponent.h
+++ b/c++/src/H5IdComponent.h
@@ -81,7 +81,7 @@ class H5_DLLCPP IdComponent {
#endif // DOXYGEN_SHOULD_SKIP_THIS
// Destructor
- virtual ~IdComponent();
+ virtual ~IdComponent() = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5IntType.cpp b/c++/src/H5IntType.cpp
index 87a287f..7c8b7d3 100644
--- a/c++/src/H5IntType.cpp
+++ b/c++/src/H5IntType.cpp
@@ -182,12 +182,4 @@ IntType::setSign(H5T_sign_t sign) const
}
}
-//--------------------------------------------------------------------------
-// Function: IntType destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-IntType::~IntType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5IntType.h b/c++/src/H5IntType.h
index 170ed37..1ca0ab1 100644
--- a/c++/src/H5IntType.h
+++ b/c++/src/H5IntType.h
@@ -60,7 +60,7 @@ class H5_DLLCPP IntType : public AtomType {
IntType(const IntType &original);
// Noop destructor.
- virtual ~IntType() override;
+ virtual ~IntType() override = default;
}; // end of IntType
} // namespace H5
diff --git a/c++/src/H5LaccProp.cpp b/c++/src/H5LaccProp.cpp
index 0285ee7..7a66c13 100644
--- a/c++/src/H5LaccProp.cpp
+++ b/c++/src/H5LaccProp.cpp
@@ -140,12 +140,4 @@ LinkAccPropList::getNumLinks() const
return (nlinks);
}
-//--------------------------------------------------------------------------
-// Function: LinkAccPropList destructor
-///\brief Noop destructor
-//--------------------------------------------------------------------------
-LinkAccPropList::~LinkAccPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5LaccProp.h b/c++/src/H5LaccProp.h
index 53389e2..6f4b919 100644
--- a/c++/src/H5LaccProp.h
+++ b/c++/src/H5LaccProp.h
@@ -51,7 +51,7 @@ class H5_DLLCPP LinkAccPropList : public PropList {
size_t getNumLinks() const;
// Noop destructor
- virtual ~LinkAccPropList() override;
+ virtual ~LinkAccPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5LcreatProp.cpp b/c++/src/H5LcreatProp.cpp
index 2f34375..0dbb0b2 100644
--- a/c++/src/H5LcreatProp.cpp
+++ b/c++/src/H5LcreatProp.cpp
@@ -184,13 +184,4 @@ LinkCreatPropList::getCharEncoding() const
return (encoding);
}
-//--------------------------------------------------------------------------
-// Function: LinkCreatPropList destructor
-///\brief Noop destructor
-// December, 2016
-//--------------------------------------------------------------------------
-LinkCreatPropList::~LinkCreatPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5LcreatProp.h b/c++/src/H5LcreatProp.h
index 233a98b..272f260 100644
--- a/c++/src/H5LcreatProp.h
+++ b/c++/src/H5LcreatProp.h
@@ -58,7 +58,7 @@ class H5_DLLCPP LinkCreatPropList : public PropList {
H5T_cset_t getCharEncoding() const;
// Noop destructor
- virtual ~LinkCreatPropList() override;
+ virtual ~LinkCreatPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5Library.cpp b/c++/src/H5Library.cpp
index 019ae67..c16bd81 100644
--- a/c++/src/H5Library.cpp
+++ b/c++/src/H5Library.cpp
@@ -273,14 +273,6 @@ H5Library::setFreeListLimits(int reg_global_lim, int reg_list_lim, int arr_globa
H5Library::H5Library()
{
}
-
-//--------------------------------------------------------------------------
-// Function: H5Library destructor
-///\brief Noop destructor
-//--------------------------------------------------------------------------
-H5Library::~H5Library()
-{
-}
#endif // DOXYGEN_SHOULD_SKIP_THIS
} // namespace H5
diff --git a/c++/src/H5Library.h b/c++/src/H5Library.h
index 3770639..10e5536 100644
--- a/c++/src/H5Library.h
+++ b/c++/src/H5Library.h
@@ -62,7 +62,7 @@ class H5_DLLCPP H5Library {
H5Library();
// Destructor
- ~H5Library();
+ ~H5Library() = default;
#endif // DOXYGEN_SHOULD_SKIP_THIS
}; // end of H5Library
diff --git a/c++/src/H5Location.cpp b/c++/src/H5Location.cpp
index 8befefc..87eac67 100644
--- a/c++/src/H5Location.cpp
+++ b/c++/src/H5Location.cpp
@@ -2367,12 +2367,4 @@ f_DataSpace_setId(DataSpace *dspace, hid_t new_id)
dspace->p_setId(new_id);
}
-//--------------------------------------------------------------------------
-// Function: H5Location destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-H5Location::~H5Location()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5Location.h b/c++/src/H5Location.h
index ee45d67..f10a005 100644
--- a/c++/src/H5Location.h
+++ b/c++/src/H5Location.h
@@ -333,7 +333,7 @@ class H5_DLLCPP H5Location : public IdComponent {
#endif // DOXYGEN_SHOULD_SKIP_THIS
// Noop destructor.
- virtual ~H5Location() override;
+ virtual ~H5Location() override = default;
}; // end of H5Location
} // namespace H5
diff --git a/c++/src/H5Object.cpp b/c++/src/H5Object.cpp
index 2b898e7..5411437 100644
--- a/c++/src/H5Object.cpp
+++ b/c++/src/H5Object.cpp
@@ -541,14 +541,4 @@ H5Object::getObjName(H5std_string &obj_name, size_t len) const
return name_size;
}
-#ifndef DOXYGEN_SHOULD_SKIP_THIS
-//--------------------------------------------------------------------------
-// Function: H5Object destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-H5Object::~H5Object()
-{
-}
-#endif // DOXYGEN_SHOULD_SKIP_THIS
-
} // namespace H5
diff --git a/c++/src/H5Object.h b/c++/src/H5Object.h
index 1e93c0c..b290584 100644
--- a/c++/src/H5Object.h
+++ b/c++/src/H5Object.h
@@ -124,7 +124,7 @@ class H5_DLLCPP H5Object : public H5Location {
virtual void p_setId(const hid_t new_id) override = 0;
// Noop destructor.
- virtual ~H5Object() override;
+ virtual ~H5Object() override = default;
#endif // DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5OcreatProp.cpp b/c++/src/H5OcreatProp.cpp
index 54808cb..0f1da1c 100644
--- a/c++/src/H5OcreatProp.cpp
+++ b/c++/src/H5OcreatProp.cpp
@@ -199,12 +199,4 @@ ObjCreatPropList::getAttrCrtOrder() const
return (crt_order_flags);
}
-//--------------------------------------------------------------------------
-// Function: ObjCreatPropList destructor
-///\brief Noop destructor
-//--------------------------------------------------------------------------
-ObjCreatPropList::~ObjCreatPropList()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5OcreatProp.h b/c++/src/H5OcreatProp.h
index 6d752ed..fbe3991 100644
--- a/c++/src/H5OcreatProp.h
+++ b/c++/src/H5OcreatProp.h
@@ -56,7 +56,7 @@ class H5_DLLCPP ObjCreatPropList : public PropList {
ObjCreatPropList(const hid_t plist_id);
// Noop destructor
- virtual ~ObjCreatPropList() override;
+ virtual ~ObjCreatPropList() override = default;
#ifndef DOXYGEN_SHOULD_SKIP_THIS
diff --git a/c++/src/H5PredType.cpp b/c++/src/H5PredType.cpp
index 6aa5b17..0338a01 100644
--- a/c++/src/H5PredType.cpp
+++ b/c++/src/H5PredType.cpp
@@ -108,15 +108,6 @@ PredType::committed()
}
#endif // DOXYGEN_SHOULD_SKIP_THIS
-// Default destructor
-//--------------------------------------------------------------------------
-// Function: PredType destructor
-///\brief Noop destructor.
-//--------------------------------------------------------------------------
-PredType::~PredType()
-{
-}
-
/*****************************************************************************
The following section is regarding the global constants PredType,
DataSpace, and PropList.
diff --git a/c++/src/H5PredType.h b/c++/src/H5PredType.h
index 1e305fc..85b6e96 100644
--- a/c++/src/H5PredType.h
+++ b/c++/src/H5PredType.h
@@ -41,7 +41,7 @@ class H5_DLLCPP PredType : public AtomType {
PredType(const PredType &original);
// Noop destructor
- virtual ~PredType() override;
+ virtual ~PredType() override = default;
/*! \brief This dummy function do not inherit from DataType - it will
throw a DataTypeIException if invoked.
diff --git a/c++/src/H5StrType.cpp b/c++/src/H5StrType.cpp
index 2c47809..1b45814 100644
--- a/c++/src/H5StrType.cpp
+++ b/c++/src/H5StrType.cpp
@@ -290,12 +290,4 @@ StrType::setStrpad(H5T_str_t strpad) const
}
}
-//--------------------------------------------------------------------------
-// Function: StrType destructor
-///\brief Properly terminates access to this string datatype.
-//--------------------------------------------------------------------------
-StrType::~StrType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5StrType.h b/c++/src/H5StrType.h
index ccae3e7..0f51e75 100644
--- a/c++/src/H5StrType.h
+++ b/c++/src/H5StrType.h
@@ -72,7 +72,7 @@ class H5_DLLCPP StrType : public AtomType {
StrType(const StrType &original);
// Noop destructor.
- virtual ~StrType() override;
+ virtual ~StrType() override = default;
}; // end of StrType
} // namespace H5
diff --git a/c++/src/H5VarLenType.cpp b/c++/src/H5VarLenType.cpp
index e8b7cbb..49f2cbd 100644
--- a/c++/src/H5VarLenType.cpp
+++ b/c++/src/H5VarLenType.cpp
@@ -146,12 +146,4 @@ VarLenType::decode() const
return (encoded_vltype);
}
-//--------------------------------------------------------------------------
-// Function: VarLenType destructor
-///\brief Properly terminates access to this datatype.
-//--------------------------------------------------------------------------
-VarLenType::~VarLenType()
-{
-}
-
} // namespace H5
diff --git a/c++/src/H5VarLenType.h b/c++/src/H5VarLenType.h
index 318681a..d7f0ff1 100644
--- a/c++/src/H5VarLenType.h
+++ b/c++/src/H5VarLenType.h
@@ -52,7 +52,7 @@ class H5_DLLCPP VarLenType : public DataType {
VarLenType(const H5Location &loc, const H5std_string &name);
// Noop destructor
- virtual ~VarLenType() override;
+ virtual ~VarLenType() override = default;
// Default constructor
VarLenType();
diff --git a/c++/src/h5c++.in b/c++/src/h5c++.in
index 078fa73..e666ba9 100644
--- a/c++/src/h5c++.in
+++ b/c++/src/h5c++.in
@@ -60,7 +60,7 @@ host_os="@host_os@"
prog_name="`basename $0`"
-allargs=""
+misc_args=""
compile_args=""
libraries=""
link_args=""
@@ -198,7 +198,6 @@ for arg in $@ ; do
case "$arg" in
-c)
- allargs="$allargs $arg"
compile_args="$compile_args $arg"
if test "x$do_link" = "xyes" -a -n "$output_file"; then
@@ -209,7 +208,6 @@ for arg in $@ ; do
dash_c="yes"
;;
-o)
- allargs="$allargs $arg"
dash_o="yes"
if test "x$dash_c" = "xyes"; then
@@ -221,14 +219,12 @@ for arg in $@ ; do
fi
;;
-E|-M|-MT)
- allargs="$allargs $arg"
compile_args="$compile_args $arg"
dash_c="yes"
do_link="no"
;;
-l*)
libraries=" $libraries $arg "
- allargs="$allargs $arg"
;;
-prefix=*)
prefix="`expr "$arg" : '-prefix=\(.*\)'`"
@@ -254,15 +250,15 @@ for arg in $@ ; do
;;
*\"*)
qarg="'"$arg"'"
- allargs="$allargs $qarg"
+ misc_args="$misc_args $qarg"
;;
*\'*)
- qarg='\"'"$arg"'\"'
- allargs="$allargs $qarg"
+ qarg='"'"$arg"'"'
+ misc_args="$misc_args $qarg"
;;
*)
- allargs="$allargs $qarg"
+ misc_args="$misc_args $qarg"
if [ -s "$arg" ] ; then
ext=`expr "$arg" : '.*\(\..*\)'`
@@ -300,7 +296,7 @@ if test "x$do_compile" = "xyes"; then
compile_args="-c $compile_args"
fi
- $SHOW $CXX -I$includedir $H5BLD_CPPFLAGS $CPPFLAGS $H5BLD_CXXFLAGS $CXXFLAGS $compile_args
+ $SHOW $CXX -I$includedir $H5BLD_CPPFLAGS $CPPFLAGS $H5BLD_CXXFLAGS $CXXFLAGS $misc_args $compile_args
status=$?
if test "$status" != "0"; then
diff --git a/c++/test/h5cpputil.cpp b/c++/test/h5cpputil.cpp
index c3feefa..933aa7d 100644
--- a/c++/test/h5cpputil.cpp
+++ b/c++/test/h5cpputil.cpp
@@ -198,13 +198,6 @@ InvalidActionException::InvalidActionException(const H5std_string &func, const H
}
//--------------------------------------------------------------------------
-// Function: InvalidActionException destructor
-//--------------------------------------------------------------------------
-InvalidActionException::~InvalidActionException() throw()
-{
-}
-
-//--------------------------------------------------------------------------
// Function: TestFailedException default constructor
//--------------------------------------------------------------------------
TestFailedException::TestFailedException() : Exception()
@@ -225,10 +218,3 @@ TestFailedException::TestFailedException(const H5std_string &func, const H5std_s
: Exception(func, message)
{
}
-
-//--------------------------------------------------------------------------
-// Function: TestFailedException destructor
-//--------------------------------------------------------------------------
-TestFailedException::~TestFailedException() throw()
-{
-}
diff --git a/c++/test/h5cpputil.h b/c++/test/h5cpputil.h
index 392382d..fa6822a 100644
--- a/c++/test/h5cpputil.h
+++ b/c++/test/h5cpputil.h
@@ -49,14 +49,14 @@ class InvalidActionException : public Exception {
public:
InvalidActionException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
InvalidActionException();
- ~InvalidActionException() throw() override;
+ ~InvalidActionException() override = default;
};
class TestFailedException : public Exception {
public:
TestFailedException(const H5std_string &func_name, const H5std_string &message = DEFAULT_MSG);
TestFailedException();
- ~TestFailedException() throw() override;
+ ~TestFailedException() override = default;
};
// Overloaded/Template functions to verify values and display proper info
diff --git a/config/cmake-presets/hidden-presets.json b/config/cmake-presets/hidden-presets.json
index ab8fdf1..df27de0 100644
--- a/config/cmake-presets/hidden-presets.json
+++ b/config/cmake-presets/hidden-presets.json
@@ -400,7 +400,7 @@
"execution": {
"noTestsAction": "error",
"timeout": 600,
- "jobs": 8
+ "jobs": 4
}
},
{
diff --git a/config/cmake/H5pubconf.h.in b/config/cmake/H5pubconf.h.in
index 0ff22fe..f8756e9 100644
--- a/config/cmake/H5pubconf.h.in
+++ b/config/cmake/H5pubconf.h.in
@@ -243,6 +243,9 @@
/* Define if we have parallel support */
#cmakedefine H5_HAVE_PARALLEL @H5_HAVE_PARALLEL@
+/* Define if MPI Fortran supports mpi_f08 module */
+#cmakedefine H5_HAVE_MPI_F08 @H5_HAVE_MPI_F08@
+
/* Define if we have support for writing to filtered datasets in parallel */
#cmakedefine H5_HAVE_PARALLEL_FILTERED_WRITES @H5_HAVE_PARALLEL_FILTERED_WRITES@
diff --git a/config/cmake/HDF5ExampleCache.cmake b/config/cmake/HDF5ExampleCache.cmake
index 6ac9cc0..7c3bf13 100644
--- a/config/cmake/HDF5ExampleCache.cmake
+++ b/config/cmake/HDF5ExampleCache.cmake
@@ -14,7 +14,7 @@ set (HDF_BUILD_CPP_LIB ${HDF5_BUILD_CPP_LIB} CACHE BOOL "Build HDF5 C++ Library"
set (HDF_BUILD_HL_LIB ${HDF5_BUILD_HL_LIB} CACHE BOOL "Build HIGH Level examples" FORCE)
set (HDF_ENABLE_THREADSAFE ${HDF5_ENABLE_THREADSAFE} CACHE BOOL "Enable examples thread-safety" FORCE)
set (HDF_ENABLE_PARALLEL ${HDF5_ENABLE_PARALLEL} CACHE BOOL "Enable examples parallel build (requires MPI)" FORCE)
-set (H5EX_USE_GNU_DIRS ${HDF5_USE_GNU_DIRS} CACHE BOOL "TRUE to use GNU Coding Standard install directory variables, FALSE to use historical settings" FORCE)
+set (H5EX_USE_GNU_DIRS ${HDF5_USE_GNU_DIRS} CACHE BOOL "ON to use GNU Coding Standard install directory variables, OFF to use historical settings" FORCE)
#preset HDF5 cache vars to this projects libraries instead of searching
set (H5EX_HDF5_HEADER "H5pubconf.h" CACHE STRING "Name of HDF5 header" FORCE)
diff --git a/config/cmake/HDF5PluginCache.cmake b/config/cmake/HDF5PluginCache.cmake
index d299d9f..7cdaf02 100644
--- a/config/cmake/HDF5PluginCache.cmake
+++ b/config/cmake/HDF5PluginCache.cmake
@@ -39,4 +39,4 @@ set (H5PL_TGZ_NAME "${PLUGIN_TGZ_NAME}" CACHE STRING "Use plugins from compresse
set (PL_PACKAGE_NAME "${PLUGIN_PACKAGE_NAME}" CACHE STRING "Name of plugins package" FORCE)
set (H5PL_CPACK_ENABLE OFF CACHE BOOL "Enable CPack include and components" FORCE)
-set (H5PL_USE_GNU_DIRS ${HDF5_USE_GNU_DIRS} CACHE BOOL "TRUE to use GNU Coding Standard install directory variables" FORCE)
+set (H5PL_USE_GNU_DIRS ${HDF5_USE_GNU_DIRS} CACHE BOOL "ON to use GNU Coding Standard install directory variables, OFF to use historical settings" FORCE)
diff --git a/config/cmake/HDFMacros.cmake b/config/cmake/HDFMacros.cmake
index 30c16e6..64e77c0 100644
--- a/config/cmake/HDFMacros.cmake
+++ b/config/cmake/HDFMacros.cmake
@@ -369,7 +369,7 @@ macro (HDFTEST_COPY_FILE src dest target)
endmacro ()
macro (HDF_DIR_PATHS package_prefix)
- option (HDF5_USE_GNU_DIRS "TRUE to use GNU Coding Standard install directory variables, FALSE to use historical settings" FALSE)
+ option (HDF5_USE_GNU_DIRS "ON to use GNU Coding Standard install directory variables, OFF to use historical settings" OFF)
if (HDF5_USE_GNU_DIRS)
include(GNUInstallDirs)
if (NOT ${package_prefix}_INSTALL_BIN_DIR)
@@ -400,7 +400,7 @@ macro (HDF_DIR_PATHS package_prefix)
endif ()
if (APPLE)
- option (${package_prefix}_BUILD_FRAMEWORKS "TRUE to build as frameworks libraries, FALSE to build according to BUILD_SHARED_LIBS" FALSE)
+ option (${package_prefix}_BUILD_FRAMEWORKS "ON to build as frameworks libraries, OFF to build according to BUILD_SHARED_LIBS" OFF)
endif ()
if (NOT ${package_prefix}_INSTALL_BIN_DIR)
@@ -449,10 +449,10 @@ macro (HDF_DIR_PATHS package_prefix)
message(STATUS "Final: ${${package_prefix}_INSTALL_DOC_DIR}")
# Always use full RPATH, i.e. don't skip the full RPATH for the build tree
- set (CMAKE_SKIP_BUILD_RPATH FALSE)
+ set (CMAKE_SKIP_BUILD_RPATH OFF)
# when building, don't use the install RPATH already
# (but later on when installing)
- set (CMAKE_INSTALL_RPATH_USE_LINK_PATH FALSE)
+ set (CMAKE_INSTALL_RPATH_USE_LINK_PATH OFF)
# add the automatically determined parts of the RPATH
# which point to directories outside the build tree to the install RPATH
set (CMAKE_BUILD_WITH_INSTALL_RPATH ON)
diff --git a/config/cmake/hdf5-config.cmake.in b/config/cmake/hdf5-config.cmake.in
index 186ae67..b04b201 100644
--- a/config/cmake/hdf5-config.cmake.in
+++ b/config/cmake/hdf5-config.cmake.in
@@ -33,39 +33,42 @@ set (${HDF5_PACKAGE_NAME}_VALID_COMPONENTS
# User Options
#-----------------------------------------------------------------------------
# Languages:
+#-----------------------------------------------------------------------------
set (${HDF5_PACKAGE_NAME}_BUILD_FORTRAN @HDF5_BUILD_FORTRAN@)
set (${HDF5_PACKAGE_NAME}_BUILD_CPP_LIB @HDF5_BUILD_CPP_LIB@)
set (${HDF5_PACKAGE_NAME}_BUILD_JAVA @HDF5_BUILD_JAVA@)
set (${HDF5_PACKAGE_NAME}_INSTALL_MOD_FORTRAN "@HDF5_INSTALL_MOD_FORTRAN@")
#-----------------------------------------------------------------------------
# Features:
-set (${HDF5_PACKAGE_NAME}_ENABLE_PARALLEL @HDF5_ENABLE_PARALLEL@)
-set (${HDF5_PACKAGE_NAME}_PARALLEL_FILTERED_WRITES @PARALLEL_FILTERED_WRITES@)
-set (${HDF5_PACKAGE_NAME}_LARGE_PARALLEL_IO @LARGE_PARALLEL_IO@)
+#-----------------------------------------------------------------------------
set (${HDF5_PACKAGE_NAME}_BUILD_HL_LIB @HDF5_BUILD_HL_LIB@)
-set (${HDF5_PACKAGE_NAME}_BUILD_DIMENSION_SCALES_WITH_NEW_REF @DIMENSION_SCALES_WITH_NEW_REF@)
-set (${HDF5_PACKAGE_NAME}_BUILD_TOOLS @HDF5_BUILD_TOOLS@)
-set (${HDF5_PACKAGE_NAME}_BUILD_HL_GIF_TOOLS @HDF5_BUILD_HL_GIF_TOOLS@)
+set (${HDF5_PACKAGE_NAME}_BUILD_SHARED_LIBS @H5_ENABLE_SHARED_LIB@)
+set (${HDF5_PACKAGE_NAME}_BUILD_STATIC_LIBS @H5_ENABLE_STATIC_LIB@)
set (${HDF5_PACKAGE_NAME}_ENABLE_THREADSAFE @HDF5_ENABLE_THREADSAFE@)
+set (${HDF5_PACKAGE_NAME}_ENABLE_PARALLEL @HDF5_ENABLE_PARALLEL@)
set (${HDF5_PACKAGE_NAME}_DEFAULT_API_VERSION "@DEFAULT_API_VERSION@")
set (${HDF5_PACKAGE_NAME}_ENABLE_DEPRECATED_SYMBOLS @HDF5_ENABLE_DEPRECATED_SYMBOLS@)
+set (${HDF5_PACKAGE_NAME}_BUILD_DIMENSION_SCALES_WITH_NEW_REF @DIMENSION_SCALES_WITH_NEW_REF@)
+#-----------------------------------------------------------------------------
+set (${HDF5_PACKAGE_NAME}_BUILD_TOOLS @HDF5_BUILD_TOOLS@)
+set (${HDF5_PACKAGE_NAME}_BUILD_HL_GIF_TOOLS @HDF5_BUILD_HL_GIF_TOOLS@)
+#-----------------------------------------------------------------------------
set (${HDF5_PACKAGE_NAME}_ENABLE_Z_LIB_SUPPORT @HDF5_ENABLE_Z_LIB_SUPPORT@)
set (${HDF5_PACKAGE_NAME}_ENABLE_SZIP_SUPPORT @HDF5_ENABLE_SZIP_SUPPORT@)
set (${HDF5_PACKAGE_NAME}_ENABLE_SZIP_ENCODING @HDF5_ENABLE_SZIP_ENCODING@)
-set (${HDF5_PACKAGE_NAME}_ENABLE_MAP_API @H5_HAVE_MAP_API@)
-set (${HDF5_PACKAGE_NAME}_ENABLE_DIRECT_VFD @H5_HAVE_DIRECT@)
-set (${HDF5_PACKAGE_NAME}_ENABLE_MIRROR_VFD @H5_HAVE_MIRROR_VFD@)
-set (${HDF5_PACKAGE_NAME}_ENABLE_SUBFILING_VFD @HDF5_ENABLE_SUBFILING_VFD@)
+#-----------------------------------------------------------------------------
+set (${HDF5_PACKAGE_NAME}_ENABLE_MAP_API @HDF5_ENABLE_MAP_API@)
+set (${HDF5_PACKAGE_NAME}_ENABLE_DIRECT_VFD @HDF5_ENABLE_DIRECT_VFD@)
+set (${HDF5_PACKAGE_NAME}_ENABLE_MIRROR_VFD @HDF5_ENABLE_MIRROR_VFD@)
set (${HDF5_PACKAGE_NAME}_ENABLE_ROS3_VFD @HDF5_ENABLE_ROS3_VFD@)
-set (${HDF5_PACKAGE_NAME}_ENABLE_HDFS_VFD @H5_HAVE_LIBHDFS@)
+set (${HDF5_PACKAGE_NAME}_ENABLE_HDFS_VFD @HDF5_ENABLE_HDFS@)
+set (${HDF5_PACKAGE_NAME}_ENABLE_SUBFILING_VFD @HDF5_ENABLE_SUBFILING_VFD@)
set (${HDF5_PACKAGE_NAME}_ENABLE_PLUGIN_SUPPORT @HDF5_ENABLE_PLUGIN_SUPPORT@)
#-----------------------------------------------------------------------------
-set (${HDF5_PACKAGE_NAME}_BUILD_SHARED_LIBS @H5_ENABLE_SHARED_LIB@)
-set (${HDF5_PACKAGE_NAME}_BUILD_STATIC_LIBS @H5_ENABLE_STATIC_LIB@)
-set (${HDF5_PACKAGE_NAME}_PACKAGE_EXTLIBS @HDF5_PACKAGE_EXTLIBS@)
-set (${HDF5_PACKAGE_NAME}_EXPORT_LIBRARIES @HDF5_LIBRARIES_TO_EXPORT@)
-set (${HDF5_PACKAGE_NAME}_ARCHITECTURE "@CMAKE_GENERATOR_ARCHITECTURE@")
-set (${HDF5_PACKAGE_NAME}_TOOLSET "@CMAKE_GENERATOR_TOOLSET@")
+set (${HDF5_PACKAGE_NAME}_PACKAGE_EXTLIBS @HDF5_PACKAGE_EXTLIBS@)
+set (${HDF5_PACKAGE_NAME}_EXPORT_LIBRARIES @HDF5_LIBRARIES_TO_EXPORT@)
+set (${HDF5_PACKAGE_NAME}_ARCHITECTURE "@CMAKE_GENERATOR_ARCHITECTURE@")
+set (${HDF5_PACKAGE_NAME}_TOOLSET "@CMAKE_GENERATOR_TOOLSET@")
#-----------------------------------------------------------------------------
# Dependencies
@@ -77,6 +80,8 @@ if (${HDF5_PACKAGE_NAME}_ENABLE_PARALLEL)
set (${HDF5_PACKAGE_NAME}_MPI_Fortran_INCLUDE_PATH "@MPI_Fortran_INCLUDE_DIRS@")
set (${HDF5_PACKAGE_NAME}_MPI_Fortran_LIBRARIES "@MPI_Fortran_LIBRARIES@")
endif ()
+ set (${HDF5_PACKAGE_NAME}_PARALLEL_FILTERED_WRITES @PARALLEL_FILTERED_WRITES@)
+ set (${HDF5_PACKAGE_NAME}_LARGE_PARALLEL_IO @LARGE_PARALLEL_IO@)
find_package(MPI QUIET REQUIRED)
endif ()
diff --git a/config/cmake/runTest.cmake b/config/cmake/runTest.cmake
index d21765a..0cfb9a3 100644
--- a/config/cmake/runTest.cmake
+++ b/config/cmake/runTest.cmake
@@ -139,6 +139,11 @@ if (TEST_FIND_RESULT GREATER -1)
string (REGEX REPLACE "^.*_pmi_alps[^\n]+\n" "" TEST_STREAM "${TEST_STREAM}")
file (WRITE ${TEST_FOLDER}/${TEST_OUTPUT} ${TEST_STREAM})
endif ()
+string (FIND TEST_STREAM "ulimit -s" TEST_FIND_RESULT)
+if (TEST_FIND_RESULT GREATER -1)
+ string (REGEX REPLACE "^.*ulimit -s.*\n" "" TEST_STREAM "${TEST_STREAM}")
+ file (WRITE ${TEST_FOLDER}/${TEST_OUTPUT} ${TEST_STREAM})
+endif ()
# remove special error output
if (NOT TEST_ERRREF)
diff --git a/config/cmake/scripts/CTestScript.cmake b/config/cmake/scripts/CTestScript.cmake
index 37bf0d4..4f7eb4b 100644
--- a/config/cmake/scripts/CTestScript.cmake
+++ b/config/cmake/scripts/CTestScript.cmake
@@ -81,7 +81,7 @@ if (CTEST_USE_TAR_SOURCE)
## Uncompress source if tar file provided
## --------------------------
if (WIN32 AND NOT MINGW)
- message (STATUS "extracting... [${CMAKE_EXECUTABLE_NAME} x ${CTEST_DASHBOARD_ROOT}\\${CTEST_USE_TAR_SOURCE}.zip]")
+ message (STATUS "extracting... [${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_DASHBOARD_ROOT}\\${CTEST_USE_TAR_SOURCE}.zip]")
execute_process (COMMAND ${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_DASHBOARD_ROOT}\\${CTEST_USE_TAR_SOURCE}.zip RESULT_VARIABLE rv)
else ()
message (STATUS "extracting... [${CMAKE_EXECUTABLE_NAME} -E tar -xvf ${CTEST_DASHBOARD_ROOT}/${CTEST_USE_TAR_SOURCE}.tar]")
diff --git a/configure.ac b/configure.ac
index 76303d3..44dc430 100644
--- a/configure.ac
+++ b/configure.ac
@@ -1220,14 +1220,14 @@ AC_MSG_RESULT([$HDF5_DOXYGEN])
## This needs to be exposed for the library info file.
AC_SUBST([HDF5_DOXY_WARNINGS])
-## Default is to consider doxygen warnings as errors
+## Default is not to consider doxygen warnings as errors
DOXY_ERR=yes
AC_MSG_CHECKING([if doxygen warnings as errors is enabled])
AC_ARG_ENABLE([doxygen-errors],
[AS_HELP_STRING([--enable-doxygen-errors],
- [Error on HDF5 doxygen warnings [default=yes]])],
+ [Error on HDF5 doxygen warnings [default=no]])],
[DOXY_ERR=$enableval])
if test "X$DOXY_ERR" = "Xyes"; then
@@ -2840,6 +2840,25 @@ if test -n "$PARALLEL"; then
AC_MSG_RESULT([yes])],
[AC_MSG_RESULT([no])]
)
+
+ AC_LANG_PUSH([Fortran])
+ AC_MSG_CHECKING([for MPI-3 module mpi_f08])
+ AC_LINK_IFELSE(
+ [AC_LANG_PROGRAM([],
+ [
+ USE mpi_f08
+ IMPLICIT NONE
+ TYPE(MPI_Comm) :: comm
+ TYPE(MPI_INFO) :: info
+ ]
+ )
+ ],
+ [AC_DEFINE([HAVE_MPI_F08], [1],
+ [Define if mpi_f08 module exist])
+ AC_MSG_RESULT([yes])],
+ [AC_MSG_RESULT([no])]
+ )
+ AC_LANG_POP(Fortran)
fi
## ----------------------------------------------------------------------
diff --git a/doc/parallel-compression.md b/doc/parallel-compression.md
index b3a4592..484f501 100644
--- a/doc/parallel-compression.md
+++ b/doc/parallel-compression.md
@@ -64,9 +64,9 @@ H5Dwrite(..., dxpl_id, ...);
The following are two simple examples of using the parallel
compression feature:
-[ph5_filtered_writes.c](https://github.com/HDFGroup/hdf5/blob/hdf5_1_14/examples/ph5_filtered_writes.c)
+[ph5_filtered_writes.c](https://github.com/HDFGroup/hdf5/blob/hdf5_1_14/HDF5Examples/C/H5PAR/ph5_filtered_writes.c)
-[ph5_filtered_writes_no_sel.c](https://github.com/HDFGroup/hdf5/blob/hdf5_1_14/examples/ph5_filtered_writes_no_sel.c)
+[ph5_filtered_writes_no_sel.c](https://github.com/HDFGroup/hdf5/blob/hdf5_1_14/HDF5Examples/C/H5PAR/ph5_filtered_writes_no_sel.c)
The former contains simple examples of using the parallel
compression feature to write to compressed datasets, while the
diff --git a/doxygen/dox/LearnBasics3.dox b/doxygen/dox/LearnBasics3.dox
index 435187e..195213b 100644
--- a/doxygen/dox/LearnBasics3.dox
+++ b/doxygen/dox/LearnBasics3.dox
@@ -288,7 +288,7 @@ is met, at a certain point in the future.)
\section secLBContentsProg Programming Example
-\subsection subsecLBContentsProgUsing Using #H5Literate, #H5Lvisit and #H5Ovisit
+\subsection subsecLBContentsProgUsing Using H5Literate, H5Lvisit and H5Ovisit
For example code, see the \ref HDF5Examples page.
Specifically look at the \ref ExAPI.
There are examples for different languages, where examples of using #H5Literate and #H5Ovisit/#H5Lvisit are included.
diff --git a/doxygen/dox/PredefinedDatatypeTables.dox b/doxygen/dox/PredefinedDatatypeTables.dox
index fbafa94..0b40516 100644
--- a/doxygen/dox/PredefinedDatatypeTables.dox
+++ b/doxygen/dox/PredefinedDatatypeTables.dox
@@ -1,22 +1,41 @@
/** \page predefined_datatypes_tables HDF5 Predefined Datatypes
+ *
+ * \section sec_predefined_datatypes_tables HDF5 Predefined Datatypes
*
* The following datatypes are predefined in HDF5.
- *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_ieee_datatypes_table
+ * </div>
*
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_std_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_unix_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_string_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_intel_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_dec_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_mips_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_native_datatypes_table
- *
+ * </div>
+ *
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_c9x_datatypes_table
+ * </div>
*/
diff --git a/doxygen/examples/H5.format.html b/doxygen/examples/H5.format.html
index 832e3fc..7aba5fe 100644
--- a/doxygen/examples/H5.format.html
+++ b/doxygen/examples/H5.format.html
@@ -10780,6 +10780,12 @@ within the embedded dataspace]<br />
<a name="ClassReference"></a>
<p>Class specific information for the Reference class (Class 7):</p>
+ <p>Note that for region references, the stored data is
+ a <a href="#GlobalHeapID">Global Heap ID</a> pointing to information
+ about the region stored in the global heap.
+ </p>
+
+
<div align="center">
<table class="desc">
<caption>
@@ -11118,6 +11124,11 @@ within the embedded dataspace]<br />
<a name="ClassVarLen"></a>
<p>Class specific information for the Variable-length class (Class 9):</p>
+ <p>Note that data with a variable length type is stored on the global heap.
+ Locations that would normally store the data directly (e.g. attribute message)
+ will instead contain a <a href="#GlobalHeapID">Global Heap ID</a>.
+ </p>
+
<div align="center">
<table class="desc">
<caption>
@@ -11276,7 +11287,7 @@ within the embedded dataspace]<br />
</tr>
<tr>
- <td colspan="4"><br />Base Type<br /><br /></td>
+ <td colspan="4"><br />Parent Type Message<br /><br /></td>
</tr>
</table>
@@ -11294,11 +11305,14 @@ within the embedded dataspace]<br />
</tr>
<tr>
- <td><p>Base Type</p></td>
+ <td><p>Parent Type</p></td>
<td>
- <p>Each variable-length type is based on some parent type. The
- information for that parent type is described recursively by
- this field.
+ <p>Each variable-length type is based on some parent type.
+ This field contains the datatype message describing that parent type.
+ In the case of nested variable-length types, this parent datatype message will
+ recursively contain all parent datatype messages.
+
+ Variable-length strings are considered to have the parent type `H5T_NATIVE_UCHAR`.
</p>
</td>
</tr>
diff --git a/doxygen/examples/tables/fileDriverLists.dox b/doxygen/examples/tables/fileDriverLists.dox
index b1f873f..437d32a 100644
--- a/doxygen/examples/tables/fileDriverLists.dox
+++ b/doxygen/examples/tables/fileDriverLists.dox
@@ -1,5 +1,4 @@
/** File Driver List
- *
//! [file_driver_table]
<table>
<caption>I/O file drivers</caption>
@@ -70,7 +69,6 @@
</table>
//! [file_driver_table]
*
- *
//! [supported_file_driver_table]
<table>
<caption id="table_file_drivers">Supported file drivers</caption>
diff --git a/examples/testh5cc.sh.in b/examples/testh5cc.sh.in
index 17b2563..291aaf6 100644
--- a/examples/testh5cc.sh.in
+++ b/examples/testh5cc.sh.in
@@ -66,6 +66,8 @@ prog1=${H5TOOL}_prog1.$suffix
prog1_o=${H5TOOL}_prog1.o
prog2=${H5TOOL}_prog2.$suffix
prog2_o=${H5TOOL}_prog2.o
+args=${H5TOOL}_args.$suffix
+args_o=${H5TOOL}_args.o
applib=libapp${H5TOOL}.a
# short hands
@@ -277,16 +279,85 @@ main (void)
}
EOF
+# Generate HDF5 v1.14 Main Program:
+# This makes unique V1.14 API calls.
+cat > $v114main <<EOF
+/* This is a V1.14 API calls example Program. */
+#include "hdf5.h"
+#define H5FILE_NAME "tmp.h5"
+#define SPACE1_RANK 3
+int
+main (void)
+{
+ hid_t sid; /* Dataspace ID */
+ hid_t fapl = -1; /* File access property list ID */
+ int rank; /* Logical rank of dataspace */
+ hsize_t dims[] = {3, 3, 15};
+ size_t sbuf_size=0;
+ herr_t ret; /* Generic return value */
+ hsize_t start[] = {0, 0, 0};
+ hsize_t stride[] = {2, 5, 3};
+ hsize_t count[] = {2, 2, 2};
+ hsize_t block[] = {1, 3, 1};
+
+ /* Create the file access property list */
+ fapl = H5Pcreate(H5P_FILE_ACCESS);
+
+ /* Set low/high bounds in the fapl */
+ ret = H5Pset_libver_bounds(fapl, H5F_LIBVER_EARLIEST,
+ H5F_LIBVER_LATEST);
+
+ /* Create the dataspace */
+ sid = H5Screate_simple(SPACE1_RANK, dims, NULL);
+
+ /* Set the hyperslab selection */
+ ret = H5Sselect_hyperslab(sid, H5S_SELECT_SET, start, stride, count, block);
+
+ /* Encode simple dataspace in a buffer with the fapl setting */
+ ret = H5Sencode(sid, NULL, &sbuf_size, fapl);
+
+ /* Encode simple dataspace in a buffer with the fapl setting */
+ ret = H5Sencode2(sid, NULL, &sbuf_size, fapl);
+
+ printf("HDF5 C program created with V1.14 API ran successfully. ");
+/* "File %s generated.\n", H5FILE_NAME);
+ remove(H5FILE_NAME); */
+ return 0;
+}
+EOF
+
+# Generate args:
+# An application main that test misc command line arguments being passed.
+cat > $args <<EOF
+#include "hdf5.h"
+#define H5FILE_NAME "check_args.h5"
+int
+main (void)
+{
+ char c = SGL_QUOTE; /* 'H' */
+ char *s = DBL_QUOTE; /* "HDF" */
+ int val = MISC; /* 42 */
+ hid_t file; /* file and dataset handles */
+
+ file = H5Fcreate(H5FILE_NAME, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
+ H5Fclose(file);
+
+ printf("HDF5 C Sample program ran successfully. File %s generated.\n", H5FILE_NAME);
+ remove(H5FILE_NAME);
+
+ return 0;
+}
+EOF
# Parse option
# None
-# Print a line-line message left justified in a field of 70 characters
+# Print a line-line message left justified in a field of 71 characters
# beginning with the word "Testing".
#
TESTING() {
SPACES=" "
- echo "Testing $* $SPACES" | cut -c1-70 | tr -d '\012'
+ echo "Testing $* $SPACES" | cut -c1-71 | tr -d '\012'
}
@@ -450,6 +521,10 @@ else
TOOLTEST $v112main
fi
+# Group 6: # HDF5 program that depends on input args.
+echo "***"Simple Compile and Link in one step with user-supplied arguments.
+TOOLTEST -DSGL_QUOTE=\'H\' -DDBL_QUOTE=\"HDF\" -DMISC=42 $args
+
##############################################################################
# END
##############################################################################
diff --git a/fortran/examples/testh5fc.sh.in b/fortran/examples/testh5fc.sh.in
index 0328bbb..f8f3706 100644
--- a/fortran/examples/testh5fc.sh.in
+++ b/fortran/examples/testh5fc.sh.in
@@ -42,11 +42,13 @@ myos=`uname -s`
myhostnama=`uname -n`
# Generate some source files and library for tests.
-suffix=f90 # source file suffix
+suffix=F90 # source file suffix
hdf5main=${H5TOOL}_hdf5main.$suffix
hdf5main_o=${H5TOOL}_hdf5main.o
appmain=${H5TOOL}_appmain.$suffix
appmain_o=${H5TOOL}_appmain.o
+args=${H5TOOL}_args.$suffix
+args_o=${H5TOOL}_args.o
prog1=${H5TOOL}_prog1.$suffix
prog1_o=${H5TOOL}_prog1.o
prog2=${H5TOOL}_prog2.$suffix
@@ -106,7 +108,7 @@ cat > $hdf5main <<EOF
IMPLICIT NONE
- CHARACTER(LEN=8), PARAMETER :: filename = "apptmp.h5" ! File name
+ CHARACTER(LEN=9), PARAMETER :: filename = "apptmp.h5" ! File name
INTEGER(HID_T) :: file_id ! File identifier
INTEGER :: error ! Error flag
@@ -118,17 +120,38 @@ cat > $hdf5main <<EOF
END PROGRAM FILEEXAMPLE
EOF
+# Generate an args Main Program:
+# An application main that test misc command line arguments being passed.
+cat > $args <<EOF
+ PROGRAM ARGS
+ USE HDF5 ! This module contains all necessary modules
+
+ IMPLICIT NONE
+ CHARACTER(LEN=1), PARAMETER :: chr1 = SGL_QUOTE ! 'H'
+ CHARACTER(LEN=3), PARAMETER :: chr3 = DBL_QUOTE ! "HDF"
+ INTEGER, PARAMETER :: val = MISC ! 42
+
+ CHARACTER(LEN=9), PARAMETER :: filename = "argtmp.h5" ! File name
+ INTEGER(HID_T) :: file_id ! File identifier
+
+ INTEGER :: error ! Error flag
+ CALL h5open_f (error)
+ CALL h5fcreate_f(filename, H5F_ACC_TRUNC_F, file_id, error)
+ CALL h5fclose_f(file_id, error)
+ CALL h5close_f(error)
+ END PROGRAM ARGS
+EOF
# Parse option
# None
-# Print a line-line message left justified in a field of 70 characters
+# Print a line-line message left justified in a field of 83 characters
# beginning with the word "Testing".
#
TESTING() {
SPACES=" "
- echo "Testing $* $SPACES" | cut -c1-70 | tr -d '\012'
+ echo "Testing $* $SPACES" | cut -c1-83 | tr -d '\012'
}
@@ -199,6 +222,15 @@ $RANLIB $applib
TOOLTEST $appmain $applib
TOOLTEST $appmain_o $applib
+# HDF5 program that depends on input args.
+echo "***"Simple Compile and Link in one step with user-supplied arguments.
+FCBASE=`grep "FCBASE=" $H5TOOL_BIN | xargs basename`
+WF=""
+if grep -qi "xlf" <<< "$FCBASE"; then
+ WF="-WF,"
+fi
+TOOLTEST $WF-DSGL_QUOTE=\'H\' $WF-DDBL_QUOTE=\"HDF\" $WF-DMISC=42 $args
+
# No preprocess test since -E is not a common option for Fortran compilers.
##############################################################################
diff --git a/fortran/src/CMakeLists.txt b/fortran/src/CMakeLists.txt
index 60c0c3a..4c28046 100644
--- a/fortran/src/CMakeLists.txt
+++ b/fortran/src/CMakeLists.txt
@@ -24,6 +24,13 @@ if (WIN32)
endif ()
endif ()
+if (H5_HAVE_MPI_F08) # MPI-3 module mpi_f08 supported
+ set (CMAKE_H5_HAVE_MPI_F08 1)
+else ()
+ set (H5_NOMPI_F08 ";")
+ set (CMAKE_H5_HAVE_MPI_F08 0)
+endif ()
+
# configure for Fortran preprocessor
# Define Parallel variable for passing to H5config_f.inc.cmake
diff --git a/fortran/src/H5Ff.c b/fortran/src/H5Ff.c
index 9703486..4b56cca 100644
--- a/fortran/src/H5Ff.c
+++ b/fortran/src/H5Ff.c
@@ -189,98 +189,6 @@ h5funmount_c(hid_t_f *loc_id, _fcd dsetname, int_f *namelen)
return ret_value;
}
-/****if* H5Ff/h5fget_create_plist_c
- * NAME
- * h5fget_create_plist_c
- * PURPOSE
- * Call H5Fget_create_plist to get the file creation property list
- * INPUTS
- * file_id - file identifier
- * OUTPUTS
- * prop_id - creation property list identifier
- * RETURNS
- * 0 on success, -1 on failure
- * SOURCE
- */
-int_f
-h5fget_create_plist_c(hid_t_f *file_id, hid_t_f *prop_id)
-/******/
-{
- int ret_value = -1;
- hid_t c_file_id, c_prop_id;
-
- c_file_id = (hid_t)*file_id;
- c_prop_id = H5Fget_create_plist(c_file_id);
-
- if (c_prop_id < 0)
- return ret_value;
- *prop_id = (hid_t_f)c_prop_id;
-
- ret_value = 0;
- return ret_value;
-}
-
-/****if* H5Ff/h5fget_access_plist_c
- * NAME
- * h5fget_access_plist_c
- * PURPOSE
- * Call H5Fget_access_plist to get the file access property list
- * INPUTS
- * file_id - file identifier
- * OUTPUTS
- * access_id - access property list identifier
- * RETURNS
- * 0 on success, -1 on failure
- * SOURCE
- */
-int_f
-h5fget_access_plist_c(hid_t_f *file_id, hid_t_f *access_id)
-/******/
-{
- int ret_value = -1;
- hid_t c_file_id, c_access_id;
-
- c_file_id = (hid_t)*file_id;
- c_access_id = H5Fget_access_plist(c_file_id);
-
- if (c_access_id < 0)
- return ret_value;
- *access_id = (hid_t_f)c_access_id;
-
- ret_value = 0;
- return ret_value;
-}
-
-/****if* H5Ff/h5fget_obj_count_c
- * NAME
- * h5fget_obj_count_c
- * PURPOSE
- * Call H5Fget_obj_count to get number of open objects within a file
- * INPUTS
- * file_id - identifier of the file to be closed
- * obj_type - type of the object
- * RETURNS
- * obj_count - number of objects
- * 0 on success, -1 on failure
- * SOURCE
- */
-
-int_f
-h5fget_obj_count_c(hid_t_f *file_id, int_f *obj_type, size_t_f *obj_count)
-/******/
-{
- int ret_value = 0;
- hid_t c_file_id;
- unsigned c_obj_type;
- ssize_t c_obj_count;
-
- c_file_id = (hid_t)*file_id;
- c_obj_type = (unsigned)*obj_type;
- if ((c_obj_count = H5Fget_obj_count(c_file_id, c_obj_type)) < 0)
- ret_value = -1;
- *obj_count = (size_t_f)c_obj_count;
- return ret_value;
-}
/****if* H5Ff/h5fget_obj_ids_c
* NAME
* h5fget_obj_ids_c
diff --git a/fortran/src/H5Fff.F90 b/fortran/src/H5Fff.F90
index 79aa5a7..e8b765b 100644
--- a/fortran/src/H5Fff.F90
+++ b/fortran/src/H5Fff.F90
@@ -639,19 +639,21 @@ CONTAINS
!!
SUBROUTINE h5fget_create_plist_f(file_id, prop_id, hdferr)
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
+ INTEGER(HID_T), INTENT(IN) :: file_id
INTEGER(HID_T), INTENT(OUT) :: prop_id
- INTEGER, INTENT(OUT) :: hdferr
+ INTEGER , INTENT(OUT) :: hdferr
INTERFACE
- INTEGER FUNCTION h5fget_create_plist_c(file_id, prop_id) BIND(C,NAME='h5fget_create_plist_c')
+ INTEGER(HID_T) FUNCTION H5Fget_create_plist(file_id) BIND(C,NAME='H5Fget_create_plist')
IMPORT :: HID_T
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
- INTEGER(HID_T), INTENT(OUT) :: prop_id
- END FUNCTION h5fget_create_plist_c
+ INTEGER(HID_T), VALUE :: file_id
+ END FUNCTION H5Fget_create_plist
END INTERFACE
- hdferr = h5fget_create_plist_c(file_id, prop_id)
+ prop_id = H5Fget_create_plist(file_id)
+
+ hdferr = 0
+ IF(prop_id.LT.0) hdferr = -1
END SUBROUTINE h5fget_create_plist_f
!>
@@ -667,19 +669,21 @@ CONTAINS
!!
SUBROUTINE h5fget_access_plist_f(file_id, access_id, hdferr)
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
+ INTEGER(HID_T), INTENT(IN) :: file_id
INTEGER(HID_T), INTENT(OUT) :: access_id
INTEGER, INTENT(OUT) :: hdferr
INTERFACE
- INTEGER FUNCTION h5fget_access_plist_c(file_id, access_id) BIND(C,NAME='h5fget_access_plist_c')
+ INTEGER(HID_T) FUNCTION H5Fget_access_plist(file_id) BIND(C,NAME='H5Fget_access_plist')
IMPORT :: HID_T
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
- INTEGER(HID_T), INTENT(OUT) :: access_id
- END FUNCTION h5fget_access_plist_c
+ INTEGER(HID_T), VALUE :: file_id
+ END FUNCTION H5Fget_access_plist
END INTERFACE
- hdferr = h5fget_access_plist_c(file_id, access_id)
+ access_id = H5Fget_access_plist(file_id)
+
+ hdferr = 0
+ IF(access_id.LT.0) hdferr = -1
END SUBROUTINE h5fget_access_plist_f
@@ -835,37 +839,41 @@ CONTAINS
!>
!! \ingroup FH5F
!!
-!! \brief Gets number of the objects open within a file
+!! \brief Gets number of the objects open within a file.
!!
-!! \param file_id File identifier.
+!! \param file_id File identifier
!! \param obj_type Type of the object; possible values are:
!! \li H5F_OBJ_FILE_F
!! \li H5F_OBJ_DATASET_F
!! \li H5F_OBJ_GROUP_F
!! \li H5F_OBJ_DATATYPE_F
!! \li H5F_OBJ_ALL_F
-!! \param obj_count Number of open objects.
+!! \param obj_count Number of open objects
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fget_obj_count()
!!
SUBROUTINE h5fget_obj_count_f(file_id, obj_type, obj_count, hdferr)
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
- INTEGER, INTENT(IN) :: obj_type
+ INTEGER(HID_T) , INTENT(IN) :: file_id
+ INTEGER , INTENT(IN) :: obj_type
INTEGER(SIZE_T), INTENT(OUT) :: obj_count
- INTEGER, INTENT(OUT) :: hdferr
+ INTEGER , INTENT(OUT) :: hdferr
+
INTERFACE
- INTEGER FUNCTION h5fget_obj_count_c(file_id, obj_type, obj_count) BIND(C,NAME='h5fget_obj_count_c')
+ INTEGER(SIZE_T) FUNCTION H5Fget_obj_count(file_id, obj_type) BIND(C,NAME='H5Fget_obj_count')
+ IMPORT :: C_INT
IMPORT :: HID_T, SIZE_T
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
- INTEGER, INTENT(IN) :: obj_type
- INTEGER(SIZE_T), INTENT(OUT) :: obj_count
- END FUNCTION h5fget_obj_count_c
+ INTEGER(HID_T), VALUE :: file_id
+ INTEGER(C_INT), VALUE :: obj_type
+ END FUNCTION H5Fget_obj_count
END INTERFACE
- hdferr = h5fget_obj_count_c(file_id, obj_type, obj_count)
+ obj_count = H5Fget_obj_count(file_id, INT(obj_type, C_INT))
+
+ hdferr = 0
+ IF(obj_count.LT.0) hdferr = -1
! Don't include objects created by H5open in the H5F_OBJ_ALL_F count
IF(file_id.EQ.INT(H5F_OBJ_ALL_F,HID_T))THEN
@@ -877,47 +885,51 @@ CONTAINS
!>
!! \ingroup FH5F
!!
-!! \brief Get list of open objects identifiers within a file
+!! \brief Get list of open objects identifiers within a file.
!!
-!! \param file_id File identifier.
+!! \param file_id File identifier
!! \param obj_type Type of the object; possible values are:
!! \li H5F_OBJ_FILE_F
!! \li H5F_OBJ_DATASET_F
!! \li H5F_OBJ_GROUP_F
!! \li H5F_OBJ_DATATYPE_F
!! \li H5F_OBJ_ALL_F
-!! \param max_objs Maximum # of objects to retrieve.
-!! \param obj_ids Array of open object identifiers.
+!! \param max_objs Maximum # of objects to retrieve
+!! \param obj_ids Array of open object identifiers
!! \param hdferr \fortran_error
-!! \param num_objs Number of open objects.
+!! \param num_objs Number of open objects
!!
!! See C API: @ref H5Fget_obj_ids()
!!
SUBROUTINE h5fget_obj_ids_f(file_id, obj_type, max_objs, obj_ids, hdferr, num_objs)
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
- INTEGER, INTENT(IN) :: obj_type
+ INTEGER(HID_T) , INTENT(IN) :: file_id
+ INTEGER , INTENT(IN) :: obj_type
INTEGER(SIZE_T), INTENT(IN) :: max_objs
- INTEGER(HID_T), DIMENSION(*), INTENT(INOUT) :: obj_ids
- INTEGER, INTENT(OUT) :: hdferr
+ INTEGER(HID_T) , DIMENSION(*), INTENT(INOUT) :: obj_ids
+ INTEGER , INTENT(OUT) :: hdferr
INTEGER(SIZE_T), INTENT(OUT), OPTIONAL :: num_objs
INTEGER(SIZE_T) :: c_num_objs ! Number of open objects of the specified type
INTERFACE
- INTEGER FUNCTION h5fget_obj_ids_c(file_id, obj_type, max_objs, obj_ids, c_num_objs) &
- BIND(C,NAME='h5fget_obj_ids_c')
+ INTEGER(SIZE_T) FUNCTION H5Fget_obj_ids(file_id, obj_type, max_objs, obj_ids) &
+ BIND(C,NAME='H5Fget_obj_ids')
+ IMPORT :: C_INT
IMPORT :: HID_T, SIZE_T
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: file_id
- INTEGER, INTENT(IN) :: obj_type
- INTEGER(SIZE_T), INTENT(IN) :: max_objs
- INTEGER(HID_T), DIMENSION(*), INTENT(INOUT) :: obj_ids
- INTEGER(SIZE_T), INTENT(OUT) :: c_num_objs
- END FUNCTION h5fget_obj_ids_c
+ INTEGER(HID_T) , VALUE :: file_id
+ INTEGER(C_INT) , VALUE :: obj_type
+ INTEGER(SIZE_T), VALUE :: max_objs
+ INTEGER(HID_T) , DIMENSION(*) :: obj_ids
+ END FUNCTION H5Fget_obj_ids
END INTERFACE
- hdferr = h5fget_obj_ids_c(file_id, obj_type, max_objs, obj_ids, c_num_objs)
+ c_num_objs = H5Fget_obj_ids(file_id, INT(obj_type, C_INT), max_objs, obj_ids)
+
+ hdferr = 0
+ IF(c_num_objs.LT.0) hdferr = -1
+
IF (PRESENT(num_objs)) num_objs= c_num_objs
END SUBROUTINE h5fget_obj_ids_f
@@ -926,8 +938,8 @@ CONTAINS
!!
!! \brief Get amount of free space within a file.
!!
-!! \param file_id File identifier.
-!! \param free_space Amount of free space in file.
+!! \param file_id File identifier
+!! \param free_space Amount of free space in file
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fget_freespace()
@@ -955,9 +967,9 @@ CONTAINS
!!
!! \brief Gets the name of the file from the object identifier.
!!
-!! \param obj_id Object identifier.
-!! \param buf Buffer to store the read name.
-!! \param size Actual size of the name.
+!! \param obj_id Object identifier
+!! \param buf Buffer to store the read name
+!! \param size Actual size of the name
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fget_name()
@@ -990,8 +1002,8 @@ CONTAINS
!!
!! \brief Retrieves the file size of the HDF5 file.
!!
-!! \param file_id File identifier.
-!! \param size File size.
+!! \param file_id File identifier
+!! \param size File size
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fget_filesize()
@@ -1018,8 +1030,8 @@ CONTAINS
!!
!! \brief Retrieves the file number of the HDF5 file.
!!
-!! \param file_id File identifier.
-!! \param fileno File number.
+!! \param file_id File identifier
+!! \param fileno File number
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fget_fileno()
@@ -1046,11 +1058,11 @@ CONTAINS
!!
!! \brief Retrieves a copy of the image of an existing, open file.
!!
-!! \param file_id Target file identifier.
-!! \param buf_ptr Pointer to the buffer into which the image of the HDF5 file is to be copied.
-!! \param buf_len Size of the supplied buffer.
+!! \param file_id Target file identifier
+!! \param buf_ptr Pointer to the buffer into which the image of the HDF5 file is to be copied
+!! \param buf_len Size of the supplied buffer
!! \param hdferr \fortran_error
-!! \param buf_size Returns the size in bytes of the buffer required to store the file image, no data will be copied.
+!! \param buf_size Returns the size in bytes of the buffer required to store the file image, no data will be copied
!!
!! See C API: @ref H5Fget_file_image()
!!
@@ -1094,8 +1106,8 @@ CONTAINS
!! \brief Gets the value of the "minimize dataset headers" value which creates
!! smaller dataset object headers when its set and no attributes are present.
!!
-!! \param file_id Target file identifier.
-!! \param minimize Value of the setting.
+!! \param file_id Target file identifier
+!! \param minimize Value of the setting
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fget_dset_no_attrs_hint()
@@ -1129,8 +1141,8 @@ CONTAINS
!! \brief Sets the value of the "minimize dataset headers" value which creates
!! smaller dataset object headers when its set and no attributes are present.
!!
-!! \param file_id Target file identifier.
-!! \param minimize Value of the setting.
+!! \param file_id Target file identifier
+!! \param minimize Value of the setting
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Fset_dset_no_attrs_hint()
@@ -1192,5 +1204,39 @@ CONTAINS
END SUBROUTINE H5Fget_info_f
+!>
+!! \ingroup FH5F
+!!
+!! \brief Determines the read/write or read-only status of a file.
+!!
+!! \param file_id File identifier
+!! \param intent Access mode flag as originally passed with H5Fopen_f()
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Fget_intent()
+!!
+ SUBROUTINE h5fget_intent_f(file_id, intent, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: file_id
+ INTEGER, INTENT(OUT) :: intent
+ INTEGER, INTENT(OUT) :: hdferr
+
+ INTEGER(C_INT) :: c_intent
+
+ INTERFACE
+ INTEGER(C_INT) FUNCTION H5Fget_intent(file_id, intent) BIND(C,NAME='H5Fget_intent')
+ IMPORT :: C_INT
+ IMPORT :: HID_T
+ IMPLICIT NONE
+ INTEGER(HID_T), VALUE :: file_id
+ INTEGER(C_INT) :: intent
+ END FUNCTION H5Fget_intent
+ END INTERFACE
+
+ hdferr = INT(H5Fget_intent(file_id, c_intent))
+ intent = INT(c_intent)
+
+ END SUBROUTINE h5fget_intent_f
+
END MODULE H5F
diff --git a/fortran/src/H5Pff.F90 b/fortran/src/H5Pff.F90
index afd17f3..90a74f7 100644
--- a/fortran/src/H5Pff.F90
+++ b/fortran/src/H5Pff.F90
@@ -49,6 +49,12 @@ MODULE H5P
PRIVATE h5pget_integer, h5pget_char, h5pget_ptr
PRIVATE h5pregister_integer, h5pregister_ptr
PRIVATE h5pinsert_integer, h5pinsert_char, h5pinsert_ptr
+#ifdef H5_HAVE_PARALLEL
+ PRIVATE h5pset_fapl_mpio_f90, h5pget_fapl_mpio_f90
+#ifdef H5_HAVE_MPI_F08
+ PRIVATE h5pset_fapl_mpio_f08, h5pget_fapl_mpio_f08
+#endif
+#endif
#ifndef H5_DOXYGEN
@@ -101,7 +107,6 @@ MODULE H5P
MODULE PROCEDURE h5pinsert_ptr
END INTERFACE
-
INTERFACE
INTEGER(C_INT) FUNCTION H5Pset_fill_value(prp_id, type_id, fillvalue) &
BIND(C, NAME='H5Pset_fill_value')
@@ -182,6 +187,35 @@ MODULE H5P
#endif
#ifdef H5_HAVE_PARALLEL
+
+ INTERFACE h5pset_fapl_mpio_f
+ MODULE PROCEDURE h5pset_fapl_mpio_f90
+#ifdef H5_HAVE_MPI_F08
+ MODULE PROCEDURE h5pset_fapl_mpio_f08
+#endif
+ END INTERFACE
+
+ INTERFACE h5pget_fapl_mpio_f
+ MODULE PROCEDURE h5pget_fapl_mpio_f90
+#ifdef H5_HAVE_MPI_F08
+ MODULE PROCEDURE h5pget_fapl_mpio_f08
+#endif
+ END INTERFACE
+
+ INTERFACE H5Pset_mpi_params_f
+ MODULE PROCEDURE H5Pset_mpi_params_f90
+#ifdef H5_HAVE_MPI_F08
+ MODULE PROCEDURE H5Pset_mpi_params_f08
+#endif
+ END INTERFACE
+
+ INTERFACE H5Pget_mpi_params_f
+ MODULE PROCEDURE H5Pget_mpi_params_f90
+#ifdef H5_HAVE_MPI_F08
+ MODULE PROCEDURE H5Pget_mpi_params_f08
+#endif
+ END INTERFACE
+
#ifdef H5_HAVE_SUBFILING_VFD
!> \addtogroup FH5P
!> @{
@@ -5125,6 +5159,8 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
! *********************************************************************
#ifdef H5_HAVE_PARALLEL
+
+#ifdef H5_DOXYGEN
!>
!! \ingroup FH5P
!!
@@ -5143,21 +5179,69 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
INTEGER, INTENT(IN) :: comm
INTEGER, INTENT(IN) :: info
INTEGER, INTENT(OUT) :: hdferr
+ END SUBROUTINE h5pset_fapl_mpio_f
+!>
+!! \ingroup FH5P
+!!
+!! \brief Stores MPI IO communicator information to the file access property list.
+!!
+!! \note Supports MPI Fortran module mpi_f08
+!!
+!! \param prp_id File access property list identifier.
+!! \param comm MPI-3 communicator.
+!! \param info MPI-3 info object.
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Pset_fapl_mpio()
+!!
+ SUBROUTINE h5pset_fapl_mpio_f(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(IN) :: comm
+ TYPE(MPI_INFO), INTENT(IN) :: info
+ INTEGER, INTENT(OUT) :: hdferr
+ END SUBROUTINE h5pset_fapl_mpio_f
+
+#else
+
+ SUBROUTINE h5pset_fapl_mpio_f90(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ INTEGER, INTENT(IN) :: comm
+ INTEGER, INTENT(IN) :: info
+ INTEGER, INTENT(OUT) :: hdferr
INTERFACE
INTEGER FUNCTION h5pset_fapl_mpio_c(prp_id, comm, info) &
BIND(C,NAME='h5pset_fapl_mpio_c')
IMPORT :: HID_T
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: prp_id
- INTEGER , INTENT(IN) :: comm
- INTEGER , INTENT(IN) :: info
+ INTEGER(HID_T) :: prp_id
+ INTEGER :: comm
+ INTEGER :: info
END FUNCTION h5pset_fapl_mpio_c
END INTERFACE
hdferr = h5pset_fapl_mpio_c(prp_id, comm, info)
- END SUBROUTINE h5pset_fapl_mpio_f
+ END SUBROUTINE h5pset_fapl_mpio_f90
+#ifdef H5_HAVE_MPI_F08
+ SUBROUTINE h5pset_fapl_mpio_f08(prp_id, comm, info, hdferr)
+ USE mpi_f08
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(IN) :: comm
+ TYPE(MPI_INFO), INTENT(IN) :: info
+ INTEGER, INTENT(OUT) :: hdferr
+
+ CALL h5pset_fapl_mpio_f90(prp_id, comm%mpi_val, info%mpi_val, hdferr)
+
+ END SUBROUTINE h5pset_fapl_mpio_f08
+#endif
+
+#endif
+
+#ifdef H5_DOXYGEN
!>
!! \ingroup FH5P
!!
@@ -5168,9 +5252,44 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
!! \param info MPI-2 info object.
!! \param hdferr \fortran_error
!!
+!! \attention It is the responsibility of the application to free the MPI objects.
+!!
!! See C API: @ref H5Pget_fapl_mpio()
!!
- SUBROUTINE h5pget_fapl_mpio_f(prp_id, comm, info, hdferr)
+SUBROUTINE h5pget_fapl_mpio_f(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ INTEGER, INTENT(OUT) :: comm
+ INTEGER, INTENT(OUT) :: info
+ INTEGER, INTENT(OUT) :: hdferr
+END SUBROUTINE h5pget_fapl_mpio_f
+!>
+!! \ingroup FH5P
+!!
+!! \brief Returns MPI communicator information.
+!!
+!! \note Supports MPI Fortran module mpi_f08
+!!
+!! \param prp_id File access property list identifier.
+!! \param comm MPI-3 communicator.
+!! \param info MPI-3 info object.
+!! \param hdferr \fortran_error
+!!
+!! \attention It is the responsibility of the application to free the MPI objects.
+!!
+!! See C API: @ref H5Pget_fapl_mpio()
+!!
+SUBROUTINE h5pget_fapl_mpio_f(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(OUT) :: comm
+ TYPE(MPI_INFO), INTENT(OUT) :: info
+ INTEGER , INTENT(OUT) :: hdferr
+END SUBROUTINE h5pget_fapl_mpio_f
+
+#else
+
+ SUBROUTINE h5pget_fapl_mpio_f90(prp_id, comm, info, hdferr)
IMPLICIT NONE
INTEGER(HID_T), INTENT(IN) :: prp_id
INTEGER, INTENT(OUT) :: comm
@@ -5181,15 +5300,30 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
BIND(C,NAME='h5pget_fapl_mpio_c')
IMPORT :: HID_T
IMPLICIT NONE
- INTEGER(HID_T), INTENT(IN) :: prp_id
- INTEGER , INTENT(OUT) :: comm
- INTEGER , INTENT(OUT) :: info
+ INTEGER(HID_T) :: prp_id
+ INTEGER :: comm
+ INTEGER :: info
END FUNCTION h5pget_fapl_mpio_c
END INTERFACE
hdferr = h5pget_fapl_mpio_c(prp_id, comm, info)
- END SUBROUTINE h5pget_fapl_mpio_f
+ END SUBROUTINE h5pget_fapl_mpio_f90
+
+#ifdef H5_HAVE_MPI_F08
+ SUBROUTINE h5pget_fapl_mpio_f08(prp_id, comm, info, hdferr)
+ USE mpi_f08
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(OUT) :: comm
+ TYPE(MPI_INFO), INTENT(OUT) :: info
+ INTEGER, INTENT(OUT) :: hdferr
+
+ CALL h5pget_fapl_mpio_f90(prp_id, comm%mpi_val, info%mpi_val, hdferr)
+
+ END SUBROUTINE h5pget_fapl_mpio_f08
+#endif
+#endif
#ifdef H5_HAVE_SUBFILING_VFD
!>
@@ -5376,14 +5510,15 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
END SUBROUTINE h5pget_mpio_no_collective_cause_f
+#ifdef H5_DOXYGEN
!>
!! \ingroup FH5P
!!
-!! \brief Set the MPI communicator and info.
+!! \brief Set the MPI communicator and information.
!!
!! \param prp_id File access property list identifier.
-!! \param comm The MPI communicator.
-!! \param info The MPI info object.
+!! \param comm MPI-2 communicator.
+!! \param info MPI-2 info object.
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Pset_mpi_params()
@@ -5394,6 +5529,37 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
INTEGER , INTENT(IN) :: comm
INTEGER , INTENT(IN) :: info
INTEGER , INTENT(OUT) :: hdferr
+ END SUBROUTINE H5Pset_mpi_params_f
+!>
+!! \ingroup FH5P
+!!
+!! \brief Set the MPI communicator and information.
+!!
+!! \note Supports MPI Fortran module mpi_f08
+!!
+!! \param prp_id File access property list identifier.
+!! \param comm MPI-3 communicator.
+!! \param info MPI-3 info object.
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Pset_mpi_params()
+!!
+ SUBROUTINE H5Pset_mpi_params_f(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(IN) :: comm
+ TYPE(MPI_INFO), INTENT(IN) :: info
+ INTEGER , INTENT(OUT) :: hdferr
+ END SUBROUTINE H5Pset_mpi_params_f
+
+#else
+
+ SUBROUTINE H5Pset_mpi_params_f90(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ INTEGER , INTENT(IN) :: comm
+ INTEGER , INTENT(IN) :: info
+ INTEGER , INTENT(OUT) :: hdferr
INTERFACE
INTEGER FUNCTION h5pset_mpi_params_c(prp_id, comm, info) &
@@ -5408,16 +5574,33 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
hdferr = H5Pset_mpi_params_c(prp_id, comm, info)
- END SUBROUTINE H5Pset_mpi_params_f
+ END SUBROUTINE H5Pset_mpi_params_f90
+
+#ifdef H5_HAVE_MPI_F08
+ SUBROUTINE H5Pset_mpi_params_f08(prp_id, comm, info, hdferr)
+ USE mpi_f08
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(IN) :: comm
+ TYPE(MPI_INFO), INTENT(IN) :: info
+ INTEGER , INTENT(OUT) :: hdferr
+
+ CALL H5Pset_mpi_params_f90(prp_id, comm%mpi_val, info%mpi_val, hdferr)
+
+ END SUBROUTINE H5Pset_mpi_params_f08
+#endif
+
+#endif
+#ifdef H5_DOXYGEN
!>
!! \ingroup FH5P
!!
!! \brief Get the MPI communicator and info.
!!
!! \param prp_id File access property list identifier.
-!! \param comm The MPI communicator.
-!! \param info The MPI info object.
+!! \param comm MPI-2 communicator.
+!! \param info MPI-2 info object.
!! \param hdferr \fortran_error
!!
!! See C API: @ref H5Pget_mpi_params()
@@ -5428,6 +5611,39 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
INTEGER , INTENT(OUT) :: comm
INTEGER , INTENT(OUT) :: info
INTEGER , INTENT(OUT) :: hdferr
+ END SUBROUTINE H5Pget_mpi_params_f
+!>
+!! \ingroup FH5P
+!!
+!! \brief Get the MPI communicator and information.
+!!
+!! \note Supports MPI Fortran module mpi_f08
+!!
+!! \param prp_id File access property list identifier.
+!! \param comm MPI-3 communicator.
+!! \param info MPI-3 info object.
+!! \param hdferr \fortran_error
+!!
+!! \attention It is the responsibility of the application to free the MPI objects.
+!!
+!! See C API: @ref H5Pget_mpi_params()
+!!
+ SUBROUTINE H5Pget_mpi_params_f(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(OUT) :: comm
+ TYPE(MPI_INFO), INTENT(OUT) :: info
+ INTEGER , INTENT(OUT) :: hdferr
+ END SUBROUTINE H5Pget_mpi_params_f
+
+#else
+
+ SUBROUTINE H5Pget_mpi_params_f90(prp_id, comm, info, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ INTEGER , INTENT(OUT) :: comm
+ INTEGER , INTENT(OUT) :: info
+ INTEGER , INTENT(OUT) :: hdferr
INTERFACE
INTEGER FUNCTION h5pget_mpi_params_c(prp_id, comm, info) &
@@ -5442,7 +5658,23 @@ SUBROUTINE h5pset_attr_phase_change_f(ocpl_id, max_compact, min_dense, hdferr)
hdferr = H5Pget_mpi_params_c(prp_id, comm, info)
- END SUBROUTINE H5Pget_mpi_params_f
+ END SUBROUTINE H5Pget_mpi_params_f90
+
+#ifdef H5_HAVE_MPI_F08
+ SUBROUTINE H5Pget_mpi_params_f08(prp_id, comm, info, hdferr)
+ USE mpi_f08
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: prp_id
+ TYPE(MPI_COMM), INTENT(OUT) :: comm
+ TYPE(MPI_INFO), INTENT(OUT) :: info
+ INTEGER , INTENT(OUT) :: hdferr
+
+ CALL H5Pget_mpi_params_f90(prp_id, comm%mpi_val, info%mpi_val, hdferr)
+
+ END SUBROUTINE H5Pget_mpi_params_f08
+#endif
+
+#endif
!>
!! \ingroup FH5P
diff --git a/fortran/src/H5Sff.F90 b/fortran/src/H5Sff.F90
index e734c03..5f2f1d2 100644
--- a/fortran/src/H5Sff.F90
+++ b/fortran/src/H5Sff.F90
@@ -1445,4 +1445,148 @@ CONTAINS
END SUBROUTINE H5Sis_regular_hyperslab_f
+!>
+!! \ingroup FH5S
+!!
+!! \brief Closes a dataspace selection iterator.
+!!
+!! \param sel_iter_id Dataspace selection iterator identifier
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Ssel_iter_close()
+!!
+ SUBROUTINE h5ssel_iter_close_f(sel_iter_id, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: sel_iter_id
+ INTEGER, INTENT(OUT) :: hdferr
+ INTERFACE
+ INTEGER(C_INT) FUNCTION H5Ssel_iter_close(sel_iter_id) &
+ BIND(C,NAME='H5Ssel_iter_close')
+ IMPORT :: HID_T, C_INT
+ IMPLICIT NONE
+ INTEGER(HID_T), VALUE :: sel_iter_id
+ END FUNCTION H5Ssel_iter_close
+ END INTERFACE
+
+ hdferr = INT(h5ssel_iter_close(sel_iter_id), C_INT)
+
+ END SUBROUTINE h5ssel_iter_close_f
+
+!>
+!! \ingroup FH5S
+!!
+!! \brief Creates a dataspace selection iterator for a dataspace's selection.
+!!
+!! \param space_id Dataspace identifier
+!! \param elmt_size Size of element in the selection
+!! \param flags Selection iterator flag, valid values are:
+!! \li H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F, ref. @ref H5S_SEL_ITER_GET_SEQ_LIST_SORTED
+!! \li H5S_SEL_ITER_SHARE_WITH_DATASPACE_F, ref. @ref H5S_SEL_ITER_SHARE_WITH_DATASPACE
+!! \param ds_iter_id Dataspace selection iterator identifier
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Ssel_iter_create()
+!!
+ SUBROUTINE h5ssel_iter_create_f(space_id, elmt_size, flags, ds_iter_id, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T) , INTENT(IN) :: space_id
+ INTEGER(SIZE_T), INTENT(IN) :: elmt_size
+ INTEGER , INTENT(IN) :: flags
+ INTEGER(HID_T) , INTENT(OUT) :: ds_iter_id
+ INTEGER , INTENT(OUT) :: hdferr
+ INTERFACE
+ INTEGER(HID_T) FUNCTION H5Ssel_iter_create(space_id, elmt_size, flags) &
+ BIND(C,NAME='H5Ssel_iter_create')
+ IMPORT :: HID_T, C_INT, SIZE_T
+ IMPLICIT NONE
+ INTEGER(HID_T) , VALUE :: space_id
+ INTEGER(SIZE_T), VALUE :: elmt_size
+ INTEGER(C_INT) , VALUE :: flags
+ END FUNCTION H5Ssel_iter_create
+ END INTERFACE
+
+ ds_iter_id = H5Ssel_iter_create(space_id, elmt_size, INT(flags, C_INT))
+
+ hdferr = 0
+ IF(ds_iter_id.LT.0) hdferr = -1
+
+ END SUBROUTINE h5ssel_iter_create_f
+
+!>
+!! \ingroup FH5S
+!!
+!! \brief Retrieves a list of offset / length sequences for the elements in an iterator.
+!!
+!! \param sel_iter_id Dataspace selection iterator identifier
+!! \param maxseq Maximum number of sequences to retrieve
+!! \param maxbytes Maximum number of bytes to retrieve in sequences
+!! \param nseq Number of sequences retrieved
+!! \param nbytes Number of bytes retrieved, in all sequences
+!! \param off Array of sequence offsets
+!! \param len Array of sequence lengths
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Ssel_iter_get_seq_list()
+!!
+ SUBROUTINE h5ssel_iter_get_seq_list_f(sel_iter_id, maxseq, maxbytes, nseq, nbytes, off, len, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T) , INTENT(IN) :: sel_iter_id
+ INTEGER(SIZE_T), INTENT(IN) :: maxseq
+ INTEGER(SIZE_T), INTENT(IN) :: maxbytes
+ INTEGER(SIZE_T), INTENT(OUT) :: nseq
+ INTEGER(SIZE_T), INTENT(OUT) :: nbytes
+ INTEGER(HSIZE_T), DIMENSION(*), INTENT(OUT) :: off
+ INTEGER(SIZE_T), DIMENSION(*), INTENT(OUT) :: len
+ INTEGER, INTENT(OUT) :: hdferr
+
+ INTERFACE
+ INTEGER(C_INT) FUNCTION H5Ssel_iter_get_seq_list(sel_iter_id, maxseq, maxbytes, nseq, nbytes, off, len) &
+ BIND(C,NAME='H5Ssel_iter_get_seq_list')
+ IMPORT :: HID_T, C_INT, SIZE_T, HSIZE_T
+ IMPLICIT NONE
+ INTEGER(HID_T) , VALUE :: sel_iter_id
+ INTEGER(SIZE_T), VALUE :: maxseq
+ INTEGER(SIZE_T), VALUE :: maxbytes
+ INTEGER(SIZE_T) :: nseq
+ INTEGER(SIZE_T) :: nbytes
+ INTEGER(HSIZE_T), DIMENSION(*) :: off
+ INTEGER(SIZE_T), DIMENSION(*) :: len
+ END FUNCTION H5Ssel_iter_get_seq_list
+ END INTERFACE
+
+ hdferr = INT(H5Ssel_iter_get_seq_list(sel_iter_id, maxseq, maxbytes, nseq, nbytes, off, len), C_INT)
+
+ END SUBROUTINE h5ssel_iter_get_seq_list_f
+
+!>
+!! \ingroup FH5S
+!!
+!! \brief Resets a dataspace selection iterator back to an initial state.
+!!
+!! \param sel_iter_id Identifier of the dataspace selection iterator to reset
+!! \param space_id Identifier of the dataspace with selection to iterate over
+!! \param hdferr \fortran_error
+!!
+!! See C API: @ref H5Ssel_iter_reset()
+!!
+ SUBROUTINE h5ssel_iter_reset_f(sel_iter_id, space_id, hdferr)
+ IMPLICIT NONE
+ INTEGER(HID_T), INTENT(IN) :: sel_iter_id
+ INTEGER(HID_T), INTENT(IN) :: space_id
+ INTEGER, INTENT(OUT) :: hdferr
+ INTERFACE
+ INTEGER(C_INT) FUNCTION H5Ssel_iter_reset(sel_iter_id, space_id) &
+ BIND(C,NAME='H5Ssel_iter_reset')
+ IMPORT :: HID_T, C_INT
+ IMPLICIT NONE
+ INTEGER(HID_T), VALUE :: sel_iter_id
+ INTEGER(HID_T), VALUE :: space_id
+ END FUNCTION H5Ssel_iter_reset
+ END INTERFACE
+
+ hdferr = INT(h5ssel_iter_reset(sel_iter_id, space_id), C_INT)
+
+ END SUBROUTINE h5ssel_iter_reset_f
+
+
END MODULE H5S
diff --git a/fortran/src/H5_f.c b/fortran/src/H5_f.c
index b1dc7db..e6f7b6d 100644
--- a/fortran/src/H5_f.c
+++ b/fortran/src/H5_f.c
@@ -798,6 +798,9 @@ h5init_flags_c(int_f *h5d_flags, size_t_f *h5d_size_flags, int_f *h5e_flags, hid
h5s_flags[16] = (int_f)H5S_SEL_HYPERSLABS;
h5s_flags[17] = (int_f)H5S_SEL_ALL;
+ h5s_flags[18] = (int_f)H5S_SEL_ITER_GET_SEQ_LIST_SORTED;
+ h5s_flags[19] = (int_f)H5S_SEL_ITER_SHARE_WITH_DATASPACE;
+
/*
* H5T flags
*/
diff --git a/fortran/src/H5_ff.F90 b/fortran/src/H5_ff.F90
index fe6337a..e83768a 100644
--- a/fortran/src/H5_ff.F90
+++ b/fortran/src/H5_ff.F90
@@ -135,7 +135,7 @@ MODULE H5LIB
!
! H5S flags declaration
!
- INTEGER, PARAMETER :: H5S_FLAGS_LEN = 18
+ INTEGER, PARAMETER :: H5S_FLAGS_LEN = 20
INTEGER, DIMENSION(1:H5S_FLAGS_LEN) :: H5S_flags
INTEGER, PARAMETER :: H5S_HSIZE_FLAGS_LEN = 1
INTEGER(HSIZE_T), DIMENSION(1:H5S_HSIZE_FLAGS_LEN) :: H5S_hsize_flags
@@ -656,6 +656,8 @@ CONTAINS
H5S_SEL_POINTS_F = H5S_flags(16)
H5S_SEL_HYPERSLABS_F = H5S_flags(17)
H5S_SEL_ALL_F = H5S_flags(18)
+ H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F = H5S_flags(19)
+ H5S_SEL_ITER_SHARE_WITH_DATASPACE_F = H5S_flags(20)
!
! H5T flags declaration
!
diff --git a/fortran/src/H5config_f.inc.cmake b/fortran/src/H5config_f.inc.cmake
index 77ff707..e6fa7b9 100644
--- a/fortran/src/H5config_f.inc.cmake
+++ b/fortran/src/H5config_f.inc.cmake
@@ -19,6 +19,14 @@
#define H5_HAVE_PARALLEL
#endif
+! Define if MPI supports mpi_f08 module
+#cmakedefine01 CMAKE_H5_HAVE_MPI_F08
+#if CMAKE_H5_HAVE_MPI_F08 == 0
+#undef H5_HAVE_MPI_F08
+#else
+#define H5_HAVE_MPI_F08
+#endif
+
! Define if there is subfiling support
#cmakedefine01 CMAKE_H5_HAVE_SUBFILING_VFD
#if CMAKE_H5_HAVE_SUBFILING_VFD == 0
diff --git a/fortran/src/H5config_f.inc.in b/fortran/src/H5config_f.inc.in
index afcfa6e..7f52255 100644
--- a/fortran/src/H5config_f.inc.in
+++ b/fortran/src/H5config_f.inc.in
@@ -17,6 +17,9 @@
! Define if we have parallel support
#undef HAVE_PARALLEL
+! Define if MPI supports mpi_f08 module
+#undef HAVE_MPI_F08
+
! Define if we have subfiling support
#undef HAVE_SUBFILING_VFD
diff --git a/fortran/src/H5f90global.F90 b/fortran/src/H5f90global.F90
index fb25f7e..5b4fc64 100644
--- a/fortran/src/H5f90global.F90
+++ b/fortran/src/H5f90global.F90
@@ -793,6 +793,8 @@ MODULE H5GLOBAL
!DEC$ATTRIBUTES DLLEXPORT :: H5S_SEL_POINTS_F
!DEC$ATTRIBUTES DLLEXPORT :: H5S_SEL_HYPERSLABS_F
!DEC$ATTRIBUTES DLLEXPORT :: H5S_SEL_ALL_F
+ !DEC$ATTRIBUTES DLLEXPORT :: H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F
+ !DEC$ATTRIBUTES DLLEXPORT :: H5S_SEL_ITER_SHARE_WITH_DATASPACE_F
!DEC$endif
!> \addtogroup FH5S
!> @{
@@ -822,6 +824,9 @@ MODULE H5GLOBAL
INTEGER :: H5S_SEL_POINTS_F !< H5S_SEL_POINTS
INTEGER :: H5S_SEL_HYPERSLABS_F !< H5S_SEL_HYPERSLABS
INTEGER :: H5S_SEL_ALL_F !< H5S_SEL_ALL
+
+ INTEGER :: H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F !< H5S_SEL_ITER_GET_SEQ_LIST_SORTED
+ INTEGER :: H5S_SEL_ITER_SHARE_WITH_DATASPACE_F !< H5S_SEL_ITER_SHARE_WITH_DATASPACE
!> @}
!
! H5T flags declaration
diff --git a/fortran/src/h5fc.in b/fortran/src/h5fc.in
index c5da815..6d7329a 100644
--- a/fortran/src/h5fc.in
+++ b/fortran/src/h5fc.in
@@ -60,7 +60,7 @@ host_os="@host_os@"
prog_name="`basename $0`"
-allargs=""
+misc_args=""
compile_args=""
link_args=""
link_objs=""
@@ -176,7 +176,6 @@ for arg in $@ ; do
case "$arg" in
-c)
- allargs="$allargs $arg"
compile_args="$compile_args $arg"
if test "x$do_link" = "xyes" -a -n "$output_file"; then
@@ -187,7 +186,6 @@ for arg in $@ ; do
dash_c="yes"
;;
-o)
- allargs="$allargs $arg"
dash_o="yes"
if test "x$dash_c" = "xyes"; then
@@ -199,14 +197,12 @@ for arg in $@ ; do
fi
;;
-E|-M|-MT)
- allargs="$allargs $arg"
compile_args="$compile_args $arg"
dash_c="yes"
do_link="no"
;;
-l*)
link_args="$link_args $arg"
- allargs="$allargs $arg"
;;
-prefix=*)
prefix="`expr "$arg" : '-prefix=\(.*\)'`"
@@ -238,14 +234,14 @@ for arg in $@ ; do
;;
*\"*)
qarg="'"$arg"'"
- allargs="$allargs $qarg"
+ misc_args="$misc_args $qarg"
;;
*\'*)
- qarg='\"'"$arg"'\"'
- allargs="$allargs $qarg"
+ qarg='"'"$arg"'"'
+ misc_args="$misc_args $qarg"
;;
- *) allargs="$allargs $arg"
+ *) misc_args="$misc_args $arg"
if [ -s "$arg" ] ; then
ext=`expr "$arg" : '.*\(\..*\)'`
if [ "$ext" = ".f" -o "$ext" = ".F" -o \
@@ -293,7 +289,7 @@ if test "x$do_compile" = "xyes"; then
fi
- $SHOW $FC $H5BLD_FCFLAGS $FCFLAGS ${F9XSUFFIXFLAG} ${fmodules} $compile_args
+ $SHOW $FC $H5BLD_FCFLAGS $FCFLAGS ${F9XSUFFIXFLAG} ${fmodules} $misc_args $compile_args
status=$?
if test "$status" != "0"; then
diff --git a/fortran/src/hdf5_fortrandll.def.in b/fortran/src/hdf5_fortrandll.def.in
index 119e140..e29488f 100644
--- a/fortran/src/hdf5_fortrandll.def.in
+++ b/fortran/src/hdf5_fortrandll.def.in
@@ -147,6 +147,7 @@ H5F_mp_H5FGET_FILE_IMAGE_F
H5F_mp_H5FGET_DSET_NO_ATTRS_HINT_F
H5F_mp_H5FSET_DSET_NO_ATTRS_HINT_F
H5F_mp_H5FGET_INFO_F
+H5F_mp_H5FGET_INTENT_F
; H5G
H5G_mp_H5GOPEN_F
H5G_mp_H5GOPEN_ASYNC_F
@@ -420,14 +421,18 @@ H5P_mp_H5PSET_FILE_SPACE_PAGE_SIZE_F
H5P_mp_H5PGET_FILE_SPACE_PAGE_SIZE_F
H5P_mp_H5PGET_ACTUAL_SELECTION_IO_MODE_F
; Parallel
-@H5_NOPAREXP@H5P_mp_H5PSET_FAPL_MPIO_F
-@H5_NOPAREXP@H5P_mp_H5PGET_FAPL_MPIO_F
+@H5_NOPAREXP@H5P_mp_H5PSET_FAPL_MPIO_F90
+@H5_NOPAREXP@@H5_NOMPI_F08@H5P_mp_H5PSET_FAPL_MPIO_F08
+@H5_NOPAREXP@H5P_mp_H5PGET_FAPL_MPIO_F90
+@H5_NOPAREXP@@H5_NOMPI_F08@H5P_mp_H5PGET_FAPL_MPIO_F08
@H5_NOPAREXP@@H5_NOSUBFILING@H5P_mp_H5PSET_FAPL_SUBFILING_F
@H5_NOPAREXP@@H5_NOSUBFILING@H5P_mp_H5PGET_FAPL_SUBFILING_F
@H5_NOPAREXP@@H5_NOSUBFILING@H5P_mp_H5PSET_FAPL_IOC_F
@H5_NOPAREXP@@H5_NOSUBFILING@H5P_mp_H5PGET_FAPL_IOC_F
-@H5_NOPAREXP@H5P_mp_H5PSET_MPI_PARAMS_F
-@H5_NOPAREXP@H5P_mp_H5PGET_MPI_PARAMS_F
+@H5_NOPAREXP@H5P_mp_H5PSET_MPI_PARAMS_F90
+@H5_NOPAREXP@@H5_NOMPI_F08@H5P_mp_H5PSET_MPI_PARAMS_F08
+@H5_NOPAREXP@H5P_mp_H5PGET_MPI_PARAMS_F90
+@H5_NOPAREXP@@H5_NOMPI_F08@H5P_mp_H5PGET_MPI_PARAMS_F08
@H5_NOPAREXP@H5P_mp_H5PSET_DXPL_MPIO_F
@H5_NOPAREXP@H5P_mp_H5PGET_DXPL_MPIO_F
@H5_NOPAREXP@H5P_mp_H5PGET_MPIO_ACTUAL_IO_MODE_F
@@ -483,6 +488,10 @@ H5S_mp_H5SENCODE_F
H5S_mp_H5SEXTENT_EQUAL_F
H5S_mp_H5SGET_REGULAR_HYPERSLAB_F
H5S_mp_H5SIS_REGULAR_HYPERSLAB_F
+H5S_mp_H5SSEL_ITER_CREATE_F
+H5S_mp_H5SSEL_ITER_GET_SEQ_LIST_F
+H5S_mp_H5SSEL_ITER_CLOSE_F
+H5S_mp_H5SSEL_ITER_RESET_F
; H5T
H5T_mp_H5TOPEN_F
H5T_mp_H5TCOMMIT_F
diff --git a/fortran/test/fortranlib_test.F90 b/fortran/test/fortranlib_test.F90
index e0a837a..05ae593 100644
--- a/fortran/test/fortranlib_test.F90
+++ b/fortran/test/fortranlib_test.F90
@@ -158,9 +158,8 @@ PROGRAM fortranlibtest
CALL test_basic_select(cleanup, ret_total_error)
CALL write_test_status(ret_total_error, ' Basic selection test', total_error)
-
ret_total_error = 0
- CALL test_select_hyperslab( cleanup, ret_total_error)
+ CALL test_select_hyperslab(cleanup, ret_total_error)
CALL write_test_status(ret_total_error, ' Hyperslab selection test', total_error)
ret_total_error = 0
@@ -179,6 +178,11 @@ PROGRAM fortranlibtest
CALL test_select_bounds(ret_total_error)
CALL write_test_status(ret_total_error, ' Selection bounds test ', total_error)
+ ret_total_error = 0
+ CALL test_select_iter(cleanup, ret_total_error)
+ CALL write_test_status(ret_total_error, ' Dataspace selection iterators test', total_error)
+
+
!
! '========================================='
! 'Testing DATATYPE interface '
diff --git a/fortran/test/tH5F.F90 b/fortran/test/tH5F.F90
index 7f9490b..569d459 100644
--- a/fortran/test/tH5F.F90
+++ b/fortran/test/tH5F.F90
@@ -197,6 +197,7 @@ CONTAINS
!flag to check operation success
!
INTEGER :: error
+ INTEGER :: fintent
!
!general purpose integer
@@ -215,8 +216,8 @@ CONTAINS
!data buffers
!
INTEGER, DIMENSION(NX,NY) :: data_in, data_out
-
INTEGER(HSIZE_T), DIMENSION(2) :: data_dims
+
filename1 = "mount1"
filename2 = "mount2"
@@ -377,6 +378,13 @@ CONTAINS
CALL h5fopen_f (fix_filename1, H5F_ACC_RDWR_F, file1_id, error)
CALL check("hfopen_f",error,total_error)
+ CALL h5fget_intent_f(file1_id, fintent, error)
+ CALL check("h5fget_intent_f",error,total_error)
+
+ IF(fintent.NE.H5F_ACC_RDWR_F)THEN
+ total_error = total_error + 1
+ ENDIF
+
CALL h5fget_obj_count_f(INT(H5F_OBJ_ALL_F,HID_T), H5F_OBJ_ALL_F, obj_count, error)
CALL check(" h5fget_obj_count_f",error,total_error)
@@ -389,7 +397,6 @@ CONTAINS
CALL h5fget_obj_count_f(INT(H5F_OBJ_ALL_F,HID_T), H5F_OBJ_ALL_F, obj_count, error)
CALL check(" h5fget_obj_count_f",error,total_error)
-
IF(obj_count.NE.2)THEN
total_error = total_error + 1
ENDIF
@@ -1038,7 +1045,7 @@ CONTAINS
total_error = total_error + 1
write(*,*) " Open with H5F_CLOSE_STRONG should fail "
endif
-
+
CALL h5fget_obj_count_f(fid1, H5F_OBJ_ALL_F, obj_count, error)
CALL check("h5fget_obj_count_f",error,total_error)
if(error .eq.0 .and. obj_count .ne. 3) then
diff --git a/fortran/test/tH5Sselect.F90 b/fortran/test/tH5Sselect.F90
index b6d28d3..bcf254a 100644
--- a/fortran/test/tH5Sselect.F90
+++ b/fortran/test/tH5Sselect.F90
@@ -314,7 +314,146 @@ CONTAINS
END SUBROUTINE test_select_hyperslab
!
- !Subroutine to test element selection
+ ! Subroutine to test selection iterations
+ !
+
+ SUBROUTINE test_select_iter(cleanup, total_error)
+
+ IMPLICIT NONE
+ LOGICAL, INTENT(IN) :: cleanup
+ INTEGER, INTENT(INOUT) :: total_error
+
+ INTEGER, PARAMETER :: POINT1_NPOINTS = 10
+ INTEGER(SIZE_T), PARAMETER :: SEL_ITER_MAX_SEQ = 256 ! Information for testing selection iterators
+ INTEGER, PARAMETER :: rank = 2
+ INTEGER(SIZE_T), PARAMETER :: NUMP = 4
+
+ INTEGER(hsize_t), DIMENSION(2) :: dims1 = (/12, 6/) ! 2-D Dataspace dimensions
+ INTEGER(HID_T) :: sid ! Dataspace ID
+ INTEGER(HID_T) :: iter_id ! Dataspace selection iterator ID
+ INTEGER(HSIZE_T), DIMENSION(rank, POINT1_NPOINTS) :: coord1 ! Coordinates for point selection
+ INTEGER(HSIZE_T), DIMENSION(2) :: start ! Hyperslab start
+ INTEGER(HSIZE_T), DIMENSION(2) :: stride ! Hyperslab stride
+ INTEGER(HSIZE_T), DIMENSION(2) :: count ! Hyperslab block count
+ INTEGER(HSIZE_T), DIMENSION(2) :: BLOCK ! Hyperslab block size
+ INTEGER(SIZE_T) :: nseq ! # of sequences retrieved
+ INTEGER(SIZE_T) :: nbytes ! # of bytes retrieved
+ INTEGER(HSIZE_T), DIMENSION(SEL_ITER_MAX_SEQ) :: off ! Offsets for retrieved sequences
+ INTEGER(SIZE_T), DIMENSION(SEL_ITER_MAX_SEQ) :: ilen ! Lengths for retrieved sequences
+ INTEGER :: sel_type ! Selection type
+ INTEGER :: error ! Error return value
+ integer(size_t) :: i
+
+ ! Create dataspace
+ CALL H5Screate_simple_f(2, dims1, sid, error)
+ CALL check("H5Screate_simple_f", error, total_error)
+
+ ! Test iterators on various basic selection types
+ DO sel_type = H5S_SEL_NONE_F, H5S_SEL_ALL_F
+ IF(sel_type .EQ. H5S_SEL_NONE_F)THEN ! "None" selection
+ CALL H5Sselect_none_f(sid, error)
+ CALL check("H5Sselect_none_f", error, total_error)
+ ELSE IF(sel_type.EQ.H5S_SEL_POINTS_F)THEN ! Point selection
+ ! Select sequence of four points
+ coord1(1, 1) = 1
+ coord1(2, 1) = 2
+ coord1(1, 2) = 3
+ coord1(2, 2) = 4
+ coord1(1, 3) = 5
+ coord1(2, 3) = 6
+ coord1(1, 4) = 7
+ coord1(2, 4) = 8
+ CALL H5Sselect_elements_f(sid, H5S_SELECT_SET_F, rank, NUMP, coord1, error)
+ CALL check("H5Sselect_elements_f", error, total_error)
+ ELSE IF(sel_type.EQ.H5S_SEL_HYPERSLABS_F)THEN ! Hyperslab selection
+ ! Select regular hyperslab
+ start(1) = 0
+ start(2) = 0
+ stride(1) = 1
+ stride(2) = 1
+ COUNT(1) = 4
+ COUNT(2) = 4
+ BLOCK(1) = 1
+ BLOCK(2) = 1
+ CALL H5Sselect_hyperslab_f(sid, H5S_SELECT_SET_F, start, count, error, stride=stride, BLOCK=BLOCK)
+ CALL check("H5Sselect_hyperslab_f", error, total_error)
+ ELSE IF(sel_type.EQ.H5S_SEL_ALL_F)THEN ! "All" selection
+ CALL H5Sselect_all_f(sid, error)
+ CALL check("H5Sselect_all_f", error, total_error)
+ ELSE
+ CALL check("Incorrect selection option", error, total_error)
+ ENDIF
+
+ ! Create selection iterator object
+ CALL H5Ssel_iter_create_f(sid, 1_size_t, H5S_SEL_ITER_SHARE_WITH_DATASPACE_F, iter_id, error)
+ CALL check("H5Ssel_iter_create_f", error, total_error)
+
+ ! Try retrieving all sequence
+ off = -99
+ ilen = -99
+ CALL H5Ssel_iter_get_seq_list_f(iter_id, SEL_ITER_MAX_SEQ, 1024_size_t * 1024_size_t, nseq, nbytes, off, ilen, error)
+ CALL check("H5Ssel_iter_get_seq_list_f", error, total_error)
+
+ ! Check results from retrieving sequence list
+
+ IF (sel_type .EQ. H5S_SEL_NONE_F)THEN ! "None" selection
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nseq, INT(0,SIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nbytes, INT(0,SIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(1), INT(-99,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(1), INT(-99,SIZE_T), total_error)
+ ELSE IF (sel_type .EQ. H5S_SEL_POINTS_F)THEN ! Point selection
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nseq, 4_SIZE_T, total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nbytes, 4_SIZE_T, total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(NUMP+1), INT(-99,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(NUMP+1), INT(-99,SIZE_T), total_error)
+ DO i = 1, NUMP
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(i), INT((i-1)*26+12,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(i), INT(1,SIZE_T), total_error)
+ ENDDO
+ ELSE IF (sel_type .eq. H5S_SEL_HYPERSLABS_F)THEN ! Hyperslab selection
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nseq, 4_SIZE_T, total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nbytes, 16_SIZE_T, total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(NUMP+1), INT(-99,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(NUMP+1), INT(-99,SIZE_T), total_error)
+ DO i = 1, NUMP
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(i), INT((i-1)*12,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(i), INT(4,SIZE_T), total_error)
+ ENDDO
+ ELSE IF (sel_type.EQ.H5S_SEL_ALL_F)THEN ! "All" selection
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nseq, 1_SIZE_T, total_error )
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", nbytes, 72_SIZE_T, total_error )
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(1), INT(0,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(1), INT(72,SIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", off(2), INT(-99,HSIZE_T), total_error)
+ CALL VERIFY("H5Ssel_iter_get_seq_list_f", ilen(2), INT(-99,SIZE_T), total_error)
+ ELSE
+ CALL check("Incorrect selection option", error, total_error)
+ ENDIF
+
+ ! Reset iterator
+ CALL H5Ssel_iter_reset_f(iter_id, sid, error)
+ CALL check("H5Ssel_iter_reset_f", error, total_error)
+
+ ! Close selection iterator
+ CALL H5Ssel_iter_close_f(iter_id, error)
+ CALL check("H5Ssel_iter_close_f", error, total_error)
+ END DO
+
+ ! Create selection iterator object
+ CALL H5Ssel_iter_create_f(sid, 1_size_t, H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F, iter_id, error)
+ CALL check("H5Ssel_iter_create_f", error, total_error)
+
+ ! Reset iterator
+ CALL H5Ssel_iter_reset_f(iter_id, sid, error)
+ CALL check("H5Ssel_iter_reset_f", error, total_error)
+
+ CALL h5sclose_f(sid, error)
+ CALL check("h5sclose_f", error, total_error)
+
+ END SUBROUTINE test_select_iter
+
+ !
+ ! Subroutine to test element selection
!
SUBROUTINE test_select_element(cleanup, total_error)
@@ -1043,9 +1182,6 @@ CONTAINS
!
DEALLOCATE(pointlist)
-
-
-
!
!Close the dataspace for the dataset.
!
diff --git a/fortran/testpar/CMakeLists.txt b/fortran/testpar/CMakeLists.txt
index e8f0107..4d3a330 100644
--- a/fortran/testpar/CMakeLists.txt
+++ b/fortran/testpar/CMakeLists.txt
@@ -20,6 +20,7 @@ add_executable (parallel_test
ptest.F90
hyper.F90
mdset.F90
+ mpi_param.F90
multidsetrw.F90
)
target_include_directories (parallel_test
diff --git a/fortran/testpar/Makefile.am b/fortran/testpar/Makefile.am
index 1c37409..3df1fee 100644
--- a/fortran/testpar/Makefile.am
+++ b/fortran/testpar/Makefile.am
@@ -39,7 +39,7 @@ check_PROGRAMS=$(TEST_PROG_PARA)
CHECK_CLEANFILES+=parf[12].h5 h5*_tests.h5 subf.h5* *.mod
# Test source files
-parallel_test_SOURCES=ptest.F90 hyper.F90 mdset.F90 multidsetrw.F90
+parallel_test_SOURCES=ptest.F90 hyper.F90 mdset.F90 multidsetrw.F90 mpi_param.F90
subfiling_test_SOURCES=subfiling.F90
async_test_SOURCES=async.F90
diff --git a/fortran/testpar/mpi_param.F90 b/fortran/testpar/mpi_param.F90
new file mode 100644
index 0000000..ba4eaaa
--- /dev/null
+++ b/fortran/testpar/mpi_param.F90
@@ -0,0 +1,326 @@
+! * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
+! Copyright by The HDF Group. *
+! All rights reserved. *
+! *
+! This file is part of HDF5. The full HDF5 copyright notice, including *
+! terms governing use, modification, and redistribution, is contained in *
+! the COPYING file, which can be found at the root of the source code *
+! distribution tree, or in https://www.hdfgroup.org/licenses. *
+! If you do not have access to either file, you may request a copy from *
+! help@hdfgroup.org. *
+! * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
+
+#include <H5config_f.inc>
+
+!
+! writes/reads dataset by hyperslabs
+!
+
+SUBROUTINE mpi_param_03(nerrors)
+
+ USE MPI
+ USE HDF5
+ USE TH5_MISC
+ USE TH5_MISC_GEN
+
+ IMPLICIT NONE
+ INTEGER, INTENT(inout) :: nerrors ! number of errors
+
+ INTEGER :: hdferror ! HDF hdferror flag
+ INTEGER(hid_t) :: fapl_id ! file access identifier
+ INTEGER :: mpi_size, mpi_size_ret ! number of processes in the group of communicator
+ INTEGER :: mpierror ! MPI hdferror flag
+ INTEGER :: mpi_rank ! rank of the calling process in the communicator
+
+ INTEGER :: info, info_ret
+ INTEGER :: comm, comm_ret
+ INTEGER :: nkeys
+ LOGICAL :: flag
+ INTEGER :: iconfig
+ CHARACTER(LEN=4) , PARAMETER :: in_key="host"
+ CHARACTER(LEN=10), PARAMETER :: in_value="myhost.org"
+
+ CHARACTER(LEN=MPI_MAX_INFO_KEY) :: key, value
+
+ ! Get the original sizes
+ CALL mpi_comm_rank( MPI_COMM_WORLD, mpi_rank, mpierror )
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_RANK *FAILED*"
+ nerrors = nerrors + 1
+ ENDIF
+ CALL mpi_comm_size( MPI_COMM_WORLD, mpi_size, mpierror )
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SIZE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+
+ DO iconfig = 1, 2
+
+ ! Create the file access property
+ CALL h5pcreate_f(H5P_FILE_ACCESS_F, fapl_id, hdferror)
+ CALL check("h5pcreate_f", hdferror, nerrors)
+
+ ! Split the communicator
+ IF(mpi_rank.EQ.0)THEN
+ CALL MPI_Comm_split(MPI_COMM_WORLD, 1, mpi_rank, comm, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SPLIT *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ ELSE
+ CALL MPI_Comm_split(MPI_COMM_WORLD, 0, mpi_rank, comm, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SPLIT *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ ENDIF
+
+ ! Create and set an MPI INFO parameter
+
+ CALL MPI_Info_create(info, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_CREATE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_Info_set(info, in_key, in_value, mpierror )
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_SET *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+
+ IF(iconfig.EQ.1)THEN
+ ! Set and get the MPI parameters
+ CALL h5pset_fapl_mpio_f(fapl_id, comm, info, hdferror)
+ CALL check("h5pset_fapl_mpio_f", hdferror, nerrors)
+
+ CALL h5pget_fapl_mpio_f(fapl_id, comm_ret, info_ret, hdferror)
+ CALL check("h5pget_fapl_mpio_f", hdferror, nerrors)
+ ELSE
+ CALL h5pset_mpi_params_f(fapl_id, comm, info, hdferror)
+ CALL check("h5pset_mpi_params_f", hdferror, nerrors)
+
+ CALL h5pget_mpi_params_f(fapl_id, comm_ret, info_ret, hdferror)
+ CALL check("h5pget_mpi_params_f", hdferror, nerrors)
+ ENDIF
+
+
+ ! Check comm returned
+ CALL mpi_comm_size(comm_ret, mpi_size_ret, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SIZE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ IF (mpi_rank.EQ.0)THEN
+ CALL VERIFY("h5pget_fapl_mpio_f", mpi_size_ret, 1, hdferror)
+ ELSE
+ CALL VERIFY("h5pget_fapl_mpio_f", mpi_size_ret, mpi_size-1, hdferror)
+ ENDIF
+
+ ! Check info returned
+ CALL MPI_info_get_nkeys( info_ret, nkeys, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_GET_NKEYS *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL VERIFY("h5pget_fapl_mpio_f", nkeys, 1, hdferror)
+
+ CALL MPI_Info_get_nthkey(info_ret, 0, key, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_GET_NTHKEY *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL VERIFY("h5pget_fapl_mpio_f", TRIM(key), in_key, hdferror)
+
+ CALL MPI_Info_get(info, key, MPI_MAX_INFO_KEY, value, flag, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_GET *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL VERIFY("h5pget_fapl_mpio_f", flag, .TRUE., hdferror)
+ CALL VERIFY("h5pget_fapl_mpio_f", TRIM(value), in_value, hdferror)
+
+ ! Free the MPI resources
+ CALL MPI_info_free(info_ret, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_comm_free(comm_ret, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_info_free(info, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_comm_free(comm, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+
+ CALL h5pclose_f(fapl_id, hdferror)
+ CALL check("h5pclose_f", hdferror, nerrors)
+ ENDDO
+
+END SUBROUTINE mpi_param_03
+
+SUBROUTINE mpi_param_08(nerrors)
+
+#ifdef H5_HAVE_MPI_F08
+
+ USE MPI_F08
+ USE HDF5
+ USE TH5_MISC
+ USE TH5_MISC_GEN
+
+ IMPLICIT NONE
+ INTEGER, INTENT(inout) :: nerrors ! number of errors
+
+ INTEGER :: hdferror ! HDF hdferror flag
+ INTEGER(hid_t) :: fapl_id ! file access identifier
+ INTEGER :: mpi_size, mpi_size_ret ! number of processes in the group of communicator
+ INTEGER :: mpierror ! MPI hdferror flag
+ INTEGER :: mpi_rank ! rank of the calling process in the communicator
+
+ TYPE(MPI_INFO) :: info, info_ret
+ TYPE(MPI_COMM) :: comm, comm_ret
+ INTEGER :: nkeys
+ LOGICAL :: flag
+ INTEGER :: iconfig
+ CHARACTER(LEN=4) , PARAMETER :: in_key="host"
+ CHARACTER(LEN=10), PARAMETER :: in_value="myhost.org"
+
+ CHARACTER(LEN=MPI_MAX_INFO_KEY) :: key, value
+
+ ! Get the original sizes
+ CALL mpi_comm_rank( MPI_COMM_WORLD, mpi_rank, mpierror )
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_RANK *FAILED*"
+ nerrors = nerrors + 1
+ ENDIF
+ CALL mpi_comm_size( MPI_COMM_WORLD, mpi_size, mpierror )
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SIZE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+
+ DO iconfig = 1, 2
+
+ ! Create the file access property
+ CALL h5pcreate_f(H5P_FILE_ACCESS_F, fapl_id, hdferror)
+ CALL check("h5pcreate_f", hdferror, nerrors)
+
+ ! Split the communicator
+ IF(mpi_rank.EQ.0)THEN
+ CALL MPI_Comm_split(MPI_COMM_WORLD, 1, mpi_rank, comm, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SPLIT *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ ELSE
+ CALL MPI_Comm_split(MPI_COMM_WORLD, 0, mpi_rank, comm, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SPLIT *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ ENDIF
+
+ ! Create and set an MPI INFO parameter
+
+ CALL MPI_Info_create(info, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_CREATE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_Info_set(info, in_key, in_value, mpierror )
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_SET *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+
+ IF(iconfig.EQ.1)THEN
+ ! Set and get the MPI parameters
+ CALL h5pset_fapl_mpio_f(fapl_id, comm, info, hdferror)
+ CALL check("h5pset_fapl_mpio_f", hdferror, nerrors)
+
+ CALL h5pget_fapl_mpio_f(fapl_id, comm_ret, info_ret, hdferror)
+ CALL check("h5pget_fapl_mpio_f", hdferror, nerrors)
+ ELSE
+ CALL h5pset_mpi_params_f(fapl_id, comm, info, hdferror)
+ CALL check("h5pset_mpi_params_f", hdferror, nerrors)
+
+ CALL h5pget_mpi_params_f(fapl_id, comm_ret, info_ret, hdferror)
+ CALL check("h5pget_mpi_params_f", hdferror, nerrors)
+ ENDIF
+
+
+ ! Check comm returned
+ CALL mpi_comm_size(comm_ret, mpi_size_ret, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_SIZE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ IF (mpi_rank.EQ.0)THEN
+ CALL VERIFY("h5pget_fapl_mpio_f", mpi_size_ret, 1, hdferror)
+ ELSE
+ CALL VERIFY("h5pget_fapl_mpio_f", mpi_size_ret, mpi_size-1, hdferror)
+ ENDIF
+
+ ! Check info returned
+ CALL MPI_info_get_nkeys( info_ret, nkeys, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_GET_NKEYS *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL VERIFY("h5pget_fapl_mpio_f", nkeys, 1, hdferror)
+
+ CALL MPI_Info_get_nthkey(info_ret, 0, key, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_GET_NTHKEY *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL VERIFY("h5pget_fapl_mpio_f", TRIM(key), in_key, hdferror)
+
+ CALL MPI_Info_get(info, key, MPI_MAX_INFO_KEY, value, flag, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_GET *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL VERIFY("h5pget_fapl_mpio_f", flag, .TRUE., hdferror)
+ CALL VERIFY("h5pget_fapl_mpio_f", TRIM(value), in_value, hdferror)
+
+ ! Free the MPI resources
+ CALL MPI_info_free(info_ret, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_comm_free(comm_ret, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_info_free(info, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_INFO_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+ CALL MPI_comm_free(comm, mpierror)
+ IF (mpierror .NE. MPI_SUCCESS) THEN
+ WRITE(*,*) "MPI_COMM_FREE *FAILED* Process = ", mpi_rank
+ nerrors = nerrors + 1
+ ENDIF
+
+ CALL h5pclose_f(fapl_id, hdferror)
+ CALL check("h5pclose_f", hdferror, nerrors)
+ ENDDO
+#else
+ INTEGER, INTENT(inout) :: nerrors ! number of errors
+ nerrors = -1 ! Skip test
+#endif
+
+END SUBROUTINE mpi_param_08
+
diff --git a/fortran/testpar/ptest.F90 b/fortran/testpar/ptest.F90
index b754e29..d2e9d10 100644
--- a/fortran/testpar/ptest.F90
+++ b/fortran/testpar/ptest.F90
@@ -58,6 +58,16 @@ PROGRAM parallel_test
IF(mpi_rank==0) CALL write_test_header("COMPREHENSIVE PARALLEL FORTRAN TESTS")
+ ret_total_error = 0
+ CALL mpi_param_03(ret_total_error)
+ IF(mpi_rank==0) CALL write_test_status(ret_total_error, &
+ 'Testing MPI communicator and info (F03)', total_error)
+
+ ret_total_error = 0
+ CALL mpi_param_08(ret_total_error)
+ IF(mpi_rank==0) CALL write_test_status(ret_total_error, &
+ 'Testing MPI communicator and info (F08)', total_error)
+
!
! test write/read dataset by hyperslabs (contiguous/chunk) with independent/collective MPI I/O
!
diff --git a/hl/tools/gif2h5/gif2hdf.c b/hl/tools/gif2h5/gif2hdf.c
index 9220655..161761c 100644
--- a/hl/tools/gif2h5/gif2hdf.c
+++ b/hl/tools/gif2h5/gif2hdf.c
@@ -26,8 +26,8 @@ main(int argv, char *argc[])
FILE *fpGif;
/* replacing int32 with long */
- long i, ImageCount;
- long filesize;
+ long i, ImageCount;
+ HDoff_t filesize;
GIFBYTE *MemGif;
GIFBYTE *StartPos;
@@ -71,7 +71,7 @@ main(int argv, char *argc[])
/* Get the whole file into memory. Mem's much faster than I/O */
fseek(fpGif, 0L, 2);
- filesize = ftell(fpGif);
+ filesize = HDftell(fpGif);
fseek(fpGif, 0L, 0);
if (filesize == 0)
diff --git a/hl/tools/h5watch/CMakeTests.cmake b/hl/tools/h5watch/CMakeTests.cmake
index aa4c41a..ce0e10e 100644
--- a/hl/tools/h5watch/CMakeTests.cmake
+++ b/hl/tools/h5watch/CMakeTests.cmake
@@ -105,8 +105,9 @@ add_custom_target(H5WATCH_files ALL COMMENT "Copying files needed by H5WATCH tes
-D "TEST_OUTPUT=${resultfile}.out"
-D "TEST_EXPECT=${resultcode}"
-D "TEST_REFERENCE=${resultfile}.mty"
- -D "TEST_ERRREF=${resultfile}.err"
- -P "${HDF_RESOURCES_DIR}/runTest.cmake"
+ -D "TEST_ERRREF=h5watch error"
+ -D "TEST_SKIP_COMPARE=true"
+ -P "${HDF_RESOURCES_DIR}/grepTest.cmake"
)
set_tests_properties (H5WATCH_ARGS-h5watch-${resultfile} PROPERTIES
DEPENDS ${last_test}
diff --git a/java/src/hdf/hdf5lib/CMakeLists.txt b/java/src/hdf/hdf5lib/CMakeLists.txt
index 41cf4e9..69b5a93 100644
--- a/java/src/hdf/hdf5lib/CMakeLists.txt
+++ b/java/src/hdf/hdf5lib/CMakeLists.txt
@@ -10,6 +10,7 @@ SET_GLOBAL_VARIABLE (HDF5_JAVA_SOURCE_PACKAGES
)
set (HDF5_JAVA_HDF_HDF5_CALLBACKS_SOURCES
+ callbacks/Callbacks.java
callbacks/H5A_iterate_cb.java
callbacks/H5A_iterate_t.java
callbacks/H5D_append_cb.java
@@ -37,7 +38,6 @@ set (HDF5_JAVA_HDF_HDF5_CALLBACKS_SOURCES
callbacks/H5P_prp_set_func_cb.java
callbacks/H5P_iterate_cb.java
callbacks/H5P_iterate_t.java
- callbacks/Callbacks.java
)
set (HDF5_JAVADOC_HDF_HDF5_CALLBACKS_SOURCES
diff --git a/java/src/hdf/hdf5lib/callbacks/Callbacks.java b/java/src/hdf/hdf5lib/callbacks/Callbacks.java
index 3d5fbd1..013e0ec 100644
--- a/java/src/hdf/hdf5lib/callbacks/Callbacks.java
+++ b/java/src/hdf/hdf5lib/callbacks/Callbacks.java
@@ -28,7 +28,7 @@ package hdf.hdf5lib.callbacks;
* exceptions thrown will be passed to the default callback exception
* handler.
*
- * @defgroup JCALL HDF5 Library Java Callbacks
+ * @defgroup JCALLBK HDF5 Library Java Callbacks
*/
public interface Callbacks {
}
diff --git a/java/src/hdf/hdf5lib/callbacks/H5A_iterate_cb.java b/java/src/hdf/hdf5lib/callbacks/H5A_iterate_cb.java
index 9958b3b..2d37044 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5A_iterate_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5A_iterate_cb.java
@@ -20,7 +20,7 @@ import hdf.hdf5lib.structs.H5A_info_t;
*/
public interface H5A_iterate_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each attribute
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5D_append_cb.java b/java/src/hdf/hdf5lib/callbacks/H5D_append_cb.java
index 49323a2..92024f8 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5D_append_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5D_append_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5D_append_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each dataset access property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5D_iterate_cb.java b/java/src/hdf/hdf5lib/callbacks/H5D_iterate_cb.java
index 5f77998..f9ea6a9 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5D_iterate_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5D_iterate_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5D_iterate_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each dataset element
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5E_walk_cb.java b/java/src/hdf/hdf5lib/callbacks/H5E_walk_cb.java
index a8ef5df..a9690a5 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5E_walk_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5E_walk_cb.java
@@ -20,7 +20,7 @@ import hdf.hdf5lib.structs.H5E_error2_t;
*/
public interface H5E_walk_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each error stack element
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5L_iterate_t.java b/java/src/hdf/hdf5lib/callbacks/H5L_iterate_t.java
index 7342e58..cf0ac0e 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5L_iterate_t.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5L_iterate_t.java
@@ -20,7 +20,7 @@ import hdf.hdf5lib.structs.H5L_info_t;
*/
public interface H5L_iterate_t extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each group
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5O_iterate_t.java b/java/src/hdf/hdf5lib/callbacks/H5O_iterate_t.java
index bfe8c67..f0dd587 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5O_iterate_t.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5O_iterate_t.java
@@ -20,7 +20,7 @@ import hdf.hdf5lib.structs.H5O_info_t;
*/
public interface H5O_iterate_t extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each group
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_cls_close_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_cls_close_func_cb.java
index a235861..e4f10cc 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_cls_close_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_cls_close_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_cls_close_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_cls_copy_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_cls_copy_func_cb.java
index b218e0c..bdaad5f 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_cls_copy_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_cls_copy_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_cls_copy_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_cls_create_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_cls_create_func_cb.java
index 3d407d0..0b9ced2 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_cls_create_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_cls_create_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_cls_create_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_iterate_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_iterate_cb.java
index 51a5768..941fd15 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_iterate_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_iterate_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_iterate_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_close_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_close_func_cb.java
index 2ddc980..33bde76 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_close_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_close_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_close_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_compare_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_compare_func_cb.java
index 53caa94..3149d17 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_compare_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_compare_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_compare_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_copy_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_copy_func_cb.java
index 0b2349e..d3d6b37 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_copy_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_copy_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_copy_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_create_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_create_func_cb.java
index 6065ce0..2fe3338 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_create_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_create_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_create_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_delete_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_delete_func_cb.java
index 4384ca7..3019284 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_delete_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_delete_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_delete_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_get_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_get_func_cb.java
index 999c7b0..cfc8e31 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_get_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_get_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_get_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/hdf/hdf5lib/callbacks/H5P_prp_set_func_cb.java b/java/src/hdf/hdf5lib/callbacks/H5P_prp_set_func_cb.java
index 893344b..2272869 100644
--- a/java/src/hdf/hdf5lib/callbacks/H5P_prp_set_func_cb.java
+++ b/java/src/hdf/hdf5lib/callbacks/H5P_prp_set_func_cb.java
@@ -18,7 +18,7 @@ package hdf.hdf5lib.callbacks;
*/
public interface H5P_prp_set_func_cb extends Callbacks {
/**
- * @ingroup JCALL
+ * @ingroup JCALLBK
*
* application callback for each property list
*
diff --git a/java/src/jni/h5util.c b/java/src/jni/h5util.c
index 76c726a..bf798b8 100644
--- a/java/src/jni/h5util.c
+++ b/java/src/jni/h5util.c
@@ -1192,7 +1192,8 @@ h5str_sprintf(JNIEnv *env, h5str_t *out_str, hid_t container, hid_t tid, void *i
* object.
*/
- if (NULL == (this_str = (char *)malloc(64)))
+ const size_t size = 64;
+ if (NULL == (this_str = (char *)malloc(size)))
H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: failed to allocate string buffer");
if ((obj = H5Rdereference2(container, H5P_DEFAULT, H5R_OBJECT, cptr)) < 0)
@@ -1206,25 +1207,25 @@ h5str_sprintf(JNIEnv *env, h5str_t *out_str, hid_t container, hid_t tid, void *i
switch (oi.type) {
case H5O_TYPE_GROUP:
- if (sprintf(this_str, "%s %s", H5_TOOLS_GROUP, obj_tok_str) < 0)
- H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: sprintf failure");
+ if (snprintf(this_str, size, "%s %s", H5_TOOLS_GROUP, obj_tok_str) < 0)
+ H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: snprintf failure");
break;
case H5O_TYPE_DATASET:
- if (sprintf(this_str, "%s %s", H5_TOOLS_DATASET, obj_tok_str) < 0)
- H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: sprintf failure");
+ if (snprintf(this_str, size, "%s %s", H5_TOOLS_DATASET, obj_tok_str) < 0)
+ H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: snprintf failure");
break;
case H5O_TYPE_NAMED_DATATYPE:
- if (sprintf(this_str, "%s %s", H5_TOOLS_DATATYPE, obj_tok_str) < 0)
- H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: sprintf failure");
+ if (snprintf(this_str, size, "%s %s", H5_TOOLS_DATATYPE, obj_tok_str) < 0)
+ H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: snprintf failure");
break;
case H5O_TYPE_UNKNOWN:
case H5O_TYPE_NTYPES:
default:
- if (sprintf(this_str, "%u-%s", (unsigned)oi.type, obj_tok_str) < 0)
- H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: sprintf failure");
+ if (snprintf(this_str, size, "%u-%s", (unsigned)oi.type, obj_tok_str) < 0)
+ H5_JNI_FATAL_ERROR(ENVONLY, "h5str_sprintf: snprintf failure");
break;
}
diff --git a/java/test/TestH5.java b/java/test/TestH5.java
index fd1a926..762f83d 100644
--- a/java/test/TestH5.java
+++ b/java/test/TestH5.java
@@ -423,7 +423,7 @@ public class TestH5 {
}
}
- @Test
+ @Ignore
public void testH5export_dataset()
{
int[][] dset_data = new int[DIM_X][DIM_Y];
@@ -489,7 +489,7 @@ public class TestH5 {
_deleteH5file();
}
- @Test
+ @Ignore
public void testH5export_region()
{
int[] dset_data_expect = {66, 69, 72, 75, 78, 81, 96, 99, 102, 105, 108, 111,
@@ -532,7 +532,7 @@ public class TestH5 {
dset_indata[row] == dset_data_expect[row]);
}
- @Test
+ @Ignore
public void testH5export_attribute()
{
int[] dset_data_expect = {0, 3, 6, 9, 1, 4, 7, 10, 2, 5, 8, 11};
@@ -573,7 +573,7 @@ public class TestH5 {
dset_indata[row] == dset_data_expect[row]);
}
- @Test
+ @Ignore
public void testH5export_regdataset()
{
int[] dset_data_expect = {66, 69, 72, 75, 78, 81, 96, 99, 102, 105, 108, 111,
@@ -616,7 +616,7 @@ public class TestH5 {
dset_indata[row] == dset_data_expect[row]);
}
- @Test
+ @Ignore
public void testH5export_attrdataset()
{
int[] dset_data_expect = {66, 69, 72, 75, 78, 81, 96, 99, 102, 105, 108, 111,
diff --git a/java/test/testfiles/JUnit-TestH5.txt b/java/test/testfiles/JUnit-TestH5.txt
index fb50a57..b282a91 100644
--- a/java/test/testfiles/JUnit-TestH5.txt
+++ b/java/test/testfiles/JUnit-TestH5.txt
@@ -1,14 +1,9 @@
JUnit version 4.11
-.testH5export_region
.testH5get_libversion_null_param
.testJ2C
-.testH5export_dataset
.testIsSerializable
-.testH5export_attrdataset
.testH5garbage_collect
.testH5error_off
-.testH5export_regdataset
-.testH5export_attribute
.serializeToDisk
.testH5open
.testH5check_version
@@ -17,5 +12,5 @@ JUnit version 4.11
Time: XXXX
-OK (15 tests)
+OK (10 tests)
diff --git a/release_docs/INSTALL b/release_docs/INSTALL_Auto.txt
index 2b8f3b6..969fbac 100644
--- a/release_docs/INSTALL
+++ b/release_docs/INSTALL_Auto.txt
@@ -7,7 +7,7 @@ This file provides instructions for installing the HDF5 software.
For help with installing, questions can be posted to the HDF Forum or sent to the HDF Helpdesk:
HDF Forum: https://forum.hdfgroup.org/
- HDF Helpdesk: https://portal.hdfgroup.org/display/support/The+HDF+Help+Desk
+ HDF Helpdesk: https://hdfgroup.atlassian.net/servicedesk/customer/portals
CONTENTS
--------
@@ -15,7 +15,6 @@ CONTENTS
2. Quick installation
2.1. Windows
- 2.2. RedStorm (Cray XT3)
3. HDF5 dependencies
3.1. Zlib
@@ -51,10 +50,8 @@ CONTENTS
*****************************************************************************
1. Obtaining HDF5
- The latest supported public release of HDF5 is available from
- https://www.hdfgroup.org/downloads/hdf5/. For Unix and UNIX-like
- platforms, it is available in tar format compressed with gzip.
- For Microsoft Windows, it is in ZIP format.
+ The latest supported public releases of HDF5 are available on
+ https://github.com/HDFGroup/hdf5.
2. Quick installation
@@ -63,7 +60,6 @@ CONTENTS
and support programs. For example, to install HDF5 version X.Y.Z at
location /usr/local/hdf5, use the following steps.
- $ gunzip < hdf5-X.Y.Z.tar.gz | tar xf -
$ cd hdf5-X.Y.Z
$ ./configure --prefix=/usr/local/hdf5 <more configure_flags>
$ make
@@ -71,10 +67,6 @@ CONTENTS
$ make install
$ make check-install # verify installation.
- Some versions of the tar command support the -z option. In such cases,
- the first step above can be simplified to the following:
-
- $ tar zxf hdf5-X.Y.Z.tar.gz
<configure_flags> above refers to the configure flags appropriate
to your installation. For example, to install HDF5 with the
@@ -91,13 +83,6 @@ CONTENTS
Users of Microsoft Windows should see the INSTALL_Windows files for
detailed instructions.
-2.2. RedStorm (Cray XT3)
- Users of the Red Storm machine, after reading this file, should read
- the Red Storm section in the INSTALL_parallel file for specific
- instructions for the Red Storm machine. The same instructions would
- probably work for other Cray XT3 systems, but they have not been
- verified.
-
3. HDF5 dependencies
3.1. Zlib
diff --git a/release_docs/INSTALL_CMake.txt b/release_docs/INSTALL_CMake.txt
index 7839f42..f794d3c 100644
--- a/release_docs/INSTALL_CMake.txt
+++ b/release_docs/INSTALL_CMake.txt
@@ -788,13 +788,14 @@ if (WINDOWS)
DISABLE_PDB_FILES "Do not install PDB files" OFF
---------------- HDF5 Build Options ---------------------
-HDF5_BUILD_CPP_LIB "Build HDF5 C++ Library" OFF
-HDF5_BUILD_EXAMPLES "Build HDF5 Library Examples" ON
-HDF5_BUILD_FORTRAN "Build FORTRAN support" OFF
-HDF5_BUILD_JAVA "Build JAVA support" OFF
-HDF5_BUILD_HL_LIB "Build HIGH Level HDF5 Library" ON
-HDF5_BUILD_TOOLS "Build HDF5 Tools" ON
-HDF5_BUILD_HL_GIF_TOOLS "Build HIGH Level HDF5 GIF Tools" OFF
+HDF5_BUILD_CPP_LIB "Build HDF5 C++ Library" OFF
+HDF5_BUILD_EXAMPLES "Build HDF5 Library Examples" ON
+HDF5_BUILD_FORTRAN "Build FORTRAN support" OFF
+HDF5_BUILD_JAVA "Build JAVA support" OFF
+HDF5_BUILD_HL_LIB "Build HIGH Level HDF5 Library" ON
+HDF5_BUILD_TOOLS "Build HDF5 Tools" ON
+HDF5_BUILD_HL_GIF_TOOLS "Build HIGH Level HDF5 GIF Tools" OFF
+HDF5_BUILD_PARALLEL_TOOLS "Build Parallel HDF5 Tools" OFF
---------------- HDF5 Folder Build Options ---------------------
Defaults relative to $<INSTALL_PREFIX>
@@ -809,6 +810,8 @@ else ()
HDF5_INSTALL_DATA_DIR "share"
HDF5_INSTALL_DOC_DIR "HDF5_INSTALL_DATA_DIR"
+HDF5_USE_GNU_DIRS "ON to use GNU Coding Standard install directory variables,
+ OFF to use historical settings" OFF
Defaults as defined by the `GNU Coding Standards`
HDF5_INSTALL_BIN_DIR "bin"
HDF5_INSTALL_LIB_DIR "lib"
@@ -819,11 +822,13 @@ HDF5_INSTALL_DATA_DIR "share"
HDF5_INSTALL_DOC_DIR "HDF5_INSTALL_DATA_DIR/doc/hdf5"
---------------- HDF5 Advanced Options ---------------------
-HDF5_USE_GNU_DIRS "TRUE to use GNU Coding Standard install directory variables,
- FALSE to use historical settings" FALSE
ONLY_SHARED_LIBS "Only Build Shared Libraries" OFF
ALLOW_UNSUPPORTED "Allow unsupported combinations of configure options" OFF
+HDF5_ENABLE_PARALLEL "Enable parallel build (requires MPI)" OFF
+HDF5_ENABLE_THREADSAFE "Enable Threadsafety" OFF
+HDF5_DIMENSION_SCALES_NEW_REF "Use new-style references with dimension scale APIs" OFF
HDF5_EXTERNAL_LIB_PREFIX "Use prefix for custom library naming." ""
+
HDF5_DISABLE_COMPILER_WARNINGS "Disable compiler warnings" OFF
HDF5_ENABLE_ALL_WARNINGS "Enable all warnings" OFF
HDF5_SHOW_ALL_WARNINGS "Show all warnings (i.e. not suppress "noisy" ones internally)" OFF
@@ -831,14 +836,14 @@ HDF5_ENABLE_CODESTACK "Enable the function stack tracing (for developer
HDF5_ENABLE_COVERAGE "Enable code coverage for Libraries and Programs" OFF
HDF5_ENABLE_DEBUG_APIS "Turn on extra debug output in all packages" OFF
HDF5_ENABLE_DEPRECATED_SYMBOLS "Enable deprecated public API symbols" ON
-HDF5_ENABLE_DIRECT_VFD "Build the Direct I/O Virtual File Driver" OFF
HDF5_ENABLE_EMBEDDED_LIBINFO "embed library info into executables" ON
-HDF5_ENABLE_PARALLEL "Enable parallel build (requires MPI)" OFF
HDF5_ENABLE_PREADWRITE "Use pread/pwrite in sec2/log/core VFDs in place of read/write (when available)" ON
HDF5_ENABLE_TRACE "Enable API tracing capability" OFF
HDF5_ENABLE_USING_MEMCHECKER "Indicate that a memory checker is used" OFF
+HDF5_ENABLE_MAP_API "Build the map API" OFF
HDF5_GENERATE_HEADERS "Rebuild Generated Files" ON
HDF5_BUILD_GENERATORS "Build Test Generators" OFF
+
HDF5_JAVA_PACK_JRE "Package a JRE installer directory" OFF
HDF5_NO_PACKAGES "Do not include CPack Packaging" OFF
HDF5_PACK_EXAMPLES "Package the HDF5 Library Examples Compressed File" OFF
@@ -847,11 +852,11 @@ HDF5_BUILD_FRAMEWORKS "TRUE to build as frameworks libraries,
FALSE to build according to BUILD_SHARED_LIBS" FALSE
HDF5_PACKAGE_EXTLIBS "CPACK - include external libraries" OFF
HDF5_STRICT_FORMAT_CHECKS "Whether to perform strict file format checks" OFF
-DEFAULT_API_VERSION "Enable default API (v16, v18, v110, v112, v114)" "v114"
-HDF5_USE_FOLDERS "Enable folder grouping of projects in IDEs." ON
HDF5_WANT_DATA_ACCURACY "IF data accuracy is guaranteed during data conversions" ON
HDF5_WANT_DCONV_EXCEPTION "exception handling functions is checked during data conversions" ON
-HDF5_ENABLE_THREADSAFE "Enable Threadsafety" OFF
+
+DEFAULT_API_VERSION "Enable default API (v16, v18, v110, v112, v114)" "v114"
+HDF5_USE_FOLDERS "Enable folder grouping of projects in IDEs." ON
HDF5_MSVC_NAMING_CONVENTION "Use MSVC Naming conventions for Shared Libraries" OFF
HDF5_MINGW_STATIC_GCC_LIBS "Statically link libgcc/libstdc++" OFF
if (APPLE)
@@ -864,12 +869,19 @@ if (HDF5_BUILD_FORTRAN)
if (BUILD_SHARED_LIBS AND NOT BUILD_STATIC_LIBS) default HDF5_INSTALL_MOD_FORTRAN is SHARED
if (NOT BUILD_SHARED_LIBS AND BUILD_STATIC_LIBS) default HDF5_INSTALL_MOD_FORTRAN is STATIC
if (NOT BUILD_SHARED_LIBS AND NOT BUILD_STATIC_LIBS) default HDF5_INSTALL_MOD_FORTRAN is SHARED
-HDF5_BUILD_DOC "Build documentation" OFF
+
HDF5_ENABLE_ANALYZER_TOOLS "enable the use of Clang tools" OFF
HDF5_ENABLE_SANITIZERS "execute the Clang sanitizer" OFF
HDF5_ENABLE_FORMATTERS "format source files" OFF
-HDF5_DIMENSION_SCALES_NEW_REF "Use new-style references with dimension scale APIs" OFF
-HDF5_ENABLE_DOXY_WARNINGS "Enable fail if doxygen parsing has warnings." ON
+HDF5_BUILD_DOC "Build documentation" OFF
+HDF5_ENABLE_DOXY_WARNINGS "Enable fail if doxygen parsing has warnings." OFF
+
+---------------- HDF5 VFD Options ---------------------
+HDF5_ENABLE_DIRECT_VFD "Build the Direct I/O Virtual File Driver" OFF
+HDF5_ENABLE_MIRROR_VFD "Build the Mirror Virtual File Driver" OFF
+HDF5_ENABLE_ROS3_VFD "Build the ROS3 Virtual File Driver" OFF
+HDF5_ENABLE_HDFS "Enable HDFS" OFF
+HDF5_ENABLE_SUBFILING_VFD "Build Parallel HDF5 Subfiling VFD" OFF
---------------- HDF5 Advanced Test Options ---------------------
if (BUILD_TESTING)
@@ -894,7 +906,7 @@ if (BUILD_TESTING)
---------------- External Library Options ---------------------
HDF5_ALLOW_EXTERNAL_SUPPORT "Allow External Library Building (NO GIT TGZ)" "NO"
HDF5_ENABLE_PLUGIN_SUPPORT "Enable PLUGIN Filters" OFF
-HDF5_ENABLE_SZIP_SUPPORT "Use SZip Filter" ON
+HDF5_ENABLE_SZIP_SUPPORT "Use SZip Filter" OFF
HDF5_ENABLE_Z_LIB_SUPPORT "Enable Zlib Filters" ON
ZLIB_USE_EXTERNAL "Use External Library Building for ZLIB" OFF
diff --git a/release_docs/RELEASE.txt b/release_docs/RELEASE.txt
index c1188ae..7e42b01 100644
--- a/release_docs/RELEASE.txt
+++ b/release_docs/RELEASE.txt
@@ -102,6 +102,18 @@ New Features
Fortran Library:
----------------
+ - Add API support for Fortran MPI_F08 module definitions:
+ Adds support for MPI's MPI_F08 module datatypes: type(MPI_COMM) and type(MPI_INFO) for HDF5 APIs:
+ H5PSET_FAPL_MPIO_F, H5PGET_FAPL_MPIO_F, H5PSET_MPI_PARAMS_F, H5PGET_MPI_PARAMS_F
+ Ref. #3951
+
+ - Added Fortran APIs:
+ H5FGET_INTENT_F, H5SSEL_ITER_CREATE_F, H5SSEL_ITER_GET_SEQ_LIST_F,
+ H5SSEL_ITER_CLOSE_F, H5S_mp_H5SSEL_ITER_RESET_F
+
+ - Added Fortran Parameters:
+ H5S_SEL_ITER_GET_SEQ_LIST_SORTED_F, H5S_SEL_ITER_SHARE_WITH_DATASPACE_F
+
- Added Fortran Parameters:
H5S_BLOCK_F and H5S_PLIST_F
@@ -159,6 +171,34 @@ Bug Fixes since HDF5-1.14.3 release
===================================
Library
-------
+ - Memory usage growth issue
+
+ Starting with the HDF5 1.12.1 release, an issue (GitHub issue #1256)
+ was observed where running a simple program that has a loop of opening
+ a file, reading from an object with a variable-length datatype and
+ then closing the file would result in the process fairly quickly
+ running out of memory. Upon further investigation, it was determined
+ that this memory was being kept around in the library's datatype
+ conversion pathway cache that is used to speed up datatype conversions
+ which are repeatedly used within an HDF5 application's lifecycle. For
+ conversions involving variable-length or reference datatypes, each of
+ these cached pathway entries keeps a reference to its associated file
+ for later use. Since the file was being closed and reopened on each
+ loop iteration, and since the library compares for equality between
+ instances of opened files (rather than equality of the actual files)
+ when determining if it can reuse a cached conversion pathway, it was
+ determining that no cached conversion pathways could be reused and was
+ creating a new cache entry on each loop iteration during I/O. This
+ would lead to constant growth of that cache and the memory it consumed,
+ as well as constant growth of the memory consumed by each cached entry
+ for the reference to its associated file.
+
+ To fix this issue, the library now removes any cached datatype
+ conversion path entries for variable-length or reference datatypes
+ associated with a particular file when that file is closed.
+
+ Fixes GitHub #1256
+
- Suppressed floating-point exceptions in H5T init code
The floating-point datatype initialization code in H5Tinit_float.c
@@ -195,6 +235,15 @@ Bug Fixes since HDF5-1.14.3 release
Configuration
-------------
+ - Changed default of 'Error on HDF5 doxygen warnings' DOXYGEN_WARN_AS_ERROR option.
+
+ The default setting of DOXYGEN_WARN_AS_ERROR to 'FAIL_ON_WARNINGS' has been changed
+ to 'NO'. It was decided that the setting was too aggressive and should be a user choice.
+ The github actions and scripts have been updated to reflect this.
+
+ * HDF5_ENABLE_DOXY_WARNINGS: ON/OFF (Default: OFF)
+ * --enable-doxygen-errors: enable/disable (Default: disable)
+
- Removed an Autotools configure hack that causes problems on MacOS
A sed line in configure.ac was added in the past to paper over some
@@ -382,7 +431,7 @@ Platforms Tested
Windows 10 x64 Visual Studio 2019 w/ clang 12.0.0
with MSVC-like command-line (C/C++ only - cmake)
- Visual Studio 2019 w/ Intel C/C++ only cmake)
+ Visual Studio 2019 w/ Intel (C/C++ only - cmake)
Visual Studio 2022 w/ clang 15.0.1
with MSVC-like command-line (C/C++ only - cmake)
Visual Studio 2022 w/ Intel C/C++/Fortran oneAPI 2023 (cmake)
@@ -417,7 +466,6 @@ Known Problems
CMake files do not behave correctly with paths containing spaces.
Do not use spaces in paths because the required escaping for handling spaces
results in very complex and fragile build files.
- ADB - 2019/05/07
At present, metadata cache images may not be generated by parallel
applications. Parallel applications can read files with metadata cache
diff --git a/src/H5Aint.c b/src/H5Aint.c
index c272402..0cbe462 100644
--- a/src/H5Aint.c
+++ b/src/H5Aint.c
@@ -2455,6 +2455,10 @@ H5A__dense_post_copy_file_cb(const H5A_t *attr_src, void *_udata)
assert(udata->file);
assert(udata->cpy_info);
+ /* Set the location of the src datatype */
+ if (H5T_set_loc(attr_src->shared->dt, H5F_VOL_OBJ(udata->oloc_src->file), H5T_LOC_DISK) < 0)
+ HGOTO_ERROR(H5E_DATATYPE, H5E_CANTINIT, H5_ITER_ERROR, "cannot mark datatype on disk");
+
if (NULL ==
(attr_dst = H5A__attr_copy_file(attr_src, udata->file, udata->recompute_size, udata->cpy_info)))
HGOTO_ERROR(H5E_ATTR, H5E_CANTCOPY, H5_ITER_ERROR, "can't copy attribute");
@@ -2464,7 +2468,7 @@ H5A__dense_post_copy_file_cb(const H5A_t *attr_src, void *_udata)
/* Reset shared location information */
if (H5O_msg_reset_share(H5O_ATTR_ID, attr_dst) < 0)
- HGOTO_ERROR(H5E_OHDR, H5E_CANTINIT, FAIL, "unable to reset attribute sharing");
+ HGOTO_ERROR(H5E_OHDR, H5E_CANTINIT, H5_ITER_ERROR, "unable to reset attribute sharing");
/* Set COPIED tag for destination object's metadata */
H5_BEGIN_TAG(H5AC__COPIED_TAG)
@@ -2478,7 +2482,7 @@ H5A__dense_post_copy_file_cb(const H5A_t *attr_src, void *_udata)
done:
if (attr_dst && H5A__close(attr_dst) < 0)
- HDONE_ERROR(H5E_ATTR, H5E_CLOSEERROR, FAIL, "can't close destination attribute");
+ HDONE_ERROR(H5E_ATTR, H5E_CLOSEERROR, H5_ITER_ERROR, "can't close destination attribute");
FUNC_LEAVE_NOAPI(ret_value)
} /* end H5A__dense_post_copy_file_cb() */
diff --git a/src/H5Apublic.h b/src/H5Apublic.h
index 232ae0a..1eb3eff 100644
--- a/src/H5Apublic.h
+++ b/src/H5Apublic.h
@@ -929,7 +929,7 @@ H5_DLL herr_t H5Aread(hid_t attr_id, hid_t type_id, void *buf);
H5_DLL herr_t H5Aread_async(const char *app_file, const char *app_func, unsigned app_line, hid_t attr_id,
hid_t dtype_id, void *buf, hid_t es_id);
#else
-H5_DLL herr_t H5Aread_async(chid_t attr_id, hid_t dtype_id, void *buf, hid_t es_id);
+H5_DLL herr_t H5Aread_async(hid_t attr_id, hid_t dtype_id, void *buf, hid_t es_id);
#endif
/*-------------------------------------------------------------------------*/
/**
diff --git a/src/H5Dmodule.h b/src/H5Dmodule.h
index 84d00e8..81f197d 100644
--- a/src/H5Dmodule.h
+++ b/src/H5Dmodule.h
@@ -179,10 +179,15 @@
* </table>
*
* \anchor dcpl_table_tag Dataset creation property list functions (H5P)
+ * <div>
* \snippet{doc} tables/propertyLists.dox dcpl_table
+ * </div>
*
* \anchor dapl_table_tag Dataset access property list functions (H5P)
+ *
+ * <div>
* \snippet{doc} tables/propertyLists.dox dapl_table
+ * </div>
*
* \subsection subsec_dataset_program Programming Model for Datasets
* This section explains the programming model for datasets.
@@ -863,7 +868,9 @@
* the pipeline processing: the pipeline and filter operations are identical no matter what data access
* mechanism is used.
*
+ * <div>
* \snippet{doc} tables/propertyLists.dox lcpl_table
+ * </div>
*
* Each file driver writes/reads contiguous blocks of bytes from a logically contiguous address
* space. The file driver is responsible for managing the details of the different physical storage
@@ -880,7 +887,9 @@
* Data transfer properties set optional parameters that control parts of the data pipeline. The
* function listing below shows transfer properties that control the behavior of the library.
*
+ * <div>
* \snippet{doc} tables/fileDriverLists.dox file_driver_table
+ * </div>
*
* Some filters and file drivers require or use additional parameters from the application program.
* These can be passed in the data transfer property list. The table below shows file driver property
diff --git a/src/H5FAcache.c b/src/H5FAcache.c
index 6d9e22e..5aa06f6 100644
--- a/src/H5FAcache.c
+++ b/src/H5FAcache.c
@@ -505,7 +505,7 @@ H5FA__cache_hdr_free_icr(void *thing)
/* Check arguments */
assert(thing);
- /* Release the extensible array header */
+ /* Release the fixed array header */
if (H5FA__hdr_dest((H5FA_hdr_t *)thing) < 0)
HGOTO_ERROR(H5E_FARRAY, H5E_CANTFREE, FAIL, "can't free fixed array header");
diff --git a/src/H5FAhdr.c b/src/H5FAhdr.c
index cfe5001..ef3d689 100644
--- a/src/H5FAhdr.c
+++ b/src/H5FAhdr.c
@@ -202,7 +202,7 @@ H5FA__hdr_create(H5F_t *f, const H5FA_create_t *cparam, void *ctx_udata)
if (HADDR_UNDEF == (hdr->addr = H5MF_alloc(f, H5FD_MEM_FARRAY_HDR, (hsize_t)hdr->size)))
HGOTO_ERROR(H5E_FARRAY, H5E_CANTALLOC, HADDR_UNDEF, "file allocation failed for Fixed Array header");
- /* Create 'top' proxy for extensible array entries */
+ /* Create 'top' proxy for fixed array entries */
if (hdr->swmr_write)
if (NULL == (hdr->top_proxy = H5AC_proxy_entry_create()))
HGOTO_ERROR(H5E_FARRAY, H5E_CANTCREATE, HADDR_UNDEF, "can't create fixed array entry proxy");
diff --git a/src/H5FDfamily.c b/src/H5FDfamily.c
index 3f43ae9..323909f 100644
--- a/src/H5FDfamily.c
+++ b/src/H5FDfamily.c
@@ -234,27 +234,28 @@ H5FD__family_get_default_printf_filename(const char *old_filename)
HGOTO_ERROR(H5E_VFL, H5E_CANTALLOC, NULL, "can't allocate new filename buffer");
/* Determine if filename contains a ".h5" extension. */
- if ((file_extension = strstr(old_filename, ".h5"))) {
+ file_extension = strstr(old_filename, ".h5");
+ if (file_extension) {
/* Insert the printf format between the filename and ".h5" extension. */
- strcpy(tmp_buffer, old_filename);
- file_extension = strstr(tmp_buffer, ".h5");
- sprintf(file_extension, "%s%s", suffix, ".h5");
+ intptr_t beginningLength = file_extension - old_filename;
+ snprintf(tmp_buffer, new_filename_len, "%.*s%s%s", (int)beginningLength, old_filename, suffix, ".h5");
}
- else if ((file_extension = strrchr(old_filename, '.'))) {
- char *new_extension_loc = NULL;
-
+ else {
/* If the filename doesn't contain a ".h5" extension, but contains
* AN extension, just insert the printf format before that extension.
*/
- strcpy(tmp_buffer, old_filename);
- new_extension_loc = strrchr(tmp_buffer, '.');
- sprintf(new_extension_loc, "%s%s", suffix, file_extension);
- }
- else {
- /* If the filename doesn't contain an extension at all, just insert
- * the printf format at the end of the filename.
- */
- snprintf(tmp_buffer, new_filename_len, "%s%s", old_filename, suffix);
+ file_extension = strrchr(old_filename, '.');
+ if (file_extension) {
+ intptr_t beginningLength = file_extension - old_filename;
+ snprintf(tmp_buffer, new_filename_len, "%.*s%s%s", (int)beginningLength, old_filename, suffix,
+ file_extension);
+ }
+ else {
+ /* If the filename doesn't contain an extension at all, just insert
+ * the printf format at the end of the filename.
+ */
+ snprintf(tmp_buffer, new_filename_len, "%s%s", old_filename, suffix);
+ }
}
ret_value = tmp_buffer;
diff --git a/src/H5FDsplitter.c b/src/H5FDsplitter.c
index 723b191..56b2cdb 100644
--- a/src/H5FDsplitter.c
+++ b/src/H5FDsplitter.c
@@ -532,27 +532,28 @@ H5FD__splitter_get_default_wo_path(char *new_path, size_t new_path_len, const ch
HGOTO_ERROR(H5E_VFL, H5E_CANTSET, FAIL, "filename exceeds max length");
/* Determine if filename contains a ".h5" extension. */
- if ((file_extension = strstr(base_filename, ".h5"))) {
+ file_extension = strstr(base_filename, ".h5");
+ if (file_extension) {
/* Insert the suffix between the filename and ".h5" extension. */
- strcpy(new_path, base_filename);
- file_extension = strstr(new_path, ".h5");
- sprintf(file_extension, "%s%s", suffix, ".h5");
+ intptr_t beginningLength = file_extension - base_filename;
+ snprintf(new_path, new_path_len, "%.*s%s%s", (int)beginningLength, base_filename, suffix, ".h5");
}
- else if ((file_extension = strrchr(base_filename, '.'))) {
- char *new_extension_loc = NULL;
-
+ else {
/* If the filename doesn't contain a ".h5" extension, but contains
* AN extension, just insert the suffix before that extension.
*/
- strcpy(new_path, base_filename);
- new_extension_loc = strrchr(new_path, '.');
- sprintf(new_extension_loc, "%s%s", suffix, file_extension);
- }
- else {
- /* If the filename doesn't contain an extension at all, just insert
- * the suffix at the end of the filename.
- */
- snprintf(new_path, new_path_len, "%s%s", base_filename, suffix);
+ file_extension = strrchr(base_filename, '.');
+ if (file_extension) {
+ intptr_t beginningLength = file_extension - base_filename;
+ snprintf(new_path, new_path_len, "%.*s%s%s", (int)beginningLength, base_filename, suffix,
+ file_extension);
+ }
+ else {
+ /* If the filename doesn't contain an extension at all, just insert
+ * the suffix at the end of the filename.
+ */
+ snprintf(new_path, new_path_len, "%s%s", base_filename, suffix);
+ }
}
done:
diff --git a/src/H5FDsubfiling/H5FDsubfiling.c b/src/H5FDsubfiling/H5FDsubfiling.c
index bf175e6..71dd4ba 100644
--- a/src/H5FDsubfiling/H5FDsubfiling.c
+++ b/src/H5FDsubfiling/H5FDsubfiling.c
@@ -2828,7 +2828,7 @@ get_iovec_sizes(subfiling_context_t *sf_context, size_t in_count, haddr_t file_o
* I/O of a size greater than the block size definitionally
* touches all subfiles at least once.
*/
- cur_max_num_subfiles = (size_t)num_subfiles;
+ cur_max_num_subfiles = (int64_t)num_subfiles;
}
else if (data_size < stripe_size) {
/*
diff --git a/src/H5Fint.c b/src/H5Fint.c
index 8738026..1feada6 100644
--- a/src/H5Fint.c
+++ b/src/H5Fint.c
@@ -1615,6 +1615,18 @@ H5F__dest(H5F_t *f, bool flush, bool free_on_failure)
if (vol_wrap_ctx && (NULL == H5VL_object_unwrap(f->vol_obj)))
HDONE_ERROR(H5E_FILE, H5E_CANTGET, FAIL, "can't unwrap VOL object");
+ /*
+ * Clean up any cached type conversion path table entries that
+ * may have been keeping a reference to the file's VOL object
+ * in order to prevent the file from being closed out from
+ * underneath other places that may access the conversion path
+ * or its src/dst datatypes later on (currently, conversions on
+ * variable-length and reference datatypes involve this)
+ */
+ if (H5T_unregister(H5T_PERS_SOFT, NULL, NULL, NULL, f->vol_obj, NULL) < 0)
+ HDONE_ERROR(H5E_FILE, H5E_CANTRELEASE, FAIL,
+ "unable to free cached type conversion path table entries");
+
if (H5VL_free_object(f->vol_obj) < 0)
HDONE_ERROR(H5E_FILE, H5E_CANTDEC, FAIL, "unable to free VOL object");
f->vol_obj = NULL;
diff --git a/src/H5Fmodule.h b/src/H5Fmodule.h
index 706dedc..17c27f4 100644
--- a/src/H5Fmodule.h
+++ b/src/H5Fmodule.h
@@ -408,14 +408,19 @@
* </table>
*
* \anchor fcpl_table_tag File creation property list functions (H5P)
+ * <div>
* \snippet{doc} tables/propertyLists.dox fcpl_table
+ * </div>
*
* \anchor fapl_table_tag File access property list functions (H5P)
+ * <div>
* \snippet{doc} tables/propertyLists.dox fapl_table
+ * </div>
*
* \anchor fd_pl_table_tag File driver property list functions (H5P)
+ * <div>
* \snippet{doc} tables/propertyLists.dox fd_pl_table
- *
+ * </div>
*
* \subsection subsec_file_create Creating or Opening an HDF5 File
* This section describes in more detail how to create and how to open files.
@@ -672,7 +677,9 @@
* #H5FD_SEC2. Alternative layouts and drivers are designed to suit the needs of a variety of
* systems, environments, and applications. The drivers are listed in the table below.
*
+ * <div>
* \snippet{doc} tables/fileDriverLists.dox supported_file_driver_table
+ * </div>
*
* For more information, see the HDF5 Reference Manual entries for the function calls shown in
* the column on the right in the table above.
diff --git a/src/H5Gmodule.h b/src/H5Gmodule.h
index 4c435eb..c330fcd 100644
--- a/src/H5Gmodule.h
+++ b/src/H5Gmodule.h
@@ -477,7 +477,9 @@
* </tr>
* </table>
*
+ * <div>
* \snippet{doc} tables/propertyLists.dox gcpl_table
+ * </div>
*
* <table>
* <caption>Other external link functions</caption>
diff --git a/src/H5Pdcpl.c b/src/H5Pdcpl.c
index cdee942..ae426ee 100644
--- a/src/H5Pdcpl.c
+++ b/src/H5Pdcpl.c
@@ -1550,7 +1550,7 @@ H5P__dcrt_ext_file_list_dec(const void **_pp, void *_value)
enc_size = *(*pp)++;
assert(enc_size < 256);
UINT64DECODE_VAR(*pp, enc_value, enc_size);
- efl->slot[u].offset = (off_t)enc_value;
+ efl->slot[u].offset = (HDoff_t)enc_value;
/* decode size */
enc_size = *(*pp)++;
diff --git a/src/H5Pmodule.h b/src/H5Pmodule.h
index ea0b2de..ef300f9 100644
--- a/src/H5Pmodule.h
+++ b/src/H5Pmodule.h
@@ -891,48 +891,75 @@
* properties. Property lists are deleted by closing the associated handles.
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox plcr_table
+ * </div>
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox plcra_table
+ * </div>
*
* \ref PLCR / \ref OCPL / \ref GCPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox fcpl_table
+ * </div>
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox fapl_table
* \snippet{doc} tables/propertyLists.dox fd_pl_table
+ * </div>
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox lapl_table
+ * </div>
*
* \ref PLCR / \ref OCPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox dcpl_table
+ * </div>
*
* \ref PLCR / \ref LAPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox dapl_table
+ * </div>
*
* \ref PLCR / \ref OCPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox gcpl_table
+ * </div>
*
* \ref PLCR / \ref LAPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox gapl_table
+ * </div>
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox ocpl_table
+ * </div>
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox ocpypl_table
+ * </div>
*
* \ref PLCR
+ * <div>
* \snippet{doc} tables/propertyLists.dox strcpl_table
+ * </div>
*
* \ref PLCR / \ref STRCPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox lcpl_table
+ * </div>
*
* \ref PLCR / \ref STRCPL
+ * <div>
* \snippet{doc} tables/propertyLists.dox acpl_table
- *
+ * </div>
*
* \defgroup STRCPL String Creation Properties
* \ingroup H5P
@@ -941,30 +968,33 @@
* choice of a character encoding, applies to both attributes and links.
* The second creation property applies to links only, and advises the library
* to automatically create missing intermediate groups when creating new objects.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox strcpl_table
+ * </div>
*
* \defgroup LCPL Link Creation Properties
* \ingroup STRCPL
* This creation property applies to links only, and advises the library
* to automatically create missing intermediate groups when creating new objects.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox lcpl_table
- *
+ * </div>
* @see STRCPL
*
* \defgroup ACPL Attribute Creation Properties
* \ingroup STRCPL
* The creation property, the choice of a character encoding, applies to attributes.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox acpl_table
+ * </div>
*
* @see STRCPL
*
* \defgroup LAPL Link Access Properties
* \ingroup H5P
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox lapl_table
+ * </div>
*
* \defgroup DAPL Dataset Access Properties
* \ingroup LAPL
@@ -974,8 +1004,9 @@
* dataset file paths, and controlling flush behavior, etc. These properties
* are \Emph{not} persisted with datasets, and can be adjusted at runtime before
* a dataset is created or opened.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox dapl_table
+ * </div>
*
* \defgroup DCPL Dataset Creation Properties
* \ingroup OCPL
@@ -984,8 +1015,9 @@
* Unlike dataset access and transfer properties, creation properties \Emph{are}
* stored with the dataset, and cannot be changed once a dataset has been
* created.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox dcpl_table
+ * </div>
*
* \defgroup DXPL Dataset Transfer Properties
* \ingroup H5P
@@ -993,8 +1025,9 @@
* and writing datasets such as transformations, MPI-IO I/O mode, error
* detection, etc. These properties are \Emph{not} persisted with datasets,
* and can be adjusted at runtime before a dataset is read or written.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox dxpl_table
+ * </div>
*
* \defgroup FAPL File Access Properties
* \ingroup H5P
@@ -1003,9 +1036,10 @@
* file driver (VFD), configuring the metadata cache (MDC), control
* file locking, etc. These properties are \Emph{not} persisted with files, and
* can be adjusted at runtime before a file is created or opened.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox fapl_table
* \snippet{doc} tables/propertyLists.dox fd_pl_table
+ * </div>
*
* \defgroup FCPL File Creation Properties
* \ingroup GCPL
@@ -1014,14 +1048,16 @@
* Unlike file access properties, creation properties \Emph{are}
* stored with the file, and cannot be changed once a file has been
* created.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox fcpl_table
+ * </div>
*
* \defgroup GAPL Group Access Properties
* \ingroup LAPL
* The functions in this section can be applied to group property lists.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox gapl_table
+ * </div>
*
* \defgroup GCPL Group Creation Properties
* \ingroup OCPL
@@ -1030,32 +1066,37 @@
* Unlike file access properties, creation properties \Emph{are}
* stored with the group, and cannot be changed once a group has been
* created.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox gcpl_table
+ * </div>
*
* \defgroup PLCR Property List Class Root
* \ingroup H5P
* Use the functions in this module to manage HDF5 property lists.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox plcr_table
+ * </div>
*
* \defgroup PLCRA Property List Class Root (Advanced)
* \ingroup H5P
* You can create and customize user-defined property list classes using the
* functions described below. Arbitrary user-defined properties can also
* be inserted into existing property lists as so-called temporary properties.
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox plcra_table
+ * </div>
*
* \defgroup OCPL Object Creation Properties
* \ingroup H5P
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox ocpl_table
+ * </div>
*
* \defgroup OCPYPL Object Copy Properties
* \ingroup H5P
- *
+ * <div>
* \snippet{doc} tables/propertyLists.dox ocpypl_table
+ * </div>
*
* \defgroup FMPL File Mount Properties
* \ingroup H5P
diff --git a/src/H5Ppublic.h b/src/H5Ppublic.h
index 1b5d2f4..903d549 100644
--- a/src/H5Ppublic.h
+++ b/src/H5Ppublic.h
@@ -3360,8 +3360,9 @@ H5_DLL herr_t H5Pget_core_write_tracking(hid_t fapl_id, hbool_t *is_enabled, siz
*
* Valid driver identifiers distributed with HDF5 are listed and
* described in the following table.
- *
+ * <div>
* \snippet{doc} tables/fileDriverLists.dox supported_file_driver_table
+ * </div>
*
* This list does not include custom drivers that might be
* defined and registered by a user.
@@ -6392,6 +6393,9 @@ H5_DLL herr_t H5Pset_dset_no_attrs_hint(hid_t dcpl_id, hbool_t minimize);
* when H5Dwrite() is called to write data to it, the library
* will create the file.
*
+ * \note On Windows, off_t is typically a 32-bit signed long value, which
+ * limits the valid offset that can be set to 2 GiB.
+ *
* \since 1.0.0
*
*/
@@ -6772,7 +6776,7 @@ H5_DLL herr_t H5Pset_scaleoffset(hid_t plist_id, H5Z_SO_scale_type_t scale_type,
* Valid values are #H5_SZIP_EC_OPTION_MASK and
* #H5_SZIP_NN_OPTION_MASK.
* \param[in] pixels_per_block The number of pixels or data elements in each
- * data block
+ * data block (max #H5_SZIP_MAX_PIXELS_PER_BLOCK)
*
* \return \herr_t
*
@@ -6807,7 +6811,7 @@ H5_DLL herr_t H5Pset_scaleoffset(hid_t plist_id, H5Z_SO_scale_type_t scale_type,
* <table>
* <tr>
* <th>Option</th>
- * <th>Description (Mutually exclusive; select one.)</th>
+ * <th>Description (Mutually exclusive; select one)</th>
* </tr>
* <tr>
* <td>#H5_SZIP_EC_OPTION_MASK</td>
@@ -6815,7 +6819,7 @@ H5_DLL herr_t H5Pset_scaleoffset(hid_t plist_id, H5Z_SO_scale_type_t scale_type,
* </tr>
* <tr>
* <td>#H5_SZIP_NN_OPTION_MASK</td>
- * <td>Selects nearest neighbor coding method</td>
+ * <td>Selects nearest neighbor preprocessing followed by entropy coding</td>
* </tr>
* </table>
*
@@ -6863,9 +6867,10 @@ H5_DLL herr_t H5Pset_scaleoffset(hid_t plist_id, H5Z_SO_scale_type_t scale_type,
* conflict can be detected only when the property list is used.
* - Users should be aware that there are factors that affect one's
* rights and ability to use SZIP compression by reviewing the
- * SZIP copyright notice.
+ * SZIP copyright notice. (This limitation does not apply to the
+ * libaec library).
*
- * \note \b For \b Users \b Familiar \b with \b SZIP \b in \b Other \b Contexts:
+ * \note <b> For Users Familiar with SZIP in Other Contexts: </b>
*
* \note The following notes are of interest primarily to those who have
* used SZIP compression outside of the HDF5 context.
diff --git a/src/H5Spublic.h b/src/H5Spublic.h
index 2b6384f..5422d96 100644
--- a/src/H5Spublic.h
+++ b/src/H5Spublic.h
@@ -848,7 +848,7 @@ H5_DLL herr_t H5Soffset_simple(hid_t space_id, const hssize_t *offset);
*
* \brief Closes a dataspace selection iterator
*
- * \space_id{sel_iter_id}
+ * \param[in] sel_iter_id Identifier of the dataspace selection iterator
*
* \return \herr_t
*
@@ -865,8 +865,9 @@ H5_DLL herr_t H5Ssel_iter_close(hid_t sel_iter_id);
*
* \space_id{spaceid}
* \param[in] elmt_size Size of element in the selection
- * \param[in] flags Selection iterator flag
- *
+ * \param[in] flags Selection iterator flag, valid values are:
+ * \li @ref H5S_SEL_ITER_GET_SEQ_LIST_SORTED
+ * \li @ref H5S_SEL_ITER_SHARE_WITH_DATASPACE
* \return \hid_t{valid dataspace selection iterator}
*
* \details H5Ssel_iter_create() creates a selection iterator and initializes
@@ -882,13 +883,13 @@ H5_DLL hid_t H5Ssel_iter_create(hid_t spaceid, size_t elmt_size, unsigned flags)
* \brief Retrieves a list of offset / length sequences for the elements in
* an iterator
*
- * \space_id{sel_iter_id}
- * \param[in] maxseq Maximum number of sequences to retrieve
- * \param[in] maxelmts Maximum number of elements to retrieve in sequences
- * \param[out] nseq Number of sequences retrieved
- * \param[out] nelmts Number of elements retrieved, in all sequences
- * \param[out] off Array of sequence offsets
- * \param[out] len Array of sequence lengths
+ * \param[in] sel_iter_id Identifier of the dataspace selection iterator
+ * \param[in] maxseq Maximum number of sequences to retrieve
+ * \param[in] maxelmts Maximum number of elements to retrieve in sequences
+ * \param[out] nseq Number of sequences retrieved
+ * \param[out] nelmts Number of elements retrieved, in all sequences
+ * \param[out] off Array of sequence offsets
+ * \param[out] len Array of sequence lengths
*
* \return \herr_t
*
diff --git a/src/H5T.c b/src/H5T.c
index 4a5c7cf..d665566 100644
--- a/src/H5T.c
+++ b/src/H5T.c
@@ -343,12 +343,13 @@ typedef H5T_t *(*H5T_copy_func_t)(H5T_t *old_dt);
static herr_t H5T__register_int(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst,
H5T_lib_conv_t func);
static herr_t H5T__register(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst, H5T_conv_func_t *conv);
-static herr_t H5T__unregister(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst, H5T_conv_t func);
static htri_t H5T__compiler_conv(H5T_t *src, H5T_t *dst);
static herr_t H5T__set_size(H5T_t *dt, size_t size);
static herr_t H5T__close_cb(H5T_t *dt, void **request);
static H5T_path_t *H5T__path_find_real(const H5T_t *src, const H5T_t *dst, const char *name,
H5T_conv_func_t *conv);
+static bool H5T_path_match(H5T_path_t *path, H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst,
+ H5VL_object_t *owned_vol_obj, H5T_conv_t func);
static bool H5T__detect_vlen_ref(const H5T_t *dt);
static H5T_t *H5T__initiate_copy(const H5T_t *old_dt);
static H5T_t *H5T__copy_transient(H5T_t *old_dt);
@@ -2671,7 +2672,7 @@ done:
} /* end H5Tregister() */
/*-------------------------------------------------------------------------
- * Function: H5T__unregister
+ * Function: H5T_unregister
*
* Purpose: Removes conversion paths that match the specified criteria.
* All arguments are optional. Missing arguments are wild cards.
@@ -2682,18 +2683,33 @@ done:
*
*-------------------------------------------------------------------------
*/
-static herr_t
-H5T__unregister(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst, H5T_conv_t func)
+herr_t
+H5T_unregister(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst, H5VL_object_t *owned_vol_obj,
+ H5T_conv_t func)
{
H5T_path_t *path = NULL; /*conversion path */
H5T_soft_t *soft = NULL; /*soft conversion information */
int nprint = 0; /*number of paths shut down */
int i; /*counter */
- FUNC_ENTER_PACKAGE_NOERR
+ FUNC_ENTER_NOAPI_NOERR
- /* Remove matching entries from the soft list */
- if (H5T_PERS_DONTCARE == pers || H5T_PERS_SOFT == pers) {
+ /*
+ * Remove matching entries from the soft list if:
+ *
+ * - The caller didn't specify a particular type (soft or hard)
+ * of conversion path to match against or specified that soft
+ * conversion paths should be matched against
+ *
+ * AND
+ *
+ * - The caller didn't provide the `owned_vol_obj` parameter;
+ * if this parameter is provided, we want to leave the soft
+ * list untouched and only remove cached conversion paths
+ * below where the file VOL object associated with the path's
+ * source or destination types matches the given VOL object.
+ */
+ if ((H5T_PERS_DONTCARE == pers || H5T_PERS_SOFT == pers) && !owned_vol_obj) {
for (i = H5T_g.nsoft - 1; i >= 0; --i) {
soft = H5T_g.soft + i;
assert(soft);
@@ -2713,13 +2729,15 @@ H5T__unregister(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst, H5T_c
/* Remove matching conversion paths, except no-op path */
for (i = H5T_g.npaths - 1; i > 0; --i) {
+ bool nomatch;
+
path = H5T_g.path[i];
assert(path);
+ nomatch = !H5T_path_match(path, pers, name, src, dst, owned_vol_obj, func);
+
/* Not a match */
- if (((H5T_PERS_SOFT == pers && path->is_hard) || (H5T_PERS_HARD == pers && !path->is_hard)) ||
- (name && *name && strcmp(name, path->name) != 0) || (src && H5T_cmp(src, path->src, false)) ||
- (dst && H5T_cmp(dst, path->dst, false)) || (func && func != path->conv.u.app_func)) {
+ if (nomatch) {
/*
* Notify all other functions to recalculate private data since some
* functions might cache a list of conversion functions. For
@@ -2768,7 +2786,7 @@ H5T__unregister(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst, H5T_c
} /* end for */
FUNC_LEAVE_NOAPI(SUCCEED)
-} /* end H5T__unregister() */
+} /* end H5T_unregister() */
/*-------------------------------------------------------------------------
* Function: H5Tunregister
@@ -2798,7 +2816,7 @@ H5Tunregister(H5T_pers_t pers, const char *name, hid_t src_id, hid_t dst_id, H5T
if (dst_id > 0 && (NULL == (dst = (H5T_t *)H5I_object_verify(dst_id, H5I_DATATYPE))))
HGOTO_ERROR(H5E_ARGS, H5E_BADTYPE, FAIL, "dst is not a data type");
- if (H5T__unregister(pers, name, src, dst, func) < 0)
+ if (H5T_unregister(pers, name, src, dst, NULL, func) < 0)
HGOTO_ERROR(H5E_DATATYPE, H5E_CANTDELETE, FAIL, "internal unregister function failed");
done:
@@ -5149,6 +5167,53 @@ done:
} /* end H5T__path_find_real() */
/*-------------------------------------------------------------------------
+ * Function: H5T_path_match
+ *
+ * Purpose: Helper function to determine whether a datatype conversion
+ * path object matches against a given set of criteria.
+ *
+ * Return: true/false (can't fail)
+ *
+ *-------------------------------------------------------------------------
+ */
+static bool
+H5T_path_match(H5T_path_t *path, H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst,
+ H5VL_object_t *owned_vol_obj, H5T_conv_t func)
+{
+ bool ret_value = true;
+
+ assert(path);
+
+ FUNC_ENTER_NOAPI_NOINIT_NOERR
+
+ if (
+ /* Check that the specified conversion function persistence matches */
+ ((H5T_PERS_SOFT == pers && path->is_hard) || (H5T_PERS_HARD == pers && !path->is_hard)) ||
+
+ /* Check that the specified conversion path name matches */
+ (name && *name && strcmp(name, path->name) != 0) ||
+
+ /*
+ * Check that the specified source and destination datatypes match
+ * the source and destination datatypes in the conversion path
+ */
+ (src && H5T_cmp(src, path->src, false)) || (dst && H5T_cmp(dst, path->dst, false)) ||
+
+ /*
+ * Check that the specified VOL object matches the VOL object
+ * in the conversion path
+ */
+ (owned_vol_obj && (owned_vol_obj != path->src->shared->owned_vol_obj) &&
+ (owned_vol_obj != path->dst->shared->owned_vol_obj)) ||
+
+ /* Check that the specified conversion function matches */
+ (func && func != path->conv.u.app_func))
+ ret_value = false;
+
+ FUNC_LEAVE_NOAPI(ret_value)
+} /* H5T_path_match() */
+
+/*-------------------------------------------------------------------------
* Function: H5T_path_noop
*
* Purpose: Is the path the special no-op path? The no-op function can be
@@ -6095,3 +6160,26 @@ H5T_own_vol_obj(H5T_t *dt, H5VL_object_t *vol_obj)
done:
FUNC_LEAVE_NOAPI(ret_value)
} /* end H5T_own_vol_obj() */
+
+/*-------------------------------------------------------------------------
+ * Function: H5T__get_path_table_npaths
+ *
+ * Purpose: Testing function to return the number of type conversion
+ * paths currently stored in the type conversion path table
+ * cache.
+ *
+ * Return: Number of type conversion paths (can't fail)
+ *
+ *-------------------------------------------------------------------------
+ */
+int
+H5T__get_path_table_npaths(void)
+{
+ int ret_value = 0;
+
+ FUNC_ENTER_PACKAGE_NOERR
+
+ ret_value = H5T_g.npaths;
+
+ FUNC_LEAVE_NOAPI(ret_value)
+}
diff --git a/src/H5Tmodule.h b/src/H5Tmodule.h
index b4f9289..f1b7b17 100644
--- a/src/H5Tmodule.h
+++ b/src/H5Tmodule.h
@@ -3892,30 +3892,42 @@ filled according to the value of this property. The padding can be:
* \details CPU-specific datatypes
* \defgroup PDTALPHA DEC Alpha
* \ingroup PDTCPU
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_dec_datatypes_table
+ * </div>
* \defgroup PDTX86 AMD & INTEL
* \ingroup PDTCPU
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_intel_datatypes_table
+ * </div>
* \defgroup PDTMIPS SGI MIPS
* \ingroup PDTCPU
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_mips_datatypes_table
+ * </div>
*
* \defgroup PDTIEEE IEEE
* \ingroup PDT
* \details The IEEE floating point types in big- and little-endian byte orders.
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_ieee_datatypes_table
+ * </div>
*
* \defgroup PDTSTD Standard Datatypes
* \ingroup PDT
* \details These are "standard" types. For instance, signed (2's complement)
* and unsigned integers of various sizes in big- and little-endian
* byte orders.
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_std_datatypes_table
+ * </div>
*
* \defgroup PDTUNIX UNIX-specific Datatypes
* \ingroup PDT
* \details Types which are particular to Unix.
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_unix_datatypes_table
+ * </div>
*
* \defgroup PDTNAT Native Datatypes
* \ingroup PDT
@@ -3928,16 +3940,22 @@ filled according to the value of this property. The padding can be:
* \li The datatype \c LLONG corresponds C's \Code{long long} and
* \c LDOUBLE is \Code{long double}. These types might be the same
* as \c LONG and \c DOUBLE, respectively.
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_native_datatypes_table
+ * </div>
*
* \defgroup PDTC9x C9x Integer Datatypes
* \ingroup PDTNAT
* \details C9x integer types
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_c9x_datatypes_table
+ * </div>
*
* \defgroup PDTS Strings
* \ingroup PDT
+ * <div>
* \snippet{doc} tables/predefinedDatatypes.dox predefined_string_datatypes_table
+ * </div>
*
*/
diff --git a/src/H5Tpkg.h b/src/H5Tpkg.h
index b9e24be..ef5ba36 100644
--- a/src/H5Tpkg.h
+++ b/src/H5Tpkg.h
@@ -877,4 +877,7 @@ H5_DLL herr_t H5T__sort_name(const H5T_t *dt, int *map);
/* Debugging functions */
H5_DLL herr_t H5T__print_stats(H5T_path_t *path, int *nprint /*in,out*/);
+/* Testing functions */
+H5_DLL int H5T__get_path_table_npaths(void);
+
#endif /* H5Tpkg_H */
diff --git a/src/H5Tprivate.h b/src/H5Tprivate.h
index 0332679..3120053 100644
--- a/src/H5Tprivate.h
+++ b/src/H5Tprivate.h
@@ -133,6 +133,8 @@ H5_DLL H5T_path_t *H5T_path_find(const H5T_t *src, const H5T_t *dst);
H5_DLL bool H5T_path_noop(const H5T_path_t *p);
H5_DLL H5T_bkg_t H5T_path_bkg(const H5T_path_t *p);
H5_DLL H5T_subset_info_t *H5T_path_compound_subset(const H5T_path_t *p);
+H5_DLL herr_t H5T_unregister(H5T_pers_t pers, const char *name, H5T_t *src, H5T_t *dst,
+ H5VL_object_t *owned_vol_obj, H5T_conv_t func);
H5_DLL herr_t H5T_convert(H5T_path_t *tpath, hid_t src_id, hid_t dst_id, size_t nelmts, size_t buf_stride,
size_t bkg_stride, void *buf, void *bkg);
H5_DLL herr_t H5T_reclaim(hid_t type_id, struct H5S_t *space, void *buf);
diff --git a/src/H5VLmodule.h b/src/H5VLmodule.h
index 546b31e..19baf34 100644
--- a/src/H5VLmodule.h
+++ b/src/H5VLmodule.h
@@ -606,15 +606,21 @@
* fact, implement some of this functionality as it is possible to mimic the native
* HDF5 connector, however this will probably not be true for most non-native
* VOL connectors.
+ * <div>
* \snippet{doc} tables/volAPIs.dox vol_native_table
+ * </div>
*
* \subsubsection subsubsec_vol_compat_indep List of HDF5 VOL-Independent API Calls
* These HDF5 API calls do not depend on a particular VOL connector being loaded.
+ * <div>
* \snippet{doc} tables/volAPIs.dox vol_independent_table
+ * </div>
*
* \subsubsection subsubsec_vol_compat_opt List of Native VOL Optional Operation Values By Subclass
* These values can be passed to the opt type parameter of H5VLquery optional().
+ * <div>
* \snippet{doc} tables/volAPIs.dox vol_optional_table
+ * </div>
*
*
*
diff --git a/src/H5Zpublic.h b/src/H5Zpublic.h
index 44d91c0..bd557fc 100644
--- a/src/H5Zpublic.h
+++ b/src/H5Zpublic.h
@@ -110,24 +110,55 @@ typedef int H5Z_filter_t;
*/
#define H5Z_FLAG_SKIP_EDC 0x0200
-/* Special parameters for szip compression */
-/* [These are aliases for the similar definitions in szlib.h, which we can't
- * include directly due to the duplication of various symbols with the zlib.h
- * header file] */
+/* Special parameters for szip compression
+ *
+ * These are aliases for similarly-named definitions in szlib.h, which we
+ * can't include directly due to the duplication of various symbols with the
+ * zlib.h header file.
+ *
+ * The flag values are set to the same values as in szlib.h. The following
+ * symbols are internal and defined in H5Zprivate.h:
+ *
+ * - H5_SZIP_LSB_OPTION_MASK
+ * - H5_SZIP_MSB_OPTION_MASK
+ * - H5_SZIP_RAW_OPTION_MASK
+ *
+ * TODO: These symbols should probably be deprecated and moved to H5Zprivate.h
+ * in the next major release of the library since they are only used
+ * internally:
+ *
+ * - H5_SZIP_ALLOW_K13_OPTION_MASK
+ * - H5_SZIP_CHIP_OPTION_MASK
+ */
/**
- * \ingroup SZIP */
+ * \ingroup SZIP
+ *
+ * Used internally. Always added to the \p options_mask parameter of H5Pset_szip().
+ */
#define H5_SZIP_ALLOW_K13_OPTION_MASK 1
/**
- * \ingroup SZIP */
+ * \ingroup SZIP
+ *
+ * Used internally. Always removed from the \p options_mask parameter of H5Pset_szip().
+ */
#define H5_SZIP_CHIP_OPTION_MASK 2
/**
- * \ingroup SZIP */
+ * \ingroup SZIP
+ *
+ * Use the entropy coding method
+ */
#define H5_SZIP_EC_OPTION_MASK 4
/**
- * \ingroup SZIP */
+ * \ingroup SZIP
+ *
+ * Use nearest neighbor preprocessing and then the entropy coding method
+ */
#define H5_SZIP_NN_OPTION_MASK 32
/**
- * \ingroup SZIP */
+ * \ingroup SZIP
+ *
+ * The maximum number of pixels per block (see H5Pset_szip())
+ */
#define H5_SZIP_MAX_PIXELS_PER_BLOCK 32
/* Macros for the shuffle filter */
diff --git a/src/H5checksum.c b/src/H5checksum.c
index 66a6c86..6eafc0d 100644
--- a/src/H5checksum.c
+++ b/src/H5checksum.c
@@ -345,7 +345,7 @@ the return value. Two keys differing by one or two bits will have
totally different hash values.
The best hash table sizes are powers of 2. There is no need to do
-mod a prime (mod is sooo slow!). If you need less than 32 bits,
+mod a prime (mod is so slow!). If you need less than 32 bits,
use a bitmask. For example, if you need only 10 bits, do
h = (h & hashmask(10));
In which case, the hash table should have hashsize(10) elements.
diff --git a/src/H5win32defs.h b/src/H5win32defs.h
index ba6028a..a9a4628 100644
--- a/src/H5win32defs.h
+++ b/src/H5win32defs.h
@@ -22,7 +22,7 @@
/* off_t exists on Windows, but is always a 32-bit long, even on 64-bit Windows,
* so we define HDoff_t to be __int64, which is the type of the st_size field
- * of the _stati64 struct.
+ * of the _stati64 struct and what is returned by _ftelli64().
*/
#define HDoff_t __int64
@@ -42,6 +42,7 @@ struct timezone {
#define HDcreat(S, M) Wopen_utf8(S, O_CREAT | O_TRUNC | O_RDWR, M)
#define HDflock(F, L) Wflock(F, L)
#define HDfstat(F, B) _fstati64(F, B)
+#define HDftell(F) _ftelli64(F)
#define HDgetdcwd(D, S, Z) _getdcwd(D, S, Z)
#define HDgetdrive() _getdrive()
#define HDgettimeofday(V, Z) Wgettimeofday(V, Z)
diff --git a/src/Makefile.am b/src/Makefile.am
index 2272389c..e662577 100644
--- a/src/Makefile.am
+++ b/src/Makefile.am
@@ -56,7 +56,7 @@ libhdf5_la_SOURCES= H5.c H5build_settings.c H5checksum.c H5dbg.c H5system.c \
H5FD.c H5FDcore.c H5FDfamily.c H5FDint.c H5FDlog.c H5FDmulti.c \
H5FDonion.c H5FDonion_header.c H5FDonion_history.c H5FDonion_index.c \
H5FDperform.c H5FDsec2.c H5FDspace.c \
- H5FDsplitter.c H5FDstdio.c H5FDtest.c \
+ H5FDsplitter.c H5FDstdio.c H5FDtest.c H5FDwindows.c \
H5FL.c H5FO.c H5FS.c H5FScache.c H5FSdbg.c H5FSint.c H5FSsection.c \
H5FSstat.c H5FStest.c \
H5G.c H5Gbtree2.c H5Gcache.c H5Gcompact.c H5Gdense.c H5Gdeprec.c \
diff --git a/test/API/H5_api_async_test.c b/test/API/H5_api_async_test.c
index 7777e10..6bcbe8d 100644
--- a/test/API/H5_api_async_test.c
+++ b/test/API/H5_api_async_test.c
@@ -369,7 +369,7 @@ test_multi_dataset_io(void)
/* Loop over datasets */
for (i = 0; i < 5; i++) {
/* Set dataset name */
- sprintf(dset_name, "dset%d", i);
+ snprintf(dset_name, sizeof(dset_name), "dset%d", i);
/* Create the dataset asynchronously */
if ((dset_id[i] = H5Dcreate_async(file_id, dset_name, H5T_NATIVE_INT, space_id, H5P_DEFAULT,
@@ -450,7 +450,7 @@ test_multi_dataset_io(void)
/* Loop over datasets */
for (i = 0; i < 5; i++) {
/* Set dataset name */
- sprintf(dset_name, "dset%d", i);
+ snprintf(dset_name, sizeof(dset_name), "dset%d", i);
/* Open the dataset asynchronously */
if ((dset_id[0] = H5Dopen_async(file_id, dset_name, H5P_DEFAULT, es_id)) < 0)
@@ -479,7 +479,7 @@ test_multi_dataset_io(void)
/* Loop over datasets */
for (i = 0; i < 5; i++) {
/* Set dataset name */
- sprintf(dset_name, "dset%d", i);
+ snprintf(dset_name, sizeof(dset_name), "dset%d", i);
/* Open the dataset asynchronously */
if ((dset_id[0] = H5Dopen_async(file_id, dset_name, H5P_DEFAULT, es_id)) < 0)
@@ -619,7 +619,7 @@ test_multi_file_dataset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Create file asynchronously */
if ((file_id[i] =
@@ -761,7 +761,7 @@ test_multi_file_dataset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Open the file asynchronously */
if ((file_id[0] = H5Fopen_async(file_name, H5F_ACC_RDWR, H5P_DEFAULT, es_id)) < 0)
@@ -799,7 +799,7 @@ test_multi_file_dataset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Open the file asynchronously */
if ((file_id[0] = H5Fopen_async(file_name, H5F_ACC_RDONLY, H5P_DEFAULT, es_id)) < 0)
@@ -929,7 +929,7 @@ test_multi_file_grp_dset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Create file asynchronously */
if ((file_id = H5Fcreate_async(file_name, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT, es_id)) <
@@ -981,7 +981,7 @@ test_multi_file_grp_dset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Open the file asynchronously */
if ((file_id = H5Fopen_async(file_name, H5F_ACC_RDONLY, H5P_DEFAULT, es_id)) < 0)
@@ -1039,7 +1039,7 @@ test_multi_file_grp_dset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Create file asynchronously */
if ((file_id = H5Fcreate_async(file_name, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT, es_id)) <
@@ -1096,7 +1096,7 @@ test_multi_file_grp_dset_io(void)
/* Loop over files */
for (i = 0; i < 5; i++) {
/* Set file name */
- sprintf(file_name, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
/* Open the file asynchronously */
if ((file_id = H5Fopen_async(file_name, H5F_ACC_RDONLY, H5P_DEFAULT, es_id)) < 0)
@@ -2676,7 +2676,7 @@ cleanup_files(void)
H5Fdelete(ASYNC_API_TEST_FILE, H5P_DEFAULT);
for (i = 0; i <= max_printf_file; i++) {
- snprintf(file_name, 64, ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), ASYNC_API_TEST_FILE_PRINTF, i);
H5Fdelete(file_name, H5P_DEFAULT);
} /* end for */
}
diff --git a/test/API/H5_api_attribute_test.c b/test/API/H5_api_attribute_test.c
index 680ee43..fd56be8 100644
--- a/test/API/H5_api_attribute_test.c
+++ b/test/API/H5_api_attribute_test.c
@@ -51,6 +51,8 @@ static int test_attribute_iterate_datatype(void);
static int test_attribute_iterate_index_saving(void);
static int test_attribute_iterate_invalid_params(void);
static int test_attribute_iterate_0_attributes(void);
+static int test_attribute_compound_subset(void);
+static int test_attribute_string_encodings(void);
static int test_delete_attribute(void);
static int test_delete_attribute_invalid_params(void);
static int test_attribute_exists(void);
@@ -99,6 +101,8 @@ static int (*attribute_tests[])(void) = {test_create_attribute_on_root,
test_attribute_iterate_index_saving,
test_attribute_iterate_invalid_params,
test_attribute_iterate_0_attributes,
+ test_attribute_compound_subset,
+ test_attribute_string_encodings,
test_delete_attribute,
test_delete_attribute_invalid_params,
test_attribute_exists,
@@ -8333,6 +8337,535 @@ error:
return 1;
}
+/* A compound type for test_attribute_compound_subset */
+typedef struct attribute_compound_io_t {
+ int a;
+ int b;
+} attribute_compound_io_t;
+
+/*
+ * A test to ensure that data is read back correctly from a attribute after it has
+ * been written, using subsets of compound datatypes
+ */
+static int
+test_attribute_compound_subset(void)
+{
+ hsize_t dims[1] = {ATTRIBUTE_COMPOUND_IO_ATTR_DIMS};
+ size_t i;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t attr_id = H5I_INVALID_HID;
+ hid_t space_id = H5I_INVALID_HID;
+ hid_t full_type_id = H5I_INVALID_HID;
+ hid_t a_type_id = H5I_INVALID_HID;
+ hid_t b_type_id = H5I_INVALID_HID;
+ attribute_compound_io_t wbuf[ATTRIBUTE_COMPOUND_IO_ATTR_DIMS];
+ attribute_compound_io_t rbuf[ATTRIBUTE_COMPOUND_IO_ATTR_DIMS];
+ attribute_compound_io_t fbuf[ATTRIBUTE_COMPOUND_IO_ATTR_DIMS];
+ attribute_compound_io_t erbuf[ATTRIBUTE_COMPOUND_IO_ATTR_DIMS];
+
+ TESTING_MULTIPART(
+ "verification of attribute data using H5Awrite then H5Aread with compound type subsets");
+
+ /* Make sure the connector supports the API functions being tested */
+ if (!(vol_cap_flags_g & H5VL_CAP_FLAG_FILE_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_GROUP_BASIC) ||
+ !(vol_cap_flags_g & H5VL_CAP_FLAG_ATTR_BASIC)) {
+ SKIPPED();
+ printf(
+ " API functions for basic file, group, or attribute aren't supported with this connector\n");
+ return 0;
+ }
+
+ TESTING_2("test setup");
+
+ if ((file_id = H5Fopen(H5_api_test_filename, H5F_ACC_RDWR, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open file '%s'\n", H5_api_test_filename);
+ goto error;
+ }
+
+ if ((container_group = H5Gopen2(file_id, ATTRIBUTE_TEST_GROUP_NAME, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open container group '%s'\n", ATTRIBUTE_TEST_GROUP_NAME);
+ goto error;
+ }
+
+ if ((group_id = H5Gcreate2(container_group, ATTRIBUTE_COMPOUND_IO_TEST_GROUP_NAME, H5P_DEFAULT,
+ H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create container sub-group '%s'\n", ATTRIBUTE_COMPOUND_IO_TEST_GROUP_NAME);
+ goto error;
+ }
+
+ if ((space_id = H5Screate_simple(1, dims, NULL)) < 0)
+ TEST_ERROR;
+
+ if ((full_type_id = H5Tcreate(H5T_COMPOUND, sizeof(attribute_compound_io_t))) < 0)
+ TEST_ERROR;
+ if (H5Tinsert(full_type_id, "a", HOFFSET(attribute_compound_io_t, a), H5T_NATIVE_INT) < 0)
+ TEST_ERROR;
+ if (H5Tinsert(full_type_id, "b", HOFFSET(attribute_compound_io_t, b), H5T_NATIVE_INT) < 0)
+ TEST_ERROR;
+
+ if ((a_type_id = H5Tcreate(H5T_COMPOUND, sizeof(attribute_compound_io_t))) < 0)
+ TEST_ERROR;
+ if (H5Tinsert(a_type_id, "a", HOFFSET(attribute_compound_io_t, a), H5T_NATIVE_INT) < 0)
+ TEST_ERROR;
+
+ if ((b_type_id = H5Tcreate(H5T_COMPOUND, sizeof(attribute_compound_io_t))) < 0)
+ TEST_ERROR;
+ if (H5Tinsert(b_type_id, "b", HOFFSET(attribute_compound_io_t, b), H5T_NATIVE_INT) < 0)
+ TEST_ERROR;
+
+ if ((attr_id = H5Acreate2(group_id, ATTRIBUTE_COMPOUND_IO_TEST_ATTR_NAME, full_type_id, space_id,
+ H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create attribute '%s'\n", ATTRIBUTE_COMPOUND_IO_TEST_ATTR_NAME);
+ goto error;
+ }
+
+ PASSED();
+
+ BEGIN_MULTIPART
+ {
+ PART_BEGIN(write_full_read_full)
+ {
+ TESTING_2("H5Awrite then H5Aread with all compound members");
+
+ /* Initialize wbuf */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ wbuf[i].a = (int)(2 * i);
+ wbuf[i].b = (int)(2 * i + 1);
+ }
+
+ /* Write data */
+ if (H5Awrite(attr_id, full_type_id, wbuf) < 0)
+ PART_TEST_ERROR(write_full_read_full);
+
+ /* Update fbuf to match file state */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ fbuf[i].a = wbuf[i].a;
+ fbuf[i].b = wbuf[i].b;
+ }
+
+ /* Initialize rbuf to -1 */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ rbuf[i].a = -1;
+ rbuf[i].b = -1;
+ }
+
+ /* Set erbuf (simply match file state since we're reading the whole
+ * thing) */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ erbuf[i].a = fbuf[i].a;
+ erbuf[i].b = fbuf[i].b;
+ }
+
+ /* Read data */
+ if (H5Aread(attr_id, full_type_id, rbuf) < 0)
+ PART_TEST_ERROR(write_full_read_full);
+
+ /* Verify data */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ if (rbuf[i].a != erbuf[i].a)
+ PART_TEST_ERROR(write_full_read_full);
+ if (rbuf[i].b != erbuf[i].b)
+ PART_TEST_ERROR(write_full_read_full);
+ }
+
+ PASSED();
+ }
+ PART_END(write_full_read_full);
+
+ PART_BEGIN(read_a)
+ {
+ TESTING_2("H5Aread with compound member a");
+
+ /* Initialize rbuf to -1 */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ rbuf[i].a = -1;
+ rbuf[i].b = -1;
+ }
+
+ /* Set erbuf (element a comes from the file, element b in untouched)
+ */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ erbuf[i].a = fbuf[i].a;
+ erbuf[i].b = rbuf[i].b;
+ }
+
+ /* Read data */
+ if (H5Aread(attr_id, a_type_id, rbuf) < 0)
+ PART_TEST_ERROR(read_a);
+
+ /* Verify data */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ if (rbuf[i].a != erbuf[i].a)
+ PART_TEST_ERROR(read_a);
+ if (rbuf[i].b != erbuf[i].b)
+ PART_TEST_ERROR(read_a);
+ }
+
+ PASSED();
+ }
+ PART_END(read_a);
+
+ PART_BEGIN(write_b_read_full)
+ {
+ TESTING_2("H5Awrite with compound member b then H5Aread with all compound members");
+
+ /* Initialize wbuf */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ wbuf[i].a = (int)(2 * ATTRIBUTE_COMPOUND_IO_ATTR_DIMS + 2 * i);
+ wbuf[i].b = (int)(2 * ATTRIBUTE_COMPOUND_IO_ATTR_DIMS + 2 * i + 1);
+ }
+
+ /* Write data */
+ if (H5Awrite(attr_id, b_type_id, wbuf) < 0)
+ PART_TEST_ERROR(write_b_read_full);
+
+ /* Update fbuf to match file state - only element b was updated */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ fbuf[i].b = wbuf[i].b;
+ }
+
+ /* Initialize rbuf to -1 */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ rbuf[i].a = -1;
+ rbuf[i].b = -1;
+ }
+
+ /* Set erbuf (simply match file state since we're reading the whole
+ * thing) */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ erbuf[i].a = fbuf[i].a;
+ erbuf[i].b = fbuf[i].b;
+ }
+
+ /* Read data */
+ if (H5Aread(attr_id, full_type_id, rbuf) < 0)
+ PART_TEST_ERROR(write_b_read_full);
+
+ /* Verify data */
+ for (i = 0; i < ATTRIBUTE_COMPOUND_IO_ATTR_DIMS; i++) {
+ if (rbuf[i].a != erbuf[i].a)
+ PART_TEST_ERROR(write_b_read_full);
+ if (rbuf[i].b != erbuf[i].b)
+ PART_TEST_ERROR(write_b_read_full);
+ }
+
+ PASSED();
+ }
+ PART_END(write_b_read_full);
+ }
+ END_MULTIPART;
+
+ TESTING_2("test cleanup");
+
+ if (H5Sclose(space_id) < 0)
+ TEST_ERROR;
+ if (H5Aclose(attr_id) < 0)
+ TEST_ERROR;
+ if (H5Gclose(group_id) < 0)
+ TEST_ERROR;
+ if (H5Gclose(container_group) < 0)
+ TEST_ERROR;
+ if (H5Fclose(file_id) < 0)
+ TEST_ERROR;
+ if (H5Tclose(full_type_id) < 0)
+ TEST_ERROR;
+ if (H5Tclose(a_type_id) < 0)
+ TEST_ERROR;
+ if (H5Tclose(b_type_id) < 0)
+ TEST_ERROR;
+
+ PASSED();
+
+ return 0;
+
+error:
+ H5E_BEGIN_TRY
+ {
+ H5Sclose(space_id);
+ H5Aclose(attr_id);
+ H5Gclose(group_id);
+ H5Gclose(container_group);
+ H5Fclose(file_id);
+ H5Tclose(full_type_id);
+ H5Tclose(a_type_id);
+ H5Tclose(b_type_id);
+ }
+ H5E_END_TRY;
+
+ return 1;
+}
+
+/*
+ * A test to check that attributes preserve data
+ * correctness for strings with ASCII or UTF-8 char sets
+ */
+static int
+test_attribute_string_encodings(void)
+{
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID;
+ hid_t dset_id1 = H5I_INVALID_HID;
+ hid_t dset_id2 = H5I_INVALID_HID;
+ hid_t type_id1 = H5I_INVALID_HID;
+ hid_t type_id2 = H5I_INVALID_HID;
+ hid_t space_id = H5I_INVALID_HID;
+ hid_t attr_id1 = H5I_INVALID_HID;
+ hid_t attr_id2 = H5I_INVALID_HID;
+ hsize_t dims[ATTRIBUTE_STRING_ENCODINGS_RANK] = {ATTRIBUTE_STRING_ENCODINGS_EXTENT};
+ size_t ascii_str_size = 0;
+ size_t utf8_str_size = 0;
+ char *write_buf = NULL;
+ char *read_buf = NULL;
+
+ TESTING_MULTIPART("string encoding read/write correctness on attributes");
+
+ /* Make sure the connector supports the API functions being tested */
+ if (!(vol_cap_flags_g & H5VL_CAP_FLAG_FILE_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_GROUP_BASIC) ||
+ !(vol_cap_flags_g & H5VL_CAP_FLAG_DATASET_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_ATTR_BASIC)) {
+ SKIPPED();
+ printf(" API functions for basic file, group, basic or more dataset aren't supported with this "
+ "connector\n");
+ return 0;
+ }
+
+ TESTING_2("test setup");
+
+ ascii_str_size = strlen(ATTRIBUTE_STRING_ENCODINGS_ASCII_STRING);
+ utf8_str_size = strlen(ATTRIBUTE_STRING_ENCODINGS_UTF8_STRING);
+
+ if ((file_id = H5Fopen(H5_api_test_filename, H5F_ACC_RDWR, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open file '%s'\n", H5_api_test_filename);
+ goto error;
+ }
+
+ if ((container_group = H5Gopen2(file_id, ATTRIBUTE_TEST_GROUP_NAME, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open container group '%s'\n", ATTRIBUTE_TEST_GROUP_NAME);
+ goto error;
+ }
+
+ if ((space_id = H5Screate_simple(ATTRIBUTE_STRING_ENCODINGS_RANK, dims, NULL)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataspace\n");
+ goto error;
+ }
+
+ if ((type_id1 = H5Tcopy(H5T_C_S1)) < 0) {
+ H5_FAILED();
+ printf(" couldn't copy builtin string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_size(type_id1, ascii_str_size)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set size of string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_cset(type_id1, H5T_CSET_ASCII)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set character set of string to ASCII\n");
+ goto error;
+ }
+
+ if ((dset_id1 = H5Dcreate(container_group, ATTRIBUTE_STRING_ENCODINGS_DSET_NAME1, type_id1, space_id,
+ H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with ascii string\n");
+ goto error;
+ }
+
+ if ((attr_id1 = H5Acreate(dset_id1, ATTRIBUTE_STRING_ENCODINGS_ATTR_NAME1, type_id1, space_id,
+ H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create attribute with ascii string\n");
+ goto error;
+ }
+
+ if ((type_id2 = H5Tcopy(H5T_C_S1)) < 0) {
+ H5_FAILED();
+ printf(" couldn't copy builtin string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_size(type_id2, utf8_str_size)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set size of string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_cset(type_id2, H5T_CSET_UTF8)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set character set of string to UTF-8\n");
+ goto error;
+ }
+
+ if ((dset_id2 = H5Dcreate(container_group, ATTRIBUTE_STRING_ENCODINGS_DSET_NAME2, type_id2, space_id,
+ H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with UTF-8 string\n");
+ goto error;
+ }
+
+ if ((attr_id2 = H5Acreate(dset_id2, ATTRIBUTE_STRING_ENCODINGS_ATTR_NAME2, type_id2, space_id,
+ H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create attribute with ascii string\n");
+ goto error;
+ }
+
+ PASSED();
+
+ BEGIN_MULTIPART
+ {
+ PART_BEGIN(ASCII_cset)
+ {
+ TESTING_2("ASCII character set");
+ if ((write_buf = calloc(1, ascii_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for write buffer\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ memcpy(write_buf, ATTRIBUTE_STRING_ENCODINGS_ASCII_STRING, ascii_str_size);
+
+ if ((read_buf = calloc(1, ascii_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ if (H5Awrite(attr_id1, type_id1, write_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to attribute with ASCII string\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ if (H5Aread(attr_id1, type_id1, read_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from attribute with ASCII string\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ if (strncmp(write_buf, read_buf, ascii_str_size)) {
+ H5_FAILED();
+ printf(" incorrect data read from attribute with ASCII string\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ free(write_buf);
+ write_buf = NULL;
+
+ free(read_buf);
+ read_buf = NULL;
+
+ PASSED();
+ }
+ PART_END(ASCII_cset);
+
+ PART_BEGIN(UTF8_cset)
+ {
+ TESTING_2("UTF-8 character set");
+
+ if ((write_buf = calloc(1, utf8_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for write buffer\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ memcpy(write_buf, ATTRIBUTE_STRING_ENCODINGS_UTF8_STRING, utf8_str_size);
+
+ if ((read_buf = calloc(1, utf8_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ if (H5Awrite(attr_id2, type_id2, write_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to attribute with UTF-8 string\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ if (H5Aread(attr_id2, type_id2, read_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from attribute with UTF-8 string\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ if (strncmp(write_buf, read_buf, utf8_str_size)) {
+ H5_FAILED();
+ printf(" incorrect data read from attribute with UTF-8 string\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ free(write_buf);
+ write_buf = NULL;
+
+ free(read_buf);
+ read_buf = NULL;
+
+ PASSED();
+ }
+ PART_END(UTF8_cset);
+
+ PASSED();
+ }
+ END_MULTIPART;
+
+ TESTING_2("test cleanup");
+
+ if (H5Fclose(file_id) < 0)
+ TEST_ERROR;
+ if (H5Gclose(container_group) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset_id1) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset_id2) < 0)
+ TEST_ERROR;
+ if (H5Tclose(type_id1) < 0)
+ TEST_ERROR;
+ if (H5Tclose(type_id2) < 0)
+ TEST_ERROR;
+ if (H5Aclose(attr_id1) < 0)
+ TEST_ERROR;
+ if (H5Aclose(attr_id2) < 0)
+ TEST_ERROR;
+ if (write_buf)
+ free(write_buf);
+ if (read_buf)
+ free(read_buf);
+ PASSED();
+
+ return 0;
+
+error:
+ H5E_BEGIN_TRY
+ {
+ H5Fclose(file_id);
+ H5Gclose(container_group);
+ H5Dclose(dset_id1);
+ H5Dclose(dset_id2);
+ H5Tclose(type_id1);
+ H5Tclose(type_id2);
+ H5Aclose(attr_id1);
+ H5Aclose(attr_id2);
+ if (write_buf)
+ free(write_buf);
+ if (read_buf)
+ free(read_buf);
+ }
+ H5E_END_TRY;
+
+ return 1;
+}
+
/*
* A test to check that an attribute can be deleted
* using H5Adelete(_by_idx).
@@ -10457,7 +10990,7 @@ test_attribute_many(void)
/* Create many attributes */
for (u = 0; u < ATTRIBUTE_MANY_NUMB; u++) {
- sprintf(attrname, "many-%06u", u);
+ snprintf(attrname, sizeof(attrname), "many-%06u", u);
if ((attr_id = H5Acreate2(group_id, attrname, attr_dtype, space_id, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
H5_FAILED();
diff --git a/test/API/H5_api_attribute_test.h b/test/API/H5_api_attribute_test.h
index dd83e73..7b455dc 100644
--- a/test/API/H5_api_attribute_test.h
+++ b/test/API/H5_api_attribute_test.h
@@ -156,6 +156,19 @@ int H5_api_attribute_test(void);
#define ATTRIBUTE_ITERATE_TEST_0_ATTRIBUTES_SUBGROUP_NAME "attribute_iterate_test_0_attributes"
#define ATTRIBUTE_ITERATE_TEST_0_ATTRIBUTES_DSET_NAME "attribute_iterate_dset"
+#define ATTRIBUTE_COMPOUND_IO_ATTR_DIMS 10
+#define ATTRIBUTE_COMPOUND_IO_TEST_GROUP_NAME "attribute_compound_io_test_group"
+#define ATTRIBUTE_COMPOUND_IO_TEST_ATTR_NAME "attribute_compound_io_test_attr"
+
+#define ATTRIBUTE_STRING_ENCODINGS_RANK 1
+#define ATTRIBUTE_STRING_ENCODINGS_EXTENT 1
+#define ATTRIBUTE_STRING_ENCODINGS_DSET_NAME1 "encoding_dset1"
+#define ATTRIBUTE_STRING_ENCODINGS_DSET_NAME2 "encoding_dset2"
+#define ATTRIBUTE_STRING_ENCODINGS_ASCII_STRING "asciistr"
+#define ATTRIBUTE_STRING_ENCODINGS_UTF8_STRING "αaααaaaα"
+#define ATTRIBUTE_STRING_ENCODINGS_ATTR_NAME1 "encoding_attr1"
+#define ATTRIBUTE_STRING_ENCODINGS_ATTR_NAME2 "encoding_attr2"
+
#define ATTRIBUTE_ITERATE_INVALID_PARAMS_TEST_ATTR_SPACE_RANK 1
#define ATTRIBUTE_ITERATE_INVALID_PARAMS_TEST_SUBGROUP_NAME "attribute_iterate_invalid_params_test"
#define ATTRIBUTE_ITERATE_INVALID_PARAMS_TEST_ATTR_NAME "invalid_params_iter_attr1"
diff --git a/test/API/H5_api_dataset_test.c b/test/API/H5_api_dataset_test.c
index c183697..e9ad15c 100644
--- a/test/API/H5_api_dataset_test.c
+++ b/test/API/H5_api_dataset_test.c
@@ -63,7 +63,9 @@ static int test_write_multi_dataset_small_hyperslab(void);
static int test_write_multi_dataset_small_point_selection(void);
static int test_write_multi_dataset_data_verification(void);
static int test_write_dataset_invalid_params(void);
+static int test_dataset_string_encodings(void);
static int test_dataset_builtin_type_conversion(void);
+static int test_dataset_real_to_int_conversion(void);
static int test_dataset_compound_partial_io(void);
static int test_dataset_set_extent_chunked_unlimited(void);
static int test_dataset_set_extent_chunked_fixed(void);
@@ -133,6 +135,7 @@ static int (*dataset_tests[])(void) = {
test_read_multi_dataset_small_point_selection,
test_dataset_io_point_selections,
test_read_dataset_invalid_params,
+ test_dataset_string_encodings,
test_write_dataset_small_all,
test_write_dataset_small_hyperslab,
test_write_dataset_small_point_selection,
@@ -143,6 +146,7 @@ static int (*dataset_tests[])(void) = {
test_write_multi_dataset_data_verification,
test_write_dataset_invalid_params,
test_dataset_builtin_type_conversion,
+ test_dataset_real_to_int_conversion,
test_dataset_compound_partial_io,
test_dataset_set_extent_chunked_unlimited,
test_dataset_set_extent_chunked_fixed,
@@ -166,6 +170,9 @@ static int (*dataset_tests[])(void) = {
test_get_vlen_buf_size,
};
+size_t filter(unsigned int flags, size_t H5_ATTR_UNUSED cd_nelmts,
+ const unsigned int H5_ATTR_UNUSED cd_values[], size_t nbytes, size_t H5_ATTR_UNUSED *buf_size,
+ void H5_ATTR_UNUSED **buf);
/*
* A test to check that a dataset can be
* created under the root group.
@@ -1211,7 +1218,7 @@ test_create_dataset_random_shapes(void)
goto error;
}
- sprintf(name, "%s%zu", DATASET_SHAPE_TEST_DSET_BASE_NAME, i + 1);
+ snprintf(name, sizeof(name), "%s%zu", DATASET_SHAPE_TEST_DSET_BASE_NAME, i + 1);
if ((dset_id = H5Dcreate2(group_id, name, dset_dtype, space_id, H5P_DEFAULT, H5P_DEFAULT,
H5P_DEFAULT)) < 0) {
@@ -1309,7 +1316,7 @@ test_create_dataset_predefined_types(void)
generate_random_dataspace(DATASET_PREDEFINED_TYPE_TEST_SPACE_RANK, NULL, NULL, false)) < 0)
TEST_ERROR;
- sprintf(name, "%s%zu", DATASET_PREDEFINED_TYPE_TEST_BASE_NAME, i);
+ snprintf(name, sizeof(name), "%s%zu", DATASET_PREDEFINED_TYPE_TEST_BASE_NAME, i);
if ((dset_id = H5Dcreate2(group_id, name, predefined_type_test_table[i], fspace_id, H5P_DEFAULT,
H5P_DEFAULT, H5P_DEFAULT)) < 0) {
@@ -2041,6 +2048,15 @@ error:
return 1;
}
+size_t
+filter(unsigned int H5_ATTR_UNUSED flags, size_t H5_ATTR_UNUSED cd_nelmts,
+ const unsigned int H5_ATTR_UNUSED cd_values[], size_t nbytes, size_t H5_ATTR_UNUSED *buf_size,
+ void H5_ATTR_UNUSED **buf)
+{
+ *buf_size = 0;
+ return nbytes;
+}
+
/*
* A test to check the functionality of the different
* dataset creation properties.
@@ -2048,15 +2064,21 @@ error:
static int
test_create_dataset_creation_properties(void)
{
- hsize_t dims[DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK];
- hsize_t chunk_dims[DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK];
- size_t i;
- hid_t file_id = H5I_INVALID_HID;
- hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
- hid_t dset_id = H5I_INVALID_HID, dcpl_id = H5I_INVALID_HID;
- hid_t dset_dtype = H5I_INVALID_HID, compact_dtype = H5I_INVALID_HID;
- hid_t fspace_id = H5I_INVALID_HID, compact_fspace_id = H5I_INVALID_HID;
-
+ hsize_t dims[DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK];
+ hsize_t chunk_dims[DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK];
+ size_t i;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t dset_id = H5I_INVALID_HID, dcpl_id = H5I_INVALID_HID;
+ hid_t dset_dtype = H5I_INVALID_HID, compact_dtype = H5I_INVALID_HID;
+ hid_t fspace_id = H5I_INVALID_HID, compact_fspace_id = H5I_INVALID_HID;
+ void *read_buf = NULL;
+ unsigned int filter_params[DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NUM_PARAMS] = {1, 2, 3};
+ unsigned int filter_params_out[DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NUM_PARAMS];
+ char ud_filter_name[sizeof(DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NAME)];
+ int nfilters = 0;
+ H5Z_filter_t retrieved_filter_id = H5I_INVALID_HID;
+ size_t num_filter_params = DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NUM_PARAMS;
TESTING_MULTIPART("dataset creation properties");
/* Make sure the connector supports the API functions being tested */
@@ -2132,7 +2154,8 @@ test_create_dataset_creation_properties(void)
PART_ERROR(DCPL_alloc_time_test);
}
- sprintf(name, "%s%zu", DATASET_CREATION_PROPERTIES_TEST_ALLOC_TIMES_BASE_NAME, i);
+ snprintf(name, sizeof(name), "%s%zu", DATASET_CREATION_PROPERTIES_TEST_ALLOC_TIMES_BASE_NAME,
+ i);
if ((dset_id = H5Dcreate2(group_id, name, dset_dtype, fspace_id, H5P_DEFAULT, dcpl_id,
H5P_DEFAULT)) < 0) {
@@ -2208,7 +2231,8 @@ test_create_dataset_creation_properties(void)
PART_ERROR(DCPL_attr_crt_order_test);
}
- sprintf(name, "%s%zu", DATASET_CREATION_PROPERTIES_TEST_CRT_ORDER_BASE_NAME, i);
+ snprintf(name, sizeof(name), "%s%zu", DATASET_CREATION_PROPERTIES_TEST_CRT_ORDER_BASE_NAME,
+ i);
if ((dset_id = H5Dcreate2(group_id, name, dset_dtype, fspace_id, H5P_DEFAULT, dcpl_id,
H5P_DEFAULT)) < 0) {
@@ -2341,7 +2365,8 @@ test_create_dataset_creation_properties(void)
PART_ERROR(DCPL_fill_time_property_test);
}
- sprintf(name, "%s%zu", DATASET_CREATION_PROPERTIES_TEST_FILL_TIMES_BASE_NAME, i);
+ snprintf(name, sizeof(name), "%s%zu", DATASET_CREATION_PROPERTIES_TEST_FILL_TIMES_BASE_NAME,
+ i);
if ((dset_id = H5Dcreate2(group_id, name, dset_dtype, fspace_id, H5P_DEFAULT, dcpl_id,
H5P_DEFAULT)) < 0) {
@@ -2388,7 +2413,275 @@ test_create_dataset_creation_properties(void)
}
PART_END(DCPL_fill_time_property_test);
- /* TODO: Test the fill value property */
+ PART_BEGIN(DCPL_fill_value_test)
+ {
+ TESTING_2("fill values");
+
+ int int_fill_value = DATASET_FILL_VALUE_TEST_INT_FILL_VALUE;
+ double double_fill_value = DATASET_FILL_VALUE_TEST_DOUBLE_FILL_VALUE;
+
+ void *val = NULL;
+ size_t num_elems = 1;
+ hid_t type_id = H5I_INVALID_HID;
+
+ if (!(vol_cap_flags_g & H5VL_CAP_FLAG_FILL_VALUES)) {
+ SKIPPED();
+ printf(" dataset fill values are not supported by this VOL connector\n");
+ PART_EMPTY(DCPL_fill_value_test);
+ }
+
+ /* Integer Fill Value */
+ if ((dcpl_id = H5Pcreate(H5P_DATASET_CREATE)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create DCPL\n");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Pset_fill_value(dcpl_id, DATASET_FILL_VALUE_TEST_INT_TYPE, (const void *)&int_fill_value) <
+ 0) {
+ H5_FAILED();
+ printf(" couldn't set integer fill value in property list");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((dset_id =
+ H5Dcreate(group_id, DATASET_FILL_VALUE_TEST_DSET_NAME1, DATASET_FILL_VALUE_TEST_INT_TYPE,
+ fspace_id, H5P_DEFAULT, dcpl_id, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with integer fill value");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((H5Sget_simple_extent_dims(fspace_id, dims, NULL)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataspace dimensions");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ for (i = 0; i < DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK; i++)
+ num_elems *= (size_t)dims[i];
+
+ if ((read_buf = calloc(num_elems, sizeof(DATASET_FILL_VALUE_TEST_INT_TYPE))) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Dread(dset_id, DATASET_FILL_VALUE_TEST_INT_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT, read_buf) <
+ 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ for (i = 0; i < num_elems; i++) {
+ val = (int *)(read_buf) + i;
+
+ if (*(int *)val != DATASET_FILL_VALUE_TEST_INT_FILL_VALUE) {
+ H5_FAILED();
+ printf(" incorrect value read from dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+ }
+
+ if (H5Dclose(dset_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close integer fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Pclose(dcpl_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close dcpl");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ /* Re-open integer dataset */
+ if ((dset_id = H5Dopen2(group_id, DATASET_FILL_VALUE_TEST_DSET_NAME1, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open integer fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Dclose(dset_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close opened integer fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ free(read_buf);
+ read_buf = NULL;
+
+ /* Double fill value */
+ if ((dcpl_id = H5Pcreate(H5P_DATASET_CREATE)) == H5I_INVALID_HID) {
+ H5_FAILED();
+ printf(" couldn't create dcpl");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((H5Pset_fill_value(dcpl_id, DATASET_FILL_VALUE_TEST_DOUBLE_TYPE,
+ (const void *)&double_fill_value)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set double fill value in property list");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((dset_id = H5Dcreate2(group_id, DATASET_FILL_VALUE_TEST_DSET_NAME2,
+ DATASET_FILL_VALUE_TEST_DOUBLE_TYPE, fspace_id, H5P_DEFAULT, dcpl_id,
+ H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with double fill value");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((read_buf = calloc(num_elems, sizeof(DATASET_FILL_VALUE_TEST_DOUBLE_TYPE))) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Dread(dset_id, DATASET_FILL_VALUE_TEST_DOUBLE_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT,
+ read_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ for (i = 0; i < num_elems; i++) {
+ val = (double *)(read_buf) + i;
+
+ if (!(H5_DBL_REL_EQUAL(*(double *)val, DATASET_FILL_VALUE_TEST_DOUBLE_FILL_VALUE,
+ 0.0000001))) {
+ H5_FAILED();
+ printf(" incorrect value read from dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+ }
+
+ if (H5Dclose(dset_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close double fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Pclose(dcpl_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close dcpl");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ /* Re-open double dataset */
+ if ((dset_id = H5Dopen2(group_id, DATASET_FILL_VALUE_TEST_DSET_NAME2, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open double fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Dclose(dset_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close opened double fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ free(read_buf);
+ read_buf = NULL;
+
+ /* Fixed-length string fill value */
+ if ((dcpl_id = H5Pcreate(H5P_DATASET_CREATE)) == H5I_INVALID_HID) {
+ H5_FAILED();
+ printf(" couldn't create dcpl");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((type_id = H5Tcopy(H5T_C_S1)) < 0) {
+ H5_FAILED();
+ printf(" couldn't copy string datatype");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((H5Tset_size(type_id, DATASET_FILL_VALUE_TEST_STRING_SIZE)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set size of string datatype");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((H5Pset_fill_value(dcpl_id, type_id,
+ (const void *)DATASET_FILL_VALUE_TEST_STRING_FILL_VALUE)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set string fill value in property list");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((dset_id = H5Dcreate2(group_id, DATASET_FILL_VALUE_TEST_DSET_NAME3, type_id, fspace_id,
+ H5P_DEFAULT, dcpl_id, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with string fill value");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if ((read_buf = calloc(num_elems, DATASET_FILL_VALUE_TEST_STRING_SIZE)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Dread(dset_id, type_id, H5S_ALL, H5S_ALL, H5P_DEFAULT, read_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ for (i = 0; i < num_elems; i++) {
+ char val_str[DATASET_FILL_VALUE_TEST_STRING_SIZE + 1];
+
+ memcpy(val_str, ((char *)read_buf) + i * DATASET_FILL_VALUE_TEST_STRING_SIZE,
+ DATASET_FILL_VALUE_TEST_STRING_SIZE);
+ val_str[DATASET_FILL_VALUE_TEST_STRING_SIZE] = '\0';
+
+ if (strcmp(val_str, DATASET_FILL_VALUE_TEST_STRING_FILL_VALUE)) {
+ H5_FAILED();
+ printf(" incorrect value read from string dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+ }
+
+ if (H5Dclose(dset_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close string fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Pclose(dcpl_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close dcpl");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Tclose(type_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close string type");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ free(read_buf);
+ read_buf = NULL;
+
+ /* Re-open string dataset */
+ if ((dset_id = H5Dopen2(group_id, DATASET_FILL_VALUE_TEST_DSET_NAME3, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open string fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ if (H5Dclose(dset_id) < 0) {
+ H5_FAILED();
+ printf(" couldn't close opened string fill value dataset");
+ PART_ERROR(DCPL_fill_value_test);
+ }
+
+ PASSED();
+ }
+ PART_END(DCPL_fill_value_test);
/* Test filters */
PART_BEGIN(DCPL_filters_test)
@@ -2490,6 +2783,120 @@ test_create_dataset_creation_properties(void)
}
PART_END(DCPL_filters_test);
+ /* Test a user-defined filter */
+ PART_BEGIN(DCPL_user_defined_filter_test)
+ {
+ TESTING_2("user-defined dataset filters");
+ /* Create user-defined filter and register with library */
+ const H5Z_class2_t filter_cls[1] = {
+ {H5Z_CLASS_T_VERS, DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_ID, 1, 1,
+ DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NAME, NULL, NULL, &filter}};
+
+ if (H5Zregister((const void *)&filter_cls) < 0) {
+ H5_FAILED();
+ printf(" couldn't register filter\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if ((dcpl_id = H5Pcreate(H5P_DATASET_CREATE)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create DCPL\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if (H5Pset_chunk(dcpl_id, DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK, chunk_dims) < 0) {
+ H5_FAILED();
+ printf(" couldn't set chunking on DCPL\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ /* Set user-defined filter on the DCPL */
+ if (H5Pset_filter(dcpl_id, (H5Z_filter_t)DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_ID,
+ H5Z_FLAG_MANDATORY, 3, filter_params) < 0) {
+ H5_FAILED();
+ printf(" couldn't set user-defined filter on DCPL\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ /* Use a simple datatype, as not all filters support all datatypes. */
+ if ((dset_id = H5Dcreate2(group_id, DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_DSET_NAME,
+ H5T_NATIVE_INT, fspace_id, H5P_DEFAULT, dcpl_id, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset '%s'\n",
+ DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_DSET_NAME);
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if (dset_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Dclose(dset_id);
+ }
+ H5E_END_TRY;
+ dset_id = H5I_INVALID_HID;
+ }
+
+ if ((dset_id = H5Dopen2(group_id, DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_DSET_NAME,
+ H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open dataset '%s'\n",
+ DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_DSET_NAME);
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if (dcpl_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Pclose(dcpl_id);
+ }
+ H5E_END_TRY;
+ dcpl_id = H5I_INVALID_HID;
+ }
+
+ /* Test that parameters are preserved in the DCPL */
+ memset(filter_params_out, 0,
+ sizeof(unsigned int) * DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NUM_PARAMS);
+
+ if ((dcpl_id = H5Dget_create_plist(dset_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't retrieve DCPL\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if ((nfilters = H5Pget_nfilters(dcpl_id)) != 1) {
+ H5_FAILED();
+ printf(" retrieved incorrect number of filters from DCPL\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if ((retrieved_filter_id = H5Pget_filter2(
+ dcpl_id, 0, H5Z_FLAG_MANDATORY, &num_filter_params, filter_params_out,
+ strlen(DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NAME), ud_filter_name, NULL)) < 0) {
+ H5_FAILED();
+ printf(" retrieved incorrect user-defined filter ID\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ for (i = 0; i < DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NUM_PARAMS; i++)
+ if (filter_params[i] != filter_params_out[i]) {
+ H5_FAILED();
+ printf(" retrieved incorrect parameter value from DCPL\n");
+ PART_ERROR(DCPL_user_defined_filter_test);
+ }
+
+ if (dset_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Dclose(dset_id);
+ }
+ H5E_END_TRY;
+ dset_id = H5I_INVALID_HID;
+ }
+
+ PASSED();
+ }
+ PART_END(DCPL_user_defined_filter_test)
+
/* Test the dataset storage layout property */
PART_BEGIN(DCPL_storage_layout_test)
{
@@ -2527,7 +2934,7 @@ test_create_dataset_creation_properties(void)
}
}
- sprintf(name, "%s%zu", DATASET_CREATION_PROPERTIES_TEST_LAYOUTS_BASE_NAME, i);
+ snprintf(name, sizeof(name), "%s%zu", DATASET_CREATION_PROPERTIES_TEST_LAYOUTS_BASE_NAME, i);
if ((dset_id =
H5Dcreate2(group_id, name, (H5D_COMPACT == layouts[i]) ? compact_dtype : dset_dtype,
@@ -2689,6 +3096,11 @@ test_create_dataset_creation_properties(void)
TESTING_2("test cleanup");
+ if (read_buf) {
+ free(read_buf);
+ read_buf = NULL;
+ }
+
if (H5Sclose(compact_fspace_id) < 0)
TEST_ERROR;
if (H5Sclose(fspace_id) < 0)
@@ -2711,6 +3123,10 @@ test_create_dataset_creation_properties(void)
error:
H5E_BEGIN_TRY
{
+ if (read_buf) {
+ free(read_buf);
+ }
+
H5Sclose(compact_fspace_id);
H5Sclose(fspace_id);
H5Tclose(compact_dtype);
@@ -2720,10 +3136,11 @@ error:
H5Gclose(group_id);
H5Gclose(container_group);
H5Fclose(file_id);
- }
- H5E_END_TRY
- return 1;
+ H5E_END_TRY
+
+ return 1;
+ }
}
/*
@@ -2778,7 +3195,7 @@ test_create_many_dataset(void)
printf("\n");
for (i = 0; i < DATASET_NUMB; i++) {
printf("\r %u/%u", i + 1, DATASET_NUMB);
- sprintf(dset_name, "dset_%02u", i);
+ snprintf(dset_name, sizeof(dset_name), "dset_%02u", i);
data = i % 256;
if ((dset_id = H5Dcreate2(group_id, dset_name, H5T_NATIVE_UCHAR, dataspace_id, H5P_DEFAULT,
@@ -3652,8 +4069,8 @@ test_dataset_property_lists(void)
{
TESTING_2("H5Dget_create_plist");
- /* Try to receive copies of the two property lists, one which has the property set and one which
- * does not */
+ /* Try to receive copies of the two property lists, one which has the property set and one
+ * which does not */
if ((dcpl_id1 = H5Dget_create_plist(dset_id1)) < 0) {
H5_FAILED();
printf(" couldn't get property list\n");
@@ -3743,8 +4160,8 @@ test_dataset_property_lists(void)
dapl_id1 = H5I_INVALID_HID;
}
- /* Try to receive copies of the two property lists, one which has the property set and one which
- * does not */
+ /* Try to receive copies of the two property lists, one which has the property set and one
+ * which does not */
if ((dapl_id1 = H5Dget_access_plist(dset_id3)) < 0) {
H5_FAILED();
printf(" couldn't get property list\n");
@@ -4721,7 +5138,7 @@ test_read_multi_dataset_small_point_selection(void)
hid_t dset_id_arr[DATASET_MULTI_COUNT];
hid_t mspace_id_arr[DATASET_MULTI_COUNT], fspace_id_arr[DATASET_MULTI_COUNT];
hid_t dtype_arr[DATASET_MULTI_COUNT];
- void *data[DATASET_MULTI_COUNT];
+ void *read_buf[DATASET_MULTI_COUNT];
TESTING("small multi read from datasets with point selections");
@@ -4735,7 +5152,7 @@ test_read_multi_dataset_small_point_selection(void)
/* Prevent uninitialized memory usage on test failure */
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- data[i] = NULL;
+ read_buf[i] = NULL;
dset_id_arr[i] = H5I_INVALID_HID;
}
@@ -4797,7 +5214,7 @@ test_read_multi_dataset_small_point_selection(void)
goto error;
}
- if (NULL == (data[i] = malloc(data_size)))
+ if (NULL == (read_buf[i] = malloc(data_size)))
TEST_ERROR;
dtype_arr[i] = DATASET_SMALL_READ_TEST_POINT_SELECTION_DSET_DTYPE;
@@ -4806,16 +5223,16 @@ test_read_multi_dataset_small_point_selection(void)
}
if (H5Dread_multi(DATASET_MULTI_COUNT, dset_id_arr, dtype_arr, mspace_id_arr, fspace_id_arr, H5P_DEFAULT,
- data) < 0) {
+ read_buf) < 0) {
H5_FAILED();
printf(" couldn't read from dataset '%s'\n", DATASET_SMALL_READ_TEST_POINT_SELECTION_DSET_NAME);
goto error;
}
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i]) {
- free(data[i]);
- data[i] = NULL;
+ if (read_buf[i]) {
+ free(read_buf[i]);
+ read_buf[i] = NULL;
}
if (H5Dclose(dset_id_arr[i]) < 0)
TEST_ERROR;
@@ -4840,9 +5257,9 @@ error:
H5E_BEGIN_TRY
{
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i]) {
- free(data[i]);
- data[i] = NULL;
+ if (read_buf[i]) {
+ free(read_buf[i]);
+ read_buf[i] = NULL;
}
H5Dclose(dset_id_arr[i]);
}
@@ -5228,8 +5645,8 @@ test_dataset_io_point_selections(void)
for (i = 0; i < DATASET_IO_POINT_DIM_0; i++)
for (j = 0; j < DATASET_IO_POINT_DIM_1; j++)
if (buf_all[i][j] != file_state[i][j])
- FAIL_PUTS_ERROR(
- "Incorrect data found after writing from hyperslab in memory to points in dataset");
+ FAIL_PUTS_ERROR("Incorrect data found after writing from hyperslab in memory to "
+ "points in dataset");
PASSED();
@@ -5302,8 +5719,8 @@ test_dataset_io_point_selections(void)
for (i = 0; i < DATASET_IO_POINT_DIM_0; i++)
for (j = 0; j < DATASET_IO_POINT_DIM_1; j++)
if (buf_all[i][j] != file_state[i][j])
- FAIL_PUTS_ERROR(
- "Incorrect data found after writing from points in memory to hyperslab in dataset");
+ FAIL_PUTS_ERROR("Incorrect data found after writing from points in memory to "
+ "hyperslab in dataset");
if (!do_chunk)
PASSED();
@@ -6501,22 +6918,24 @@ error:
static int
test_write_multi_dataset_small_all(void)
{
- hssize_t space_npoints;
- hsize_t dims[DATASET_SMALL_WRITE_TEST_ALL_DSET_SPACE_RANK] = {10, 5, 3};
- size_t i;
- hid_t file_id = H5I_INVALID_HID;
- hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
- hid_t dset_id_arr[DATASET_MULTI_COUNT];
- hid_t fspace_id = H5I_INVALID_HID, fspace_id_arr[DATASET_MULTI_COUNT];
- hid_t dtype_id_arr[DATASET_MULTI_COUNT];
- void *data[DATASET_MULTI_COUNT];
+ hssize_t space_npoints;
+ hsize_t dims[DATASET_SMALL_WRITE_TEST_ALL_DSET_SPACE_RANK] = {10, 5, 3};
+ size_t i;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t dset_id_arr[DATASET_MULTI_COUNT];
+ hid_t fspace_id = H5I_INVALID_HID, fspace_id_arr[DATASET_MULTI_COUNT];
+ hid_t dtype_id_arr[DATASET_MULTI_COUNT];
+ const void *write_buf[DATASET_MULTI_COUNT];
+ void *wbuf_temp[DATASET_MULTI_COUNT];
TESTING("small multi write to datasets with H5S_ALL");
/* Prevent uninitialized memory usage on test failure */
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
dset_id_arr[i] = H5I_INVALID_HID;
- data[i] = NULL;
+ write_buf[i] = NULL;
+ wbuf_temp[i] = NULL;
}
/* Make sure the connector supports the API functions being tested */
@@ -6600,23 +7019,26 @@ test_write_multi_dataset_small_all(void)
dtype_id_arr[i] = DATASET_SMALL_WRITE_TEST_ALL_DSET_DTYPE;
fspace_id_arr[i] = H5S_ALL;
- if (NULL == (data[i] = malloc((hsize_t)space_npoints * DATASET_SMALL_WRITE_TEST_ALL_DSET_DTYPESIZE)))
+ if (NULL ==
+ (wbuf_temp[i] = malloc((hsize_t)space_npoints * DATASET_SMALL_WRITE_TEST_ALL_DSET_DTYPESIZE)))
TEST_ERROR;
for (size_t j = 0; j < (size_t)space_npoints; j++)
- ((int **)data)[i][j] = (int)i;
+ ((int **)wbuf_temp)[i][j] = (int)i;
+
+ write_buf[i] = wbuf_temp[i];
}
if (H5Dwrite_multi(DATASET_MULTI_COUNT, dset_id_arr, dtype_id_arr, fspace_id_arr, fspace_id_arr,
- H5P_DEFAULT, (const void **)data) < 0) {
+ H5P_DEFAULT, write_buf) < 0) {
H5_FAILED();
printf(" couldn't write to dataset '%s'\n", DATASET_SMALL_WRITE_MULTI_TEST_ALL_DSET_NAME);
goto error;
}
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- free(data[i]);
- data[i] = NULL;
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
if (H5Dclose(dset_id_arr[i]) < 0)
TEST_ERROR;
}
@@ -6637,8 +7059,8 @@ error:
H5E_BEGIN_TRY
{
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i])
- free(data[i]);
+ if (wbuf_temp[i])
+ free(wbuf_temp[i]);
H5Dclose(dset_id_arr[i]);
}
@@ -6659,19 +7081,20 @@ error:
static int
test_write_multi_dataset_small_hyperslab(void)
{
- hsize_t start[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
- hsize_t stride[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
- hsize_t count[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
- hsize_t block[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
- hsize_t dims[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK] = {10, 5, 3};
- size_t i, data_size;
- hid_t file_id = H5I_INVALID_HID;
- hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
- hid_t dset_id_arr[DATASET_MULTI_COUNT];
- hid_t mspace_id = H5I_INVALID_HID, fspace_id = H5I_INVALID_HID;
- hid_t mspace_id_arr[DATASET_MULTI_COUNT], fspace_id_arr[DATASET_MULTI_COUNT];
- hid_t dtype_id_arr[DATASET_MULTI_COUNT];
- void *data[DATASET_MULTI_COUNT];
+ hsize_t start[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
+ hsize_t stride[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
+ hsize_t count[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
+ hsize_t block[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK];
+ hsize_t dims[DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK] = {10, 5, 3};
+ size_t i, data_size;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t dset_id_arr[DATASET_MULTI_COUNT];
+ hid_t mspace_id = H5I_INVALID_HID, fspace_id = H5I_INVALID_HID;
+ hid_t mspace_id_arr[DATASET_MULTI_COUNT], fspace_id_arr[DATASET_MULTI_COUNT];
+ hid_t dtype_id_arr[DATASET_MULTI_COUNT];
+ const void *write_buf[DATASET_MULTI_COUNT];
+ void *wbuf_temp[DATASET_MULTI_COUNT];
TESTING("small multi write to datasets with hyperslab selections");
@@ -6685,7 +7108,8 @@ test_write_multi_dataset_small_hyperslab(void)
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
dset_id_arr[i] = H5I_INVALID_HID;
- data[i] = NULL;
+ write_buf[i] = NULL;
+ wbuf_temp[i] = NULL;
}
if ((file_id = H5Fopen(H5_api_test_filename, H5F_ACC_RDWR, H5P_DEFAULT)) < 0) {
@@ -6732,11 +7156,13 @@ test_write_multi_dataset_small_hyperslab(void)
goto error;
}
- if (NULL == (data[i] = malloc(data_size)))
+ if (NULL == (wbuf_temp[i] = malloc(data_size)))
TEST_ERROR;
for (size_t j = 0; j < data_size / DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_DTYPESIZE; j++)
- ((int **)data)[i][j] = (int)i;
+ ((int **)wbuf_temp)[i][j] = (int)i;
+
+ write_buf[i] = (const void *)wbuf_temp[i];
}
for (i = 0; i < DATASET_SMALL_WRITE_TEST_HYPERSLAB_DSET_SPACE_RANK; i++) {
@@ -6758,16 +7184,16 @@ test_write_multi_dataset_small_hyperslab(void)
}
if (H5Dwrite_multi(DATASET_MULTI_COUNT, dset_id_arr, dtype_id_arr, mspace_id_arr, fspace_id_arr,
- H5P_DEFAULT, (const void **)data) < 0) {
+ H5P_DEFAULT, (const void **)write_buf) < 0) {
H5_FAILED();
printf(" couldn't write to dataset '%s'\n", DATASET_SMALL_WRITE_MULTI_TEST_HYPERSLAB_DSET_NAME);
goto error;
}
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i]) {
- free(data[i]);
- data[i] = NULL;
+ if (wbuf_temp[i]) {
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
}
if (H5Dclose(dset_id_arr[i]) < 0)
TEST_ERROR;
@@ -6792,9 +7218,9 @@ error:
H5E_BEGIN_TRY
{
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i]) {
- free(data[i]);
- data[i] = NULL;
+ if (wbuf_temp[i]) {
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
}
H5Dclose(dset_id_arr[i]);
}
@@ -6816,18 +7242,19 @@ error:
static int
test_write_multi_dataset_small_point_selection(void)
{
- hsize_t points[DATASET_SMALL_WRITE_TEST_POINT_SELECTION_NUM_POINTS *
+ hsize_t points[DATASET_SMALL_WRITE_TEST_POINT_SELECTION_NUM_POINTS *
DATASET_SMALL_WRITE_TEST_POINT_SELECTION_DSET_SPACE_RANK];
- hsize_t dims[DATASET_SMALL_WRITE_TEST_POINT_SELECTION_DSET_SPACE_RANK] = {10, 10, 10};
- hsize_t mdims[] = {DATASET_SMALL_WRITE_TEST_POINT_SELECTION_NUM_POINTS};
- size_t i, data_size;
- hid_t file_id = H5I_INVALID_HID;
- hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
- hid_t dset_id_arr[DATASET_MULTI_COUNT];
- hid_t fspace_id = H5I_INVALID_HID, fspace_id_arr[DATASET_MULTI_COUNT];
- hid_t mspace_id = H5I_INVALID_HID, mspace_id_arr[DATASET_MULTI_COUNT];
- hid_t dtype_id_arr[DATASET_MULTI_COUNT];
- void *data[DATASET_MULTI_COUNT];
+ hsize_t dims[DATASET_SMALL_WRITE_TEST_POINT_SELECTION_DSET_SPACE_RANK] = {10, 10, 10};
+ hsize_t mdims[] = {DATASET_SMALL_WRITE_TEST_POINT_SELECTION_NUM_POINTS};
+ size_t i, data_size;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t dset_id_arr[DATASET_MULTI_COUNT];
+ hid_t fspace_id = H5I_INVALID_HID, fspace_id_arr[DATASET_MULTI_COUNT];
+ hid_t mspace_id = H5I_INVALID_HID, mspace_id_arr[DATASET_MULTI_COUNT];
+ hid_t dtype_id_arr[DATASET_MULTI_COUNT];
+ const void *write_buf[DATASET_MULTI_COUNT];
+ void *wbuf_temp[DATASET_MULTI_COUNT];
TESTING("small multi write to datasets with point selections");
@@ -6840,7 +7267,8 @@ test_write_multi_dataset_small_point_selection(void)
}
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- data[i] = NULL;
+ write_buf[i] = NULL;
+ wbuf_temp[i] = NULL;
dset_id_arr[i] = H5I_INVALID_HID;
}
@@ -6888,11 +7316,13 @@ test_write_multi_dataset_small_point_selection(void)
goto error;
}
- if (NULL == (data[i] = malloc(data_size)))
+ if (NULL == (wbuf_temp[i] = malloc(data_size)))
TEST_ERROR;
for (size_t j = 0; j < data_size / DATASET_SMALL_WRITE_TEST_POINT_SELECTION_DSET_DTYPESIZE; j++)
- ((int **)data)[i][j] = (int)i;
+ ((int **)wbuf_temp)[i][j] = (int)i;
+
+ write_buf[i] = (const void *)wbuf_temp[i];
}
for (i = 0; i < DATASET_SMALL_WRITE_TEST_POINT_SELECTION_NUM_POINTS; i++) {
@@ -6916,16 +7346,16 @@ test_write_multi_dataset_small_point_selection(void)
}
if (H5Dwrite_multi(DATASET_MULTI_COUNT, dset_id_arr, dtype_id_arr, mspace_id_arr, fspace_id_arr,
- H5P_DEFAULT, (const void **)data) < 0) {
+ H5P_DEFAULT, (const void **)write_buf) < 0) {
H5_FAILED();
printf(" couldn't write to multiple datasets\n");
goto error;
}
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i]) {
- free(data[i]);
- data[i] = NULL;
+ if (wbuf_temp[i]) {
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
}
if (H5Dclose(dset_id_arr[i]) < 0)
@@ -6951,8 +7381,8 @@ error:
H5E_BEGIN_TRY
{
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i])
- free(data[i]);
+ if (wbuf_temp[i])
+ free(wbuf_temp[i]);
H5Dclose(dset_id_arr[i]);
}
@@ -6992,9 +7422,10 @@ test_write_multi_dataset_data_verification(void)
hid_t mspace_id = H5I_INVALID_HID, mspace_id_arr[DATASET_MULTI_COUNT];
hid_t select_all_arr[DATASET_MULTI_COUNT];
void *data[DATASET_MULTI_COUNT];
- void *write_buf[DATASET_MULTI_COUNT];
- void *read_buf[DATASET_MULTI_COUNT];
- char dset_names[DATASET_MULTI_COUNT][DSET_NAME_BUF_SIZE];
+ const void *write_buf[DATASET_MULTI_COUNT];
+ void *wbuf_temp[DATASET_MULTI_COUNT];
+ void *read_buf[DATASET_MULTI_COUNT];
+ char dset_names[DATASET_MULTI_COUNT][DSET_NAME_BUF_SIZE];
TESTING_MULTIPART("verification of datasets' data using H5Dwrite_multi then H5Dread_multi");
@@ -7013,6 +7444,7 @@ test_write_multi_dataset_data_verification(void)
select_all_arr[i] = H5S_ALL;
read_buf[i] = NULL;
write_buf[i] = NULL;
+ wbuf_temp[i] = NULL;
data[i] = NULL;
}
@@ -7062,6 +7494,8 @@ test_write_multi_dataset_data_verification(void)
for (size_t j = 0; j < data_size / DATASET_DATA_VERIFY_WRITE_TEST_DSET_DTYPESIZE; j++)
((int **)data)[i][j] = (int)j;
+
+ write_buf[i] = (const void *)data[i];
}
PASSED();
@@ -7073,7 +7507,7 @@ test_write_multi_dataset_data_verification(void)
TESTING_2("H5Dwrite_multi using H5S_ALL then H5Dread_multi");
if (H5Dwrite_multi(DATASET_MULTI_COUNT, dset_id_arr, dtype_id_arr, select_all_arr, select_all_arr,
- H5P_DEFAULT, (const void **)data) < 0) {
+ H5P_DEFAULT, (const void **)write_buf) < 0) {
H5_FAILED();
printf(" couldn't write to datasets");
PART_ERROR(H5Dwrite_multi_all_read);
@@ -7170,14 +7604,14 @@ test_write_multi_dataset_data_verification(void)
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
data_size = dims[1] * 2 * DATASET_DATA_VERIFY_WRITE_TEST_DSET_DTYPESIZE;
- if (NULL == (write_buf[i] = malloc(data_size))) {
+ if (NULL == (wbuf_temp[i] = malloc(data_size))) {
H5_FAILED();
printf(" couldn't allocate buffer for dataset write\n");
PART_ERROR(H5Dwrite_multi_hyperslab_read);
}
for (size_t j = 0; j < data_size / DATASET_DATA_VERIFY_WRITE_TEST_DSET_DTYPESIZE; j++) {
- ((int *)write_buf[i])[j] = 56;
+ ((int *)wbuf_temp[i])[j] = 56;
}
data_size = 1;
@@ -7190,6 +7624,8 @@ test_write_multi_dataset_data_verification(void)
printf(" couldn't allocate buffer for datasets' data verification\n");
PART_ERROR(H5Dwrite_multi_hyperslab_read);
}
+
+ write_buf[i] = (const void *)wbuf_temp[i];
}
if (H5Dread_multi(DATASET_MULTI_COUNT, dset_id_arr, dtype_id_arr, select_all_arr, select_all_arr,
@@ -7316,9 +7752,9 @@ test_write_multi_dataset_data_verification(void)
data[i] = NULL;
}
- if (write_buf[i]) {
- free(write_buf[i]);
- write_buf[i] = NULL;
+ if (wbuf_temp[i]) {
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
}
if (read_buf[i]) {
@@ -7332,14 +7768,9 @@ test_write_multi_dataset_data_verification(void)
PART_END(H5Dwrite_multi_hyperslab_read);
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (data[i]) {
- free(data[i]);
- data[i] = NULL;
- }
-
- if (write_buf[i]) {
- free(write_buf[i]);
- write_buf[i] = NULL;
+ if (wbuf_temp[i]) {
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
}
if (read_buf[i]) {
@@ -7356,14 +7787,16 @@ test_write_multi_dataset_data_verification(void)
DATASET_DATA_VERIFY_WRITE_TEST_NUM_POINTS * DATASET_DATA_VERIFY_WRITE_TEST_DSET_DTYPESIZE;
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
- if (NULL == (write_buf[i] = malloc(data_size))) {
+ if (NULL == (wbuf_temp[i] = malloc(data_size))) {
H5_FAILED();
printf(" couldn't allocate buffer for dataset write\n");
PART_ERROR(H5Dwrite_multi_point_sel_read);
}
for (size_t j = 0; j < data_size / DATASET_DATA_VERIFY_WRITE_TEST_DSET_DTYPESIZE; j++)
- ((int **)write_buf)[i][j] = 13;
+ ((int **)wbuf_temp)[i][j] = 13;
+
+ write_buf[i] = (const void *)wbuf_temp[i];
data_size = 1;
@@ -7516,9 +7949,9 @@ test_write_multi_dataset_data_verification(void)
data[i] = NULL;
}
- if (write_buf[i]) {
- free(write_buf[i]);
- write_buf[i] = NULL;
+ if (wbuf_temp[i]) {
+ free(wbuf_temp[i]);
+ wbuf_temp[i] = NULL;
}
if (read_buf[i]) {
@@ -7549,8 +7982,8 @@ error:
for (i = 0; i < DATASET_MULTI_COUNT; i++) {
if (data[i])
free(data[i]);
- if (write_buf[i])
- free(write_buf[i]);
+ if (wbuf_temp[i])
+ free(wbuf_temp[i]);
if (read_buf[i])
free(read_buf[i]);
@@ -7810,6 +8243,251 @@ error:
}
/*
+ * A test to ensure that strings of any encoding
+ * can be written to and read from a dataset
+ */
+static int
+test_dataset_string_encodings(void)
+{
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID;
+ hid_t dset_id1 = H5I_INVALID_HID;
+ hid_t dset_id2 = H5I_INVALID_HID;
+ hid_t type_id1 = H5I_INVALID_HID;
+ hid_t type_id2 = H5I_INVALID_HID;
+ hid_t space_id = H5I_INVALID_HID;
+ hsize_t dims[DATASET_STRING_ENCODINGS_RANK] = {DATASET_STRING_ENCODINGS_EXTENT};
+ size_t ascii_str_size = 0;
+ size_t utf8_str_size = 0;
+ char *write_buf = NULL;
+ char *read_buf = NULL;
+
+ TESTING_MULTIPART("string encoding read/write correctness on datasets");
+
+ /* Make sure the connector supports the API functions being tested */
+ if (!(vol_cap_flags_g & H5VL_CAP_FLAG_FILE_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_GROUP_BASIC) ||
+ !(vol_cap_flags_g & H5VL_CAP_FLAG_DATASET_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_ATTR_BASIC)) {
+ SKIPPED();
+ printf(" API functions for basic file, group, basic or more dataset aren't supported with this "
+ "connector\n");
+ return 0;
+ }
+
+ TESTING_2("test setup");
+
+ ascii_str_size = strlen(DATASET_STRING_ENCODINGS_ASCII_STRING);
+ utf8_str_size = strlen(DATASET_STRING_ENCODINGS_UTF8_STRING);
+
+ if ((file_id = H5Fopen(H5_api_test_filename, H5F_ACC_RDWR, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open file '%s'\n", H5_api_test_filename);
+ goto error;
+ }
+
+ if ((container_group = H5Gopen2(file_id, DATASET_TEST_GROUP_NAME, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open container group '%s'\n", DATASET_TEST_GROUP_NAME);
+ goto error;
+ }
+
+ if ((space_id = H5Screate_simple(DATASET_STRING_ENCODINGS_RANK, dims, NULL)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataspace\n");
+ goto error;
+ }
+
+ if ((type_id1 = H5Tcopy(H5T_C_S1)) < 0) {
+ H5_FAILED();
+ printf(" couldn't copy builtin string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_size(type_id1, ascii_str_size)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set size of string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_cset(type_id1, H5T_CSET_ASCII)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set character set of string to ASCII\n");
+ goto error;
+ }
+
+ if ((dset_id1 = H5Dcreate(container_group, DATASET_STRING_ENCODINGS_DSET_NAME1, type_id1, space_id,
+ H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with ascii string\n");
+ goto error;
+ }
+
+ if ((type_id2 = H5Tcopy(H5T_C_S1)) < 0) {
+ H5_FAILED();
+ printf(" couldn't copy builtin string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_size(type_id2, utf8_str_size)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set size of string datatype\n");
+ goto error;
+ }
+
+ if ((H5Tset_cset(type_id2, H5T_CSET_UTF8)) < 0) {
+ H5_FAILED();
+ printf(" couldn't set character set of string to UTF-8\n");
+ goto error;
+ }
+
+ if ((dset_id2 = H5Dcreate(container_group, DATASET_STRING_ENCODINGS_DSET_NAME2, type_id2, space_id,
+ H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset with UTF-8 string\n");
+ goto error;
+ }
+
+ PASSED();
+
+ BEGIN_MULTIPART
+ {
+ PART_BEGIN(ASCII_cset)
+ {
+ TESTING_2("ASCII character set");
+ /* Dataset with ASCII string datatype */
+ if ((write_buf = calloc(1, ascii_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for write buffer\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ memcpy(write_buf, DATASET_STRING_ENCODINGS_ASCII_STRING, ascii_str_size);
+
+ if ((H5Dwrite(dset_id1, type_id1, H5S_ALL, H5S_ALL, H5P_DEFAULT, write_buf)) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to dataset with ASCII string\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ if ((read_buf = calloc(1, ascii_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ if ((H5Dread(dset_id1, type_id1, H5S_ALL, H5S_ALL, H5P_DEFAULT, read_buf)) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset with ASCII string\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ if (strncmp(write_buf, read_buf, ascii_str_size)) {
+ H5_FAILED();
+ printf(" incorrect data read from dataset with ASCII string\n");
+ PART_ERROR(ASCII_cset);
+ }
+
+ free(write_buf);
+ write_buf = NULL;
+
+ free(read_buf);
+ read_buf = NULL;
+
+ PASSED();
+ }
+ PART_END(ASCII_cset);
+
+ PART_BEGIN(UTF8_cset)
+ {
+ TESTING_2("UTF-8 character set");
+ /* Dataset with UTF-8 string datatype */
+ if ((write_buf = calloc(1, utf8_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for write buffer\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ memcpy(write_buf, DATASET_STRING_ENCODINGS_UTF8_STRING, utf8_str_size);
+
+ if ((H5Dwrite(dset_id2, type_id2, H5S_ALL, H5S_ALL, H5P_DEFAULT, write_buf)) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to dataset with ASCII string\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ if ((read_buf = calloc(1, utf8_str_size + 1)) == NULL) {
+ H5_FAILED();
+ printf(" couldn't allocate memory for read buffer\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ if ((H5Dread(dset_id2, type_id2, H5S_ALL, H5S_ALL, H5P_DEFAULT, read_buf)) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset with ASCII string\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ if (strncmp(write_buf, read_buf, utf8_str_size)) {
+ H5_FAILED();
+ printf(" incorrect data read from dataset with ASCII string\n");
+ PART_ERROR(UTF8_cset);
+ }
+
+ free(write_buf);
+ write_buf = NULL;
+
+ free(read_buf);
+ read_buf = NULL;
+
+ PASSED();
+ }
+ PART_END(UTF8_cset);
+
+ PASSED();
+ }
+ END_MULTIPART;
+
+ TESTING_2("test cleanup");
+
+ if (H5Fclose(file_id) < 0)
+ TEST_ERROR;
+ if (H5Gclose(container_group) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset_id1) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset_id2) < 0)
+ TEST_ERROR;
+ if (H5Tclose(type_id1) < 0)
+ TEST_ERROR;
+ if (H5Tclose(type_id2) < 0)
+ TEST_ERROR;
+ if (write_buf)
+ free(write_buf);
+ if (read_buf)
+ free(read_buf);
+ PASSED();
+
+ return 0;
+
+error:
+ H5E_BEGIN_TRY
+ {
+ H5Fclose(file_id);
+ H5Gclose(container_group);
+ H5Dclose(dset_id1);
+ H5Dclose(dset_id2);
+ H5Tclose(type_id1);
+ H5Tclose(type_id2);
+ if (write_buf)
+ free(write_buf);
+ if (read_buf)
+ free(read_buf);
+ }
+ H5E_END_TRY;
+
+ return 1;
+}
+
+/*
* A test to ensure that data is read back correctly from a dataset after it has
* been written, using type conversion with builtin types.
*/
@@ -8347,6 +9025,548 @@ error:
return 1;
}
+static int
+test_dataset_real_to_int_conversion(void)
+{
+ hssize_t space_npoints;
+ hsize_t dims[DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK] = {10, 10, 10};
+ hsize_t start[DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK];
+ hsize_t stride[DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK];
+ hsize_t count[DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK];
+ hsize_t block[DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK];
+ hsize_t points[DATASET_DATA_REAL_CONVERSION_TEST_NUM_POINTS *
+ DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK];
+ size_t i, data_size;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t dset_id = H5I_INVALID_HID;
+ hid_t fspace_id = H5I_INVALID_HID;
+ hid_t mspace_id = H5I_INVALID_HID;
+ hid_t real_type_id = DATASET_DATA_REAL_CONVERSION_TEST_REAL_TYPE;
+ void *data = NULL;
+ void *write_buf = NULL;
+ void *read_buf = NULL;
+
+ TESTING_MULTIPART(
+ "verification of dataset data using H5Dwrite then H5Dread with real <-> integer type conversion");
+
+ /* Make sure the connector supports the API functions being tested */
+ if (!(vol_cap_flags_g & H5VL_CAP_FLAG_FILE_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_GROUP_BASIC) ||
+ !(vol_cap_flags_g & H5VL_CAP_FLAG_DATASET_BASIC)) {
+ SKIPPED();
+ printf(" API functions for basic file, group, basic or more dataset aren't supported with this "
+ "connector\n");
+ return 0;
+ }
+
+ TESTING_2("test setup");
+
+ if ((file_id = H5Fopen(H5_api_test_filename, H5F_ACC_RDWR, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open file '%s'\n", H5_api_test_filename);
+ goto error;
+ }
+
+ if ((container_group = H5Gopen2(file_id, DATASET_TEST_GROUP_NAME, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open container group '%s'\n", DATASET_TEST_GROUP_NAME);
+ goto error;
+ }
+
+ if ((group_id = H5Gcreate2(container_group, DATASET_DATA_REAL_CONVERSION_TEST_GROUP_NAME, H5P_DEFAULT,
+ H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create container sub-group '%s'\n",
+ DATASET_DATA_REAL_CONVERSION_TEST_GROUP_NAME);
+ goto error;
+ }
+
+ if ((fspace_id = H5Screate_simple(DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK, dims, NULL)) < 0)
+ TEST_ERROR;
+
+ if ((dset_id = H5Dcreate2(group_id, DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME, real_type_id, fspace_id,
+ H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ goto error;
+ }
+
+ for (i = 0, data_size = 1; i < DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK; i++)
+ data_size *= dims[i];
+ data_size *= DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE;
+
+ if (NULL == (data = malloc(data_size)))
+ TEST_ERROR;
+
+ for (i = 0; i < data_size / DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE; i++)
+ ((int *)data)[i] = (int)i;
+
+ PASSED();
+
+ BEGIN_MULTIPART
+ {
+ PART_BEGIN(H5Dwrite_all_read)
+ {
+ TESTING_2("write then read int from real dataset with H5S_ALL selection");
+
+ if (H5Dwrite(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT,
+ data) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ if (data) {
+ free(data);
+ data = NULL;
+ }
+
+ if (fspace_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Sclose(fspace_id);
+ }
+ H5E_END_TRY;
+ fspace_id = H5I_INVALID_HID;
+ }
+ if (dset_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Dclose(dset_id);
+ }
+ H5E_END_TRY;
+ dset_id = H5I_INVALID_HID;
+ }
+
+ if ((dset_id = H5Dopen2(group_id, DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME, H5P_DEFAULT)) <
+ 0) {
+ H5_FAILED();
+ printf(" couldn't open dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ if ((fspace_id = H5Dget_space(dset_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataset dataspace\n");
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ if ((space_npoints = H5Sget_simple_extent_npoints(fspace_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataspace num points\n");
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ if (NULL ==
+ (data = malloc((hsize_t)space_npoints * DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset read\n");
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ if (H5Dread(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT,
+ data) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ for (i = 0; i < (hsize_t)space_npoints; i++)
+ if (((int *)data)[i] != (int)i) {
+ H5_FAILED();
+ printf(" H5S_ALL selection data verification failed\n");
+ PART_ERROR(H5Dwrite_all_read);
+ }
+
+ if (data) {
+ free(data);
+ data = NULL;
+ }
+
+ PASSED();
+ }
+ PART_END(H5Dwrite_all_read);
+
+ if (data) {
+ free(data);
+ data = NULL;
+ }
+
+ if (write_buf) {
+ free(write_buf);
+ write_buf = NULL;
+ }
+
+ if (read_buf) {
+ free(read_buf);
+ read_buf = NULL;
+ }
+
+ PART_BEGIN(H5Dwrite_hyperslab_read)
+ {
+ TESTING_2("write then read int from real dataset with hyperslab selection");
+
+ data_size = dims[1] * 2 * DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE;
+
+ if (NULL == (write_buf = malloc(data_size))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset write\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ for (i = 0; i < data_size / DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE; i++)
+ ((int *)write_buf)[i] = 56;
+
+ for (i = 0, data_size = 1; i < DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK; i++)
+ data_size *= dims[i];
+ data_size *= DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE;
+
+ if (NULL == (data = calloc(1, data_size))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset data verification\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ for (i = 0; i < dims[0] * dims[1] * dims[2]; i++)
+ ((int *)data)[i] = (int)i;
+
+ for (i = 0; i < 2; i++) {
+ size_t j;
+
+ for (j = 0; j < dims[1]; j++)
+ ((int *)data)[(i * dims[1] * dims[2]) + (j * dims[2])] = 56;
+ }
+
+ /* Write to first two rows of dataset */
+ start[0] = start[1] = start[2] = 0;
+ stride[0] = stride[1] = stride[2] = 1;
+ count[0] = 2;
+ count[1] = dims[1];
+ count[2] = 1;
+ block[0] = block[1] = block[2] = 1;
+
+ if (H5Sselect_hyperslab(fspace_id, H5S_SELECT_SET, start, stride, count, block) < 0) {
+ H5_FAILED();
+ printf(" couldn't select hyperslab for dataset write\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ {
+ hsize_t mdims[] = {(hsize_t)2 * dims[1]};
+
+ if ((mspace_id = H5Screate_simple(1, mdims, NULL)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create memory dataspace\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+ }
+
+ if (H5Dwrite(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, mspace_id, fspace_id,
+ H5P_DEFAULT, write_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if (mspace_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Sclose(mspace_id);
+ }
+ H5E_END_TRY;
+ mspace_id = H5I_INVALID_HID;
+ }
+ if (fspace_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Sclose(fspace_id);
+ }
+ H5E_END_TRY;
+ fspace_id = H5I_INVALID_HID;
+ }
+ if (dset_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Dclose(dset_id);
+ }
+ H5E_END_TRY;
+ dset_id = H5I_INVALID_HID;
+ }
+
+ if ((dset_id = H5Dopen2(group_id, DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME, H5P_DEFAULT)) <
+ 0) {
+ H5_FAILED();
+ printf(" couldn't open dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if ((fspace_id = H5Dget_space(dset_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataset dataspace\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if ((space_npoints = H5Sget_simple_extent_npoints(fspace_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataspace num points\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if (NULL == (read_buf = malloc((hsize_t)space_npoints *
+ DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset read\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if (H5Dread(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT,
+ read_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if (memcmp(data, read_buf, data_size)) {
+ H5_FAILED();
+ printf(" hyperslab selection data verification failed\n");
+ PART_ERROR(H5Dwrite_hyperslab_read);
+ }
+
+ if (data) {
+ free(data);
+ data = NULL;
+ }
+
+ if (write_buf) {
+ free(write_buf);
+ write_buf = NULL;
+ }
+
+ if (read_buf) {
+ free(read_buf);
+ read_buf = NULL;
+ }
+
+ PASSED();
+ }
+ PART_END(H5Dwrite_hyperslab_read);
+
+ if (data) {
+ free(data);
+ data = NULL;
+ }
+
+ if (write_buf) {
+ free(write_buf);
+ write_buf = NULL;
+ }
+
+ if (read_buf) {
+ free(read_buf);
+ read_buf = NULL;
+ }
+
+ PART_BEGIN(H5Dwrite_point_sel_read)
+ {
+ TESTING_2("write then read int from real dataset with point selection");
+
+ data_size = DATASET_DATA_REAL_CONVERSION_TEST_NUM_POINTS *
+ DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE;
+
+ if (NULL == (write_buf = malloc(data_size))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset write\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ for (i = 0; i < data_size / DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE; i++)
+ ((int *)write_buf)[i] = 13;
+
+ for (i = 0, data_size = 1; i < DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK; i++)
+ data_size *= dims[i];
+ data_size *= DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE;
+
+ if (NULL == (data = malloc(data_size))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset data verification\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if (H5Dread(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT,
+ data) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ for (i = 0; i < dims[0]; i++) {
+ size_t j;
+
+ for (j = 0; j < dims[1]; j++) {
+ size_t k;
+
+ for (k = 0; k < dims[2]; k++) {
+ if (i == j && j == k)
+ ((int *)data)[(i * dims[1] * dims[2]) + (j * dims[2]) + k] = 13;
+ }
+ }
+ }
+
+ /* Select a series of 10 points in the dataset */
+ for (i = 0; i < DATASET_DATA_REAL_CONVERSION_TEST_NUM_POINTS; i++) {
+ size_t j;
+
+ for (j = 0; j < DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK; j++)
+ points[(i * DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK) + j] = i;
+ }
+
+ if (H5Sselect_elements(fspace_id, H5S_SELECT_SET, DATASET_DATA_REAL_CONVERSION_TEST_NUM_POINTS,
+ points) < 0) {
+ H5_FAILED();
+ printf(" couldn't select elements in dataspace\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ {
+ hsize_t mdims[] = {(hsize_t)DATASET_DATA_REAL_CONVERSION_TEST_NUM_POINTS};
+
+ if ((mspace_id = H5Screate_simple(1, mdims, NULL)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create memory dataspace\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+ }
+
+ if (H5Dwrite(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, mspace_id, fspace_id,
+ H5P_DEFAULT, write_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't write to dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if (mspace_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Sclose(mspace_id);
+ }
+ H5E_END_TRY;
+ mspace_id = H5I_INVALID_HID;
+ }
+ if (fspace_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Sclose(fspace_id);
+ }
+ H5E_END_TRY;
+ fspace_id = H5I_INVALID_HID;
+ }
+ if (dset_id >= 0) {
+ H5E_BEGIN_TRY
+ {
+ H5Dclose(dset_id);
+ }
+ H5E_END_TRY;
+ dset_id = H5I_INVALID_HID;
+ }
+
+ if ((dset_id = H5Dopen2(group_id, DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME, H5P_DEFAULT)) <
+ 0) {
+ H5_FAILED();
+ printf(" couldn't open dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if ((fspace_id = H5Dget_space(dset_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataset dataspace\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if ((space_npoints = H5Sget_simple_extent_npoints(fspace_id)) < 0) {
+ H5_FAILED();
+ printf(" couldn't get dataspace num points\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if (NULL == (read_buf = malloc((hsize_t)space_npoints *
+ DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE))) {
+ H5_FAILED();
+ printf(" couldn't allocate buffer for dataset read\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if (H5Dread(dset_id, DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE, H5S_ALL, H5S_ALL, H5P_DEFAULT,
+ read_buf) < 0) {
+ H5_FAILED();
+ printf(" couldn't read from dataset '%s'\n", DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME);
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ if (memcmp(data, read_buf, data_size)) {
+ H5_FAILED();
+ printf(" point selection data verification failed\n");
+ PART_ERROR(H5Dwrite_point_sel_read);
+ }
+
+ PASSED();
+ }
+ PART_END(H5Dwrite_point_sel_read);
+ }
+ END_MULTIPART;
+
+ TESTING_2("test cleanup");
+
+ if (data) {
+ free(data);
+ data = NULL;
+ }
+
+ if (write_buf) {
+ free(write_buf);
+ write_buf = NULL;
+ }
+
+ if (read_buf) {
+ free(read_buf);
+ read_buf = NULL;
+ }
+
+ if (H5Sclose(fspace_id) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset_id) < 0)
+ TEST_ERROR;
+ if (H5Gclose(group_id) < 0)
+ TEST_ERROR;
+ if (H5Gclose(container_group) < 0)
+ TEST_ERROR;
+ if (H5Fclose(file_id) < 0)
+ TEST_ERROR;
+
+ PASSED();
+
+ return 0;
+
+error:
+ H5E_BEGIN_TRY
+ {
+ if (data)
+ free(data);
+ if (write_buf)
+ free(write_buf);
+ if (read_buf)
+ free(read_buf);
+ H5Sclose(mspace_id);
+ H5Sclose(fspace_id);
+ H5Dclose(dset_id);
+ H5Gclose(group_id);
+ H5Gclose(container_group);
+ H5Fclose(file_id);
+ }
+ H5E_END_TRY;
+
+ return 1;
+}
+
/*
* A test to ensure that data is read back correctly from a dataset after it has
* been written, using partial element I/O with compound types
@@ -8373,8 +9593,8 @@ test_dataset_compound_partial_io(void)
dataset_compount_partial_io_t fbuf[DATASET_COMPOUND_PARTIAL_IO_DSET_DIMS];
dataset_compount_partial_io_t erbuf[DATASET_COMPOUND_PARTIAL_IO_DSET_DIMS];
- TESTING_MULTIPART(
- "verification of dataset data using H5Dwrite then H5Dread with partial element compound type I/O");
+ TESTING_MULTIPART("verification of dataset data using H5Dwrite then H5Dread with partial element "
+ "compound type I/O");
/* Make sure the connector supports the API functions being tested */
if (!(vol_cap_flags_g & H5VL_CAP_FLAG_FILE_BASIC) || !(vol_cap_flags_g & H5VL_CAP_FLAG_GROUP_BASIC) ||
@@ -11999,10 +13219,10 @@ test_read_partial_chunk_all_selection(void)
for (j = 0; j < FIXED_DIMSIZE; j++)
if (read_buf[i][j] != (int)((i * FIXED_DIMSIZE) + j)) {
H5_FAILED();
- printf(
- " data verification failed for read buffer element %lld: expected %lld but was %lld\n",
- (long long)((i * FIXED_DIMSIZE) + j), (long long)((i * FIXED_DIMSIZE) + j),
- (long long)read_buf[i][j]);
+ printf(" data verification failed for read buffer element %lld: expected %lld but was "
+ "%lld\n",
+ (long long)((i * FIXED_DIMSIZE) + j), (long long)((i * FIXED_DIMSIZE) + j),
+ (long long)read_buf[i][j]);
goto error;
}
diff --git a/test/API/H5_api_dataset_test.h b/test/API/H5_api_dataset_test.h
index bba3073..086dc1c 100644
--- a/test/API/H5_api_dataset_test.h
+++ b/test/API/H5_api_dataset_test.h
@@ -42,6 +42,13 @@ int H5_api_dataset_test(void);
#define DATASET_CREATE_ANONYMOUS_INVALID_PARAMS_GROUP_NAME "anon_dset_creation_invalid_params_test"
#define DATASET_CREATE_ANONYMOUS_INVALID_PARAMS_SPACE_RANK 2
+#define DATASET_STRING_ENCODINGS_RANK 1
+#define DATASET_STRING_ENCODINGS_EXTENT 1
+#define DATASET_STRING_ENCODINGS_DSET_NAME1 "encoding_dset1"
+#define DATASET_STRING_ENCODINGS_DSET_NAME2 "encoding_dset2"
+#define DATASET_STRING_ENCODINGS_ASCII_STRING "asciistr"
+#define DATASET_STRING_ENCODINGS_UTF8_STRING "αaααaaaα"
+
#define DATASET_CREATE_NULL_DATASPACE_TEST_SUBGROUP_NAME "dataset_with_null_space_test"
#define DATASET_CREATE_NULL_DATASPACE_TEST_DSET_NAME "dataset_with_null_space"
@@ -53,7 +60,7 @@ int H5_api_dataset_test(void);
#define ZERO_DIM_DSET_TEST_DSET_NAME "zero_dim_dset"
#define DATASET_MANY_CREATE_GROUP_NAME "group_for_many_datasets"
-#define DSET_NAME_BUF_SIZE 64u
+#define DSET_NAME_BUF_SIZE 64
#define DATASET_NUMB 100u
#define DATASET_SHAPE_TEST_DSET_BASE_NAME "dataset_shape_test"
@@ -106,6 +113,10 @@ int H5_api_dataset_test(void);
#define DATASET_CREATION_PROPERTIES_TEST_MAX_COMPACT 12
#define DATASET_CREATION_PROPERTIES_TEST_MIN_DENSE 8
#define DATASET_CREATION_PROPERTIES_TEST_SHAPE_RANK 3
+#define DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_ID 32004
+#define DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NAME "lz4"
+#define DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_DSET_NAME "ud_filter_test"
+#define DATASET_CREATION_PROPERTIES_TEST_UD_FILTER_NUM_PARAMS 3
#define DATASET_OPEN_INVALID_PARAMS_SPACE_RANK 2
#define DATASET_OPEN_INVALID_PARAMS_GROUP_NAME "dataset_open_test"
@@ -126,6 +137,24 @@ int H5_api_dataset_test(void);
#define DATASET_PROPERTY_LIST_TEST_DSET_NAME3 "property_list_test_dataset3"
#define DATASET_PROPERTY_LIST_TEST_DSET_NAME4 "property_list_test_dataset4"
+#define DATASET_STORAGE_SIZE_TEST_ALL_DSET_SPACE_RANK 2
+#define DATASET_STORAGE_SIZE_TEST_ALL_DSET_EXTENT 10
+#define DATASET_STORAGE_SIZE_TEST_GROUP_NAME "dataset_get_storage_size_test"
+#define DATASET_STORAGE_SIZE_TEST_DSET_CONTIGUOUS_NAME "dataset_contiguous"
+#define DATASET_STORAGE_SIZE_TEST_DSET_CHUNKED_NAME "dataset_chunked"
+#define DATASET_STORAGE_SIZE_TEST_DSET_FILTERED_NAME "dataset_filtered"
+#define DATASET_STORAGE_SIZE_TEST_TYPE H5T_NATIVE_INT
+
+#define DATASET_FILL_VALUE_TEST_DSET_NAME1 "dataset_fill_value_test_dataset1"
+#define DATASET_FILL_VALUE_TEST_DSET_NAME2 "dataset_fill_value_test_dataset2"
+#define DATASET_FILL_VALUE_TEST_DSET_NAME3 "dataset_fill_value_test_dataset3"
+#define DATASET_FILL_VALUE_TEST_INT_TYPE H5T_NATIVE_INT
+#define DATASET_FILL_VALUE_TEST_INT_FILL_VALUE 1
+#define DATASET_FILL_VALUE_TEST_DOUBLE_TYPE H5T_NATIVE_DOUBLE
+#define DATASET_FILL_VALUE_TEST_DOUBLE_FILL_VALUE 2.002
+#define DATASET_FILL_VALUE_TEST_STRING_FILL_VALUE "abcdefgh"
+#define DATASET_FILL_VALUE_TEST_STRING_SIZE 8 /* No null terminator for fixed length string*/
+
#define DATASET_SMALL_READ_TEST_ALL_DSET_SPACE_RANK 3
#define DATASET_SMALL_READ_TEST_ALL_DSET_DTYPESIZE sizeof(int)
#define DATASET_SMALL_READ_TEST_ALL_DSET_DTYPE H5T_NATIVE_INT
@@ -214,6 +243,15 @@ int H5_api_dataset_test(void);
#define DATASET_DATA_BUILTIN_CONVERSION_TEST_GROUP_NAME "dataset_builtin_conversion_verification_test"
#define DATASET_DATA_BUILTIN_CONVERSION_TEST_DSET_NAME "dataset_builtin_conversion_verification_dset"
+#define DATASET_DATA_REAL_CONVERSION_TEST_DSET_SPACE_RANK 3
+#define DATASET_DATA_REAL_CONVERSION_TEST_NUM_POINTS 10
+#define DATASET_DATA_REAL_CONVERSION_TEST_GROUP_NAME "dataset_real_conversion_verification_test"
+#define DATASET_DATA_REAL_CONVERSION_TEST_DSET_NAME "dataset_real_conversion_verification_dset"
+#define DATASET_DATA_REAL_CONVERSION_TEST_INT_DTYPESIZE sizeof(int)
+#define DATASET_DATA_REAL_CONVERSION_TEST_INT_TYPE H5T_NATIVE_INT
+#define DATASET_DATA_REAL_CONVERSION_TEST_REAL_DTYPESIZE sizeof(double)
+#define DATASET_DATA_REAL_CONVERSION_TEST_REAL_TYPE H5T_NATIVE_DOUBLE
+
#define DATASET_COMPOUND_PARTIAL_IO_DSET_DIMS 10
#define DATASET_DATA_COMPOUND_PARTIAL_IO_TEST_GROUP_NAME "dataset_compound_partial_io_test"
#define DATASET_DATA_COMPOUND_PARTIAL_IO_TEST_DSET_NAME "dataset_compound_partial_io_test"
diff --git a/test/API/H5_api_file_test.c b/test/API/H5_api_file_test.c
index 804b3bd..5b91551 100644
--- a/test/API/H5_api_file_test.c
+++ b/test/API/H5_api_file_test.c
@@ -948,7 +948,7 @@ test_flush_file(void)
}
for (u = 0; u < 10; u++) {
- sprintf(dset_name, "Dataset %u", u);
+ snprintf(dset_name, sizeof(dset_name), "Dataset %u", u);
if ((dset_id = H5Dcreate2(file_id, dset_name, H5T_STD_U32LE, dspace_id, H5P_DEFAULT, H5P_DEFAULT,
H5P_DEFAULT)) < 0) {
diff --git a/test/API/H5_api_group_test.c b/test/API/H5_api_group_test.c
index 4132f64..0203ebe 100644
--- a/test/API/H5_api_group_test.c
+++ b/test/API/H5_api_group_test.c
@@ -229,7 +229,7 @@ test_create_many_groups(void)
printf("\n");
for (i = 0; i < GROUP_NUMB_MANY; i++) {
printf("\r %u/%u", i + 1, GROUP_NUMB_MANY);
- sprintf(group_name, "group %02u", i);
+ snprintf(group_name, sizeof(group_name), "group %02u", i);
if ((child_group_id =
H5Gcreate2(parent_group_id, group_name, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
H5_FAILED();
@@ -342,11 +342,11 @@ create_group_recursive(hid_t parent_gid, unsigned counter)
printf("\r %u/%u", counter, GROUP_DEPTH);
if (counter == 1)
- sprintf(gname, "2nd_child_group");
+ snprintf(gname, sizeof(gname), "2nd_child_group");
else if (counter == 2)
- sprintf(gname, "3rd_child_group");
+ snprintf(gname, sizeof(gname), "3rd_child_group");
else
- sprintf(gname, "%dth_child_group", counter + 1);
+ snprintf(gname, sizeof(gname), "%dth_child_group", counter + 1);
if ((child_gid = H5Gcreate2(parent_gid, gname, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
H5_FAILED();
printf(" couldn't create group '%s'\n", gname);
diff --git a/test/API/H5_api_object_test.c b/test/API/H5_api_object_test.c
index d861661..77af8c9 100644
--- a/test/API/H5_api_object_test.c
+++ b/test/API/H5_api_object_test.c
@@ -51,6 +51,8 @@ static herr_t object_copy_soft_link_expand_callback(hid_t group, const char *nam
void *op_data);
static herr_t object_visit_callback(hid_t o_id, const char *name, const H5O_info2_t *object_info,
void *op_data);
+static herr_t object_visit_simple_callback(hid_t o_id, const char *name, const H5O_info2_t *object_info,
+ void *op_data);
static herr_t object_visit_dset_callback(hid_t o_id, const char *name, const H5O_info2_t *object_info,
void *op_data);
static herr_t object_visit_dtype_callback(hid_t o_id, const char *name, const H5O_info2_t *object_info,
@@ -5048,15 +5050,23 @@ test_object_comments_invalid_params(void)
static int
test_object_visit(void)
{
- size_t i;
- hid_t file_id = H5I_INVALID_HID;
- hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
- hid_t group_id2 = H5I_INVALID_HID;
- hid_t gcpl_id = H5I_INVALID_HID;
- hid_t type_id = H5I_INVALID_HID;
- hid_t dset_id = H5I_INVALID_HID;
- hid_t dset_dtype = H5I_INVALID_HID;
- hid_t fspace_id = H5I_INVALID_HID;
+ size_t i;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t file_id2 = H5I_INVALID_HID;
+ hid_t container_group = H5I_INVALID_HID, group_id = H5I_INVALID_HID;
+ hid_t group_id2 = H5I_INVALID_HID;
+ hid_t gcpl_id = H5I_INVALID_HID;
+ hid_t type_id = H5I_INVALID_HID;
+ hid_t dset_id = H5I_INVALID_HID;
+ hid_t dset_dtype = H5I_INVALID_HID;
+ hid_t fspace_id = H5I_INVALID_HID;
+ hid_t attr_id = H5I_INVALID_HID;
+ hid_t group_id3 = H5I_INVALID_HID;
+ hid_t group_id4 = H5I_INVALID_HID;
+ hid_t group_id5 = H5I_INVALID_HID;
+ hssize_t num_elems = 0;
+ size_t elem_size = 0;
+ char visit_filename[H5_API_TEST_FILENAME_MAX_LENGTH];
TESTING_MULTIPART("object visiting");
@@ -5079,6 +5089,15 @@ test_object_visit(void)
goto error;
}
+ snprintf(visit_filename, H5_API_TEST_FILENAME_MAX_LENGTH, "%s%s", test_path_prefix,
+ OBJECT_VISIT_TEST_FILE_NAME);
+
+ if ((file_id2 = H5Fcreate(visit_filename, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't open file '%s'\n", OBJECT_VISIT_TEST_FILE_NAME);
+ goto error;
+ }
+
if ((container_group = H5Gopen2(file_id, OBJECT_TEST_GROUP_NAME, H5P_DEFAULT)) < 0) {
H5_FAILED();
printf(" couldn't open container group '%s'\n", OBJECT_TEST_GROUP_NAME);
@@ -5106,18 +5125,44 @@ test_object_visit(void)
goto error;
}
- if ((fspace_id = generate_random_dataspace(OBJECT_VISIT_TEST_SPACE_RANK, NULL, NULL, false)) < 0)
- TEST_ERROR;
+ /* Make sure not to generate too much data for an attribute to hold */
+ do {
+ if (fspace_id != H5I_INVALID_HID)
+ H5Sclose(fspace_id);
- if ((dset_dtype = generate_random_datatype(H5T_NO_CLASS, false)) < 0)
- TEST_ERROR;
+ if (dset_dtype != H5I_INVALID_HID)
+ H5Tclose(dset_dtype);
+
+ if ((fspace_id = generate_random_dataspace(OBJECT_VISIT_TEST_SPACE_RANK, NULL, NULL, FALSE)) < 0) {
+ TEST_ERROR;
+ }
+
+ if ((dset_dtype = generate_random_datatype(H5T_NO_CLASS, FALSE)) < 0) {
+ TEST_ERROR;
+ }
- if ((type_id = generate_random_datatype(H5T_NO_CLASS, false)) < 0) {
+ if ((num_elems = H5Sget_simple_extent_npoints(fspace_id)) < 0)
+ TEST_ERROR;
+
+ if ((elem_size = H5Tget_size(dset_dtype)) == 0)
+ TEST_ERROR;
+
+ } while (((long unsigned int)num_elems * elem_size) > OBJECT_VISIT_TEST_TOTAL_DATA_SIZE_LIMIT);
+
+ if ((type_id = generate_random_datatype(H5T_NO_CLASS, FALSE)) < 0) {
H5_FAILED();
printf(" couldn't create datatype '%s'\n", OBJECT_VISIT_TEST_TYPE_NAME);
goto error;
}
+ if ((attr_id = H5Acreate2(group_id, OBJECT_VISIT_TEST_ATTR_NAME, dset_dtype, fspace_id, H5P_DEFAULT,
+ H5P_DEFAULT)) == H5I_INVALID_HID) {
+ H5_FAILED();
+ printf(" couldn't create attribute '%s' on group '%s'\n", OBJECT_VISIT_TEST_ATTR_NAME,
+ OBJECT_VISIT_TEST_SUBGROUP_NAME);
+ goto error;
+ }
+
if ((group_id2 = H5Gcreate2(group_id, OBJECT_VISIT_TEST_GROUP_NAME, H5P_DEFAULT, gcpl_id, H5P_DEFAULT)) <
0) {
H5_FAILED();
@@ -5125,6 +5170,27 @@ test_object_visit(void)
goto error;
}
+ if ((group_id3 = H5Gcreate2(file_id2, OBJECT_VISIT_TEST_GROUP_NAME_PARENT, H5P_DEFAULT, gcpl_id,
+ H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create group '%s'\n", OBJECT_VISIT_TEST_GROUP_NAME_PARENT);
+ goto error;
+ }
+
+ if ((group_id4 = H5Gcreate2(group_id3, OBJECT_VISIT_TEST_GROUP_NAME_CHILD, H5P_DEFAULT, gcpl_id,
+ H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create group '%s'\n", OBJECT_VISIT_TEST_GROUP_NAME_CHILD);
+ goto error;
+ }
+
+ if ((group_id5 = H5Gcreate2(group_id4, OBJECT_VISIT_TEST_GROUP_NAME_GRANDCHILD, H5P_DEFAULT, gcpl_id,
+ H5P_DEFAULT)) < 0) {
+ H5_FAILED();
+ printf(" couldn't create group '%s'\n", OBJECT_VISIT_TEST_GROUP_NAME_GRANDCHILD);
+ goto error;
+ }
+
if ((dset_id = H5Dcreate2(group_id, OBJECT_VISIT_TEST_DSET_NAME, dset_dtype, fspace_id, H5P_DEFAULT,
H5P_DEFAULT, H5P_DEFAULT)) < 0) {
H5_FAILED();
@@ -5257,16 +5323,49 @@ test_object_visit(void)
}
PART_END(H5Ovisit_create_order_decreasing);
+ PART_BEGIN(H5Ovisit_group)
+ {
+ TESTING_2("H5Ovisit on a group");
+
+ i = 0;
+
+ if (H5Ovisit3(group_id3, H5_INDEX_CRT_ORDER, H5_ITER_INC, object_visit_simple_callback, &i,
+ H5O_INFO_ALL) < 0) {
+ H5_FAILED();
+ printf(" H5Ovisit on a group failed!\n");
+ PART_ERROR(H5Ovisit_group);
+ }
+
+ if (i != OBJECT_VISIT_TEST_SUBGROUP_LAYERS) {
+ H5_FAILED();
+ printf(" some objects were not visited!\n");
+ PART_ERROR(H5Ovisit_group);
+ }
+
+ PASSED();
+ }
+ PART_END(H5Ovisit_group);
+
PART_BEGIN(H5Ovisit_file)
{
TESTING_2("H5Ovisit on a file ID");
- /*
- * XXX:
- */
+ i = 0;
+
+ if (H5Ovisit3(file_id2, H5_INDEX_CRT_ORDER, H5_ITER_INC, object_visit_simple_callback, &i,
+ H5O_INFO_ALL) < 0) {
+ H5_FAILED();
+ printf(" H5Ovisit on a file ID failed!\n");
+ PART_ERROR(H5Ovisit_file);
+ }
+
+ if (i != OBJECT_VISIT_TEST_NUM_OBJS_VISITED) {
+ H5_FAILED();
+ printf(" some objects were not visited!\n");
+ PART_ERROR(H5Ovisit_file);
+ }
- SKIPPED();
- PART_EMPTY(H5Ovisit_file);
+ PASSED();
}
PART_END(H5Ovisit_file);
@@ -5300,6 +5399,30 @@ test_object_visit(void)
}
PART_END(H5Ovisit_dtype);
+ PART_BEGIN(H5Ovisit_attr)
+ {
+ TESTING_2("H5Ovisit on an attribute");
+
+ i = 0;
+
+ if (H5Ovisit3(attr_id, H5_INDEX_CRT_ORDER, H5_ITER_INC, object_visit_simple_callback, &i,
+ H5O_INFO_ALL) < 0) {
+ H5_FAILED();
+ printf(" H5Ovisit on an attribute failed!\n");
+ PART_ERROR(H5Ovisit_attr);
+ }
+
+ /* Should have same effect as calling H5Ovisit on group_id */
+ if (i != OBJECT_VISIT_TEST_NUM_OBJS_VISITED) {
+ H5_FAILED();
+ printf(" some objects were not visited!\n");
+ PART_ERROR(H5Ovisit_attr);
+ }
+
+ PASSED();
+ }
+ PART_END(H5Ovisit_attr);
+
PART_BEGIN(H5Ovisit_by_name_obj_name_increasing)
{
TESTING_2("H5Ovisit_by_name by object name in increasing order");
@@ -5480,12 +5603,22 @@ test_object_visit(void)
{
TESTING_2("H5Ovisit_by_name on a file ID");
- /*
- * XXX:
- */
+ i = 0;
+
+ if (H5Ovisit_by_name3(file_id2, "/", H5_INDEX_CRT_ORDER, H5_ITER_INC,
+ object_visit_simple_callback, &i, H5O_INFO_ALL, H5P_DEFAULT) < 0) {
+ H5_FAILED();
+ printf(" H5Ovisit on a file ID failed!\n");
+ PART_ERROR(H5Ovisit_by_name_file);
+ }
- SKIPPED();
- PART_EMPTY(H5Ovisit_by_name_file);
+ if (i != OBJECT_VISIT_TEST_NUM_OBJS_VISITED) {
+ H5_FAILED();
+ printf(" some objects were not visited!\n");
+ PART_ERROR(H5Ovisit_by_name_file);
+ }
+
+ PASSED();
}
PART_END(H5Ovisit_by_name_file);
@@ -5518,6 +5651,30 @@ test_object_visit(void)
PASSED();
}
PART_END(H5Ovisit_by_name_dtype);
+
+ PART_BEGIN(H5Ovisit_by_name_attr)
+ {
+ TESTING_2("H5Ovisit_by_name on an attribute");
+
+ i = 0;
+
+ if (H5Ovisit_by_name(attr_id, ".", H5_INDEX_CRT_ORDER, H5_ITER_INC, object_visit_simple_callback,
+ &i, H5O_INFO_ALL, H5P_DEFAULT) < 0) {
+ H5_FAILED();
+ printf(" H5Ovisit_by_name on an attribute failed!\n");
+ PART_ERROR(H5Ovisit_by_name_attr);
+ }
+
+ /* Should have same effect as calling H5Ovisit on group_id */
+ if (i != OBJECT_VISIT_TEST_NUM_OBJS_VISITED) {
+ H5_FAILED();
+ printf(" some objects were not visited!\n");
+ PART_ERROR(H5Ovisit_by_name_attr);
+ }
+
+ PASSED();
+ }
+ PART_END(H5Ovisit_by_name_attr);
}
END_MULTIPART;
@@ -5535,12 +5692,22 @@ test_object_visit(void)
TEST_ERROR;
if (H5Gclose(group_id2) < 0)
TEST_ERROR;
+ if (H5Gclose(group_id3) < 0)
+ TEST_ERROR;
+ if (H5Gclose(group_id4) < 0)
+ TEST_ERROR;
+ if (H5Gclose(group_id5) < 0)
+ TEST_ERROR;
+ if (H5Aclose(attr_id) < 0)
+ TEST_ERROR;
if (H5Gclose(group_id) < 0)
TEST_ERROR;
if (H5Gclose(container_group) < 0)
TEST_ERROR;
if (H5Fclose(file_id) < 0)
TEST_ERROR;
+ if (H5Fclose(file_id2) < 0)
+ TEST_ERROR;
PASSED();
@@ -5555,11 +5722,16 @@ error:
H5Dclose(dset_id);
H5Pclose(gcpl_id);
H5Gclose(group_id2);
+ H5Gclose(group_id3);
+ H5Gclose(group_id4);
+ H5Gclose(group_id5);
+ H5Aclose(attr_id);
H5Gclose(group_id);
H5Gclose(container_group);
H5Fclose(file_id);
+ H5Fclose(file_id2);
}
- H5E_END_TRY
+ H5E_END_TRY;
return 1;
}
@@ -7068,6 +7240,29 @@ done:
}
/*
+ * H5Ovisit callback to count the number of visited objects
+ */
+static herr_t
+object_visit_simple_callback(hid_t o_id, const char *name, const H5O_info2_t *object_info, void *op_data)
+{
+ size_t *i = (size_t *)op_data;
+ herr_t ret_val = 0;
+
+ UNUSED(o_id);
+ UNUSED(object_info);
+
+ if (name)
+ goto done;
+
+ ret_val = -1;
+
+done:
+ (*i)++;
+
+ return ret_val;
+}
+
+/*
* H5Ovisit callback for visiting a singular dataset.
*/
static herr_t
@@ -7128,6 +7323,14 @@ object_visit_soft_link_callback(hid_t o_id, const char *name, const H5O_info2_t
UNUSED(o_id);
+ if (!strcmp(name, OBJECT_VISIT_TEST_GROUP_NAME_PARENT) ||
+ !strcmp(name, OBJECT_VISIT_TEST_GROUP_NAME_PARENT "/" OBJECT_VISIT_TEST_GROUP_NAME_CHILD) ||
+ !strcmp(name, OBJECT_VISIT_TEST_GROUP_NAME_PARENT "/" OBJECT_VISIT_TEST_GROUP_NAME_CHILD
+ "/" OBJECT_VISIT_TEST_GROUP_NAME_GRANDCHILD)) {
+ (*i)--;
+ goto done;
+ }
+
if (!strncmp(name, ".", strlen(".") + 1) && (counter_val <= 5)) {
if (H5O_TYPE_GROUP == object_info->type)
goto done;
@@ -7166,7 +7369,15 @@ object_visit_noop_callback(hid_t o_id, const char *name, const H5O_info2_t *obje
static void
cleanup_files(void)
{
- H5Fdelete(OBJECT_COPY_BETWEEN_FILES_TEST_FILE_NAME, H5P_DEFAULT);
+ char filename[H5_API_TEST_FILENAME_MAX_LENGTH];
+
+ snprintf(filename, H5_API_TEST_FILENAME_MAX_LENGTH, "%s%s", test_path_prefix,
+ OBJECT_COPY_BETWEEN_FILES_TEST_FILE_NAME);
+ H5Fdelete(filename, H5P_DEFAULT);
+
+ snprintf(filename, H5_API_TEST_FILENAME_MAX_LENGTH, "%s%s", test_path_prefix,
+ OBJECT_VISIT_TEST_FILE_NAME);
+ H5Fdelete(filename, H5P_DEFAULT);
}
int
diff --git a/test/API/H5_api_object_test.h b/test/API/H5_api_object_test.h
index 75c3961..68d89d0 100644
--- a/test/API/H5_api_object_test.h
+++ b/test/API/H5_api_object_test.h
@@ -121,12 +121,19 @@ int H5_api_object_test(void);
#define OBJECT_COPY_INVALID_PARAMS_TEST_GROUP_NAME "object_copy_invalid_params_group"
#define OBJECT_COPY_INVALID_PARAMS_TEST_GROUP_NAME2 "object_copy_invalid_params_group_copy"
-#define OBJECT_VISIT_TEST_NUM_OBJS_VISITED 4
-#define OBJECT_VISIT_TEST_SUBGROUP_NAME "object_visit_test"
-#define OBJECT_VISIT_TEST_SPACE_RANK 2
-#define OBJECT_VISIT_TEST_GROUP_NAME "object_visit_test_group"
-#define OBJECT_VISIT_TEST_DSET_NAME "object_visit_test_dset"
-#define OBJECT_VISIT_TEST_TYPE_NAME "object_visit_test_type"
+#define OBJECT_VISIT_TEST_NUM_OBJS_VISITED 4
+#define OBJECT_VISIT_TEST_SUBGROUP_NAME "object_visit_test"
+#define OBJECT_VISIT_TEST_SPACE_RANK 2
+#define OBJECT_VISIT_TEST_GROUP_NAME "object_visit_test_group"
+#define OBJECT_VISIT_TEST_DSET_NAME "object_visit_test_dset"
+#define OBJECT_VISIT_TEST_TYPE_NAME "object_visit_test_type"
+#define OBJECT_VISIT_TEST_ATTR_NAME "object_visit_test_attr"
+#define OBJECT_VISIT_TEST_FILE_NAME "object_visit_test_file"
+#define OBJECT_VISIT_TEST_SUBGROUP_LAYERS 3
+#define OBJECT_VISIT_TEST_GROUP_NAME_PARENT "object_visit_test_group_parent"
+#define OBJECT_VISIT_TEST_GROUP_NAME_CHILD "object_visit_test_group_child"
+#define OBJECT_VISIT_TEST_GROUP_NAME_GRANDCHILD "object_visit_test_group_grandchild"
+#define OBJECT_VISIT_TEST_TOTAL_DATA_SIZE_LIMIT 32000
#define OBJECT_VISIT_SOFT_LINK_TEST_NUM_OBJS_VISITED 1
#define OBJECT_VISIT_SOFT_LINK_TEST_SUBGROUP_NAME "object_visit_soft_link"
diff --git a/test/dtypes.c b/test/dtypes.c
index cbc031b..ac8697e 100644
--- a/test/dtypes.c
+++ b/test/dtypes.c
@@ -7567,6 +7567,174 @@ error:
return 1;
} /* end test_named_indirect_reopen() */
+/*-------------------------------------------------------------------------
+ * Function: test_named_indirect_reopen_file
+ *
+ * Purpose: Tests that a named compound datatype that refers to a named
+ * string datatype can be reopened indirectly through H5Dget_type,
+ * and shows the correct H5Tcommitted() state, including after the
+ * file has been closed and reopened.
+ *
+ * Return: Success: 0
+ * Failure: number of errors
+ *
+ *-------------------------------------------------------------------------
+ */
+static int
+test_named_indirect_reopen_file(hid_t fapl)
+{
+ hid_t file = H5I_INVALID_HID;
+ hid_t space = H5I_INVALID_HID;
+ hid_t cmptype = H5I_INVALID_HID;
+ hid_t reopened_cmptype = H5I_INVALID_HID;
+ hid_t strtype = H5I_INVALID_HID;
+ hid_t reopened_strtype = H5I_INVALID_HID;
+ hid_t dset = H5I_INVALID_HID;
+ hsize_t dims[1] = {3};
+ size_t strtype_size, cmptype_size;
+ char filename[1024];
+
+ TESTING("indirectly reopening recursively committed datatypes including file reopening");
+
+ /* PREPARATION */
+
+ /* Create file, dataspace */
+ h5_fixname(FILENAME[1], fapl, filename, sizeof filename);
+ if ((file = H5Fcreate(filename, H5F_ACC_TRUNC, H5P_DEFAULT, fapl)) < 0)
+ TEST_ERROR;
+ if ((space = H5Screate_simple(1, dims, dims)) < 0)
+ TEST_ERROR;
+
+ /* Create string type */
+ if ((strtype = H5Tcopy(H5T_C_S1)) < 0)
+ TEST_ERROR;
+ if (H5Tset_size(strtype, H5T_VARIABLE) < 0)
+ TEST_ERROR;
+
+ /* Get size of string type */
+ if ((strtype_size = H5Tget_size(strtype)) == 0)
+ TEST_ERROR;
+
+ /* Commit string type and verify the size doesn't change */
+ if (H5Tcommit2(file, "str_type", strtype, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT) < 0)
+ TEST_ERROR;
+ if (strtype_size != H5Tget_size(strtype))
+ TEST_ERROR;
+
+ /* Create compound type */
+ if ((cmptype = H5Tcreate(H5T_COMPOUND, sizeof(char *))) < 0)
+ TEST_ERROR;
+ if (H5Tinsert(cmptype, "vlstr", (size_t)0, strtype) < 0)
+ TEST_ERROR;
+
+ /* Get size of compound type */
+ if ((cmptype_size = H5Tget_size(cmptype)) == 0)
+ TEST_ERROR;
+
+ /* Commit compound type and verify the size doesn't change */
+ if (H5Tcommit2(file, "cmp_type", cmptype, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT) < 0)
+ TEST_ERROR;
+ if (cmptype_size != H5Tget_size(cmptype))
+ TEST_ERROR;
+
+ /* Create dataset with compound type */
+ if ((dset = H5Dcreate2(file, "cmp_dset", cmptype, space, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0)
+ TEST_ERROR;
+
+ /* Close original types */
+ if (H5Tclose(strtype) < 0)
+ TEST_ERROR;
+ if (H5Tclose(cmptype) < 0)
+ TEST_ERROR;
+
+ /* CHECK DATA TYPES WHILE STILL HOLDING THE FILE OPEN */
+
+ /* Indirectly reopen compound type, verify that they report as committed, and the size doesn't change */
+ if ((reopened_cmptype = H5Dget_type(dset)) < 0)
+ TEST_ERROR;
+ if (cmptype_size != H5Tget_size(reopened_cmptype))
+ TEST_ERROR;
+ if (H5Tcommitted(reopened_cmptype) != 1)
+ TEST_ERROR;
+
+ /* Indirectly reopen string type, verify that they report as NOT committed, and the size doesn't change */
+ if ((reopened_strtype = H5Tget_member_type(reopened_cmptype, 0)) < 0)
+ TEST_ERROR;
+ if (strtype_size != H5Tget_size(reopened_strtype))
+ TEST_ERROR;
+ if (H5Tcommitted(reopened_strtype) != 0)
+ TEST_ERROR;
+
+ /* Close types and dataset */
+ if (H5Tclose(reopened_strtype) < 0)
+ TEST_ERROR;
+ if (H5Tclose(reopened_cmptype) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset) < 0)
+ TEST_ERROR;
+
+ /* CHECK DATA TYPES AFTER REOPENING THE SAME FILE */
+
+ /* Close file */
+ if (H5Fclose(file) < 0)
+ TEST_ERROR;
+
+ /* Reopen file */
+ if ((file = H5Fopen(filename, H5F_ACC_RDWR, fapl)) < 0)
+ TEST_ERROR;
+
+ /* Reopen dataset */
+ if ((dset = H5Dopen2(file, "cmp_dset", H5P_DEFAULT)) < 0)
+ TEST_ERROR;
+
+ /* Indirectly reopen compound type, verify that they report as committed, and the size doesn't change */
+ if ((reopened_cmptype = H5Dget_type(dset)) < 0)
+ TEST_ERROR;
+ if (cmptype_size != H5Tget_size(reopened_cmptype))
+ TEST_ERROR;
+ if (H5Tcommitted(reopened_cmptype) != 1)
+ TEST_ERROR;
+
+ /* Indirectly reopen string type, verify that they report as NOT committed, and the size doesn't change */
+ if ((reopened_strtype = H5Tget_member_type(reopened_cmptype, 0)) < 0)
+ TEST_ERROR;
+ if (strtype_size != H5Tget_size(reopened_strtype))
+ TEST_ERROR;
+ if (H5Tcommitted(reopened_strtype) != 0)
+ TEST_ERROR;
+
+ /* Close types and dataset */
+ if (H5Tclose(reopened_strtype) < 0)
+ TEST_ERROR;
+ if (H5Tclose(reopened_cmptype) < 0)
+ TEST_ERROR;
+ if (H5Dclose(dset) < 0)
+ TEST_ERROR;
+
+ /* DONE */
+
+ /* Close file and dataspace */
+ if (H5Sclose(space) < 0)
+ TEST_ERROR;
+ if (H5Fclose(file) < 0)
+ TEST_ERROR;
+ PASSED();
+ return 0;
+
+error:
+ H5E_BEGIN_TRY
+ {
+ H5Tclose(cmptype);
+ H5Tclose(strtype);
+ H5Tclose(reopened_cmptype);
+ H5Tclose(reopened_strtype);
+ H5Sclose(space);
+ H5Dclose(dset);
+ H5Fclose(file);
+ }
+ H5E_END_TRY;
+ return 1;
+} /* end test_named_indirect_reopen() */
static void
create_del_obj_named_test_file(const char *filename, hid_t fapl, H5F_libver_t low, H5F_libver_t high)
{
@@ -8870,6 +9038,7 @@ main(void)
nerrors += test_latest();
nerrors += test_int_float_except();
nerrors += test_named_indirect_reopen(fapl);
+ nerrors += test_named_indirect_reopen_file(fapl);
nerrors += test_delete_obj_named(fapl);
nerrors += test_delete_obj_named_fileid(fapl);
nerrors += test_set_order_compound(fapl);
diff --git a/test/filter_plugin.c b/test/filter_plugin.c
index 1571bf2..9207d9e 100644
--- a/test/filter_plugin.c
+++ b/test/filter_plugin.c
@@ -847,10 +847,7 @@ test_creating_groups_using_plugins(hid_t fid)
/* Create multiple groups under the top-level group */
for (i = 0; i < N_SUBGROUPS; i++) {
- char *sp = subgroup_name;
-
- sp += snprintf(subgroup_name, sizeof(subgroup_name), SUBGROUP_PREFIX);
- sprintf(sp, "%d", i);
+ snprintf(subgroup_name, sizeof(subgroup_name), SUBGROUP_PREFIX "%d", i);
if ((sub_gid = H5Gcreate2(gid, subgroup_name, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT)) < 0)
TEST_ERROR;
@@ -906,10 +903,7 @@ test_opening_groups_using_plugins(hid_t fid)
/* Open all the sub-groups under the top-level group */
for (i = 0; i < N_SUBGROUPS; i++) {
- char *sp = subgroup_name;
-
- sp += snprintf(subgroup_name, sizeof(subgroup_name), SUBGROUP_PREFIX);
- sprintf(sp, "%d", i);
+ snprintf(subgroup_name, sizeof(subgroup_name), SUBGROUP_PREFIX "%d", i);
if ((sub_gid = H5Gopen2(gid, subgroup_name, H5P_DEFAULT)) < 0)
TEST_ERROR;
diff --git a/test/h5test.c b/test/h5test.c
index 6983c37..1f1430b 100644
--- a/test/h5test.c
+++ b/test/h5test.c
@@ -1808,7 +1808,7 @@ h5_get_version_string(H5F_libver_t libver)
/*-------------------------------------------------------------------------
* Function: h5_compare_file_bytes()
*
- * Purpose: Helper function to compare two files byte-for-byte.
+ * Purpose: Helper function to compare two files byte-for-byte
*
* Return: Success: 0, if files are identical
* Failure: -1, if files differ
@@ -1818,14 +1818,14 @@ h5_get_version_string(H5F_libver_t libver)
int
h5_compare_file_bytes(char *f1name, char *f2name)
{
- FILE *f1ptr = NULL; /* two file pointers */
- FILE *f2ptr = NULL;
- off_t f1size = 0; /* size of the files */
- off_t f2size = 0;
- char f1char = 0; /* one char from each file */
- char f2char = 0;
- off_t ii = 0;
- int ret_value = 0; /* for error handling */
+ FILE *f1ptr = NULL; /* two file pointers */
+ FILE *f2ptr = NULL;
+ HDoff_t f1size = 0; /* size of the files */
+ HDoff_t f2size = 0;
+ char f1char = 0; /* one char from each file */
+ char f2char = 0;
+ HDoff_t ii = 0;
+ int ret_value = 0; /* for error handling */
/* Open files for reading */
f1ptr = fopen(f1name, "rb");
diff --git a/test/tmisc.c b/test/tmisc.c
index a8103af..ddebc3d 100644
--- a/test/tmisc.c
+++ b/test/tmisc.c
@@ -21,6 +21,7 @@
*************************************************************/
#define H5D_FRIEND /*suppress error about including H5Dpkg */
+#define H5T_FRIEND /*suppress error about including H5Tpkg */
/* Define this macro to indicate that the testing APIs should be available */
#define H5D_TESTING
@@ -28,6 +29,7 @@
#include "testhdf5.h"
#include "H5srcdir.h"
#include "H5Dpkg.h" /* Datasets */
+#include "H5Tpkg.h" /* Datatypes */
#include "H5MMprivate.h" /* Memory */
/* Definitions for misc. test #1 */
@@ -335,6 +337,8 @@ typedef struct {
See https://nvd.nist.gov/vuln/detail/CVE-2020-10812 */
#define CVE_2020_10812_FILENAME "cve_2020_10812.h5"
+#define MISC38_FILE "type_conversion_path_table_issue.h5"
+
/****************************************************************
**
** test_misc1(): test unlinking a dataset from a group and immediately
@@ -6259,6 +6263,190 @@ test_misc37(void)
/****************************************************************
**
+** test_misc38():
+** Test for issue where the type conversion path table cache
+** would grow continuously when variable-length datatypes
+** are involved due to file VOL object comparisons causing
+** the library not to reuse type conversion paths
+**
+****************************************************************/
+static void
+test_misc38(void)
+{
+ H5VL_object_t *file_vol_obj = NULL;
+ const char *buf[] = {"attr_value"};
+ herr_t ret = SUCCEED;
+ hid_t file_id = H5I_INVALID_HID;
+ hid_t attr_id = H5I_INVALID_HID;
+ hid_t str_type = H5I_INVALID_HID;
+ hid_t space_id = H5I_INVALID_HID;
+ int init_npaths = 0;
+ int *irbuf = NULL;
+ char **rbuf = NULL;
+ bool vol_is_native;
+
+ /* Output message about test being performed */
+ MESSAGE(5, ("Fix for type conversion path table issue"));
+
+ /*
+ * Get the initial number of type conversion path table
+ * entries that are currently defined
+ */
+ init_npaths = H5T__get_path_table_npaths();
+
+ file_id = H5Fcreate(MISC38_FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
+ CHECK(file_id, H5I_INVALID_HID, "H5Fcreate");
+
+ /* Check if native VOL is being used */
+ CHECK(h5_using_native_vol(H5P_DEFAULT, file_id, &vol_is_native), FAIL, "h5_using_native_vol");
+ if (!vol_is_native) {
+ CHECK(H5Fclose(file_id), FAIL, "H5Fclose");
+ MESSAGE(5, (" -- SKIPPED --\n"));
+ return;
+ }
+
+ /* Retrieve file's VOL object field for further use */
+ file_vol_obj = H5F_VOL_OBJ((H5F_t *)H5VL_object(file_id));
+
+ /*
+ * Check reference count of file's VOL object field. At this point,
+ * the object should have a reference count of 1 since the file
+ * was just created.
+ */
+ VERIFY(file_vol_obj->rc, 1, "checking reference count");
+
+ str_type = H5Tcopy(H5T_C_S1);
+ CHECK(str_type, H5I_INVALID_HID, "H5Tcopy");
+
+ ret = H5Tset_size(str_type, H5T_VARIABLE);
+ CHECK(ret, FAIL, "H5Tset_size");
+
+ space_id = H5Screate(H5S_SCALAR);
+ CHECK(space_id, H5I_INVALID_HID, "H5Screate");
+
+ /*
+ * Check the number of type conversion path table entries currently
+ * stored in the cache. It shouldn't have changed yet.
+ */
+ VERIFY(H5T__get_path_table_npaths(), init_npaths,
+ "checking number of type conversion path table entries");
+
+ attr_id = H5Acreate2(file_id, "attribute", str_type, space_id, H5P_DEFAULT, H5P_DEFAULT);
+ CHECK(attr_id, H5I_INVALID_HID, "H5Acreate2");
+
+ /*
+ * Check the number of type conversion path table entries currently
+ * stored in the cache. It shouldn't have changed yet.
+ */
+ VERIFY(H5T__get_path_table_npaths(), init_npaths,
+ "checking number of type conversion path table entries");
+
+ /*
+ * Check reference count of file's VOL object field. At this point,
+ * the object should have a reference count of 2. Creating the
+ * attribute on the dataset will have caused a H5T_set_loc call that
+ * associates the attribute's datatype with the file's VOL object
+ * and will have incremented the reference count by 1.
+ */
+ VERIFY(file_vol_obj->rc, 2, "checking reference count");
+
+ ret = H5Awrite(attr_id, str_type, buf);
+ CHECK(ret, FAIL, "H5Awrite");
+
+ /*
+ * Check the number of type conversion path table entries currently
+ * stored in the cache. The H5Awrite call should have added a new
+ * type conversion path. Note that if another test in this file uses
+ * the same conversion path, this check may fail and need to be
+ * refactored.
+ */
+ VERIFY(H5T__get_path_table_npaths(), init_npaths + 1,
+ "checking number of type conversion path table entries");
+
+ /*
+ * Check reference count of file's VOL object field. At this point,
+ * the object should have a reference count of 3. Writing to the
+ * variable-length typed attribute will have caused an H5T_convert
+ * call that ends up incrementing the reference count of the
+ * associated file's VOL object.
+ */
+ VERIFY(file_vol_obj->rc, 3, "checking reference count");
+
+ ret = H5Aclose(attr_id);
+ CHECK(ret, FAIL, "H5Aclose");
+ ret = H5Fclose(file_id);
+ CHECK(ret, FAIL, "H5Fclose");
+
+ irbuf = malloc(100 * 100 * sizeof(int));
+ CHECK_PTR(irbuf, "int read buf allocation");
+ rbuf = malloc(sizeof(char *));
+ CHECK_PTR(rbuf, "varstr read buf allocation");
+
+ for (size_t i = 0; i < 10; i++) {
+ file_id = H5Fopen(MISC38_FILE, H5F_ACC_RDONLY, H5P_DEFAULT);
+ CHECK(file_id, H5I_INVALID_HID, "H5Fopen");
+
+ /* Retrieve file's VOL object field for further use */
+ file_vol_obj = H5F_VOL_OBJ((H5F_t *)H5VL_object(file_id));
+
+ /*
+ * Check reference count of file's VOL object field. At this point,
+ * the object should have a reference count of 1 since the file
+ * was just opened.
+ */
+ VERIFY(file_vol_obj->rc, 1, "checking reference count");
+
+ attr_id = H5Aopen(file_id, "attribute", H5P_DEFAULT);
+ CHECK(attr_id, H5I_INVALID_HID, "H5Aopen");
+
+ /*
+ * Check reference count of file's VOL object field. At this point,
+ * the object should have a reference count of 2 since opening
+ * the attribute will also have associated its type with the file's
+ * VOL object.
+ */
+ VERIFY(file_vol_obj->rc, 2, "checking reference count");
+
+ ret = H5Aread(attr_id, str_type, rbuf);
+ CHECK(ret, FAIL, "H5Aread");
+
+ /*
+ * Check the number of type conversion path table entries currently
+ * stored in the cache. Each H5Aread call shouldn't cause this number
+ * to go up, as the library should have removed the cached conversion
+ * paths on file close.
+ */
+ VERIFY(H5T__get_path_table_npaths(), init_npaths + 1,
+ "checking number of type conversion path table entries");
+
+ /*
+ * Check reference count of file's VOL object field. At this point,
+ * the object should have a reference count of 3. Writing to the
+ * variable-length typed attribute will have caused an H5T_convert
+ * call that ends up incrementing the reference count of the
+ * associated file's VOL object.
+ */
+ VERIFY(file_vol_obj->rc, 3, "checking reference count");
+
+ ret = H5Treclaim(str_type, space_id, H5P_DEFAULT, rbuf);
+
+ ret = H5Aclose(attr_id);
+ CHECK(ret, FAIL, "H5Aclose");
+ ret = H5Fclose(file_id);
+ CHECK(ret, FAIL, "H5Fclose");
+ }
+
+ free(irbuf);
+ free(rbuf);
+
+ ret = H5Tclose(str_type);
+ CHECK(ret, FAIL, "H5Tclose");
+ ret = H5Sclose(space_id);
+ CHECK(ret, FAIL, "H5Sclose");
+}
+
+/****************************************************************
+**
** test_misc(): Main misc. test routine.
**
****************************************************************/
@@ -6325,6 +6513,7 @@ test_misc(void)
test_misc35(); /* Test behavior of free-list & allocation statistics API calls */
test_misc36(); /* Exercise H5atclose and H5is_library_terminating */
test_misc37(); /* Test for seg fault failure at file close */
+ test_misc38(); /* Test for type conversion path table issue */
} /* test_misc() */
@@ -6380,6 +6569,7 @@ cleanup_misc(void)
#ifndef H5_NO_DEPRECATED_SYMBOLS
H5Fdelete(MISC31_FILE, H5P_DEFAULT);
#endif /* H5_NO_DEPRECATED_SYMBOLS */
+ H5Fdelete(MISC38_FILE, H5P_DEFAULT);
}
H5E_END_TRY
} /* end cleanup_misc() */
diff --git a/testpar/API/H5_api_async_test_parallel.c b/testpar/API/H5_api_async_test_parallel.c
index 79327d0..768bbc2 100644
--- a/testpar/API/H5_api_async_test_parallel.c
+++ b/testpar/API/H5_api_async_test_parallel.c
@@ -542,7 +542,7 @@ test_multi_dataset_io(void)
size_t buf_end_idx;
/* Set dataset name */
- sprintf(dset_name, "dset%d", (int)i);
+ snprintf(dset_name, sizeof(dset_name), "dset%d", (int)i);
/* Create the dataset asynchronously */
if ((dset_id[i] = H5Dcreate_async(file_id, dset_name, H5T_NATIVE_INT, space_id, H5P_DEFAULT,
@@ -611,7 +611,7 @@ test_multi_dataset_io(void)
size_t buf_end_idx;
/* Set dataset name */
- sprintf(dset_name, "dset%d", (int)i);
+ snprintf(dset_name, sizeof(dset_name), "dset%d", (int)i);
/* Open the dataset asynchronously */
if ((dset_id[0] = H5Dopen_async(file_id, dset_name, H5P_DEFAULT, es_id)) < 0)
@@ -641,7 +641,7 @@ test_multi_dataset_io(void)
/* Loop over datasets */
for (i = 0; i < MULTI_DATASET_IO_TEST_NDSETS; i++) {
/* Set dataset name */
- sprintf(dset_name, "dset%d", (int)i);
+ snprintf(dset_name, sizeof(dset_name), "dset%d", (int)i);
/* Open the dataset asynchronously */
if ((dset_id[0] = H5Dopen_async(file_id, dset_name, H5P_DEFAULT, es_id)) < 0)
@@ -864,7 +864,7 @@ test_multi_file_dataset_io(void)
size_t buf_end_idx;
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Create file asynchronously */
if ((file_id[i] = H5Fcreate_async(file_name, H5F_ACC_TRUNC, H5P_DEFAULT, fapl_id, es_id)) < 0)
@@ -1018,7 +1018,7 @@ test_multi_file_dataset_io(void)
size_t buf_end_idx;
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Open the file asynchronously */
if ((file_id[0] = H5Fopen_async(file_name, H5F_ACC_RDWR, fapl_id, es_id)) < 0)
@@ -1057,7 +1057,7 @@ test_multi_file_dataset_io(void)
/* Loop over files */
for (i = 0; i < MULTI_FILE_DATASET_IO_TEST_NFILES; i++) {
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Open the file asynchronously */
if ((file_id[0] = H5Fopen_async(file_name, H5F_ACC_RDONLY, fapl_id, es_id)) < 0)
@@ -1287,7 +1287,7 @@ test_multi_file_grp_dset_io(void)
size_t buf_end_idx;
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Create file asynchronously */
if ((file_id = H5Fcreate_async(file_name, H5F_ACC_TRUNC, H5P_DEFAULT, fapl_id, es_id)) < 0)
@@ -1339,7 +1339,7 @@ test_multi_file_grp_dset_io(void)
/* Loop over files */
for (i = 0; i < MULTI_FILE_GRP_DSET_IO_TEST_NFILES; i++) {
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Open the file asynchronously */
if ((file_id = H5Fopen_async(file_name, H5F_ACC_RDONLY, fapl_id, es_id)) < 0)
@@ -1401,7 +1401,7 @@ test_multi_file_grp_dset_io(void)
size_t buf_end_idx;
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Create file asynchronously */
if ((file_id = H5Fcreate_async(file_name, H5F_ACC_TRUNC, H5P_DEFAULT, fapl_id, es_id)) < 0)
@@ -1459,7 +1459,7 @@ test_multi_file_grp_dset_io(void)
/* Loop over files */
for (i = 0; i < MULTI_FILE_GRP_DSET_IO_TEST_NFILES; i++) {
/* Set file name */
- sprintf(file_name, PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, (int)i);
/* Open the file asynchronously */
if ((file_id = H5Fopen_async(file_name, H5F_ACC_RDONLY, fapl_id, es_id)) < 0)
@@ -3582,7 +3582,7 @@ cleanup_files(void)
if (MAINPROCESS) {
H5Fdelete(PAR_ASYNC_API_TEST_FILE, H5P_DEFAULT);
for (i = 0; i <= max_printf_file; i++) {
- snprintf(file_name, 64, PAR_ASYNC_API_TEST_FILE_PRINTF, i);
+ snprintf(file_name, sizeof(file_name), PAR_ASYNC_API_TEST_FILE_PRINTF, i);
H5Fdelete(file_name, H5P_DEFAULT);
} /* end for */
}
diff --git a/testpar/t_subfiling_vfd.c b/testpar/t_subfiling_vfd.c
index 4f109cb..2ebb0e4 100644
--- a/testpar/t_subfiling_vfd.c
+++ b/testpar/t_subfiling_vfd.c
@@ -320,7 +320,7 @@ test_config_file(void)
FILE *config_file;
char *config_filename = NULL;
char *config_buf = NULL;
- long config_file_len;
+ HDoff_t config_file_len;
hid_t file_id = H5I_INVALID_HID;
hid_t fapl_id = H5I_INVALID_HID;
int read_stripe_count;
diff --git a/testpar/t_vfd.c b/testpar/t_vfd.c
index 79b7e01..cce5cf7 100644
--- a/testpar/t_vfd.c
+++ b/testpar/t_vfd.c
@@ -3987,17 +3987,20 @@ vector_write_test_7(int file_name_id, int mpi_rank, int mpi_size, H5FD_mpio_xfer
if (xfer_mode == H5FD_MPIO_INDEPENDENT) {
- sprintf(test_title, "parallel vector write test 7 -- %s / independent", vfd_name);
+ snprintf(test_title, sizeof(test_title), "parallel vector write test 7 -- %s / independent",
+ vfd_name);
}
else if (coll_opt_mode == H5FD_MPIO_INDIVIDUAL_IO) {
- sprintf(test_title, "parallel vector write test 7 -- %s / col op / ind I/O", vfd_name);
+ snprintf(test_title, sizeof(test_title), "parallel vector write test 7 -- %s / col op / ind I/O",
+ vfd_name);
}
else {
assert(coll_opt_mode == H5FD_MPIO_COLLECTIVE_IO);
- sprintf(test_title, "parallel vector write test 7 -- %s / col op / col I/O", vfd_name);
+ snprintf(test_title, sizeof(test_title), "parallel vector write test 7 -- %s / col op / col I/O",
+ vfd_name);
}
TESTING(test_title);
diff --git a/tools/lib/h5diff.c b/tools/lib/h5diff.c
index 15f2a14..bdbda6e 100644
--- a/tools/lib/h5diff.c
+++ b/tools/lib/h5diff.c
@@ -116,7 +116,7 @@ print_incoming_data(void)
MPI_Recv(data, PRINT_DATA_MAX_SIZE, MPI_CHAR, Status.MPI_SOURCE, MPI_TAG_PRINT_DATA,
MPI_COMM_WORLD, &Status);
- printf("%s", data);
+ parallel_print("%s", data);
}
} while (incomingMessage);
}
@@ -1247,7 +1247,8 @@ diff_match(hid_t file1_id, const char *grp1, trav_info_t *info1, hid_t file2_id,
/*Set up args to pass to worker task. */
if (strlen(obj1_fullpath) > 255 || strlen(obj2_fullpath) > 255) {
- printf("The parallel diff only supports object names up to 255 characters\n");
+ fprintf(stderr,
+ "The parallel diff only supports object names up to 255 characters\n");
MPI_Abort(MPI_COMM_WORLD, 0);
} /* end if */
@@ -1392,7 +1393,7 @@ diff_match(hid_t file1_id, const char *grp1, trav_info_t *info1, hid_t file2_id,
MPI_COMM_WORLD);
} /* end else-if */
else {
- printf("ERROR: Invalid tag (%d) received \n", Status.MPI_TAG);
+ fprintf(stderr, "ERROR: Invalid tag (%d) received \n", Status.MPI_TAG);
MPI_Abort(MPI_COMM_WORLD, 0);
MPI_Finalize();
} /* end else */
@@ -1477,10 +1478,10 @@ diff_match(hid_t file1_id, const char *grp1, trav_info_t *info1, hid_t file2_id,
MPI_Recv(data, PRINT_DATA_MAX_SIZE, MPI_CHAR, Status.MPI_SOURCE, MPI_TAG_PRINT_DATA,
MPI_COMM_WORLD, &Status);
- printf("%s", data);
+ parallel_print("%s", data);
} /* end else-if */
else {
- printf("ph5diff-manager: ERROR!! Invalid tag (%d) received \n", Status.MPI_TAG);
+ fprintf(stderr, "ph5diff-manager: ERROR!! Invalid tag (%d) received \n", Status.MPI_TAG);
MPI_Abort(MPI_COMM_WORLD, 0);
} /* end else */
} /* end while */
diff --git a/tools/src/h5diff/ph5diff_main.c b/tools/src/h5diff/ph5diff_main.c
index f90bd48..98e0c1d 100644
--- a/tools/src/h5diff/ph5diff_main.c
+++ b/tools/src/h5diff/ph5diff_main.c
@@ -85,8 +85,9 @@ main(int argc, char *argv[])
MPI_Barrier(MPI_COMM_WORLD);
- print_info(&opts);
print_manager_output();
+
+ print_info(&opts);
}
/* All other tasks become workers and wait for assignments. */
else {
diff --git a/tools/test/h5dump/CMakeVFDTests.cmake b/tools/test/h5dump/CMakeVFDTests.cmake
index 0118725..433eced 100644
--- a/tools/test/h5dump/CMakeVFDTests.cmake
+++ b/tools/test/h5dump/CMakeVFDTests.cmake
@@ -24,7 +24,42 @@ set (HDF5_VFD_H5DUMP_FILES
packedbits
)
+set (HDF5_SF_VFD_H5DUMP_FILES
+ test_subfiling_stripe_sizes.h5
+)
+
+set (HDF5_SF2_VFD_H5DUMP_FILES
+ test_subfiling_precreate_rank_0.h5
+)
+
foreach (vfdtest ${VFD_LIST})
+ if (vfdtest STREQUAL "subfiling")
+ foreach (h5_tfile ${HDF5_SF_VFD_H5DUMP_FILES})
+ file(COPY "${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}" DESTINATION "${PROJECT_BINARY_DIR}/${vfdtest}")
+ execute_process(
+ COMMAND ls -i ${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}
+ OUTPUT_VARIABLE OUTPUT_VALUE
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ )
+ string(REGEX MATCH "^ *([0-9]+) *" INODE_VALUE "${OUTPUT_VALUE}")
+ string(STRIP ${INODE_VALUE} INODE_STR)
+ HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}.subfile_1_of_1" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.subfile_${INODE_STR}_1_of_1" "HDF5_SF_VFD_H5DUMP_files")
+ HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}.subfile.config" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.subfile_${INODE_STR}.config" "HDF5_SF_VFD_H5DUMP_files")
+ endforeach ()
+ foreach (h5_tfile ${HDF5_SF2_VFD_H5DUMP_FILES})
+ file(COPY "${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}" DESTINATION "${PROJECT_BINARY_DIR}/${vfdtest}")
+ execute_process(
+ COMMAND ls -i ${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}
+ OUTPUT_VARIABLE OUTPUT_VALUE
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ )
+ string(REGEX MATCH "^ *([0-9]+) *" INODE_VALUE "${OUTPUT_VALUE}")
+ string(STRIP ${INODE_VALUE} INODE_STR)
+ HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}.subfile_1_of_2" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.subfile_${INODE_STR}_1_of_2" "HDF5_SF2_VFD_H5DUMP_files")
+ HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}.subfile_2_of_2" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.subfile_${INODE_STR}_2_of_2" "HDF5_SF2_VFD_H5DUMP_files")
+ HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}.subfile.config" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.subfile_${INODE_STR}.config" "HDF5_SF2_VFD_H5DUMP_files")
+ endforeach ()
+ endif ()
foreach (h5_tfile ${HDF5_VFD_H5DUMP_FILES})
HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/testfiles/${h5_tfile}.h5" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.h5" "HDF5_VFD_H5DUMP_files")
HDFTEST_COPY_FILE("${PROJECT_SOURCE_DIR}/expected/${h5_tfile}.ddl" "${PROJECT_BINARY_DIR}/${vfdtest}/${h5_tfile}.ddl" "HDF5_VFD_H5DUMP_files")
@@ -32,6 +67,8 @@ foreach (vfdtest ${VFD_LIST})
endforeach ()
add_custom_target(HDF5_VFD_H5DUMP_files ALL COMMENT "Copying files needed by HDF5_VFD_H5DUMP tests" DEPENDS ${HDF5_VFD_H5DUMP_files_list})
+add_custom_target(HDF5_SF_VFD_H5DUMP_files ALL COMMENT "Copying files needed by HDF5_SF_VFD_H5DUMP tests" DEPENDS ${HDF5_SF_VFD_H5DUMP_files_list})
+add_custom_target(HDF5_SF2_VFD_H5DUMP_files ALL COMMENT "Copying files needed by HDF5_SF2_VFD_H5DUMP tests" DEPENDS ${HDF5_SF2_VFD_H5DUMP_files_list})
##############################################################################
##############################################################################
@@ -69,6 +106,12 @@ endmacro ()
# Run test with different Virtual File Driver
foreach (vfd ${VFD_LIST})
- # test for signed/unsigned datasets
+ if (vfd STREQUAL "subfiling")
+ ADD_VFD_H5DUMP_TEST (${vfd} filedriver_subfiling 0 --enable-error-stack=2 --filedriver=subfiling test_subfiling_stripe_sizes.h5)
+ ADD_VFD_H5DUMP_TEST (${vfd} vfd_name_subfiling 0 --enable-error-stack=2 --vfd-name=subfiling test_subfiling_stripe_sizes.h5)
+ ADD_VFD_H5DUMP_TEST (${vfd} vfd_value_subfiling 0 --enable-error-stack=2 --vfd-value=12 test_subfiling_stripe_sizes.h5)
+ ADD_VFD_H5DUMP_TEST (${vfd} vfd_value_subfiling_2 0 --enable-error-stack=2 --vfd-value=12 -d DSET -s 0 -S 100 -c 10 test_subfiling_precreate_rank_0.h5)
+ endif ()
+ # test for signed/unsigned datasets
ADD_VFD_H5DUMP_TEST (${vfd} packedbits 0 --enable-error-stack packedbits.h5)
endforeach ()
diff --git a/tools/test/h5dump/h5dumpgentest.c b/tools/test/h5dump/h5dumpgentest.c
index aed3eda..e12690c 100644
--- a/tools/test/h5dump/h5dumpgentest.c
+++ b/tools/test/h5dump/h5dumpgentest.c
@@ -3884,7 +3884,7 @@ gent_multi(void)
for (mt = H5FD_MEM_DEFAULT; mt < H5FD_MEM_NTYPES; mt++) {
memb_fapl[mt] = H5P_DEFAULT;
memb_map[mt] = mt;
- sprintf(sv[mt], "%%s-%c.h5", multi_letters[mt]);
+ snprintf(sv[mt], 1024, "%%s-%c.h5", multi_letters[mt]);
memb_name[mt] = sv[mt];
/*printf("memb_name[%d]=%s, memb_map[%d]=%d; ", mt, memb_name[mt], mt, memb_map[mt]);*/
memb_addr[mt] = (haddr_t)MAX(mt - 1, 0) * (HADDR_MAX / 10);
diff --git a/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5 b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5
new file mode 100644
index 0000000..92abb78
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5
Binary files differ
diff --git a/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile.config b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile.config
new file mode 100644
index 0000000..fb73927
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile.config
@@ -0,0 +1,2 @@
+stripe_size=22495773
+subfile_count=2
diff --git a/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_1_of_2 b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_1_of_2
new file mode 100644
index 0000000..0f7d317
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_1_of_2
Binary files differ
diff --git a/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_2_of_2 b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_2_of_2
new file mode 100644
index 0000000..fc78f6a
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_precreate_rank_0.h5.subfile_2_of_2
Binary files differ
diff --git a/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5 b/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5
new file mode 100644
index 0000000..d040215
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5
Binary files differ
diff --git a/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile.config b/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile.config
new file mode 100644
index 0000000..8b8b597
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile.config
@@ -0,0 +1,2 @@
+stripe_size=12149997
+subfile_count=1
diff --git a/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile_1_of_1 b/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile_1_of_1
new file mode 100644
index 0000000..ba79d7c
--- /dev/null
+++ b/tools/test/h5dump/testfiles/test_subfiling_stripe_sizes.h5.subfile_1_of_1
Binary files differ
diff --git a/utils/tools/h5dwalk/h5dwalk.c b/utils/tools/h5dwalk/h5dwalk.c
index b510f3e..f994a90 100644
--- a/utils/tools/h5dwalk/h5dwalk.c
+++ b/utils/tools/h5dwalk/h5dwalk.c
@@ -1209,10 +1209,10 @@ MFU_PRED_EXEC(mfu_flist flist, uint64_t idx, void *arg)
snprintf(cmdline, sizeof(cmdline), "\n---------\nCommand:");
b_offset = strlen(cmdline);
for (k = 0; k < count; k++) {
- sprintf(&cmdline[b_offset], " %s", argv[k]);
+ snprintf(&cmdline[b_offset], sizeof(cmdline) - b_offset, " %s", argv[k]);
b_offset = strlen(cmdline);
}
- sprintf(&cmdline[b_offset], "\n");
+ snprintf(&cmdline[b_offset], sizeof(cmdline) - b_offset, "\n");
run_command(count, argv, cmdline, fname);
mfu_free(argv);