将大型numpy数组保存为.mat文件



我正在努力解决这个问题:我有两个大的2D numpy数组(大约5 GB(,我想把它们保存在一个可以从Matlab加载的.mat文件中我试过scipy.io,写过

from scipy.io import savemat
data = {'A': a, 'B': b}
savemat('myfile.mat', data, appendmat=True, format='5',
long_field_names=False, do_compression=False, oned_as='row')

但我得到了错误:Overflow错误:Python int太大,无法转换为C long

编辑:Python 3.8,Matlab 2017b

这里是回溯

a.shape(6001048261(类型<类'numpy.foat64'>

b.shape(1048261(类型<类'numpy.foat64'>

data = {'A': a, 'B': b}
savemat('myfile.mat', data, appendmat=True, format='5',
long_field_names=False, do_compression=False, oned_as='row')
---------------------------------------------------------------------------
OverflowError                             Traceback (most recent call last)
<ipython-input-19-4d1d08a54148> in <module>
1 data = {'A': a, 'B': b}
----> 2 savemat('myfile.mat', data, appendmat=True, format='5',
3         long_field_names=False, do_compression=False, oned_as='row')
~miniconda3envsworklibsite-packagesscipyiomatlabmio.py in savemat(file_name, mdict, appendmat, format, long_field_names, do_compression, oned_as)
277         else:
278             raise ValueError("Format should be '4' or '5'")
--> 279         MW.put_variables(mdict)
280 
281 
~miniconda3envsworklibsite-packagesscipyiomatlabmio5.py in put_variables(self, mdict, write_header)
847                 self.file_stream.write(out_str)
848             else:  # not compressing
--> 849                 self._matrix_writer.write_top(var, asbytes(name), is_global)
~miniconda3envsworklibsite-packagesscipyiomatlabmio5.py in write_top(self, arr, name, is_global)
588         self._var_name = name
589         # write the header and data
--> 590         self.write(arr)
591 
592     def write(self, arr):
~miniconda3envsworklibsite-packagesscipyiomatlabmio5.py in write(self, arr)
627             self.write_char(narr, codec)
628         else:
--> 629             self.write_numeric(narr)
630         self.update_matrix_tag(mat_tag_pos)
631 
~miniconda3envsworklibsite-packagesscipyiomatlabmio5.py in write_numeric(self, arr)
653             self.write_element(arr.imag)
654         else:
--> 655             self.write_element(arr)
656 
657     def write_char(self, arr, codec='ascii'):
~miniconda3envsworklibsite-packagesscipyiomatlabmio5.py in write_element(self, arr, mdtype)
494             self.write_smalldata_element(arr, mdtype, byte_count)
495         else:
--> 496             self.write_regular_element(arr, mdtype, byte_count)
497 
498     def write_smalldata_element(self, arr, mdtype, byte_count):
~miniconda3envsworklibsite-packagesscipyiomatlabmio5.py in write_regular_element(self, arr, mdtype, byte_count)
508         tag = np.zeros((), NDT_TAG_FULL)
509         tag['mdtype'] = mdtype
--> 510         tag['byte_count'] = byte_count
511         self.write_bytes(tag)
512         self.write_bytes(arr)
OverflowError: Python int too large to convert to C long

我也尝试过hdf5存储

hdf5storage.write(data, 'myfile.mat', matlab_compatible=True)

但它也失败了。

编辑:

发出此警告

miniconda3envsworklibsite-packageshdf5storage__init__.py:1306: 
H5pyDeprecationWarning: The default file mode will change to 'r' (read-only) 
in h5py 3.0. To suppress this warning, pass the mode you need to 
h5py.File(), or set the global default h5.get_config().default_file_mode, or 
set the environment variable H5PY_DEFAULT_READONLY=1. Available modes are: 
'r', 'r+', 'w', 'w-'/'x', 'a'. See the docs for details.
f = h5py.File(filename)

无论如何,它创建了一个5GB文件,但当我在Matlab中加载它时,我得到了一个用文件路径命名的变量,显然没有数据。

最后我尝试了h5py:

import h5py
hf = h5py.File('C:/Users/flavio/Desktop/STRA-pattern.mat', 'w')
hf.create_dataset('A', data=a)
hf.create_dataset('B', data=b)
hf.close()

但输出文件在Matlab中无法识别/读取。

拆分是唯一的解决方案吗?希望有更好的方法来解决这个问题。

任何仍在寻找答案的人,这适用于hdf5storage

hdf5storage.savemat(save_path,data_ ict,格式=7.3,matlab_compatible=真,compress=错误)

相关内容

  • 没有找到相关文章

最新更新