我意识到我用python编写的连续函数的输出(接受一个向量,返回一个标量)以9.536743e-7的分辨率离散化。
我在谷歌上搜索了这个数字,得知一些网站说1bit=9.536743e-7兆比特。1比特不等于10e-8兆比特吗?为什么我在所有地方都看到这个数字,为什么我在代码中看到它?
我的代码是用jax.numpy编写的。
在讨论内存时,通常表示2^20 = 1048576。是的,1/1048576 = 9.536743e-7
然而,当讨论带宽/速度时,"mega"通常是1000000。然而,当讨论内存时,通常使用字节(相对于带宽/速度的比特);这部分很奇怪。所以我想你可以说1兆链路上1位的时间是9.536743e-7秒。但它充其量只是一个非典型案例。我不明白这种1/x的想法如何适用于内存。
注意,这个数不能精确地表示为浮点数;下面是最接近的64位浮点值:
>>> print "%1000.1000fn" % 0.000000953674316
0.0000009536743159999999753652735225151193532155957655049860477447509765625000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
>>>