pyshark数据包排队并pickle错误



我正在尝试使用pyshark实时捕获数据包。

当我试图从multiprocessing.Queue获取数据包或取消pickle时,我会收到以下错误:

python2.7/site packages/pyshark/packet/layer.py",第48行,位于__getattr__
val=self.get_field_value(item,raw=self.raw_mode)
(…多次…)
RuntimeError:调用Python对象时超过了最大递归深度`。

我怀疑在重建对象时存在问题,无论是从队列中检索还是取消拾取
令人惊讶的是,当我使用Queue.Queue进行此操作时,没有出现任何错误。

以下是用于重现此问题的代码:

import pyshark
import multiprocessing
import Queue
import cPickle as pickle
# Capture on eth0
interface = pyshark.LiveCapture(interface="eth0")
def queue_test(queue):
    """ Puts captured packets in a queue, then un-queue them and display """
    for packet in interface.sniff_continuously(packet_count=5):
        queue.put(packet)
    while not queue.empty():
        packet = queue.get()
        print "Packet {} {}".format(packet.highest_layer,packet._packet_string)
def pickle_test():
    """ Immediately pickle and unpickle the packet to display it"""
    for packet in interface.sniff_continuously(packet_count=5):
        pickled_packet = pickle.loads(pickle.dumps(packet, pickle.HIGHEST_PROTOCOL))
        print "Packet #{}, {} {}".format(pickled_packet.highest_layer,pickled_packet._packet_string)

if __name__ == "__main__":
    normal_queue = Queue.Queue()
    process_queue = multiprocessing.Queue()
    # Runs fine
    queue_test(normal_queue)
    # Both crash with a RuntimeError
    queue_test(process_queue)
    pickle_test()

为什么我会收到RuntimeErrors,我该怎么办
是我做错了什么,还是pyshark有问题?

在这里没有太大成功,我在pyshark的Github上发布了一个问题,碰巧库中缺少了一些东西:

这是由于某些类数据包使用覆盖getattr这一事实造成的。固定在541fc52 中

本期链接:https://github.com/KimiNewt/pyshark/issues/63

最新更新