当前位置:   article > 正文

tensorflow中prefetch最合适的用法_dataset prefetch

dataset prefetch

tensorflow中有prefetch方法,从方法名可以看出,这个操作是跟数据预读取,提升IO性能有关。那什么时候使用prefetch方法呢?

正好在stackoverflow上看到这样一个类似的问题:
What is the proper use of Tensorflow dataset prefetch and cache options?

下面有这样一段回答:

"When the GPU is working on forward / backward propagation on the current batch, we want the CPU to process the next batch of data so that it is immediately ready. As the most expensive part of the computer, we want the GPU to be fully used all the time during training. We call this consumer / producer overlap, where the consumer is the GPU and the producer is the CPU.

With tf.data, you can do this with a simple call to dataset.prefetch(1) at the end of the pipeline (after batching). This will always prefetch one batch of data and make sure that there is always one ready.

In some cases, it can be useful to prefetch more than one batch. For instance if the duration of the preprocessing varies a lot, prefetching 10 batches would average out the processing time over 10 batches, instead of sometimes waiting for longer batches.

To give a concrete example, suppose than 10% of the batches take 10s to compute, and 90% take 1s. If the GPU takes 2s to train on one batch, by prefetching multiple batches you make sure that we never wait for these rare longer batches."

I’m not quite sure how to determine processing time of each batch but that’s the next step. If your batches are roughly taking the same amount of time to process then I believe prefetch(batch_size=1) should suffice as your GPU wouldn’t be waiting for the cPU to finish processing a computationally expensive batch.

根据上面的回答,我们总结一下:
1.GPU在进行当前批次前向/反向传播计算时,我们希望CPU能够处理下一批数据以便马上准备好。GPU是整个计算系统中最昂贵的部分,我们希望GPU一直在使用状态,将其称为生产者/消费者重叠。其中,消费者就是GPU(计算),生产者是CPU(准备数据)。

2.使用tf.data,可以在数据处理的pipeline最末端使用dataset.prefetch(1)来完成该操作,这样将始终有一批数据保证准备好用于计算。

3.某些情况下,prefetch多个batch可能很有用。例如,如果预处理的持续时间变化很大,预取 10 个批次将平均处理 10 个批次的处理时间,而不是有时等待更长的批次。

4.举一个具体的例子,假设 10% 的批次需要 10 秒来计算,90% 需要 1 秒。如果 GPU 需要 2 秒来训练一个批次,那么通过prefetch多个批次,可以确保我们永远不会等待这些罕见的较长批次。

5.如果每次batch处理的时间大致相同,prefetch时设置batch_size=1基本就可以了,这样GPU就不会去等待CPU去完成计算量大的批次。

原问答地址:
https://stackoverflow.com/questions/63796936/what-is-the-proper-use-of-tensorflow-dataset-prefetch-and-cache-options

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/笔触狂放9/article/detail/249769
推荐阅读
相关标签
  

闽ICP备14008679号