赞
踩
In-depth examination of the status quo
the latency and energy consumption of executing state- of-the-art DNNs in the cloud and on the mobile device
DNN compute and data size characteristics study
DNN layers have significantly different compute and data size characteristics depending on their type and configurations
DNN computation partitioning across the cloud and mobile edge
Neurosurgeon runtime system and layer performance prediction models
a set of models to pre- dict the latency and power consumption of a DNN layer based on its type and configuration
a system to intelligently partition DNN com- putation between the mobile and cloud
mobile devices
server:
framework:
caffe
model:
AlexNet
other:
TestMyNet(measure the bandwidth)
Watts Up(measure energy consumption)Watts Up?
Power Meter. https://www.wattsupmeters. com/. Accessed: 2015-05.
Thrift
an open source flexible RPC inter- face for inter-process communication
MAUI
a general offloading framework(可用做实验比较)
Eduardo Cuervo, Aruna Balasubramanian, Dae-ki Cho, Alec Wolman, Stefan Saroiu, Ranveer Chandra, and Paramvir Bahl. Maui: making smartphones last longer with code offload. In Proceedings of the 8th international conference on Mobile systems, applications, and services, pages 49–62. ACM, 2010.
BigHouse(Data Center throughput)
David Meisner, Junjie Wu, and Thomas F. Wenisch. Big- House: A Simulation Infrastructure for Data Center Systems. ISPASS ’12: International Symposium on Performance Anal- ysis of Systems and Software, April 2012.
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。