메뉴 건너뛰기




Volumn , Issue , 2018, Pages 31-36

Edge intelligence: On-demand deep learning model co-inference with device-edge synergy

Author keywords

Computation offloading; Deep learning; Edge computing; Edge intelligence

Indexed keywords

DEEP LEARNING; EDGE COMPUTING; WIDE AREA NETWORKS;

EID: 85056854955     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1145/3229556.3229562     Document Type: Conference Paper
Times cited : (360)

References (16)
  • 1
    • 84876231242 scopus 로고    scopus 로고
    • ImageNet classification with deep convolutional neural networks
    • K. Alex, I. Sutskever, and G. Hinton. 2012. ImageNet Classification with Deep Convolutional Neural Networks. In NIPS.
    • (2012) NIPS
    • Alex, K.1    Sutskever, I.2    Hinton, G.3
  • 5
    • 85021799578 scopus 로고    scopus 로고
    • Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding
    • (2015)
    • S. Han, H. Mao, and W. Dally. 2015. Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding. Fiber (2015).
    • (2015) Fiber
    • Han, S.1    Mao, H.2    Dally, W.3
  • 7
    • 85083951289 scopus 로고    scopus 로고
    • Compression of deep convolutional neural networks for fast and low power mobile applications
    • Y. Kim, E. Park, S. Yoo, T. Choi, L. Yang, and D. Shin. 2016. Compression of Deep Convolutional Neural Networks for Fast and Low Power Mobile Applications. In IGLR.
    • (2016) IGLR
    • Kim, Y.1    Park, E.2    Yoo, S.3    Choi, T.4    Yang, L.5    Shin, D.6
  • 10
    • 85056872043 scopus 로고    scopus 로고
    • (2004)
    • V. Mulhollon. 2004. WonderShaper. http://manpages.ubuntu.com/manpages/trusty/man8/wondershaper-8.html. (2004).
    • (2004) WonderShaper
    • Mulhollon, V.1
  • 11
    • 85056882671 scopus 로고    scopus 로고
    • Chaîner (2017)
    • Prefered Networks. 2017. Chaîner, https://github.com/chainer/chainer/tree/vl. (2017).
    • (2017) Prefered Networks
  • 15
    • 85019129480 scopus 로고    scopus 로고
    • BranchyNet: Fast inference via early exiting from deep neural networks
    • S. Teerapittayanon, B. McDanel, and H. T. Kung. 2016. BranchyNet: Fast inference via early exiting from deep neural networks. In 2016 23rd IGPR.
    • (2016) 2016 23rd IGPR
    • Teerapittayanon, S.1    McDanel, B.2    Kung, H.T.3
  • 16
    • 84944049919 scopus 로고    scopus 로고
    • A long short-term memory model for answer sentence selection in question answering
    • Di Wang and Eric Nyberg. 2015. A Long Short-Term Memory Model for Answer Sentence Selection in Question Answering. In AGL and IJGNLP.
    • (2015) AGL and IJGNLP
    • Wang, D.1    Nyberg, E.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.