PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Architecture Overview

Nigeria News News

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Architecture Overview
Nigeria Latest News,Nigeria Headlines
  • 📰 hackernoon
  • ⏱ Reading Time:
  • 22 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 12%
  • Publisher: 51%

This paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.

This paper is available on arxiv under CC BY-NC-ND 4.0 DEED license. : Authors Minghao Yan, University of Wisconsin-Madison; Hongyi Wang, Carnegie Mellon University; Shivaram Venkataraman, myan@cs.wisc.edu. Table of Links Abstract & Introduction Motivation Opportunities Architecture Overview Proble Formulation: Two-Phase Tuning Modeling Workload Interference Experiments Conclusion & References A. Hardware Details B. Experimental Results C. Arithmetic Intensity D.

Offline, we automatically find the best CPU frequency, GPU frequency, memory frequency, and recommended batch size for inference requests that satisfy the latency constraints while minimizing per-query energy consumption. We discuss the details of the optimization procedure in Section 5. We also show that our formulation can find near-optimal energy configurations in a few minutes using just a handful of samples.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

hackernoon /  🏆 532. in US

Nigeria Latest News, Nigeria Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Experimental ResultsPolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Experimental ResultsThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Read more »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: OpportunitiesPolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: OpportunitiesThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Read more »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: MotivationPolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: MotivationThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Read more »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Arithmetic IntensityPolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Arithmetic IntensityThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Read more »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: ExperimentsPolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: ExperimentsThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Read more »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Hardware DetailsPolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Hardware DetailsThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Read more »



Render Time: 2025-08-27 20:31:16