An energy-efficient network-on-chip design using reinforcement learning

Hao Zheng, Ahmed Louri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

The design space for energy-efficient Network-on-Chips (NoCs) has expanded significantly comprising a number of techniques. The simultaneous application of these techniques to yield maximum energy efficiency requires the monitoring of a large number of system parameters which often results in substantial engineering efforts and complicated control policies. This motivates us to explore the use of reinforcement learning (RL) approach that automatically learns an optimal control policy to improve NoC energy efficiency. First, we deploy power-gating (PG) and dynamic voltage and frequency scaling (DVFS) to simultaneously reduce both static and dynamic power. Second, we use RL to automatically explore the dynamic interactions among PG, DVFS, and system parameters, learn the critical system parameters contained in the router and cache, and eventually evolve optimal per-router control policies that significantly improve energy efficiency. Moreover, we introduce an artificial neural network (ANN) to efficiently implement the large state-action table required by RL. Simulation results using PARSEC benchmark show that the proposed RL approach improves power consumption by 26%, while improving system performance by 7%, as compared to a combined PG and DVFS design without RL. Additionally, the ANN design yields 67% area reduction, as compared to a conventional RL implementation.

Original languageEnglish (US)
Title of host publicationProceedings of the 56th Annual Design Automation Conference 2019, DAC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781450367257
DOIs
StatePublished - Jun 2 2019
Event56th Annual Design Automation Conference, DAC 2019 - Las Vegas, United States
Duration: Jun 2 2019Jun 6 2019

Publication series

NameProceedings - Design Automation Conference
ISSN (Print)0738-100X

Conference

Conference56th Annual Design Automation Conference, DAC 2019
CountryUnited States
CityLas Vegas
Period6/2/196/6/19

Fingerprint

Reinforcement learning
Reinforcement Learning
Energy Efficient
Control Policy
Energy Efficiency
Energy efficiency
Voltage
Scaling
Router
Routers
Artificial Neural Network
Neural networks
Network Design
Optimal Policy
Cache
Power Consumption
Design
Network on chip
Network-on-chip
System Performance

ASJC Scopus subject areas

  • Computer Science Applications
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Modeling and Simulation

Cite this

Zheng, H., & Louri, A. (2019). An energy-efficient network-on-chip design using reinforcement learning. In Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019 [a47] (Proceedings - Design Automation Conference). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1145/3316781.3317768

An energy-efficient network-on-chip design using reinforcement learning. / Zheng, Hao; Louri, Ahmed.

Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019. Institute of Electrical and Electronics Engineers Inc., 2019. a47 (Proceedings - Design Automation Conference).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Zheng, H & Louri, A 2019, An energy-efficient network-on-chip design using reinforcement learning. in Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019., a47, Proceedings - Design Automation Conference, Institute of Electrical and Electronics Engineers Inc., 56th Annual Design Automation Conference, DAC 2019, Las Vegas, United States, 6/2/19. https://doi.org/10.1145/3316781.3317768
Zheng H, Louri A. An energy-efficient network-on-chip design using reinforcement learning. In Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019. Institute of Electrical and Electronics Engineers Inc. 2019. a47. (Proceedings - Design Automation Conference). https://doi.org/10.1145/3316781.3317768
Zheng, Hao ; Louri, Ahmed. / An energy-efficient network-on-chip design using reinforcement learning. Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019. Institute of Electrical and Electronics Engineers Inc., 2019. (Proceedings - Design Automation Conference).
@inproceedings{bc7b770ed20d4101a429181cf6188e69,
title = "An energy-efficient network-on-chip design using reinforcement learning",
abstract = "The design space for energy-efficient Network-on-Chips (NoCs) has expanded significantly comprising a number of techniques. The simultaneous application of these techniques to yield maximum energy efficiency requires the monitoring of a large number of system parameters which often results in substantial engineering efforts and complicated control policies. This motivates us to explore the use of reinforcement learning (RL) approach that automatically learns an optimal control policy to improve NoC energy efficiency. First, we deploy power-gating (PG) and dynamic voltage and frequency scaling (DVFS) to simultaneously reduce both static and dynamic power. Second, we use RL to automatically explore the dynamic interactions among PG, DVFS, and system parameters, learn the critical system parameters contained in the router and cache, and eventually evolve optimal per-router control policies that significantly improve energy efficiency. Moreover, we introduce an artificial neural network (ANN) to efficiently implement the large state-action table required by RL. Simulation results using PARSEC benchmark show that the proposed RL approach improves power consumption by 26{\%}, while improving system performance by 7{\%}, as compared to a combined PG and DVFS design without RL. Additionally, the ANN design yields 67{\%} area reduction, as compared to a conventional RL implementation.",
author = "Hao Zheng and Ahmed Louri",
year = "2019",
month = "6",
day = "2",
doi = "10.1145/3316781.3317768",
language = "English (US)",
series = "Proceedings - Design Automation Conference",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019",

}

TY - GEN

T1 - An energy-efficient network-on-chip design using reinforcement learning

AU - Zheng, Hao

AU - Louri, Ahmed

PY - 2019/6/2

Y1 - 2019/6/2

N2 - The design space for energy-efficient Network-on-Chips (NoCs) has expanded significantly comprising a number of techniques. The simultaneous application of these techniques to yield maximum energy efficiency requires the monitoring of a large number of system parameters which often results in substantial engineering efforts and complicated control policies. This motivates us to explore the use of reinforcement learning (RL) approach that automatically learns an optimal control policy to improve NoC energy efficiency. First, we deploy power-gating (PG) and dynamic voltage and frequency scaling (DVFS) to simultaneously reduce both static and dynamic power. Second, we use RL to automatically explore the dynamic interactions among PG, DVFS, and system parameters, learn the critical system parameters contained in the router and cache, and eventually evolve optimal per-router control policies that significantly improve energy efficiency. Moreover, we introduce an artificial neural network (ANN) to efficiently implement the large state-action table required by RL. Simulation results using PARSEC benchmark show that the proposed RL approach improves power consumption by 26%, while improving system performance by 7%, as compared to a combined PG and DVFS design without RL. Additionally, the ANN design yields 67% area reduction, as compared to a conventional RL implementation.

AB - The design space for energy-efficient Network-on-Chips (NoCs) has expanded significantly comprising a number of techniques. The simultaneous application of these techniques to yield maximum energy efficiency requires the monitoring of a large number of system parameters which often results in substantial engineering efforts and complicated control policies. This motivates us to explore the use of reinforcement learning (RL) approach that automatically learns an optimal control policy to improve NoC energy efficiency. First, we deploy power-gating (PG) and dynamic voltage and frequency scaling (DVFS) to simultaneously reduce both static and dynamic power. Second, we use RL to automatically explore the dynamic interactions among PG, DVFS, and system parameters, learn the critical system parameters contained in the router and cache, and eventually evolve optimal per-router control policies that significantly improve energy efficiency. Moreover, we introduce an artificial neural network (ANN) to efficiently implement the large state-action table required by RL. Simulation results using PARSEC benchmark show that the proposed RL approach improves power consumption by 26%, while improving system performance by 7%, as compared to a combined PG and DVFS design without RL. Additionally, the ANN design yields 67% area reduction, as compared to a conventional RL implementation.

UR - http://www.scopus.com/inward/record.url?scp=85067823203&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067823203&partnerID=8YFLogxK

U2 - 10.1145/3316781.3317768

DO - 10.1145/3316781.3317768

M3 - Conference contribution

AN - SCOPUS:85067823203

T3 - Proceedings - Design Automation Conference

BT - Proceedings of the 56th Annual Design Automation Conference 2019, DAC 2019

PB - Institute of Electrical and Electronics Engineers Inc.

ER -