Superior training of artificial neural networks using weight-space partitioning

Hoshin Vijai Gupta, Kuo lin Hsu, Soroosh Sorooshian

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Citations (Scopus)

Abstract

LLSSIM (Linear Least Squares SIMplex)is a new algorithm for batch training of three-layer feedforward Artificial Neural Networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a `Multi-Start Downhill Simplex' global search algorithm, and the hidden-output weights are estimated using `conditional linear least squares'. Monte-carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional back propagation, adaptive back propagation, and conjugate gradient strategies.

Original languageEnglish (US)
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
PublisherIEEE
Pages1919-1923
Number of pages5
Volume3
StatePublished - 1997
EventProceedings of the 1997 IEEE International Conference on Neural Networks. Part 4 (of 4) - Houston, TX, USA
Duration: Jun 9 1997Jun 12 1997

Other

OtherProceedings of the 1997 IEEE International Conference on Neural Networks. Part 4 (of 4)
CityHouston, TX, USA
Period6/9/976/12/97

Fingerprint

Backpropagation
Neural networks
Function evaluation
Testing

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence

Cite this

Gupta, H. V., Hsu, K. L., & Sorooshian, S. (1997). Superior training of artificial neural networks using weight-space partitioning. In IEEE International Conference on Neural Networks - Conference Proceedings (Vol. 3, pp. 1919-1923). IEEE.

Superior training of artificial neural networks using weight-space partitioning. / Gupta, Hoshin Vijai; Hsu, Kuo lin; Sorooshian, Soroosh.

IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 3 IEEE, 1997. p. 1919-1923.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Gupta, HV, Hsu, KL & Sorooshian, S 1997, Superior training of artificial neural networks using weight-space partitioning. in IEEE International Conference on Neural Networks - Conference Proceedings. vol. 3, IEEE, pp. 1919-1923, Proceedings of the 1997 IEEE International Conference on Neural Networks. Part 4 (of 4), Houston, TX, USA, 6/9/97.
Gupta HV, Hsu KL, Sorooshian S. Superior training of artificial neural networks using weight-space partitioning. In IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 3. IEEE. 1997. p. 1919-1923
Gupta, Hoshin Vijai ; Hsu, Kuo lin ; Sorooshian, Soroosh. / Superior training of artificial neural networks using weight-space partitioning. IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 3 IEEE, 1997. pp. 1919-1923
@inproceedings{ee4b4693b285438fba8157a86ae9794d,
title = "Superior training of artificial neural networks using weight-space partitioning",
abstract = "LLSSIM (Linear Least Squares SIMplex)is a new algorithm for batch training of three-layer feedforward Artificial Neural Networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a `Multi-Start Downhill Simplex' global search algorithm, and the hidden-output weights are estimated using `conditional linear least squares'. Monte-carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional back propagation, adaptive back propagation, and conjugate gradient strategies.",
author = "Gupta, {Hoshin Vijai} and Hsu, {Kuo lin} and Soroosh Sorooshian",
year = "1997",
language = "English (US)",
volume = "3",
pages = "1919--1923",
booktitle = "IEEE International Conference on Neural Networks - Conference Proceedings",
publisher = "IEEE",

}

TY - GEN

T1 - Superior training of artificial neural networks using weight-space partitioning

AU - Gupta, Hoshin Vijai

AU - Hsu, Kuo lin

AU - Sorooshian, Soroosh

PY - 1997

Y1 - 1997

N2 - LLSSIM (Linear Least Squares SIMplex)is a new algorithm for batch training of three-layer feedforward Artificial Neural Networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a `Multi-Start Downhill Simplex' global search algorithm, and the hidden-output weights are estimated using `conditional linear least squares'. Monte-carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional back propagation, adaptive back propagation, and conjugate gradient strategies.

AB - LLSSIM (Linear Least Squares SIMplex)is a new algorithm for batch training of three-layer feedforward Artificial Neural Networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a `Multi-Start Downhill Simplex' global search algorithm, and the hidden-output weights are estimated using `conditional linear least squares'. Monte-carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional back propagation, adaptive back propagation, and conjugate gradient strategies.

UR - http://www.scopus.com/inward/record.url?scp=0030682699&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030682699&partnerID=8YFLogxK

M3 - Conference contribution

VL - 3

SP - 1919

EP - 1923

BT - IEEE International Conference on Neural Networks - Conference Proceedings

PB - IEEE

ER -