### Abstract

The authors introduce a constraint on intermediate representations which reduces the number of allowable solutions and leads to generalization for certain classes of problems. Specifically, they constrain the number of intermediate representations to be minimized during training. This representational constraint also defines a training algorithm for multilayered networks. They describe the class of problems for which the algorithm is well suited and discuss the performance of the algorithm with regard to several problems on two-layer networks.

Original language | English (US) |
---|---|

Title of host publication | IEEE Int Conf on Neural Networks |

Publisher | Publ by IEEE |

Pages | 371-381 |

Number of pages | 11 |

State | Published - 1988 |

Externally published | Yes |

### Fingerprint

### ASJC Scopus subject areas

- Engineering(all)

### Cite this

*IEEE Int Conf on Neural Networks*(pp. 371-381). Publ by IEEE.

**Emergence of generalization in networks with constrained representations.** / Psaltis, Demetri; Neifeld, Mark A.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*IEEE Int Conf on Neural Networks.*Publ by IEEE, pp. 371-381.

}

TY - GEN

T1 - Emergence of generalization in networks with constrained representations

AU - Psaltis, Demetri

AU - Neifeld, Mark A

PY - 1988

Y1 - 1988

N2 - The authors introduce a constraint on intermediate representations which reduces the number of allowable solutions and leads to generalization for certain classes of problems. Specifically, they constrain the number of intermediate representations to be minimized during training. This representational constraint also defines a training algorithm for multilayered networks. They describe the class of problems for which the algorithm is well suited and discuss the performance of the algorithm with regard to several problems on two-layer networks.

AB - The authors introduce a constraint on intermediate representations which reduces the number of allowable solutions and leads to generalization for certain classes of problems. Specifically, they constrain the number of intermediate representations to be minimized during training. This representational constraint also defines a training algorithm for multilayered networks. They describe the class of problems for which the algorithm is well suited and discuss the performance of the algorithm with regard to several problems on two-layer networks.

UR - http://www.scopus.com/inward/record.url?scp=0024122889&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0024122889&partnerID=8YFLogxK

M3 - Conference contribution

SP - 371

EP - 381

BT - IEEE Int Conf on Neural Networks

PB - Publ by IEEE

ER -