<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <title>ЭБ Коллекция:</title>
  <link rel="alternate" href="https://elib.bsu.by:443/handle/123456789/53789" />
  <subtitle />
  <id>https://elib.bsu.by:443/handle/123456789/53789</id>
  <updated>2026-04-20T23:09:52Z</updated>
  <dc:date>2026-04-20T23:09:52Z</dc:date>
  <entry>
    <title>Traing algorithm for forecasting multilayer neural network</title>
    <link rel="alternate" href="https://elib.bsu.by:443/handle/123456789/53793" />
    <author>
      <name>Maniakov, N.</name>
    </author>
    <author>
      <name>Makhnist, L.</name>
    </author>
    <author>
      <name>Rubanov, V.</name>
    </author>
    <id>https://elib.bsu.by:443/handle/123456789/53793</id>
    <updated>2018-08-08T07:47:29Z</updated>
    <published>2003-01-01T00:00:00Z</published>
    <summary type="text">Заглавие документа: Traing algorithm for forecasting multilayer neural network
Авторы: Maniakov, N.; Makhnist, L.; Rubanov, V.
Аннотация: This paper discusses one method of building a forecasting neural network, based on dynamical properties of time series. Is explained its construction on basis of theorem of em¬bedding. For this type of feed-forward multilayer neural network proposed recurrent training methods and its matrix algorithmization, which is very helpful in its program realization.</summary>
    <dc:date>2003-01-01T00:00:00Z</dc:date>
  </entry>
  <entry>
    <title>Algorithms of Solving Pattern Recognition Tasks with Uncertain Data</title>
    <link rel="alternate" href="https://elib.bsu.by:443/handle/123456789/53792" />
    <author>
      <name>Ryabtsev, A.</name>
    </author>
    <id>https://elib.bsu.by:443/handle/123456789/53792</id>
    <updated>2018-08-07T09:26:25Z</updated>
    <published>2003-01-01T00:00:00Z</published>
    <summary type="text">Заглавие документа: Algorithms of Solving Pattern Recognition Tasks with Uncertain Data
Авторы: Ryabtsev, A.
Аннотация: The extensions of inductive resolution algorithms set are proposed. The extensions allow operating with uncertain (or missing) values of attributes. It is established that algorithms are monotonnous relatively to the volume of known data.</summary>
    <dc:date>2003-01-01T00:00:00Z</dc:date>
  </entry>
  <entry>
    <title>A Modification Of Tamura's Pattern Classification Method</title>
    <link rel="alternate" href="https://elib.bsu.by:443/handle/123456789/53791" />
    <author>
      <name>Viattchenin, D. A.</name>
    </author>
    <author>
      <name>Klempach, V.</name>
    </author>
    <author>
      <name>Yarkou, Y.</name>
    </author>
    <id>https://elib.bsu.by:443/handle/123456789/53791</id>
    <updated>2018-08-07T09:22:20Z</updated>
    <published>2003-01-01T00:00:00Z</published>
    <summary type="text">Заглавие документа: A Modification Of Tamura's Pattern Classification Method
Авторы: Viattchenin, D. A.; Klempach, V.; Yarkou, Y.
Аннотация: A modification of Tamura's method of pattern classification based on fuzzy relation is proposed in the paper. Numerical experiments results are presented Comparative analysis of the original version and the method modification is made. Some conclusions are made too.</summary>
    <dc:date>2003-01-01T00:00:00Z</dc:date>
  </entry>
  <entry>
    <title>"Accelerated Perceptron": A Self-Learning Linear Decision Algorithm</title>
    <link rel="alternate" href="https://elib.bsu.by:443/handle/123456789/53790" />
    <author>
      <name>Zuev, Yu. A.</name>
    </author>
    <id>https://elib.bsu.by:443/handle/123456789/53790</id>
    <updated>2018-08-07T09:25:38Z</updated>
    <published>2003-01-01T00:00:00Z</published>
    <summary type="text">Заглавие документа: "Accelerated Perceptron": A Self-Learning Linear Decision Algorithm
Авторы: Zuev, Yu. A.
Аннотация: The class of linear decision rules is studied. A new algorithm for weight correction, called an "accelerated perceptron", is proposed. In contrast to classical Rosenblatt's perceptron this algorithm modifies the weight vector at each step. The algorithm may be employed both in learning and in self-learning modes. The theoretical aspects of the behaviour of the algorithm are studied when the algorithm is used for the purpose of increasing the decision reliability by means of weighted voting. In this case the simple majority vote may be used as initial decision.</summary>
    <dc:date>2003-01-01T00:00:00Z</dc:date>
  </entry>
</feed>

