Logo BSU

Please use this identifier to cite or link to this item: https://elib.bsu.by/handle/123456789/306218
Title: Fast Random Search Algorithm in Neural networks Training
Authors: Matskevich, Vadim
Keywords: ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ::Кибернетика
ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ::Математика
Issue Date: 2023
Publisher: Minsk : BSU
Citation: Pattern Recognition and Information Processing (PRIP’2023). Artificial Universe: New Horisont : Proceedings of the 16 th International Conference, Belarus, Minsk, October 17–19, 2023 / Belarusian State University : eds. A. Nedzved, A. Belotserkovsky. – Minsk : BSU, 2023. – Pp. 22-24.
Abstract: The paper deals with a state-of-art applied problem related to the neural networks training. Currently, gradient descent algorithms are widely used for training. Despite their high convergence rate, they have a number of disadvantages, which, with the expansion of the neural networks' scope, can turn out to be critical. The paper proposes a fast algorithm for neural networks training based on random search. It has been experimentally shown that in terms of the proposed algorithm's convergence rate, it is almost comparable to the best of the gradient algorithms, and in terms of quality it is significantly ahead of it
URI: https://elib.bsu.by/handle/123456789/306218
ISBN: 978-985-881-522-6
Licence: info:eu-repo/semantics/openAccess
Appears in Collections:2023. Pattern Recognition and Information Processing (PRIP’2023). Artificial Intelliverse: Expanding Horizons

Files in This Item:
File Description SizeFormat 
22-24.pdf237,95 kBAdobe PDFView/Open
Show full item record Google Scholar



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.