popup zone

수학과 세미나 안내

등록일 2023-09-18 작성자 학과 관리자 조회 1058

수학과 세미나 포스터

DGU, 세미나 소개


신경망의 깊이와 데이터본능을 구현한 추상머쉰의 이론 및 시스템
About the Depth of Layers in Boltzmann Machines
╴as a hypothesis to resolve a key problem⠆ an instinct of AI ╴


Young S. Han


An entropic metric is introduced that measures a configurational distribution of
data particles at each layer. Abstract Machines are defined to have a property of "near
monotonic decrease" of the proposed measure in the depth of layers. It is shown that the
expected entropy for a set of inference data is
related with the difficulty of the data. The process of developing the abstract machine and
its implications on the deep learning applications are explained
in the context of thermodynamics and belief (Bayesian) inferences.
One key idea is the entropic force defined at each layer of the deep architectures.
The force is a natural result of the entropic differences between layers. For now
it is suggested that the entropic force 𝟋 is relational to the difficulty of data and
may produce abstract or kinetic motivations of the AI agents.
An implemented system of the AM is introduced. The ideas to encapsulate the AM
as a kernel of AI module are supposed to make it easier to build large scale real time
systems. The system is extensively tested and optimised to compute unique
unsupervised codes as deep Boltzmann machine, and will help advance researches and
commercial applications in a way that is not replaceable by other means.