3 resultados para the Okinawa Trough
em Bulgarian Digital Mathematics Library at IMI-BAS
Resumo:
Станимир Д. Илиев, Нина Хр. Пешева - Представен е хибриден експериментално-числен метод, работещ в реално време, за определяне на макроскопичния динамичен контактен ъгъл, който менискусът на течност в съд формира с вертикална пластина, която се потапя или издърпва с постоянна скорост от съда с течността. Този метод е приложим, когато системата е в стационарно състояние. Методът се базира на пълния хидродинамичен модел на Войнов. Той позволява да се получи числено с висока точност стационарната форма на профила на динамичния менискус (и от там ъгълът на наклон на менискуса) като се използва като гранично условие експериментално определената височина на менискуса на пластината.
Resumo:
When Recurrent Neural Networks (RNN) are going to be used as Pattern Recognition systems, the problem to be considered is how to impose prescribed prototype vectors ξ^1,ξ^2,...,ξ^p as fixed points. The synaptic matrix W should be interpreted as a sort of sign correlation matrix of the prototypes, In the classical approach. The weak point in this approach, comes from the fact that it does not have the appropriate tools to deal efficiently with the correlation between the state vectors and the prototype vectors The capacity of the net is very poor because one can only know if one given vector is adequately correlated with the prototypes or not and we are not able to know what its exact correlation degree. The interest of our approach lies precisely in the fact that it provides these tools. In this paper, a geometrical vision of the dynamic of states is explained. A fixed point is viewed as a point in the Euclidean plane R2. The retrieving procedure is analyzed trough statistical frequency distribution of the prototypes. The capacity of the net is improved and the spurious states are reduced. In order to clarify and corroborate the theoretical results, together with the formal theory, an application is presented
Resumo:
As is well known, the Convergence Theorem for the Recurrent Neural Networks, is based in Lyapunov ́s second method, which states that associated to any one given net state, there always exist a real number, in other words an element of the one dimensional Euclidean Space R, in such a way that when the state of the net changes then its associated real number decreases. In this paper we will introduce the two dimensional Euclidean space R2, as the space associated to the net, and we will define a pair of real numbers ( x, y ) , associated to any one given state of the net. We will prove that when the net change its state, then the product x ⋅ y will decrease. All the states whose projection over the energy field are placed on the same hyperbolic surface, will be considered as points with the same energy level. On the other hand we will prove that if the states are classified attended to their distances to the zero vector, only one pattern in each one of the different classes may be at the same energy level. The retrieving procedure is analyzed trough the projection of the states on that plane. The geometrical properties of the synaptic matrix W may be used for classifying the n-dimensional state- vector space in n classes. A pattern to be recognized is seen as a point belonging to one of these classes, and depending on the class the pattern to be retrieved belongs, different weight parameters are used. The capacity of the net is improved and the spurious states are reduced. In order to clarify and corroborate the theoretical results, together with the formal theory, an application is presented.