436 resultados para Harbourne-Hirschowitz Conjecture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, we dealt with Restricted Boltzmann Machines with binary priors as models of unsupervised learning, analyzing the role of the number of hidden neurons on the amount of examples needed for a successful training. We simulated a teacher-student scenario and calculated the efficiency of the machine under the assumption of replica symmetry to study the location of the critical threshold beyond which learning begins. Our results confirm the conjecture that, in the absence of correlation between the weights of the data-generating machine, the critical threshold does not depend on the number of hidden units (as long as it is finite) and thus on the complexity of the data. Instead, the presence of correlation significantly reduces the amount of examples needed for training. We have shown that this effect becomes more pronounced as the number of hidden units increases. The entire analysis is supported by numerical simulations that corroborate the results.