913 resultados para Divide and conquer
Resumo:
A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants
Resumo:
A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants
Resumo:
info:eu-repo/semantics/submittedForPublication
Resumo:
The identification of northern and southern components in different vertebrate species led researchers to accept a two-component hypothesis for the Brazilian Atlantic forest (BAF). Nevertheless, neither a formal proposal nor a meta-analysis to confirm this coincidence was ever made. Our main objective here was therefore to systematically test in how many vertebrate components the BAF could be divided by analysing existing empirical data. We used two approaches: (1) mapping and comparing the proposed areas of vertebrate endemism in the BAF and (2) analysing studies mentioning spatial subdivisions in distinct forest-dependent vertebrates within the biome, by the use of panbiogeography. The four large-scale endemism area components together with the six small-scale panbiogeographical ones allowed the definition of three BAF greater regions, subdivided into nine vertebrate components, latitudinally and longitudinally organized. Empirical time estimates of the diversification events within the BAF were also reviewed. Diversification of these vertebrates occurred not only in the Pleistocene but also throughout the Miocene. Our results confirm the BAF's complex history, both in space and time. We propose that future research should be small-scale and focused in the vertebrate components identified herein. Given the BAF's heterogeneity, studying via sections will be much more useful in identifying the BAF's historical biogeography. (c) 2012 The Linnean Society of London, Biological Journal of the Linnean Society, 2012, 107, 39-55.
Resumo:
With the failure of the traditional mechanisms of distributing bibliographic materials into developing countries, digital libraries show up as a strong alternative in accomplishing such job, despite the challenges of the digital divide. This paper discusses the challenges of building a digital library (DL) in a developing country. The case of Cape Verde as a digital divide country is analyzed, in terms of current digital library usage and its potentiality for fighting the difficulties in accessing bibliographic resources in the country. The paper also introduces an undergoing project of building a digital library at the University Jean Piaget of Cape Verde.
Resumo:
We propose a model where an autocrat rules over an ethnically divided society. The dictator selects the tax rate over domestic production and the nation’s natural resources to maximize his rents under the threat of a regime-switching revolution. We show that a weak ruler may let the country plunge in civil war to increase his personal rents. Inter-group fighting weakens potential opposition to the ruler, thereby allowing him to increase fiscal pressure. We show that the presence of natural resources exacerbates the incentives of the ruler to promote civil conflict for his own profit, especially if the resources are unequally distributed across ethnic groups. We validate the main predictions of the model using cross-country data over the period 1960-2007, and show that our empirical results are not likely to be driven by omitted observable determinants of civil war incidence or by unobservable country-specific heterogeneity.
Resumo:
In order to gain knowledge from large databases, scalable data mining technologies are needed. Data are captured on a large scale and thus databases are increasing at a fast pace. This leads to the utilisation of parallel computing technologies in order to cope with large amounts of data. In the area of classification rule induction, parallelisation of classification rules has focused on the divide and conquer approach, also known as the Top Down Induction of Decision Trees (TDIDT). An alternative approach to classification rule induction is separate and conquer which has only recently been in the focus of parallelisation. This work introduces and evaluates empirically a framework for the parallel induction of classification rules, generated by members of the Prism family of algorithms. All members of the Prism family of algorithms follow the separate and conquer approach.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Semi-supervised learning is one of the important topics in machine learning, concerning with pattern classification where only a small subset of data is labeled. In this paper, a new network-based (or graph-based) semi-supervised classification model is proposed. It employs a combined random-greedy walk of particles, with competition and cooperation mechanisms, to propagate class labels to the whole network. Due to the competition mechanism, the proposed model has a local label spreading fashion, i.e., each particle only visits a portion of nodes potentially belonging to it, while it is not allowed to visit those nodes definitely occupied by particles of other classes. In this way, a "divide-and-conquer" effect is naturally embedded in the model. As a result, the proposed model can achieve a good classification rate while exhibiting low computational complexity order in comparison to other network-based semi-supervised algorithms. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method.
Resumo:
In this paper, a novel and approach for obtaining 3D models from video sequences captured with hand-held cameras is addressed. We define a pipeline that robustly deals with different types of sequences and acquiring devices. Our system follows a divide and conquer approach: after a frame decimation that pre-conditions the input sequence, the video is split into short-length clips. This allows to parallelize the reconstruction step which translates into a reduction in the amount of computational resources required. The short length of the clips allows an intensive search for the best solution at each step of reconstruction which robustifies the system. The process of feature tracking is embedded within the reconstruction loop for each clip as opposed to other approaches. A final registration step, merges all the processed clips to the same coordinate frame
Resumo:
This article aims to gain a greater understanding of relevant and successful methods of stimulating an ICT culture and skills development in rural areas. The paper distils good practice activities, utilizing criteria derived from a review of the rural dimensions of ICT learning, from a range of relevant initiatives and programmes. These good practice activities cover: community resource centres providing opportunities for ‘tasting’ ICTs; video games and Internet Cafe´s as tools removing ‘entry barriers’; emphasis on ‘user management’ as a means of creating ownership; service delivery beyond fixed locations; use of ICT capacities in the delivery of general services; and selected use of financial support.
Resumo:
ACM Computing Classification System (1998): G.2.2, F.2.2.