933 resultados para parallel and distributed information processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the accuracy of electronic apex locators Digital Signal Processing (DSP) and ProPex, for root canal length determination in primary teeth. Fifteen primary molars (a total of 34 root canals) were divided into two groups: Group I - without physiological resorption (n = 16); and Group II - with physiological resorption (n = 18). The length of each canal was measured by introducing a file until its tip was visible and then it was retracted 1 mm. For electronic measurement, the devices were set to 1 mm short of the apical resorption. The data were analysed statistically using the intraclass correlation coefficient (ICC). Results showed that the ICC was high for both electronic apex locators in all situations - with (ICC: DSP = 0.82 and Propex = 0.89) or without resorption (ICC: DSP = 0.92 and Propex = 0.90). Both apex locators were extremely accurate in determining the working length in primary teeth, both with or without physiological resorption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel cryptography method based on the Lorenz`s attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coexistence between different types of templates has been the choice solution to the information crisis of prebiotic evolution, triggered by the finding that a single RNA-like template cannot carry enough information to code for any useful replicase. In principle, confining d distinct templates of length L in a package or protocell, whose Survival depends on the coexistence of the templates it holds in, could resolve this crisis provided that d is made sufficiently large. Here we review the prototypical package model of Niesert et al. [1981. Origin of life between Scylla and Charybdis. J. Mol. Evol. 17, 348-353] which guarantees the greatest possible region of viability of the protocell population, and show that this model, and hence the entire package approach, does not resolve the information crisis. In particular, we show that the total information stored in a viable protocell (Ld) tends to a constant value that depends only on the spontaneous error rate per nucleotide of the template replication mechanism. As a result, an increase of d must be followed by a decrease of L, so that the net information gain is null. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation was offered as part of the CUNY Library Assessment Conference, Reinventing Libraries: Reinventing Assessment, held at the City University of New York in June 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The metaheuristics techiniques are known to solve optimization problems classified as NP-complete and are successful in obtaining good quality solutions. They use non-deterministic approaches to generate solutions that are close to the optimal, without the guarantee of finding the global optimum. Motivated by the difficulties in the resolution of these problems, this work proposes the development of parallel hybrid methods using the reinforcement learning, the metaheuristics GRASP and Genetic Algorithms. With the use of these techniques, we aim to contribute to improved efficiency in obtaining efficient solutions. In this case, instead of using the Q-learning algorithm by reinforcement learning, just as a technique for generating the initial solutions of metaheuristics, we use it in a cooperative and competitive approach with the Genetic Algorithm and GRASP, in an parallel implementation. In this context, was possible to verify that the implementations in this study showed satisfactory results, in both strategies, that is, in cooperation and competition between them and the cooperation and competition between groups. In some instances were found the global optimum, in others theses implementations reach close to it. In this sense was an analyze of the performance for this proposed approach was done and it shows a good performance on the requeriments that prove the efficiency and speedup (gain in speed with the parallel processing) of the implementations performed

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study was taken in a 1566 ha watershed situated in the Capivara River basin, municipality of Botucatu, São Paulo State, Brazil. This environment is fragile and can be subjected to different forms of negative impacts, among them soil erosion by water. The main objective of the research was to develop a methodology for the assessment of soil erosion fragility at the various different watershed positions, using the geographic information system ILWIS version 3.3 for Windows. An impact model was created to generate the soil's erosion fragility plan, based on four indicators of fragility to water erosion: land use and cover, slope, percentage of soil fine sand and accumulated water flow. Thematic plans were generated in a geographic information system (GIS) environment. First, all the variables, except land use and cover, were described by continuous numerical plans in a raster structure. The land use and cover plan was also represented by numerical values associated with the weights attributed to each class, starting from a pairwise comparison matrix and using the analytical hierarchy process. A final field check was done to record evidence of erosive processes in the areas indicated as presenting the highest levels of fragility, i.e., sites with steep slopes, high percentage of soil fine sand, tendency to accumulate surface water flow, and sites of pastureland. The methodology used in the environmental problems diagnosis of the study area can be employed at places with similar relief, soil and climatic conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A neural model for solving nonlinear optimization problems is presented in this paper. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points that represent an optimal feasible solution. The network is shown to be completely stable and globally convergent to the solutions of nonlinear optimization problems. A study of the modified Hopfield model is also developed to analyze its stability and convergence. Simulation results are presented to validate the developed methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information has increasingly become a crucial resource for organizations that want to remain competitive in the market. For this reason, analysis and a correct understanding of informational types that are present in these environments become relevant to achieving the highest levels of performance. The aim of this paper is to review the literature of the concepts of organic and archival information within the organizational context/business environments. This is still an emerging theoretical field and therefore is conducive to intense discussions. We point out elements that help to characterize and distinguish these two types of information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to examine the effects of visual and somatosensory information on body sway in individuals with Down syndrome (DS). Nine adults with DS (19-29 years old) and nine control subjects (CS) (19-29 years old) stood in the upright stance in four experimental conditions: no vision and no touch; vision and no touch; no vision and touch; and vision and touch. In the vision condition, participants looked at a target placed in front of them; in the no vision condition, participants wore a black cotton mask. In the touch condition, participants touched a stationary surface with their right index finger; in the no touch condition, participants kept their arms hanging alongside their bodies. A force plate was used to estimate center of pressure excursion for both anterior-posterior and medial-lateral directions. MANOVA revealed that both the individuals with DS and the control subjects used vision and touch to reduce overall body sway, although individuals with DS still oscillated more than did the CS. These results indicate that adults with DS are able to use sensory information to reduce body sway, and they demonstrate that there is no difference in sensory integration between the individuals with DS and the CS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A thorough study of the thermal performance of multipass parallel cross-flow and counter-cross-flow heat exchangers has been carried out by applying a new numerical procedure. According to this procedure, the heat exchanger is discretized into small elements following the tube-side fluid circuits. Each element is itself a one-pass mixed-unmixed cross-flow heat exchanger. Simulated results have been validated through comparisons to results from analytical solutions for one- to four-pass, parallel cross-flow and counter-cross-flow arrangements. Very accurate results have been obtained over wide ranges of NTU (number of transfer units) and C* (heat capacity rate ratio) values. New effectiveness data for the aforementioned configurations and a higher number of tube passes is presented along with data for a complex flow configuration proposed elsewhere. The proposed procedure constitutes a useful research tool both for theoretical and experimental studies of cross-flow heat exchangers thermal performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this work was to study fragmentation of forest formations (mesophytic forest, riparian woodland and savannah vegetation (cerrado)) in a 15,774-ha study area located in the Municipal District of Botucatu in Southeastern Brazil (São Paulo State). A land use and land cover map was made from a color composition of a Landsat-5 thematic mapper (TM) image. The edge effect caused by habitat fragmentation was assessed by overlaying, on a geographic information system (GIS), the land use and land cover data with the spectral ratio. The degree of habitat fragmentation was analyzed by deriving: 1. mean patch area and perimeter; 2. patch number and density; 3. perimeter-area ratio, fractal dimension (D), and shape diversity index (SI); and 4. distance between patches and dispersion index (R). In addition, the following relationships were modeled: 1. distribution of natural vegetation patch sizes; 2. perimeter-area relationship and the number and area of natural vegetation patches; 3. edge effect caused by habitat fragmentation, the values of R indicated that savannah patches (R = 0.86) were aggregated while patches of natural vegetation as a whole (R = 1.02) were randomly dispersed in the landscape. There was a high frequency of small patches in the landscape whereas large patches were rare. In the perimeter-area relationship, there was no sign of scale distinction in the patch shapes, In the patch number-landscape area relationship, D, though apparently scale-dependent, tends to be constant as area increases. This phenomenon was correlated with the tendency to reach a constant density as the working scale was increased, on the edge effect analysis, the edge-center distance was properly estimated by a model in which the edge-center distance was considered a function of the to;al patch area and the SI. (C) 1997 Elsevier B.V. B.V.