7 resultados para Distributed operating systems (Computers) - Design

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The aim of this study was to evaluate, ex vivo, the precision of five electronic root canal length measurement devices (ERCLMDs) with different operating systems: the Root ZX, Mini Apex Locator, Propex II, iPex, and RomiApex A-15, and the possible influence of the positioning of the instrument tips short of the apical foramen. Material and Methods: Forty-two mandibular bicuspids had their real canal lengths (RL) previously determined. Electronic measurements were performed 1.0 mm short of the apical foramen (-1.0), followed by measurements at the apical foramen (0.0). The data resulting from the comparison of the ERCLMD measurements and the RL were evaluated by the Wilcoxon and Friedman tests at a significance level of 5%. Results: Considering the measurements performed at 0.0 and -1.0, the precision rates for the ERCLMDs were: 73.5% and 47.1% (Root ZX), 73.5% and 55.9% (Mini Apex Locator), 67.6% and 41.1% (Propex II), 61.7% and 44.1% (iPex), and 79.4% and 44.1% (RomiApex A-15), respectively, considering ±0.5 mm of tolerance. Regarding the mean discrepancies, no differences were observed at 0.0; however, in the measurements at -1.0, the iPex, a multi-frequency ERCLMD, had significantly more discrepant readings short of the apical foramen than the other devices, except for the Propex II, which had intermediate results. When the ERCLMDs measurements at -1.0 were compared with those at 0.0, the Propex II, iPex and RomiApex A-15 presented significantly higher discrepancies in their readings. Conclusions: Under the conditions of the present study, all the ERCLMDs provided acceptable measurements at the 0.0 position. However, at the -1.0 position, the ERCLMDs had a lower precision, with statistically significant differences for the Propex II, iPex, and RomiApex A-15.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synchronous telecommunication networks, distributed control systems and integrated circuits have its accuracy of operation dependent on the existence of a reliable time basis signal extracted from the line data stream and acquirable to each node. In this sense, the existence of a sub-network (inside the main network) dedicated to the distribution of the clock signals is crucially important. There are different solutions for the architecture of the time distribution sub-network and choosing one of them depends on cost, precision, reliability and operational security. In this work we expose: (i) the possible time distribution networks and their usual topologies and arrangements. (ii) How parameters of the network nodes can affect the reachability and stability of the synchronous state of a network. (iii) Optimizations methods for synchronous networks which can provide low cost architectures with operational precision, reliability and security. (C) 2011 Elsevier B. V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives. This study recorded and evaluated the intra-and inter-group agreement degree by different examiners for the classification of lower third molars according to both the Winter's and Pell & Gregory's systems. Study Design. An observational and cross-sectional study was realized with forty lower third molars analyzed from twenty digital panoramic radiographs. Four examiner groups (undergraduates, maxillofacial surgeons, oral radiologists and clinical dentists) from Aracaju, Sergipe, Brazil, classified them in relation to angulation, class and position. The variance test (ANOVA) was applied in the examiner findings with significance level of p<0.05 and confidence intervals of 95%. Results. Intra- and inter-group agreement was observed in Winter's classification system among all examiners. Pell & Gregory's classification system showed an average intra-group agreement and a statistical significant difference to position variable in inter-group analysis with greater disagreement to the clinical dentists group (p<0.05). Conclusions. High reproducibility was associated to Winter's classification, whereas the system proposed by Pell & Gregory did not demonstrate appropriate levels of reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we point some aspects of workers activities in offshore units in the oil industry. These units became more verticalized and have a greater number of operating systems. Our goal is to present the main difficulties that workers face in these units.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A complete census of planetary systems around a volume-limited sample of solar-type stars (FGK dwarfs) in the Solar neighborhood (d a parts per thousand currency signaEuro parts per thousand 15 pc) with uniform sensitivity down to Earth-mass planets within their Habitable Zones out to several AUs would be a major milestone in extrasolar planets astrophysics. This fundamental goal can be achieved with a mission concept such as NEAT-the Nearby Earth Astrometric Telescope. NEAT is designed to carry out space-borne extremely-high-precision astrometric measurements at the 0.05 mu as (1 sigma) accuracy level, sufficient to detect dynamical effects due to orbiting planets of mass even lower than Earth's around the nearest stars. Such a survey mission would provide the actual planetary masses and the full orbital geometry for all the components of the detected planetary systems down to the Earth-mass limit. The NEAT performance limits can be achieved by carrying out differential astrometry between the targets and a set of suitable reference stars in the field. The NEAT instrument design consists of an off-axis parabola single-mirror telescope (D = 1 m), a detector with a large field of view located 40 m away from the telescope and made of 8 small movable CCDs located around a fixed central CCD, and an interferometric calibration system monitoring dynamical Young's fringes originating from metrology fibers located at the primary mirror. The mission profile is driven by the fact that the two main modules of the payload, the telescope and the focal plane, must be located 40 m away leading to the choice of a formation flying option as the reference mission, and of a deployable boom option as an alternative choice. The proposed mission architecture relies on the use of two satellites, of about 700 kg each, operating at L2 for 5 years, flying in formation and offering a capability of more than 20,000 reconfigurations. The two satellites will be launched in a stacked configuration using a Soyuz ST launch vehicle. The NEAT primary science program will encompass an astrometric survey of our 200 closest F-, G- and K-type stellar neighbors, with an average of 50 visits each distributed over the nominal mission duration. The main survey operation will use approximately 70% of the mission lifetime. The remaining 30% of NEAT observing time might be allocated, for example, to improve the characterization of the architecture of selected planetary systems around nearby targets of specific interest (low-mass stars, young stars, etc.) discovered by Gaia, ground-based high-precision radial-velocity surveys, and other programs. With its exquisite, surgical astrometric precision, NEAT holds the promise to provide the first thorough census for Earth-mass planets around stars in the immediate vicinity of our Sun.