60 resultados para Distributed computer-controlled systems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, we take advantage of association rule mining to support two types of medical systems: the Content-based Image Retrieval (CBIR) systems and the Computer-Aided Diagnosis (CAD) systems. For content-based retrieval, association rules are employed to reduce the dimensionality of the feature vectors that represent the images and to improve the precision of the similarity queries. We refer to the association rule-based method to improve CBIR systems proposed here as Feature selection through Association Rules (FAR). To improve CAD systems, we propose the Image Diagnosis Enhancement through Association rules (IDEA) method. Association rules are employed to suggest a second opinion to the radiologist or a preliminary diagnosis of a new image. A second opinion automatically obtained can either accelerate the process of diagnosing or to strengthen a hypothesis, increasing the probability of a prescribed treatment be successful. Two new algorithms are proposed to support the IDEA method: to pre-process low-level features and to propose a preliminary diagnosis based on association rules. We performed several experiments to validate the proposed methods. The results indicate that association rules can be successfully applied to improve CBIR and CAD systems, empowering the arsenal of techniques to support medical image analysis in medical systems. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In chemical analyses performed by laboratories, one faces the problem of determining the concentration of a chemical element in a sample. In practice, one deals with the problem using the so-called linear calibration model, which considers that the errors associated with the independent variables are negligible compared with the former variable. In this work, a new linear calibration model is proposed assuming that the independent variables are subject to heteroscedastic measurement errors. A simulation study is carried out in order to verify some properties of the estimators derived for the new model and it is also considered the usual calibration model to compare it with the new approach. Three applications are considered to verify the performance of the new approach. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the influence of intrapulpal pressure simulation on the bonding effectiveness of etch & rinse and self-etch adhesives to dentin. Eighty sound human molars were distributed into eight groups, according to the permeability level of each sample, measured by an apparatus to assess hydraulic conductance (Lp). Thus, a similar mean permeability was achieved in each group. Three etch & rinse adhesives (Prime & Bond NT - PB, Single Bond -SB, and Excite - EX) and one self-etch system (Clearfil SE Bond - SE) were employed, varying the presence or absence of an intrapulpal pressure (IPP) simulation of 15 cmH2O. After adhesive and restorative procedures were carried out, the samples were stored in distilled water for 24 hours at 37°C, and taken for tensile bond strength (TBS) testing. Fracture analysis was performed using a light microscope at 40 X magnification. The data, obtained in MPa, were then submitted to the Kruskal-Wallis test ( a = 0.05). The results revealed that the TBS of SB and EX was significantly reduced under IPP simulation, differing from the TBS of PB and SE. Moreover, SE obtained the highest bond strength values in the presence of IPP. It could be concluded that IPP simulation can influence the bond strength of certain adhesive systems to dentin and should be considered when in vitro studies are conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider N sites randomly and uniformly distributed in a d-dimensional hypercube. A walker explores this disordered medium going to the nearest site, which has not been visited in the last mu (memory) steps. The walker trajectory is composed of a transient part and a periodic part (cycle). For one-dimensional systems, travelers can or cannot explore all available space, giving rise to a crossover between localized and extended regimes at the critical memory mu(1) = log(2) N. The deterministic rule can be softened to consider more realistic situations with the inclusion of a stochastic parameter T (temperature). In this case, the walker movement is driven by a probability density function parameterized by T and a cost function. The cost function increases as the distance between two sites and favors hops to closer sites. As the temperature increases, the walker can escape from cycles that are reminiscent of the deterministic nature and extend the exploration. Here, we report an analytical model and numerical studies of the influence of the temperature and the critical memory in the exploration of one-dimensional disordered systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The Borg Scale may be a useful tool for heart failure patients to self-monitor and self-regulate exercise on land or in water (hydrotherapy) by maintaining the heart rate (HR) between the anaerobic threshold and respiratory compensation point. Methods and Results: Patients performed a cardiopulmonary exercise test to determine their anaerobic threshold/respiratory compensation points. The percentage of the mean HR during the exercise session in relation to the anaerobic threshold HR (%EHR-AT), in relation to the respiratory compensation point (%EHR-RCP), in relation to the peak HR by the exercise test (%EHR-Peak) and in relation to the maximum predicted HR (%EHR-Predicted) was calculated. Next, patients were randomized into the land or water exercise group. One blinded investigator instructed the patients in each group to exercise at a level between ""relatively easy and slightly tiring"". The mean HR throughout the 30-min exercise session was recorded. The %EHR-AT and %EHR-Predicted did not differ between the land and water exercisegroups, but they differed in the %EHR-RCP (95 +/- 7 to 86 +/- 7. P<0.001) and in the %EHR-Peak (85 +/- 8 to 78 +/- 9, P=0.007). Conclusions: Exercise guided by the Borg scale maintains the patient's HR between the anaerobic threshold and respiratory compensation point (ie, in the exercise training zone). (Circ J 2009; 73: 1871-1876)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several experimental studies have altered the phase relationship between photic and non-photic environmental, 24 h cycles (zeitgebers) in order to assess their role in the synchronization of circadian rhythms. To assist in the interpretation of the complex activity patterns that emerge from these ""conflicting zeitgeber'' protocols, we present computer simulations of coupled circadian oscillators forced by two independent zeitgebers. This circadian system configuration was first employed by Pittendrigh and Bruce (1959), to model their studies of the light and temperature entrainment of the eclosion oscillator in Drosophila. Whereas most of the recent experiments have restricted conflicting zeitgeber experiments to two experimental conditions, by comparing circadian oscillator phases under two distinct phase relationships between zeitgebers (usually 0 and 12 h), Pittendrigh and Bruce compared eclosion phase under 12 distinct phase relationships, spanning the 24 h interval. Our simulations using non-linear differential equations replicated complex non-linear phenomena, such as ""phase jumps'' and sudden switches in zeitgeber preferences, which had previously been difficult to interpret. Our simulations reveal that these phenomena generally arise when inter-oscillator coupling is high in relation to the zeitgeber strength. Manipulations in the structural symmetry of the model indicated that these results can be expected to apply to a wide range of system configurations. Finally, our studies recommend the use of the complete protocol employed by Pittendrigh and Bruce, because different system configurations can generate similar results when a ""conflicting zeitgeber experiment'' incorporates only two phase relationships between zeitgebers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this Letter we extend current perspectives in engineering reservoirs by producing a time-dependent master equation leading to a nonstationary superposition equilibrium state that can be nonadiabatically controlled by the system-reservoir parameters. Working with an ion trapped inside a nonideal cavity, we first engineer effective interactions, which allow us to achieve two classes of decoherence-free evolution of superpositions of the ground and excited ionic levels: those with a time-dependent azimuthal or polar angle. As an application, we generalize the purpose of an earlier study [Phys. Rev. Lett. 96, 150403 (2006)], showing how to observe the geometric phases acquired by the protected nonstationary states even under nonadiabatic evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel solid phase extraction technique is described where DNA is bound and eluted from magnetic silica beads in a manner where efficiency is dependent on the magnetic manipulation of the beads and not on the flow of solution through a packed bed. The utility of this technique in the isolation of reasonably pure, PCR-amplifiable DNA from complex samples is shown by isolating DNA from whole human blood, and subsequently amplifying a fragment of the beta-globin gene. By effectively controlling the movement of the solid phase in the presence of a static sample, the issues associated with reproducibly packing a solid phase in a microchannel and maintaining consistent flow rates are eliminated. The technique described here is rapid, simple, and efficient, allowing for recovery of more than 60% of DNA from 0.6 mu L of blood at a concentration which is suitable for PCR amplification. In addition, the technique presented here requires inexpensive, common laboratory equipment, making it easily adopted for both clinical point-of-care applications and on-site forensic sample analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, artificial neural networks are employed in a novel approach to identify harmonic components of single-phase nonlinear load currents, whose amplitude and phase angle are subject to unpredictable changes, even in steady-state. The first six harmonic current components are identified through the variation analysis of waveform characteristics. The effectiveness of this method is tested by applying it to the model of a single-phase active power filter, dedicated to the selective compensation of harmonic current drained by an AC controller. Simulation and experimental results are presented to validate the proposed approach. (C) 2010 Elsevier B. V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in energy technology generation and new directions in electricity regulation have made distributed generation (DG) more widespread, with consequent significant impacts on the operational characteristics of distribution networks. For this reason, new methods for identifying such impacts are needed, together with research and development of new tools and resources to maintain and facilitate continued expansion towards DG. This paper presents a study aimed at determining appropriate DG sites for distribution systems. The main considerations which determine DG sites are also presented, together with an account of the advantages gained from correct DG placement. The paper intends to define some quantitative and qualitative parameters evaluated by Digsilent (R), GARP3 (R) and DSA-GD software. A multi-objective approach based on the Bellman-Zadeh algorithm and fuzzy logic is used to determine appropriate DG sites. The study also aims to find acceptable DG locations both for distribution system feeders, as well as for nodes inside a given feeder. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel graphical approach to adjust and evaluate frequency-based relays employed in anti-islanding protection schemes of distributed synchronous generators, in order to meet the anti-islanding and abnormal frequency variation requirements, simultaneously. The proposed method defines a region in the power mismatch space, inside which the relay non-detection zone should be located, if the above-mentioned requirements must be met. Such region is called power imbalance application region. Results show that this method can help protection engineers to adjust frequency-based relays to improve the anti-islanding capability and to minimize false operation occurrences, keeping the abnormal frequency variation utility requirements satisfied. Moreover, the proposed method can be employed to coordinate different types of frequency-based relays, aiming at improving overall performance of the distributed generator frequency protection scheme. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of supplementary damping controllers to mitigate the effects of electromechanical oscillations in power systems is a highly complex and time-consuming process, which requires a significant amount of knowledge from the part of the designer. In this study, the authors propose an automatic technique that takes the burden of tuning the controller parameters away from the power engineer and places it on the computer. Unlike other approaches that do the same based on robust control theories or evolutionary computing techniques, our proposed procedure uses an optimisation algorithm that works over a formulation of the classical tuning problem in terms of bilinear matrix inequalities. Using this formulation, it is possible to apply linear matrix inequality solvers to find a solution to the tuning problem via an iterative process, with the advantage that these solvers are widely available and have well-known convergence properties. The proposed algorithm is applied to tune the parameters of supplementary controllers for thyristor controlled series capacitors placed in the New England/New York benchmark test system, aiming at the improvement of the damping factor of inter-area modes, under several different operating conditions. The results of the linear analysis are validated by non-linear simulation and demonstrate the effectiveness of the proposed procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of this paper is to present architecture of automated system that allows monitoring and tracking in real time (online) the possible occurrence of faults and electromagnetic transients observed in primary power distribution networks. Through the interconnection of this automated system to the utility operation center, it will be possible to provide an efficient tool that will assist in decisionmaking by the Operation Center. In short, the desired purpose aims to have all tools necessary to identify, almost instantaneously, the occurrence of faults and transient disturbances in the primary power distribution system, as well as to determine its respective origin and probable location. The compilations of results from the application of this automated system show that the developed techniques provide accurate results, identifying and locating several occurrences of faults observed in the distribution system.