66 resultados para reliability algorithms
Resumo:
This technical note develops information filter and array algorithms for a linear minimum mean square error estimator of discrete-time Markovian jump linear systems. A numerical example for a two-mode Markovian jump linear system, to show the advantage of using array algorithms to filter this class of systems, is provided.
Resumo:
The continuous growth of peer-to-peer networks has made them responsible for a considerable portion of the current Internet traffic. For this reason, improvements in P2P network resources usage are of central importance. One effective approach for addressing this issue is the deployment of locality algorithms, which allow the system to optimize the peers` selection policy for different network situations and, thus, maximize performance. To date, several locality algorithms have been proposed for use in P2P networks. However, they usually adopt heterogeneous criteria for measuring the proximity between peers, which hinders a coherent comparison between the different solutions. In this paper, we develop a thoroughly review of popular locality algorithms, based on three main characteristics: the adopted network architecture, distance metric, and resulting peer selection algorithm. As result of this study, we propose a novel and generic taxonomy for locality algorithms in peer-to-peer networks, aiming to enable a better and more coherent evaluation of any individual locality algorithm.
Resumo:
In this paper a computational implementation of an evolutionary algorithm (EA) is shown in order to tackle the problem of reconfiguring radial distribution systems. The developed module considers power quality indices such as long duration interruptions and customer process disruptions due to voltage sags, by using the Monte Carlo simulation method. Power quality costs are modeled into the mathematical problem formulation, which are added to the cost of network losses. As for the EA codification proposed, a decimal representation is used. The EA operators, namely selection, recombination and mutation, which are considered for the reconfiguration algorithm, are herein analyzed. A number of selection procedures are analyzed, namely tournament, elitism and a mixed technique using both elitism and tournament. The recombination operator was developed by considering a chromosome structure representation that maps the network branches and system radiality, and another structure that takes into account the network topology and feasibility of network operation to exchange genetic material. The topologies regarding the initial population are randomly produced so as radial configurations are produced through the Prim and Kruskal algorithms that rapidly build minimum spanning trees. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The most-used refrigeration system is the vapor-compression system. In this cycle, the compressor is the most complex and expensive component, especially the reciprocating semihermetic type, which is often used in food product conservation. This component is very sensitive to variations in its operating conditions. If these conditions reach unacceptable levels, failures are practically inevitable. Therefore, maintenance actions should be taken in order to maintain good performance of such compressors and to avoid undesirable stops of the system. To achieve such a goal, one has to evaluate the reliability of the system and/or the components. In this case, reliability means the probability that some equipment cannot perform their requested functions for an established time period, under defined operating conditions. One of the tools used to improve component reliability is the failure mode and effect analysis (FMEA). This paper proposes that the methodology of FMEA be used as a tool to evaluate the main failures found in semihermetic reciprocating compressors used in refrigeration systems. Based on the results, some suggestions for maintenance are addressed.
Resumo:
This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a family of algorithms for approximate inference in credal networks (that is, models based on directed acyclic graphs and set-valued probabilities) that contain only binary variables. Such networks can represent incomplete or vague beliefs, lack of data, and disagreements among experts; they can also encode models based on belief functions and possibilistic measures. All algorithms for approximate inference in this paper rely on exact inferences in credal networks based on polytrees with binary variables, as these inferences have polynomial complexity. We are inspired by approximate algorithms for Bayesian networks; thus the Loopy 2U algorithm resembles Loopy Belief Propagation, while the Iterated Partial Evaluation and Structured Variational 2U algorithms are, respectively, based on Localized Partial Evaluation and variational techniques. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
The flowshop scheduling problem with blocking in-process is addressed in this paper. In this environment, there are no buffers between successive machines: therefore intermediate queues of jobs waiting in the system for their next operations are not allowed. Heuristic approaches are proposed to minimize the total tardiness criterion. A constructive heuristic that explores specific characteristics of the problem is presented. Moreover, a GRASP-based heuristic is proposed and Coupled with a path relinking strategy to search for better outcomes. Computational tests are presented and the comparisons made with an adaptation of the NEH algorithm and with a branch-and-bound algorithm indicate that the new approaches are promising. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
When building genetic maps, it is necessary to choose from several marker ordering algorithms and criteria, and the choice is not always simple. In this study, we evaluate the efficiency of algorithms try (TRY), seriation (SER), rapid chain delineation (RCD), recombination counting and ordering (RECORD) and unidirectional growth (UG), as well as the criteria PARF (product of adjacent recombination fractions), SARF (sum of adjacent recombination fractions), SALOD (sum of adjacent LOD scores) and LHMC (likelihood through hidden Markov chains), used with the RIPPLE algorithm for error verification, in the construction of genetic linkage maps. A linkage map of a hypothetical diploid and monoecious plant species was simulated containing one linkage group and 21 markers with fixed distance of 3 cM between them. In all, 700 F(2) populations were randomly simulated with and 400 individuals with different combinations of dominant and co-dominant markers, as well as 10 and 20% of missing data. The simulations showed that, in the presence of co-dominant markers only, any combination of algorithm and criteria may be used, even for a reduced population size. In the case of a smaller proportion of dominant markers, any of the algorithms and criteria (except SALOD) investigated may be used. In the presence of high proportions of dominant markers and smaller samples (around 100), the probability of repulsion linkage increases between them and, in this case, use of the algorithms TRY and SER associated to RIPPLE with criterion LHMC would provide better results. Heredity (2009) 103, 494-502; doi:10.1038/hdy.2009.96; published online 29 July 2009
Resumo:
This paper proposes the use of the q-Gaussian mutation with self-adaptation of the shape of the mutation distribution in evolutionary algorithms. The shape of the q-Gaussian mutation distribution is controlled by a real parameter q. In the proposed method, the real parameter q of the q-Gaussian mutation is encoded in the chromosome of individuals and hence is allowed to evolve during the evolutionary process. In order to test the new mutation operator, evolution strategy and evolutionary programming algorithms with self-adapted q-Gaussian mutation generated from anisotropic and isotropic distributions are presented. The theoretical analysis of the q-Gaussian mutation is also provided. In the experimental study, the q-Gaussian mutation is compared to Gaussian and Cauchy mutations in the optimization of a set of test functions. Experimental results show the efficiency of the proposed method of self-adapting the mutation distribution in evolutionary algorithms.
Resumo:
Objective To assess the validity and the reliability of the Portuguese version of the Delirium Rating Scale-Revised-98 (DRS-R-98). Methods The scale was translated into Portuguese and back-translated into English. After assessing its face validity, five diagnostic groups (n = 64; delirium, depression, dementia, schizophrenia and others) were evaluated by two independent researchers blinded to the diagnosis. Diagnosis and severity of delirium as measured by the DRS-R-98 were compared to clinical diagnosis, Mini-Mental State Exam, Confusion Assessment Method, and Clinical Global Impressions scale (CGI). Results Mean and rnedian DRS-R-98 total scores significantly distinguished delirium from the other groups (p < 0.001). Inter-rater reliability (ICC between 0.9 and 1) and internal consistency (alpha = 0.91) were very high. DRS-R-98 severity scores correlated highly with the CGI. Mean DRS-R-98 severity scores during delirium differed significantly (p < 0.01) from the post-treatment values. The area under the curve established by ROC analysis was 0.99 and using the cut-off Value of 20 the scale showed sensitivity and specificity of 92.6% and 94.6%, respectively. Conclusion The Portuguese version of the DRS-R-98 is a valid and reliable measure of delirium that distinguishes delirium from other disorders and is sensitive to change in delirium severity, which may be of great value for longitudinal studies. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
Objective: The study we assessed how often patients who are manifesting a myocardial infarction (MI) would not be considered candidates for intensive lipid-lowering therapy based on the current guidelines. Methods: In 355 consecutive patients manifesting ST elevation MI (STEMI), admission plasma C-reactive protein (CRP) was measured and Framingham risk score (FRS), PROCAM risk score, Reynolds risk score, ASSIGN risk score, QRISK, and SCORE algorithms were applied. Cardiac computed tomography and carotid ultrasound were performed to assess the coronary artery calcium score (CAC), carotid intima-media thickness (cIMT) and the presence of carotid plaques. Results: Less than 50% of STEMI patients would be identified as having high risk before the event by any of these algorithms. With the exception of FRS (9%), all other algorithms would assign low risk to about half of the enrolled patients. Plasma CRP was <1.0 mg/L in 70% and >2 mg/L in 14% of the patients. The average cIMT was 0.8 +/- 0.2 mm and only in 24% of patients was >= 1.0 mm. Carotid plaques were found in 74% of patients. CAC > 100 was found in 66% of patients. Adding CAC >100 plus the presence of carotid plaque, a high-risk condition would be identified in 100% of the patients using any of the above mentioned algorithms. Conclusion: More than half of patients manifesting STEMI would not be considered as candidates for intensive preventive therapy by the current clinical algorithms. The addition of anatomical parameters such as CAC and the presence of carotid plaques can substantially reduce the CVD risk underestimation. (C) 2010 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The objective was to determine the reliability of isokinetic strength and endurance testing in the ankle joints of patients with intermittent claudication. Twenty-three patients with peripheral artery disease (PAD) and symptoms of intermittent claudication participated in the study. Isokinetic strength and endurance testing of the ankle joint were performed in symptomatic and asymptomatic legs on 3 separate days. Intraclass coefficient correlation of peak torque (PT) and total work (TW) ranged from 0.77 to 0.92 and 0.89 to 0.96, respectively. PT and TW increased significantly and similarly in both legs from day 1 to day 2 (PT: +42 +/- 84% in the symptomatic leg and +33 +/- 51% in the asymptomatic leg, p < 0.05;TW: +38 +/- 26% in the symptomatic leg and +26 +/- 50% in the asymptomatic leg, p < 0.05). In conclusion, isokinetic strength and endurance testing in the ankle joints of patients with PAD presents reliability coefficients ranging from 0.77 to 0.96. However, strength and endurance increased between the first and the other test sessions performed on separate days, suggesting that two test sessions are necessary for the accurate evaluation of strength and endurance in patients with PAD.
Resumo:
The objective of this prospective study was to perform a cross-cultural adaptation of the Functional Assessment Measure (FAM) into Brazilian Portuguese, and to assess the test-retest reliability. The instrument was translated, back-translated, pretested, and reviewed by a committee. The Brazilian version was assessed in 61 brain-injury patients. Intrarater and interrater reliability was verified by a test-retest procedure (intraclass correlation). Intrarater reliability was moderate-to-excellent; interrater reliability was moderate-to-excellent, with the exception of one item. The Brazilian version of the FAM has acceptable test-retest reliability. Results suggest the use of the Brazilian version of the FAM in the Brazilian population, for disability evaluation and outcome assessment. Further research is required to evaluate the psychometric properties of the scale. International Journal of Rehabilitation Research 34:89-91 (C) 2011 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
Resumo:
The Direct Assessment of Functional Status-Revised (DAFS-R) is an instrument developed to objectively measure functional capacities required for independent living. The objective of this study was to translate and culturally adapt the DAFS-R for Brazilian Portuguese (DAFS-BR) and to evaluate its reliability and validity. The DAFS-BR was administered to 89 older patients classified previously as normal controls, mild cognitive impairment (MCI) and Alzheimer`s disease (AD). The results indicated good internal consistency (Cronbach`s alpha = 0.78) in the total sample. The DAFS-BR showed high interobserver reliability (0.996; p < .001) as well as test-retest stability over 1-week interval (0.995; p < .001). Correlation between the DAFS-BR total score and the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE) was moderate and significant (r = -.65, p < .001) in the total sample, whereas it did not reach statistical significance within each diagnostic group. Receiver operating characteristic curve analyses suggested that DAFS-BR has good sensitivity and specificity to identify MCI and AD. Results suggest that DAFS-BR can document degrees of severity of functional impairment among Brazilian older adults.