82 resultados para tabu search algorithm
Resumo:
Context: Foreign body aspiration (FbA) is a serious problem in children. Accurate clinical and radiographic diagnosis is important because missed or delayed diagnosis can result in respiratory difficulties ranging from life-treatening airway obstruction to chronic wheezing or recurrent pneumonia. Bronchoscopy also has risks and accurate clinical and radiographc diagnosis can support the decision of bronchoscopy. Objective: To rewiev the diagnostic accuracy of clinical presentation (CP) and pulmonary radiograph (PR) for the diagnosis of FbA. There is no previous rewievMethods: A search of Medline is conducted for articles containing data regarding CP and PR signes of FbA. Calculation of likelihood ratios (LR) and pre and post test probability using Bayes theorem were performed for all signs of CP and PR. Inclusion criteria: Articles containing prospective data regarding CP and PR of FbA. Exclusion criteria: Retrospectives studies. Articles containing incomplete data for calculation of LR. Results: Five prospectives studies are included with a total of 585 patients. Prevalence of FbA is 63% in children suspected of FbA. If CP is normal, probability of FbA is 25% and if PR is normal, probability is 14%. If CP is pathologic, probability of FbA is 69-76% with presence of cough (LR = 1.32) or dyspnea (LR = 1.84) or localized crackles (LR = 1.5). Probability is 81-88% if cyanosis (LR = 4.8) or decreased breaths sounds (LR = 4.3) or asymetric auscultation (LR = 2.9) or localized wheezing (LR = 2.5) are present. When CP is anormal and PR show mediatinal shift (LR = 100), pneumomediatin (LR = 100), radio opaque foreign body (LR = 100), lobar distention (LR = 4), atelectasis (LR = 2.5), inspiratory/expiratory abnormal (LR = 7), the probability of FbA is 96-100%. If CP is normal and PR is abnormal the probability is 40-100%. If CP is abnormal and PR is normal the probability is 55-75%. Conclusions: This rewiev of prospective studies demonstrates the importance of CP and PR and an algorithm can be proposed. When CP is abnormal with or without PR pathologic, the probability of FbA is high and bronchoscopy is indicated. When CP and PR are normal the probability of FbA is low and bronchoscopy is not necessary immediatly, observation should be proposed. This approach should be validated with prospective study.
Resumo:
For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.
Resumo:
BACKGROUND: Small RNAs (sRNAs) are widespread among bacteria and have diverse regulatory roles. Most of these sRNAs have been discovered by a combination of computational and experimental methods. In Pseudomonas aeruginosa, a ubiquitous Gram-negative bacterium and opportunistic human pathogen, the GacS/GacA two-component system positively controls the transcription of two sRNAs (RsmY, RsmZ), which are crucial for the expression of genes involved in virulence. In the biocontrol bacterium Pseudomonas fluorescens CHA0, three GacA-controlled sRNAs (RsmX, RsmY, RsmZ) regulate the response to oxidative stress and the expression of extracellular products including biocontrol factors. RsmX, RsmY and RsmZ contain multiple unpaired GGA motifs and control the expression of target mRNAs at the translational level, by sequestration of translational repressor proteins of the RsmA family. RESULTS: A combined computational and experimental approach enabled us to identify 14 intergenic regions encoding sRNAs in P. aeruginosa. Eight of these regions encode newly identified sRNAs. The intergenic region 1698 was found to specify a novel GacA-controlled sRNA termed RgsA. GacA regulation appeared to be indirect. In P. fluorescens CHA0, an RgsA homolog was also expressed under positive GacA control. This 120-nt sRNA contained a single GGA motif and, unlike RsmX, RsmY and RsmZ, was unable to derepress translation of the hcnA gene (involved in the biosynthesis of the biocontrol factor hydrogen cyanide), but contributed to the bacterium's resistance to hydrogen peroxide. In both P. aeruginosa and P. fluorescens the stress sigma factor RpoS was essential for RgsA expression. CONCLUSION: The discovery of an additional sRNA expressed under GacA control in two Pseudomonas species highlights the complexity of this global regulatory system and suggests that the mode of action of GacA control may be more elaborate than previously suspected. Our results also confirm that several GGA motifs are required in an sRNA for sequestration of the RsmA protein.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.
Resumo:
Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy, Total Variation (TV)- based energies and more recently non-local means. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm or fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n2) and O(1/√ε), while existing techniques are in O(1/n2) and O(1/√ε). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.
Resumo:
This review assesses the presentation, management, and outcome of delayed postpancreatectomy hemorrhage (PPH) and suggests a novel algorithm as possible standard of care.An electronic search of Medline and Embase databases from January 1990 to February 2010 was undertaken. A random-effect meta-analysis for success rate and mortality of laparotomy vs. interventional radiology after delayed PPH was performed.Fifteen studies comprising of 248 patients with delayed PPH were included. Its incidence was of 3.3%. A sentinel bleed heralding a delayed PPH was observed in 45% of cases. Pancreatic leaks or intraabdominal abscesses were found in 62%. Interventional radiology was attempted in 41%, and laparotomy was undertaken in 49%. On meta-analysis comparing laparotomy vs. interventional radiology, no significant difference could be found in terms of complete hemostasis (76% vs. 80%; P = 0.35). A statistically significant difference favored interventional radiology vs. laparotomy in term of mortality (22% vs. 47%; P = 0.02).Proper management of postoperative complications, such as pancreatic leak and intraabdominal abscess, minimizes the risk of delayed PPH. Sentinel bleeding needs to be thoroughly investigated. If a pseudoaneurysm is detected, it has to be treated by interventional angiography, in order to prevent a further delayed PPH. Early angiography and embolization or stenting is safe and should be the procedure of choice. Surgery remains a therapeutic option if no interventional radiology is available, or patients cannot be resuscitated for an interventional treatment.
Resumo:
Pyogenic liver abscess is a severe condition and a therapeutic challenge. Treatment failure may be due to an unrecognized ingested foreign body that migrated from the gastrointestinal tract. There has recently been a marked increase in the number of reported cases of this condition, but initial misdiagnosis as cryptogenic liver abscess still occurs in the majority of cases. We conducted the current study to characterize this entity and provide a diagnostic strategy applicable worldwide. To this end, data were collected from our case and from a systematic review that identified 59 well-described cases. Another systematic review identified series of cryptogenic-and Asian Klebsiella-liver abscess; these data were pooled and compared with the data from the cases of migrated foreign body liver abscess. The review points out the low diagnostic accuracy of history taking, modern imaging, and even surgical exploration. A fistula found through imaging procedures or endoscopy warrants surgical exploration. Findings suggestive of foreign body migration are symptoms of gastrointestinal perforation, computed tomography demonstration of a thickened gastrointestinal wall in continuity with the abscess, and adhesions seen during surgery. Treatment failure, left lobe location, unique location (that is, only 1 abscess location within the liver), and absence of underlying conditions also point to the diagnosis, as shown by comparison with the cryptogenic liver abscess series. This study demonstrates that migrated foreign body liver abscess is a specific entity, increasingly reported. It usually is not cured when unrecognized, and diagnosis is mainly delayed. This study provides what we consider the best available evidence for timely diagnosis with worldwide applicability. Increased awareness is required to treat this underestimated condition effectively, and further studies are needed.
Resumo:
BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.
Resumo:
The atomic force microscope is not only a very convenient tool for studying the topography of different samples, but it can also be used to measure specific binding forces between molecules. For this purpose, one type of molecule is attached to the tip and the other one to the substrate. Approaching the tip to the substrate allows the molecules to bind together. Retracting the tip breaks the newly formed bond. The rupture of a specific bond appears in the force-distance curves as a spike from which the binding force can be deduced. In this article we present an algorithm to automatically process force-distance curves in order to obtain bond strength histograms. The algorithm is based on a fuzzy logic approach that permits an evaluation of "quality" for every event and makes the detection procedure much faster compared to a manual selection. In this article, the software has been applied to measure the binding strength between tubuline and microtubuline associated proteins.
Resumo:
Background: The first AO comprehensive pediatric long bone fracture classification system has been established following a structured path of development and validation with experienced pediatric surgeons. Methods: A follow-up series of agreement studies was applied to specify and evaluate a grading system for displacement of pediatric supracondylar fractures. An iterative process comprising an international group of 5 experienced pediatric surgeons (Phase 1) followed by a pragmatic multicenter agreement study involving 26 raters (Phase 2) was used. The last evaluations were conducted on a consecutive collection of 154 supracondylar fractures documented by standard anteroposterior and lateral radiographs. Results: Fractures were classified according to 1 of 4 grades: I = incomplete fracture with no or minimal displacement; II = Incomplete fracture with continuity of the posterior (extension fracture) or anterior cortex (flexion fracture); III = lack of bone continuity (broken cortex), but still some contact between the fracture planes; IV = complete fracture with no bone continuity (broken cortex), and no contact between the fracture planes. A diagnostic algorithm to support the practical application of the grading system in a clinical setting, as well as an aid using a circle placed over the capitellum was proposed. The overall kappa coefficients were 0.68 and 0.61 in the Phase 1 and Phase 2 studies, respectively. In the Phase 1 study, fracture grades I, II, III, and IV were classified with median accuracies of 91%, 82%, 83%, and 99.5%, respectively. Similar median accuracies of 86% (Grade I), 73% (Grade II), 83%(Grade III), and 92% were reported for the Phase 2 study. Reliability was high in distinguishing complete, unstable fractures from stable injuries [ie, kappa coefficients of 0.84 (Phase 1) and 0.83 (Phase 2) were calculated]; in Phase 2, surgeons' accuracies in classifying complete fractures were all above 85%. Conclusions: With clear and unambiguous definition, this new grading system for supracondylar fracture displacement has proved to be sufficiently reliable and accurate when applied by pediatric surgeons in the framework of clinical routine as well as research.
Resumo:
Through this article, we propose a mixed management of patients' medical records, so as to share responsibilities between the patient and the Medical Practitioner by making Patients responsible for the validation of their administrative information, and MPs responsible for the validation of their Patients' medical information. Our proposal can be considered a solution to the main problem faced by patients, health practitioners and the authorities, namely the gathering and updating of administrative and medical data belonging to the patient in order to accurately reconstitute a patient's medical history. This method is based on two processes. The aim of the first process is to provide a patient's administrative data, in order to know where and when the patient received care (name of the health structure or health practitioner, type of care: out patient or inpatient). The aim of the second process is to provide a patient's medical information and to validate it under the accountability of the Medical Practitioner with the help of the patient if needed. During these two processes, the patient's privacy will be ensured through cryptographic hash functions like the Secure Hash Algorithm, which allows pseudonymisation of a patient's identity. The proposed Medical Record Search Engines will be able to retrieve and to provide upon a request formulated by the Medical ractitioner all the available information concerning a patient who has received care in different health structures without divulging the patient's identity. Our method can lead to improved efficiency of personal medical record management under the mixed responsibilities of the patient and the MP.