956 resultados para Cluster Counting Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The atomic force microscope is not only a very convenient tool for studying the topography of different samples, but it can also be used to measure specific binding forces between molecules. For this purpose, one type of molecule is attached to the tip and the other one to the substrate. Approaching the tip to the substrate allows the molecules to bind together. Retracting the tip breaks the newly formed bond. The rupture of a specific bond appears in the force-distance curves as a spike from which the binding force can be deduced. In this article we present an algorithm to automatically process force-distance curves in order to obtain bond strength histograms. The algorithm is based on a fuzzy logic approach that permits an evaluation of "quality" for every event and makes the detection procedure much faster compared to a manual selection. In this article, the software has been applied to measure the binding strength between tubuline and microtubuline associated proteins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The first AO comprehensive pediatric long bone fracture classification system has been established following a structured path of development and validation with experienced pediatric surgeons. Methods: A follow-up series of agreement studies was applied to specify and evaluate a grading system for displacement of pediatric supracondylar fractures. An iterative process comprising an international group of 5 experienced pediatric surgeons (Phase 1) followed by a pragmatic multicenter agreement study involving 26 raters (Phase 2) was used. The last evaluations were conducted on a consecutive collection of 154 supracondylar fractures documented by standard anteroposterior and lateral radiographs. Results: Fractures were classified according to 1 of 4 grades: I = incomplete fracture with no or minimal displacement; II = Incomplete fracture with continuity of the posterior (extension fracture) or anterior cortex (flexion fracture); III = lack of bone continuity (broken cortex), but still some contact between the fracture planes; IV = complete fracture with no bone continuity (broken cortex), and no contact between the fracture planes. A diagnostic algorithm to support the practical application of the grading system in a clinical setting, as well as an aid using a circle placed over the capitellum was proposed. The overall kappa coefficients were 0.68 and 0.61 in the Phase 1 and Phase 2 studies, respectively. In the Phase 1 study, fracture grades I, II, III, and IV were classified with median accuracies of 91%, 82%, 83%, and 99.5%, respectively. Similar median accuracies of 86% (Grade I), 73% (Grade II), 83%(Grade III), and 92% were reported for the Phase 2 study. Reliability was high in distinguishing complete, unstable fractures from stable injuries [ie, kappa coefficients of 0.84 (Phase 1) and 0.83 (Phase 2) were calculated]; in Phase 2, surgeons' accuracies in classifying complete fractures were all above 85%. Conclusions: With clear and unambiguous definition, this new grading system for supracondylar fracture displacement has proved to be sufficiently reliable and accurate when applied by pediatric surgeons in the framework of clinical routine as well as research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known the relationship between source separation and blind deconvolution: If a filtered version of an unknown i.i.d. signal is observed, temporal independence between samples can be used to retrieve the original signal, in the same manner as spatial independence is used for source separation. In this paper we propose the use of a Genetic Algorithm (GA) to blindly invert linear channels. The use of GA is justified in the case of small number of samples, where other gradient-like methods fails because of poor estimation of statistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Defining the limits of an urban agglomeration is essential both for fundamental and applied studies in quantitative and theoretical geography. A simple and consistent way for defining such urban clusters is important for performing different statistical analysis and comparisons. Traditionally, agglomerations are defined using a rather qualitative approach based on various statistical measures. This definition varies generally from one country to another, and the data taken into account are different. In this paper, we explore the use of the City Clustering Algorithm (CCA) for the agglomeration definition in Switzerland. This algorithm provides a systemic and easy way to define an urban area based only on population data. The CCA allows the specification of the spatial resolution for defining the urban clusters. The results from different resolutions are compared and analysed, and the effect of filtering the data investigated. Different scales and parameters allow highlighting different phenomena. The study of Zipf's law using the visual rank-size rule shows that it is valid only for some specific urban clusters, inside a narrow range of the spatial resolution of the CCA. The scale where emergence of one main cluster occurs can also be found in the analysis using Zipf's law. The study of the urban clusters at different scales using the lacunarity measure - a complementary measure to the fractal dimension - allows to highlight the change of scale at a given range.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To test the effect of a multidimensional lifestyle intervention on aerobic fitness and adiposity in predominantly migrant preschool children. DESIGN: Cluster randomised controlled single blinded trial (Ballabeina study) over one school year; randomisation was performed after stratification for linguistic region. SETTING: 40 preschool classes in areas with a high migrant population in the German and French speaking regions of Switzerland. PARTICIPANTS: 652 of the 727 preschool children had informed consent and were present for baseline measures (mean age 5.1 years (SD 0.7), 72% migrants of multicultural origins). No children withdrew, but 26 moved away. INTERVENTION: The multidimensional culturally tailored lifestyle intervention included a physical activity programme, lessons on nutrition, media use (use of television and computers), and sleep and adaptation of the built environment of the preschool class. It lasted from August 2008 to June 2009. MAIN OUTCOME MEASURES: Primary outcomes were aerobic fitness (20 m shuttle run test) and body mass index (BMI). Secondary outcomes included motor agility, balance, percentage body fat, waist circumference, physical activity, eating habits, media use, sleep, psychological health, and cognitive abilities. RESULTS: Compared with controls, children in the intervention group had an increase in aerobic fitness at the end of the intervention (adjusted mean difference: 0.32 stages (95% confidence interval 0.07 to 0.57; P=0.01) but no difference in BMI (-0.07 kg/m(2), -0.19 to 0.06; P=0.31). Relative to controls, children in the intervention group had beneficial effects in motor agility (-0.54 s, -0.90 to -0.17; P=0.004), percentage body fat (-1.1%, -2.0 to -0.2; P=0.02), and waist circumference (-1.0 cm, -1.6 to -0.4; P=0.001). There were also significant benefits in the intervention group in reported physical activity, media use, and eating habits, but not in the remaining secondary outcomes. CONCLUSIONS: A multidimensional intervention increased aerobic fitness and reduced body fat but not BMI in predominantly migrant preschool children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present molecular dynamics (MD) simulations results for dense fluids of ultrasoft, fully penetrable particles. These are a binary mixture and a polydisperse system of particles interacting via the generalized exponential model, which is known to yield cluster crystal phases for the corresponding monodisperse systems. Because of the dispersity in the particle size, the systems investigated in this work do not crystallize and form disordered cluster phases. The clusteringtransition appears as a smooth crossover to a regime in which particles are mostly located in clusters, isolated particles being infrequent. The analysis of the internal cluster structure reveals microsegregation of the big and small particles, with a strong homo-coordination in the binary mixture. Upon further lowering the temperature below the clusteringtransition, the motion of the clusters" centers-of-mass slows down dramatically, giving way to a cluster glass transition. In the cluster glass, the diffusivities remain finite and display an activated temperature dependence, indicating that relaxation in the cluster glass occurs via particle hopping in a nearly arrested matrix of clusters. Finally we discuss the influence of the microscopic dynamics on the transport properties by comparing the MD results with Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General clustering deals with weighted objects and fuzzy memberships. We investigate the group- or object-aggregation-invariance properties possessed by the relevant functionals (effective number of groups or objects, centroids, dispersion, mutual object-group information, etc.). The classical squared Euclidean case can be generalized to non-Euclidean distances, as well as to non-linear transformations of the memberships, yielding the c-means clustering algorithm as well as two presumably new procedures, the convex and pairwise convex clustering. Cluster stability and aggregation-invariance of the optimal memberships associated to the various clustering schemes are examined as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of gross alpha, gross beta and 226Ra activity in natural waters is useful in a wide range of environmental studies. Furthermore, gross alpha and gross beta parameters are included in international legislation on the quality of drinking water [Council Directive 98/83/EC].1 In this work, a low-background liquid scintillation counter (Wallac, Quantulus 1220) was used to simultaneously determine gross alpha, gross beta and 226Ra activity in natural water samples. Sample preparation involved evaporation to remove 222Rn and its short-lived decay daughters. The evaporation process concentrated the sample ten-fold. Afterwards, a sample aliquot of 8 mL was mixed with 12 mL of Ultima Gold AB scintillation cocktail in low-diffusion vials. In this study, a theoretical mathematical model based on secular equilibrium conditions between 226Ra and its short-lived decay daughters is presented. The proposed model makes it possible to determine 226Ra activity from two measurements. These measurements also allow determining gross alpha and gross beta simultaneously. To validate the proposed model, spiked samples with different activity levels for each parameter were analysed. Additionally, to evaluate the model's applicability in natural water, eight natural water samples from different parts of Spain were analysed. The eight natural water samples were also characterised by alpha spectrometry for the naturally occurring isotopes of uranium (234U, 235U and 238U), radium (224Ra and 226Ra), 210Po and 232Th. The results for gross alpha and 226Ra activity were compared with alpha spectrometry characterization, and an acceptable concordance was obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STUDY DESIGN:: Retrospective database- query to identify all anterior spinal approaches. OBJECTIVES:: To assess all patients with pharyngo-cutaneous fistulas after anterior cervical spine surgery. SUMMARY OF BACKGROUND DATA:: Patients treated in University of Heidelberg Spine Medical Center, Spinal Cord Injury Unit and Department of Otolaryngology (Germany), between 2005 and 2011 with the diagnosis of pharyngo-cutaneous fistulas. METHODS:: We conducted a retrospective study on 5 patients between 2005 and 2011 with PCF after ACSS, their therapy management and outcome according to radiologic data and patient charts. RESULTS:: Upon presentation 4 patients were paraplegic. 2 had PCF arising from one piriform sinus, two patients from the posterior pharyngeal wall and piriform sinus combined and one patient only from the posterior pharyngeal wall. 2 had previous unsuccessful surgical repair elsewhere and 1 had prior radiation therapy. In 3 patients speech and swallowing could be completely restored, 2 patients died. Both were paraplegic. The patients needed an average of 2-3 procedures for complete functional recovery consisting of primary closure with various vascularised regional flaps and refining laser procedures supplemented with negative pressure wound therapy where needed. CONCLUSION:: Based on our experience we are able to provide a treatment algorithm that indicates that chronic as opposed to acute fistulas require a primary surgical closure combined with a vascularised flap that should be accompanied by the immediate application of a negative pressure wound therapy. We also conclude that particularly in paraplegic patients suffering this complication the risk for a fatal outcome is substantial.