930 resultados para classification algorithm
Resumo:
We propose a deep study on tissue modelization andclassification Techniques on T1-weighted MR images. Threeapproaches have been taken into account to perform thisvalidation study. Two of them are based on FiniteGaussian Mixture (FGM) model. The first one consists onlyin pure gaussian distributions (FGM-EM). The second oneuses a different model for partial volume (PV) (FGM-GA).The third one is based on a Hidden Markov Random Field(HMRF) model. All methods have been tested on a DigitalBrain Phantom image considered as the ground truth. Noiseand intensity non-uniformities have been added tosimulate real image conditions. Also the effect of ananisotropic filter is considered. Results demonstratethat methods relying in both intensity and spatialinformation are in general more robust to noise andinhomogeneities. However, in some cases there is nosignificant differences between all presented methods.
Resumo:
The primary purposes of this investigation are: 1) To delineate flood plain deposits with different geologic and engineering properties. 2) To provide basic data necessary for any attempt at stabilizing flood plain deposits. The alluvial valley of the Missouri River adjacent to Iowa was chosen as the logical place to begin this study. The river forms the western boundary of the state for an airline distance of approximately 139 miles; and the flood plain varies from a maximum width of approximately 18 miles (Plates 2 and 3, Sheets 75 and 75L) to approximately 4 miles near Crescent, Iowa (Plate 8, Sheet 66). The area studied includes parts of Woodbury, Monona, Harrison, Pottawattamie, Mills, and Fremont counties in Iowa and parts of Dakota, Thurston, Burt, Washington, Douglas, Sarpy, Cass and Otoe counties in Nebraska. Plate l is an index map of the area under consideration.
Resumo:
The Iowa D.O.T. has a classification system designed to rate coarse aggregates as to their skid resistant characteristics. Aggregates have been classified into five functional types, with a Type 1 being the most skid resistant. A complete description of the classification system can be found in the Office of Materials Instructional Memorandum T-203. Due to the variability of ledges within any given quarry the classification of individual ledges becomes necessary. The type of aggregate is then specified for each asphaltic concrete surface course. As various aggregates become used in a.c. paving, there is a continuing process of evaluating the frictional properties of the pavement surface. It is primarily through an effort of this sort that information on aggregate sources and individual ledges becomes more refined. This study is being conducted to provide that needed up-to-date information that can be used to monitor the aggregate classification system.
Resumo:
This document Classifications and Pay Plans is produced by the State of Iowa Executive Branch, Department of Administrative Services. Informational document about the pay plan codes and classification codes, how to use them.
Resumo:
Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.
Resumo:
OBJECTIVE: To assess whether Jass staging enhances prognostic prediction in Dukes' B colorectal carcinoma. DESIGN: A historical cohort observational study. SETTING: A university tertiary care centre, Switzerland. SUBJECTS: 108 consecutive patients. INTERVENTIONS: Curative resection of Dukes' B colorectal carcinoma between January 1985 and December 1988, Patients with familial adenomatous polyposis; hereditary non-polyposis colorectal cancer; Crohns' disease; ulcerative colitis and synchronous and recurrent tumours were excluded. A comparable group of 155 consecutive patients with Dukes' C carcinoma were included for reference purposes. MAIN OUTCOME MEASURES: Disease free and overall survival for Dukes' B and overall survival for Dukes' C tumours. RESULTS: Dukes' B tumours in Jass group III or with an infiltrated margin had a significantly worse disease-free survival (p = 0.001 and 0.0001, respectively) and those with infiltrated margins had a significantly worse overall survival (p = 0.002). Overall survival among those with Dukes' B Jass III and Dukes' B with infiltrated margins was no better than overall survival among all patients with Dukes' C tumours. CONCLUSION: Jass staging and the nature of the margin of invasion allow patients undergoing curative surgery for Dukes' B colorectal carcinoma to be separated into prognostic groups. A group of patients with Dukes' B tumours whose prognosis is inseparable from those with Dukes' C tumours can be identified, the nature of the margin of invasion being used to classify a larger number of patients.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
A table showing a comparison and classification of tools (intelligent tutoring systems) for e-learning of Logic at a college level.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
This letter presents advanced classification methods for very high resolution images. Efficient multisource information, both spectral and spatial, is exploited through the use of composite kernels in support vector machines. Weighted summations of kernels accounting for separate sources of spectral and spatial information are analyzed and compared to classical approaches such as pure spectral classification or stacked approaches using all the features in a single vector. Model selection problems are addressed, as well as the importance of the different kernels in the weighted summation.
Resumo:
The objective of this work was to evaluate the application of the spectral-temporal response surface (STRS) classification method on Moderate Resolution Imaging Spectroradiometer (MODIS, 250 m) sensor images in order to estimate soybean areas in Mato Grosso state, Brazil. The classification was carried out using the maximum likelihood algorithm (MLA) adapted to the STRS method. Thirty segments of 30x30 km were chosen along the main agricultural regions of Mato Grosso state, using data from the summer season of 2005/2006 (from October to March), and were mapped based on fieldwork data, TM/Landsat-5 and CCD/CBERS-2 images. Five thematic classes were considered: Soybean, Forest, Cerrado, Pasture and Bare Soil. The classification by the STRS method was done over an area intersected with a subset of 30x30-km segments. In regions with soybean predominance, STRS classification overestimated in 21.31% of the reference values. In regions where soybean fields were less prevalent, the classifier overestimated 132.37% in the acreage of the reference. The overall classification accuracy was 80%. MODIS sensor images and the STRS algorithm showed to be promising for the classification of soybean areas in regions with the predominance of large farms. However, the results for fragmented areas and smaller farms were less efficient, overestimating soybean areas.
Resumo:
The glasses of the rosette forming the main window of the transept of the Gothic Cathedral of Tarragona have been characterised by means of SEM/EDS, XRD, FTIR and electronic microprobe. The multivariate statistical treatment of these data allow to establish a classification of the samples forming groups having an historical significance and reflecting ancient restorations. Furthermore, the decay patterns and mechanisms have been determined and the weathering by-products characterised. It has been demonstrated a clear influence of the bioactivity in the decay of these glasses, which activity is partially controlled by the chemical composition of the glasses.
Resumo:
The glasses of the rosette forming the main window of the transept of the Gothic Cathedral of Tarragona have been characterised by means of SEM/EDS, XRD, FTIR and electronic microprobe. The multivariate statistical treatment of these data allow to establish a classification of the samples forming groups having an historical significance and reflecting ancient restorations. Furthermore, the decay patterns and mechanisms have been determined and the weathering by-products characterised. It has been demonstrated a clear influence of the bioactivity in the decay of these glasses, which activity is partially controlled by the chemical composition of the glasses.
Resumo:
General clustering deals with weighted objects and fuzzy memberships. We investigate the group- or object-aggregation-invariance properties possessed by the relevant functionals (effective number of groups or objects, centroids, dispersion, mutual object-group information, etc.). The classical squared Euclidean case can be generalized to non-Euclidean distances, as well as to non-linear transformations of the memberships, yielding the c-means clustering algorithm as well as two presumably new procedures, the convex and pairwise convex clustering. Cluster stability and aggregation-invariance of the optimal memberships associated to the various clustering schemes are examined as well.