41 resultados para typological classification of languages


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinical and pathological heterogeneity of breast cancer hinders selection of appropriate treatment for individual cases. Molecular profiling at gene or protein levels may elucidate the biological variance of tumors and provide a new classification system that correlates better with biological, clinical and prognostic parameters. We studied the immunohistochemical profile of a panel of seven important biomarkers using tumor tissue arrays. The tumor samples were then classified with a monothetic (binary variables) clustering algorithm. Two distinct groups of tumors are characterized by the estrogen receptor (ER) status and tumor grade (p = 0.0026). Four biomarkers, c-erbB2, Cox-2, p53 and VEGF, were significantly overexpressed in tumors with the ER-negative (ER-) phenotype. Eight subsets of tumors were further identified according to the expression status of VEGF, c-erbB2 and p53. The malignant potential of the ER-/VEGF+ subgroup was associated with the strong correlations of Cox-2 and c-erb132 with VEGF. Our results indicate that this molecular classification system, based on the statistical analysis of immunohistochemical profiling, is a useful approach for tumor grouping. Some of these subgroups have a relative genetic homogeneity that may allow further study of specific genetically-controlled metabolic pathways. This approach may hold great promise in rationalizing the application of different therapeutic strategies for different subgroups of breast tumors. (C) 2003 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Color segmentation of images usually requires a manual selection and classification of samples to train the system. This paper presents an automatic system that performs these tasks without the need of a long training, providing a useful tool to detect and identify figures. In real situations, it is necessary to repeat the training process if light conditions change, or if, in the same scenario, the colors of the figures and the background may have changed, being useful a fast training method. A direct application of this method is the detection and identification of football players.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE. To describe and classify patterns of abnormal fundus autofluorescence (FAF) in eyes with early nonexudative age-related macular disease (AMD). METHODS. FAF images were recorded in eyes with early AMD by confocal scanning laser ophthalmoscopy (cSLO) with excitation at 488 nm (argon or OPSL laser) and emission above 500 or 521 nm (barrier filter). A standardized protocol for image acquisition and generation of mean images after automated alignment was applied, and routine fundus photographs were obtained. FAF images were classified by two independent observers. The ? statistic was applied to assess intra- and interobserver variability. RESULTS. Alterations in FAF were classified into eight phenotypic patterns including normal, minimal change, focal increased, patchy, linear, lacelike, reticular, and speckled. Areas with abnormal increased or decreased FAF signals may or may not have corresponded to funduscopically visible alterations. For intraobserver variability, ? of observer I was 0.80 (95% confidence interval [CI]0.71-0.89) and of observer II, 0.74. (95% CI, 0.64-0.84). For interobserver variability, ? was 0.77 (95% CI, 0.67-0.87). CONCLUSIONS. Various phenotypic patterns of abnormal FAF can be identified with cSLO imaging. Distinct patterns may reflect heterogeneity at a cellular and molecular level in contrast to a nonspecific aging process. The results indicate that the classification system yields a relatively high degree of intra- and interobserver agreement. It may be applicable for determination of novel prognostic determinants in longitudinal natural history studies, for identification of genetic risk factors, and for monitoring of future therapeutic interventions to slow the progression of early AMD. Copyright © Association for Research in Vision and Ophthalmology.