992 resultados para Discrete valued features
Resumo:
Breast cancer is the most common cancer among women. In CAD systems, several studies have investigated the use of wavelet transform as a multiresolution analysis tool for texture analysis and could be interpreted as inputs to a classifier. In classification, polynomial classifier has been used due to the advantages of providing only one model for optimal separation of classes and to consider this as the solution of the problem. In this paper, a system is proposed for texture analysis and classification of lesions in mammographic images. Multiresolution analysis features were extracted from the region of interest of a given image. These features were computed based on three different wavelet functions, Daubechies 8, Symlet 8 and bi-orthogonal 3.7. For classification, we used the polynomial classification algorithm to define the mammogram images as normal or abnormal. We also made a comparison with other artificial intelligence algorithms (Decision Tree, SVM, K-NN). A Receiver Operating Characteristics (ROC) curve is used to evaluate the performance of the proposed system. Our system is evaluated using 360 digitized mammograms from DDSM database and the result shows that the algorithm has an area under the ROC curve Az of 0.98 ± 0.03. The performance of the polynomial classifier has proved to be better in comparison to other classification algorithms. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Background: Factor analyses indicate that hoarding symptoms constitute a distinctive dimension of obsessive-compulsive disorder (OCD), usually associated with higher severity and limited insight. The aim was to compare demographic and clinical features of OCD patients with and without hoarding symptoms. Method: A cross sectional study was conducted with 1001 DSM-IV OCD patients from the Brazilian Research Consortium of Obsessive-Compulsive Spectrum Disorders (CTOC), using several instruments. The presence and severity of hoarding symptoms were determined using the Dimensional Yale-Brown Obsessive-Compulsive Scale. Statistical univariate analyses comparing factors possibly associated with hoarding symptoms were conducted, followed by logistic regression to adjust the results for possible confounders. Results: Approximately half of the sample (52.7%, n = 528) presented hoarding symptoms, but only four patients presented solely the hoarding dimension. Hoarding was the least severe dimension in the total sample (mean score: 3.89). The most common lifetime hoarding symptom was the obsessive thought of needing to collect and keep things for the future (44.0%, n = 440). After logistic regression, the following variables remained independently associated with hoarding symptoms: being older, living alone, earlier age of symptoms onset, insidious onset of obsessions, higher anxiety scores, poorer insight and higher frequency of the symmetry-ordering symptom dimension. Concerning comorbidities, major depressive, posttraumatic stress and attention deficit/hyperactivity disorders, compulsive buying and tic disorders remained associated with the hoarding dimension. Conclusion: OCD hoarding patients are more likely to present certain clinical features, but further studies are needed to determine whether OCD patients with hoarding symptoms constitute an etiologically discrete subgroup. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
For a locally compact Hausdorff space K and a Banach space X we denote by C-0(K, X) the space of X-valued continuous functions on K which vanish at infinity, provided with the supremum norm. Let n be a positive integer, Gamma an infinite set with the discrete topology, and X a Banach space having non-trivial cotype. We first prove that if the nth derived set of K is not empty, then the Banach-Mazur distance between C-0(Gamma, X) and C-0(K, X) is greater than or equal to 2n + 1. We also show that the Banach-Mazur distance between C-0(N, X) and C([1, omega(n)k], X) is exactly 2n + 1, for any positive integers n and k. These results extend and provide a vector-valued version of some 1970 Cambern theorems, concerning the cases where n = 1 and X is the scalar field.
Resumo:
The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.
Resumo:
Astroblastoma is a historically traded microscopic diagnosis to denote a rare neuroepithelial tumor of uncertain nosology, involving a distinctive pattern of pseudorosette arrangement of neoplastic cells. While displaying some glial properties, the latter shall not - by definition - be either reducible to or part of any conventional glioma type. We report on clinicopathologic correlations in a case of astroblastoma involving an extensive rhabdoid phenotype of tumor cells. The male patient was operated on at the age of 53 and 59 years for a left parietal tumor measuring 5.8 cm in diameter at the first presentation. On magnetic resonance imaging and angiography, both the primary and its recurrence were discrete, highly vascularized, and contrast-enhancing. The second surgery was complemented with radiotherapy of 66 Gy, followed by chemotherapy with Temozolomide. Twelve years into clinical history, the patient has stable minimal residual disease at the age of 65. A review of pathology samples from both surgeries showed well-differentiated astroblastoma according to current standards, with an MIB-1 labeling index of 1% and 4%, respectively. Neither of the specimens involved cellular anaplasia, overt mitotic activity, microvascular proliferation, or palisading necrosis. Most tumor cells harbored paranuclear filamentous rhabdoid inclusions that were immunostained for vimentin and, in part, also for GFAP. No polyantigenic reactivity was observed. This example contributes another facet to the spectrum of the so-called composite rhabdoid tumors. Involving a low-grade parent neoplasm, it also further substantiates the incipient perception that the rhabdoid phenotype neither is a peculiar but nonspecific convergence point of anaplastic evolution, nor are such lesions indiscriminately bound for a relentless course.
Resumo:
This paper presents a multi-stage algorithm for the dynamic condition monitoring of a gear. The algorithm provides information referred to the gear status (fault or normal condition) and estimates the mesh stiffness per shaft revolution in case that any abnormality is detected. In the first stage, the analysis of coefficients generated through discrete wavelet transformation (DWT) is proposed as a fault detection and localization tool. The second stage consists in establishing the mesh stiffness reduction associated with local failures by applying a supervised learning mode and coupled with analytical models. To do this, a multi-layer perceptron neural network has been configured using as input features statistical parameters sensitive to torsional stiffness decrease and derived from wavelet transforms of the response signal. The proposed method is applied to the gear condition monitoring and results show that it can update the mesh dynamic properties of the gear on line.
Resumo:
Many systems in chemistry, biology, finance, and social sciences present emerging features that are not easy to guess from the elementary interactions of their microscopic individual components. In the past, the macroscopic behavior of such systems was modeled by assuming that the collective dynamics of microscopic components can be effectively described collectively by equations acting on spatially continuous density distributions. It turns out that, to the contrary, taking into account the actual individual/discrete character of the microscopic components of these systems is crucial for explaining their macroscopic behavior. In fact, we find that in conditions in which the continuum approach would predict the extinction of all of the population (respectively the vanishing of the invested capital or the concentration of a chemical substance, etc.), the microscopic granularity insures the emergence of macroscopic localized subpopulations with collective adaptive properties that allow their survival and development. In particular it is found that in two dimensions “life” (the localized proliferating phase) always prevails.
Resumo:
This article presents an array antenna with beam-steering capability in azimuth over a wide frequency band using real-valued weighting coefficients that can be realized in practice by amplifiers or attenuators. The described beamforming scheme relies on a 2D (instead of 1D) array structure in order to make sure that there are enough degrees of freedom to realize a given radiation pattern in both the angular and frequency domains. In the presented approach, weights are determined using an inverse discrete Fourier transform (IDFT) technique by neglecting the mutual coupling between array elements. Because of the presence of mutual coupling, the actual array produces a radiation pattern with increased side-lobe levels. In order to counter this effect, the design aims to realize the initial radiation pattern with a lower side-lobe level. This strategy is demonstrated in the design example of 4 X 4 element array. (C) 2005 Wiley Periodicals. Inc.
Resumo:
The traditional method of classifying neurodegenerative diseases is based on the original clinico-pathological concept supported by 'consensus' criteria and data from molecular pathological studies. This review discusses first, current problems in classification resulting from the coexistence of different classificatory schemes, the presence of disease heterogeneity and multiple pathologies, the use of 'signature' brain lesions in diagnosis, and the existence of pathological processes common to different diseases. Second, three models of neurodegenerative disease are proposed: (1) that distinct diseases exist ('discrete' model), (2) that relatively distinct diseases exist but exhibit overlapping features ('overlap' model), and (3) that distinct diseases do not exist and neurodegenerative disease is a 'continuum' in which there is continuous variation in clinical/pathological features from one case to another ('continuum' model). Third, to distinguish between models, the distribution of the most important molecular 'signature' lesions across the different diseases is reviewed. Such lesions often have poor 'fidelity', i.e., they are not unique to individual disorders but are distributed across many diseases consistent with the overlap or continuum models. Fourth, the question of whether the current classificatory system should be rejected is considered and three alternatives are proposed, viz., objective classification, classification for convenience (a 'dissection'), or analysis as a continuum.
Resumo:
In this paper we examine discrete functions that depend on their variables in a particular way, namely the H-functions. The results obtained in this work make the “construction” of these functions possible. H-functions are generalized, as well as their matrix representation by Latin hypercubes.
Resumo:
The basic construction concepts of many-valued intellectual systems, which are adequate to primal problems of person activity and using hybrid tools with many-valued of coding are considered. The many-valued intellectual systems being two-place, but simulating neuron processes of space toting which are different on a level of actions, inertial and threshold of properties of neurons diaphragms, and also modification of frequency of following of the transmitted messages are created. All enumerated properties and functions in point of fact are essential not only are discrete on time, but also many-valued.
Resumo:
The basic construction concepts of many-valued intellectual systems, which are adequate to primal problems of person activity and using hybrid tools with many-valued intellectual systems being two-place, but simulating neuron processes of space toting which are different on a level of actions, inertial and threshold of properties of neuron diaphragms, and also frequency modification of the following transmitted messages are created. All enumerated properties and functions in point of fact are essential not only are discrete on time, but also many-valued.
Resumo:
Engineering education in the United Kingdom is at the point of embarking upon an interesting journey into uncharted waters. At no point in the past have there been so many drivers for change and so many opportunities for the development of engineering pedagogy. This paper will look at how Engineering Education Research (EER) has developed within the UK and what differentiates it from the many small scale practitioner interventions, perhaps without a clear research question or with little evaluation, which are presented at numerous staff development sessions, workshops and conferences. From this position some examples of current projects will be described, outcomes of funding opportunities will be summarised and the benefits of collaboration with other disciplines illustrated. In this study, I will account for how the design of task structure according to variation theory, as well as the probe-ware technology, make the laws of force and motion visible and learnable and, especially, in the lab studied make Newton's third law visible and learnable. I will also, as a comparison, include data from a mechanics lab that use the same probe-ware technology and deal with the same topics in mechanics, but uses a differently designed task structure. I will argue that the lower achievements on the FMCE-test in this latter case can be attributed to these differences in task structure in the lab instructions. According to my analysis, the necessary pattern of variation is not included in the design. I will also present a microanalysis of 15 hours collected from engineering students' activities in a lab about impulse and collisions based on video recordings of student's activities in a lab about impulse and collisions. The important object of learning in this lab is the development of an understanding of Newton's third law. The approach analysing students interaction using video data is inspired by ethnomethodology and conversation analysis, i.e. I will focus on students practical, contingent and embodied inquiry in the setting of the lab. I argue that my result corroborates variation theory and show this theory can be used as a 'tool' for designing labs as well as for analysing labs and lab instructions. Thus my results have implications outside the domain of this study and have implications for understanding critical features for student learning in labs. Engineering higher education is well used to change. As technology develops the abilities expected by employers of graduates expand, yet our understanding of how to make informed decisions about learning and teaching strategies does not without a conscious effort to do so. With the numerous demands of academic life, we often fail to acknowledge our incomplete understanding of how our students learn within our discipline. The journey facing engineering education in the UK is being driven by two classes of driver. Firstly there are those which we have been working to expand our understanding of, such as retention and employability, and secondly the new challenges such as substantial changes to funding systems allied with an increase in student expectations. Only through continued research can priorities be identified, addressed and a coherent and strong voice for informed change be heard within the wider engineering education community. This new position makes it even more important that through EER we acquire the knowledge and understanding needed to make informed decisions regarding approaches to teaching, curriculum design and measures to promote effective student learning. This then raises the question 'how does EER function within a diverse academic community?' Within an existing community of academics interested in taking meaningful steps towards understanding the ongoing challenges of engineering education a Special Interest Group (SIG) has formed in the UK. The formation of this group has itself been part of the rapidly changing environment through its facilitation by the Higher Education Academy's Engineering Subject Centre, an entity which through the Academy's current restructuring will no longer exist as a discrete Centre dedicated to supporting engineering academics. The aims of this group, the activities it is currently undertaking and how it expects to network and collaborate with the global EER community will be reported in this paper. This will include explanation of how the group has identified barriers to the progress of EER and how it is seeking, through a series of activities, to facilitate recognition and growth of EER both within the UK and with our valued international colleagues.