873 resultados para Support Vector Machines and Naive Bayes Classifier


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart homes for the aging population have recently started attracting the attention of the research community. The "health state" of smart homes is comprised of many different levels; starting with the physical health of citizens, it also includes longer-term health norms and outcomes, as well as the arena of positive behavior changes. One of the problems of interest is to monitor the activities of daily living (ADL) of the elderly, aiming at their protection and well-being. For this purpose, we installed passive infrared (PIR) sensors to detect motion in a specific area inside a smart apartment and used them to collect a set of ADL. In a novel approach, we describe a technology that allows the ground truth collected in one smart home to train activity recognition systems for other smart homes. We asked the users to label all instances of all ADL only once and subsequently applied data mining techniques to cluster in-home sensor firings. Each cluster would therefore represent the instances of the same activity. Once the clusters were associated to their corresponding activities, our system was able to recognize future activities. To improve the activity recognition accuracy, our system preprocessed raw sensor data by identifying overlapping activities. To evaluate the recognition performance from a 200-day dataset, we implemented three different active learning classification algorithms and compared their performance: naive Bayesian (NB), support vector machine (SVM) and random forest (RF). Based on our results, the RF classifier recognized activities with an average specificity of 96.53%, a sensitivity of 68.49%, a precision of 74.41% and an F-measure of 71.33%, outperforming both the NB and SVM classifiers. Further clustering markedly improved the results of the RF classifier. An activity recognition system based on PIR sensors in conjunction with a clustering classification approach was able to detect ADL from datasets collected from different homes. Thus, our PIR-based smart home technology could improve care and provide valuable information to better understand the functioning of our societies, as well as to inform both individual and collective action in a smart city scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we propose an image acquisition and processing methodology (framework) developed for performance in-field grapes and leaves detection and quantification, based on a six step methodology: 1) image segmentation through Fuzzy C-Means with Gustafson Kessel (FCM-GK) clustering; 2) obtaining of FCM-GK outputs (centroids) for acting as seeding for K-Means clustering; 3) Identification of the clusters generated by K-Means using a Support Vector Machine (SVM) classifier. 4) Performance of morphological operations over the grapes and leaves clusters in order to fill holes and to eliminate small pixels clusters; 5)Creation of a mosaic image by Scale-Invariant Feature Transform (SIFT) in order to avoid overlapping between images; 6) Calculation of the areas of leaves and grapes and finding of the centroids in the grape bunches. Image data are collected using a colour camera fixed to a mobile platform. This platform was developed to give a stabilized surface to guarantee that the images were acquired parallel to de vineyard rows. In this way, the platform avoids the distortion of the images that lead to poor estimation of the areas. Our preliminary results are promissory, although they still have shown that it is necessary to implement a camera stabilization system to avoid undesired camera movements, and also a parallel processing procedure in order to speed up the mosaicking process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sensores inerciales (acelerómetros y giróscopos) se han ido introduciendo poco a poco en dispositivos que usamos en nuestra vida diaria gracias a su minituarización. Hoy en día todos los smartphones contienen como mínimo un acelerómetro y un magnetómetro, siendo complementados en losmás modernos por giróscopos y barómetros. Esto, unido a la proliferación de los smartphones ha hecho viable el diseño de sistemas basados en las medidas de sensores que el usuario lleva colocados en alguna parte del cuerpo (que en un futuro estarán contenidos en tejidos inteligentes) o los integrados en su móvil. El papel de estos sensores se ha convertido en fundamental para el desarrollo de aplicaciones contextuales y de inteligencia ambiental. Algunos ejemplos son el control de los ejercicios de rehabilitación o la oferta de información referente al sitio turístico que se está visitando. El trabajo de esta tesis contribuye a explorar las posibilidades que ofrecen los sensores inerciales para el apoyo a la detección de actividad y la mejora de la precisión de servicios de localización para peatones. En lo referente al reconocimiento de la actividad que desarrolla un usuario, se ha explorado el uso de los sensores integrados en los dispositivos móviles de última generación (luz y proximidad, acelerómetro, giróscopo y magnetómetro). Las actividades objetivo son conocidas como ‘atómicas’ (andar a distintas velocidades, estar de pie, correr, estar sentado), esto es, actividades que constituyen unidades de actividades más complejas como pueden ser lavar los platos o ir al trabajo. De este modo, se usan algoritmos de clasificación sencillos que puedan ser integrados en un móvil como el Naïve Bayes, Tablas y Árboles de Decisión. Además, se pretende igualmente detectar la posición en la que el usuario lleva el móvil, no sólo con el objetivo de utilizar esa información para elegir un clasificador entrenado sólo con datos recogidos en la posición correspondiente (estrategia que mejora los resultados de estimación de la actividad), sino también para la generación de un evento que puede producir la ejecución de una acción. Finalmente, el trabajo incluye un análisis de las prestaciones de la clasificación variando el tipo de parámetros y el número de sensores usados y teniendo en cuenta no sólo la precisión de la clasificación sino también la carga computacional. Por otra parte, se ha propuesto un algoritmo basado en la cuenta de pasos utilizando informaiii ción proveniente de un acelerómetro colocado en el pie del usuario. El objetivo final es detectar la actividad que el usuario está haciendo junto con la estimación aproximada de la distancia recorrida. El algoritmo de cuenta pasos se basa en la detección de máximos y mínimos usando ventanas temporales y umbrales sin requerir información específica del usuario. El ámbito de seguimiento de peatones en interiores es interesante por la falta de un estándar de localización en este tipo de entornos. Se ha diseñado un filtro extendido de Kalman centralizado y ligeramente acoplado para fusionar la información medida por un acelerómetro colocado en el pie del usuario con medidas de posición. Se han aplicado también diferentes técnicas de corrección de errores como las de velocidad cero que se basan en la detección de los instantes en los que el pie está apoyado en el suelo. Los resultados han sido obtenidos en entornos interiores usando las posiciones estimadas por un sistema de triangulación basado en la medida de la potencia recibida (RSS) y GPS en exteriores. Finalmente, se han implementado algunas aplicaciones que prueban la utilidad del trabajo desarrollado. En primer lugar se ha considerado una aplicación de monitorización de actividad que proporciona al usuario información sobre el nivel de actividad que realiza durante un período de tiempo. El objetivo final es favorecer el cambio de comportamientos sedentarios, consiguiendo hábitos saludables. Se han desarrollado dos versiones de esta aplicación. En el primer caso se ha integrado el algoritmo de cuenta pasos en una plataforma OSGi móvil adquiriendo los datos de un acelerómetro Bluetooth colocado en el pie. En el segundo caso se ha creado la misma aplicación utilizando las implementaciones de los clasificadores en un dispositivo Android. Por otro lado, se ha planteado el diseño de una aplicación para la creación automática de un diario de viaje a partir de la detección de eventos importantes. Esta aplicación toma como entrada la información procedente de la estimación de actividad y de localización además de información almacenada en bases de datos abiertas (fotos, información sobre sitios) e información sobre sensores reales y virtuales (agenda, cámara, etc.) del móvil. Abstract Inertial sensors (accelerometers and gyroscopes) have been gradually embedded in the devices that people use in their daily lives thanks to their miniaturization. Nowadays all smartphones have at least one embedded magnetometer and accelerometer, containing the most upto- date ones gyroscopes and barometers. This issue, together with the fact that the penetration of smartphones is growing steadily, has made possible the design of systems that rely on the information gathered by wearable sensors (in the future contained in smart textiles) or inertial sensors embedded in a smartphone. The role of these sensors has become key to the development of context-aware and ambient intelligent applications. Some examples are the performance of rehabilitation exercises, the provision of information related to the place that the user is visiting or the interaction with objects by gesture recognition. The work of this thesis contributes to explore to which extent this kind of sensors can be useful to support activity recognition and pedestrian tracking, which have been proven to be essential for these applications. Regarding the recognition of the activity that a user performs, the use of sensors embedded in a smartphone (proximity and light sensors, gyroscopes, magnetometers and accelerometers) has been explored. The activities that are detected belong to the group of the ones known as ‘atomic’ activities (e.g. walking at different paces, running, standing), that is, activities or movements that are part of more complex activities such as doing the dishes or commuting. Simple, wellknown classifiers that can run embedded in a smartphone have been tested, such as Naïve Bayes, Decision Tables and Trees. In addition to this, another aim is to estimate the on-body position in which the user is carrying the mobile phone. The objective is not only to choose a classifier that has been trained with the corresponding data in order to enhance the classification but also to start actions. Finally, the performance of the different classifiers is analysed, taking into consideration different features and number of sensors. The computational and memory load of the classifiers is also measured. On the other hand, an algorithm based on step counting has been proposed. The acceleration information is provided by an accelerometer placed on the foot. The aim is to detect the activity that the user is performing together with the estimation of the distance covered. The step counting strategy is based on detecting minima and its corresponding maxima. Although the counting strategy is not innovative (it includes time windows and amplitude thresholds to prevent under or overestimation) no user-specific information is required. The field of pedestrian tracking is crucial due to the lack of a localization standard for this kind of environments. A loosely-coupled centralized Extended Kalman Filter has been proposed to perform the fusion of inertial and position measurements. Zero velocity updates have been applied whenever the foot is detected to be placed on the ground. The results have been obtained in indoor environments using a triangulation algorithm based on RSS measurements and GPS outdoors. Finally, some applications have been designed to test the usefulness of the work. The first one is called the ‘Activity Monitor’ whose aim is to prevent sedentary behaviours and to modify habits to achieve desired objectives of activity level. Two different versions of the application have been implemented. The first one uses the activity estimation based on the step counting algorithm, which has been integrated in an OSGi mobile framework acquiring the data from a Bluetooth accelerometer placed on the foot of the individual. The second one uses activity classifiers embedded in an Android smartphone. On the other hand, the design of a ‘Travel Logbook’ has been planned. The input of this application is the information provided by the activity and localization modules, external databases (e.g. pictures, points of interest, weather) and mobile embedded and virtual sensors (agenda, camera, etc.). The aim is to detect important events in the journey and gather the information necessary to store it as a journal page.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes our participation at the RepLab 2014 reputation dimensions scenario. Our idea was to evaluate the best combination strategy of a machine learning classifier with a rule-based algorithm based on logical expressions of terms. Results show that our baseline experiment using just Naive Bayes Multinomial with a term vector model representation of the tweet text is ranked second among runs from all participants in terms of accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Reactivation of p53 by either gene transfer or pharmacologic approaches may compensate for loss of p19Arf or excess mdm2 expression, common events in melanoma and glioma. In our previous work, we constructed the pCLPG retroviral vector where transgene expression is controlled by p53 through a p53-responsive promoter. The use of this vector to introduce p19Arf into tumor cells that harbor p53wt should yield viral expression of p19Arf which, in turn, would activate the endogenous p53 and result in enhanced vector expression and tumor suppression. Since nutlin-3 can activate p53 by blocking its interaction with mdm2, we explored the possibility that the combination of p19Arf gene transfer and nutlin-3 drug treatment may provide an additive benefit in stimulating p53 function. Methods: B16 (mouse melanoma) and C6 (rat glioma) cell lines, which harbor p53wt, were transduced with pCLPGp19 and these were additionally treated with nutlin-3 or the DNA damaging agent, doxorubicin. Viral expression was confirmed by Western, Northern and immunofluorescence assays. p53 function was assessed by reporter gene activity provided by a p53-responsive construct. Alterations in proliferation and viability were measured by colony formation, growth curve, cell cycle and MTT assays. In an animal model, B16 cells were treated with the pCLPGp19 virus and/or drugs before subcutaneous injection in C57BL/6 mice, observation of tumor progression and histopathologic analyses. Results: Here we show that the functional activation of endogenous p53wt in B16 was particularly challenging, but accomplished when combined gene transfer and drug treatments were applied, resulting in increased transactivation by p53, marked cell cycle alteration and reduced viability in culture. In an animal model, B16 cells treated with both p19Arf and nutlin-3 yielded increased necrosis and decreased BrdU marking. In comparison, C6 cells were quite susceptible to either treatment, yet p53 was further activated by the combination of p19Arf and nutlin-3. Conclusions: To the best of our knowledge, this is the first study to apply both p19Arf and nutlin-3 for the stimulation of p53 activity. These results support the notion that a p53 responsive vector may prove to be an interesting gene transfer tool, especially when combined with p53- activating agents, for the treatment of tumors that retain wild-type p53.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study evaluated two different support materials (polystyrene and expanded clay) for biohydrogen production in an anaerobic fluidized bed reactor (AFBR) treating synthetic wastewater containing glucose (4000 mg L(-1)). The AFBRs contained either polystyrene (R1) or expanded clay (R2) as support materials were inoculated with thermally pre-treated anaerobic sludge and operated at a temperature of 30 degrees C and a pH of approximately 5.5. The AFBRs were operated with a range of hydraulic retention times (HRTs) between 1 and 8 h. For R1 with an HRT of 2 h, the maximum hydrogen yield (HY) was 1.90 mol H(2) mol(-1) glucose, with 0.805 mg of biomass (as total volatile solids, or TVS) attached to each g of polystyrene. For R2 operated at an HRT of 2 h, the maximum HY was 2.59 mol H(2) moll glucose, with 1.100 mg of attached biomass (as TVS) g(-1) expanded clay. The highest hydrogen production rates (HPR) were 0.95 and 1.21 L h(-1) L(-1) for R1 and R2, respectively, using an HRT of 1 h. The H(2) content increased from 16-47% for R1 and from 22-51% for R2. No methane was detected in the biogas produced throughout the period of AFBR operation. These results show that the values of HY, HPR, H(2) content, and g of attached biomass g(-1) support material were all higher for AFBRs containing expanded clay than for reactors containing polystyrene. (C) 2010 Professor T. Nejat Veziroglu. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives This study examines the direct and mediated effects of shift workers' coping strategies and social support on structural work-nonwork conflict and subjective health. Methods The participants were 172 registered female nurses, aged 21 to 40 years. They all worked full-time, on rapidly rotating, 8-hour shifts in metropolitan general hospitals. All the respondents completed a self-administered questionnaire requesting demographic information and data on sources of social support, work-nonwork conflict, and coping strategies. Results A path model with good fit (chi(2)=28.88, df=23, P>.23, CFI=0.97) demonstrated complex effects of social support and coping on structural work-nonwork conflict and health. Conclusions Structural work-nonwork conflict mediated the effects of social support from supervisors and emotionally expressive coping on psychological symptoms. Control of shifts mediated the effect of social support from supervisors on structural work-nonwork conflict. Disengagement coping had direct and mediated effects on psychological and physical health. However, it also had mediated effects, with the effect on psychological health being mediated by support from co-workers and the effect on physical symptoms being mediated by family support. Go-worker support mediated the effect of social support from supervisors on psychological symptoms. Overall, these findings support previous research and clarify the process by which coping strategies and social support affect structural work-nonwork conflict and health in shift work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here, we examine morphological changes in cortical thickness of patients with Alzheimer`s disease (AD) using image analysis algorithms for brain structure segmentation and study automatic classification of AD patients using cortical and volumetric data. Cortical thickness of AD patients (n = 14) was measured using MRI cortical surface-based analysis and compared with healthy subjects (n = 20). Data was analyzed using an automated algorithm for tissue segmentation and classification. A Support Vector Machine (SVM) was applied over the volumetric measurements of subcortical and cortical structures to separate AD patients from controls. The group analysis showed cortical thickness reduction in the superior temporal lobe, parahippocampal gyrus, and enthorhinal cortex in both hemispheres. We also found cortical thinning in the isthmus of cingulate gyrus and middle temporal gyrus at the right hemisphere, as well as a reduction of the cortical mantle in areas previously shown to be associated with AD. We also confirmed that automatic classification algorithms (SVM) could be helpful to distinguish AD patients from healthy controls. Moreover, the same areas implicated in the pathogenesis of AD were the main parameters driving the classification algorithm. While the patient sample used in this study was relatively small, we expect that using a database of regional volumes derived from MRI scans of a large number of subjects will increase the SVM power of AD patient identification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pattern recognition methods have been successfully applied in several functional neuroimaging studies. These methods can be used to infer cognitive states, so-called brain decoding. Using such approaches, it is possible to predict the mental state of a subject or a stimulus class by analyzing the spatial distribution of neural responses. In addition it is possible to identify the regions of the brain containing the information that underlies the classification. The Support Vector Machine (SVM) is one of the most popular methods used to carry out this type of analysis. The aim of the current study is the evaluation of SVM and Maximum uncertainty Linear Discrimination Analysis (MLDA) in extracting the voxels containing discriminative information for the prediction of mental states. The comparison has been carried out using fMRI data from 41 healthy control subjects who participated in two experiments, one involving visual-auditory stimulation and the other based on bimanual fingertapping sequences. The results suggest that MLDA uses significantly more voxels containing discriminative information (related to different experimental conditions) to classify the data. On the other hand, SVM is more parsimonious and uses less voxels to achieve similar classification accuracies. In conclusion, MLDA is mostly focused on extracting all discriminative information available, while SVM extracts the information which is sufficient for classification. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To develop a model to predict the bleeding source and identify the cohort amongst patients with acute gastrointestinal bleeding (GIB) who require urgent intervention, including endoscopy. Patients with acute GIB, an unpredictable event, are most commonly evaluated and managed by non-gastroenterologists. Rapid and consistently reliable risk stratification of patients with acute GIB for urgent endoscopy may potentially improve outcomes amongst such patients by targeting scarce health-care resources to those who need it the most. Design and methods: Using ICD-9 codes for acute GIB, 189 patients with acute GIB and all. available data variables required to develop and test models were identified from a hospital medical records database. Data on 122 patients was utilized for development of the model and on 67 patients utilized to perform comparative analysis of the models. Clinical data such as presenting signs and symptoms, demographic data, presence of co-morbidities, laboratory data and corresponding endoscopic diagnosis and outcomes were collected. Clinical data and endoscopic diagnosis collected for each patient was utilized to retrospectively ascertain optimal management for each patient. Clinical presentations and corresponding treatment was utilized as training examples. Eight mathematical models including artificial neural network (ANN), support vector machine (SVM), k-nearest neighbor, linear discriminant analysis (LDA), shrunken centroid (SC), random forest (RF), logistic regression, and boosting were trained and tested. The performance of these models was compared using standard statistical analysis and ROC curves. Results: Overall the random forest model best predicted the source, need for resuscitation, and disposition with accuracies of approximately 80% or higher (accuracy for endoscopy was greater than 75%). The area under ROC curve for RF was greater than 0.85, indicating excellent performance by the random forest model Conclusion: While most mathematical models are effective as a decision support system for evaluation and management of patients with acute GIB, in our testing, the RF model consistently demonstrated the best performance. Amongst patients presenting with acute GIB, mathematical models may facilitate the identification of the source of GIB, need for intervention and allow optimization of care and healthcare resource allocation; these however require further validation. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O estresse pode afetar qualquer pessoa, independente de idade, sexo ou etnia. O organismo humano o utiliza como uma resposta adaptativa frente a situações diversas, as quais requeiram alguma adaptação do organismo para que possa enfrentar tal situação. Dependendo do estímulo estressor, pode ser gerado no indivíduo desgastes físico, mental ou emocional, no entanto, o estresse não representa necessariamente algo ruim ou patológico; este é um mecanismo de adaptação vital para a sobrevivência da espécie humana. Porém, o número de pessoas que são afetadas de forma negativa pelo estresse tem crescido imensamente nas últimas décadas. Pesquisas destacam que nos Estados Unidos cerca de 60% a 90% dos atendimentos médicos estão relacionados de alguma maneira com o estresse, enquanto que no Brasil aproximadamente 80% da população sofre de estresse, sendo que desses, 30% encontram-se na fase mais crítica, a chamada fase de exaustão. Tendo em vista que a principal forma de identificação de estresse ainda é realizada através do uso de questionário de autorrelato. O presente estudo apresenta como contribuição uma metodologia de análise do nível de estresse baseada na variação da condutância galvânica da pele e de sinais de eletroencefalografia, sendo utilizados como parâmetros a assimetria do ritmo alfa, assim como a razão entre os ritmos beta e alfa no córtex frontal e pré-frontal. Para a gravação dos sinais de EEG foi utilizado um dispositivo portátil, com eletrodos especificamente situados nas posições aF3, F3, F4 e aF4, de acordo com o Sistema Internacional 10/20 de posicionamento de eletrodos. Os participantes deste estudo são Bombeiros Militares da 1ª Cia de Vitória-ES. Foram utilizadas três classes de estímulos emocionais positivos, calmos e negativos, através da utilização de imagens pertencentes ao banco de dados IAPS (International Affective Picture System). Os resultados de acurácia obtidos através de um classificador SVM (Support Vector Machine) chegam a 88,24% para classe de estímulos positivos, 84,09% para classe calma e de 92,86% para os estímulos negativos. Deste modo, esta pesquisa apresenta uma combinação de parâmetros que podem ser aferidos com equipamentos de baixo custo, e fornecem condições de diferenciar estímulos estressantes, podendo assim, ser utilizada para auxiliar no treinamento de profissionais da área de urgência e emergência.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Steatosis, also known as fatty liver, corresponds to an abnormal retention of lipids within the hepatic cells and reflects an impairment of the normal processes of synthesis and elimination of fat. Several causes may lead to this condition, namely obesity, diabetes, or alcoholism. In this paper an automatic classification algorithm is proposed for the diagnosis of the liver steatosis from ultrasound images. The features are selected in order to catch the same characteristics used by the physicians in the diagnosis of the disease based on visual inspection of the ultrasound images. The algorithm, designed in a Bayesian framework, computes two images: i) a despeckled one, containing the anatomic and echogenic information of the liver, and ii) an image containing only the speckle used to compute the textural features. These images are computed from the estimated RF signal generated by the ultrasound probe where the dynamic range compression performed by the equipment is taken into account. A Bayes classifier, trained with data manually classified by expert clinicians and used as ground truth, reaches an overall accuracy of 95% and a 100% of sensitivity. The main novelties of the method are the estimations of the RF and speckle images which make it possible to accurately compute textural features of the liver parenchyma relevant for the diagnosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To analyze whether the level of institutional and matrix support is associated with better certification of primary healthcare teams.METHODS In this cross-sectional study, we evaluated two kinds of primary healthcare support – 14,489 teams received institutional support and 14,306 teams received matrix support. Logistic regression models were applied. In the institutional support model, the independent variable was “level of support” (as calculated by the sum of supporting activities for both modalities). In the matrix support model, in turn, the independent variables were the supporting activities. The multivariate analysis has considered variables with p < 0.20. The model was adjusted by the Hosmer-Lemeshow test.RESULTS The teams had institutional and matrix supporting activities (84.0% and 85.0%), respectively, with 55.0% of them performing between six and eight activities. For the institutional support, we have observed 1.96 and 3.77 chances for teams who had medium and high levels of support to have very good or good certification, respectively. For the matrix support, the chances of their having very good or good certification were 1.79 and 3.29, respectively. Regarding to the association between institutional support activities and the certification, the very good or good certification was positively associated with self-assessment (OR = 1.95), permanent education (OR = 1.43), shared evaluation (OR = 1.40), and supervision and evaluation of indicators (OR = 1.37). In regards to the matrix support, the very good or good certification was positively associated with permanent education (OR = 1.50), interventions in the territory (OR = 1.30), and discussion in the work processes (OR = 1.23).CONCLUSIONS In Brazil, supporting activities are being incorporated in primary healthcare, and there is an association between the level of support, both matrix and institutional, and the certification result.