963 resultados para Regression methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current practice for analysing functional neuroimaging data is to average the brain signals recorded at multiple sensors or channels on the scalp over time across hundreds of trials or replicates to eliminate noise and enhance the underlying signal of interest. These studies recording brain signals non-invasively using functional neuroimaging techniques such as electroencephalography (EEG) and magnetoencephalography (MEG) generate complex, high dimensional and noisy data for many subjects at a number of replicates. Single replicate (or single trial) analysis of neuroimaging data have gained focus as they are advantageous to study the features of the signals at each replicate without averaging out important features in the data that the current methods employ. The research here is conducted to systematically develop flexible regression mixed models for single trial analysis of specific brain activities using examples from EEG and MEG to illustrate the models. This thesis follows three specific themes: i) artefact correction to estimate the `brain' signal which is of interest, ii) characterisation of the signals to reduce their dimensions, and iii) model fitting for single trials after accounting for variations between subjects and within subjects (between replicates). The models are developed to establish evidence of two specific neurological phenomena - entrainment of brain signals to an $\alpha$ band of frequencies (8-12Hz) and dipolar brain activation in the same $\alpha$ frequency band in an EEG experiment and a MEG study, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessing the fit of a model is an important final step in any statistical analysis, but this is not straightforward when complex discrete response models are used. Cross validation and posterior predictions have been suggested as methods to aid model criticism. In this paper a comparison is made between four methods of model predictive assessment in the context of a three level logistic regression model for clinical mastitis in dairy cattle; cross validation, a prediction using the full posterior predictive distribution and two “mixed” predictive methods that incorporate higher level random effects simulated from the underlying model distribution. Cross validation is considered a gold standard method but is computationally intensive and thus a comparison is made between posterior predictive assessments and cross validation. The analyses revealed that mixed prediction methods produced results close to cross validation whilst the full posterior predictive assessment gave predictions that were over-optimistic (closer to the observed disease rates) compared with cross validation. A mixed prediction method that simulated random effects from both higher levels was best at identifying the outlying level two (farm-year) units of interest. It is concluded that this mixed prediction method, simulating random effects from both higher levels, is straightforward and may be of value in model criticism of multilevel logistic regression, a technique commonly used for animal health data with a hierarchical structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to examine the relationship between the structure of jobs and burnout, and to assess to what extent, if any this relationship was moderated by individual coping methods. This study was supported by the Karasek's (1998) Job Demand-Control-Support theory of work stress as well as Maslach and Leiter's (1993) theory of burnout. Coping was examined as a moderator based on the conceptualization of Lazarus and Folkman (1984). Two overall overarching questions framed this study: (a) what is the relationship between job structure, as operationalized by job title, and burnout across different occupations in support services in a large municipal school district? and (b) To what extent do individual differences in coping methods moderate this relationship? This study was a cross-sectional study of county public school bus drivers, bus aides, mechanics, and clerical workers (N = 253) at three bus depot locations within the same district using validated survey instruments for data collection. Hypotheses were tested using simultaneous regression analyses. Findings indicated that there were statistically significant and relevant relationships among the variables of interest; job demands, job control, burnout, and ways of coping. There was a relationship between job title and physical job demands. There was no evidence to support a relationship between job title and psychological demands. Furthermore, there was a relationship between physical demands, emotional exhaustion and personal accomplishment; key indicators of burnout. Results showed significant correlations between individual ways of coping as a moderator between job structure, operationalized by job title, and individual employee burnout adding empirical evidence to the occupational stress literature. Based on the findings, there are implications for theory, research, and practice. For theory and research, the findings suggest the importance of incorporating transactional models in the study of occupational stress. In the area of practice, the findings highlight the importance of enriching jobs, increasing job control, and providing individual-level training related to stress reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuroimaging research involves analyses of huge amounts of biological data that might or might not be related with cognition. This relationship is usually approached using univariate methods, and, therefore, correction methods are mandatory for reducing false positives. Nevertheless, the probability of false negatives is also increased. Multivariate frameworks have been proposed for helping to alleviate this balance. Here we apply multivariate distance matrix regression for the simultaneous analysis of biological and cognitive data, namely, structural connections among 82 brain regions and several latent factors estimating cognitive performance. We tested whether cognitive differences predict distances among individuals regarding their connectivity pattern. Beginning with 3,321 connections among regions, the 36 edges better predicted by the individuals' cognitive scores were selected. Cognitive scores were related to connectivity distances in both the full (3,321) and reduced (36) connectivity patterns. The selected edges connect regions distributed across the entire brain and the network defined by these edges supports high-order cognitive processes such as (a) (fluid) executive control, (b) (crystallized) recognition, learning, and language processing, and (c) visuospatial processing. This multivariate study suggests that one widespread, but limited number, of regions in the human brain, supports high-level cognitive ability differences. Hum Brain Mapp, 2016. © 2016 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis project is to automatically localize HCC tumors in the human liver and subsequently predict if the tumor will undergo microvascular infiltration (MVI), the initial stage of metastasis development. The input data for the work have been partially supplied by Sant'Orsola Hospital and partially downloaded from online medical databases. Two Unet models have been implemented for the automatic segmentation of the livers and the HCC malignancies within it. The segmentation models have been evaluated with the Intersection-over-Union and the Dice Coefficient metrics. The outcomes obtained for the liver automatic segmentation are quite good (IOU = 0.82; DC = 0.35); the outcomes obtained for the tumor automatic segmentation (IOU = 0.35; DC = 0.46) are, instead, affected by some limitations: it can be state that the algorithm is almost always able to detect the location of the tumor, but it tends to underestimate its dimensions. The purpose is to achieve the CT images of the HCC tumors, necessary for features extraction. The 14 Haralick features calculated from the 3D-GLCM, the 120 Radiomic features and the patients' clinical information are collected to build a dataset of 153 features. Now, the goal is to build a model able to discriminate, based on the features given, the tumors that will undergo MVI and those that will not. This task can be seen as a classification problem: each tumor needs to be classified either as “MVI positive” or “MVI negative”. Techniques for features selection are implemented to identify the most descriptive features for the problem at hand and then, a set of classification models are trained and compared. Among all, the models with the best performances (around 80-84% ± 8-15%) result to be the XGBoost Classifier, the SDG Classifier and the Logist Regression models (without penalization and with Lasso, Ridge or Elastic Net penalization).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cerebral cortex presents self-similarity in a proper interval of spatial scales, a property typical of natural objects exhibiting fractal geometry. Its complexity therefore can be characterized by the value of its fractal dimension (FD). In the computation of this metric, it has usually been employed a frequentist approach to probability, with point estimator methods yielding only the optimal values of the FD. In our study, we aimed at retrieving a more complete evaluation of the FD by utilizing a Bayesian model for the linear regression analysis of the box-counting algorithm. We used T1-weighted MRI data of 86 healthy subjects (age 44.2 ± 17.1 years, mean ± standard deviation, 48% males) in order to gain insights into the confidence of our measure and investigate the relationship between mean Bayesian FD and age. Our approach yielded a stronger and significant (P < .001) correlation between mean Bayesian FD and age as compared to the previous implementation. Thus, our results make us suppose that the Bayesian FD is a more truthful estimation for the fractal dimension of the cerebral cortex compared to the frequentist FD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this investigation was to compare the skeletal stability of three different rigid fixation methods after mandibular advancement. Fifty-five class II malocclusion patients treated with the use of bilateral sagittal split ramus osteotomy and mandibular advancement were selected for this retrospective study. Group 1 (n = 17) had miniplates with monocortical screws, Group 2 (n = 16) had bicortical screws and Group 3 (n = 22) had the osteotomy fixed by means of the hybrid technique. Cephalograms were taken preoperatively, 1 week within the postoperative care period, and 6 months after the orthognathic surgery. Linear and angular changes of the cephalometric landmarks of the chin region were measured at each period, and the changes at each cephalometric landmark were determined for the time gaps. Postoperative changes in the mandibular shape were analyzed to determine the stability of fixation methods. There was minimum difference in the relapse of the mandibular advancement among the three groups. Statistical analysis showed no significant difference in postoperative stability. However, a positive correlation between the amount of advancement and the amount of postoperative relapse was demonstrated by the linear multiple regression test (p < 0.05). It can be concluded that all techniques can be used to obtain stable postoperative results in mandibular advancement after 6 months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this clinical study was to determine the efficacy of Uncaria tomentosa (cat's claw) against denture stomatitis (DS). Fifty patients with DS were randomly assigned into 3 groups to receive 2% miconazole, placebo, or 2% U tomentosa gel. DS level was recorded immediately, after 1 week of treatment, and 1 week after treatment. The clinical effectiveness of each treatment was measured using Newton's criteria. Mycologic samples from palatal mucosa and prosthesis were obtained to determinate colony forming units per milliliter (CFU/mL) and fungal identification at each evaluation period. Candida species were identified with HiCrome Candida and API 20C AUX biochemical test. DS severity decreased in all groups (P < .05). A significant reduction in number of CFU/mL after 1 week (P < .05) was observed for all groups and remained after 14 days (P > .05). C albicans was the most prevalent microorganism before treatment, followed by C tropicalis, C glabrata, and C krusei, regardless of the group and time evaluated. U tomentosa gel had the same effect as 2% miconazole gel. U tomentosa gel is an effective topical adjuvant treatment for denture stomatitis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What is the contribution of the provision, at no cost for users, of long acting reversible contraceptive methods (LARC; copper intrauterine device [IUD], the levonorgestrel-releasing intrauterine system [LNG-IUS], contraceptive implants and depot-medroxyprogesterone [DMPA] injection) towards the disability-adjusted life years (DALY) averted through a Brazilian university-based clinic established over 30 years ago. Over the last 10 years of evaluation, provision of LARC methods and DMPA by the clinic are estimated to have contributed to DALY averted by between 37 and 60 maternal deaths, 315-424 child mortalities, 634-853 combined maternal morbidity and mortality and child mortality, and 1056-1412 unsafe abortions averted. LARC methods are associated with a high contraceptive effectiveness when compared with contraceptive methods which need frequent attention; perhaps because LARC methods are independent of individual or couple compliance. However, in general previous studies have evaluated contraceptive methods during clinical studies over a short period of time, or not more than 10 years. Furthermore, information regarding the estimation of the DALY averted is scarce. We reviewed 50 004 medical charts from women who consulted for the first time looking for a contraceptive method over the period from 2 January 1980 through 31 December 2012. Women who consulted at the Department of Obstetrics and Gynaecology, University of Campinas, Brazil were new users and users switching contraceptive, including the copper IUD (n = 13 826), the LNG-IUS (n = 1525), implants (n = 277) and DMPA (n = 9387). Estimation of the DALY averted included maternal morbidity and mortality, child mortality and unsafe abortions averted. We obtained 29 416 contraceptive segments of use including 25 009 contraceptive segments of use from 20 821 new users or switchers to any LARC method or DMPA with at least 1 year of follow-up. The mean (± SD) age of the women at first consultation ranged from 25.3 ± 5.7 (range 12-47) years in the 1980s, to 31.9 ± 7.4 (range 16-50) years in 2010-2011. The most common contraceptive chosen at the first consultation was copper IUD (48.3, 74.5 and 64.7% in the 1980s, 1990s and 2000s, respectively). For an evaluation over 20 years, the cumulative pregnancy rates (SEM) were 0.4 (0.2), 2.8 (2.1), 4.0 (0.4) and 1.3 (0.4) for the LNG-IUS, the implants, copper IUD and DMPA, respectively and cumulative continuation rates (SEM) were 15.1 (3.7), 3.9 (1.4), 14.1 (0.6) and 7.3 (1.7) for the LNG-IUS, implants, copper IUD and DMPA, respectively (P < 0.001). Over the last 10 years of evaluation, the estimation of the contribution of the clinic through the provision of LARC methods and DMPA to DALY averted was 37-60 maternal deaths; between 315 and 424 child mortalities; combined maternal morbidity and mortality and child mortality of between 634 and 853, and 1056-1412 unsafe abortions averted. The main limitations are the number of women who never returned to the clinic (overall 14% among the four methods under evaluation); consequently the pregnancy rate could be different. Other limitations include the analysis of two kinds of copper IUD and two kinds of contraceptive implants as the same IUD or implant, and the low number of users of implants. In addition, the DALY calculation relies on a number of estimates, which may vary in different parts of the world. LARC methods and DMPA are highly effective and women who were well-counselled used these methods for a long time. The benefit of averting maternal morbidity and mortality, child mortality, and unsafe abortions is an example to health policy makers to implement more family planning programmes and to offer contraceptive methods, mainly LARC and DMPA, at no cost or at affordable cost for the underprivileged population. This study received partial financial support from the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP), grant # 2012/12810-4 and from the National Research Council (CNPq), grant #573747/2008-3. B.F.B., M.P.G., and V.M.C. were fellows from the scientific initiation programme from FAPESP. Since the year 2001, all the TCu380A IUD were donated by Injeflex, São Paulo, Brazil, and from the year 2006 all the LNG-IUS were donated by the International Contraceptive Access Foundation (ICA), Turku, Finland. Both donations are as unrestricted grants. The authors declare that there are no conflicts of interest associated with this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The microabrasion technique of enamel consists of selectively abrading the discolored areas or causing superficial structural changes in a selective way. In microabrasion technique, abrasive products associated with acids are used, and the evaluation of enamel roughness after this treatment, as well as surface polishing, is necessary. This in-vitro study evaluated the enamel roughness after microabrasion, followed by different polishing techniques. Roughness analyses were performed before microabrasion (L1), after microabrasion (L2), and after polishing (L3).Thus, 60 bovine incisive teeth divided into two groups were selected (n=30): G1- 37% phosphoric acid (37%) (Dentsply) and pumice; G2- hydrochloric acid (6.6%) associated with silicon carbide (Opalustre - Ultradent). Thereafter, the groups were divided into three sub-groups (n=10), according to the system of polishing: A - Fine and superfine granulation aluminum oxide discs (SofLex 3M); B - Diamond Paste (FGM) associated with felt discs (FGM); C - Silicone tips (Enhance - Dentsply). A PROC MIXED procedure was applied after data exploratory analysis, as well as the Tukey-Kramer test (5%). No statistical differences were found between G1 and G2 groups. L2 differed statistically from L1 and showed superior amounts of roughness. Differences in the amounts of post-polishing roughness for specific groups (1A, 2B, and 1C) arose, which demonstrated less roughness in L3 and differed statistically from L2 in the polishing system. All products increased enamel roughness, and the effectiveness of the polishing systems was dependent upon the abrasive used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the spatial pattern of ill-defined causes of death across Brazilian regions, and its relationship with the evolution of completeness of the deaths registry and changes in the mortality age profile. We make use of the Brazilian Health Informatics Department mortality database and population censuses from 1980 to 2010. We applied demographic methods to evaluate the quality of mortality data for 137 small areas and correct for under-registration of death counts when necessary. The second part of the analysis uses linear regression models to investigate the relationship between, on the one hand, changes in death counts coverage and age profile of mortality, and on the other, changes in the reporting of ill-defined causes of death. The completeness of death counts coverage increases from about 80% in 1980-1991 to over 95% in 2000-2010 at the same time the percentage of ill-defined causes of deaths reduced about 53% in the country. The analysis suggests that the government's efforts to improve data quality are proving successful, and they will allow for a better understanding of the dynamics of health and the mortality transition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Silk fibroin has been widely explored for many biomedical applications, due to its biocompatibility and biodegradability. Sterilization is a fundamental step in biomaterials processing and it must not jeopardize the functionality of medical devices. The aim of this study was to analyze the influence of different sterilization methods in the physical, chemical, and biological characteristics of dense and porous silk fibroin membranes. Silk fibroin membranes were treated by several procedures: immersion in 70% ethanol solution, ultraviolet radiation, autoclave, ethylene oxide, and gamma radiation, and were analyzed by scanning electron microscopy, Fourier-transformed infrared spectroscopy (FTIR), X-ray diffraction, tensile strength and in vitro cytotoxicity to Chinese hamster ovary cells. The results indicated that the sterilization methods did not cause perceivable morphological changes in the membranes and the membranes were not toxic to cells. The sterilization methods that used organic solvent or an increased humidity and/or temperature (70% ethanol, autoclave, and ethylene oxide) increased the silk II content in the membranes: the dense membranes became more brittle, while the porous membranes showed increased strength at break. Membranes that underwent sterilization by UV and gamma radiation presented properties similar to the nonsterilized membranes, mainly for tensile strength and FTIR results.