956 resultados para Multinomial logit models with random coefficients (RCL)
Resumo:
Der Forschungsgegenstand der vorliegenden Arbeit war die Identifikation und Interpretation von Traumata an menschlichen Skeletten. Neben einer umfassenden Darstellung des aktuellen Kenntnisstandes unter verschiedenen Gesichtspunkten wurden menschliche Überreste aus der Schlacht von Dornach 1499 n. Chr. untersucht. Ergänzend wurde eine Versuchsreihe mit Replika mittelalterlicher Waffen an Kunstköpfen durchgeführt. Für die Ansprache und Kategorisierung von Traumata an Skelettfunden existiert kein einheitliches und allgemein gebräuchliches System. Die verschiedenen Herangehensweisen und ihre Vor- und Nachteile wurden benannt und diskutiert. Nachfolgend wurden die Erscheinungsformen prä-, peri- und postmortaler Traumata bzw. Defekte sowie von Verletzungen durch stumpfe und scharfe Gewalt, Schussverletzungen und anderen Verletzungsarten dargestellt. Weitere besprochene Aspekte waren die Abgrenzung von Traumata gegen pathologische Veränderungen und anatomische Varianten sowie eine Diskussion der Methodik und Problematik der Erfassung von Verletzungsfrequenzen. Neben der Bestimmung von Geschlecht, Sterbealter und Körperhöhe wurden an den zur Untersuchung zur Verfügung stehenden Schädeln (N=106) und Femora (N=33) aus der Schlacht von Dornach 1499 n. Chr. pathologische und postmortale Veränderungen sowie als Schwerpunkt prä- und perimortale Traumata identifiziert und beschrieben. Die anthropologischen Befunde zeichneten das Bild einer in Hinsicht auf Sterbealter und Körperhöhe heterogenen Gruppe von Männern mit wenigen pathologischen Veränderungen. Die Ergebnisse wurden vor dem Hintergrund des spätmittelalterlichen Söldnerwesens diskutiert. An den Schädeln wurden insgesamt 417 perimortale Traumata identifiziert, wobei Hiebverletzungen stark überwogen. Die Entstehungsweise charakteristischer Merkmale von Hiebverletzungen konnte experimentell nachvollzogen werden. Weiter stellte sich heraus, dass Hiebverletzungen durch Schwerter und Hellebarden nur in Ausnahmefällen voneinander unterschieden werden können. Verletzungen durch punktuelle Einwirkungen und stumpfe Gewalt sowie Schussverletzungen wurden in weitaus geringerer Häufigkeit festgestellt. Experimentell konnte gezeigt werden, dass die Verletzungen durch punktuelle Einwirkungen mit einer Beibringung durch Langspiesse, Stossspitzen und Reisshaken von Hellebarden sowie Armbrustbolzen vereinbar sind, wobei beträchtliche Limitationen einer genaueren Waffenzuordnung offenkundig wurden. Die Verletzungen konnten als wohl typisch für die damalige Zeit bezeichnet werden, da sie das zeitgenössische Waffenspektrum deutlich widerspiegeln. Die Lokalisation der perimortalen Traumata am Schädel liess kein Muster erkennen, mit Ausnahme der Feststellung, dass grössere Schädelknochen mehr Verletzungen aufwiesen als kleinere. Diese regellose Verteilung wurde als Hinweis darauf verstanden, dass die Kampfweise keine „ritterliche“ gewesen sein dürfte, was in Einklang mit den damals geltenden Kriegsordnungen steht. Postmortale Veränderungen unterschiedlicher Art liessen vermuten, dass die untersuchten Individuen nicht bestattet wurden und dass die vom Schlachtfeld aufgesammelten Gebeine in Beinhäusern aufbewahrt wurden. Die Resultate bestätigten damit Angaben aus Schriftquellen und erlaubten die Zuordnung der Skelettreste zu Gefallenen des Reichsheeres. Beim Vergleich der Dornacher Stichprobe mit anderen mittelalterlichen Schlachtfeldserien traten sowohl hinsichtlich der anthropologischen Befunde als auch im Hinblick auf die Verletzungen und Verletzungsmuster deutliche Ähnlichkeiten zutage. Diese ergänzten nicht nur das lückenhafte Bild spätmittelalterlicher Heere und ihrer Kampfweise, sondern beleuchteten auch Unterschiede zwischen mittelalterlicher und neuzeitlicher Kriegsführung.
Resumo:
In this Thesis we consider a class of second order partial differential operators with non-negative characteristic form and with smooth coefficients. Main assumptions on the relevant operators are hypoellipticity and existence of a well-behaved global fundamental solution. We first make a deep analysis of the L-Green function for arbitrary open sets and of its applications to the Representation Theorems of Riesz-type for L-subharmonic and L-superharmonic functions. Then, we prove an Inverse Mean value Theorem characterizing the superlevel sets of the fundamental solution by means of L-harmonic functions. Furthermore, we establish a Lebesgue-type result showing the role of the mean-integal operator in solving the homogeneus Dirichlet problem related to L in the Perron-Wiener sense. Finally, we compare Perron-Wiener and weak variational solutions of the homogeneous Dirichlet problem, under specific hypothesis on the boundary datum.
Resumo:
Analyzing and modeling relationships between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects in chemical datasets is a challenging task for scientific researchers in the field of cheminformatics. Therefore, (Q)SAR model validation is essential to ensure future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to approve its use in real-world scenarios as an alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model is still under discussion. In this work, we empirically compare a k-fold cross-validation with external test set validation. The introduced workflow allows to apply the built and validated models to large amounts of unseen data, and to compare the performance of the different validation approaches. Our experimental results indicate that cross-validation produces (Q)SAR models with higher predictivity than external test set validation and reduces the variance of the results. Statistical validation is important to evaluate the performance of (Q)SAR models, but does not support the user in better understanding the properties of the model or the underlying correlations. We present the 3D molecular viewer CheS-Mapper (Chemical Space Mapper) that arranges compounds in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kinds of features, like structural fragments as well as quantitative chemical descriptors. Comprehensive functionalities including clustering, alignment of compounds according to their 3D structure, and feature highlighting aid the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. Even though visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allows for the investigation of model validation results are still lacking. We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. New functionalities in CheS-Mapper 2.0 facilitate the analysis of (Q)SAR information and allow the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. Our approach reveals if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org.
Resumo:
Important insights into the molecular mechanism of T cell extravasation across the blood-brain barrier (BBB) have already been obtained using immortalized mouse brain endothelioma cell lines (bEnd). However, compared with bEnd, primary brain endothelial cells have been shown to establish better barrier characteristics, including complex tight junctions and low permeability. In this study, we asked whether bEnd5 and primary mouse brain microvascular endothelial cells (pMBMECs) were equally suited as in vitro models with which to study the cellular and molecular mechanisms of T cell extravasation across the BBB. We found that both in vitro BBB models equally supported both T cell adhesion under static and physiologic flow conditions, and T cell crawling on the endothelial surface against the direction of flow. In contrast, distances of T cell crawling on pMBMECs were strikingly longer than on bEnd5, whereas diapedesis of T cells across pMBMECs was dramatically reduced compared with bEnd5. Thus, both in vitro BBB models are suited to study T cell adhesion. However, because pMBMECs better reflect endothelial BBB specialization in vivo, we propose that more reliable information about the cellular and molecular mechanisms of T cell diapedesis across the BBB can be attained using pMBMECs.
Resumo:
Background Replicative phenotypic HIV resistance testing (rPRT) uses recombinant infectious virus to measure viral replication in the presence of antiretroviral drugs. Due to its high sensitivity of detection of viral minorities and its dissecting power for complex viral resistance patterns and mixed virus populations rPRT might help to improve HIV resistance diagnostics, particularly for patients with multiple drug failures. The aim was to investigate whether the addition of rPRT to genotypic resistance testing (GRT) compared to GRT alone is beneficial for obtaining a virological response in heavily pre-treated HIV-infected patients. Methods Patients with resistance tests between 2002 and 2006 were followed within the Swiss HIV Cohort Study (SHCS). We assessed patients' virological success after their antiretroviral therapy was switched following resistance testing. Multilevel logistic regression models with SHCS centre as a random effect were used to investigate the association between the type of resistance test and virological response (HIV-1 RNA <50 copies/mL or ≥1.5log reduction). Results Of 1158 individuals with resistance tests 221 with GRT+rPRT and 937 with GRT were eligible for analysis. Overall virological response rates were 85.1% for GRT+rPRT and 81.4% for GRT. In the subgroup of patients with >2 previous failures, the odds ratio (OR) for virological response of GRT+rPRT compared to GRT was 1.45 (95% CI 1.00-2.09). Multivariate analyses indicate a significant improvement with GRT+rPRT compared to GRT alone (OR 1.68, 95% CI 1.31-2.15). Conclusions In heavily pre-treated patients rPRT-based resistance information adds benefit, contributing to a higher rate of treatment success.
Resumo:
Full axon counting of optic nerve cross-sections represents the most accurate method to quantify axonal damage, but such analysis is very labour intensive. Recently, a new method has been developed, termed targeted sampling, which combines the salient features of a grading scheme with axon counting. Preliminary findings revealed the method compared favourably with random sampling. The aim of the current study was to advance our understanding of the effect of sampling patterns on axon counts by comparing estimated axon counts from targeted sampling with those obtained from fixed-pattern sampling in a large collection of optic nerves with different severities of axonal injury.
Resumo:
In this research the supportive role of the family in coping with everyday problems was studied using two large data sets. The results show the importance of the structural aspect of social support. Mapping individual preferences to support referents showed the crucial role of spouse and parents in solving everyday problems. The individual choices of particular support referents could be fairly accurately predicted from knowledge of the composition of the family, in both categorical regression and logit models. The far lower predictability of the criterion variable was shown using a wide range of socioeconomic, social and demographic indicators. Residence in small cities and indicators of extreme occupational strata were particularly predictive of the choice of support referent. The supportive role of the family was also traced in the personal projects of young adults, which were seen as ecological, natural and dynamic middle-level units of analysis of personality. Different aspects of personal projects, including reliance on social support referents, turned out to be highly interrelated. One the one hand, expectations of support were determined by the content of the project, and on the other, expected social support also influences the content of the project. Sivuha sees this as one of the ways others can enter self-structures.
Resumo:
Generalized linear mixed models (GLMM) are generalized linear models with normally distributed random effects in the linear predictor. Penalized quasi-likelihood (PQL), an approximate method of inference in GLMMs, involves repeated fitting of linear mixed models with “working” dependent variables and iterative weights that depend on parameter estimates from the previous cycle of iteration. The generality of PQL, and its implementation in commercially available software, has encouraged the application of GLMMs in many scientific fields. Caution is needed, however, since PQL may sometimes yield badly biased estimates of variance components, especially with binary outcomes. Recent developments in numerical integration, including adaptive Gaussian quadrature, higher order Laplace expansions, stochastic integration and Markov chain Monte Carlo (MCMC) algorithms, provide attractive alternatives to PQL for approximate likelihood inference in GLMMs. Analyses of some well known datasets, and simulations based on these analyses, suggest that PQL still performs remarkably well in comparison with more elaborate procedures in many practical situations. Adaptive Gaussian quadrature is a viable alternative for nested designs where the numerical integration is limited to a small number of dimensions. Higher order Laplace approximations hold the promise of accurate inference more generally. MCMC is likely the method of choice for the most complex problems that involve high dimensional integrals.
Resumo:
Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.
Resumo:
DNA sequence copy number has been shown to be associated with cancer development and progression. Array-based Comparative Genomic Hybridization (aCGH) is a recent development that seeks to identify the copy number ratio at large numbers of markers across the genome. Due to experimental and biological variations across chromosomes and across hybridizations, current methods are limited to analyses of single chromosomes. We propose a more powerful approach that borrows strength across chromosomes and across hybridizations. We assume a Gaussian mixture model, with a hidden Markov dependence structure, and with random effects to allow for intertumoral variation, as well as intratumoral clonal variation. For ease of computation, we base estimation on a pseudolikelihood function. The method produces quantitative assessments of the likelihood of genetic alterations at each clone, along with a graphical display for simple visual interpretation. We assess the characteristics of the method through simulation studies and through analysis of a brain tumor aCGH data set. We show that the pseudolikelihood approach is superior to existing methods both in detecting small regions of copy number alteration and in accurately classifying regions of change when intratumoral clonal variation is present.
Resumo:
In linear mixed models, model selection frequently includes the selection of random effects. Two versions of the Akaike information criterion (AIC) have been used, based either on the marginal or on the conditional distribution. We show that the marginal AIC is no longer an asymptotically unbiased estimator of the Akaike information, and in fact favours smaller models without random effects. For the conditional AIC, we show that ignoring estimation uncertainty in the random effects covariance matrix, as is common practice, induces a bias that leads to the selection of any random effect not predicted to be exactly zero. We derive an analytic representation of a corrected version of the conditional AIC, which avoids the high computational cost and imprecision of available numerical approximations. An implementation in an R package is provided. All theoretical results are illustrated in simulation studies, and their impact in practice is investigated in an analysis of childhood malnutrition in Zambia.
Resumo:
Correspondence establishment is a key step in statistical shape model building. There are several automated methods for solving this problem in 3D, but they usually can only handle objects with simple topology, like that of a sphere or a disc. We propose an extension to correspondence establishment over a population based on the optimization of the minimal description length function, allowing considering objects with arbitrary topology. Instead of using a fixed structure of kernel placement on a sphere for the systematic manipulation of point landmark positions, we rely on an adaptive, hierarchical organization of surface patches. This hierarchy can be built on surfaces of arbitrary topology and the resulting patches are used as a basis for a consistent, multi-scale modification of the surfaces' parameterization, based on point distribution models. The feasibility of the approach is demonstrated on synthetic models with different topologies.
Resumo:
OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.
Resumo:
OBJECTIVE: Meta-analysis of studies of the accuracy of diagnostic tests currently uses a variety of methods. Statistically rigorous hierarchical models require expertise and sophisticated software. We assessed whether any of the simpler methods can in practice give adequately accurate and reliable results. STUDY DESIGN AND SETTING: We reviewed six methods for meta-analysis of diagnostic accuracy: four simple commonly used methods (simple pooling, separate random-effects meta-analyses of sensitivity and specificity, separate meta-analyses of positive and negative likelihood ratios, and the Littenberg-Moses summary receiver operating characteristic [ROC] curve) and two more statistically rigorous approaches using hierarchical models (bivariate random-effects meta-analysis and hierarchical summary ROC curve analysis). We applied the methods to data from a sample of eight systematic reviews chosen to illustrate a variety of patterns of results. RESULTS: In each meta-analysis, there was substantial heterogeneity between the results of different studies. Simple pooling of results gave misleading summary estimates of sensitivity and specificity in some meta-analyses, and the Littenberg-Moses method produced summary ROC curves that diverged from those produced by more rigorous methods in some situations. CONCLUSION: The closely related hierarchical summary ROC curve or bivariate models should be used as the standard method for meta-analysis of diagnostic accuracy.
Resumo:
This project addresses the potential impacts of changing climate on dry-season water storage and discharge from a small, mountain catchment in Tanzania. Villagers and water managers around the catchment have experienced worsening water scarcity and attribute it to increasing population and demand, but very little has been done to understand the physical characteristics and hydrological behavior of the spring catchment. The physical nature of the aquifer was characterized and water balance models were calibrated to discharge observations so as to be able to explore relative changes in aquifer storage resulting from climate changes. To characterize the shallow aquifer supplying water to the Jandu spring, water quality and geochemistry data were analyzed, discharge recession analysis was performed, and two water balance models were developed and tested. Jandu geochemistry suggests a shallow, meteorically-recharged aquifer system with short circulation times. Baseflow recession analysis showed that the catchment behavior could be represented by a linear storage model with an average recession constant of 0.151/month from 2004-2010. Two modified Thornthwaite-Mather Water Balance (TMWB) models were calibrated using historic rainfall and discharge data and shown to reproduce dry-season flows with Nash-Sutcliffe efficiencies between 0.86 and 0.91. The modified TMWB models were then used to examine the impacts of nineteen, perturbed climate scenarios to test the potential impacts of regional climate change on catchment storage during the dry season. Forcing the models with realistic scenarios for average monthly temperature, annual precipitation, and seasonal rainfall distribution demonstrated that even small climate changes might adversely impact aquifer storage conditions at the onset of the dry season. The scale of the change was dependent on the direction (increasing vs. decreasing) and magnitude of climate change (temperature and precipitation). This study demonstrates that small, mountain aquifer characterization is possible using simple water quality parameters, recession analysis can be integrated into modeling aquifer storage parameters, and water balance models can accurately reproduce dry-season discharges and might be useful tools to assess climate change impacts. However, uncertainty in current climate projections and lack of data for testing the predictive capabilities of the model beyond the present data set, make the forecasts of changes in discharge also uncertain. The hydrologic tools used herein offer promise for future research in understanding small, shallow, mountainous aquifers and could potentially be developed and used by water resource professionals to assess climatic influences on local hydrologic systems.