729 resultados para Mixed Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Dropouts and missing data are nearly-ubiquitous in obesity randomized controlled trails, threatening validity and generalizability of conclusions. Herein, we meta-analytically evaluate the extent of missing data, the frequency with which various analytic methods are employed to accommodate dropouts, and the performance of multiple statistical methods. METHODOLOGY/PRINCIPAL FINDINGS: We searched PubMed and Cochrane databases (2000-2006) for articles published in English and manually searched bibliographic references. Articles of pharmaceutical randomized controlled trials with weight loss or weight gain prevention as major endpoints were included. Two authors independently reviewed each publication for inclusion. 121 articles met the inclusion criteria. Two authors independently extracted treatment, sample size, drop-out rates, study duration, and statistical method used to handle missing data from all articles and resolved disagreements by consensus. In the meta-analysis, drop-out rates were substantial with the survival (non-dropout) rates being approximated by an exponential decay curve (e(-lambdat)) where lambda was estimated to be .0088 (95% bootstrap confidence interval: .0076 to .0100) and t represents time in weeks. The estimated drop-out rate at 1 year was 37%. Most studies used last observation carried forward as the primary analytic method to handle missing data. We also obtained 12 raw obesity randomized controlled trial datasets for empirical analyses. Analyses of raw randomized controlled trial data suggested that both mixed models and multiple imputation performed well, but that multiple imputation may be more robust when missing data are extensive. CONCLUSION/SIGNIFICANCE: Our analysis offers an equation for predictions of dropout rates useful for future study planning. Our raw data analyses suggests that multiple imputation is better than other methods for handling missing data in obesity randomized controlled trials, followed closely by mixed models. We suggest these methods supplant last observation carried forward as the primary method of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian factor models have proven widely useful for parsimoniously characterizing dependence in multivariate data. There is a rich literature on their extension to mixed categorical and continuous variables, using latent Gaussian variables or through generalized latent trait models acommodating measurements in the exponential family. However, when generalizing to non-Gaussian measured variables the latent variables typically influence both the dependence structure and the form of the marginal distributions, complicating interpretation and introducing artifacts. To address this problem we propose a novel class of Bayesian Gaussian copula factor models which decouple the latent factors from the marginal distributions. A semiparametric specification for the marginals based on the extended rank likelihood yields straightforward implementation and substantial computational gains. We provide new theoretical and empirical justifications for using this likelihood in Bayesian inference. We propose new default priors for the factor loadings and develop efficient parameter-expanded Gibbs sampling for posterior computation. The methods are evaluated through simulations and applied to a dataset in political science. The models in this paper are implemented in the R package bfa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the findings from a discrete-choice experiment designed to estimate the economic benefits associated with rural landscape improvements in Ireland. Using a mixed logit model, the panel nature of the dataset is exploited to retrieve willingness-to-pay values for every individual in the sample. This departs from customary approaches in which the willingness-to-pay estimates are normally expressed as measures of central tendency of an a priori distribution. Random-effects models for panel data are subsequently used to identify the determinants of the individual-specific willingness-to-pay estimates. In comparison with the standard methods used to incorporate individual-specific variables into the analysis of discrete-choice experiments, the analytical approach outlined in this paper is shown to add considerable explanatory power to the welfare estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many finite element analysis models it would be desirable to combine reduced- or lower-dimensional element types with higher-dimensional element types in a single model. In order to achieve compatibility of displacements and stress equilibrium at the junction or interface between the differing element types, it is important in such cases to integrate into the analysis some scheme for coupling the element types. A novel and effective scheme for establishing compatibility and equilibrium at the dimensional interface is described and its merits and capabilities are demonstrated. Copyright (C) 2000 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emotion research has long been dominated by the “standard method” of displaying posed or acted static images of facial expressions of emotion. While this method has been useful it is unable to investigate the dynamic nature of emotion expression. Although continuous self-report traces have enabled the measurement of dynamic expressions of emotion, a consensus has not been reached on the correct statistical techniques that permit inferences to be made with such measures. We propose Generalized Additive Models and Generalized Additive Mixed Models as techniques that can account for the dynamic nature of such continuous measures. These models allow us to hold constant shared components of responses that are due to perceived emotion across time, while enabling inference concerning linear differences between groups. The mixed model GAMM approach is preferred as it can account for autocorrelation in time series data and allows emotion decoding participants to be modelled as random effects. To increase confidence in linear differences we assess the methods that address interactions between categorical variables and dynamic changes over time. In addition we provide comments on the use of Generalized Additive Models to assess the effect size of shared perceived emotion and discuss sample sizes. Finally we address additional uses, the inference of feature detection, continuous variable interactions, and measurement of ambiguity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advancement of flexible fixture and flexible tooling, mixed production has become possible for aircraft assembly as the manufacturing processes of different aircraft/sub-assembly models are similar. However, it is a great challenge to model the problem and provide a practical solution due to the low volume and complex constraints of aircraft assemblies. To tackle this problem, this work proposes a methodology for designing the mixed production system, and a new scheduling approach is proposed by combined backward and forward scheduling methods. These methods are validated through a real-life industrial case study. Simulation results show that the number of workstations and the cycle time for making a fuselage can be reduced by 50% and 39% respectively with the newly designed mixed-model system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach for extracting stress intensity factors (SIFs) by the element-free Galerkin (EFG) class of methods through a modified crack closure integral (MCCI) scheme is proposed. Its primary feature is that it allows accurate calculation of mode I and mode II SIFs with a relatively simple and straightforward analysis even when a coarser nodal density is employed. The details of the adoption of the MCCI technique in the EFG method are described. Its performance is demonstrated through a number of case studies including mixed-mode and thermal problems in linear elastic fracture mechanics (LEFM). The results are compared with published theoretical solutions and those based on the displacement method, stress method, crack closure integral in conjunction with local smoothing (CCI–LS) technique, as well as the M-integral method. Its advantages are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional internal combustion engine vehicles are a major contributor to global greenhouse gas emissions and other air pollutants, such as particulate matter and nitrogen oxides. If the tail pipe point emissions could be managed centrally without reducing the commercial and personal user functionalities, then one of the most attractive solutions for achieving a significant reduction of emissions in the transport sector would be the mass deployment of electric vehicles. Though electric vehicle sales are still hindered by battery performance, cost and a few other technological bottlenecks, focused commercialisation and support from government policies are encouraging large scale electric vehicle adoptions. The mass proliferation of plug-in electric vehicles is likely to bring a significant additional electric load onto the grid creating a highly complex operational problem for power system operators. Electric vehicle batteries also have the ability to act as energy storage points on the distribution system. This double charge and storage impact of many uncontrollable small kW loads, as consumers will want maximum flexibility, on a distribution system which was originally not designed for such operations has the potential to be detrimental to grid balancing. Intelligent scheduling methods if established correctly could smoothly integrate electric vehicles onto the grid. Intelligent scheduling methods will help to avoid cycling of large combustion plants, using expensive fossil fuel peaking plant, match renewable generation to electric vehicle charging and not overload the distribution system causing a reduction in power quality. In this paper, a state-of-the-art review of scheduling methods to integrate plug-in electric vehicles are reviewed, examined and categorised based on their computational techniques. Thus, in addition to various existing approaches covering analytical scheduling, conventional optimisation methods (e.g. linear, non-linear mixed integer programming and dynamic programming), and game theory, meta-heuristic algorithms including genetic algorithm and particle swarm optimisation, are all comprehensively surveyed, offering a systematic reference for grid scheduling considering intelligent electric vehicle integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The purpose of this investigation was to determine for dispensed multiples (1 through 4) of powder (P) and liquid (L) in hand-mixed dental cement whether: (1) the mean (P/L) ratio (m/m) and (2) the maximum difference in (P/L) ratio is dependent on the number of multiples dispensed. The Null hypotheses were: (a) mean (P/L) ratio is independent of the number of multiples dispensed and (b) maximum difference in (P/L) ratio is independent of the number of multiples dispensed.
Methods: The materials investigated are listed in the Table. The masses of dispensed aliquots of powder and liquid were measured by a single operator (n=10, for multiples 1 through 4) on a 4-place analytical balance. All measurements were made independently and all possible (P/L) ratios calculated for each sample. The effect of multiple dispensations on (P/L) ratios and maximum (P/L) differences was by one-way ANOVA and linear regression, respectively, with the Tukey post-hoc correction for multiple comparisons.MULTIPLE DISPENSEDDISPENSED MU(x1)(x2)(x3)(x4)Zinc phosphateHeraeus12.271(0.691)a13.051(1.269)b13.215(0.824)b13.118(1.149)bFuji IXGC4.209(0.373)a4.085(0.275)b4.095(0.226)b4.095(0.217)bIRMDentsply7.933(0.767)a7.430(0.451)b7.977(0.729)a8.186(0.929)aKetac-Cem3M Espe9.6206(0.613)a9.714(0.523)a9.298(0.314)b9.321(0.292)bMean (SD) powder/liquid ratio (m/m). Superscript letters represent significances (α = 0.05) within each material
Results: Mean (SD) (P/L) ratios are presented in the Table. Null hypothesis (a) is rejected: either (x1) or (x2) dispensation yields a different (P/L) ratio to (x3) or (x4) (p < 0.05). Null hypothesis (b) is rejected: a negative correlation is observed in max (P/L) ratio difference with dispensed multiple for Ketac Cem (p = 0.029).
Conclusion: For hand-mixed dental cements: (1) more consistent (P/L) ratios may be observed with multiple dispensations of powder & liquid; (2) maximum differences in (P/L) ratio may be negatively correlated with dispensation multiple in some materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to be able to assess the contribution of donor cells to the graft followmg bone marrow transplantation (BMT), as complete engraftment of marrow progenitors that can give rise to long term donor derived hemopoiesis may be important in long-term disease-free survival. The contribution of the donor marrow, both in terms of filling the marrow "space" created by the intense conditioning regimen and in its ability to mediate a graft versus leukemia effect may be assessed by studying the kinetics of the engraftment process. As BMT involves repopulation of the host hemopoietic system with donor cells, recipients of allogeneic marrow are referred to as hemopoietic chimeras. A donor chimera is an individual who exhibits complete donor hemopoiesis and we would imagine that donor chimertsm carries the best long-term prognosis. A patient who has both donor and recipient cells coexistmg in a stable fashion post-BMT without hematological evidence of relapse or graft rejection is referred to as a mixed chimera. Mixed chimerism may be a prelude to graft rejection or leukemic relapse; therefore, it is important to be able to monitor the presence of these cells in a precise manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of mixed hematopoietic chimerism (MC) after allogeneic bone marrow transplantation remains unknown. Increasingly sensitive detection methods have shown that MC occurs frequently. We report a highly sensitive novel method to assess MC based on the polymerase chain reaction (PCR). Simple dinucleotide repeat sequences called microsatellites have been found to vary in their repeat number between individuals. We use this variation to type donor-recipient pairs following allogeneic BMT. A panel of seven microsatellites was used to distinguish between donor and recipient cells of 32 transplants. Informative microsatellites were subsequently used to assess MC after BMT in this group of patients. Seventeen of the 32 transplants involved a donor of opposite sex; hence, cytogenetics and Y chromosome-specific PCR were also used as an index of chimerism in these patients. MC was detected in bone marrow aspirates and peripheral blood in 18 of 32 patients (56%) by PCR. In several cases, only stored slide material was available for analysis but PCR of microsatellites or Y chromosomal material could be used successfully to assess the origin of cells in this archival material. Cytogenetic analysis was possible in 17 patients and MC was detected in three patients. Twelve patients received T-cell-depleted marrow and showed a high incidence of MC as revealed by PCR (greater than 80%). Twenty patients received unmanipulated marrow, and while the incidence of MC was lower (44%), this was a high percentage when compared with other studies. Once MC was detected, the percentages of recipient cells tended to increase. However, in patients exhibiting MC who subsequently relapsed, this increase was relatively sudden. The overall level of recipient cells in the group of MC patients who subsequently relapsed was higher than in those who exhibited stable MC. Thus, while the occurrence of MC was not indicative of a poor prognosis per se, sudden increases in the proportions of recipient cells may be a prelude to graft rejection or relapse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the representation of landscape complexity in stated preferences research. It integrates landscape ecology and landscape economics and conducts the landscape analysis in a three-dimensional space to provide ecologically meaningful quantitative landscape indicators that are used as variables for the monetary valuation of landscape in a stated preferences study. Expected heterogeneity in taste intensity across respondents is addressed with a mixed logit model in Willingness to Pay space. Our methodology is applied to value, in monetary terms, the landscape of the Sorrento Peninsula in Italy, an area that has faced increasing pressure from urbanization affecting its traditional horticultural, herbaceous, and arboreal structure, with loss of biodiversity, and an increasing risk of landslides. We find that residents of the Sorrento Peninsula would prefer landscapes characterized by large open views and natural features. Residents also appear to dislike heterogeneous landscapes and the presence of lemon orchards and farmers' stewardship, which are associated with the current failure of protecting the traditional landscape. The outcomes suggest that the use of landscape ecology metrics in a stated preferences model may be an effective way to move forward integrated methodologies to better understand and represent landscape and its complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To create a clinically relevant gold nanoparticle (AuNP) treatment, the surface must be functionalized with multiple ligands such as drugs, antifouling agents and targeting moieties. However, attaching several ligands of differing chemistries and lengths, while ensuring they all retain their biological functionality remains a challenge. This review compares the two most widely employed methods of surface co-functionalization, namely mixed monolayers and hetero-bifunctional linkers. While there are numerous in vitro studies successfully utilizing both surface arrangements, there is little consensus regarding their relative merits. Animal and preclinical studies have demonstrated the effectiveness of mixed monolayer functionalization and while some promising in vitro results have been reported for PEG linker capped AuNPs, any potential benefits of the approach are not yet fully understood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important feature of UK housing policy has been the promotion of consortia between local authorities, private developers and housing associations in order to develop mixed tenure estates to meet a wide range of housing needs. Central to this approach has been a focus on the management of neighbourhoods, based on the assumption that high densities and the inter-mixing of tenure exacerbates the potential for incivility and anti-social behaviour and exerts a disproportionate impact on residents' quality of life. Landlord strategies are therefore based on a need to address such issues at an early stage in the development. In some cases community-based, third sector organisations are established in order to manage community assets and to provide a community development service to residents. In others, a common response is to appoint caretakers and wardens to tackle social and environmental problems before they escalate and undermine residents’ quality of life. A number of innovative developments have promoted such neighbourhood governance approaches to housing practice by applying community development methods to address potential management problems. In the process, there is an increasing trend towards strategies that shape behaviour, govern ethical conduct, promote aesthetic standards and determine resident and landlord expectations. These processes can be related to the wider concept of governmentality whereby residents are encouraged to become actively engaged in managing their own environments, based on the assumption that this produces more cohesive, integrated communities and projects positive images. Evidence is emerging from a number of countries that increasingly integrated and mutually supportive roles and relationships between public, private and third sector agencies are transforming neighbourhood governance in similar ways. This paper will review the evidence for this trend towards community governance in mixed housing developments by drawing on a series of UK case studies prepared for two national agencies in 2007. It will review in particular the contractual arrangements with different tenures, identify codes and guidelines promoting 'good neighbour' behaviour and discuss the role of community development trusts and other neighbourhood organisations in providing facilities and services, designed to generate a well integrated community. The second part of the paper will review evidence from the USA and Australia to see how far there is a convergence in this respect in advanced economies. The paper will conclude by discussing the extent to which housing management practice is changing, particularly in areas of mixed development, whether there is a convergence in practice between different countries and how far these trends are supported by theories of governmentality.