984 resultados para Bayesian method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.

This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.

The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new

individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the

refreshment sample itself. As we illustrate, nonignorable unit nonresponse

can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse

in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.

The second method incorporates informative prior beliefs about

marginal probabilities into Bayesian latent class models for categorical data.

The basic idea is to append synthetic observations to the original data such that

(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.

We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.

The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning Bayesian networks with bounded tree-width has attracted much attention recently, because low tree-width allows exact inference to be performed efficiently. Some existing methods \cite{korhonen2exact, nie2014advances} tackle the problem by using $k$-trees to learn the optimal Bayesian network with tree-width up to $k$. Finding the best $k$-tree, however, is computationally intractable. In this paper, we propose a sampling method to efficiently find representative $k$-trees by introducing an informative score function to characterize the quality of a $k$-tree. To further improve the quality of the $k$-trees, we propose a probabilistic hill climbing approach that locally refines the sampled $k$-trees. The proposed algorithm can efficiently learn a quality Bayesian network with tree-width at most $k$. Experimental results demonstrate that our approach is more computationally efficient than the exact methods with comparable accuracy, and outperforms most existing approximate methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a method for learning treewidth-bounded Bayesian networks from data sets containing thousands of variables. Bounding the treewidth of a Bayesian network greatly reduces the complexity of inferences. Yet, being a global property of the graph, it considerably increases the difficulty of the learning process. Our novel algorithm accomplishes this task, scaling both to large domains and to large treewidths. Our novel approach consistently outperforms the state of the art on experiments with up to thousands of variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When something unfamiliar emerges or when something familiar does something unexpected people need to make sense of what is emerging or going on in order to act. Social representations theory suggests how individuals and society make sense of the unfamiliar and hence how the resultant social representations (SRs) cognitively, emotionally, and actively orient people and enable communication. SRs are social constructions that emerge through individual and collective engagement with media and with everyday conversations among people. Recent developments in text analysis techniques, and in particular topic modeling, provide a potentially powerful analytical method to examine the structure and content of SRs using large samples of narrative or text. In this paper I describe the methods and results of applying topic modeling to 660 micronarratives collected from Australian academics / researchers, government employees, and members of the public in 2010-2011. The narrative fragments focused on adaptation to climate change (CC) and hence provide an example of Australian society making sense of an emerging and conflict ridden phenomena. The results of the topic modeling reflect elements of SRs of adaptation to CC that are consistent with findings in the literature as well as being reasonably robust predictors of classes of action in response to CC. Bayesian Network (BN) modeling was used to identify relationships among the topics (SR elements) and in particular to identify relationships among topics, sentiment, and action. Finally the resulting model and topic modeling results are used to highlight differences in the salience of SR elements among social groups. The approach of linking topic modeling and BN modeling offers a new and encouraging approach to analysis for ongoing research on SRs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International audience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dust attenuation affects nearly all observational aspects of galaxy evolution, yet very little is known about the form of the dust-attenuation law in the distant universe. Here, we model the spectral energy distributions of galaxies at z ~ 1.5–3 from CANDELS with rest-frame UV to near-IR imaging under different assumptions about the dust law, and compare the amount of inferred attenuated light with the observed infrared (IR) luminosities. Some individual galaxies show strong Bayesian evidence in preference of one dust law over another, and this preference agrees with their observed location on the plane of infrared excess (IRX, L_TIR/L_UV) and UV slope (β). We generalize the shape of the dust law with an empirical model, A_ λ,σ =E(B-V)k_ λ (λ / λ v)^ σ where k_λ is the dust law of Calzetti et al., and show that there exists a correlation between the color excess E(B-V) and tilt δ with δ =(0.62±0.05)log(E(B-V))+(0.26±0.02). Galaxies with high color excess have a shallower, starburst-like law, and those with low color excess have a steeper, SMC-like law. Surprisingly, the galaxies in our sample show no correlation between the shape of the dust law and stellar mass, star formation rate, or β. The change in the dust law with color excess is consistent with a model where attenuation is caused by scattering, a mixed star–dust geometry, and/or trends with stellar population age, metallicity, and dust grain size. This rest-frame UV-to-near-IR method shows potential to constrain the dust law at even higher redshifts (z>3).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how virus strains offer protection against closely related emerging strains is vital for creating effective vaccines. For many viruses, including Foot-and-Mouth Disease Virus (FMDV) and the Influenza virus where multiple serotypes often co-circulate, in vitro testing of large numbers of vaccines can be infeasible. Therefore the development of an in silico predictor of cross-protection between strains is important to help optimise vaccine choice. Vaccines will offer cross-protection against closely related strains, but not against those that are antigenically distinct. To be able to predict cross-protection we must understand the antigenic variability within a virus serotype, distinct lineages of a virus, and identify the antigenic residues and evolutionary changes that cause the variability. In this thesis we present a family of sparse hierarchical Bayesian models for detecting relevant antigenic sites in virus evolution (SABRE), as well as an extended version of the method, the extended SABRE (eSABRE) method, which better takes into account the data collection process. The SABRE methods are a family of sparse Bayesian hierarchical models that use spike and slab priors to identify sites in the viral protein which are important for the neutralisation of the virus. In this thesis we demonstrate how the SABRE methods can be used to identify antigenic residues within different serotypes and show how the SABRE method outperforms established methods, mixed-effects models based on forward variable selection or l1 regularisation, on both synthetic and viral datasets. In addition we also test a number of different versions of the SABRE method, compare conjugate and semi-conjugate prior specifications and an alternative to the spike and slab prior; the binary mask model. We also propose novel proposal mechanisms for the Markov chain Monte Carlo (MCMC) simulations, which improve mixing and convergence over that of the established component-wise Gibbs sampler. The SABRE method is then applied to datasets from FMDV and the Influenza virus in order to identify a number of known antigenic residue and to provide hypotheses of other potentially antigenic residues. We also demonstrate how the SABRE methods can be used to create accurate predictions of the important evolutionary changes of the FMDV serotypes. In this thesis we provide an extended version of the SABRE method, the eSABRE method, based on a latent variable model. The eSABRE method takes further into account the structure of the datasets for FMDV and the Influenza virus through the latent variable model and gives an improvement in the modelling of the error. We show how the eSABRE method outperforms the SABRE methods in simulation studies and propose a new information criterion for selecting the random effects factors that should be included in the eSABRE method; block integrated Widely Applicable Information Criterion (biWAIC). We demonstrate how biWAIC performs equally to two other methods for selecting the random effects factors and combine it with the eSABRE method to apply it to two large Influenza datasets. Inference in these large datasets is computationally infeasible with the SABRE methods, but as a result of the improved structure of the likelihood, we are able to show how the eSABRE method offers a computational improvement, leading it to be used on these datasets. The results of the eSABRE method show that we can use the method in a fully automatic manner to identify a large number of antigenic residues on a variety of the antigenic sites of two Influenza serotypes, as well as making predictions of a number of nearby sites that may also be antigenic and are worthy of further experiment investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper describes a novel, simple and reliable differential pulse voltammetric method for determining amitriptyline (AMT) in pharmaceutical formulations. It has been described for many authors that this antidepressant is electrochemically inactive at carbon electrodes. However, the procedure proposed herein consisted in electrochemically oxidizing AMT at an unmodified carbon nanotube paste electrode in the presence of 0.1 mol L(-1) sulfuric acid used as electrolyte. At such concentration, the acid facilitated the AMT electroxidation through one-electron transfer at 1.33 V vs. Ag/AgCl, as observed by the augmentation of peak current. Concerning optimized conditions (modulation time 5 ms, scan rate 90 mV s(-1), and pulse amplitude 120 mV) a linear calibration curve was constructed in the range of 0.0-30.0 μmol L(-1), with a correlation coefficient of 0.9991 and a limit of detection of 1.61 μmol L(-1). The procedure was successfully validated for intra- and inter-day precision and accuracy. Moreover, its feasibility was assessed through analysis of commercial pharmaceutical formulations and it has been compared to the UV-vis spectrophotometric method used as standard analytical technique recommended by the Brazilian Pharmacopoeia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work compared the local injection of mononuclear cells to the spinal cord lateral funiculus with the alternative approach of local delivery with fibrin sealant after ventral root avulsion (VRA) and reimplantation. For that, female adult Lewis rats were divided into the following groups: avulsion only, reimplantation with fibrin sealant; root repair with fibrin sealant associated with mononuclear cells; and repair with fibrin sealant and injected mononuclear cells. Cell therapy resulted in greater survival of spinal motoneurons up to four weeks post-surgery, especially when mononuclear cells were added to the fibrin glue. Injection of mononuclear cells to the lateral funiculus yield similar results to the reimplantation alone. Additionally, mononuclear cells added to the fibrin glue increased neurotrophic factor gene transcript levels in the spinal cord ventral horn. Regarding the motor recovery, evaluated by the functional peroneal index, as well as the paw print pressure, cell treated rats performed equally well as compared to reimplanted only animals, and significantly better than the avulsion only subjects. The results herein demonstrate that mononuclear cells therapy is neuroprotective by increasing levels of brain derived neurotrophic factor (BDNF) and glial derived neurotrophic factor (GDNF). Moreover, the use of fibrin sealant mononuclear cells delivery approach gave the best and more long lasting results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that long term use of shampoo causes damage to human hair. Although the Lowry method has been widely used to quantify hair damage, it is unsuitable to determine this in the presence of some surfactants and there is no other method proposed in literature. In this work, a different method is used to investigate and compare the hair damage induced by four types of surfactants (including three commercial-grade surfactants) and water. Hair samples were immersed in aqueous solution of surfactants under conditions that resemble a shower (38 °C, constant shaking). These solutions become colored with time of contact with hair and its UV-vis spectra were recorded. For comparison, the amount of extracted proteins from hair by sodium dodecyl sulfate (SDS) and by water were estimated by the Lowry method. Additionally, non-pigmented vs. pigmented hair and also sepia melanin were used to understand the washing solution color and their spectra. The results presented herein show that hair degradation is mostly caused by the extraction of proteins, cuticle fragments and melanin granules from hair fiber. It was found that the intensity of solution color varies with the charge density of the surfactants. Furthermore, the intensity of solution color can be correlated to the amount of proteins quantified by the Lowry method as well as to the degree of hair damage. UV-vis spectrum of hair washing solutions is a simple and straightforward method to quantify and compare hair damages induced by different commercial surfactants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the transmission-line modeling (TLM) applied to bio-thermal problems was improved by incorporating several novel computational techniques, which include application of graded meshes which resulted in 9 times faster in computational time and uses only a fraction (16%) of the computational resources used by regular meshes in analyzing heat flow through heterogeneous media. Graded meshes, unlike regular meshes, allow heat sources to be modeled in all segments of the mesh. A new boundary condition that considers thermal properties and thus resulting in a more realistic modeling of complex problems is introduced. Also, a new way of calculating an error parameter is introduced. The calculated temperatures between nodes were compared against the results obtained from the literature and agreed within less than 1% difference. It is reasonable, therefore, to conclude that the improved TLM model described herein has great potential in heat transfer of biological systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that trichomes protect plant organs, and several studies have investigated their role in the adaptation of plants to harsh environments. Recent studies have shown that the production of hydrophilic substances by glandular trichomes and the deposition of this secretion on young organs may facilitate water retention, thus preventing desiccation and favouring organ growth until the plant develops other protective mechanisms. Lychnophora diamantinana is a species endemic to the Brazilian 'campos rupestres' (rocky fields), a region characterized by intense solar radiation and water deficits. This study sought to investigate trichomes and the origin of the substances observed on the stem apices of L. diamantinana. Samples of stem apices, young and expanded leaves were studied using standard techniques, including light microscopy and scanning and transmission electron microscopy. Histochemical tests were used to identify the major groups of metabolites present in the trichomes and the hyaline material deposited on the apices. Non-glandular trichomes and glandular trichomes were observed. The material deposited on the stem apices was hyaline, highly hydrophilic and viscous. This hyaline material primarily consists of carbohydrates that result from the partial degradation of the cell wall of uniseriate trichomes. This degradation occurs at the same time that glandular trichomes secrete terpenoids, phenolic compounds and proteins. These results suggest that the non-glandular trichomes on the leaves of L. diamantinana help protect the young organ, particularly against desiccation, by deposition of highly hydrated substances on the apices. Furthermore, the secretion of glandular trichomes probably repels herbivore and pathogen attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To determine the most adequate number and size of tissue microarray (TMA) cores for pleomorphic adenoma immunohistochemical studies. Eighty-two pleomorphic adenoma cases were distributed in 3 TMA blocks assembled in triplicate containing 1.0-, 2.0-, and 3.0-mm cores. Immunohistochemical analysis against cytokeratin 7, Ki67, p63, and CD34 were performed and subsequently evaluated with PixelCount, nuclear, and microvessel software applications. The 1.0-mm TMA presented lower results than 2.0- and 3.0-mm TMAs versus conventional whole section slides. Possibly because of an increased amount of stromal tissue, 3.0-mm cores presented a higher microvessel density. Comparing the results obtained with one, two, and three 2.0-mm cores, there was no difference between triplicate or duplicate TMAs and a single-core TMA. Considering the possible loss of cylinders during immunohistochemical reactions, 2.0-mm TMAs in duplicate are a more reliable approach for pleomorphic adenoma immunohistochemical study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An HPLC-PAD method using a gold working electrode and a triple-potential waveform was developed for the simultaneous determination of streptomycin and dihydrostreptomycin in veterinary drugs. Glucose was used as the internal standard, and the triple-potential waveform was optimized using a factorial and a central composite design. The optimum potentials were as follows: amperometric detection, E1=-0.15V; cleaning potential, E2=+0.85V; and reactivation of the electrode surface, E3=-0.65V. For the separation of the aminoglycosides and the internal standard of glucose, a CarboPac™ PA1 anion exchange column was used together with a mobile phase consisting of a 0.070 mol L(-1) sodium hydroxide solution in the isocratic elution mode with a flow rate of 0.8 mL min(-1). The method was validated and applied to the determination of streptomycin and dihydrostreptomycin in veterinary formulations (injection, suspension and ointment) without any previous sample pretreatment, except for the ointments, for which a liquid-liquid extraction was required before HPLC-PAD analysis. The method showed adequate selectivity, with an accuracy of 98-107% and a precision of less than 3.9%.