965 resultados para Bayesian Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

When something unfamiliar emerges or when something familiar does something unexpected people need to make sense of what is emerging or going on in order to act. Social representations theory suggests how individuals and society make sense of the unfamiliar and hence how the resultant social representations (SRs) cognitively, emotionally, and actively orient people and enable communication. SRs are social constructions that emerge through individual and collective engagement with media and with everyday conversations among people. Recent developments in text analysis techniques, and in particular topic modeling, provide a potentially powerful analytical method to examine the structure and content of SRs using large samples of narrative or text. In this paper I describe the methods and results of applying topic modeling to 660 micronarratives collected from Australian academics / researchers, government employees, and members of the public in 2010-2011. The narrative fragments focused on adaptation to climate change (CC) and hence provide an example of Australian society making sense of an emerging and conflict ridden phenomena. The results of the topic modeling reflect elements of SRs of adaptation to CC that are consistent with findings in the literature as well as being reasonably robust predictors of classes of action in response to CC. Bayesian Network (BN) modeling was used to identify relationships among the topics (SR elements) and in particular to identify relationships among topics, sentiment, and action. Finally the resulting model and topic modeling results are used to highlight differences in the salience of SR elements among social groups. The approach of linking topic modeling and BN modeling offers a new and encouraging approach to analysis for ongoing research on SRs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents quantitative studies of T cell and dendritic cell (DC) behaviour in mouse lymph nodes (LNs) in the naive state and following immunisation. These processes are of importance and interest in basic immunology, and better understanding could improve both diagnostic capacity and therapeutic manipulations, potentially helping in producing more effective vaccines or developing treatments for autoimmune diseases. The problem is also interesting conceptually as it is relevant to other fields where 3D movement of objects is tracked with a discrete scanning interval. A general immunology introduction is presented in chapter 1. In chapter 2, I apply quantitative methods to multi-photon imaging data to measure how T cells and DCs are spatially arranged in LNs. This has been previously studied to describe differences between the naive and immunised state and as an indicator of the magnitude of the immune response in LNs, but previous analyses have been generally descriptive. The quantitative analysis shows that some of the previous conclusions may have been premature. In chapter 3, I use Bayesian state-space models to test some hypotheses about the mode of T cell search for DCs. A two-state mode of movement where T cells can be classified as either interacting to a DC or freely migrating is supported over a model where T cells would home in on DCs at distance through for example the action of chemokines. In chapter 4, I study whether T cell migration is linked to the geometric structure of the fibroblast reticular network (FRC). I find support for the hypothesis that the movement is constrained to the fibroblast reticular cell (FRC) network over an alternative 'random walk with persistence time' model where cells would move randomly, with a short-term persistence driven by a hypothetical T cell intrinsic 'clock'. I also present unexpected results on the FRC network geometry. Finally, a quantitative method is presented for addressing some measurement biases inherent to multi-photon imaging. In all three chapters, novel findings are made, and the methods developed have the potential for further use to address important problems in the field. In chapter 5, I present a summary and synthesis of results from chapters 3-4 and a more speculative discussion of these results and potential future directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Australia, along with many other parts of the world, fumigation with phosphine is a vital component in controlling stored grain insect pests. However, resistance is a factor that may limit the continued efficacy of this fumigant. While strong resistance to phosphine has been identified and characterised, very little information is available on the causes of its development and spread. Data obtained from a unique national resistance monitoring and management program were analysed, using Bayesian hurdle modelling, to determine which factors may be responsible. Fumigation in unsealed storages, combined with a high frequency of weak resistance, were found to be the main criteria that led to the development of strong resistance in Sitophilus oryzae. Independent development, rather than gene flow via migration, appears to be primarily responsible for the geographic incidence of strong resistance to phosphine in S. oryzae. This information can now be utilised to direct resources and education into those areas at high risk and to refine phosphine resistance management strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical methodology is proposed for comparing molecular shapes. In order to account for the continuous nature of molecules, classical shape analysis methods are combined with techniques used for predicting random fields in spatial statistics. Applying a modification of Procrustes analysis, Bayesian inference is carried out using Markov chain Monte Carlo methods for the pairwise alignment of the resulting molecular fields. Superimposing entire fields rather than the configuration matrices of nuclear positions thereby solves the problem that there is usually no clear one--to--one correspondence between the atoms of the two molecules under consideration. Using a similar concept, we also propose an adaptation of the generalised Procrustes analysis algorithm for the simultaneous alignment of multiple molecular fields. The methodology is applied to a dataset of 31 steroid molecules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to propose and evaluate the use of factor analysis (FA) in obtaining latent variables (factors) that represent a set of pig traits simultaneously, for use in genome-wide selection (GWS) studies. We used crosses between outbred F2 populations of Brazilian Piau X commercial pigs. Data were obtained on 345 F2 pigs, genotyped for 237 SNPs, with 41 traits. FA allowed us to obtain four biologically interpretable factors: ?weight?, ?fat?, ?loin?, and ?performance?. These factors were used as dependent variables in multiple regression models of genomic selection (Bayes A, Bayes B, RR-BLUP, and Bayesian LASSO). The use of FA is presented as an interesting alternative to select individuals for multiple variables simultaneously in GWS studies; accuracy measurements of the factors were similar to those obtained when the original traits were considered individually. The similarities between the top 10% of individuals selected by the factor, and those selected by the individual traits, were also satisfactory. Moreover, the estimated markers effects for the traits were similar to those found for the relevant factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focused on the longitudinal analysis of business start-ups using three waves of data from the Kauffman Firm Survey. The first essay used the data from years 2004-2008, and examined the simultaneous relationship between a firm’s capital structure, human resource policies, and its impact on the level of innovation. The firm leverage was calculated as, debt divided by total financial resources. Index of employee well-being was determined by a set of nine dichotomous questions asked in the survey. A negative binomial fixed effects model was used to analyze the effect of employee well-being and leverage on the count data of patents and copyrights, which were used as a proxy for innovation. The paper demonstrated that employee well-being positively affects the firm's innovation, while a higher leverage ratio had a negative impact on the innovation. No significant relation was found between leverage and employee well-being. The second essay used the data from years 2004-2009, and inquired whether a higher entrepreneurial speed of learning is desirable, and whether there is a linkage between the speed of learning and growth rate of the firm. The change in the speed of learning was measured using a pooled OLS estimator in repeated cross-sections. There was evidence of a declining speed of learning over time, and it was concluded that a higher speed of learning is not necessarily a good thing, because speed of learning is contingent on the entrepreneur's initial knowledge, and the precision of the signals he receives from the market. Also, there was no reason to expect speed of learning to be related to the growth of the firm in one direction over another. The third essay used the data from years 2004-2010, and determined the timing of diversification activities by the business start-ups. It captured when a start-up diversified for the first time, and explored the association between an early diversification strategy adopted by a firm, and its survival rate. A semi-parametric Cox proportional hazard model was used to examine the survival pattern. The results demonstrated that firms diversifying at an early stage in their lives show a higher survival rate; however, this effect fades over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Raised blood pressure is an important risk factor for cardiovascular diseases and chronic kidney disease. We estimated worldwide trends in mean systolic and mean diastolic blood pressure, and the prevalence of, and number of people with, raised blood pressure, defined as systolic blood pressure of 140 mm Hg or higher or diastolic blood pressure of 90 mm Hg or higher. Methods: For this analysis, we pooled national, subnational, or community population-based studies that had measured blood pressure in adults aged 18 years and older. We used a Bayesian hierarchical model to estimate trends from 1975 to 2015 in mean systolic and mean diastolic blood pressure, and the prevalence of raised blood pressure for 200 countries. We calculated the contributions of changes in prevalence versus population growth and ageing to the increase in the number of adults with raised blood pressure. Findings: We pooled 1479 studies that had measured the blood pressures of 19·1 million adults. Global age-standardised mean systolic blood pressure in 2015 was 127·0 mm Hg (95% credible interval 125·7–128·3) in men and 122·3 mm Hg (121·0–123·6) in women; age-standardised mean diastolic blood pressure was 78·7 mm Hg (77·9–79·5) for men and 76·7 mm Hg (75·9–77·6) for women. Global age-standardised prevalence of raised blood pressure was 24·1% (21·4–27·1) in men and 20·1% (17·8–22·5) in women in 2015. Mean systolic and mean diastolic blood pressure decreased substantially from 1975 to 2015 in high-income western and Asia Pacific countries, moving these countries from having some of the highest worldwide blood pressure in 1975 to the lowest in 2015. Mean blood pressure also decreased in women in central and eastern Europe, Latin America and the Caribbean, and, more recently, central Asia, Middle East, and north Africa, but the estimated trends in these super-regions had larger uncertainty than in high-income super-regions. By contrast, mean blood pressure might have increased in east and southeast Asia, south Asia, Oceania, and sub-Saharan Africa. In 2015, central and eastern Europe, sub-Saharan Africa, and south Asia had the highest blood pressure levels. Prevalence of raised blood pressure decreased in high-income and some middle-income countries; it remained unchanged elsewhere. The number of adults with raised blood pressure increased from 594 million in 1975 to 1·13 billion in 2015, with the increase largely in low-income and middle-income countries. The global increase in the number of adults with raised blood pressure is a net effect of increase due to population growth and ageing, and decrease due to declining age-specific prevalence. Interpretation: During the past four decades, the highest worldwide blood pressure levels have shifted from high-income countries to low-income countries in south Asia and sub-Saharan Africa due to opposite trends, while blood pressure has been persistently high in central and eastern Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: One of the global targets for non-communicable diseases is to halt, by 2025, the rise in the age-standardised adult prevalence of diabetes at its 2010 levels. We aimed to estimate worldwide trends in diabetes, how likely it is for countries to achieve the global target, and how changes in prevalence, together with population growth and ageing, are affecting the number of adults with diabetes. Methods: We pooled data from population-based studies that had collected data on diabetes through measurement of its biomarkers. We used a Bayesian hierarchical model to estimate trends in diabetes prevalence - defined as fasting plasma glucose of 7·0 mmol/L or higher, or history of diagnosis with diabetes, or use of insulin or oral hypoglycaemic drugs - in 200 countries and territories in 21 regions, by sex and from 1980 to 2014. We also calculated the posterior probability of meeting the global diabetes target if post-2000 trends continue. Findings: We used data from 751 studies including 4 372 000 adults from 146 of the 200 countries we make estimates for Global age-standardised diabetes prevalence increased from 4·3% (95% credible interval 2·4-7·0) in 1980 to 9·0% (7·2-11·1) in 2014 in men, and from 5·0% (2·9-7·9) to 7·9% (6·4-9·7) in women. The number of adults with diabetes in the world increased from 108 million in 1980 to 422 million in 2014 (28·5% due to the rise in prevalence, 39·7% due to population growth and ageing, and 31·8% due to interaction of these two factors). Age-standardised adult diabetes prevalence in 2014 was lowest in northwestern Europe, and highest in Polynesia and Micronesia, at nearly 25%, followed by Melanesia and the Middle East and north Africa. Between 1980 and 2014 there was little change in age-standardised diabetes prevalence in adult women in continental western Europe, although crude prevalence rose because of ageing of the population. By contrast, age-standardised adult prevalence rose by 15 percentage points in men and women in Polynesia and Micronesia. In 2014, American Samoa had the highest national prevalence of diabetes (>30% in both sexes), with age-standardised adult prevalence also higher than 25% in some other islands in Polynesia and Micronesia. If post-2000 trends continue, the probability of meeting the global target of halting the rise in the prevalence of diabetes by 2025 at the 2010 level worldwide is lower than 1% for men and is 1% for women. Only nine countries for men and 29 countries for women, mostly in western Europe, have a 50% or higher probability of meeting the global target. Interpretation Since 1980, age-standardised diabetes prevalence in adults has increased, or at best remained unchanged, in every country. Together with population growth and ageing, this rise has led to a near quadrupling of the number of adults with diabetes worldwide. The burden of diabetes, both in terms of prevalence and number of adults aff ected, has increased faster in low-income and middle-income countries than in high-income countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper estimates Bejarano and Charry (2014)’s small open economy with financial frictions model for the Colombian economy using Bayesian estimation techniques. Additionally, I compute the welfare gains of implementing an optimal response to credit spreads into an augmented Taylor rule. The main result is that a reaction to credit spreads does not imply significant welfare gains unless the economic disturbances increases its volatility, like the disruption implied by a financial crisis. Otherwise its impact over the macroeconomic variables is null.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cerebral cortex presents self-similarity in a proper interval of spatial scales, a property typical of natural objects exhibiting fractal geometry. Its complexity therefore can be characterized by the value of its fractal dimension (FD). In the computation of this metric, it has usually been employed a frequentist approach to probability, with point estimator methods yielding only the optimal values of the FD. In our study, we aimed at retrieving a more complete evaluation of the FD by utilizing a Bayesian model for the linear regression analysis of the box-counting algorithm. We used T1-weighted MRI data of 86 healthy subjects (age 44.2 ± 17.1 years, mean ± standard deviation, 48% males) in order to gain insights into the confidence of our measure and investigate the relationship between mean Bayesian FD and age. Our approach yielded a stronger and significant (P < .001) correlation between mean Bayesian FD and age as compared to the previous implementation. Thus, our results make us suppose that the Bayesian FD is a more truthful estimation for the fractal dimension of the cerebral cortex compared to the frequentist FD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the success of the ΛCDM model in describing the Universe, a possible tension between early- and late-Universe cosmological measurements is calling for new independent cosmological probes. Amongst the most promising ones, gravitational waves (GWs) can provide a self-calibrated measurement of the luminosity distance. However, to obtain cosmological constraints, additional information is needed to break the degeneracy between parameters in the gravitational waveform. In this thesis, we exploit the latest LIGO-Virgo-KAGRA Gravitational Wave Transient Catalog (GWTC-3) of GW sources to constrain the background cosmological parameters together with the astrophysical properties of Binary Black Holes (BBHs), using information from their mass distribution. We expand the public code MGCosmoPop, previously used for the application of this technique, by implementing a state-of-the-art model for the mass distribution, needed to account for the presence of non-trivial features, i.e. a truncated power law with two additional Gaussian peaks, referred to as Multipeak. We then analyse GWTC-3 comparing this model with simpler and more commonly adopted ones, both in the case of fixed and varying cosmology, and assess their goodness-of-fit with different model selection criteria, and their constraining power on the cosmological and population parameters. We also start to explore different sampling methods, namely Markov Chain Monte Carlo and Nested Sampling, comparing their performances and evaluating the advantages of both. We find concurring evidence that the Multipeak model is favoured by the data, in line with previous results, and show that this conclusion is robust to the variation of the cosmological parameters. We find a constraint on the Hubble constant of H0 = 61.10+38.65−22.43 km/s/Mpc (68% C.L.), which shows the potential of this method in providing independent constraints on cosmological parameters. The results obtained in this work have been included in [1].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Fourier transform-infrared (FT-IR) signature of dry samples of DNA and DNA-polypeptide complexes, as studied by IR microspectroscopy using a diamond attenuated total reflection (ATR) objective, has revealed important discriminatory characteristics relative to the PO2(-) vibrational stretchings. However, DNA IR marks that provide information on the sample's richness in hydrogen bonds have not been resolved in the spectral profiles obtained with this objective. Here we investigated the performance of an all reflecting objective (ARO) for analysis of the FT-IR signal of hydrogen bonds in DNA samples differing in base richness types (salmon testis vs calf thymus). The results obtained using the ARO indicate prominent band peaks at the spectral region representative of the vibration of nitrogenous base hydrogen bonds and of NH and NH2 groups. The band areas at this spectral region differ in agreement with the DNA base richness type when using the ARO. A peak assigned to adenine was more evident in the AT-rich salmon DNA using either the ARO or the ATR objective. It is concluded that, for the discrimination of DNA IR hydrogen bond vibrations associated with varying base type proportions, the use of an ARO is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although various abutment connections and materials have recently been introduced, insufficient data exist regarding the effect of stress distribution on their mechanical performance. The purpose of this study was to investigate the effect of different abutment materials and platform connections on stress distribution in single anterior implant-supported restorations with the finite element method. Nine experimental groups were modeled from the combination of 3 platform connections (external hexagon, internal hexagon, and Morse tapered) and 3 abutment materials (titanium, zirconia, and hybrid) as follows: external hexagon-titanium, external hexagon-zirconia, external hexagon-hybrid, internal hexagon-titanium, internal hexagon-zirconia, internal hexagon-hybrid, Morse tapered-titanium, Morse tapered-zirconia, and Morse tapered-hybrid. Finite element models consisted of a 4×13-mm implant, anatomic abutment, and lithium disilicate central incisor crown cemented over the abutment. The 49 N occlusal loading was applied in 6 steps to simulate the incisal guidance. Equivalent von Mises stress (σvM) was used for both the qualitative and quantitative evaluation of the implant and abutment in all the groups and the maximum (σmax) and minimum (σmin) principal stresses for the numerical comparison of the zirconia parts. The highest abutment σvM occurred in the Morse-tapered groups and the lowest in the external hexagon-hybrid, internal hexagon-titanium, and internal hexagon-hybrid groups. The σmax and σmin values were lower in the hybrid groups than in the zirconia groups. The stress distribution concentrated in the abutment-implant interface in all the groups, regardless of the platform connection or abutment material. The platform connection influenced the stress on abutments more than the abutment material. The stress values for implants were similar among different platform connections, but greater stress concentrations were observed in internal connections.