877 resultados para Attitudes, Persuasion, Confidence, Voice, Elaboration Likelihood Model
Resumo:
Single vehicle run-off-road (ROR) crashes are the largest type of fatal passenger vehicle crash in the United States (NCHRP 500 2003). In Iowa, ROR crashes accounted for 36% of rural crashes and 9% of total crashes in 2006. Run-off-road crashes accounted for more than 61.8% of rural fatal crashes and 32.6% of total fatal crashes in Iowa in 2006. Paved shoulders are a potential countermeasure for ROR crashes. Several studies are available which have generally indicated that paved shoulders are effective in reducing crashes. However, the number of studies that quantify the benefits are limited. The research described in this report evaluates the effectiveness of paved shoulders. Model results indicated that covariate for speed limit was not significant at the 0.05 confidence level and was removed from the model. All other variables which resulted in the final model were significant at the 0.05 confidence level. The final model indicated that season of the year was significant in indicating expected number of total monthly crashes with a higher number of crashes occurring in the winter and fall than for spring and summer. The model also indicated that presence of rumble strips, paved shoulder width, unpaved shoulder width, and presence of a divided median were correlated with a decrease in crashes. The model also indicated that roadway sections with paved shoulders had fewer crashes in the after period as compared to both the before period and control sections. The actual impact of paved shoulders depends on several other covariates as indicated in the final model such as installation year and width of paved shoulders. However, comparing the expected number of total crashes before and after installation of paved shoulders for several scenarios indicated around a 4.6% reduction in the expected number of monthly crashes in the after period.
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
We present projections of winter storm-induced insured losses in the German residential building sector for the 21st century. With this aim, two structurally most independent downscaling methods and one hybrid downscaling method are applied to a 3-member ensemble of ECHAM5/MPI-OM1 A1B scenario simulations. One method uses dynamical downscaling of intense winter storm events in the global model, and a transfer function to relate regional wind speeds to losses. The second method is based on a reshuffling of present day weather situations and sequences taking into account the change of their frequencies according to the linear temperature trends of the global runs. The third method uses statistical-dynamical downscaling, considering frequency changes of the occurrence of storm-prone weather patterns, and translation into loss by using empirical statistical distributions. The A1B scenario ensemble was downscaled by all three methods until 2070, and by the (statistical-) dynamical methods until 2100. Furthermore, all methods assume a constant statistical relationship between meteorology and insured losses and no developments other than climate change, such as in constructions or claims management. The study utilizes data provided by the German Insurance Association encompassing 24 years and with district-scale resolution. Compared to 1971–2000, the downscaling methods indicate an increase of 10-year return values (i.e. loss ratios per return period) of 6–35 % for 2011–2040, of 20–30 % for 2041–2070, and of 40–55 % for 2071–2100, respectively. Convolving various sources of uncertainty in one confidence statement (data-, loss model-, storm realization-, and Pareto fit-uncertainty), the return-level confidence interval for a return period of 15 years expands by more than a factor of two. Finally, we suggest how practitioners can deal with alternative scenarios or possible natural excursions of observed losses.
Resumo:
This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision. Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes. The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).
Resumo:
We combine results from searches by the CDF and D0 collaborations for a standard model Higgs boson (H) in the process gg -> H -> W+W- in p (p) over bar collisions at the Fermilab Tevatron Collider at root s = 1.96 TeV. With 4.8 fb(-1) of integrated luminosity analyzed at CDF and 5.4 fb(-1) at D0, the 95% confidence level upper limit on sigma(gg -> H) x B(H -> W+W-) is 1.75 pb at m(H) = 120 GeV, 0.38 pb at m(H) = 165 GeV, and 0.83 pb at m(H) = 200 GeV. Assuming the presence of a fourth sequential generation of fermions with large masses, we exclude at the 95% confidence level a standard-model-like Higgs boson with a mass between 131 and 204 GeV.
Resumo:
Various types of trill exercises have been used for a long time as a tool in the treatment and preparation of the voice. Although they are reported to produce vocal benefits in most subjects, their physiology has not yet been studied in depth. The aim of this study was to compare the mean and standard deviation of the closed quotient in exercises of lip and tongue trills with the sustained vowel /epsilon/ in opera singers. Ten professional classical (operatic) singers, reportedly in perfect laryngeal health, served as subjects for this study and underwent electroglottography. During the examination, the subjects were instructed to deliver the sustained vowel /epsilon/ and lip and tongue trills in a same preestablished frequency and intensity. The mean values and standard deviation of the closed quotient were obtained using the software developed for this purpose. The comparison of the results was intrasubjects; maximum intensities were compared only among them and so were minimum intensities. The means of closed quotient were statistically significant only in the strong intensities, and the lip trill was different from the tongue trill and the sustained vowel /epsilon/. The standard deviation of the closed quotient distinguished the sustained vowel /epsilon/ from the lip and tongue trills in the two intensities. We concluded that there is oscillation of the closed quotient during the exercises of tongue and lip trills, and the closed quotient is higher during the performance of exercises of the lip trill, when compared with the two other utterances, only in the strong intensities.
Resumo:
It has been difficult to replicate consistently the experimental model of axonal Guillain-Barré syndrome (GBS). We immunized rabbits with two lipo-oligosaccharides (LOS1 and LOS2) derived from the same C. jejuni strain and purified in a slightly different way. LOS1 did not contain proteins whereas several proteins were present in LOS2. In spite of a robust anti-GM1 antibody response in all animals the neuropathy developed only in rabbits immunized with LOS1. To explain this discrepancy we investigated fine specificity, affinity and ability to activate the complement of anti-GM1 antibodies. Only rabbits immunized with LOS1 showed monospecific high-affinity antibodies which activated more effectively the complement. Although it is not well understood how monospecific high-affinity antibodies are induced these are crucial for the induction of experimental axonal neuropathy. Only a strict adherence to the protocols demonstrated to be successful may guarantee the reproducibility and increase the confidence in the animal model as a reliable tool for the study of the human axonal GBS.
Resumo:
This article presents a probabilistic method for vehicle detection and tracking through the analysis of monocular images obtained from a vehicle-mounted camera. The method is designed to address the main shortcomings of traditional particle filtering approaches, namely Bayesian methods based on importance sampling, for use in traffic environments. These methods do not scale well when the dimensionality of the feature space grows, which creates significant limitations when tracking multiple objects. Alternatively, the proposed method is based on a Markov chain Monte Carlo (MCMC) approach, which allows efficient sampling of the feature space. The method involves important contributions in both the motion and the observation models of the tracker. Indeed, as opposed to particle filter-based tracking methods in the literature, which typically resort to observation models based on appearance or template matching, in this study a likelihood model that combines appearance analysis with information from motion parallax is introduced. Regarding the motion model, a new interaction treatment is defined based on Markov random fields (MRF) that allows for the handling of possible inter-dependencies in vehicle trajectories. As for vehicle detection, the method relies on a supervised classification stage using support vector machines (SVM). The contribution in this field is twofold. First, a new descriptor based on the analysis of gradient orientations in concentric rectangles is dened. This descriptor involves a much smaller feature space compared to traditional descriptors, which are too costly for real-time applications. Second, a new vehicle image database is generated to train the SVM and made public. The proposed vehicle detection and tracking method is proven to outperform existing methods and to successfully handle challenging situations in the test sequences.
Resumo:
In this study, a method for vehicle tracking through video analysis based on Markov chain Monte Carlo (MCMC) particle filtering with metropolis sampling is proposed. The method handles multiple targets with low computational requirements and is, therefore, ideally suited for advanced-driver assistance systems that involve real-time operation. The method exploits the removed perspective domain given by inverse perspective mapping (IPM) to define a fast and efficient likelihood model. Additionally, the method encompasses an interaction model using Markov Random Fields (MRF) that allows treatment of dependencies between the motions of targets. The proposed method is tested in highway sequences and compared to state-of-the-art methods for vehicle tracking, i.e., independent target tracking with Kalman filtering (KF) and joint tracking with particle filtering. The results showed fewer tracking failures using the proposed method.
Resumo:
Markov chain Monte Carlo (MCMC) is a methodology that is gaining widespread use in the phylogenetics community and is central to phylogenetic software packages such as MrBayes. An important issue for users of MCMC methods is how to select appropriate values for adjustable parameters such as the length of the Markov chain or chains, the sampling density, the proposal mechanism, and, if Metropolis-coupled MCMC is being used, the number of heated chains and their temperatures. Although some parameter settings have been examined in detail in the literature, others are frequently chosen with more regard to computational time or personal experience with other data sets. Such choices may lead to inadequate sampling of tree space or an inefficient use of computational resources. We performed a detailed study of convergence and mixing for 70 randomly selected, putatively orthologous protein sets with different sizes and taxonomic compositions. Replicated runs from multiple random starting points permit a more rigorous assessment of convergence, and we developed two novel statistics, delta and epsilon, for this purpose. Although likelihood values invariably stabilized quickly, adequate sampling of the posterior distribution of tree topologies took considerably longer. Our results suggest that multimodality is common for data sets with 30 or more taxa and that this results in slow convergence and mixing. However, we also found that the pragmatic approach of combining data from several short, replicated runs into a metachain to estimate bipartition posterior probabilities provided good approximations, and that such estimates were no worse in approximating a reference posterior distribution than those obtained using a single long run of the same length as the metachain. Precision appears to be best when heated Markov chains have low temperatures, whereas chains with high temperatures appear to sample trees with high posterior probabilities only rarely. [Bayesian phylogenetic inference; heating parameter; Markov chain Monte Carlo; replicated chains.]
Resumo:
The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.
Resumo:
Credible endorsers are often used in advertisements. However, there is conflicting evidence on the role source credibility plays in persuasion. Early research found that source credibility affects persuasion when subjects pay attention to the communication. Other research indicates that a credible source enhances persuasion when people do not scrutinize the message claims carefully and thoroughly. This effect is opposite to what was indicated by early research. More recent research indicates that source credibility may affect persuasion when people scrutinize the message claims, but limits this effect to advertisements with certain type of claims (i.e., ambiguous or extreme claims). This dissertation proposes that source credibility might play a broader role during persuasion than suggested by the empirical literature. Source credibility may affect persuasion, at low levels of involvement, by serving as a peripheral cue. It may also affect persuasion, at high involvement, by serving as an argument or biasing elaboration. ^ Each of these possibilities was explored in an experiment using a 3 (source credibility) x 2 (type of claim) x 2 (levels of involvement) full factorial design. The sample consisted of 180 undergraduate students from a major southeastern University. ^ Results indicated that, at high levels of involvement, the credibility of the source affected persuasion. This effect was due to source credibility acting as an argument within the advertisement. This study did not find that source credibility affected persuasion by biasing elaboration, at high involvement, or by serving as a peripheral cue, at low involvement. ^
Resumo:
This thesis investigates the numerical modelling of Dynamic Position (DP) in pack ice. A two-dimensional numerical model for ship-ice interaction was developed using the Discrete Element Method (DEM). A viscous-elastic ice rheology was adopted to model the dynamic behaviour of the ice floes. Both the ship-ice and the ice-ice contacts were considered in the interaction force. The environment forces and the hydrodynamic forces were calculated by empirical formulas. After the current position and external forces were calculated, a Proportional-Integral-Derivative (PID) control and thrust allocation algorithms were applied on the vessel to control its motion and heading. The numerical model was coded in Fortran 90 and validated by comparing computation results to published data. Validation work was first carried out for the ship-ice interaction calculation, and former researchers’ simulation and model test results were used for the comparison. With confidence in the interaction model, case studies were conducted to predict the DP capability of a sample Arctic DP vessel.
Resumo:
Dissertação de mestrado, Qualidade em Análises, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014