966 resultados para penalized likelihood


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of the current study was to attempt to model various cognitive and social processes that are believed to lead to false confessions. More specifically, this study manipulated the variables of experimenter expectancy, guilt-innocence of the suspect, and interrogation techniques using the Russano et al. (2005) paradigm. The primary measure of interest was the likelihood of the participant signing the confession statement. By manipulating experimenter expectancy, the current study sought to further explore the social interactions that may occur in the interrogation room. In addition, in past experiments, the interrogator has typically been restricted to the use of one or two interrogation techniques. In the present study, interrogators were permitted to select from 15 different interrogation techniques when attempting to solicit a confession from participants. ^ Consistent with Rusanno et al. (2005), guilty participants (94%) were more likely to confess to the act of cheating than innocent participants (31%). The variable of experimenter expectancy did not effect confessions rates, length of interrogation, or the type of interrogation techniques used. Path analysis revealed feelings of pressure and the weighing of consequences on the part of the participant were associated with the signing of the confession statement. The findings suggest the guilt/innocence of the participant, the participant's perceptions of the interrogation situation, and length of interrogation play a pivotal role in the signing of the confession statement. Further examination of these variables may provide researchers with a better understanding of the relationship between interrogations and confessions. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acknowledgements This research has been conducted using the UK Biobank resource, and was funded by the University of Aberdeen. The authors have no conflicts of interest to declare.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to assess the effect of performance feedback on Athletic Trainers’ (ATs) perceived knowledge (PK) and likelihood to pursue continuing education (CE). The investigation was grounded in the theories of “the definition of the situation” (Thomas & Thomas, 1928) and the “illusion of knowing,” (Glenberg, Wilkinson, & Epstein, 1982) suggesting that PK drives behavior. This investigation measured the degree to which knowledge gap predicted CE seeking behavior by providing performance feedback designed to change PK. A pre-test post-test control-group design was used to measure PK and likelihood to pursue CE before and after assessing actual knowledge. ATs (n=103) were randomly sampled and assigned to two groups, with and without performance feedback. Two independent samples t-tests were used to compare groups on the difference scores of the dependent variables. Likelihood to pursue CE was predicted by three variables using multiple linear regression: perceived knowledge, pre-test likelihood to pursue CE, and knowledge gap. There was a 68.4% significant difference (t101= 2.72, p=0.01, ES=0.45) between groups in the change scores for likelihood to pursue CE because of the performance feedback (Experimental group=13.7% increase; Control group= 4.3% increase). The strongest relationship among the dependent variables was between pre-test and post-test measures of likelihood to pursue CE (F2,102=56.80, p<0.01, r=0.73, R2=0.53). The pre- and post-test predictive relationship was enhanced when group was included in the model. In this model [YCEpost=0.76XCEpre-0.34 Xgroup+2.24+E], group accounted for a significant amount of unique variance in predicting CE while the pre-test likelihood to pursue CE variable was held constant (F3,102=40.28, p<0.01,: r=0.74, R2=0.55). Pre-test knowledge gap, regardless of group allocation, was a linear predictor of the likelihood to pursue CE (F1,102=10.90, p=.01, r=.31, R2=.10). In this investigation, performance feedback significantly increased participants’ likelihood to pursue CE. Pre-test knowledge gap was a significant predictor of likelihood to pursue CE, regardless if performance feedback was provided. ATs may have self-assessed and engaged in internal feedback as a result of their test-taking experience. These findings indicate that feedback, both internal and external, may be necessary to trigger CE seeking behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When we study the variables that a ffect survival time, we usually estimate their eff ects by the Cox regression model. In biomedical research, e ffects of the covariates are often modi ed by a biomarker variable. This leads to covariates-biomarker interactions. Here biomarker is an objective measurement of the patient characteristics at baseline. Liu et al. (2015) has built up a local partial likelihood bootstrap model to estimate and test this interaction e ffect of covariates and biomarker, but the R code developed by Liu et al. (2015) can only handle one variable and one interaction term and can not t the model with adjustment to nuisance variables. In this project, we expand the model to allow adjustment to nuisance variables, expand the R code to take any chosen interaction terms, and we set up many parameters for users to customize their research. We also build up an R package called "lplb" to integrate the complex computations into a simple interface. We conduct numerical simulation to show that the new method has excellent fi nite sample properties under both the null and alternative hypothesis. We also applied the method to analyze data from a prostate cancer clinical trial with acid phosphatase (AP) biomarker.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Roads represent a new source of mortality due to animal-vehicle risk of collision threatening log-term populations’ viability. Risk of road-kill depends on species sensitivity to roads and their specific life-history traits. The risk of road mortality for each species depends on the characteristics of roads and bioecological characteristics of the species. In this study we intend to know the importance of climatic parameters (temperature and precipitation) together with traffic and life history traits and understand the role of drought in barn owl population viability, also affected by road mortality in three scenarios: high mobility, high population density and the combination of previous scenarios (mixed) (Manuscript). For the first objective we correlated the several parameters (climate, traffic and life history traits). We used the most correlated variables to build a predictive mixed model (GLMM) the influence of the same. Using a population model we evaluated barn owl population viability in all three scenarios. Model revealed precipitation, traffic and dispersal have negative relationship with road-kills, although the relationship was not significant. Scenarios showed different results, high mobility scenario showed greater population depletion, more fluctuations over time and greater risk of extinction. High population density scenario showed a more stable population with lower risk of extinction and mixed scenario showed similar results as first scenario. Climate seems to play an indirect role on barn owl road-kills, it may influence prey availability which influences barn owl reproductive success and activity. Also, high mobility scenario showed a greater negative impact on viability of populations which may affect their ability and resilience to other stochastic events. Future research should take in account climate and how it may influence species life cycles and activity periods for a more complete approach of road-kills. Also it is important to make the best mitigation decisions which might include improving prey quality habitat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report reviews literature on the rate of convergence of maximum likelihood estimators and establishes a Central Limit Theorem, which yields an O(1/sqrt(n)) rate of convergence of the maximum likelihood estimator under somewhat relaxed smoothness conditions. These conditions include the existence of a one-sided derivative in θ of the pdf, compared to up to three that are classically required. A verification through simulation is included in the end of the report.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report discusses the calculation of analytic second-order bias techniques for the maximum likelihood estimates (for short, MLEs) of the unknown parameters of the distribution in quality and reliability analysis. It is well-known that the MLEs are widely used to estimate the unknown parameters of the probability distributions due to their various desirable properties; for example, the MLEs are asymptotically unbiased, consistent, and asymptotically normal. However, many of these properties depend on an extremely large sample sizes. Those properties, such as unbiasedness, may not be valid for small or even moderate sample sizes, which are more practical in real data applications. Therefore, some bias-corrected techniques for the MLEs are desired in practice, especially when the sample size is small. Two commonly used popular techniques to reduce the bias of the MLEs, are ‘preventive’ and ‘corrective’ approaches. They both can reduce the bias of the MLEs to order O(n−2), whereas the ‘preventive’ approach does not have an explicit closed form expression. Consequently, we mainly focus on the ‘corrective’ approach in this report. To illustrate the importance of the bias-correction in practice, we apply the bias-corrected method to two popular lifetime distributions: the inverse Lindley distribution and the weighted Lindley distribution. Numerical studies based on the two distributions show that the considered bias-corrected technique is highly recommended over other commonly used estimators without bias-correction. Therefore, special attention should be paid when we estimate the unknown parameters of the probability distributions under the scenario in which the sample size is small or moderate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to apply fuzzy majority multicriteria group decision?making to determine risk areas for foot?and?mouth disease (FMD) introduction along the border between Brazil and Paraguay. The study was conducted in three municipalities in the state of Mato Grosso do Sul, Brazil, located along the border with Paraguay. Four scenarios were built, applying the following linguistic quantifiers to describe risk factors: few, half, many, and most. The three criteria considered to be most likely to affect the vulnerability to introduction of FMD, according to experts? opinions, were: the introduction of animals in the farm, the distance from the border, and the type of property settlements. The resulting maps show a strong spatial heterogeneity in the risk of FMD introduction. The used methodology brings out a new approach that can be helpful to policy makers in the combat and eradication of FMD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Participatory evaluation and participatory action research (PAR) are increasingly used in community-based programs and initiatives and there is a growing acknowledgement of their value. These methodologies focus more on knowledge generated and constructed through lived experience than through social science (Vanderplaat 1995). The scientific ideal of objectivity is usually rejected in favour of a holistic approach that acknowledges and takes into account the diverse perspectives, values and interpretations of participants and evaluation professionals. However, evaluation rigour need not be lost in this approach. Increasing the rigour and trustworthiness of participatory evaluations and PAR increases the likelihood that results are seen as credible and are used to continually improve programs and policies.----- Drawing on learnings and critical reflections about the use of feminist and participatory forms of evaluation and PAR over a 10-year period, significant sources of rigour identified include:----- • participation and communication methods that develop relations of mutual trust and open communication----- • using multiple theories and methodologies, multiple sources of data, and multiple methods of data collection----- • ongoing meta-evaluation and critical reflection----- • critically assessing the intended and unintended impacts of evaluations, using relevant theoretical models----- • using rigorous data analysis and reporting processes----- • participant reviews of evaluation case studies, impact assessments and reports.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sequences of two chloroplast photosystem genes, psaA and psbB, together comprising about 3,500 bp, were obtained for all five major groups of extant seed plants and several outgroups among other vascular plants. Strongly supported, but significantly conflicting, phylogenetic signals were obtained in parsimony analyses from partitions of the data into first and second codon positions versus third positions. In the former, both genes agreed on a monophyletic gymnosperms, with Gnetales closely related to certain conifers. In the latter, Gnetales are inferred to be the sister group of all other seed plants, with gymnosperms paraphyletic. None of the data supported the modern ‘‘anthophyte hypothesis,’’ which places Gnetales as the sister group of flowering plants. A series of simulation studies were undertaken to examine the error rate for parsimony inference. Three kinds of errors were examined: random error, systematic bias (both properties of finite data sets), and statistical inconsistency owing to long-branch attraction (an asymptotic property). Parsimony reconstructions were extremely biased for third-position data for psbB. Regardless of the true underlying tree, a tree in which Gnetales are sister to all other seed plants was likely to be reconstructed for these data. None of the combinations of genes or partitions permits the anthophyte tree to be reconstructed with high probability. Simulations of progressively larger data sets indicate the existence of long-branch attraction (statistical inconsistency) for third-position psbB data if either the anthophyte tree or the gymnosperm tree is correct. This is also true for the anthophyte tree using either psaA third positions or psbB first and second positions. A factor contributing to bias and inconsistency is extremely short branches at the base of the seed plant radiation, coupled with extremely high rates in Gnetales and nonseed plant outgroups. M. J. Sanderson,* M. F. Wojciechowski,*† J.-M. Hu,* T. Sher Khan,* and S. G. Brady

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A plethora of methods for procuring building projects are available to meet the needs of clients. Deciding what method to use for a given project is a difficult and challenging task as a client’s objectives and priorities need to marry with the selected method so as to improve the likelihood of the project being procured successfully. The decision as to what procurement system to use should be made as early as possible and underpinned by the client’s business case for the project. The risks and how they can potentially affect the client’s business should also be considered. In this report, the need for client’s to develop a procurement strategy, which outlines the key means by which the objectives of the project are to be achieved is emphasised. Once a client has established a business case for a project, appointed a principal advisor, determined their requirements and brief, then consideration as to which procurement method to be adopted should be made. An understanding of the characteristics of various procurement options is required before a recommendation can be made to a client. Procurement systems can be categorised as traditional, design and construct, management and collaborative. The characteristics of these systems along with the procurement methods commonly used are described. The main advantages and disadvantages, and circumstances under which a system could be considered applicable for a given project are also identified.