158 resultados para prior probabilities
em University of Queensland eSpace - Australia
Resumo:
There is concern over the safety of calcium channel blockers (CCBs) in acute coronary disease. We sought to determine if patients taking calcium channel blockers (CCBs) at the time of admission with acute myocardial infarction (AMI) had a higher case-fatality compared with those taking beta-blockers or neither medication. Clinical and drug treatment variables at the time of hospital admission predictive of survival at 28 days were examined in a community-based registry of patients aged under 65 years admitted to hospital for suspected AMI in Perth, Australia, between 1984 and 1993. Among 7766 patients, 1291 (16.6%) were taking a CCB and 1259 (16.2%) a betablocker alone at hospital admission. Patients taking CCBs had a worse clinical profile than those taking a beta-blocker alone or neither drug (control group), and a higher unadjusted 28-day mortality (17.6% versus 9.3% and 11.1% respectively, both P < 0.001). There was no significant heterogeneity with respect to mortality between nifedipine, diltiazem, or verapamil when used alone, or with a beta-blocker. After adjustment for factors predictive of death at 28 days, patients taking a CCB were found not to have an excess chance of death compared with the control group (odds ratio [OR] 1.06, 95% confidence interval [CI]; 0.87, 1.30), whereas those taking a beta-blocker alone had a lower odds of death (OR 0.75, 95% CI; 0.59, 0.94). These results indicate that established calcium channel blockade is not associated with an excess risk of death following AMI once other differences between patients are taken into account, but neither does it have the survival advantage seen with prior beta-blocker therapy.
Resumo:
HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.
Resumo:
The importance of education and experience to the successful performance of new firms is well recognized both by management practitioners and academics. Yet empirical research to support the significance of this relationship is inconclusive. This paper discusses theories describing the relationship between education and experience and firm performance. It also analyses and classifies the differing measures of performance, education and experience, and compares the results of multiple studies undertaken between 1977 and 2000. Possible reasons for conflicting results are identified, such as lack of sound theoretical bases that relate education and experience to performance, varying definitions of the key variables and the diversity of measures used. Finally, a framework is developed that incorporates variables that interact with experience and education to influence new venture performance.
Resumo:
What fundamental constraints characterize the relationship between a mixture rho = Sigma (i)p(i)rho (i) of quantum states, the states rho (i) being mixed, and the probabilities p(i)? What fundamental constraints characterize the relationship between prior and posterior states in a quantum measurement? In this paper we show that then are many surprisingly strong constraints on these mixing and measurement processes that can be expressed simply in terms of the eigenvalues of the quantum states involved. These constraints capture in a succinct fashion what it means to say that a quantum measurement acquires information about the system being measured, and considerably simplify the proofs of many results about entanglement transformation.
Resumo:
A firm's prior knowledge facilitates the absorption of new knowledge, thereby renewing a firm's systematic search, transfer and absorption capabilities. The rapidly expanding field of biotechnology is characterised by the convergence of disparate sciences and technologies. This paper, the shift from proteinbased to DNA-based diagnostic technologies, quantifies the value of a firm's prior knowledge and its relation to future knowledge development. Four dimensions of diagnostic and four dimensions of knowledge in biotechnology firms are analysed. A simple scaled matrix method is developed to quantify the positive and negative heuristic values of prior scientific and technological knowledge that is useful for the acquisition and absorption of new knowledge.
Resumo:
An efficient Lanczos subspace method has been devised for calculating state-to-state reaction probabilities. The method recasts the time-independent wave packet Lippmann-Schwinger equation [Kouri , Chem. Phys. Lett. 203, 166 (1993)] inside a tridiagonal (Lanczos) representation in which action of the causal Green's operator is affected easily with a QR algorithm. The method is designed to yield all state-to-state reaction probabilities from a given reactant-channel wave packet using a single Lanczos subspace; the spectral properties of the tridiagonal Hamiltonian allow calculations to be undertaken at arbitrary energies within the spectral range of the initial wave packet. The method is applied to a H+O-2 system (J=0), and the results indicate the approach is accurate and stable. (C) 2002 American Institute of Physics.
Resumo:
Knowledge, especially scientific and technological knowledge, grows according to knowledge trajectories and guideposts that make up the prior knowledge of an organization. We argue that these knowledge structures and their specific components lead to successful innovation. A firm's prior knowledge facilitates the absorption of new knowledge, thereby renewing a firm's systematic search, transfer and acquisition of knowledge and capabilities. In particular, the exponential growth in biotechnology is characterized by the convergence of disparate scientific and technological knowledge resources. This paper examines the shift from protein-based to DNA-based diagnostic technologies as an example, to quantify the value of a firm's prior knowledge using relative values of knowledge distance. The distance between core prior knowledge and the rate of transition from one knowledge system to another has been identified as a proxy for the value a firm's prior knowledge. The overall difficulty of transition from one technology paradigm to another is discussed. We argue this transition is possible when the knowledge distance is minimal and the transition process has a correspondingly high value of absorptive capacities. Our findings show knowledge distance is a determinant of the feasibility, continuity and capture of scientific and technological knowledge. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
Background: The surgical cure rate for primary hyperparathyroidism is greater than 95%. For those who have recurrent or persistent disease, preoperative localization improves reoperation success rates. Selective parathyroid venous sampling (SPVS) for intact parathyroid hormone is particularly useful when non-invasive localization techniques are negative or inconclusive. Methods: We present all known cases (n = 13) between 1994 and 2002 who had venous sampling for localization at our institution prior to reoperation for recurrent or persistent primary hyperparathyroidism. Comparison was made with non-invasive localization procedures. Results of invasive and non-invasive localization were correlated with surgical findings. Results: Of the nine reoperated cases, eight had positive correlations between SPVS and operative findings and histopathology. SPVS did not reveal the parathyroid hormone source in one case with negative non-invasive localization procedures. Comparisons between SPVS, computerized tomography (CT), and parathyroid scintigraphy (MIBI) as expressed in terms of true positive (TP), false positive (FP) and false negative (FN) were: SPVS - TP 88.8%, FP 0%, FN 11.1%; CT - TP 22.2%, FP 22.2%, FN 55.5%; and MIBI - TP 33.3%, FP 0%, FN 66.6%. At least seven of the nine operated cases have been cured; another remained normocalcaemic 2 weeks after subtotal parathyroidectomy. Conclusion: In our institution SPVS has proven to be a valuable tool in cases with recurrent or persistent primary hyperparathyroidism and negative non-invasive localization procedures.
Resumo:
Frequency of exposure to very low- and high-frequency words was manipulated in a three-phase (familiarisation, study, and test) design. During familiarisation, words were presented with their definition (once, four times, or not presented). One week (Experiment 1) or one day (Experiment 2) later, participants studied a list of homogeneous pairs (i.e., pair members were matched on background and familiarisation frequency). Item and associative recognition of high- and very low-frequency words presented in intact, rearranged, old-new, or new-new pairs were tested in Experiment 1. Associative recognition of very low-frequency words was tested in Experiment 2. Results showed that prior familiaris ation improved associative recognition of very low-frequency pairs, but had no effect on high-frequency pairs. The role of meaning in the formation of item-to-item and item-to-context associations and the implications for current models of memory are discussed.
Resumo:
Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.