965 resultados para Assumptions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This licentiate thesis sets out to analyse how a retail price decision frame can be understood. It is argued that it is possible to view price determination within retailing by determining the level of rationality and using behavioural theories. In this way, it is possible to use assumptions derived from economics and marketing to establish a decision frame. By taking a management perspective, it is possible to take into consideration how it is assumed that the retailer should strategically manage price decisions, which decisions might be assumed to be price decisions, and which decisions can be assumed to be under the control of the retailer. Theoretically, this licentiate thesis has its foundations in different assumptions about decision frames regarding the level of information collected, the goal of the decisions, and the outcomes of the decisions. Since the concepts that are to be analysed within this thesis are price decisions, the latter part of the theory discusses price decision in specific: sequential price decisions, at the point of the decision, and trade-offs when making a decision. Here, it is evident that a conceptual decision frame that is intended to illustrate price decisions includes several aspects: several decision alternatives and what assumptions of rationality that can be made in relation to the decision frame. A semi-structured literature review was conducted. As a result, it became apparent that two important things in the decision frame were unclear: time assumptions regarding the decisions and the amount of information that is assumed in relation to the different decision alternatives. By using the same articles that were used to adjust the decision frame, a topical study was made in order to determine the time specific assumptions, as well as the analytical level based on the assumed information necessary for individual decision alternatives. This, together with an experimental study, was necessary to be able to discuss the consequences of the rationality assumption. When the retail literature is analysed for the level of rationality and consequences of assuming certain assumptions of rationality, three main things becomes apparent. First, the level of rationality or the assumptions of rationality are seldom made or accounted for in the literature. In fact, there are indications that perfect and bounded rationality assumptions are used simultaneously within studies. Second, although bounded rationality is a recognised theoretical perspective, very few articles seem to use these assumptions. Third, since the outcome of a price decision seems to provide no incremental sale, it is questionable which assumptions of rationality that should be used. It might even be the case that no assumptions of rationality at all should be used. In a broader perspective, the findings from this licentiate thesis show that the assumptions of rationality within retail research is unclear. There is an imbalance between the perspectives used, where the main assumptions seem to be concentrated to perfect rationality. However, it is suggested that by clarifying which assumptions of rationality that is used and using bounded rationality assumptions within research would result in a clearer picture of the multifaceted price decisions that could be assumed within retailing. The theoretical contribution of this thesis mainly surround the identification of how the level of rationality provides limiting assumptions within retail research. Furthermore, since indications show that learning might not occur within this specific context it is questioned whether the basic learning assumption within bounded rationality should be used in this context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Almost a full century separates Lewis’ Alice in Wonderland (1865) and the second, lengthier and more elaborate edition of Hans Kelsen’s Pure Theory of Law (1960; first edition published in 1934). And yet, it is possible to argue that the former anticipates and critically addresses many of the philosophical assumptions that underlie and are elemental to the argument of the latter. Both texts, with the illuminating differences that arise from their disparate genre, have as one of their key themes norms and their functioning. Wonderland, as Alice soon finds out, is a world beset by rules of all kinds: from the etiquette rituals of the mad tea-party to the changing setting for the cricket game to the procedural insanity of the trial with which the novel ends. Pure Theory of Law, as Kelsen emphatically stresses, has the grundnorm as the cornerstone upon which the whole theoretical edifice rests2. This paper discusses some of the assumptions underlying Kelsen’s argument as an instance of the modern worldview which Lewis satirically scrutinizes. The first section (Sleepy and stupid) discusses Lewis critique of the idea that, to correctly apprehend an object (in the case of Kelsen’s study, law), one has to free it from its alien elements. The second section (Do bats eat cats?) discusses the notion of systemic coherence and its impact on modern ways of thinking about truth, law and society. The third section (Off with their heads!) explores the connections between readings of systems as neutral entities and the perpetuation of political power. The fourth and final section (Important, Unimportant) explains the sense in which a “critical anticipation” is both possible and useful to discuss the philosophical assumptions structuring some positivist arguments. It also discusses the reasons for choosing to focus on Kelsen’s work, rather than on that of Lewis’ contemporary, John Austin, whose The Province of Jurisprudence Determined (published in 1832) remains influential in legal debates today.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate strategies used to select cases and controls and how reported odds ratios are interpreted, the authors examined 150 case-control studies published in leading general medicine, epidemiology, and clinical specialist journals from 2001 to 2007. Most of the studies (125/150; 83%) were based on incident cases; among these, the source population was mostly dynamic (102/125; 82%). A minority (23/125; 18%) sampled from a fixed cohort. Among studies with incident cases, 105 (84%) could interpret the odds ratio as a rate ratio. Fifty-seven (46% of 125) required the source population to be stable for such interpretation, while the remaining 48 (38% of 125) did not need any assumptions because of matching on time or concurrent sampling. Another 17 (14% of 125) studies with incident cases could interpret the odds ratio as a risk ratio, with 16 of them requiring the rare disease assumption for this interpretation. The rare disease assumption was discussed in 4 studies but was not relevant to any of them. No investigators mentioned the need for a stable population. The authors conclude that in current case-control research, a stable exposure distribution is much more frequently needed to interpret odds ratios than the rare disease assumption. At present, investigators conducting case-control studies rarely discuss what their odds ratios estimate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conservative procedures in low-dose risk assessment are used to set safety standards for known or suspected carcinogens. However, the assumptions upon which the methods are based and the effects of these methods are not well understood.^ To minimize the number of false-negatives and to reduce the cost of bioassays, animals are given very high doses of potential carcinogens. Results must then be extrapolated to much smaller doses to set safety standards for risks such as one per million. There are a number of competing methods that add a conservative safety factor into these calculations.^ A method of quantifying the conservatism of these methods was described and tested on eight procedures used in setting low-dose safety standards. The results using these procedures were compared by computer simulation and by the use of data from a large scale animal study.^ The method consisted of determining a "true safe dose" (tsd) according to an assumed underlying model. If one assumed that Y = the probability of cancer = P(d), a known mathematical function of the dose, then by setting Y to some predetermined acceptable risk, one can solve for d, the model's "true safe dose".^ Simulations were generated, assuming a binomial distribution, for an artificial bioassay. The eight procedures were then used to determine a "virtual safe dose" (vsd) that estimates the tsd, assuming a risk of one per million. A ratio R = ((tsd-vsd)/vsd) was calculated for each "experiment" (simulation). The mean R of 500 simulations and the probability R $<$ 0 was used to measure the over and under conservatism of each procedure.^ The eight procedures included Weil's method, Hoel's method, the Mantel-Byran method, the improved Mantel-Byran, Gross's method, fitting a one-hit model, Crump's procedure, and applying Rai and Van Ryzin's method to a Weibull model.^ None of the procedures performed uniformly well for all types of dose-response curves. When the data were linear, the one-hit model, Hoel's method, or the Gross-Mantel method worked reasonably well. However, when the data were non-linear, these same methods were overly conservative. Crump's procedure and the Weibull model performed better in these situations. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The set agreement problem states that from n proposed values at most n?1 can be decided. Traditionally, this problem is solved using a failure detector in asynchronous systems where processes may crash but do not recover, where processes have different identities, and where all processes initially know the membership. In this paper we study the set agreement problem and the weakest failure detector L used to solve it in asynchronous message passing systems where processes may crash and recover, with homonyms (i.e., processes may have equal identities) and without a complete initial knowledge of the membership.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Steam Generator Tube Rupture (SGTR) sequences in Pressurized Water Reactors are known to be one of the most demanding transients for the operating crew. SGTR are a special kind of transient as they could lead to radiological releases without core damage or containment failure, as they can constitute a direct path from the reactor coolant system to the environment. The first methodology used to perform the Deterministic Safety Analysis (DSA) of a SGTR did not credit the operator action for the first 30 min of the transient, assuming that the operating crew was able to stop the primary to secondary leakage within that period of time. However, the different real SGTR accident cases happened in the USA and over the world demonstrated that the operators usually take more than 30 min to stop the leakage in actual sequences. Some methodologies were raised to overcome that fact, considering operator actions from the beginning of the transient, as it is done in Probabilistic Safety Analysis. This paper presents the results of comparing different assumptions regarding the single failure criteria and the operator action taken from the most common methodologies included in the different Deterministic Safety Analysis. One single failure criteria that has not been analysed previously in the literature is proposed and analysed in this paper too. The comparison is done with a PWR Westinghouse three loop model in TRACE code (Almaraz NPP) with best estimate assumptions but including deterministic hypothesis such as single failure criteria or loss of offsite power. The behaviour of the reactor is quite diverse depending on the different assumptions made regarding the operator actions. On the other hand, although there are high conservatisms included in the hypothesis, as the single failure criteria, all the results are quite far from the regulatory limits. In addition, some improvements to the Emergency Operating Procedures to minimize the offsite release from the damaged SG in case of a SGTR are outlined taking into account the offsite dose sensitivity results.