973 resultados para probability models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, ‘business model’ and ‘business model innovation’ have gained substantial attention in management literature and practice. However, many firms lack the capability to develop a novel business model to capture the value from new technologies. Existing literature on business model innovation highlights the central role of ‘customer value’. Further, it suggests that firms need to experiment with different business models and engage in ‘trail-and-error’ learning when participating in business model innovation. Trial-and error processes and prototyping with tangible artifacts are a fundamental characteristic of design. This conceptual paper explores the role of design-led innovation in facilitating firms to conceive and prototype novel and meaningful business models. It provides a brief review of the conceptual discussion on business model innovation and highlights the opportunities for linking it with the research stream of design-led innovation. We propose design-led business model innovation as a future research area and highlight the role of design-led prototyping and new types of artifacts and prototypes play within it. We present six propositions in order to outline future research avenues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identification of the primary drivers of stock returns has been of great interest to both financial practitioners and academics alike for many decades. Influenced by classical financial theories such as the CAPM (Sharp, 1964; Lintner, 1965) and APT (Ross, 1976), a linear relationship is conventionally assumed between company characteristics as derived from their financial accounts and forward returns. Whilst this assumption may be a fair approximation to the underlying structural relationship, it is often adopted for the purpose of convenience. It is actually quite rare that the assumptions of distributional normality and a linear relationship are explicitly assessed in advance even though this information would help to inform the appropriate choice of modelling technique. Non-linear models have nevertheless been applied successfully to the task of stock selection in the past (Sorensen et al, 2000). However, their take-up by the investment community has been limited despite the fact that researchers in other fields have found them to be a useful way to express knowledge and aid decision-making...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animal models typically require a known genetic pedigree to estimate quantitative genetic parameters. Here we test whether animal models can alternatively be based on estimates of relatedness derived entirely from molecular marker data. Our case study is the morphology of a wild bird population, for which we report estimates of the genetic variance-covariance matrices (G) of six morphological traits using three methods: the traditional animal model; a molecular marker-based approach to estimate heritability based on Ritland's pairwise regression method; and a new approach using a molecular genealogy arranged in a relatedness matrix (R) to replace the pedigree in an animal model. Using the traditional animal model, we found significant genetic variance for all six traits and positive genetic covariance among traits. The pairwise regression method did not return reliable estimates of quantitative genetic parameters in this population, with estimates of genetic variance and covariance typically being very small or negative. In contrast, we found mixed evidence for the use of the pedigree-free animal model. Similar to the pairwise regression method, the pedigree-free approach performed poorly when the full-rank R matrix based on the molecular genealogy was employed. However, performance improved substantially when we reduced the dimensionality of the R matrix in order to maximize the signal to noise ratio. Using reduced-rank R matrices generated estimates of genetic variance that were much closer to those from the traditional model. Nevertheless, this method was less reliable at estimating covariances, which were often estimated to be negative. Taken together, these results suggest that pedigree-free animal models can recover quantitative genetic information, although the signal remains relatively weak. It remains to be determined whether this problem can be overcome by the use of a more powerful battery of molecular markers and improved methods for reconstructing genealogies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the ‘evade or not’ assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is ‘caught’ or ‘not caught’, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an ‘evade or not’ choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the ‘evade or not’ model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a ‘cost-of evasion’ function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the model’s ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the ‘evade or not’ version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the ‘evade or not’ model vote for a low degree of progressivity. This is because, in the ‘evade or not’ version of the model, low values of the tax rate encourages a large number of agents to choose the ‘not-evade’ option, so that the redistributive mechanism is more ‘efficient’ relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In various industrial and scientific fields, conceptual models are derived from real world problem spaces to understand and communicate containing entities and coherencies. Abstracted models mirror the common understanding and information demand of engineers, who apply conceptual models for performing their daily tasks. However, most standardized models in Process Management, Product Lifecycle Management and Enterprise Resource Planning lack of a scientific foundation for their notation. In collaboration scenarios with stakeholders from several disciplines, tailored conceptual models complicate communication processes, as a common understanding is not shared or implemented in specific models. To support direct communication between experts from several disciplines, a visual language is developed which allows a common visualization of discipline-specific conceptual models. For visual discrimination and to overcome visual complexity issues, conceptual models are arranged in a three-dimensional space. The visual language introduced here follows and extends established principles of Visual Language science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dengue fever is one of the world’s most important vector-borne diseases. The transmission area of this disease continues to expand due to many factors including urban sprawl, increased travel and global warming. Current preventative techniques are primarily based on controlling mosquito vectors as other prophylactic measures, such as a tetravalent vaccine are unlikely to be available in the foreseeable future. However, the continually increasing dengue incidence suggests that this strategy alone is not sufficient. Epidemiological models attempt to predict future outbreaks using information on the risk factors of the disease. Through a systematic literature review, this paper aims at analyzing the different modeling methods and their outputs in terms of accurately predicting disease outbreaks. We found that many previous studies have not sufficiently accounted for the spatio-temporal features of the disease in the modeling process. Yet with advances in technology, the ability to incorporate such information as well as the socio-environmental aspect allowed for its use as an early warning system, albeit limited geographically to a local scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental problem faced by stereo matching algorithms is the matching or correspondence problem. A wide range of algorithms have been proposed for the correspondence problem. For all matching algorithms, it would be useful to be able to compute a measure of the probability of correctness, or reliability of a match. This paper focuses in particular on one class for matching algorithms, which are based on the rank transform. The interest in these algorithms for stereo matching stems from their invariance to radiometric distortion, and their amenability to fast hardware implementation. This work differs from previous work in that it derives, from first principles, an expression for the probability of a correct match. This method was based on an enumeration of all possible symbols for matching. The theoretical results for disparity error prediction, obtained using this method, were found to agree well with experimental results. However, disadvantages of the technique developed in this chapter are that it is not easily applicable to real images, and also that it is too computationally expensive for practical window sizes. Nevertheless, the exercise provides an interesting and novel analysis of match reliability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/objectives This study estimates the economic outcomes of a nutrition intervention to at-risk patients compared with standard care in the prevention of pressure ulcer. Subjects/methods Statistical models were developed to predict ‘cases of pressure ulcer avoided’, ‘number of bed days gained’ and ‘change to economic costs’ in public hospitals in 2002–2003 in Queensland, Australia. Input parameters were specified and appropriate probability distributions fitted for: number of discharges per annum; incidence rate for pressure ulcer; independent effect of pressure ulcer on length of stay; cost of a bed day; change in risk in developing a pressure ulcer associated with nutrition support; annual cost of the provision of a nutrition support intervention for at-risk patients. A total of 1000 random re-samples were made and the results expressed as output probability distributions. Results The model predicts a mean 2896 (s.d. 632) cases of pressure ulcer avoided; 12 397 (s.d. 4491) bed days released and corresponding mean economic cost saving of euros 2 869 526 (s.d. 2 078 715) with a nutrition support intervention, compared with standard care. Conclusion Nutrition intervention is predicted to be a cost-effective approach in the prevention of pressure ulcer in at-risk patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to identify what outcome measures or quality indicators are being used to evaluate advanced and new roles in nine allied health professions and whether the measures are evaluating outcomes of interest to the patient, the clinician, or the healthcare provider. A systematic search strategy was used. Medical and allied health databases were searched and relevant articles extracted. Relevant studies with at least 1 outcome measure were evaluated. A total of 106 articles were identified that described advanced roles, however, only 23 of these described an outcome measure in sufficient detail to be included for review. The majority of the reported measures fit into the economic and process categories. The most reported outcome related to patients was satisfaction surveys. Measures of patient health outcomes were infrequently reported. It is unclear from the studies evaluated whether new models of allied healthcare can be shown to be as safe and effective as traditional care for a given procedure. Outcome measures chosen to evaluate these services often reflect organizational need and not patient outcomes. Organizations need to ensure that high-quality performance measures are chosen to evaluate the success of new health service innovations. There needs to be a move away from in-house type surveys that add little or no valid evidence as to the effect of a new innovation. More importance needs to be placed on patient outcomes as a measure of the quality of allied health interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emergency health is a critical component of Australia’s health system and one which is increasingly congested from growing demand and blocked access to inpatient beds. The Emergency Health Services Queensland (EHSQ) study aims to identify the factors driving increased demand for emergency health and to evaluate strategies which may safely reduce the future demand growth. This monograph addresses the characteristics of users of emergency health services with an aim to identify those that appear to contribute to demand growth. This study utilises data on patients treated by Emergency Departments (ED) and Queensland Ambulance Service (QAS) across Queensland. ED data was derived from the Emergency Department Information System (EDIS) for the period 2001-02 through to 2010-11. Ambulance data was extracted from the QAS’ Ambulance Information Management System (AIMS) and electronic Ambulance Report Form (eARF) for the period 2001-02 through to 2009-10. Due to discrepancies and comparability issues for ED data, this monograph compares data from the 2003-04 time period with 2010-11 data for 21 of the reporting EDs. Also a snapshot of users for the 2010-11 financial year for 31 reporting EDs is used to describe the characteristics of users and to compare those characteristics with population demographics. For QAS data, the 2002-03 and 2009-10 time periods were selected for detailed analyses to identify trends. • Demand for emergency health care services is increasing, representing both increased population and increased relative utilisation. Per capita demand for ED attention has increased by 2% per annum over the last decade and for ambulance attention by 3.7% per annum. • The growth in ED demand is prominent in more urgent triage categories with actual decline in less urgent patients. An estimated 55% of patients attend hospital EDs outside of normal working hours. There is no evidence that patients presenting out of hours are significantly different to those presenting within working hours; they have similar triage assessments and outcomes. • Patients suffering from injuries and poisoning comprise 28% of the ED workload (an increase of 65% in the study period), whilst declines of 32% in cardiovascular and circulatory conditions, and musculoskeletal problems have been observed. • 25.6% of patients attending EDs are admitted to hospital. 19% of admitted patients and 7% of patients who die in the ED are triage category 4 or 5 on arrival. • The average age of ED patients is 35.6 years. Demand has grown in all age groups and amongst both men and women. Men have higher utilisation rates for ED in all age groups. The only group where the growth rate in women has exceeded men is in the 20-29 age group; this growth is particularly in the injury and poisoning categories. • Considerable attention has been paid publicly to ED performance criteria. It is worth noting that 50% of all patients were treated within 33 minutes of arrival. • Patients from lower socioeconomic areas appear to have higher utilisation rates and the utilisation rate for indigenous people appears to exceed those of European and other backgrounds. The utilisation rates for immigrant people is generally less than that of Australian born however it has not been possible to eliminate the confounding impact of different age and socioeconomic profiles. • Demand for ambulance service is also increasing at a rate that exceeds population growth. Utilisation rates have increased by an average of 5% per annum in Queensland compared to 3.6% nationally, and the utilisation rate in Queensland is 27% higher than the national average. • The growth in ambulance utilisation has also been amongst the more urgent categories of dispatch and utilisation rates are higher in rural and regional areas than in the metropolitan area. The demand for ambulance increases with age but the growth in demand for ambulance service has been more prominent in younger age groups. These findings contribute significantly to an understanding of the growth in demand for emergency health. It shows that the growth is amongst patients in genuine need of emergency healthcare and public rhetoric that the congestion of emergency health services is due to inappropriate attendees is unable to be substantiated. The consistency of the growth in demand over the last decade reflects not only the changing demographics of the Australian population but also the changes in health status, standards of acute health care and other social factors. The growth is also amongst patients with acute injury and poisoning which is inconsistent with rates of chronic disease as a fundamental driver. We have also interviewed patients in regard to their decision making choices for acute health care and the factors that influence these decisions and this will be the subject of a third Monograph and publications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developers and policy makers are consistently at odds over the debate as to whether impact fees increase house prices. This debate continues despite the extensive body of theoretical and empirical international literature that discusses the passing on to home buyers of impact fees, and the corresponding increase to housing prices. In attempting to quantify this impact, over a dozen empirical studies have been carried out in the US and Canada since the 1980’s. However the methodologies used vary greatly, as do the results. Despite similar infrastructure funding policies in numerous developed countries, no such empirical works exist outside of the US/Canada. The purpose of this research is to analyse the existing econometric models in order to identify, compare and contrast the theoretical bases, methodologies, key assumptions and findings of each. This research will assist in identifying if further model development is required and/or whether any of these models have external validity and are readily transferable outside of the US. The findings conclude that there is very little explicit rationale behind the various model selections and that significant model deficiencies appear still to exist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A synthesis is presented of the predictive capability of a family of near-wall wall-normal free Reynolds stress models (which are completely independent of wall topology, i.e., of the distance fromthe wall and the normal-to-thewall orientation) for oblique-shock-wave/turbulent-boundary-layer interactions. For the purpose of comparison, results are also presented using a standard low turbulence Reynolds number k–ε closure and a Reynolds stress model that uses geometric wall normals and wall distances. Studied shock-wave Mach numbers are in the range MSW = 2.85–2.9 and incoming boundary-layer-thickness Reynolds numbers are in the range Reδ0 = 1–2×106. Computations were carefully checked for grid convergence. Comparison with measurements shows satisfactory agreement, improving on results obtained using a k–ε model, and highlights the relative importance of redistribution and diffusion closures, indicating directions for future modeling work.