944 resultados para Risk-Neutral Probability
Resumo:
In this study, I determined the identity, taxonomic placement, and distribution of digenetic trematodes parasitizing the snails Pomacea paludosa and Planorbella duryi at Pa-hay-okee, Everglades National Park. I also characterized temporal and geographic variation in the probability of parasite infection for these snails based on two years of sampling. Although studies indicate that digenean parasites may have important effects both on individual species and the structure of communities, there have been no studies of digenean parasitism on snails within the Everglades ecosystem. For example, the endangered Everglade Snail Kite, a specialist that feeds almost exclusively on Pomacea paludosa, and is known to be a definitive host of digenean parasites, may suffer direct and indirect effects from consumption of parasitized apple snails. Therefore, information on the diversity and abundance of parasites harbored in snail populations in the Everglades should be of considerable interest for management and conservation of wildlife. Juvenile digeneans (cercariae) representing 20 species were isolated from these two snails, representing a quadrupling of the number of species known. Species were characterized based on morphological, morphometric, and sequence data (18S rDNA, COI, and ITS). Species richness of shed cercariae from P. duryi was greater than P. paludosa, with 13 and 7 species respectively. These species represented 14 families. P. paludosa and P. duryi had no digenean species in common. Probability of digenean infection was higher for P. duryi than P. paludosa and adults showed a greater risk of infection than juveniles for both of these snails. Planorbella duryi showed variation in probability of infection between sampling sites and hydrological seasons. The number of unique combinations of multi-species infections was greatest among P. duryi individuals, while the overall percentage of multi-species infections was greatest in P. paludosa. Analyses of six frequently-observed multiple infections from P. duryi suggest the presence of negative interactions, positive interactions, and neutral associations between larval digeneans. These results should contribute to an understanding of the factors controlling the abundance and distribution of key species in the Everglades ecosystem and may in particular help in the management and recovery planning for the Everglade Snail Kite.
Resumo:
Australia’s civil infrastructure assets of roads, bridges, railways, buildings and other structures are worth billions of dollars. Road assets alone are valued at around A$ 140 billion. As the condition of assets deteriorate over time, close to A$10 billion is spent annually in asset maintenance on Australia's roads, or the equivalent of A$27 million per day. To effectively manage road infrastructures, firstly, road agencies need to optimise the expenditure for asset data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. A procedure for assessing investment decision for road asset management has been developed. The procedure includes: • A methodology for optimising asset data collection; • A methodology for calibrating deterioration prediction models; • A methodology for assessing risk-adjusted estimates for life-cycle cost estimates. • A decision framework in the form of risk map
Resumo:
Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.
Resumo:
Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.
Resumo:
An estimation of costs for maintenance and rehabilitation is subject to variation due to the uncertainties of input parameters. This paper presents the results of an analysis to identify input parameters that affect the prediction of variation in road deterioration. Road data obtained from 1688 km of a national highway located in the tropical northeast of Queensland in Australia were used in the analysis. Data were analysed using a probability-based method, the Monte Carlo simulation technique and HDM-4’s roughness prediction model. The results of the analysis indicated that among the input parameters the variability of pavement strength, rut depth, annual equivalent axle load and initial roughness affected the variability of the predicted roughness. The second part of the paper presents an analysis to assess the variation in cost estimates due to the variability of the overall identified critical input parameters.
Resumo:
Queensland Department of Main Roads, Australia, spends approximately A$ 1 billion annually for road infrastructure asset management. To effectively manage road infrastructure, firstly road agencies not only need to optimise the expenditure for data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. This paper presents the results of case studies in using the probability-based method for an integrated approach (i.e. assessing optimal costs of pavement strength data collection; calibrating deterioration prediction models that suit local condition and assessing risk-adjusted budget estimates for road maintenance and rehabilitation for assessing life-cycle budget estimates). The probability concept is opening the path to having the means to predict life-cycle maintenance and rehabilitation budget estimates that have a known probability of success (e.g. produce budget estimates for a project life-cycle cost with 5% probability of exceeding). The paper also presents a conceptual decision-making framework in the form of risk mapping in which the life-cycle budget/cost investment could be considered in conjunction with social, environmental and political issues.
Resumo:
A study has been conducted to investigate current practices on decision-making under risk and uncertainty for infrastructure project investments. It was found that many European countries such as the UK, France, Germany including Australia use scenarios for the investigation of the effects of risk and uncertainty of project investments. Different alternative scenarios are mostly considered during the engineering economic cost-benefit analysis stage. For instance, the World Bank requires an analysis of risks in all project appraisals. Risk in economic evaluation needs to be addressed by calculating sensitivity of the rate of return for a number of events. Risks and uncertainties of project developments arise from various sources of errors including data, model and forecasting errors. It was found that the most influential factors affecting risk and uncertainty resulted from forecasting errors. Data errors and model errors have trivial effects. It was argued by many analysts that scenarios do not forecast what will happen but scenarios indicate only what can happen from given alternatives. It was suggested that the probability distributions of end-products of the project appraisal, such as cost-benefit ratios that take forecasting errors into account, are feasible decision tools for economic evaluation. Political, social, environmental as well as economic and other related risk issues have been addressed and included in decision-making frameworks, such as in a multi-criteria decisionmaking framework. But no suggestion has been made on how to incorporate risk into the investment decision-making process.
Resumo:
A shortage of affordable housing is a major problem in Australia today. This is mainly due to the limited supply of affordable housing that is provided by the non-government housing sector. Some private housing developers see the provision of affordable housing for lower income people as a high risk investment which offers a lower return than broader market-based housing. The scarcity of suitable land, a limited government ‘subsidy’, and increasing housing costs have not provided sufficient development incentives to encourage their investment despite the existing high demand for affordable housing. This study analyses the risk management process conducted by some private and not-for-profit housing providers in South East Queensland, and draws conclusions about the relationship between risk assessments/responses and past experiences. In-depth interviews of selected non-government housing providers have been conducted to facilitate an understanding of their approach to risk assessment/response in developing and in managing affordable housing projects. These developers use an informal risk management process as part of their normal business process in accordance with industry standards. A simple qualitative matrix has been used to analyse probability and impacts using a qualitative scale - low, medium and high. For housing providers who have considered investing in affordable housing but have not yet implemented any such projects, affordable housing development is seen as an opportunity that needs to be approached with caution. The risks associated with such projects and the levels of acceptance of these are not consistently identified by current housing providers. Many interviewees agree that the recognition of financial risk and the fear of community rejection of such housing projects have restrained them from committing to such investment projects. This study suggests that implementing improvements to the risk mitigation and management framework may assist in promoting the supply of affordable housing by non-government providers.
Resumo:
Crash risk is the statistical probability of a crash. Its assessment can be performed through ex post statistical analysis or in real-time with on-vehicle systems. These systems can be cooperative. Cooperative Vehicle-Infrastructure Systems (CVIS) are a developing research avenue in the automotive industry worldwide. This paper provides a survey of existing CVIS systems and methods to assess crash risk with them. It describes the advantages of cooperative systems versus non-cooperative systems. A sample of cooperative crash risk assessment systems is analysed to extract vulnerabilities according to three criteria: market penetration, over-reliance on GPS and broadcasting issues. It shows that cooperative risk assessment systems are still in their infancy and requires further development to provide their full benefits to road users.
Resumo:
Rodenticide use in agriculture can lead to the secondary poisoning of avian predators. Currently the Australian sugarcane industry has two rodenticides, Racumin® and Rattoff®, available for in-crop use but, like many agricultural industries, it lacks an ecologically-based method of determining the potential secondary poisoning risk the use of these rodenticides poses to avian predators. The material presented in this thesis addresses this by: a. determining where predator/prey interactions take place in sugar producing districts; b. quantifying the amount of rodenticide available to avian predators and the probability of encounter; and c. developing a stochastic model that allows secondary poisoning risk under various rodenticide application scenarios to be investigated. Results demonstrate that predator/prey interactions are highly constrained by environmental structure. Rodents used crops that provided high levels of canopy cover and therefore predator protection and poorly utilised open canopy areas. In contrast, raptors over-utilised areas with low canopy cover and low rodent densities, but which provided high accessibility to prey. Given this pattern of habitat use, and that industry baiting protocols preclude rodenticide application in open canopy crops, these results indicate that secondary poisoning can only occur if poisoned rodents leave closed canopy crops and become available for predation in open canopy areas. Results further demonstrate that after in-crop rodenticide application, only a small proportion of rodents available in open areas are poisoned and that these rodents carry low levels of toxicant. Coupled with the low level of rodenticide use in the sugar industry, the high toxic threshold raptors have to these toxicants and the low probability of encountering poisoned rodents, results indicate that the risk of secondary poisoning events occurring is minimal. A stochastic model was developed to investigate the effect of manipulating factors that might influence secondary poisoning hazard in a sugarcane agro-ecosystem. These simulations further suggest that in all but extreme scenarios, the risk of secondary poisoning is also minimal. Collectively, these studies demonstrate that secondary poisoning of avian predators associated with the use of the currently available rodenticides in Australian sugar producing districts is minimal. Further, the ecologically-based method of assessing secondary poisoning risk developed in this thesis has broader applications in other agricultural systems where rodenticide use may pose risks to avian predators.
Resumo:
In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.
Resumo:
Objective: Diarrhoea in the enterally tube fed (ETF) intensive care unit (ICU) patient is a multifactorial problem. Diarrhoeal aetiologies in this patient cohort remain debatable; however, the consequences of diarrhoea have been well established and include electrolyte imbalance, dehydration, bacterial translocation, peri anal wound contamination and sleep deprivation. This study examined the incidence of diarrhoea and explored factors contributing to the development of diarrhoea in the ETF, critically ill, adult patient. ---------- Method: After institutional ethical review and approval, a single centre medical chart audit was undertaken to examine the incidence of diarrhoea in ETF, critically ill patients. Retrospective, non-probability sequential sampling was used of all emergency admission adult ICU patients who met the inclusion/exclusion criteria. ---------- Results: Fifty patients were audited. Faecal frequency, consistency and quantity were considered important criteria in defining ETF diarrhoea. The incidence of diarrhoea was 78%. Total patient diarrhoea days (r = 0.422; p = 0.02) and total diarrhoea frequency (r = 0.313; p = 0.027) increased when the patient was ETF for longer periods of time. Increased severity of illness, peripheral oxygen saturation (Sp02), glucose control, albumin and white cell count were found to be statistically significant factors for the development of diarrhoea. ---------- Conclusion: Diarrhoea in ETF critically ill patients is multi-factorial. The early identification of diarrhoea risk factors and the development of a diarrhoea risk management algorithm is recommended.
Resumo:
Driver distraction is a research area that continues to receive considerable research interest but the drivers’ perspective is less well documented. The current research focuses on how drivers perceive the risks associated with a range of driver distractions with the aim of identifying features that contribute to their risk perception judgements. Multidimensional scaling analysis was employed to better understand drivers’ risk perceptions for 15 in-vehicle and external distractions. Results identify both salient qualitative characteristics that underpin drivers’ risk perceptions, such as the probability of a crash, as well as identify other features inherent in the distractions that may also contribute to risk perceptions. The implications of the results are discussed for better understanding drivers’ perceptions of distractions and the potential for improving road safety messages related to distracted driving.
Resumo:
From a ‘cultural science’ perspective, this paper traces one aspect of a more general shift, from the realist representational regime of modernity to the productive DIY systems of the internet era. It argues that collecting and archiving is transformed by this change. Modern museums – and also broadcast television – were based on determinist or ‘essence’ theory; while internet archives like YouTube (and the internet as an archive) are based on ‘probability’ theory. The paper goes through the differences between modernist ‘essence’ and postmodern ‘probability’; starting from the obvious difference that in a museum each object is selected by experts for its intrinsic properties, while on the internet you don’t know what you will find. The status of individual objects is uncertain, although the productivity of the overall archive is unlimited. The paper links these differences with changes in contemporary culture – from a Newtonian to a quantum universe, progress to risk, institutional structure to evolutionary change, objectivity to uncertainty, identity to performance. Borrowing some of its methodology from science fiction, the paper uses examples from museums and online archives, ranging from the oldest stone tool in the world to the latest tribute vid on the net.