528 resultados para Probability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we consider a time-space fractional diffusion equation of distributed order (TSFDEDO). The TSFDEDO is obtained from the standard advection-dispersion equation by replacing the first-order time derivative by the Caputo fractional derivative of order α∈(0,1], the first-order and second-order space derivatives by the Riesz fractional derivatives of orders β 1∈(0,1) and β 2∈(1,2], respectively. We derive the fundamental solution for the TSFDEDO with an initial condition (TSFDEDO-IC). The fundamental solution can be interpreted as a spatial probability density function evolving in time. We also investigate a discrete random walk model based on an explicit finite difference approximation for the TSFDEDO-IC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Integrity of Real Time Kinematic (RTK) positioning solutions relates to the confidential level that can be placed in the information provided by the RTK system. It includes the ability of the RTK system to provide timely valid warnings to users when the system must not be used for the intended operation. For instance, in the controlled traffic farming (CTF) system that controls traffic separates wheel beds and root beds, RTK positioning error causes overlap and increases the amount of soil compaction. The RTK system’s integrity capacity can inform users when the actual positional errors of the RTK solutions have exceeded Horizontal Protection Levels (HPL) within a certain Time-To-Alert (TTA) at a given Integrity Risk (IR). The later is defined as the probability that the system claims its normal operational status while actually being in an abnormal status, e.g., the ambiguities being incorrectly fixed and positional errors having exceeded the HPL. The paper studies the required positioning performance (RPP) of GPS positioning system for PA applications such as a CTF system, according to literature review and survey conducted among a number of farming companies. The HPL and IR are derived from these RPP parameters. A RTK-specific rover autonomous integrity monitoring (RAIM) algorithm is developed to determine the system integrity according to real time outputs, such as residual square sum (RSS), HDOP values. A two-station baseline data set is analyzed to demonstrate the concept of RTK integrity and assess the RTK solution continuity, missed detection probability and false alarm probability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The refractive error of a human eye varies across the pupil and therefore may be treated as a random variable. The probability distribution of this random variable provides a means for assessing the main refractive properties of the eye without the necessity of traditional functional representation of wavefront aberrations. To demonstrate this approach, the statistical properties of refractive error maps are investigated. Closed-form expressions are derived for the probability density function (PDF) and its statistical moments for the general case of rotationally-symmetric aberrations. A closed-form expression for a PDF for a general non-rotationally symmetric wavefront aberration is difficult to derive. However, for specific cases, such as astigmatism, a closed-form expression of the PDF can be obtained. Further, interpretation of the distribution of the refractive error map as well as its moments is provided for a range of wavefront aberrations measured in real eyes. These are evaluated using a kernel density and sample moments estimators. It is concluded that the refractive error domain allows non-functional analysis of wavefront aberrations based on simple statistics in the form of its sample moments. Clinicians may find this approach to wavefront analysis easier to interpret due to the clinical familiarity and intuitive appeal of refractive error maps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Engineering assets are often complex systems. In a complex system, components often have failure interactions which lead to interactive failures. A system with interactive failures may lead to an increased failure probability. Hence, one may have to take the interactive failures into account when designing and maintaining complex engineering systems. To address this issue, Sun et al have developed an analytical model for the interactive failures. In this model, the degree of interaction between two components is represented by interactive coefficients. To use this model for failure analysis, the related interactive coefficients must be estimated. However, methods for estimating the interactive coefficients have not been reported. To fill this gap, this paper presents five methods to estimate the interactive coefficients including probabilistic method; failure data based analysis method; laboratory experimental method; failure interaction mechanism based method; and expert estimation method. Examples are given to demonstrate the applications of the proposed methods. Comparisons among these methods are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Split System Approach (SSA) based methodology is presented to assist in making optimal Preventive Maintenance decisions for serial production lines. The methodology treats a production line as a complex series system with multiple PM actions over multiple intervals. Both risk related cost and maintenance related cost are factored into the methodology as either deterministic or random variables. This SSA based methodology enables Asset Management (AM) decisions to be optimized considering a variety of factors including failure probability, failure cost, maintenance cost, PM performance, and the type of PM strategy. The application of this new methodology and an evaluation of the effects of these factors on PM decisions are demonstrated using an example. The results of this work show that the performance of a PM strategy can be measured by its Total Expected Cost Index (TECI). The optimal PM interval is dependent on TECI, PM performance and types of PM strategies. These factors are interrelated. Generally it was found that a trade-off between reliability and the number of PM actions needs to be made so that one can minimize Total Expected Cost (TEC) for asset maintenance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rodenticide use in agriculture can lead to the secondary poisoning of avian predators. Currently the Australian sugarcane industry has two rodenticides, Racumin® and Rattoff®, available for in-crop use but, like many agricultural industries, it lacks an ecologically-based method of determining the potential secondary poisoning risk the use of these rodenticides poses to avian predators. The material presented in this thesis addresses this by: a. determining where predator/prey interactions take place in sugar producing districts; b. quantifying the amount of rodenticide available to avian predators and the probability of encounter; and c. developing a stochastic model that allows secondary poisoning risk under various rodenticide application scenarios to be investigated. Results demonstrate that predator/prey interactions are highly constrained by environmental structure. Rodents used crops that provided high levels of canopy cover and therefore predator protection and poorly utilised open canopy areas. In contrast, raptors over-utilised areas with low canopy cover and low rodent densities, but which provided high accessibility to prey. Given this pattern of habitat use, and that industry baiting protocols preclude rodenticide application in open canopy crops, these results indicate that secondary poisoning can only occur if poisoned rodents leave closed canopy crops and become available for predation in open canopy areas. Results further demonstrate that after in-crop rodenticide application, only a small proportion of rodents available in open areas are poisoned and that these rodents carry low levels of toxicant. Coupled with the low level of rodenticide use in the sugar industry, the high toxic threshold raptors have to these toxicants and the low probability of encountering poisoned rodents, results indicate that the risk of secondary poisoning events occurring is minimal. A stochastic model was developed to investigate the effect of manipulating factors that might influence secondary poisoning hazard in a sugarcane agro-ecosystem. These simulations further suggest that in all but extreme scenarios, the risk of secondary poisoning is also minimal. Collectively, these studies demonstrate that secondary poisoning of avian predators associated with the use of the currently available rodenticides in Australian sugar producing districts is minimal. Further, the ecologically-based method of assessing secondary poisoning risk developed in this thesis has broader applications in other agricultural systems where rodenticide use may pose risks to avian predators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secondary tasks such as cell phone calls or interaction with automated speech dialog systems (SDSs) increase the driver’s cognitive load as well as the probability of driving errors. This study analyzes speech production variations due to cognitive load and emotional state of drivers in real driving conditions. Speech samples were acquired from 24 female and 17 male subjects (approximately 8.5 h of data) while talking to a co-driver and communicating with two automated call centers, with emotional states (neutral, negative) and the number of necessary SDS query repetitions also labeled. A consistent shift in a number of speech production parameters (pitch, first format center frequency, spectral center of gravity, spectral energy spread, and duration of voiced segments) was observed when comparing SDS interaction against co-driver interaction; further increases were observed when considering negative emotion segments and the number of requested SDS query repetitions. A mel frequency cepstral coefficient based Gaussian mixture classifier trained on 10 male and 10 female sessions provided 91% accuracy in the open test set task of distinguishing co-driver interactions from SDS interactions, suggesting—together with the acoustic analysis—that it is possible to monitor the level of driver distraction directly from their speech.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The expansion of economics to ‘non-market topics’ has received increased attention in recent years. The economics of sports (football) is such a sub-field. This paper reports empirical evidence of team and referee performances in the FIFA World Cup 2002. The results reveal that being a hosting nation has a significant impact on the probability of winning a game. Furthermore, the strength of a team measured with the FIFA World Ranking does not play the important role presumed, which indicates that the element of uncertainty is working. The findings also indicate that the influence of a referee on the game result should not be neglected. Finally, the previous World Cup experiences seem to have the strongest impact on referees' performances during the game.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research on particle size distributions and particle concentrations near a busy road cannot be explained by the conventional mechanisms for particle evolution of combustion aerosols. Specifically they appear to be inadequate to explain the experimental observations of particle transformation and the evolution of the total number concentration. This resulted in the development of a new mechanism based on their thermal fragmentation, for the evolution of combustion aerosol nano-particles. A complex and comprehensive pattern of evolution of combustion aerosols, involving particle fragmentation, was then proposed and justified. In that model it was suggested that thermal fragmentation occurs in aggregates of primary particles each of which contains a solid graphite/carbon core surrounded by volatile molecules bonded to the core by strong covalent bonds. Due to the presence of strong covalent bonds between the core and the volatile (frill) molecules, such primary composite particles can be regarded as solid, despite the presence of significant (possibly, dominant) volatile component. Fragmentation occurs when weak van der Waals forces between such primary particles are overcome by their thermal (Brownian) motion. In this work, the accepted concept of thermal fragmentation is advanced to determine whether fragmentation is likely in liquid composite nano-particles. It has been demonstrated that at least at some stages of evolution, combustion aerosols contain a large number of composite liquid particles containing presumably several components such as water, oil, volatile compounds, and minerals. It is possible that such composite liquid particles may also experience thermal fragmentation and thus contribute to, for example, the evolution of the total number concentration as a function of distance from the source. Therefore, the aim of this project is to examine theoretically the possibility of thermal fragmentation of composite liquid nano-particles consisting of immiscible liquid v components. The specific focus is on ternary systems which include two immiscible liquid droplets surrounded by another medium (e.g., air). The analysis shows that three different structures are possible, the complete encapsulation of one liquid by the other, partial encapsulation of the two liquids in a composite particle, and the two droplets separated from each other. The probability of thermal fragmentation of two coagulated liquid droplets is discussed and examined for different volumes of the immiscible fluids in a composite liquid particle and their surface and interfacial tensions through the determination of the Gibbs free energy difference between the coagulated and fragmented states, and comparison of this energy difference with the typical thermal energy kT. The analysis reveals that fragmentation was found to be much more likely for a partially encapsulated particle than a completely encapsulated particle. In particular, it was found that thermal fragmentation was much more likely when the volume ratio of the two liquid droplets that constitute the composite particle are very different. Conversely, when the two liquid droplets are of similar volumes, the probability of thermal fragmentation is small. It is also demonstrated that the Gibbs free energy difference between the coagulated and fragmented states is not the only important factor determining the probability of thermal fragmentation of composite liquid particles. The second essential factor is the actual structure of the composite particle. It is shown that the probability of thermal fragmentation is also strongly dependent on the distance that each of the liquid droplets should travel to reach the fragmented state. In particular, if this distance is larger than the mean free path for the considered droplets in the air, the probability of thermal fragmentation should be negligible. In particular, it follows form here that fragmentation of the composite particle in the state with complete encapsulation is highly unlikely because of the larger distance that the two droplets must travel in order to separate. The analysis of composite liquid particles with the interfacial parameters that are expected in combustion aerosols demonstrates that thermal fragmentation of these vi particles may occur, and this mechanism may play a role in the evolution of combustion aerosols. Conditions for thermal fragmentation to play a significant role (for aerosol particles other than those from motor vehicle exhaust) are determined and examined theoretically. Conditions for spontaneous transformation between the states of composite particles with complete and partial encapsulation are also examined, demonstrating the possibility of such transformation in combustion aerosols. Indeed it was shown that for some typical components found in aerosols that transformation could take place on time scales less than 20 s. The analysis showed that factors that influenced surface and interfacial tension played an important role in this transformation process. It is suggested that such transformation may, for example, result in a delayed evaporation of composite particles with significant water component, leading to observable effects in evolution of combustion aerosols (including possible local humidity maximums near a source, such as a busy road). The obtained results will be important for further development and understanding of aerosol physics and technologies, including combustion aerosols and their evolution near a source.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Currently used Trauma and Injury Severity Score (TRISS) coefficients, which measure probability of survival (Ps), were derived from the Major Trauma Outcome Study (MTOS) in 1995 and are now unlikely to be optimal. This study aims to estimate new TRISS coefficients using a contemporary database of injured patients presenting to emergency departments in the United States; and to compare these against the MTOS coefficients.---------- Methods: Data were obtained from the National Trauma Data Bank (NTDB) and the NTDB National Sample Project (NSP). TRISS coefficients were estimated using logistic regression. Separate coefficients were derived from complete case and multistage multiple imputation analyses for each NTDB and NSP dataset. Associated Ps over Injury Severity Score values were graphed and compared by age (adult ≥ 15 years; pediatric < 15 years) and injury mechanism (blunt; penetrating) groups. Area under the Receiver Operating Characteristic curves was used to assess coefficients’ predictive performance.---------- Results: Overall 1,072,033 NTDB and 1,278,563 weighted NSP injury events were included, compared with 23,177 used in the original MTOS analyses. Large differences were seen between results from complete case and imputed analyses. For blunt mechanism and adult penetrating mechanism injuries, there were similarities between coefficients estimated on imputed samples, and marked divergences between associated Ps estimated and those from the MTOS. However, negligible differences existed between area under the receiver operating characteristic curves estimates because the overwhelming majority of patients had minor trauma and survived. For pediatric penetrating mechanism injuries, variability in coefficients was large and Ps estimates unreliable.---------- Conclusions: Imputed NTDB coefficients are recommended as the TRISS coefficients 2009 revision for blunt mechanism and adult penetrating mechanism injuries. Coefficients for pediatric penetrating mechanism injuries could not be reliably estimated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carlin and Finch, this issue, compare goodwill impairment discount rates used by a sample of large Australian firms with ‘independently’ generated discount rates. Their objective is to empirically determine whether managers opportunistically select goodwill discount rates subsequent to the 2005 introduction of International Financial Reporting Standards (IFRS) in Australia. This is a worthwhile objective given that IFRS introduced an impairment regime, and within this regime, discount rate selection plays a key role in goodwill valuation decisions. It is also timely to consider the goodwill valuation issue. Following the recent downturn in the economy, there is a high probability that many firms will be forced to write down impaired goodwill arising from boom period acquisitions. Hence, evidence of bias in rate selection is likely to be of major concern to investors, policymakers and corporate regulators. Carlin and Finch claim their findings provide evidence of such bias. In this commentary I review the validity of their claims.