93 resultados para reasonable accuracy
Resumo:
Recent efforts in mission planning for underwater vehicles have utilised predictive models to aid in navigation, optimal path planning and drive opportunistic sampling. Although these models provide information at a unprecedented resolutions and have proven to increase accuracy and effectiveness in multiple campaigns, most are deterministic in nature. Thus, predictions cannot be incorporated into probabilistic planning frameworks, nor do they provide any metric on the variance or confidence of the output variables. In this paper, we provide an initial investigation into determining the confidence of ocean model predictions based on the results of multiple field deployments of two autonomous underwater vehicles. For multiple missions conducted over a two-month period in 2011, we compare actual vehicle executions to simulations of the same missions through the Regional Ocean Modeling System in an ocean region off the coast of southern California. This comparison provides a qualitative analysis of the current velocity predictions for areas within the selected deployment region. Ultimately, we present a spatial heat-map of the correlation between the ocean model predictions and the actual mission executions. Knowing where the model provides unreliable predictions can be incorporated into planners to increase the utility and application of the deterministic estimations.
Resumo:
The development and design of electric high power devices with electromagnetic computer-aided engineering (EM-CAE) software such as the Finite Element Method (FEM) and Boundary Element Method (BEM) has been widely adopted. This paper presents the analysis of a Fault Current Limiter (FCL), which acts as a high-voltage surge protector for power grids. A prototype FCL was built. The magnetic flux in the core and the resulting electromagnetic forces in the winding of the FCL were analyzed using both FEM and BEM. An experiment on the prototype was conducted in a laboratory. The data obtained from the experiment is compared to the numerical solutions to determine the suitability and accuracy of the two methods.
Resumo:
The Australian Disability Standards for Education 2005 (Cth) require education providers to make reasonable adjustments in educational assessment so that students with disability can participate on the same basis as other students and be able to demonstrate what they know and can do. Reasonableness is governed by a determination of the balance of interests, benefits and detriment to the parties involved. The Standards require providers to consult with students and associates on adjustments, although guidance on how consultation should occur and how the views of students and associates are to be taken into account is vague. In this article, we identify three principles to be considered in order to put appropriate and effective reasonable adjustments in assessment into practice. While Australian law and assessment contexts are used to examine intentions, expectations and practices in educational assessment for students with disability, we argue that these three principles must be considered in any national education system to ensure equitable assessment practices and achieve equitable educational inclusion for students with disability.
Resumo:
This paper explains the legislation which underpins the right to reasonable adjustment in education for students with disabilities in Australian schools. It gives examples of the kinds of adjustment which may be made to promote equality of opportunity in the area of assessment. It also considers how the law has constructed the border between reasonable adjustment and academic integrity.
Resumo:
Objectives The relationship between performance variability and accuracy in cricket fast bowlers of different skill levels under three different task conditions was investigated. Bowlers of different skill levels were examined to observe if they could adapt movement patterns to maintain performance accuracy on a bowling skills test. Design 8 national, 12 emerging and 12 junior pace bowlers completed an adapted version of the Cricket Australia bowling skills test, in which they performed 30 trials involving short (n = 10), good (n = 10), and full (n = 10) length deliveries. Methods Bowling accuracy was recorded by digitising ball position relative to the centre of a target. Performance measures were mean radial error (accuracy), variable error (consistency), centroid error (bias), bowling score and ball speed. Radial error changes across the duration of the skills test were used to record accuracy adjustment in subsequent deliveries. Results Elite fast bowlers performed better in speed, accuracy, and test scores than developing athletes. Bowlers who were less variable were also more accurate across all delivery lengths. National and emerging bowlers were able to adapt subsequent performance trials within the same bowling session for short length deliveries. Conclusions Accuracy and adaptive variability were key components of elite performance in fast bowling which improved with skill level. In this study, only national elite bowlers showed requisite levels of adaptive variability to bowl a range of lengths to different pitch locations.
Resumo:
One of the primary desired capabilities of any future air traffic separation management system is the ability to provide early conflict detection and resolution effectively and efficiently. In this paper, we consider the risk of conflict as a primary measurement to be used for early conflict detection. This paper focuses on developing a novel approach to assess the impact of different measurement uncertainty models on the estimated risk of conflict. The measurement uncertainty model can be used to represent different sensor accuracy and sensor choices. Our study demonstrates the value of modelling measurement uncertainty in the conflict risk estimation problem and presents techniques providing a means of assessing sensor requirements to achieve desired conflict detection performance.
Resumo:
This paper looks at the accuracy of using the built-in camera of smart phones and free software as an economical way to quantify and analyse light exposure by producing luminance maps from High Dynamic Range (HDR) images. HDR images were captured with an Apple iPhone 4S to capture a wide variation of luminance within an indoor and outdoor scene. The HDR images were then processed using Photosphere software (Ward, 2010.) to produce luminance maps, where individual pixel values were compared with calibrated luminance meter readings. This comparison has shown an average luminance error of ~8% between the HDR image pixel values and luminance meter readings, when the range of luminances in the image is limited to approximately 1,500cd/m2.
Resumo:
A fundamental proposition is that the accuracy of the designer's tender price forecasts is positively correlated with the amount of information available for that project. The paper describes an empirical study of the effects of the quantity of information available on practicing Quantity Surveyors' forecasting accuracy. The methodology involved the surveyors repeatedly revising tender price forecasts on receipt of chunks of project information. Each of twelve surveyors undertook two projects and selected information chunks from a total of sixteen information types. The analysis indicated marked differences in accuracy between different project types and experts/non-experts. The expert surveyors' forecasts were not found to be significantly improved by information other than that of basic building type and size, even after eliminating project type effects. The expert surveyors' forecasts based on the knowledge of building type and size alone were, however, found to be of similar accuracy to that of average practitioners pricing full bills of quantities.
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.
Resumo:
Several methods of estimating the costs or price of construction projects are now available for use in the construction industry. It is difficult due to the conservative approach of estimators and quantity surveyors, and the fact that the industry is undergoing one of its deepest recessions this century, to implement any changes in these processes. Several methods have been tried and tested and probably discarded forever, whereas other methods are still in their infancy. There is also a movement towards greater use of the computer, whichever method seems to be adopted. An important consideration with any method of estimating is the accuracy by which costs can be calculated. Any improvement in this consideration will be welcomed by a11 parties, because existing methods are poor when measured by this criteria. Estimating, particularly by contractors, has always carried some mystic, and many of the processes discussed both in the classroom and in practice are little more than fallacy when properly investigated. What makes an estimator or quantity surveyor good at forecasting the right price? To what extent does human behaviour influence or have a part to play? These and some of the other aspects of effective estimating are now examined in more detail.
Resumo:
Objectives: This study examines the accuracy of Gestational Diabetes Mellitus (GDM) case-ascertainment in routinely collected data. Methods: Retrospective cohort study analysed routinely collected data from all births at Cairns Base Hospital, Australia, from 1 January 2004 to 31 December 2010 in the Cairns Base Hospital Clinical Coding system (CBHCC) and the Queensland Perinatal Data Collection (QPDC). GDM case ascertainment in the National Diabetes Services Scheme (NDSS) and Cairns Diabetes Centre (CDC) data were compared. Results: From 2004 to 2010, the specificity of GDM case-ascertainment in the QPDC was 99%. In 2010, only 2 of 225 additional cases were identified from the CDC and CBHCC, suggesting QPDC sensitivity is also over 99%. In comparison, the sensitivity of the CBHCC data was 80% during 2004–2010. The sensitivity of CDC data was 74% in 2010. During 2010, 223 births were coded as GDM in the QPDC, and the NDSS registered 247 women with GDM from the same postcodes, suggesting reasonable uptake on the NDSS register. However, the proportion of Aboriginal and Torres Strait Islander women was lower than expected. Conclusion: The accuracy of GDM case ascertainment in the QPDC appears high, with lower accuracy in routinely collected hospital and local health service data. This limits capacity of local data for planning and evaluation, and developing structured systems to improve post-pregnancy care, and may underestimate resources required. Implications: Data linkage should be considered to improve accuracy of routinely collected local health service data. The accuracy of the NDSS for Aboriginal and Torres Strait Islander women requires further evaluation.
Resumo:
In Hauff v Miller [2013] QCA 48 the Queensland Court of Appeal considered an issue that has not previously arisen at appellate level. The case concerned the interaction of the well-known subject to finance clause and other standard contractual provisions...
Resumo:
Currently, finite element analyses are usually done by means of commercial software tools. Accuracy of analysis and computational time are two important factors in efficiency of these tools. This paper studies the effective parameters in computational time and accuracy of finite element analyses performed by ANSYS and provides the guidelines for the users of this software whenever they us this software for study on deformation of orthopedic bone plates or study on similar cases. It is not a fundamental scientific study and only shares the findings of the authors about structural analysis by means of ANSYS workbench. It gives an idea to the readers about improving the performance of the software and avoiding the traps. The solutions provided in this paper are not the only possible solutions of the problems and in similar cases there are other solutions which are not given in this paper. The parameters of solution method, material model, geometric model, mesh configuration, number of the analysis steps, program controlled parameters and computer settings are discussed through thoroughly in this paper.
Resumo:
This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.