58 resultados para Error Floor

em Deakin Research Online - Australia


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper demonstrates how the "error-bar" feature can be used to extend the utility of "worldware" spreadsheet packages in producing high-quality graphs for university teaching and learning, and for research. To further utilize the advantages of spreadsheets in university education, this paper seeks to overcome some of the earlier reservations about the lack of scientific plotting capabilities of spreadsheet applications. Specific examples of educational material in the areas of enzyme kinetics, vibrational spectroscopy, vibronic spectroscopy, and mass spectrometry are discussed. It is argued that, where practical, university educators should use "worldware" packages to prepare teaching aids, since these would better prepare their students for future employment. The use of software features for purposes that were not envisioned by the programmers has additional educational benefits in fostering flexibility and innovation. Other graphing packages can also use the "error-bar" feature in a manner similar to that described here for Excel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are many variations within sheet metal forming, some of which are manifest in the final geometry of the formed component. It is important that this geometric variation be quantified and measured for use in a process or quality control system. The contribution of this paper is to propose a novel way of measuring the geometric difference between the desired shape and an actual formed "U" channel. The metric is based upon measuring errors in terms of the significant manufacturing variations. The metric accords with the manually measured errors of the channel set. The shape error metric is then extended to develop a simple empirical, whole-component, springback error measure. The springback error measure combines into one value all the angle springback and side wall curl geometric errors for a single channel. Two trends were observed: combined springback decreases when the blank holder force is increased; and the combined springback marginally decreases when the die radii is increased.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that there always exists an interval of tuning parameter values such that the corresponding mean squared prediction error for the lasso estimator is smaller than for the ordinary least squares estimator. For an estimator satisfying some condition such as unbiasedness, the paper defines the corresponding generalized lasso estimator. Its mean squared prediction error is shown to be smaller than that of the estimator for values of the tuning parameter in some interval. This implies that all unbiased estimators are not admissible. Simulation results for five models support the theoretical results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes an error correction method based on the Euler Superpath problem. Sequence data is mapped to an Euler Superpath dynamically by Merging Transformation. With restriction and guiding rules, data consistency is maintained and error paths are separated from correct data: Error edges are mapped to the correct ones and after substitution (of error edges with right paths), corresponding errors in the sequencing data are eliminated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This pilot study attempted to examine the additional efficacy of interferential therapy in reducing the symptoms of urinary stress and urge incontinence. Twenty-four subjects were randomly allocated to the experimental group, which received interferential therapy plus pelvic floor exercises, or the control group, which received pelvic floor exercises only. Treatment was given three times a week for 4 weeks. Subjects were given urinary diaries to record urinary symptoms (including frequency of passing urine and number of times woken by desire to pass urine) for 5 days prior to and after treatment. Perineometer readings, pad weighing test and start/stop test were also performed in a physiotherapy clinic before and at completion of treatment regimes. Significant improvements were observed in all the outcome variables in the experimental group, but in only the perineometer readings in controls. When the changes from pre- to post-treatment were compared between the two groups, four of the dependent variables did not reach statistical significance. Power analysis indicated that the sample size for each group needed to be 70 for all results to be statistically significant. This study shows that interferential therapy plus pelvic floor exercise appears to be a more effective treatment modality than pelvic floor muscle strengthening exercise alone for incontinence, but a larger trial with longer followup is needed before definitive conclusions can be reached.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the factors influencing the annual dissent rate on the High Court of Australia from its first full year of operation in 1904 up to 2001 within a cointegration and error correction framework. We hypothesize that institutional factors, socioeconomic complexity, and leadership style explain variations in the dissent rate on the High Court of Australia over time. The institutional factors that we consider are the Court's caseload, whether it had discretion to select the cases it hears, and whether it was a final court of appeal. To measure socioeconomic complexity we use the divorce rate, urbanization rate, and real GDP per capita. Our main finding is that in the long run and short run, caseload and real income are the main factors influencing dissent. We find that a 1 percent increase in caseload and real income reduce the dissent rate on the High Court of Australia by 0.3 percent and 0.6 percent, respectively, holding other factors constant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In data stream applications, a good approximation obtained in a timely  manner is often better than the exact answer that’s delayed beyond the window of opportunity. Of course, the quality of the approximate is as important as its timely delivery. Unfortunately, algorithms capable of online processing do not conform strictly to a precise error guarantee. Since online processing is essential and so is the precision of the error, it is necessary that stream algorithms meet both criteria. Yet, this is not the case for mining frequent sets in data streams. We present EStream, a novel algorithm that allows online processing while producing results strictly within the error bound. Our theoretical and experimental results show that EStream is a better candidate for finding frequent sets in data streams, when both constraints need to be satisfied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The positioning error of a large cantilevered mass that is actuated at its supported end is minimized as this mass travels at challenging high speeds and accelerations. An integrated approach is adopted to realize the task. After selecting the appropriate actuator that would provide higher rigidity, the system is viewed as a multi-degree of freedom system, and hence the concept of system-generated disturbance is introduced. This allows the use of appropriate mechanical design considerations and a proper generation of the kinematics commands to minimize such disturbance. A disturbance observer is then designed to detect and compensate the remaining disturbance, hence minimizing the positioning error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of our present paper is to derive a computationally efficient genetic pattern learning algorithm to evolutionarily derive the optimal rebalancing weights (i.e. dynamic hedge ratios) to engineer a structured financial product out of a multiasset, best-of option. The stochastic target function is formulated as an expected squared cost of hedging (tracking) error which is assumed to be partly dependent on the governing Markovian process underlying the individual asset returns and partly on
randomness i.e. pure white noise. A simple haploid genetic algorithm is advanced as an alternative numerical scheme, which is deemed to be
computationally more efficient than numerically deriving an explicit solution to the formulated optimization model. An extension to our proposed scheme is suggested by means of adapting the Genetic Algorithm parameters based on fuzzy logic controllers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this habitat mapping study, multi-beam acoustic data are integrated with extensive, precisely geo-referenced video validation data in a GIS environment to classify benthic substrates and biota at a 33km2 site in the near shore waters of Victoria, Australia. Using an automated decision-tree classification method, 5 representative biotic groups were identified in the Cape Nelson survey area using a combination of multi-beam bathymetry, backscatter and derivative products. Rigorous error assessment of derived, classified maps produced high overall accuracies (>85%) for all mapping products. In addition, a discrete multivariate analysis technique (kappa analysis) was used to assess classification accuracy. High-resolution (2.5m cell-size) representation of sea floor morphology and textural characteristics provided by multi-beam bathymetry and backscatter datasets, allowed the interpretation of benthic substrates of the Cape Nelson site and the communities of sessile organisms that populate them. Non-parametric multivariate statistical analysis (ANOSIM) revealed a significant difference in biotic composition between depth strata, and between substrate types. Incorporated with other descriptive measures, these results indicate that depth and substrate are important factors in the distributional ecology of the biotic communities at the Cape Nelson study site. BIOENV analysis indicates that derivatives of both multi-beam datasets (bathymetry and backscatter) are correlated with distribution and density of biotic communities. Results from this study provide new tools for research and management of the coastal zone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this article is to present an empirical analysis of complex sample data with regard to the biasing effect of non-independence of observations on standard error parameter estimates. Using field data structured in the form of repeated measurements it is to be shown, in a two-factor confirmatory factor analysis model, how the bias in SE can be derived when the non-independence is ignored.

Design/methodology/approach – Three estimation procedures are compared: normal asymptotic theory (maximum likelihood); non-parametric standard error estimation (naïve bootstrap); and sandwich (robust covariance matrix) estimation (pseudo-maximum likelihood).

Findings – The study reveals that, when using either normal asymptotic theory or non-parametric standard error estimation, the SE bias produced by the non-independence of observations can be noteworthy.

Research limitations/implications –
Considering the methodological constraints in employing field data, the three analyses examined must be interpreted independently and as a result taxonomic generalisations are limited. However, the study still provides “case study” evidence suggesting the existence of the relationship between non-independence of observations and standard error bias estimates.

Originality/value – Given the increasing popularity of structural equation models in the social sciences and in particular in the marketing discipline, the paper provides a theoretical and practical insight into how to treat repeated measures and clustered data in general, adding to previous methodological research. Some conclusions and suggestions for researchers who make use of partial least squares modelling are also drawn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For most data stream applications, the volume of data is too huge to be stored in permanent devices or to be thoroughly scanned more than once. It is hence recognized that approximate answers are usually sufficient, where a good approximation obtained in a timely manner is often better than the exact answer that is delayed beyond the window of opportunity. Unfortunately, this is not the case for mining frequent patterns over data streams where algorithms capable of online processing data streams do not conform strictly to a precise error guarantee. Since the quality of approximate answers is as important as their timely delivery, it is necessary to design algorithms to meet both criteria at the same time. In this paper, we propose an algorithm that allows online processing of streaming data and yet guaranteeing the support error of frequent patterns strictly within a user-specified threshold. Our theoretical and experimental studies show that our algorithm is an effective and reliable method for finding frequent sets in data stream environments when both constraints need to be satisfied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The creation of an electronic limit order book is discussed as the basis for distinguishing between the floor trading and screen trading of derivative instruments. Distinguishing between FTP and ETP in terms of market transparency allows investors to contemplate the trade-off between the 2 platforms. Distinguishing between FTP and ETP in terms of memory preservation allows practitioners to contemplate the different experiences when analyzing floor data and screen data. A comparable set of floor and screen data is used to examine the impact on the trading dynamics and price discovery of LIFFE's FTSE 100 index futures market when trading is automated on LIFFE CONNECT. The dynamics in the quote change equation is shortened when moving from the floor to screen sample. Using the model's measure of trade informativeness, it is found that in 4 out of 5 daily sub-samples, screen trades are more than twice as informative as floor trades. Variability within a system of equations is explained more by order size history than trade size history.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years there has been increasing recognition internationally that health care is not as safe as it ought to be and that patient safety outcomes need to be improved. To this end patient safety has become the focus of a world-wide endeavour aimed at reducing the incidence and impact of preventable human errors and related adverse events in health care domains. The emergency department has been identified as a significant site of preventable human errors and adverse events in the health care system, raising important questions about the nature of human error management and patient safety ethics in rapidly changing environments. In this article (the first of a two-part discussion on the subject) an overview of the incidence and impact of preventable adverse events in ED contexts is explored. The development of a ‘culture of safety’ in other hazardous industries and the ‘lessons learned’ and applied to the health care industry are also briefly examined. In a second article (to be presented as Part II), some of the ethical tensions that have arisen in the context of implementing patient safety processes and their possible implications for ED contexts are explored.