999 resultados para implant impression accuracy


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and design of electric high power devices with electromagnetic computer-aided engineering (EM-CAE) software such as the Finite Element Method (FEM) and Boundary Element Method (BEM) has been widely adopted. This paper presents the analysis of a Fault Current Limiter (FCL), which acts as a high-voltage surge protector for power grids. A prototype FCL was built. The magnetic flux in the core and the resulting electromagnetic forces in the winding of the FCL were analyzed using both FEM and BEM. An experiment on the prototype was conducted in a laboratory. The data obtained from the experiment is compared to the numerical solutions to determine the suitability and accuracy of the two methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives The relationship between performance variability and accuracy in cricket fast bowlers of different skill levels under three different task conditions was investigated. Bowlers of different skill levels were examined to observe if they could adapt movement patterns to maintain performance accuracy on a bowling skills test. Design 8 national, 12 emerging and 12 junior pace bowlers completed an adapted version of the Cricket Australia bowling skills test, in which they performed 30 trials involving short (n = 10), good (n = 10), and full (n = 10) length deliveries. Methods Bowling accuracy was recorded by digitising ball position relative to the centre of a target. Performance measures were mean radial error (accuracy), variable error (consistency), centroid error (bias), bowling score and ball speed. Radial error changes across the duration of the skills test were used to record accuracy adjustment in subsequent deliveries. Results Elite fast bowlers performed better in speed, accuracy, and test scores than developing athletes. Bowlers who were less variable were also more accurate across all delivery lengths. National and emerging bowlers were able to adapt subsequent performance trials within the same bowling session for short length deliveries. Conclusions Accuracy and adaptive variability were key components of elite performance in fast bowling which improved with skill level. In this study, only national elite bowlers showed requisite levels of adaptive variability to bowl a range of lengths to different pitch locations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the primary desired capabilities of any future air traffic separation management system is the ability to provide early conflict detection and resolution effectively and efficiently. In this paper, we consider the risk of conflict as a primary measurement to be used for early conflict detection. This paper focuses on developing a novel approach to assess the impact of different measurement uncertainty models on the estimated risk of conflict. The measurement uncertainty model can be used to represent different sensor accuracy and sensor choices. Our study demonstrates the value of modelling measurement uncertainty in the conflict risk estimation problem and presents techniques providing a means of assessing sensor requirements to achieve desired conflict detection performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper looks at the accuracy of using the built-in camera of smart phones and free software as an economical way to quantify and analyse light exposure by producing luminance maps from High Dynamic Range (HDR) images. HDR images were captured with an Apple iPhone 4S to capture a wide variation of luminance within an indoor and outdoor scene. The HDR images were then processed using Photosphere software (Ward, 2010.) to produce luminance maps, where individual pixel values were compared with calibrated luminance meter readings. This comparison has shown an average luminance error of ~8% between the HDR image pixel values and luminance meter readings, when the range of luminances in the image is limited to approximately 1,500cd/m2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental proposition is that the accuracy of the designer's tender price forecasts is positively correlated with the amount of information available for that project. The paper describes an empirical study of the effects of the quantity of information available on practicing Quantity Surveyors' forecasting accuracy. The methodology involved the surveyors repeatedly revising tender price forecasts on receipt of chunks of project information. Each of twelve surveyors undertook two projects and selected information chunks from a total of sixteen information types. The analysis indicated marked differences in accuracy between different project types and experts/non-experts. The expert surveyors' forecasts were not found to be significantly improved by information other than that of basic building type and size, even after eliminating project type effects. The expert surveyors' forecasts based on the knowledge of building type and size alone were, however, found to be of similar accuracy to that of average practitioners pricing full bills of quantities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several methods of estimating the costs or price of construction projects are now available for use in the construction industry. It is difficult due to the conservative approach of estimators and quantity surveyors, and the fact that the industry is undergoing one of its deepest recessions this century, to implement any changes in these processes. Several methods have been tried and tested and probably discarded forever, whereas other methods are still in their infancy. There is also a movement towards greater use of the computer, whichever method seems to be adopted. An important consideration with any method of estimating is the accuracy by which costs can be calculated. Any improvement in this consideration will be welcomed by a11 parties, because existing methods are poor when measured by this criteria. Estimating, particularly by contractors, has always carried some mystic, and many of the processes discussed both in the classroom and in practice are little more than fallacy when properly investigated. What makes an estimator or quantity surveyor good at forecasting the right price? To what extent does human behaviour influence or have a part to play? These and some of the other aspects of effective estimating are now examined in more detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, a treatment plan for a spinal lesion, with all beams transmitted though a titanium vertebral reconstruction implant, was used to investigate the potential effect of a high-density implant on a three-dimensional dose distribution for a radiotherapy treatment. The BEAMnrc/DOSXYZnrc and MCDTK Monte Carlo codes were used to simulate the treatment using both a simplified, recltilinear model and a detailed model incorporating the full complexity of the patient anatomy and treatment plan. The resulting Monte Carlo dose distributions showed that the commercial treatment planning system failed to accurately predict both the depletion of dose downstream of the implant and the increase in scattered dose adjacent to the implant. Overall, the dosimetric effect of the implant was underestimated by the commercial treatment planning system and overestimated by the simplified Monte Carlo model. The value of performing detailed Monte Carlo calculations, using the full patient and treatment geometry, was demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Finite Element modelling of bone fracture fixation systems allows computational investigation of the deformation response of the bone to load. Once validated, these models can be easily adapted to explore changes in design or configuration of a fixator. The deformation of the tissue within the fracture gap determines its healing and is often summarised as the stiffness of the construct. FE models capable of reproducing this behaviour would provide valuable insight into the healing potential of different fixation systems. Current model validation techniques lack depth in 6D load and deformation measurements. Other aspects of the FE model creation such as the definition of interfaces between components have also not been explored. This project investigated the mechanical testing and FE modelling of a bone– plate construct for the determination of stiffness. In depth 6D measurement and analysis of the generated forces, moments and movements showed large out of plane behaviours which had not previously been characterised. Stiffness calculated from the interfragmentary movement was found to be an unsuitable summary parameter as the error propagation is too large. Current FE modelling techniques were applied in compression and torsion mimicking the experimental setup. Compressive stiffness was well replicated, though torsional stiffness was not. The out of plane behaviours prevalent in the experimental work were not replicated in the model. The interfaces between the components were investigated experimentally and through modification to the FE model. Incorporation of the interface modelling techniques into the full construct models had no effect in compression but did act to reduce torsional stiffness bringing it closer to that of the experiment. The interface definitions had no effect on out of plane behaviours, which were still not replicated. Neither current nor novel FE modelling techniques were able to replicate the out of plane behaviours evident in the experimental work. New techniques for modelling loads and boundary conditions need to be developed to mimic the effects of the entire experimental system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, finite element analyses are usually done by means of commercial software tools. Accuracy of analysis and computational time are two important factors in efficiency of these tools. This paper studies the effective parameters in computational time and accuracy of finite element analyses performed by ANSYS and provides the guidelines for the users of this software whenever they us this software for study on deformation of orthopedic bone plates or study on similar cases. It is not a fundamental scientific study and only shares the findings of the authors about structural analysis by means of ANSYS workbench. It gives an idea to the readers about improving the performance of the software and avoiding the traps. The solutions provided in this paper are not the only possible solutions of the problems and in similar cases there are other solutions which are not given in this paper. The parameters of solution method, material model, geometric model, mesh configuration, number of the analysis steps, program controlled parameters and computer settings are discussed through thoroughly in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to identify retrospectively the predictors of implant survival when the flapless protocol was used in two private dental practices. Materials and Methods: The collected data were initially computer searched to identify the patients; later, a hand search of patient records was carried out to identify all flapless implants consecutively inserted over the last 10 years. The demographic information gathered on statistical predictors included age, sex, periodontal and peri-implantitis status, smoking, details of implants inserted, implant locations, placement time after extraction, use of simultaneous guided hard and soft tissue regeneration procedures, loading protocols, type of prosthesis, and treatment outcomes (implant survival and complications). Excluded were any implants that required flaps or simultaneous guided hard and soft tissue regeneration procedures, and implants narrower than 3.25 mm. Results: A total of 1,241 implants had been placed in 472 patients. Life table analysis indicated cumulative 5-year and 10-year implant survival rates of 97.9% and 96.5%, respectively. Most of the failed implants occurred in the posterior maxilla (54%) in type 4 bone (74.0%), and 55.0% of failed implants had been placed in smokers. Conclusion: Flapless dental implant surgery can yield an implant survival rate comparable to that reported in other studies using traditional flap techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

iTRAQ (isobaric tags for relative or absolute quantitation) is a mass spectrometry technology that allows quantitative comparison of protein abundance by measuring peak intensities of reporter ions released from iTRAQ-tagged peptides by fragmentation during MS/MS. However, current data analysis techniques for iTRAQ struggle to report reliable relative protein abundance estimates and suffer with problems of precision and accuracy. The precision of the data is affected by variance heterogeneity: low signal data have higher relative variability; however, low abundance peptides dominate data sets. Accuracy is compromised as ratios are compressed toward 1, leading to underestimation of the ratio. This study investigated both issues and proposed a methodology that combines the peptide measurements to give a robust protein estimate even when the data for the protein are sparse or at low intensity. Our data indicated that ratio compression arises from contamination during precursor ion selection, which occurs at a consistent proportion within an experiment and thus results in a linear relationship between expected and observed ratios. We proposed that a correction factor can be calculated from spiked proteins at known ratios. Then we demonstrated that variance heterogeneity is present in iTRAQ data sets irrespective of the analytical packages, LC-MS/MS instrumentation, and iTRAQ labeling kit (4-plex or 8-plex) used. We proposed using an additive-multiplicative error model for peak intensities in MS/MS quantitation and demonstrated that a variance-stabilizing normalization is able to address the error structure and stabilize the variance across the entire intensity range. The resulting uniform variance structure simplifies the downstream analysis. Heterogeneity of variance consistent with an additive-multiplicative model has been reported in other MS-based quantitation including fields outside of proteomics; consequently the variance-stabilizing normalization methodology has the potential to increase the capabilities of MS in quantitation across diverse areas of biology and chemistry.