966 resultados para ORDER ACCURACY APPROXIMATIONS
Resumo:
This paper formulates a node-based smoothed conforming point interpolation method (NS-CPIM) for solid mechanics. In the proposed NS-CPIM, the higher order conforming PIM shape functions (CPIM) have been constructed to produce a continuous and piecewise quadratic displacement field over the whole problem domain, whereby the smoothed strain field was obtained through smoothing operation over each smoothing domain associated with domain nodes. The smoothed Galerkin weak form was then developed to create the discretized system equations. Numerical studies have demonstrated the following good properties: NS-CPIM (1) can pass both standard and quadratic patch test; (2) provides an upper bound of strain energy; (3) avoid the volumetric locking; (4) provides the higher accuracy than those in the node-based smoothed schemes of the original PIMs.
Resumo:
A magneto-rheological (MR) fluid damper is a semi-active control device that has recently begun to receive more attention in the vibration control community. However, the inherent nonlinear nature of the MR fluid damper makes it challenging to use this device to achieve high damping control system performance. Therefore the development of an accurate modeling method for a MR fluid damper is necessary to take advantage of its unique characteristics. Our goal was to develop an alternative method for modeling a MR fluid damper by using a self tuning fuzzy (STF) method based on neural technique. The behavior of the researched damper is directly estimated through a fuzzy mapping system. In order to improve the accuracy of the STF model, a back propagation and a gradient descent method are used to train online the fuzzy parameters to minimize the model error function. A series of simulations had been done to validate the effectiveness of the suggested modeling method when compared with the data measured from experiments on a test rig with a researched MR fluid damper. Finally, modeling results show that the proposed STF interference system trained online by using neural technique could describe well the behavior of the MR fluid damper without need of calculation time for generating the model parameters.
Resumo:
Given global demand for new infrastructure, governments face substantial challenges in funding new infrastructure and simultaneously delivering Value for Money (VfM). As background to this challenge, a brief review is given of current practice in the selection of major public sector infrastructure in Australia, along with a review of the related literature concerning the Multi-Attribute Utility Approach (MAUA) and the effect of MAUA on the role of risk management in procurement selection. To contribute towards addressing the key weaknesses of MAUA, a new first-order procurement decision making model is mentioned. A brief summary is also given of the research method and hypothesis used to test and develop the new procurement model and which uses competition as the dependent variable and as a proxy for VfM. The hypothesis is given as follows: When the actual procurement mode matches the theoretical/predicted procurement mode (informed by the new procurement model), then actual competition is expected to match optimum competition (based on actual prevailing capacity vis-à-vis the theoretical/predicted procurement mode) and subject to efficient tendering. The aim of this paper is to report on progress towards testing this hypothesis in terms of an analysis of two of the four data components in the hypothesis. That is, actual procurement and actual competition across 87 road and health major public sector projects in Australia. In conclusion, it is noted that the Global Financial Crisis (GFC) has seen a significant increase in competition in public sector major road and health infrastructure and if any imperfections in procurement and/or tendering are discernible, then this would create the opportunity, through the deployment of economic principles embedded in the new procurement model and/or adjustments in tendering, to maintain some of this higher level post-GFC competition throughout the next business cycle/upturn in demand including private sector demand. Finally, the paper previews the next steps in the research with regard to collection and analysis of data concerning theoretical/predicted procurement and optimum competition.
Resumo:
The aim of this study was to develop a reliable technique for measuring the area of a curved surface from an axial computed tomography (CT) scan and to apply this clinically in the measurement of articular cartilage surface area in acetabular fractures. The method used was a triangulation algorithm. In order to determine the accuracy of the technique, areas of hemispheres of known size were measured to give the percentage error in area measurement. Seven such hemispheres were machined into a Perspex block and their area measured geometrically, and also from CT scans by means of the triangulation algorithm. Scans of 1, 2 and 4 mm slice thickness and separation were used. The error varied with slice thickness and hemisphere diameter. It was shown that the 2 mm slice thickness provides the most accurate area measurement, while 1 mm cuts overestimate and 4 mm cuts underestimate the area. For a hemisphere diameter of 5 cm, which is of similar size to the acetabulum, the error was -11.2% for 4 mm cuts, +4.2% for 2 mm cuts and + 5.1% for 1 mm cuts. As expected, area measurement was more accurate for larger hemispheres. This method can be applied clinically to quantify acetabular fractures by measuring the percentage area of intact articular cartilage. In the case of both column fractures, the percentage area of secondary congruence can be determined. This technique of quantifying acetabular fractures has a potential clinical application as a prognostic factor and an indication for surgery in the long term.
Resumo:
For the analysis of material nonlinearity, an effective shear modulus approach based on the strain control method is proposed in this paper by using point collocation method. Hencky’s total deformation theory is used to evaluate the effective shear modulus, Young’s modulus and Poisson’s ratio, which are treated as spatial field variables. These effective properties are obtained by the strain controlled projection method in an iterative manner. To evaluate the second order derivatives of shape function at the field point, the radial basis function (RBF) in the local support domain is used. Several numerical examples are presented to demonstrate the efficiency and accuracy of the proposed method and comparisons have been made with analytical solutions and the finite element method (ABAQUS).
Resumo:
Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.
Resumo:
With the growing number of XML documents on theWeb it becomes essential to effectively organise these XML documents in order to retrieve useful information from them. A possible solution is to apply clustering on the XML documents to discover knowledge that promotes effective data management, information retrieval and query processing. However, many issues arise in discovering knowledge from these types of semi-structured documents due to their heterogeneity and structural irregularity. Most of the existing research on clustering techniques focuses only on one feature of the XML documents, this being either their structure or their content due to scalability and complexity problems. The knowledge gained in the form of clusters based on the structure or the content is not suitable for reallife datasets. It therefore becomes essential to include both the structure and content of XML documents in order to improve the accuracy and meaning of the clustering solution. However, the inclusion of both these kinds of information in the clustering process results in a huge overhead for the underlying clustering algorithm because of the high dimensionality of the data. The overall objective of this thesis is to address these issues by: (1) proposing methods to utilise frequent pattern mining techniques to reduce the dimension; (2) developing models to effectively combine the structure and content of XML documents; and (3) utilising the proposed models in clustering. This research first determines the structural similarity in the form of frequent subtrees and then uses these frequent subtrees to represent the constrained content of the XML documents in order to determine the content similarity. A clustering framework with two types of models, implicit and explicit, is developed. The implicit model uses a Vector Space Model (VSM) to combine the structure and the content information. The explicit model uses a higher order model, namely a 3- order Tensor Space Model (TSM), to explicitly combine the structure and the content information. This thesis also proposes a novel incremental technique to decompose largesized tensor models to utilise the decomposed solution for clustering the XML documents. The proposed framework and its components were extensively evaluated on several real-life datasets exhibiting extreme characteristics to understand the usefulness of the proposed framework in real-life situations. Additionally, this research evaluates the outcome of the clustering process on the collection selection problem in the information retrieval on the Wikipedia dataset. The experimental results demonstrate that the proposed frequent pattern mining and clustering methods outperform the related state-of-the-art approaches. In particular, the proposed framework of utilising frequent structures for constraining the content shows an improvement in accuracy over content-only and structure-only clustering results. The scalability evaluation experiments conducted on large scaled datasets clearly show the strengths of the proposed methods over state-of-the-art methods. In particular, this thesis work contributes to effectively combining the structure and the content of XML documents for clustering, in order to improve the accuracy of the clustering solution. In addition, it also contributes by addressing the research gaps in frequent pattern mining to generate efficient and concise frequent subtrees with various node relationships that could be used in clustering.
Resumo:
In Mango Boulevard Pty Ltd v Spencer [2010] QCA 207, a self-executing order had been made in consequence of continuing default by parties to the proceedings in meeting their disclosure obligations. The case involved several questions about the construction and implications of the self-executing order. This note focuses on the aspects of the case relating to that order.
Resumo:
The formation of a venture relies, in part, upon the participants reaching a shared understanding of purpose and process. Yet in circumstances of great complexity and uncertainty how can such a shared understanding be created? If the response to complexity and uncertainty is to seek simplicity in order to find commonality then what is lost and what is at risk? Can shared understandings of purpose and process be arrived at by embracing complexity and uncertainty and if so how? These questions led us to explore the process of dialogue and communication of a team in its formative stages. Our interests were not centred upon the behavioural characteristics of the individuals in the 'forming' stage of group dynamics but rather the process of cognitive and linguistic turns, the wax and wan of ideas and, the formation of shared meaning. This process of cognitive and linguistic turns was focused thematically on the areas of foresight, innovation, entrepreneurship, and public policy. This cross disciplinary exploration sought to explore potential synergies between these domains, in particular in developing a conceptual basis for long term thinking that can inform wiser public policy.
Resumo:
An array of monopole elements with reduced element spacing of λ/6 to λ/20 is considered for application in digital beam-forming and direction-finding. The small element spacing introduces strong mutual coupling between the array elements. This paper discusses that decoupling can be achieved analytically for arrays with three elements and describes Kuroda’s identities to realize the lumped elements of the derived decoupling network. Design procedures and equations are proposed. Experimental results are presented. The decoupled array has a bandwidth of 1% and a superdirective radiation pattern.
Resumo:
The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.
Resumo:
Purpose: To assess the accuracy of intraocular pressure(IOP) measurements using rebound tonometry over disposable hydrogel (etafilcon A) and silicone hydrogel (senofilcon A) contact lenses (CLs) of different powers. Methods: The experimental group comprised 36 subjects (19 male, 17 female). IOP measurements were undertaken on the subject’s right eyes in random order using a rebound tonometer (ICare). The CLs had powers of +2.00D, −2.00D and−6.00D. Six measurements were taken over each contact lens and also before and after the CLs had been worn. Results: A good correlation was found between IOP measurements with and without CLs (all r≥0.80; p < 0.05). Bland Altman plots did not show any significant trend in the difference in IOP readings with and without CLs as a function of IOP value. A two-way ANOVA revealed a significant effect of material and power (p < 0.01) but no interaction. All the comparisons between the measurements without CLs and with hydrogel CLs were significant (p < 0.01). The comparisons with silicone hydrogel CLs were not significant. Conclusions: Rebound tonometry can be reliably performed over silicone hydrogel CLs. With hydrogel CLs, the measurements were lower than those without CLs. However, despite the fact that these differences were statistically significant, their clinical significance was minimal.
Resumo:
When compared with similar joint arthroplasties, the prognosis of Total Ankle Replacement (TAR) is not satisfactory although it shows promising results post surgery. To date, most models do not provide the full anatomical functionality and biomechanical range of motion of the healthy ankle joint. This has sparked additional research and evaluation of clinical outcomes in order to enhance ankle prosthesis design. However, the limited biomechanical data that exist in literature are based upon two-dimensional, discrete and outdated techniques1 and may be inaccurate. Since accurate force estimations are crucial to prosthesis design, a paper based on a new biomechanical modeling approach, providing three dimensional forces acting on the ankle joint and the surrounding tissues was published recently, but the identified forces were suspected of being under-estimated, while muscles were . The present paper reports an attempt to improve the accuracy of the analysis by means of novel methods for kinematic processing of gait data, provided in release 4.1 of the AnyBody Modeling System (AnyBody Technology, Aalborg, Denmark) Results from the new method are shown and remaining issues are discussed.
Resumo:
In information retrieval (IR) research, more and more focus has been placed on optimizing a query language model by detecting and estimating the dependencies between the query and the observed terms occurring in the selected relevance feedback documents. In this paper, we propose a novel Aspect Language Modeling framework featuring term association acquisition, document segmentation, query decomposition, and an Aspect Model (AM) for parameter optimization. Through the proposed framework, we advance the theory and practice of applying high-order and context-sensitive term relationships to IR. We first decompose a query into subsets of query terms. Then we segment the relevance feedback documents into chunks using multiple sliding windows. Finally we discover the higher order term associations, that is, the terms in these chunks with high degree of association to the subsets of the query. In this process, we adopt an approach by combining the AM with the Association Rule (AR) mining. In our approach, the AM not only considers the subsets of a query as “hidden” states and estimates their prior distributions, but also evaluates the dependencies between the subsets of a query and the observed terms extracted from the chunks of feedback documents. The AR provides a reasonable initial estimation of the high-order term associations by discovering the associated rules from the document chunks. Experimental results on various TREC collections verify the effectiveness of our approach, which significantly outperforms a baseline language model and two state-of-the-art query language models namely the Relevance Model and the Information Flow model
Resumo:
Purpose: To demonstrate that relatively simple third-order theory can provide a framework which shows how peripheral refraction can be manipulated by altering the forms of spectacle lenses. Method: Third-order equations were used to yield lens forms that correct peripheral power errors, either for the lenses alone or in combination with typical peripheral refractions of myopic eyes. These results were compared with those of finite ray-tracing. Results: The approximate forms of spherical and conicoidal lenses provided by third-order theory were flatter over a moderate myopic range than the forms obtained by rigorous raytracing. Lenses designed to correct peripheral refractive errors produced large errors when used with foveal vision and a rotating eye. Correcting astigmatism tended to give large errors in mean oblique error and vice versa. When only spherical lens forms are used, correction of the relative hypermetropic peripheral refractions of myopic eyes which are observed experimentally, or the provision of relative myopic peripheral refractions in such eyes, seems impossible in the majority of cases. Conclusion: The third-order spectacle lens design approach can readily be used to show trends in peripheral refraction.