315 resultados para accuracy of estimation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical resistivity of soils and sediments is strongly influenced by the presence of interstitial water. Taking advantage of this dependency, electrical-resistivity imaging (ERI) can be effectively utilized to estimate subsurface soil-moisture distributions. The ability to obtain spatially extensive data combined with time-lapse measurements provides further opportunities to understand links between land use and climate processes. In natural settings, spatial and temporal changes in temperature and porewater salinity influence the relationship between soil moisture and electrical resistivity. Apart from environmental factors, technical, theoretical, and methodological ambiguities may also interfere with accurate estimation of soil moisture from ERI data. We have examined several of these complicating factors using data from a two-year study at a forest-grassland ecotone, a boundary between neighboring but different plant communities.At this site, temperature variability accounts for approximately 20-45 of resistivity changes from cold winter to warm summer months. Temporal changes in groundwater conductivity (mean=650 S/cm =57.7) and a roughly 100-S/cm spatial difference between the forest and grassland had only a minor influence on the moisture estimates. Significant seasonal fluctuations in temperature and precipitation had negligible influence on the basic measurement errors in data sets. Extracting accurate temporal changes from ERI can be hindered by nonuniqueness of the inversion process and uncertainties related to time-lapse inversion schemes. The accuracy of soil moisture obtained from ERI depends on all of these factors, in addition to empirical parameters that define the petrophysical soil-moisture/resistivity relationship. Many of the complicating factors and modifying variables to accurately quantify soil moisture changes with ERI can be accounted for using field and theoretical principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental work could be conducted in either laboratory or at field site. Generally, the laboratory experiments are carried out in an artificial setting and with a highly controlled environment. By contrast, the field experiments often take place in a natural setting, subject to the influences of many uncontrolled factors. Therefore, it is necessary to carefully assess the possible limitations and appropriateness of an experiment before embarking on it. In this paper, a case study of field monitoring of the energy performance of air conditioners is presented. Significant challenges facing the experimental work are described. Lessons learnt from this case study are also discussed. In particular, it was found that on-going analysis of the monitoring data and the correction of abnormal issues are two of the keys for a successful field test program. It was also shown that the installation of monitoring systems could have a significant impact on the accuracy of the data being collected. Before monitoring system was set up to collect monitoring data, it is recommended that an initial analysis of sample monitored data should be conducted to make sure that the monitoring data can achieve the expected precision. In the case where inevitable inherent errors were induced from the installation of field monitoring systems, appropriate remediation may need to be developed and implemented for the improved accuracy of the estimation of results. On-going analysis of monitoring data and correction of any abnormal issues would be the key to a successful field test program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urbanisation significantly changes the characteristics of a catchment as natural areas are transformed to impervious surfaces such as roads, roofs and parking lots. The increased fraction of impervious surfaces leads to changes to the stormwater runoff characteristics, whilst a variety of anthropogenic activities common to urban areas generate a range of pollutants such as nutrients, solids and organic matter. These pollutants accumulate on catchment surfaces and are removed and trans- ported by stormwater runoff and thereby contribute pollutant loads to receiving waters. In summary, urbanisation influences the stormwater characteristics of a catchment, including hydrology and water quality. Due to the growing recognition that stormwater pollution is a significant environmental problem, the implementation of mitigation strategies to improve the quality of stormwater runoff is becoming increasingly common in urban areas. A scientifically robust stormwater quality treatment strategy is an essential requirement for effective urban stormwater management. The efficient design of treatment systems is closely dependent on the state of knowledge in relation to the primary factors influencing stormwater quality. In this regard, stormwater modelling outcomes provide designers with important guidance and datasets which significantly underpin the design of effective stormwater treatment systems. Therefore, the accuracy of modelling approaches and the reliability modelling outcomes are of particular concern. This book discusses the inherent complexity and key characteristics in the areas of urban hydrology and stormwater quality, based on the influence exerted by a range of rainfall and catchment characteristics. A comprehensive field sampling and testing programme in relation to pollutant build-up, an urban catchment monitoring programme in relation to stormwater quality and the outcomes from advanced statistical analyses provided the platform for the knowledge creation. Two case studies and two real-world applications are discussed to illustrate the translation of the knowledge created to practical use in relation to the role of rainfall and catchment characteristics on urban stormwater quality. An innovative rainfall classification based on stormwater quality was developed to support the effective and scientifically robust design of stormwater treatment systems. Underpinned by the rainfall classification methodology, a reliable approach for design rainfall selection is proposed in order to optimise stormwater treatment based on both, stormwater quality and quantity. This is a paradigm shift from the common approach where stormwater treatment systems are designed based solely on stormwater quantity data. Additionally, how pollutant build-up and stormwater runoff quality vary with a range of catchment characteristics was also investigated. Based on the study out- comes, it can be concluded that the use of only a limited number of catchment parameters such as land use and impervious surface percentage, as it is the case in current modelling approaches, could result in appreciable error in water quality estimation. Influential factors which should be incorporated into modelling in relation to catchment characteristics, should also include urban form and impervious surface area distribution. The knowledge created through the research investigations discussed in this monograph is expected to make a significant contribution to engineering practice such as hydrologic and stormwater quality modelling, stormwater treatment design and urban planning, as the study outcomes provide practical approaches and recommendations for urban stormwater quality enhancement. Furthermore, this monograph also demonstrates how fundamental knowledge of stormwater quality processes can be translated to provide guidance on engineering practice, the comprehensive application of multivariate data analyses techniques and a paradigm on integrative use of computer models and mathematical models to derive practical outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cost estimating has been acknowledged as a crucial component of construction projects. Depending on available information and project requirements, cost estimates evolve in tandem with project lifecycle stages; conceptualisation, design development, execution and facility management. The premium placed on the accuracy of cost estimates is crucial to producing project tenders and eventually in budget management. Notwithstanding the initial slow pace of its adoption, Building Information Modelling (BIM) has successfully addressed a number of challenges previously characteristic of traditional approaches in the AEC, including poor communication, the prevalence of islands of information and frequent reworks. Therefore, it is conceivable that BIM can be leveraged to address specific shortcomings of cost estimation. The impetus for leveraging BIM models for accurate cost estimation is to align budgeted and actual cost. This paper hypothesises that the accuracy of BIM-based estimation, as more efficient, process-mirrors of traditional cost estimation methods, can be enhanced by simulating traditional cost estimation factors variables. Through literature reviews and preliminary expert interviews, this paper explores the factors that could potentially lead to more accurate cost estimates for construction projects. The findings show numerous factors that affect the cost estimates ranging from project information and its characteristic, project team, clients, contractual matters, and other external influences. This paper will make a particular contribution to the early phase of BIM-based project estimation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Background: Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. Method: We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Results: Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Conclusion: Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Application: Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To examine the effects of gaze position and optical blur, similar to that used in multifocal corrections, on stepping accuracy for a precision stepping task among older adults. Methods: Nineteen healthy older adults (mean age, 71.6 +/- 8.8 years) with normal vision performed a series of precision stepping tasks onto a fixed target. The stepping tasks were performed using a repeated-measures design for three gaze positions (fixating on the stepping target as well as 30 and 60 cm farther forward of the stepping target) and two visual conditions (best-corrected vision and with +2.50DS blur). Participants' gaze position was tracked using a head-mounted eye tracker. Absolute, anteroposterior, and mediolateral foot placement errors and within-subject foot placement variability were calculated from the locations of foot and floor-mounted retroreflective markers captured by flash photography of the final foot position. Results: Participants made significantly larger absolute and anteroposterior foot placement errors and exhibited greater foot placement variability when their gaze was directed farther forward of the stepping target. Blur led to significantly increased absolute and anteroposterior foot placement errors and increased foot placement variability. Furthermore, blur differentially increased the absolute and anteroposterior foot placement errors and variability when gaze was directed 60 cm farther forward of the stepping target. Conclusions: Increasing gaze position farther ahead from stepping locations and the presence of blur negatively impact the stepping accuracy of older adults. These findings indicate that blur, similar to that used in multifocal corrections, has the potential to increase the risk of trips and falls among older populations when negotiating challenging environments where precision stepping is required, particularly as gaze is directed farther ahead from stepping locations when walking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The time for conducting Preventive Maintenance (PM) on an asset is often determined using a predefined alarm limit based on trends of a hazard function. In this paper, the authors propose using both hazard and reliability functions to improve the accuracy of the prediction particularly when the failure characteristic of the asset whole life is modelled using different failure distributions for the different stages of the life of the asset. The proposed method is validated using simulations and case studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cat’s claw creeper, Macfadyena unguis-cati (L.) Gentry (Bignoniaceae) is a major environmental weed of riparian areas, rainforest communities and remnant natural vegetation in coastal Queensland and New South Wales, Australia. In densely infested areas, it smothers standing vegetation, including large trees, and causes canopy collapse. Quantitative data on the ecology of this invasive vine are generally lacking. The present study examines the underground tuber traits of M. unguis-cati and explores their links with aboveground parameters at five infested sites spanning both riparian and inland vegetation. Tubers were abundant in terms of density (~1000 per m2), although small in size and low in level of interconnectivity. M. unguis-cati also exhibits multiple stems per plant. Of all traits screened, the link between stand (stem density) and tuber density was the most significant and yielded a promising bivariate relationship for the purposes of estimation, prediction and management of what lies beneath the soil surface of a given M. unguis-cati infestation site. The study also suggests that new recruitment is primarily from seeds, not from vegetative propagation as previously thought. The results highlight the need for future biological-control efforts to focus on introducing specialist seed- and pod-feeding insects to reduce seed-output.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This report fully summarises a project designed to enhance commercial real estate performance within both operational and investment contexts through the development of a model aimed at supporting improved decision-making. The model is based on a risk adjusted discounted cash flow, providing a valuable toolkit for building managers, owners, and potential investors for evaluating individual building performance in terms of financial, social and environmental criteria over the complete life-cycle of the asset. The ‘triple bottom line’ approach to the evaluation of commercial property has much significance for the administrators of public property portfolios in particular. It also has applications more generally for the wider real estate industry given that the advent of ‘green’ construction requires new methods for evaluating both new and existing building stocks. The research is unique in that it focuses on the accuracy of the input variables required for the model. These key variables were largely determined by market-based research and an extensive literature review, and have been fine-tuned with extensive testing. In essence, the project has considered probability-based risk analysis techniques that required market-based assessment. The projections listed in the partner engineers’ building audit reports of the four case study buildings were fed into the property evaluation model developed by the research team. The results are strongly consistent with previously existing, less robust evaluation techniques. And importantly, this model pioneers an approach for taking full account of the triple bottom line, establishing a benchmark for related research to follow. The project’s industry partners expressed a high degree of satisfaction with the project outcomes at a recent demonstration seminar. The project in its existing form has not been geared towards commercial applications but it is anticipated that QDPW and other industry partners will benefit greatly by using this tool for the performance evaluation of property assets. The project met the objectives of the original proposal as well as all the specified milestones. The project has been completed within budget and on time. This research project has achieved the objective by establishing research foci on the model structure, the key input variable identification, the drivers of the relevant property markets, the determinants of the key variables (Research Engine no.1), the examination of risk measurement, the incorporation of risk simulation exercises (Research Engine no.2), the importance of both environmental and social factors and, finally the impact of the triple bottom line measures on the asset (Research Engine no. 3).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (Nmm−1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived froma single 2D radiographic image. Methods: 18 excised human femora had previously been quantitative computed tomography scanned, from which 2D BMD-equivalent radiographic images were derived, and mechanically tested to failure in a stance-loading configuration. A 3D proximal femur shape was generated from each 2D radiographic image and used to construct 3D-FEA models. Results: The coefficient of determination (R2%) to predict failure load was 54.5% for BMD and 80.4% for 3D-FEXI. Conclusions: This ex vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD. This approach may be readily extended to routine clinical BMD images derived by dual energy X-ray absorptiometry. Crown Copyright © 2009 Published by Elsevier Ltd on behalf of IPEM. All rights reserved

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Differential axial shortening, distortion and deformation in high rise buildings is a serious concern. They are caused by three time dependent modes of volume change; “shrinkage”, “creep” and “elastic shortening” that takes place in every concrete element during and after construction. Vertical concrete components in a high rise building are sized and designed based on their strength demand to carry gravity and lateral loads. Therefore, columns and walls are sized, shaped and reinforced differently with varying concrete grades and volume to surface area ratios. These structural components may be subjected to the detrimental effects of differential axial shortening that escalates with increasing the height of buildings. This can have an adverse impact on other structural and non-structural elements. Limited procedures are available to quantify axial shortening, and the results obtained from them differ because each procedure is based on various assumptions and limited to few parameters. All these prompt to a need to develop an accurate numerical procedure to quantify the axial shortening of concrete buildings taking into account the important time varying functions of (i) construction sequence (ii) Young’s Modulus and (iii) creep and shrinkage models associated with reinforced concrete. General assumptions are refined to minimize variability of creep and shrinkage parameters to improve accuracy of the results. Finite element techniques are used in the procedure that employs time history analysis along with compression only elements to simulate staged construction behaviour. This paper presents such a procedure and illustrates it through an example. Keywords: Differential Axial Shortening, Concrete Buildings, Creep and Shrinkage, Construction Sequence, Finite Element Method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computer aided joint replacement surgery has become very popular during recent years and is being done in increasing numbers all over the world. The accuracy of the system depends to a major extent, on accurate registration and immobility of the tracker attachment devices to the bone. This study was designed to asses the forces needed to displace the tracker attachment devices in the bone simulators. Bone simulators were used to maintain the uniformity of the bone structure during the study. The fixation devices tested were 3mm diameter self drilling, self tapping threaded pin, 4mm diameter self tapping cortical threaded pin, 5mm diameter self tapping cancellous threaded pin and a triplanar fixation device ‘ortholock’ used with three 3mm pins. All the devices were tested for pull out, translational and rotational forces in unicortical and bicortical fixation modes. Also tested was the normal bang strength and forces generated by leaning on the devices. The forces required to produce translation increased with the increasing diameter of the pins. These were 105N, 185N, and 225N for the unicortical fixations and 130N, 200N, 225N for the bicortical fixations for 3mm, 4mm and 5mm diameter pins respectively. The forces required to pull out the pins were 1475N, 1650N, 2050N for the unicortical, 1020N, 3044N and 3042N for the bicortical fixated 3mm, 4mm and 5mm diameter pins. The ortholock translational and pull out strength was tested to 900N and 920N respectively and still it did not fail. Rotatory forces required to displace the tracker on pins was to the magnitude of 30N before failure. The ortholock device had rotational forces applied up to 135N and still did not fail. The manual leaning forces and the sudden bang forces generated were of the magnitude of 210N and 150N respectively. The strength of the fixation pins increases with increasing diameter from three to five mm for the translational forces. There is no significant difference in pull out forces of four mm and five mm diameter pins though it is more that the three mm diameter pins. This is because of the failure of material at that stage rather than the fixation device. The rotatory forces required to displace the tracker are very small and much less that that can be produced by the surgeon or assistants in single pins. Although the ortholock device was tested to 135N in rotation without failing, one has to be very careful not to put any forces during the operation on the tracker devices to ensure the accuracy of the procedure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many older adults have difficulty using modern consumer products due to their complexity both in terms of functionality and interface design. It has been observed that older people also have more problems learning new systems. It was hypothesised that designing technological products that are more intuitive for older people to use can solve this problem. An intuitive interface allows a user’s to employ prior knowledge, thus minimizing the learning needed for effective interaction. This paper discusses an experiment investigating the effectiveness of redundancy in interface design. The primary objective of this experiment was to find out if using more than one modality for a product’s interface improves the speed and intuitiveness of interactions for older adults. Preliminary analysis showed strong correlation between technology familiarity and time on tasks, but redundancy in interface design improved speed and accuracy of use only for participants with moderate to high technology familiarity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The accuracy of data derived from linked-segment models depends on how well the system has been represented. Previous investigations describing the gait of persons with partial foot amputation did not account for the unique anthropometry of the residuum or the inclusion of a prosthesis and footwear in the model and, as such, are likely to have underestimated the magnitude of the peak joint moments and powers. This investigation determined the effect of inaccuracies in the anthropometric input data on the kinetics of gait. Toward this end, a geometric model was developed and validated to estimate body segment parameters of various intact and partial feet. These data were then incorporated into customized linked-segment models, and the kinetic data were compared with that obtained from conventional models. Results indicate that accurate modeling increased the magnitude of the peak hip and knee joint moments and powers during terminal swing. Conventional inverse dynamic models are sufficiently accurate for research questions relating to stance phase. More accurate models that account for the anthropometry of the residuum, prosthesis, and footwear better reflect the work of the hip extensors and knee flexors to decelerate the limb during terminal swing phase.