958 resultados para Calibration estimators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Near-infrared spectroscopy (NIRS) calibrations were developed for the discrimination of Chinese hawthorn (Crataegus pinnatifida Bge. var. major) fruit from three geographical regions as well as for the estimation of the total sugar, total acid, total phenolic content, and total antioxidant activity. Principal component analysis (PCA) was used for the discrimination of the fruit on the basis of their geographical origin. Three pattern recognition methods, linear discriminant analysis, partial least-squares-discriminant analysis, and back-propagation artificial neural networks, were applied to classify and compare these samples. Furthermore, three multivariate calibration models based on the first derivative NIR spectroscopy, partial least-squares regression, back-propagation artificial neural networks, and least-squares-support vector machines, were constructed for quantitative analysis of the four analytes, total sugar, total acid, total phenolic content, and total antioxidant activity, and validated by prediction data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In outdoor environments shadows are common. These typically strong visual features cause considerable change in the appearance of a place, and therefore confound vision-based localisation approaches. In this paper we describe how to convert a colour image of the scene to a greyscale invariant image where pixel values are a function of underlying material property not lighting. We summarise the theory of shadow invariant images and discuss the modelling and calibration issues which are important for non-ideal off-the-shelf colour cameras. We evaluate the technique with a commonly used robotic camera and an autonomous car operating in an outdoor environment, and show that it can outperform the use of ordinary greyscale images for the task of visual localisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe recent biologically-inspired mapping research incorporating brain-based multi-sensor fusion and calibration processes and a new multi-scale, homogeneous mapping framework. We also review the interdisciplinary approach to the development of the RatSLAM robot mapping and navigation system over the past decade and discuss the insights gained from combining pragmatic modelling of biological processes with attempts to close the loop back to biology. Our aim is to encourage the pursuit of truly interdisciplinary approaches to robotics research by providing successful case studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document describes large, accurately calibrated and time-synchronised datasets, gathered in controlled environmental conditions, using an unmanned ground vehicle equipped with a wide variety of sensors. These sensors include: multiple laser scanners, a millimetre wave radar scanner, a colour camera and an infra-red camera. Full details of the sensors are given, as well as the calibration parameters needed to locate them with respect to each other and to the platform. This report also specifies the format and content of the data, and the conditions in which the data have been gathered. The data collection was made in two different situations of the vehicle: static and dynamic. The static tests consisted of sensing a fixed ’reference’ terrain, containing simple known objects, from a motionless vehicle. For the dynamic tests, data were acquired from a moving vehicle in various environments, mainly rural, including an open area, a semi-urban zone and a natural area with different types of vegetation. For both categories, data have been gathered in controlled environmental conditions, which included the presence of dust, smoke and rain. Most of the environments involved were static, except for a few specific datasets which involve the presence of a walking pedestrian. Finally, this document presents illustrations of the effects of adverse environmental conditions on sensor data, as a first step towards reliability and integrity in autonomous perceptual systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis developed a method for real-time and handheld 3D temperature mapping using a combination of off-the-shelf devices and efficient computer algorithms. It contributes a new sensing and data processing framework to the science of 3D thermography, unlocking its potential for application areas such as building energy auditing and industrial monitoring. New techniques for the precise calibration of multi-sensor configurations were developed, along with several algorithms that ensure both accurate and comprehensive surface temperature estimates can be made for rich 3D models as they are generated by a non-expert user.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the health impacts caused by exposures to air pollutants in urban areas, monitoring and forecasting of air quality parameters have become popular as an important topic in atmospheric and environmental research today. The knowledge on the dynamics and complexity of air pollutants behavior has made artificial intelligence models as a useful tool for a more accurate pollutant concentration prediction. This paper focuses on an innovative method of daily air pollution prediction using combination of Support Vector Machine (SVM) as predictor and Partial Least Square (PLS) as a data selection tool based on the measured values of CO concentrations. The CO concentrations of Rey monitoring station in the south of Tehran, from Jan. 2007 to Feb. 2011, have been used to test the effectiveness of this method. The hourly CO concentrations have been predicted using the SVM and the hybrid PLS–SVM models. Similarly, daily CO concentrations have been predicted based on the aforementioned four years measured data. Results demonstrated that both models have good prediction ability; however the hybrid PLS–SVM has better accuracy. In the analysis presented in this paper, statistic estimators including relative mean errors, root mean squared errors and the mean absolute relative error have been employed to compare performances of the models. It has been concluded that the errors decrease after size reduction and coefficients of determination increase from 56 to 81% for SVM model to 65–85% for hybrid PLS–SVM model respectively. Also it was found that the hybrid PLS–SVM model required lower computational time than SVM model as expected, hence supporting the more accurate and faster prediction ability of hybrid PLS–SVM model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Obtaining attribute values of non-chosen alternatives in a revealed preference context is challenging because non-chosen alternative attributes are unobserved by choosers, chooser perceptions of attribute values may not reflect reality, existing methods for imputing these values suffer from shortcomings, and obtaining non-chosen attribute values is resource intensive. This paper presents a unique Bayesian (multiple) Imputation Multinomial Logit model that imputes unobserved travel times and distances of non-chosen travel modes based on random draws from the conditional posterior distribution of missing values. The calibrated Bayesian (multiple) Imputation Multinomial Logit model imputes non-chosen time and distance values that convincingly replicate observed choice behavior. Although network skims were used for calibration, more realistic data such as supplemental geographically referenced surveys or stated preference data may be preferred. The model is ideally suited for imputing variation in intrazonal non-chosen mode attributes and for assessing the marginal impacts of travel policies, programs, or prices within traffic analysis zones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Disjoint top-view networked cameras are among the most commonly utilized networks in many applications. One of the open questions for these cameras' study is the computation of extrinsic parameters (positions and orientations), named extrinsic calibration or localization of cameras. Current approaches either rely on strict assumptions of the object motion for accurate results or fail to provide results of high accuracy without the requirement of the object motion. To address these shortcomings, we present a location-constrained maximum a posteriori (LMAP) approach by applying known locations in the surveillance area, some of which would be passed by the object opportunistically. The LMAP approach formulates the problem as a joint inference of the extrinsic parameters and object trajectory based on the cameras' observations and the known locations. In addition, a new task-oriented evaluation metric, named MABR (the Maximum value of All image points' Back-projected localization errors' L2 norms Relative to the area of field of view), is presented to assess the quality of the calibration results in an indoor object tracking context. Finally, results herein demonstrate the superior performance of the proposed method over the state-of-the-art algorithm based on the presented MABR and classical evaluation metric in simulations and real experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accident record of the repair, maintenance, minor alteration, and addition (RMAA) sector has been alarmingly high; however, research in the RMAA sector remains limited. Unsafe behavior is considered one of the key causes of accidents. Thus, the organizational factors that influence individual safety behavior at work continue to be the focus of many studies. The safety climate, which reflects the true priority of safety in an organization, has drawn much attention. Safety climate measurement helps to identify areas for safety improvement. The current study aims to identify safety climate factors in the RMAA sector. A questionnaire survey was conducted in the RMAA sector in Hong Kong. Data were randomly split into the calibration and the validation samples. The RMAA safety climate factors were determined by exploratory factor analysis on the calibration sample. Three safety climate factors of the RMAA works were identified: (1) management commitment to occupational health and safety (OHS) and employee involvement, (2) application of safety rules and work practices, and; (3) responsibility for health and safety. Confirmatory factor analysis (CFA) was then conducted on the validation sample. The CFA model showed satisfactory goodness of fit, reliability, and validity. The suggested RMAA safety climate factors can be utilized by construction industry practitioners in developed economies to measure the safety climate of their RMAA projects, thereby enhancing the safety of RMAA works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measurement of the moisture variation in soils is required for geotechnical design and research because soil properties and behavior can vary as moisture content changes. The neutron probe, which was developed more than 40 years ago, is commonly used to monitor soil moisture variation in the field. This study reports a full-scale field monitoring of soil moisture using a neutron moisture probe for a period of more than 2 years in the Melbourne (Australia) region. On the basis of soil types available in the Melbourne region, 23 sites were chosen for moisture monitoring down to a depth of 1500 mm. The field calibration method was used to develop correlations relating the volumetric moisture content and neutron counts. Observed results showed that the deepest “wetting front” during the wet season was limited to the top 800 to 1000 mm of soil whilst the top soil layer down to about 550mmresponded almost immediately to the rainfall events. At greater depths (550 to 800mmand below 800 mm), the moisture variations were relatively low and displayed predominantly periodic fluctuations. This periodic nature was captured with Fourier analysis to develop a cyclic moisture model on the basis of an analytical solution of a one-dimensional moisture flow equation for homogeneous soils. It is argued that the model developed can be used to predict the soil moisture variations as applicable to buried structures such as pipes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Moving cell fronts are an essential feature of wound healing, development and disease. The rate at which a cell front moves is driven, in part, by the cell motility, quantified in terms of the cell diffusivity $D$, and the cell proliferation rate �$\lambda$. Scratch assays are a commonly-reported procedure used to investigate the motion of cell fronts where an initial cell monolayer is scratched and the motion of the front is monitored over a short period of time, often less than 24 hours. The simplest way of quantifying a scratch assay is to monitor the progression of the leading edge. Leading edge data is very convenient since, unlike other methods, it is nondestructive and does not require labeling, tracking or counting individual cells amongst the population. In this work we study short time leading edge data in a scratch assay using a discrete mathematical model and automated image analysis with the aim of investigating whether such data allows us to reliably identify $D$ and $\lambda$�. Using a naıve calibration approach where we simply scan the relevant region of the ($D$;$\lambda$�) parameter space, we show that there are many choices of $D$ and $\lambda$� for which our model produces indistinguishable short time leading edge data. Therefore, without due care, it is impossible to estimate $D$ and $\lambda$� from this kind of data. To address this, we present a modified approach accounting for the fact that cell motility occurs over a much shorter time scale than proliferation. Using this information we divide the duration of the experiment into two periods, and we estimate $D$ using data from the first period, while we estimate �$\lambda$ using data from the second period. We confirm the accuracy of our approach using in silico data and a new set of in vitro data, which shows that our method recovers estimates of $D$ and $\lamdba$� that are consistent with previously-reported values except that that our approach is fast, inexpensive, nondestructive and avoids the need for cell labeling and cell counting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the utility to computational Bayesian analyses of a particular family of recursive marginal likelihood estimators characterized by the (equivalent) algorithms known as "biased sampling" or "reverse logistic regression" in the statistics literature and "the density of states" in physics. Through a pair of numerical examples (including mixture modeling of the well-known galaxy dataset) we highlight the remarkable diversity of sampling schemes amenable to such recursive normalization, as well as the notable efficiency of the resulting pseudo-mixture distributions for gauging prior-sensitivity in the Bayesian model selection context. Our key theoretical contributions are to introduce a novel heuristic ("thermodynamic integration via importance sampling") for qualifying the role of the bridging sequence in this procedure, and to reveal various connections between these recursive estimators and the nested sampling technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accuracy of early cost estimates is critical to the success of construction projects. The selected tender price (clients' building cost) is usually seen in previous research as a holistic dependent variable when examining early stage estimates. Unlike other components of construction cost, the amount of contingencies is decided by clients/consultants with consideration of early project information. Cost drivers of contingencies estimates are associated with uncertainty and complexity, and include project size, schedule, ground condition, construction site access, market condition and so on. A path analysis of 133 UK school building contracts was conducted to identify impacts of nine major cost drivers on the determination of contingencies by different clients/cost estimators. This research finds that gross floor area (GFA), schedule and requirement of air conditioning have statistically significant impacts on the contingency determination. The mediating role of schedule between gross floor area and contingencies (GFA→Schedule→Contingencies) was confirmed with the Soble test. The total effects of the three variables on contingencies estimates were obtained with the consideration of this indirect effect. The squared multiple correlation (SMC) of contingencies (=0.624) indicates the identified three variables can explain 62.4% variance of contingencies, and it is comparatively satisfactory considering the heterogeneity among different estimators, unknown estimating techniques and different projects

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose This Study evaluated the predictive validity of three previously published ActiGraph energy expenditure (EE) prediction equations developed for children and adolescents. Methods A total of 45 healthy children and adolescents (mean age: 13.7 +/- 2.6 yr) completed four 5-min activity trials (normal walking. brisk walking, easy running, and fast running) in ail indoor exercise facility. During each trial, participants were all ActiGraph accelerometer oil the right hip. EE was monitored breath by breath using the Cosmed K4b(2) portable indirect calorimetry system. Differences and associations between measured and predicted EE were assessed using dependent t-tests and Pearson correlations, respectively. Classification accuracy was assessed using percent agreement, sensitivity, specificity, and area under the receiver operating characteristic (ROC) curve. Results None of the equations accurately predicted mean energy expenditure during each of the four activity trials. Each equation, however, accurately predicted mean EE in at least one activity trial. The Puyau equation accurately predicted EE during slow walking. The Trost equation accurately predicted EE during slow running. The Freedson equation accurately predicted EE during fast running. None of the three equations accurately predicted EE during brisk walking. The equations exhibited fair to excellent classification accuracy with respect to activity intensity. with the Trost equation exhibiting the highest classification accuracy and the Puyau equation exhibiting the lowest. Conclusions These data suggest that the three accelerometer prediction equations do not accurately predict EE on a minute-by-minute basis in children and adolescents during overground walking and running. The equations maybe, however, for estimating participation in moderate and vigorous activity.