916 resultados para Forecast accuracy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The along-track stereo images of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensor with 15 m resolution were used to generate Digital Elevation Model (DEM) on an area with low and near Mean Sea Level (MSL) elevation in Johor, Malaysia. The absolute DEM was generated by using the Rational Polynomial Coefficient (RPC) model which was run on ENVI 4.8 software. In order to generate the absolute DEM, 60 Ground Control Pointes (GCPs) with almost vertical accuracy less than 10 meter extracted from topographic map of the study area. The assessment was carried out on uncorrected and corrected DEM by utilizing dozens of Independent Check Points (ICPs). Consequently, the uncorrected DEM showed the RMSEz of ± 26.43 meter which was decreased to the RMSEz of ± 16.49 meter for the corrected DEM after post-processing. Overall, the corrected DEM of ASTER stereo images met the expectations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to assess the accuracy of Digital Elevation Model (DEM) which is generated by using Toutin’s model. Thus, Toutin’s model was run by using OrthoEngineSE of PCI Geomatics 10.3.Thealong-track stereoimages of Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) sensor with 15 m resolution were used to produce DEM on an area with low and near Mean Sea Level (MSL) elevation in Johor Malaysia. Despite the satisfactory pre-processing results the visual assessment of the DEM generated from Toutin’s model showed that the DEM contained many outliers and incorrect values. The failure of Toutin’s model may mostly be due to the inaccuracy and insufficiency of ASTER ephemeris data for low terrains as well as huge water body in the stereo images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Older adults have increased visual impairment, including refractive blur from presbyopic multifocal spectacle corrections, and are less able to extract visual information from the environment to plan and execute appropriate stepping actions; these factors may collectively contribute to their higher risk of falls. The aim of this study was to examine the effect of refractive blur and target visibility on the stepping accuracy and visuomotor stepping strategies of older adults during a precision stepping task. Methods: Ten healthy, visually normal older adults (mean age 69.4 ± 5.2 years) walked up and down a 20 m indoor corridor stepping onto selected high and low-contrast targets while viewing under three visual conditions: best-corrected vision, +2.00 DS and +3.00 DS blur; the order of blur conditions was randomised between participants. Stepping accuracy and gaze behaviours were recorded using an eyetracker and a secondary hand-held camera. Results: Older adults made significantly more stepping errors with increasing levels of blur, particularly exhibiting under-stepping (stepping more posteriorly) onto the targets (p<0.05), while visuomotor stepping strategies did not significantly alter. Stepping errors were also significantly greater for the low compared to the high contrast targets and differences in visuomotor stepping strategies were found, including increased duration of gaze and increased interval between gaze onset and initiation of the leg swing when stepping onto the low contrast targets. Conclusions: These findings highlight that stepping accuracy is reduced for low visibility targets, and for high levels of refractive blur at levels typically present in multifocal spectacle corrections, despite significant changes in some of the visuomotor stepping strategies. These findings highlight the importance of maximising the contrast of objects in the environment, and may help explain why older adults wearing multifocal spectacle corrections exhibit an increased risk of falling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction The provision of a written comment on traumatic abnormalities of the musculoskeletal system detected by radiographers can assist referrers and may improve patient management, but the practice has not been widely adopted outside the United Kingdom. The purpose of this study was to investigate Australian radiographers’ perceptions of their readiness for practice in a radiographer commenting system and their educational preferences in relation to two different delivery formats of image interpretation education, intensive and non-intensive. Methods A cross-sectional web-based questionnaire was implemented between August and September 2012. Participants included radiographers with experience working in emergency settings at four Australian metropolitan hospitals. Conventional descriptive statistics, frequency histograms, and thematic analysis were undertaken. A Wilcoxon signed-rank test examined whether a difference in preference ratings between intensive and non-intensive education delivery was evident. Results The questionnaire was completed by 73 radiographers (68% response rate). Radiographers reported higher confidence and self-perceived accuracy to detect traumatic abnormalities than to describe traumatic abnormalities of the musculoskeletal system. Radiographers frequently reported high desirability ratings for both the intensive and the non-intensive education delivery, no difference in desirability ratings for these two formats was evident (z = 1.66,P = 0.11). Conclusions Some Australian radiographers perceive they are not ready to practise in a frontline radiographer commenting system. Overall, radiographers indicated mixed preferences for image interpretation education delivered via intensive and non-intensive formats. Further research, preferably randomised trials, investigating the effectiveness of intensive and non-intensive education formats of image interpretation education for radiographers is warranted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article aims to fill in the gap of the second-order accurate schemes for the time-fractional subdiffusion equation with unconditional stability. Two fully discrete schemes are first proposed for the time-fractional subdiffusion equation with space discretized by finite element and time discretized by the fractional linear multistep methods. These two methods are unconditionally stable with maximum global convergence order of $O(\tau+h^{r+1})$ in the $L^2$ norm, where $\tau$ and $h$ are the step sizes in time and space, respectively, and $r$ is the degree of the piecewise polynomial space. The average convergence rates for the two methods in time are also investigated, which shows that the average convergence rates of the two methods are $O(\tau^{1.5}+h^{r+1})$. Furthermore, two improved algorithms are constrcted, they are also unconditionally stable and convergent of order $O(\tau^2+h^{r+1})$. Numerical examples are provided to verify the theoretical analysis. The comparisons between the present algorithms and the existing ones are included, which show that our numerical algorithms exhibit better performances than the known ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The understanding of the loads generated within the prosthetic leg can aid engineers in the design of components and clinicians in the process of rehabilitation. Traditional methods to assess these loads have relied on inverse dynamics. This indirect method estimates the applied load using video recordings and force-plates located at a distance from the region of interest, such as the base of the residuum. The well-known limitations of this method are related to the accuracy of this recursive model and the experimental conditions required (Frossard et al., 2003). Recent developments in sensors (Frossard et al., 2003) and prosthetic fixation (Brånemark et al., 2000) permit the direct measurement of the loads applied on the residuum of transfemoral amputees. In principle, direct measurement should be an appropriate tool for assessing the accuracy of inverse dynamics. The purpose of this paper is to determine the validity of this assumption. The comparative variable used in this study is the velocity of the relative body center of mass (VCOM(t)). The relativity is used to align the static (w.r.t. position) force plate measurement with the dynamic load cell measurement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research publication is one of the final steps in the research process, which begins with development of a research idea. Moving through the process of bringing together collaborators, design of the study protocol, securing of grant or study funding, and obtaining ethic(s) approval to conduct the research, and implementation of the research, analysis and drawing of conclusions based on the data leads to publication of the study results. Although a final step in the research process entails dissemination of the results, many studies go unreported or are improperly reported. Indeed, reviewers have suggested that many randomized controlled trials, observational studies, and qualitative studies lack crucial methodological features or details that lend credibility to study results (Simera et al., 2010).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Groundwater modelling studies rely on an accurate determination of inputs and outputs that make up the water balance. Often there is large uncertainty associated with estimates of recharge and unmetered groundwater use. This can translate to equivalent uncertainty in the forecasting of sustainable yields, impacts of extraction, and susceptibility of groundwater dependent ecosystems. In the case of Coal Seam Gas, it is important to characterise the temporal and special distribution of depressurisation in the reservoir and how this may or may not extend to the adjacent aquifers. A regional groundwater flow model has been developed by the Queensland Government to predict drawdown impacts due to Coal Seam Gas activities in the Surat basin. This groundwater model is undergoing continued refinement and there is currently scope to address some of the key areas of uncertainty including better quantification of groundwater recharge and unmetered groundwater extractions. Research is currently underway to improve the accuracy of estimates of both of these components of the groundwater balance in order to reduce uncertainty in predicted groundwater drawdowns due to CSG activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cost estimating has been acknowledged as a crucial component of construction projects. Depending on available information and project requirements, cost estimates evolve in tandem with project lifecycle stages; conceptualisation, design development, execution and facility management. The premium placed on the accuracy of cost estimates is crucial to producing project tenders and eventually in budget management. Notwithstanding the initial slow pace of its adoption, Building Information Modelling (BIM) has successfully addressed a number of challenges previously characteristic of traditional approaches in the AEC, including poor communication, the prevalence of islands of information and frequent reworks. Therefore, it is conceivable that BIM can be leveraged to address specific shortcomings of cost estimation. The impetus for leveraging BIM models for accurate cost estimation is to align budgeted and actual cost. This paper hypothesises that the accuracy of BIM-based estimation, as more efficient, process-mirrors of traditional cost estimation methods, can be enhanced by simulating traditional cost estimation factors variables. Through literature reviews and preliminary expert interviews, this paper explores the factors that could potentially lead to more accurate cost estimates for construction projects. The findings show numerous factors that affect the cost estimates ranging from project information and its characteristic, project team, clients, contractual matters, and other external influences. This paper will make a particular contribution to the early phase of BIM-based project estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new technique called the reef resource inventory (RRI) was developed to map the distribution and abundance of benthos and substratum on reefs. The rapid field sampling technique uses divers to visually estimate the percentage cover of categories of benthos and substratum along 2x20 in plotless strip-transects positioned randomly over the tops, and systematically along the edge of reefs. The purpose of this study was to compare the relative sampling accuracy of the RRI against the line intercept transect technique (LIT), an international standard for sampling reef benthos and substratum. Analysis of paired sampling with LIT and RRI at 51 sites indicated sampling accuracy was not different (P > 0.05) for 8 of the 12 benthos and substratum categories used in the study. Significant differences were attributed to small-scale patchiness and cryptic coloration of some benthos; effects associated with sampling a sparsely distributed animal along a line versus an area; difficulties in discriminating some of the benthos and substratum categories; and differences due to visual acuity since LIT measurements were taken by divers close to the seabed whereas RRI measurements were taken by divers higher in the water column. The relative cost efficiency of the RRI technique was at least three times that of LIT for all benthos and substratum categories and as much as 10 times higher for two categories. These results suggest that the RRI can be used to obtain reliable and accurate estimates of relative abundance of broad categories of reef benthos and substratum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling of cultivar x trial effects for multienvironment trials (METs) within a mixed model framework is now common practice in many plant breeding programs. The factor analytic (FA) model is a parsimonious form used to approximate the fully unstructured form of the genetic variance-covariance matrix in the model for MET data. In this study, we demonstrate that the FA model is generally the model of best fit across a range of data sets taken from early generation trials in a breeding program. In addition, we demonstrate the superiority of the FA model in achieving the most common aim of METs, namely the selection of superior genotypes. Selection is achieved using best linear unbiased predictions (BLUPs) of cultivar effects at each environment, considered either individually or as a weighted average across environments. In practice, empirical BLUPs (E-BLUPs) of cultivar effects must be used instead of BLUPs since variance parameters in the model must be estimated rather than assumed known. While the optimal properties of minimum mean squared error of prediction (MSEP) and maximum correlation between true and predicted effects possessed by BLUPs do not hold for E-BLUPs, a simulation study shows that E-BLUPs perform well in terms of MSEP.