985 resultados para Error estimate.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a 3D human pose tracking framework is presented. A new dimensionality reduction method (Hierarchical Temporal Laplacian Eigenmaps) is introduced to represent activities in hierarchies of low dimensional spaces. Such a hierarchy provides increasing independence between limbs, allowing higher flexibility and adaptability that result in improved accuracy. Moreover, a novel deterministic optimisation method (Hierarchical Manifold Search) is applied to estimate efficiently the position of the corresponding body parts. Finally, evaluation on public datasets such as HumanEva demonstrates that our approach achieves a 62.5mm-65mm average joint error for the walking activity and outperforms state-of-the-art methods in terms of accuracy and computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social work in the United Kingdom remains embroiled in concerns about child protection error. The serious injury or death of vulnerable children continues to evince much consternation in the public and private spheres. Governmental responses to these concerns invariably draw on technocratic solutions involving more procedures, case management systems, information technology and bureaucratic regulation. Such solutions flow from an implicit use of instrumental rationality based on a ‘means-end’ logic. While bringing an important perspective to the problem of child protection error, instrumental rationality has been overused limiting discretion and other modes of rational inquiry. This paper argues that the social work profession should apply an enlarged form of rationality comprising not only the instrumental-rational mode but also the critical-rational, affective-rational and communicative-rational forms. It is suggested that this combined, conceptual arsenal of rational inquiry leads to a gestalt which has been termed the holistic-rational perspective. It is also argued that embracing a more rounded perspective such as this might offer greater opportunities for reducing child protection error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Molecular pathology relies on identifying anomalies using PCR or analysis of DNA/RNA. This is important in solid tumours where molecular stratification of patients define targeted treatment. These molecular biomarkers rely on examination of tumour, annotation for possible macro dissection/tumour cell enrichment and the estimation of % tumour. Manually marking up tumour is error prone. Method: We have developed a method for automated tumour mark-up and % cell calculations using image analysis called TissueMark® based on texture analysis for lung, colorectal and breast (cases=245, 100, 100 respectively). Pathologists marked slides for tumour and reviewed the automated analysis. A subset of slides was manually counted for tumour cells to provide a benchmark for automated image analysis. Results: There was a strong concordance between pathological and automated mark-up (100 % acceptance rate for macro-dissection). We also showed a strong concordance between manually/automatic drawn boundaries (median exclusion/inclusion error of 91.70 %/89 %). EGFR mutation analysis was precisely the same for manual and automated annotation-based macrodissection. The annotation accuracy rates in breast and colorectal cancer were 83 and 80 % respectively. Finally, region-based estimations of tumour percentage using image analysis showed significant correlation with actual cell counts. Conclusion: Image analysis can be used for macro-dissection to (i) annotate tissue for tumour and (ii) estimate the % tumour cells and represents an approach to standardising/improving molecular diagnostics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Viscosity represents a key indicator of product quality in polymer extrusion but has traditionally been difficult to measure in-process in real-time. An innovative, yet simple, solution to this problem is proposed by a Prediction-Feedback observer mechanism. A `Prediction' model based on the operating conditions generates an open-loop estimate of the melt viscosity; this estimate is used as an input to a second, `Feedback' model to predict the pressure of the system. The pressure value is compared to the actual measured melt pressure and the error used to correct the viscosity estimate. The Prediction model captures the relationship between the operating conditions and the resulting melt viscosity and as such describes the specific material behavior. The Feedback model on the other hand describes the fundamental physical relationship between viscosity and extruder pressure and is a function of the machine geometry. The resulting system yields viscosity estimates within 1% error, shows excellent disturbance rejection properties and can be directly applied to model-based control. This is of major significance to achieving higher quality and reducing waste and set-up times in the polymer extrusion industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently wind power is dominated by onshore wind farms in the British Isles, but both the United Kingdom and the Republic of Ireland have high renewable energy targets, expected to come mostly from wind power. However, as the demand for wind power grows to ensure security of energy supply, as a potentially cheaper alternative to fossil fuels and to meet greenhouse gas emissions reduction targets offshore wind power will grow rapidly as the availability of suitable onshore sites decrease. However, wind is variable and stochastic by nature and thus difficult to schedule. In order to plan for these uncertainties market operators use wind forecasting tools, reserve plant and ancillary service agreements. Onshore wind power forecasting techniques have improved dramatically and continue to advance, but offshore wind power forecasting is more difficult due to limited datasets and knowledge. So as the amount of offshore wind power increases in the British Isles robust forecasting and planning techniques are even more critical. This paper presents a methodology to investigate the impacts of better offshore wind forecasting on the operation and management of the single wholesale electricity market in the Republic of Ireland and Northern Ireland using PLEXOS for Power Systems. © 2013 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Indoor personnel localization research has generated a range of potential techniques and algorithms. However, these typically do not account for the influence of the user's body upon the radio channel. In this paper an active RFID based patient tracking system is demonstrated and three localization algorithms are used to estimate the location of a user within a modern office building. It is shown that disregarding body effects reduces the accuracy of the algorithms' location estimates and that body shadowing effects create a systematic position error that estimates the user's location as closer to the RFID reader that the active tag has line of sight to.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To explore the quality of reporting (writing and graphics) of articles that used time-to-event analyses to report dental treatment outcomes. A systematic search of the top 50 dental journals in 2008 produced the sample of articles for this analysis. Articles reporting treatment outcomes with (n = 95) and without (n = 91) time-to-event statistics were reviewed. Survival descriptive words used in the two groups were analysed (Pearson's chi-square). The quality of life tables, survival curves and time-to-event statistics were assessed (Kappa analysed agreement) and explored. Words describing dental outcomes 'over time' were more common in time-to-event compared with control articles (77%, 3%, P < 0.001). Non-specific use of 'rate' was common across both groups. Life tables and survival curves were used by 39% and 48% of the time-to-event articles, with at least one used by 82%. Construction quality was poor: 21% of life tables and 28% of survival curves achieved an acceptable standard. Time-to-event statistical reporting was poor: 3% achieved a high and 59% achieved an acceptable standard. The survival statistic, summary figure and standard error were reported in 76%, 95% and 20% of time-to-event articles. Individual statistical terms and graphic aids were common within and unique to time-to-event articles. Unfortunately, important details were regularly omitted from statistical descriptions and survival figures making the overall quality poor. It is likely this will mean such articles will be incorrectly indexed in databases, missed by searchers and unable to be understood completely if identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: This systematic review reports on the survival of feldspathic porcelain veneers.

MATERIALS AND METHODS: The Cochrane Library, MEDLINE (OVID), Embase, Web of Knowledge, selected journals, clinical trials registers, and conference proceedings were searched independently by two reviewers. Academic colleagues were also contacted to identify relevant research. Inclusion criteria were human cohort studies (prospective and retrospective) and controlled trials assessing outcomes of feldspathic porcelain veneers in more than 15 patients and with at least some of the veneers in situ for 5 years. Of 4,294 articles identified, 116 studies underwent full-text screenings and 69 were further reviewed for eligibility. Of these, 11 were included in the qualitative analysis and 6 (5 cohorts) were included in meta-analyses. Estimated cumulative survival and standard error for each study were assessed and used for meta-, sensitivity, and post hoc analyses. The I2 statistic and the Cochran Q test and its associated P value were used to evaluate statistical heterogeneity, with a random-effects meta-analysis used when the P value for heterogeneity was less than .1. Galbraith, forest, and funnel plots explored heterogeneity, publication patterns, and small study biases.

RESULTS: The estimated cumulative survival for feldspathic porcelain veneers was 95.7% (95% confidence interval [CI]: 92.9% to 98.4%) at 5 years and ranged from 64% to 95% at 10 years across three studies. A post hoc meta-analysis indicated that the 10-year best estimate may approach 95.6% (95% CI: 93.8% to 97.5%). High levels of statistical heterogeneity were found.

CONCLUSIONS: When bonded to enamel substrate, feldspathic porcelain veneers have a very high 10-year survival rate that may approach 95%. Clinical heterogeneity is associated with differences in reported survival rates. Use of clinically relevant survival definitions and careful reporting of tooth characteristics, censorship, clustering, and precise results in future research would improve metaanalytic estimates and aid treatment decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Heckman-type selection models have been used to control HIV prevalence estimates for selection bias when participation in HIV testing and HIV status are associated after controlling for observed variables. These models typically rely on the strong assumption that the error terms in the participation and the outcome equations that comprise the model are distributed as bivariate normal.
Methods: We introduce a novel approach for relaxing the bivariate normality assumption in selection models using copula functions. We apply this method to estimating HIV prevalence and new confidence intervals (CI) in the 2007 Zambia Demographic and Health Survey (DHS) by using interviewer identity as the selection variable that predicts participation (consent to test) but not the outcome (HIV status).
Results: We show in a simulation study that selection models can generate biased results when the bivariate normality assumption is violated. In the 2007 Zambia DHS, HIV prevalence estimates are similar irrespective of the structure of the association assumed between participation and outcome. For men, we estimate a population HIV prevalence of 21% (95% CI = 16%–25%) compared with 12% (11%–13%) among those who consented to be tested; for women, the corresponding figures are 19% (13%–24%) and 16% (15%–17%).
Conclusions: Copula approaches to Heckman-type selection models are a useful addition to the methodological toolkit of HIV epidemiology and of epidemiology in general. We develop the use of this approach to systematically evaluate the robustness of HIV prevalence estimates based on selection models, both empirically and in a simulation study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present optical and near-infrared (NIR) photometry and spectroscopy as well as modelling of the lightcurves of the Type IIb supernova (SN) 2011dh. Our extensive dataset, for which we present the observations obtained after day 100, spans two years, and complemented with Spitzer mid-infrared (MIR) data, we use it to build an optical-to-MIR bolometric lightcurve between days 3 and 732. To model the bolometric lightcurve before day 400 we use a grid of hydrodynamical SN models, which allows us to determine the errors in the derived quantities, and a bolometric correction determined with steady-state non-local thermodynamic equilibrium (NLTE) modelling. Using this method we find a helium core mass of 3.1<sup>+0.7</sup><inf>-0.4</inf> M<inf>⊙</inf> for SN 2011dh, consistent within error bars with previous results obtained using the bolometric lightcurve before day 80. We compute bolometric and broad-band lightcurves between days 100 and 500 from spectral steady-state NLTE models, presented and discussed in a companion paper. The preferred 12 M<inf>⊙</inf> (initial mass) model, previously found to agree well with the observed spectra, shows a good overall agreement with the observed lightcurves, although some discrepancies exist. Time-dependent NLTE modelling shows that after day ∼600 a steady-state assumption is no longer valid. The radioactive energy deposition in this phase is likely dominated by the positrons emitted in the decay of <sup>56</sup>Co, but seems insufficient to reproduce the lightcurves, and what energy source is dominating the emitted flux is unclear. We find an excess in the K and the MIR bands developing between days 100 and 250, during which an increase in the optical decline rate is also observed. A local origin of the excess is suggested by the depth of the He I 20 581 Å absorption. Steady-state NLTE models with a modest dust opacity in the core (τ = 0.44), turned on during this period, reproduce the observed behaviour, but an additional excess in the Spitzer 4.5 μm band remains. Carbon-monoxide (CO) first-overtone band emission is detected at day 206, and possibly at day 89, and assuming the additional excess to bedominated by CO fundamental band emission, we find fundamental to first-overtone band ratios considerably higher than observed in SN 1987A. The profiles of the [O i] 6300 Å and Mg i] 4571 Å lines show a remarkable similarity, suggesting that these lines originate from a common nuclear burning zone (O/Ne/Mg), and using small scale fluctuations in the line profiles we estimate a filling factor of ≲ 0.07 for the emitting material. This paper concludes our extensive observational and modelling work on SN 2011dh. The results from hydrodynamical modelling, steady-state NLTE modelling, and stellar evolutionary progenitor analysis are all consistent, and suggest an initial mass of ∼12 M<inf>⊙</inf> for the progenitor.