402 resultados para Semi-parametric estimation
Resumo:
Data-driven approaches such as Gaussian Process (GP) regression have been used extensively in recent robotics literature to achieve estimation by learning from experience. To ensure satisfactory performance, in most cases, multiple learning inputs are required. Intuitively, adding new inputs can often contribute to better estimation accuracy, however, it may come at the cost of a new sensor, larger training dataset and/or more complex learning, some- times for limited benefits. Therefore, it is crucial to have a systematic procedure to determine the actual impact each input has on the estimation performance. To address this issue, in this paper we propose to analyse the impact of each input on the estimate using a variance-based sensitivity analysis method. We propose an approach built on Analysis of Variance (ANOVA) decomposition, which can characterise how the prediction changes as one or more of the input changes, and also quantify the prediction uncertainty as attributed from each of the inputs in the framework of dependent inputs. We apply the proposed approach to a terrain-traversability estimation method we proposed in prior work, which is based on multi-task GP regression, and we validate this implementation experimentally using a rover on a Mars-analogue terrain.
Resumo:
We present a Bayesian sampling algorithm called adaptive importance sampling or population Monte Carlo (PMC), whose computational workload is easily parallelizable and thus has the potential to considerably reduce the wall-clock time required for sampling, along with providing other benefits. To assess the performance of the approach for cosmological problems, we use simulated and actual data consisting of CMB anisotropies, supernovae of type Ia, and weak cosmological lensing, and provide a comparison of results to those obtained using state-of-the-art Markov chain Monte Carlo (MCMC). For both types of data sets, we find comparable parameter estimates for PMC and MCMC, with the advantage of a significantly lower wall-clock time for PMC. In the case of WMAP5 data, for example, the wall-clock time scale reduces from days for MCMC to hours using PMC on a cluster of processors. Other benefits of the PMC approach, along with potential difficulties in using the approach, are analyzed and discussed.
Resumo:
We describe a novel approach to treatment planning for focal brachytherapy utilizing a biologically based inverse optimization algorithm and biological imaging to target an ablative dose at known regions of significant tumour burden and a lower, therapeutic dose to low risk regions.
Resumo:
In this paper, we examine approaches to estimate a Bayesian mixture model at both single and multiple time points for a sample of actual and simulated aerosol particle size distribution (PSD) data. For estimation of a mixture model at a single time point, we use Reversible Jump Markov Chain Monte Carlo (RJMCMC) to estimate mixture model parameters including the number of components which is assumed to be unknown. We compare the results of this approach to a commonly used estimation method in the aerosol physics literature. As PSD data is often measured over time, often at small time intervals, we also examine the use of an informative prior for estimation of the mixture parameters which takes into account the correlated nature of the parameters. The Bayesian mixture model offers a promising approach, providing advantages both in estimation and inference.
Resumo:
This research provides information for providing the required seismic mitigation in building structures through the use of semi active and passive dampers. The Magneto-Rheological (MR) semi-active damper model was developed using control algorithms and integrated into seismically excited structures as a time domain function. Linear and nonlinear structure models are evaluated in real time scenarios. Research information can be used for the design and construction of earthquake safe buildings with optimally employed MR dampers and MR-passive damper combinations.
Resumo:
Exposure to ambient air pollution is a major risk factor for global disease. Assessment of the impacts of air pollution on population health and the evaluation of trends relative to other major risk factors requires regularly updated, accurate, spatially resolved exposure estimates. We combined satellite-based estimates, chemical transport model (CTM) simulations and ground measurements from 79 different countries to produce new global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0.1° spatial resolution for five-year intervals from 1990-2010 and the year 2013. These estimates were then applied to assess population-weighted mean concentrations for 1990 – 2013 for each of 188 countries. In 2013, 87% of the world’s population lived in areas exceeding the World Health Organization (WHO) Air Quality Guideline of 10 μg/m3 PM2.5 (annual average). Between 1990 and 2013, decreases in population-weighted mean concentrations of PM2.5 were evident in most high income countries, in contrast to increases estimated in South Asia, throughout much of Southeast Asia, and in China. Population-weighted mean concentrations of ozone increased in most countries from 1990 - 2013, with modest decreases in North America, parts of Europe, and several countries in Southeast Asia.
Resumo:
This research develops a design support system, which is able to estimate the life cycle cost of different product families at the early stage of product development. By implementing the system, a designer is able to develop various cost effective product families in a shorter lead-time and minimise the destructive impact of the product family on the environment.
Resumo:
The aim of this paper is to assess the heritability of cerebral cortex, based on measurements of grey matter (GM) thickness derived from structural MR images (sMRI). With data acquired from a large twin cohort (328 subjects), an automated method was used to estimate the cortical thickness, and EM-ICP surface registration algorithm was used to establish the correspondence of cortex across the population. An ACE model was then employed to compute the heritability of cortical thickness. Heritable cortical thickness measures various cortical regions, especially in frontal and parietal lobes, such as bilateral postcentral gyri, superior occipital gyri, superior parietal gyri, precuneus, the orbital part of the right frontal gyrus, right medial superior frontal gyrus, right middle occipital gyrus, right paracentral lobule, left precentral gyrus, and left dorsolateral superior frontal gyrus.
Resumo:
Parallel programming and effective partitioning of applications for embedded many-core architectures requires optimization algorithms. However, these algorithms have to quickly evaluate thousands of different partitions. We present a fast performance estimator embedded in a parallelizing compiler for streaming applications. The estimator combines a single execution-based simulation and an analytic approach. Experimental results demonstrate that the estimator has a mean error of 2.6% and computes its estimation 2848 times faster compared to a cycle accurate simulator.
Resumo:
The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a ‘magnitude-based inference’ approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.
Resumo:
Background There is a strong link between antibiotic consumption and the rate of antibiotic resistance. In Australia, the vast majority of antibiotics are prescribed by general practitioners, and the most common indication is for acute respiratory infections. The aim of this study is to assess if implementing a package of integrated, multifaceted interventions reduces antibiotic prescribing for acute respiratory infections in general practice. Methods/design This is a cluster randomised trial comparing two parallel groups of general practitioners in 28 urban general practices in Queensland, Australia: 14 intervention and 14 control practices. The protocol was peer-reviewed by content experts who were nominated by the funding organization. This study evaluates an integrated, multifaceted evidence-based package of interventions implemented over a six month period. The included interventions, which have previously been demonstrated to be effective at reducing antibiotic prescribing for acute respiratory infections, are: delayed prescribing; patient decision aids; communication training; commitment to a practice prescribing policy for antibiotics; patient information leaflet; and near patient testing with C-reactive protein. In addition, two sub-studies are nested in the main study: (1) point prevalence estimation carriage of bacterial upper respiratory pathogens in practice staff and asymptomatic patients; (2) feasibility of direct measures of antibiotic resistance by nose/throat swabbing. The main outcome data are from Australia’s national health insurance scheme, Medicare, which will be accessed after the completion of the intervention phase. They include the number of antibiotic prescriptions and the number of patient visits per general practitioner for periods before and during the intervention. The incidence of antibiotic prescriptions will be modelled using the numbers of patients as the denominator and seasonal and other factors as explanatory variables. Results will compare the change in prescription rates before and during the intervention in the two groups of practices. Semi-structured interviews will be conducted with the general practitioners and practice staff (practice nurse and/or practice manager) from the intervention practices on conclusion of the intervention phase to assess the feasibility and uptake of the interventions. An economic evaluation will be conducted to estimate the costs of implementing the package, and its cost-effectiveness in terms of cost per unit reduction in prescribing. Discussion The results on the effectiveness, cost-effectiveness, acceptability and feasibility of this package of interventions will inform the policy for any national implementation.
Resumo:
Diffusion in a composite slab consisting of a large number of layers provides an ideal prototype problem for developing and analysing two-scale modelling approaches for heterogeneous media. Numerous analytical techniques have been proposed for solving the transient diffusion equation in a one-dimensional composite slab consisting of an arbitrary number of layers. Most of these approaches, however, require the solution of a complex transcendental equation arising from a matrix determinant for the eigenvalues that is difficult to solve numerically for a large number of layers. To overcome this issue, in this paper, we present a semi-analytical method based on the Laplace transform and an orthogonal eigenfunction expansion. The proposed approach uses eigenvalues local to each layer that can be obtained either explicitly, or by solving simple transcendental equations. The semi-analytical solution is applicable to both perfect and imperfect contact at the interfaces between adjacent layers and either Dirichlet, Neumann or Robin boundary conditions at the ends of the slab. The solution approach is verified for several test cases and is shown to work well for a large number of layers. The work is concluded with an application to macroscopic modelling where the solution of a fine-scale multilayered medium consisting of two hundred layers is compared against an “up-scaled” variant of the same problem involving only ten layers.