960 resultados para Kernel density estimates


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Time-dependent density-functional theory is a rather accurate and efficient way to compute electronic excitations for finite systems. However, in the macroscopic limit (systems of increasing size), for the usual adiabatic random-phase, local-density, or generalized-gradient approximations, one recovers the Kohn-Sham independent-particle picture, and thus the incorrect band gap. To clarify this trend, we investigate the macroscopic limit of the exchange-correlation kernel in such approximations by means of an algebraical analysis complemented with numerical studies of a one-dimensional tight-binding model. We link the failure to shift the Kohn-Sham spectrum of these approximate kernels to the fact that the corresponding operators in the transition space act only on a finite subspace.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this paper is the numerical treatment of a boundary value problem for the system of Stokes' equations. For this we extend the method of approximate approximations to boundary value problems. This method was introduced by V. Maz'ya in 1991 and has been used until now for the approximation of smooth functions defined on the whole space and for the approximation of volume potentials. In the present paper we develop an approximation procedure for the solution of the interior Dirichlet problem for the system of Stokes' equations in two dimensions. The procedure is based on potential theoretical considerations in connection with a boundary integral equations method and consists of three approximation steps as follows. In a first step the unknown source density in the potential representation of the solution is replaced by approximate approximations. In a second step the decay behavior of the generating functions is used to gain a suitable approximation for the potential kernel, and in a third step Nyström's method leads to a linear algebraic system for the approximate source density. For every step a convergence analysis is established and corresponding error estimates are given.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We study Hardy spaces on the boundary of a smooth open subset or R-n and prove that they can be defined either through the intrinsic maximal function or through Poisson integrals, yielding identical spaces. This extends to any smooth open subset of R-n results already known for the unit ball. As an application, a characterization of the weak boundary values of functions that belong to holomorphic Hardy spaces is given, which implies an F. and M. Riesz type theorem. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Acknowledgements We would like to thank Erik Rexstad and Rob Williams for useful reviews of this manuscript. The collection of visual and acoustic data was funded by the UK Department of Energy & Climate Change, the Scottish Government, Collaborative Offshore Wind Research into the Environment (COWRIE) and Oil & Gas UK. Digital aerial surveys were funded by Moray Offshore Renewables Ltd and additional funding for analysis of the combined datasets was provided by Marine Scotland. Collaboration between the University of Aberdeen and Marine Scotland was supported by MarCRF. We thank colleagues at the University of Aberdeen, Moray First Marine, NERI, Hi-Def Aerial Surveying Ltd and Ravenair for essential support in the field, particularly Tim Barton, Bill Ruck, Rasmus Nielson and Dave Rutter. Thanks also to Andy Webb, David Borchers, Len Thomas, Kelly McLeod, David L. Miller, Dinara Sadykova and Thomas Cornulier for advice on survey design and statistical approache. Data Accessibility Data are available from the Dryad Digital Repository: http://dx.doi.org/10.5061/dryad.cf04g

Relevância:

40.00% 40.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G07, 60F10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Because studies of crowding in long-term care settings are lacking, the authors sought to: (1) generate initial estimates of crowding in nursing homes and assisted living facilities; and (2) evaluate two operational approaches to its measurement. ----- ----- Background: Reactions to density and proximity are complex. Greater density intensifies people's reaction to a situation in the direction (positive or negative) that they would react if the situation were to occur under less dense conditions. People with dementia are especially reactive to the environment. ----- ----- Methods: Using a cross-sectional correlational design in nursing homes and assisted living facilities involving 185 participants, multiple observations (N = 6,455) of crowding and other environmental variables were made. Crowding, location, and sound were measured three times per observation; ambiance was measured once. Data analyses consisted of descriptive statistics, t-tests, and one-way analysis of variance. ----- ----- Results: Crowding estimates were higher for nursing homes and in dining and activity rooms. Crowding also varied across settings and locations by time of day. Overall, the interaction of location and time affected crowding significantly (N = 5,559, df [47, 511], F = 105.69, p < .0001); effects were greater within location-by-hour than between location-by-hour, but the effect explained slightly less variance in Long-Term Care Crowding Index (LTC-CI) estimates (47.41%) than location alone. Crowding had small, direct, and highly significant correlations with sound and with the engaging subscale for ambiance; a similar, though inverse, correlation was seen with the soothing subscale for ambiance. ----- ----- Conclusions: Crowding fluctuates consistent with routine activities such as meals in long-term care settings. Furthermore, a relationship between crowding and other physical characteristics of the environment was found. The LTC-CI is likely to be more sensitive than simple people counts when seeking to evaluate the effects of crowding on the behavior of elders-particularly those with dementia-in long-term care settings. aging in place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A critical requirement for safe autonomous navigation of a planetary rover is the ability to accurately estimate the traversability of the terrain. This work considers the problem of predicting the attitude and configuration angles of the platform from terrain representations that are often incomplete due to occlusions and sensor limitations. Using Gaussian Processes (GP) and exteroceptive data as training input, we can provide a continuous and complete representation of terrain traversability, with uncertainty in the output estimates. In this paper, we propose a novel method that focuses on exploiting the explicit correlation in vehicle attitude and configuration during operation by learning a kernel function from vehicle experience to perform GP regression. We provide an extensive experimental validation of the proposed method on a planetary rover. We show significant improvement in the accuracy of our estimation compared with results obtained using standard kernels (Squared Exponential and Neural Network), and compared to traversability estimation made over terrain models built using state-of-the-art GP techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cells respond to various biochemical and physical cues during wound–healing and tumour progression. In vitro assays used to study these processes are typically conducted in one particular geometry and it is unclear how the assay geometry affects the capacity of cell populations to spread, or whether the relevant mechanisms, such as cell motility and cell proliferation, are somehow sensitive to the geometry of the assay. In this work we use a circular barrier assay to characterise the spreading of cell populations in two different geometries. Assay 1 describes a tumour–like geometry where a cell population spreads outwards into an open space. Assay 2 describes a wound–like geometry where a cell population spreads inwards to close a void. We use a combination of discrete and continuum mathematical models and automated image processing methods to obtain independent estimates of the effective cell diffusivity, D, and the effective cell proliferation rate, λ. Using our parameterised mathematical model we confirm that our estimates of D and λ accurately predict the time–evolution of the location of the leading edge and the cell density profiles for both assay 1 and assay 2. Our work suggests that the effective cell diffusivity is up to 50% lower for assay 2 compared to assay 1, whereas the effective cell proliferation rate is up to 30% lower for assay 2 compared to assay 1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantifying the impact of biochemical compounds on collective cell spreading is an essential element of drug design, with various applications including developing treatments for chronic wounds and cancer. Scratch assays are a technically simple and inexpensive method used to study collective cell spreading; however, most previous interpretations of scratch assays are qualitative and do not provide estimates of the cell diffusivity, D, or the cell proliferation rate,l. Estimating D and l is important for investigating the efficacy of a potential treatment and provides insight into the mechanism through which the potential treatment acts. While a few methods for estimating D and l have been proposed, these previous methods lead to point estimates of D and l, and provide no insight into the uncertainty in these estimates. Here, we compare various types of information that can be extracted from images of a scratch assay, and quantify D and l using discrete computational simulations and approximate Bayesian computation. We show that it is possible to robustly recover estimates of D and l from synthetic data, as well as a new set of experimental data. For the first time, our approach also provides a method to estimate the uncertainty in our estimates of D and l. We anticipate that our approach can be generalized to deal with more realistic experimental scenarios in which we are interested in estimating D and l, as well as additional relevant parameters such as the strength of cell-to-cell adhesion or the strength of cell-to-substrate adhesion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To assess the cost-effectiveness of bone density screening programmes for osteoporosis. Study design. Using published and locally available data regarding fracture rates and treatment costs, the overall costs per fracture prevented, cost per quality of life year (QALY) saved and cost per year of life gained were estimated for different bone density screening and osteoporosis treatment programmes. Main outcome measures. Cost per fracture prevented, cost per QALY saved, and cost per year of life gained. Results. In women over the age of 50 years, the costs per fracture prevented of treating all women with hormone replacement therapy, or treating only if osteoporosis is demonstrated on bone density screening were £32,594 or £23,867 respectively. For alendronate therapy for the same groups, the costs were £171,067 and £14,067 respectively. Once the background rate of treatment with alendronate reaches 18%, bone density screening becomes cost-saving. Costs estimates per QALY saved ranged from £1,514 to £39,076 for osteoporosis treatment with alendronate following bone density screening. Conclusions. For relatively expensive medications such as alendronate, treatment programmes with prior bone density screening are far more cost effective than those without, and in some circumstances become cost-saving. Costs per QALY of life saved and per year of life gained for osteoporosis treatment with prior bone density screening compare favourably with treatment of hypertension and hypercholesterolemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Error estimates for the error reproducing kernel method (ERKM) are provided. The ERKM is a mesh-free functional approximation scheme [A. Shaw, D. Roy, A NURBS-based error reproducing kernel method with applications in solid mechanics, Computational Mechanics (2006), to appear (available online)], wherein a targeted function and its derivatives are first approximated via non-uniform rational B-splines (NURBS) basis function. Errors in the NURBS approximation are then reproduced via a family of non-NURBS basis functions, constructed using a polynomial reproduction condition, and added to the NURBS approximation of the function obtained in the first step. In addition to the derivation of error estimates, convergence studies are undertaken for a couple of test boundary value problems with known exact solutions. The ERKM is next applied to a one-dimensional Burgers equation where, time evolution leads to a breakdown of the continuous solution and the appearance of a shock. Many available mesh-free schemes appear to be unable to capture this shock without numerical instability. However, given that any desired order of continuity is achievable through NURBS approximations, the ERKM can even accurately approximate functions with discontinuous derivatives. Moreover, due to the variation diminishing property of NURBS, it has advantages in representing sharp changes in gradients. This paper is focused on demonstrating this ability of ERKM via some numerical examples. Comparisons of some of the results with those via the standard form of the reproducing kernel particle method (RKPM) demonstrate the relative numerical advantages and accuracy of the ERKM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban encroachment on dense, coastal koala populations has ensured that their management has received increasing government and public attention. The recently developed National Koala Conservation Strategy calls for maintenance of viable populations in the wild. Yet the success of this, and other, conservation initiatives is hampered by lack of reliable and generally accepted national and regional population estimates. In this paper we address this problem in a potentially large, but poorly studied, regional population in the State that is likely to have the largest wild populations. We draw on findings from previous reports in this series and apply the faecal standing-crop method (FSCM) to derive a regional estimate of more than 59 000 individuals. Validation trials in riverine communities showed that estimates of animal density obtained from the FSCM and direct observation were in close agreement. Bootstrapping and Monte Carlo simulations were used to obtain variance estimates for our population estimates in different vegetation associations across the region. The most favoured habitat was riverine vegetation, which covered only 0.9% of the region but supported 45% of the koalas. We also estimated that between 1969 and 1995 -30% of the native vegetation associations that are considered as potential koala habitat were cleared, leading to a decline of perhaps 10% in koala numbers. Management of this large regional population has significant implications for the national conservation of the species: the continued viability of this population is critically dependent on the retention and management of riverine and residual vegetation communities, and future vegetation-management guidelines should be cognisant of the potential impacts of clearing even small areas of critical habitat. We also highlight eight management implications.