69 resultados para Hyperbolic smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realistic virtual models of leaf surfaces are important for a number of applications in the plant sciences, such as modelling agrichemical spray droplet movement and spreading on the surface. In this context, the virtual surfaces are required to be sufficiently smooth to facilitate the use of the mathematical equations that govern the motion of the droplet. While an effective approach is to apply discrete smoothing D2-spline algorithms to reconstruct the leaf surfaces from three-dimensional scanned data, difficulties arise when dealing with wheat leaves that tend to twist and bend. To overcome this topological difficulty, we develop a parameterisation technique that rotates and translates the original data, allowing the surface to be fitted using the discrete smoothing D2-spline methods in the new parameter space. Our algorithm uses finite element methods to represent the surface as a linear combination of compactly supported shape functions. Numerical results confirm that the parameterisation, along with the use of discrete smoothing D2-spline techniques, produces realistic virtual representations of wheat leaves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis has contributed to the advancement of knowledge in disease modelling by addressing interesting and crucial issues relevant to modelling health data over space and time. The research has led to the increased understanding of spatial scales, temporal scales, and spatial smoothing for modelling diseases, in terms of their methodology and applications. This research is of particular significance to researchers seeking to employ statistical modelling techniques over space and time in various disciplines. A broad class of statistical models are employed to assess what impact of spatial and temporal scales have on simulated and real data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given the drawbacks for using geo-political areas in mapping outcomes unrelated to geo-politics, a compromise is to aggregate and analyse data at the grid level. This has the advantage of allowing spatial smoothing and modelling at a biologically or physically relevant scale. This article addresses two consequent issues: the choice of the spatial smoothness prior and the scale of the grid. Firstly, we describe several spatial smoothness priors applicable for grid data and discuss the contexts in which these priors can be employed based on different aims. Two such aims are considered, i.e., to identify regions with clustering and to model spatial dependence in the data. Secondly, the choice of the grid size is shown to depend largely on the spatial patterns. We present a guide on the selection of spatial scales and smoothness priors for various point patterns based on the two aims for spatial smoothing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolutionary algorithms are playing an increasingly important role as search methods in cognitive science domains. In this study, methodological issues in the use of evolutionary algorithms were investigated via simulations in which procedures were systematically varied to modify the selection pressures on populations of evolving agents. Traditional roulette wheel, tournament, and variations of these selection algorithms were compared on the “needle-in-a-haystack” problem developed by Hinton and Nowlan in their 1987 study of the Baldwin effect. The task is an important one for cognitive science, as it demonstrates the power of learning as a local search technique in smoothing a fitness landscape that lacks gradient information. One aspect that has continued to foster interest in the problem is the observation of residual learning ability in simulated populations even after long periods of time. Effective evolutionary algorithms balance their search effort between broad exploration of the search space and in-depth exploitation of promising solutions already found. Issues discussed include the differential effects of rank and proportional selection, the tradeoff between migration of populations towards good solutions and maintenance of diversity, and the development of measures that illustrate how each selection algorithm affects the search process over generations. We show that both roulette wheel and tournament algorithms can be modified to appropriately balance search between exploration and exploitation, and effectively eliminate residual learning in this problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 hours to only 7 minutes. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the axial performance of two heavily instrumented barrette piles, with and without grouting, socket into gravel layer in Taipei are evaluated based on the results of pile load tests. Both piles are 44 m long with the same dimension of 0.8 by 2.7 m, installed by hydraulic long bucket. One of the piles with toe grouting was socket 6 m into gravel layer and the other pile without toe grouting was socket 3 m into gravel layer. The load versus displacement relationships at pile head, the t-z curves of upper soil layers and of bottom gravel layer, and the tip resistance versus displacement relationships are important concerns and are presented in the paper. The t-z curves interpreted from the measured data along depth are also simulated by the hyperbolic model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a class of unconditionally stable difference schemes based on the Pad´e approximation is presented for the Riesz space-fractional telegraph equation. Firstly, we introduce a new variable to transform the original dfferential equation to an equivalent differential equation system. Then, we apply a second order fractional central difference scheme to discretise the Riesz space-fractional operator. Finally, we use (1, 1), (2, 2) and (3, 3) Pad´e approximations to give a fully discrete difference scheme for the resulting linear system of ordinary differential equations. Matrix analysis is used to show the unconditional stability of the proposed algorithms. Two examples with known exact solutions are chosen to assess the proposed difference schemes. Numerical results demonstrate that these schemes provide accurate and efficient methods for solving a space-fractional hyperbolic equation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to investigate the spatial clustering and dynamic dispersion of dengue incidence in Queensland, Australia. We used Moran's I statistic to assess the spatial autocorrelation of reported dengue cases. Spatial empirical Bayes smoothing estimates were used to display the spatial distribution of dengue in postal areas throughout Queensland. Local indicators of spatial association (LISA) maps and logistic regression models were used to identify spatial clusters and examine the spatio-temporal patterns of the spread of dengue. The results indicate that the spatial distribution of dengue was clustered during each of the three periods of 1993–1996, 1997–2000 and 2001–2004. The high-incidence clusters of dengue were primarily concentrated in the north of Queensland and low-incidence clusters occurred in the south-east of Queensland. The study concludes that the geographical range of notified dengue cases has significantly expanded in Queensland over recent years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to estimate the expected Remaining Useful Life (RUL) is critical to reduce maintenance costs, operational downtime and safety hazards. In most industries, reliability analysis is based on the Reliability Centred Maintenance (RCM) and lifetime distribution models. In these models, the lifetime of an asset is estimated using failure time data; however, statistically sufficient failure time data are often difficult to attain in practice due to the fixed time-based replacement and the small population of identical assets. When condition indicator data are available in addition to failure time data, one of the alternate approaches to the traditional reliability models is the Condition-Based Maintenance (CBM). The covariate-based hazard modelling is one of CBM approaches. There are a number of covariate-based hazard models; however, little study has been conducted to evaluate the performance of these models in asset life prediction using various condition indicators and data availability. This paper reviews two covariate-based hazard models, Proportional Hazard Model (PHM) and Proportional Covariate Model (PCM). To assess these models’ performance, the expected RUL is compared to the actual RUL. Outcomes demonstrate that both models achieve convincingly good results in RUL prediction; however, PCM has smaller absolute prediction error. In addition, PHM shows over-smoothing tendency compared to PCM in sudden changes of condition data. Moreover, the case studies show PCM is not being biased in the case of small sample size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project constructed virtual plant leaf surfaces from digitised data sets for use in droplet spray models. Digitisation techniques for obtaining data sets for cotton, chenopodium and wheat leaves are discussed and novel algorithms for the reconstruction of the leaves from these three plant species are developed. The reconstructed leaf surfaces are included into agricultural droplet spray models to investigate the effect of the nozzle and spray formulation combination on the proportion of spray retained by the plant. A numerical study of the post-impaction motion of large droplets that have formed on the leaf surface is also considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We extended genetic linkage analysis - an analysis widely used in quantitative genetics - to 3D images to analyze single gene effects on brain fiber architecture. We collected 4 Tesla diffusion tensor images (DTI) and genotype data from 258 healthy adult twins and their non-twin siblings. After high-dimensional fluid registration, at each voxel we estimated the genetic linkage between the single nucleotide polymorphism (SNP), Val66Met (dbSNP number rs6265), of the BDNF gene (brain-derived neurotrophic factor) with fractional anisotropy (FA) derived from each subject's DTI scan, by fitting structural equation models (SEM) from quantitative genetics. We also examined how image filtering affects the effect sizes for genetic linkage by examining how the overall significance of voxelwise effects varied with respect to full width at half maximum (FWHM) of the Gaussian smoothing applied to the FA images. Raw FA maps with no smoothing yielded the greatest sensitivity to detect gene effects, when corrected for multiple comparisons using the false discovery rate (FDR) procedure. The BDNF polymorphism significantly contributed to the variation in FA in the posterior cingulate gyrus, where it accounted for around 90-95% of the total variance in FA. Our study generated the first maps to visualize the effect of the BDNF gene on brain fiber integrity, suggesting that common genetic variants may strongly determine white matter integrity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A key question in diffusion imaging is how many diffusion-weighted images suffice to provide adequate signal-to-noise ratio (SNR) for studies of fiber integrity. Motion, physiological effects, and scan duration all affect the achievable SNR in real brain images, making theoretical studies and simulations only partially useful. We therefore scanned 50 healthy adults with 105-gradient high-angular resolution diffusion imaging (HARDI) at 4T. From gradient image subsets of varying size (6 ≤ N ≤ 94) that optimized a spherical angular distribution energy, we created SNR plots (versus gradient numbers) for seven common diffusion anisotropy indices: fractional and relative anisotropy (FA, RA), mean diffusivity (MD), volume ratio (VR), geodesic anisotropy (GA), its hyperbolic tangent (tGA), and generalized fractional anisotropy (GFA). SNR, defined in a region of interest in the corpus callosum, was near-maximal with 58, 66, and 62 gradients for MD, FA, and RA, respectively, and with about 55 gradients for GA and tGA. For VR and GFA, SNR increased rapidly with more gradients. SNR was optimized when the ratio of diffusion-sensitized to non-sensitized images was 9.13 for GA and tGA, 10.57 for FA, 9.17 for RA, and 26 for MD and VR. In orientation density functions modeling the HARDI signal as a continuous mixture of tensors, the diffusion profile reconstruction accuracy rose rapidly with additional gradients. These plots may help in making trade-off decisions when designing diffusion imaging protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lattice-based cryptographic primitives are believed to offer resilience against attacks by quantum computers. We demonstrate the practicality of post-quantum key exchange by constructing cipher suites for the Transport Layer Security (TLS) protocol that provide key exchange based on the ring learning with errors (R-LWE) problem, we accompany these cipher suites with a rigorous proof of security. Our approach ties lattice-based key exchange together with traditional authentication using RSA or elliptic curve digital signatures: the post-quantum key exchange provides forward secrecy against future quantum attackers, while authentication can be provided using RSA keys that are issued by today's commercial certificate authorities, smoothing the path to adoption. Our cryptographically secure implementation, aimed at the 128-bit security level, reveals that the performance price when switching from non-quantum-safe key exchange is not too high. With our R-LWE cipher suites integrated into the Open SSL library and using the Apache web server on a 2-core desktop computer, we could serve 506 RLWE-ECDSA-AES128-GCM-SHA256 HTTPS connections per second for a 10 KiB payload. Compared to elliptic curve Diffie-Hellman, this means an 8 KiB increased handshake size and a reduction in throughput of only 21%. This demonstrates that provably secure post-quantum key-exchange can already be considered practical.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A smoothed rank-based procedure is developed for the accelerated failure time model to overcome computational issues. The proposed estimator is based on an EM-type procedure coupled with the induced smoothing. "The proposed iterative approach converges provided the initial value is based on a consistent estimator, and the limiting covariance matrix can be obtained from a sandwich-type formula. The consistency and asymptotic normality of the proposed estimator are also established. Extensive simulations show that the new estimator is not only computationally less demanding but also more reliable than the other existing estimators.