950 resultados para Gaussian random fields
Resumo:
Public acceptance is consistently listed as having an enormous impact on the implementation and success of a congestion charge scheme. This paper investigates public acceptance of such a scheme in Australia. Surveys were conducted in Brisbane and Melbourne, the two fastest growing Australian cities. Using an ordered logit modeling approach, the survey data including stated preferences were analyzed to pinpoint the important factors influencing people’s attitudes to a congestion charge and, in turn, to their transport mode choices. To accommodate the nature of, and to account for the resulting heterogeneity of the panel data, random effects were considered in the models. As expected, this study found that the amount of the congestion charge and the financial benefits of implementing it have a significant influence on respondents’ support for the charge and on the likelihood of their taking a bus to city areas. However, respondents’ current primary transport mode for travelling to the city areas has a more pronounced impact. Meanwhile, respondents’ perceptions of the congestion charge’s role in protecting the environment by reducing vehicle emissions, and of the extent to which the charge would mean that they travelled less frequently to the city for shopping or entertainment, also have a significant impact on their level of support for its implementation. We also found and explained notable differences across two cities. Finally, findings from this study have been fully discussed in relation to the literature.
Resumo:
Abnormal event detection has attracted a lot of attention in the computer vision research community during recent years due to the increased focus on automated surveillance systems to improve security in public places. Due to the scarcity of training data and the definition of an abnormality being dependent on context, abnormal event detection is generally formulated as a data-driven approach where activities are modeled in an unsupervised fashion during the training phase. In this work, we use a Gaussian mixture model (GMM) to cluster the activities during the training phase, and propose a Gaussian mixture model based Markov random field (GMM-MRF) to estimate the likelihood scores of new videos in the testing phase. Further-more, we propose two new features: optical acceleration, and the histogram of optical flow gradients; to detect the presence of any abnormal objects and speed violations in the scene. We show that our proposed method outperforms other state of the art abnormal event detection algorithms on publicly available UCSD dataset.
Resumo:
To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.
Resumo:
Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 hours to only 7 minutes. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.
Resumo:
With the overwhelming increase in the amount of data on the web and data bases, many text mining techniques have been proposed for mining useful patterns in text documents. Extracting closed sequential patterns using the Pattern Taxonomy Model (PTM) is one of the pruning methods to remove noisy, inconsistent, and redundant patterns. However, PTM model treats each extracted pattern as whole without considering included terms, which could affect the quality of extracted patterns. This paper propose an innovative and effective method that extends the random set to accurately weigh patterns based on their distribution in the documents and their terms distribution in patterns. Then, the proposed approach will find the specific closed sequential patterns (SCSP) based on the new calculated weight. The experimental results on Reuters Corpus Volume 1 (RCV1) data collection and TREC topics show that the proposed method significantly outperforms other state-of-the-art methods in different popular measures.
Resumo:
Successful prediction of groundwater flow and solute transport through highly heterogeneous aquifers has remained elusive due to the limitations of methods to characterize hydraulic conductivity (K) and generate realistic stochastic fields from such data. As a result, many studies have suggested that the classical advective-dispersive equation (ADE) cannot reproduce such transport behavior. Here we demonstrate that when high-resolution K data are used with a fractal stochastic method that produces K fields with adequate connectivity, the classical ADE can accurately predict solute transport at the macrodispersion experiment site in Mississippi. This development provides great promise to accurately predict contaminant plume migration, design more effective remediation schemes, and reduce environmental risks. Key Points Non-Gaussian transport behavior at the MADE site is unraveledADE can reproduce tracer transport in heterogeneous aquifers with no calibrationNew fractal method generates heterogeneous K fields with adequate connectivity
Resumo:
Background Random Breath Testing (RBT) has proven to be a cornerstone of enforcement attempts to deter (as well as apprehend) motorists from drink driving in Queensland (Australia) for decades. However, scant published research has examined the relationship between the frequency of implementing RBT activities and subsequent drink driving apprehension rates across time. Aim This study aimed to examine the prevalence of apprehending drink drivers in Queensland over a 12 year period. It was hypothesised that an increase in breath testing rates would result in a corresponding decrease in the frequency of drink driving apprehension rates over time, which would reflect general deterrent effects. Method The Queensland Police Service provided RBT data that was analysed. Results Between the 1st of January 2000 and 31st of December 2011, 35,082,386 random breath tests (both mobile and stationary) were conducted in Queensland, resulting in 248,173 individuals being apprehended for drink driving offences. A total of 342,801 offences were recorded during this period, representing an intercept rate of .96. Of these offences, 276,711 (80.72%) were recorded against males and 66,024 (19.28%) offences committed by females. The most common drink driving offence was between 0.05 and 0.08 BAC limit. The largest proportion of offences was detected on the weekends, with Saturdays (27.60%) proving to be the most common drink driving night followed by Sundays (21.41%). The prevalence of drink driving detection rates rose steadily across time, peaking in 2008 and 2009, before slightly declining. This decline was observed across all Queensland regions and any increase in annual figures was due to new offence types being developed. Discussion This paper will further outline the major findings of the study in regards to tailoring RBT operations to increase detection rates as well as improve the general deterrent effect of the initiative.
Resumo:
One of the main challenges facing online and offline path planners is the uncertainty in the magnitude and direction of the environmental energy because it is dynamic, changeable with time, and hard to forecast. This thesis develops an artificial intelligence for a mobile robot to learn from historical or forecasted data of environmental energy available in the area of interest which will help for a persistence monitoring under uncertainty using the developed algorithm.
Resumo:
We show that the parallax motion resulting from non-nodal rotation in panorama capture can be exploited for light field construction from commodity hardware. Automated panoramic image capture typically seeks to rotate a camera exactly about its nodal point, for which no parallax motion is observed. This can be difficult or impossible to achieve due to limitations of the mounting or optical systems, and consequently a wide range of captured panoramas suffer from parallax between images. We show that by capturing such imagery over a regular grid of camera poses, then appropriately transforming the captured imagery to a common parameterisation, a light field can be constructed. The resulting four-dimensional image encodes scene geometry as well as texture, allowing an increasingly rich range of light field processing techniques to be applied. Employing an Ocular Robotics REV25 camera pointing system, we demonstrate light field capture,refocusing and low-light image enhancement.
Resumo:
This paper presents an efficient noniterative method for distribution state estimation using conditional multivariate complex Gaussian distribution (CMCGD). In the proposed method, the mean and standard deviation (SD) of the state variables is obtained in one step considering load uncertainties, measurement errors, and load correlations. In this method, first the bus voltages, branch currents, and injection currents are represented by MCGD using direct load flow and a linear transformation. Then, the mean and SD of bus voltages, or other states, are calculated using CMCGD and estimation of variance method. The mean and SD of pseudo measurements, as well as spatial correlations between pseudo measurements, are modeled based on the historical data for different levels of load duration curve. The proposed method can handle load uncertainties without using time-consuming approaches such as Monte Carlo. Simulation results of two case studies, six-bus, and a realistic 747-bus distribution network show the effectiveness of the proposed method in terms of speed, accuracy, and quality against the conventional approach.
Resumo:
This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.
Resumo:
Skin temperature is an important physiological measure that can reflect the presence of illness and injury as well as provide insight into the localised interactions between the body and the environment. The aim of this systematic review was to analyse the agreement between conductive and infrared means of assessing skin temperature which are commonly employed in in clinical, occupational, sports medicine, public health and research settings. Full-text eligibility was determined independently by two reviewers. Studies meeting the following criteria were included in the review: 1) the literature was written in English, 2) participants were human (in vivo), 3) skin surface temperature was assessed at the same site, 4) with at least two commercially available devices employed—one conductive and one infrared—and 5) had skin temperature data reported in the study. A computerised search of four electronic databases, using a combination of 21 keywords, and citation tracking was performed in January 2015. A total of 8,602 were returned. Methodology quality was assessed by 2 authors independently, using the Cochrane risk of bias tool. A total of 16 articles (n = 245) met the inclusion criteria. Devices are classified to be in agreement if they met the clinically meaningful recommendations of mean differences within ±0.5 °C and limits of agreement of ±1.0 °C. Twelve of the included studies found mean differences greater than ±0.5 °C between conductive and infrared devices. In the presence of external stimulus (e.g. exercise and/or heat) five studies foundexacerbated measurement differences between conductive and infrared devices. This is the first review that has attempted to investigate presence of any systemic bias between infrared and conductive measures by collectively evaluating the current evidence base. There was also a consistently high risk of bias across the studies, in terms of sample size, random sequence generation, allocation concealment, blinding and incomplete outcome data. This systematic review questions the suitability of using infrared cameras in stable, resting, laboratory conditions. Furthermore, both infrared cameras and thermometers in the presence of sweat and environmental heat demonstrate poor agreement when compared to conductive devices. These findings have implications for clinical, occupational, public health, sports science and research fields.
Resumo:
South Africa has an electrical transmission grid of over 25 000 km of overhead power lines with voltages of 132 kV to 765 kV. The grid has been largely designed and built by the power utility, Eskom. This book embodies the planning philosophies, design principles and construction practices of Eskom. It is the culmination of decades of thought, study, research and the practical experience of many overhead power line engineers and researchers. The book covers the main aspects of overhead power line design and construction, from electrical first principles, system planning, insulation co-ordination (including live line working), mechanical design through to environmental impact management and power line communications. The content emphasises the need for close interaction between all technical disciplines involved and the importance of optimising designs for economy and performance. Additional challenges in South Africa are the relatively high altitude of the interior plateau (1 000 m to 1 700 m above sea level), severe lightning in some areas and long transmission distances. The book explains how these factors are accommodated in modern designs. Other advanced work covered includes the use and understanding of polymeric insulators, the judicious reduction of phase-to-phase spacings and the adoption of guyed structures.