220 resultados para Markov random fields (MRFs)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much of the existing empirical research on journalism focuses largely on hard-news journalism, at the expense of its less traditional forms, particularly the soft-news areas of lifestyle and entertainment journalism. In focussing on one particular area of lifestyle journalism – the reporting of travel stories – this paper argues for renewed scholarly efforts in this increasingly important field. Travel journalism’s location at the intersection between information and entertainment, journalism and advertising, as well as its increasingly significant role in the representation of foreign cultures makes it a significant site for scholarly research. By reviewing existing research about travel journalism and examining in detail the special exigencies that constrain it, the article proposes a number of dimensions for future research into the production practices of travel journalism. These dimensions include travel journalism’s role in mediating foreign cultures, its market orientation, motivational aspects and its ethical standards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research suggests that the length and quality of police-citizen encounters affect policing outcomes. The Koper Curve, for example, shows that the optimal length for police presence in hot spots is between 14 and 15 minutes, with diminishing returns observed thereafter. Our study, using data from the Queensland Community Engagement Trial (QCET), examines the impact of encounter length on citizen perceptions of police performance. QCET involved a randomised field trial, where 60 random breath test (RBT) traffic stop operations were randomly allocated to an experimental condition involving a procedurally just encounter or a business-as-usual control condition. Our results show that the optimal length of time for procedurally just encounters during RBT traffic stops is just less than 2 minutes. We show, therefore, that it is important to encourage and facilitate positive police–citizen encounters during RBTat traffic stops, while ensuring that the length of these interactions does not pass a point of diminishing returns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We construct two efficient Identity-Based Encryption (IBE) systems that admit selective-identity security reductions without random oracles in groups equipped with a bilinear map. Selective-identity secure IBE is a slightly weaker security model than the standard security model for IBE. In this model the adversary must commit ahead of time to the identity that it intends to attack, whereas in an adaptive-identity attack the adversary is allowed to choose this identity adaptively. Our first system—BB1—is based on the well studied decisional bilinear Diffie–Hellman assumption, and extends naturally to systems with hierarchical identities, or HIBE. Our second system—BB2—is based on a stronger assumption which we call the Bilinear Diffie–Hellman Inversion assumption and provides another approach to building IBE systems. Our first system, BB1, is very versatile and well suited for practical applications: the basic hierarchical construction can be efficiently secured against chosen-ciphertext attacks, and further extended to support efficient non-interactive threshold decryption, among others, all without using random oracles. Both systems, BB1 and BB2, can be modified generically to provide “full” IBE security (i.e., against adaptive-identity attacks), either using random oracles, or in the standard model at the expense of a non-polynomial but easy-to-compensate security reduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model. Our construction works in groups equipped with an efficient bilinear map, or, more generally, an algorithm for the Decision Diffie-Hellman problem. The security of our scheme depends on a new intractability assumption we call Strong Diffie-Hellman (SDH), by analogy to the Strong RSA assumption with which it shares many properties. Signature generation in our system is fast and the resulting signatures are as short as DSA signatures for comparable security. We give a tight reduction proving that our scheme is secure in any group in which the SDH assumption holds, without relying on the random oracle model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A, dry, non-hydrostatic sub-cloud model is used to simulate an isolated stationary downburst wind event to study the influence topographic features have on the near-ground wind structure of these storms. It was generally found that storm maximum wind speeds could be increased by up to 30% because of the presence of a topographic feature at the location of maximum wind speeds. Comparing predicted velocity profile amplification with that of a steady flow impinging jet, similar results were found despite the simplifications made in the impinging jet model. Comparison of these amplification profiles with those found in the simulated boundary layer winds reveal reductions of up to 30% in the downburst cases. Downburst and boundary layer amplification profiles were shown to become more similar as the topographic feature height was reduced with respect to the outflow depth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Convective downburst wind storms generate the peak annual gust wind speed for many parts of the non-cyclonic world at return periods of importance for ultimate limit state design. Despite this there is little clear understanding of how to appropriately design for these wind events given their significant dissimilarities to boundary layer winds upon which most design is based. To enhance the understanding of wind fields associated with these storms a three-dimensional numerical model was developed to simulate a multitude of idealised downburst scenarios and to investigate their near-ground wind characteristics. Stationary and translating downdraft wind events in still and sheared environments were simulated with baseline results showing good agreement with previous numerical work and full-scale observational data. Significant differences are shown in the normalised peak wind speed velocity profiles depending on the environmental wind conditions in the vicinity of the simulated event. When integrated over the height of mid- to high rise structures, all simulated profiles are shown to produce wind loads smaller than an equivalent 10 m height matched open terrain boundary layer profile. This suggests that for these structures the current design approach is conservative from an ultimate loading standpoint. Investigating the influence of topography on the structure of the simulated near-ground downburst wind fields, it is shown that these features amplify wind speeds in a manner similar to that expected for boundary layer winds, but the extent of amplification is reduced. The level of reduction is shown to be dependent on the depth of the simulated downburst outflow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust facial expression recognition (FER) under occluded face conditions is challenging. It requires robust algorithms of feature extraction and investigations into the effects of different types of occlusion on the recognition performance to gain insight. Previous FER studies in this area have been limited. They have spanned recovery strategies for loss of local texture information and testing limited to only a few types of occlusion and predominantly a matched train-test strategy. This paper proposes a robust approach that employs a Monte Carlo algorithm to extract a set of Gabor based part-face templates from gallery images and converts these templates into template match distance features. The resulting feature vectors are robust to occlusion because occluded parts are covered by some but not all of the random templates. The method is evaluated using facial images with occluded regions around the eyes and the mouth, randomly placed occlusion patches of different sizes, and near-realistic occlusion of eyes with clear and solid glasses. Both matched and mis-matched train and test strategies are adopted to analyze the effects of such occlusion. Overall recognition performance and the performance for each facial expression are investigated. Experimental results on the Cohn-Kanade and JAFFE databases demonstrate the high robustness and fast processing speed of our approach, and provide useful insight into the effects of occlusion on FER. The results on the parameter sensitivity demonstrate a certain level of robustness of the approach to changes in the orientation and scale of Gabor filters, the size of templates, and occlusions ratios. Performance comparisons with previous approaches show that the proposed method is more robust to occlusion with lower reductions in accuracy from occlusion of eyes or mouth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent advances suggest that encoding images through Symmetric Positive Definite (SPD) matrices and then interpreting such matrices as points on Riemannian manifolds can lead to increased classification performance. Taking into account manifold geometry is typically done via (1) embedding the manifolds in tangent spaces, or (2) embedding into Reproducing Kernel Hilbert Spaces (RKHS). While embedding into tangent spaces allows the use of existing Euclidean-based learning algorithms, manifold shape is only approximated which can cause loss of discriminatory information. The RKHS approach retains more of the manifold structure, but may require non-trivial effort to kernelise Euclidean-based learning algorithms. In contrast to the above approaches, in this paper we offer a novel solution that allows SPD matrices to be used with unmodified Euclidean-based learning algorithms, with the true manifold shape well-preserved. Specifically, we propose to project SPD matrices using a set of random projection hyperplanes over RKHS into a random projection space, which leads to representing each matrix as a vector of projection coefficients. Experiments on face recognition, person re-identification and texture classification show that the proposed approach outperforms several recent methods, such as Tensor Sparse Coding, Histogram Plus Epitome, Riemannian Locality Preserving Projection and Relational Divergence Classification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates compressed sensing using hidden Markov models (HMMs) and hence provides an extension of recent single frame, bounded error sparse decoding problems into a class of sparse estimation problems containing both temporal evolution and stochastic aspects. This paper presents two optimal estimators for compressed HMMs. The impact of measurement compression on HMM filtering performance is experimentally examined in the context of an important image based aircraft target tracking application. Surprisingly, tracking of dim small-sized targets (as small as 5-10 pixels, with local detectability/SNR as low as − 1.05 dB) was only mildly impacted by compressed sensing down to 15% of original image size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An effective control of the ion current distribution over large-area (up to 103 cm2) substrates with the magnetic fields of a complex structure by using two additional magnetic coils installed under the substrate exposed to vacuum arc plasmas is demonstrated. When the magnetic field generated by the additional coils is aligned with the direction of the magnetic field generated by the guiding and focusing coils of the vacuum arc source, a narrow ion density distribution with the maximum current density 117 A m-2 is achieved. When one of the additional coils is set to generate the magnetic field of the opposite direction, an area almost uniform over the substrate of 103 cm2 ion current distribution with the mean value of 45 A m-2 is achieved. Our findings suggest that the system with the vacuum arc source and two additional magnetic coils can be effectively used for the effective, high throughput, and highly controllable plasma processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a novel data-driven approach to monitoring of systems operating under variable operating conditions is described. The method is based on characterizing the degradation process via a set of operation-specific hidden Markov models (HMMs), whose hidden states represent the unobservable degradation states of the monitored system while its observable symbols represent the sensor readings. Using the HMM framework, modeling, identification and monitoring methods are detailed that allow one to identify a HMM of degradation for each operation from mixed-operation data and perform operation-specific monitoring of the system. Using a large data set provided by a major manufacturer, the new methods are applied to a semiconductor manufacturing process running multiple operations in a production environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radial and axial distributions of magnetic fields in a low-frequency (∼460 kHz)inductively coupled plasmasource with two internal crossed planar rf current sheets are reported. The internal antenna configuration comprises two orthogonal sets of eight alternately reconnected parallel and equidistant copper litz wires in quartz enclosures and generates three magnetic (H z, H r, and H φ) and two electric (E φ and E r) field components at the fundamental frequency. The measurements have been performed in rarefied and dense plasmas generated in the electrostatic(E) and electromagnetic (H)discharge modes using two miniature magnetic probes. It is shown that the radial uniformity and depth of the rf power deposition can be improved as compared with conventional sources of inductively coupled plasmas with external flat spiral (“pancake”) antennas. Relatively deeper rf power deposition in the plasma source results in more uniform profiles of the optical emission intensity, which indicates on the improvement of the plasma uniformity over large chamber volumes. The results of the numerical modeling of the radial magnetic field profiles are found in a reasonable agreement with the experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radial profiles of magnetic fields in the electrostatic (E) and electromagnetic (H) modes of low-frequency (∼500) inductively coupled plasmas (ICP) were measured using miniature magnetic probes. A simplified plasma fluid model explaining the generation of the second harmonics of the azimuthal magnetic field in the plasma source was proposed. Because of apparent similarity in the procedure of derivation of the pondermotive force-caused nonlinear terms, pronounced generation of the nonlinear static azimuthal magnetic field could be expected.