925 resultados para post-Newtonian approximation to general relativity
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
This paper describes experiments relating to the perception of the roughness of simulated surfaces via the haptic and visual senses. Subjects used a magnitude estimation technique to judge the roughness of “virtual gratings” presented via a PHANToM haptic interface device, and a standard visual display unit. It was shown that under haptic perception, subjects tended to perceive roughness as decreasing with increased grating period, though this relationship was not always statistically significant. Under visual exploration, the exact relationship between spatial period and perceived roughness was less well defined, though linear regressions provided a reliable approximation to individual subjects’ estimates.
Resumo:
Texture and small-scale surface details are widely recognised as playing an important role in the haptic identification of objects. In order to simulate realistic textures in haptic virtual environments, it has become increasingly necessary to identify a robust technique for modelling of surface profiles. This paper describes a method whereby Fourier series spectral analysis is employed in order to describe the measured surface profiles of several characteristic surfaces. The results presented suggest that a bandlimited Fourier series can be used to provide a realistic approximation to surface amplitude profiles.
Resumo:
The background error covariance matrix, B, is often used in variational data assimilation for numerical weather prediction as a static and hence poor approximation to the fully dynamic forecast error covariance matrix, Pf. In this paper the concept of an Ensemble Reduced Rank Kalman Filter (EnRRKF) is outlined. In the EnRRKF the forecast error statistics in a subspace defined by an ensemble of states forecast by the dynamic model are found. These statistics are merged in a formal way with the static statistics, which apply in the remainder of the space. The combined statistics may then be used in a variational data assimilation setting. It is hoped that the nonlinear error growth of small-scale weather systems will be accurately captured by the EnRRKF, to produce accurate analyses and ultimately improved forecasts of extreme events.
Resumo:
European economic and political integration have been recognised as having implications for patterns of performance in national real estate and capital markets and have generated a wide body of research and commentary. In 1999, progress towards monetary integration within the European Union culminated in the introduction of a common currency and monetary policy. This paper investigates the effects of this ‘event’ on the behaviour of stock returns in European real estate companies. A range of statistical tests is applied to the performance of European property companies to test for changes in segmentation, co-movement and causality. The results suggest that, relative to the wider equity markets, the dispersion of performance is higher, correlations are lower, a common contemporaneous factor has much lower explanatory power whilst lead-lag relationships are stronger. Consequently, the evidence of transmission of monetary integration to real estate securities is less noticeable than to general securities. Less and slower integration is attributed to the relatively small size of the real estate securities market and the local and national nature of the majority of the companies’ portfolios.
Resumo:
Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.
Resumo:
A periodic structure of finite extent is embedded within an otherwise uniform two-dimensional system consisting of finite-depth fluid covered by a thin elastic plate. An incident harmonic flexural-gravity wave is scattered by the structure. By using an approximation to the corresponding linearised boundary value problem that is based on a slowly varying structure in conjunction with a transfer matrix formulation, a method is developed that generates the whole solution from that for just one cycle of the structure, providing both computational savings and insight into the scattering process. Numerical results show that variations in the plate produce strong resonances about the ‘Bragg frequencies’ for relatively few periods. We find that certain geometrical variations in the plate generate these resonances above the Bragg value, whereas other geometries produce the resonance below the Bragg value. The familiar resonances due to periodic bed undulations tend to be damped by the plate.
Resumo:
The problem of water wave scattering by a circular ice floe, floating in fluid of finite depth, is formulated and solved numerically. Unlike previous investigations of such situations, here we allow the thickness of the floe (and the fluid depth) to vary axisymmetrically and also incorporate a realistic non-zero draught. A numerical approximation to the solution of this problem is obtained to an arbitrary degree of accuracy by combining a Rayleigh–Ritz approximation of the vertical motion with an appropriate variational principle. This numerical solution procedure builds upon the work of Bennets et al. (2007, J. Fluid Mech., 579, 413–443). As part of the numerical formulation, we utilize a Fourier cosine expansion of the azimuthal motion, resulting in a system of ordinary differential equations to solve in the radial coordinate for each azimuthal mode. The displayed results concentrate on the response of the floe rather than the scattered wave field and show that the effects of introducing the new features of varying floe thickness and a realistic draught are significant.
Resumo:
We reconsider the theory of the linear response of non-equilibrium steady states to perturbations. We �rst show that by using a general functional decomposition for space-time dependent forcings, we can de�ne elementary susceptibilities that allow to construct the response of the system to general perturbations. Starting from the de�nition of SRB measure, we then study the consequence of taking di�erent sampling schemes for analysing the response of the system. We show that only a speci�c choice of the time horizon for evaluating the response of the system to a general time-dependent perturbation allows to obtain the formula �rst presented by Ruelle. We also discuss the special case of periodic perturbations, showing that when they are taken into consideration the sampling can be �ne-tuned to make the de�nition of the correct time horizon immaterial. Finally, we discuss the implications of our results in terms of strategies for analyzing the outputs of numerical experiments by providing a critical review of a formula proposed by Reick.
Resumo:
A three-point difference scheme recently proposed in Ref. 1 for the numerical solution of a class of linear, singularly perturbed, two-point boundary-value problems is investigated. The scheme is derived from a first-order approximation to the original problem with a small deviating argument. It is shown here that, in the limit, as the deviating argument tends to zero, the difference scheme converges to a one-sided approximation to the original singularly perturbed equation in conservation form. The limiting scheme is shown to be stable on any uniform grid. Therefore, no advantage arises from using the deviating argument, and the most accurate and efficient results are obtained with the deviation at its zero limit.
Resumo:
This study explores how children learn the meaning (semantics) and spelling patterns (orthography) of novel words encountered in story context. English-speaking children (N = 88) aged 7 to 8 years read 8 stories and each story contained 1 novel word repeated 4 times. Semantic cues were provided by the story context such that children could infer the meaning of the word (specific context) or the category that the word belonged to (general context). Following story reading, posttests indicated that children showed reliable semantic and orthographic learning. Decoding was the strongest predictor of orthographic learning, indicating that self-teaching via phonological recoding was important for this aspect of word learning. In contrast, oral vocabulary emerged as the strongest predictor of semantic learning.
Resumo:
ABSTRACT Non-Gaussian/non-linear data assimilation is becoming an increasingly important area of research in the Geosciences as the resolution and non-linearity of models are increased and more and more non-linear observation operators are being used. In this study, we look at the effect of relaxing the assumption of a Gaussian prior on the impact of observations within the data assimilation system. Three different measures of observation impact are studied: the sensitivity of the posterior mean to the observations, mutual information and relative entropy. The sensitivity of the posterior mean is derived analytically when the prior is modelled by a simplified Gaussian mixture and the observation errors are Gaussian. It is found that the sensitivity is a strong function of the value of the observation and proportional to the posterior variance. Similarly, relative entropy is found to be a strong function of the value of the observation. However, the errors in estimating these two measures using a Gaussian approximation to the prior can differ significantly. This hampers conclusions about the effect of the non-Gaussian prior on observation impact. Mutual information does not depend on the value of the observation and is seen to be close to its Gaussian approximation. These findings are illustrated with the particle filter applied to the Lorenz ’63 system. This article is concluded with a discussion of the appropriateness of these measures of observation impact for different situations.
Resumo:
Several methods are examined which allow to produce forecasts for time series in the form of probability assignments. The necessary concepts are presented, addressing questions such as how to assess the performance of a probabilistic forecast. A particular class of models, cluster weighted models (CWMs), is given particular attention. CWMs, originally proposed for deterministic forecasts, can be employed for probabilistic forecasting with little modification. Two examples are presented. The first involves estimating the state of (numerically simulated) dynamical systems from noise corrupted measurements, a problem also known as filtering. There is an optimal solution to this problem, called the optimal filter, to which the considered time series models are compared. (The optimal filter requires the dynamical equations to be known.) In the second example, we aim at forecasting the chaotic oscillations of an experimental bronze spring system. Both examples demonstrate that the considered time series models, and especially the CWMs, provide useful probabilistic information about the underlying dynamical relations. In particular, they provide more than just an approximation to the conditional mean.
Resumo:
This article examines the ways in which the BNP utilises the elements of British national identity in its discourse and argues that, during Griffin's leadership, the party has made a discursive choice to shift the emphasis from an ethnic to a civic narrative. We put forward two hypotheses, 1: the modernisation of the discourse of extreme right parties in the British context is likely to be related to the adoption of a predominantly civic narrative and 2: in the context of British party competition the BNP is likely to converge towards UKIP, drawing upon elements of its perceived winning formula, i.e. a predominantly civic rhetoric of national identity. We proceed to empirically test our hypotheses by conducting a twofold comparison. First, we compare the BNP's discourse pre- and post-1999 showing the BNP's progressive adoption of a civic narrative; and second the BNP's post-1999 discourse to that of UKIP in order to illustrate their similarities in terms of civic values.