29 resultados para Objective assumptions
em CentAUR: Central Archive University of Reading - UK
Resumo:
Data from four recent reanalysis projects [ECMWF, NCEP-NCAR, NCEP - Department of Energy ( DOE), NASA] have been diagnosed at the scale of synoptic weather systems using an objective feature tracking method. The tracking statistics indicate that, overall, the reanalyses correspond very well in the Northern Hemisphere (NH) lower troposphere, although differences for the spatial distribution of mean intensities show that the ECMWF reanalysis is systematically stronger in the main storm track regions but weaker around major orographic features. A direct comparison of the track ensembles indicates a number of systems with a broad range of intensities that compare well among the reanalyses. In addition, a number of small-scale weak systems are found that have no correspondence among the reanalyses or that only correspond upon relaxing the matching criteria, indicating possible differences in location and/or temporal coherence. These are distributed throughout the storm tracks, particularly in the regions known for small-scale activity, such as secondary development regions and the Mediterranean. For the Southern Hemisphere (SH), agreement is found to be generally less consistent in the lower troposphere with significant differences in both track density and mean intensity. The systems that correspond between the various reanalyses are considerably reduced and those that do not match span a broad range of storm intensities. Relaxing the matching criteria indicates that there is a larger degree of uncertainty in both the location of systems and their intensities compared with the NH. At upper-tropospheric levels, significant differences in the level of activity occur between the ECMWF reanalysis and the other reanalyses in both the NH and SH winters. This occurs due to a lack of coherence in the apparent propagation of the systems in ERA15 and appears most acute above 500 hPa. This is probably due to the use of optimal interpolation data assimilation in ERA15. Also shown are results based on using the same techniques to diagnose the tropical easterly wave activity. Results indicate that the wave activity is sensitive not only to the resolution and assimilation methods used but also to the model formulation.
Resumo:
A new objective climatology of polar lows in the Nordic (Norwegian and Barents) seas has been derived from a database of diagnostics of objectively identified cyclones spanning the period January 2000 to April 2004. There are two distinct parts to this study: the development of the objective climatology and a characterization of the dynamical forcing of the polar lows identified. Polar lows are an intense subset of polar mesocyclones. Polar mesocyclones are distinguished from other cyclones in the database as those that occur in cold air outbreaks over the open ocean. The difference between the wet-bulb potential temperature at 700 hPa and the sea surface temperature (SST) is found to be an effective discriminator between the atmospheric conditions associated with polar lows and other cyclones in the Nordic seas. A verification study shows that the objective identification method is reliable in the Nordic seas region. After demonstrating success at identifying polar lows using the above method, the dynamical forcing of the polar lows in the Nordic seas is characterized. Diagnostics of the ratio of mid-level vertical motion attributable to quasi-geostrophic forcing from upper and lower levels (U/L ratio) are used to determine the prevalence of a recently proposed category of extratropical cyclogenesis, type C, for which latent heat release is crucial to development. Thirty-one percent of the objectively identified polar low events (36 from 115) exceeded the U/L ratio of 4.0, previously identified as a threshold for type C cyclones. There is a contrast between polar lows to the north and south of the Nordic seas. In the southern Norwegian Sea, the population of polar low events is dominated by type C cyclones. These possess strong convection and weak low-level baroclinicity. Over the Barents and northern Norwegian seas, the well-known cyclogenesis types A and B dominate. These possess stronger low-level baroclinicity and weaker convection.
Resumo:
Using mixed logit models to analyse choice data is common but requires ex ante specification of the functional forms of preference distributions. We make the case for greater use of bounded functional forms and propose the use of the Marginal Likelihood, calculated using Bayesian techniques, as a single measure of model performance across non nested mixed logit specifications. Using this measure leads to very different rankings of model specifications compared to alternative rule of thumb measures. The approach is illustrated using data from a choice experiment regarding GM food types which provides insights regarding the recent WTO dispute between the EU and the US, Canada and Argentina and whether labelling and trade regimes should be based on the production process or product composition.
Resumo:
Several studies have highlighted the importance of the cooling period in oil absorption in deep-fat fried products. Specifically, it has been established that the largest proportion of oil which ends up into the food, is sucked into the porous crust region after the fried product is removed from the oil bath, stressing the importance of this time interval. The main objective of this paper was to develop a predictive mechanistic model that can be used to understand the principles behind post-frying cooling oil absorption kinetics, which can also help identifying the key parameters that affect the final oil intake by the fried product. The model was developed for two different geometries, an infinite slab and an infinite cylinder, and was divided into two main sub-models, one describing the immersion frying period itself and the other describing the post-frying cooling period. The immersion frying period was described by a transient moving-front model that considered the movement of the crust/core interface, whereas post-frying cooling oil absorption was considered to be a pressure driven flow mediated by capillary forces. A key element in the model was the hypothesis that oil suction would only begin once a positive pressure driving force had developed. The mechanistic model was based on measurable physical and thermal properties, and process parameters with no need of empirical data fitting, and can be used to study oil absorption in any deep-fat fried product that satisfies the assumptions made.
Resumo:
Two experiments examine the effects of extraneous speech and nonspeech noise on a visual short-term memory task administered to younger and older adults. Experiment 1 confirms an earlier report that playing task-irrelevant speech is no more distracting for older adults than for younger adults (Rouleau T Belleville, 1996), indicating that "irrelevant sound effects" in short-term memory operate in a different manner to recalling targets in the presence of competing speech (Tun, O'Kane, T Wingfield, 2002). Experiment 2 extends this result to nonspeech noise and demonstrates that the result cannot be ascribed to hearing difficulties amongst the older age group, although the data also show that older adults rated the noise as less annoying and uncomfortable than younger adults. Implications for theories of the irrelevant sound effect, and for cognitive ageing, are discussed.
Resumo:
A fast Knowledge-based Evolution Strategy, KES, for the multi-objective minimum spanning tree, is presented. The proposed algorithm is validated, for the bi-objective case, with an exhaustive search for small problems (4-10 nodes), and compared with a deterministic algorithm, EPDA and NSGA-II for larger problems (up to 100 nodes) using benchmark hard instances. Experimental results show that KES finds the true Pareto fronts for small instances of the problem and calculates good approximation Pareto sets for larger instances tested. It is shown that the fronts calculated by YES are superior to NSGA-II fronts and almost as good as those established by EPDA. KES is designed to be scalable to multi-objective problems and fast due to its small complexity.
Resumo:
Whilst radial basis function (RBF) equalizers have been employed to combat the linear and nonlinear distortions in modern communication systems, most of them do not take into account the equalizer's generalization capability. In this paper, it is firstly proposed that the. model's generalization capability can be improved by treating the modelling problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets. Then, as a modelling application, a new RBF equalizer learning scheme is introduced based on the directional evolutionary MOO (EMOO). Directional EMOO improves the computational efficiency of conventional EMOO, which has been widely applied in solving MOO problems, by explicitly making use of the directional information. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good performance not only on explaining the training samples but on predicting the unseen samples.
Resumo:
In this paper, a new equalizer learning scheme is introduced based on the algorithm of the directional evolutionary multi-objective optimization (EMOO). Whilst nonlinear channel equalizers such as the radial basis function (RBF) equalizers have been widely studied to combat the linear and nonlinear distortions in the modern communication systems, most of them do not take into account the equalizers' generalization capabilities. In this paper, equalizers are designed aiming at improving their generalization capabilities. It is proposed that this objective can be achieved by treating the equalizer design problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets, followed by deriving equalizers with good capabilities of recovering the signals for all the training sets. Conventional EMOO which is widely applied in the MOO problems suffers from disadvantages such as slow convergence speed. Directional EMOO improves the computational efficiency of the conventional EMOO by explicitly making use of the directional information. The new equalizer learning scheme based on the directional EMOO is applied to the RBF equalizer design. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good generalization capabilities, i.e., good performance on predicting the unseen samples.
Resumo:
Book review of 'Subjective, intersubjective, objective' by Donald Davidson.
Resumo:
Background: The aim of this study was to evaluate stimulant medication response following a single dose of methylphenidate (MPH) in children and young people with hyperkinetic disorder using infrared motion analysis combined with a continuous performance task (QbTest system) as objective measures. The hypothesis was put forward that a moderate testdose of stimulant medication could determine a robust treatment response, partial response and non-response in relation to activity, attention and impulse control measures. Methods: The study included 44 children and young people between the ages of 7-18 years with a diagnosis of hyperkinetic disorder (F90 & F90.1). A single dose-protocol incorporated the time course effects of both immediate release MPH and extended release MPH (Concerta XL, Equasym XL) to determine comparable peak efficacy periods post intake. Results: A robust treatment response with objective measures reverting to the population mean was found in 37 participants (84%). Three participants (7%) demonstrated a partial response to MPH and four participants (9%) were determined as non-responders due to deteriorating activity measures together with no improvements in attention and impulse control measures. Conclusion: Objective measures provide early into prescribing the opportunity to measure treatment response and monitor adverse reactions to stimulant medication. Most treatment responders demonstrated an effective response to MPH on a moderate testdose facilitating a swift and more optimal titration process.
Resumo:
Using the integral manifold approach, a composite control—the sum of a fast control and a slow control—is derived for a particular class of non-linear singularly perturbed systems. The fast control is designed completely at the outset, thus ensuring the stability of the fast transients of the system and, furthermore, the existence of the integral manifold. A new method is then presented which simplifies the derivation of a slow control such that the singularly perturbed system meets a preselected design objective to within some specified order of accuracy. Though this approach is, by its very nature, ad hoc, the underlying procedure is easily extended to more general classes of singularly perturbed systems by way of three examples.