55 resultados para Specifications.

em CentAUR: Central Archive University of Reading - UK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In most climate simulations used by the Intergovernmental Panel on Climate Change 2007 fourth assessment report, stratospheric processes are only poorly represented. For example, climatological or simple specifications of time-varying ozone concentrations are imposed and the quasi-biennial oscillation (QBO) of equatorial stratospheric zonal wind is absent. Here we investigate the impact of an improved stratospheric representation using two sets of perturbed simulations with the Hadley Centre coupled ocean atmosphere model HadGEM1 with natural and anthropogenic forcings for the 1979–2003 period. In the first set of simulations, the usual zonal mean ozone climatology with superimposed trends is replaced with a time series of observed zonal mean ozone distributions that includes interannual variability associated with the solar cycle, QBO and volcanic eruptions. In addition to this, the second set of perturbed simulations includes a scheme in which the stratospheric zonal wind in the tropics is relaxed to appropriate zonal mean values obtained from the ERA-40 re-analysis, thus forcing a QBO. Both of these changes are applied strictly to the stratosphere only. The improved ozone field results in an improved simulation of the stepwise temperature transitions observed in the lower stratosphere in the aftermath of the two major recent volcanic eruptions. The contribution of the solar cycle signal in the ozone field to this improved representation of the stepwise cooling is discussed. The improved ozone field and also the QBO result in an improved simulation of observed trends, both globally and at tropical latitudes. The Eulerian upwelling in the lower stratosphere in the equatorial region is enhanced by the improved ozone field and is affected by the QBO relaxation, yet neither induces a significant change in the upwelling trend.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the summer of 1982, the ICLCUA CAFS Special Interest Group defined three subject areas for working party activity. These were: 1) interfaces with compilers and databases, 2) end-user language facilities and display methods, and 3) text-handling and office automation. The CAFS SIG convened one working party to address the first subject with the following terms of reference: 1) review facilities and map requirements onto them, 2) "Database or CAFS" or "Database on CAFS", 3) training needs for users to bridge to new techniques, and 4) repair specifications to cover gaps in software. The working party interpreted the topic broadly as the data processing professional's, rather than the end-user's, view of and relationship with CAFS. This report is the result of the working party's activities. The report content for good reasons exceeds the terms of reference in their strictest sense. For example, we examine QUERYMASTER, which is deemed to be an end-user tool by ICL, from both the DP and end-user perspectives. First, this is the only interface to CAFS in the current SV201. Secondly, it is necessary for the DP department to understand the end-user's interface to CAFS. Thirdly, the other subjects have not yet been addressed by other active working parties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In England, drama is embedded into the National Curriculum as a part of the programmes of study for the subject of English. This means that all children aged between 5 - 16 in state funded schools have an entitlement to be taught some aspects of the subject. While the manifestation of drama in primary schools is diverse, in a great many schools for students aged between 11 – 19, drama and theatre art is taught as a discrete subject in the same way that the visual arts and music are. Students may opt for public examination courses in the subject at ages 16 and 18. In order to satisfy the specifications laid down for such examinations many schools recognise the need for specialist teachers and indeed specialist teaching rooms and equipment. This chapter outlines how drama is taught in secondary schools in England (there being subtle variations in the education systems in the other countries that make up the United Kingdom) and the theories that underpin drama’s place in the curriculum as a subject in its own right and as a vehicle for delivering other aspects of the prescribed curriculum are discussed. The paper goes on to review the way in which drama is taught articulates with the requirements and current initiatives laid down by the government. Given this context, the chapter moves on to explore what specialist subject and pedagogical knowledge secondary school drama teachers need. Furthermore, consideration is made of the tensions that may be seen to exist between the way drama teachers perceive their own identity as subject specialists and the restrictions and demands placed upon them by the education system within which they work. An insight into the backgrounds of those who become drama teachers in England is provided and the reasons for choosing such a career and the expectations and concerns that underpin their training are identified and analysed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new approach is presented that simultaneously deals with Misreporting and Don't Know (DK) responses within a dichotomous-choice contingent valuation framework. Utilising a modification of the standard Bayesian Probit framework, a Gibbs with Metropolis-Hastings algorithm is used to estimate the posterior densities for the parameters of interest. Several model specifications are applied to two contingent valuation datasets: one on wolf management plans, and one on the US Fee Demonstration Program. We find that DKs are more likely to be from people who would be predicted to have positive utility for the bid. Therefore, a DK is more likely to be a YES than a NO. We also find evidence of misreporting, primarily in favour of the NO option. The inclusion of DK responses has an unpredictable impact on willingness-to-pay estimates, since it impacts differently on the results for the two datasets we examine. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents the method and findings of a Delphi expert survey to assess the impact of UK government farm animal welfare policy, form assurance schemes and major food retailer specifications on the welfare of animals on forms. Two case-study livestock production systems are considered, dairy and cage egg production. The method identifies how well the various standards perform in terms of their effects on a number of key farm animal welfare variables, and provides estimates of the impact of the three types of standard on the welfare of animals on forms, taking account of producer compliance. The study highlights that there remains considerable scope for government policy, together with form assurance schemes, to improve the welfare of form animals by introducing standards that address key factors affecting animal welfare and by increasing compliance of livestock producers. There is a need for more comprehensive, regular and random surveys of on-farm welfare to monitor compliance with welfare standards (legislation and welfare codes) and the welfare of farm animals over time, and a need to collect farm data on the costs of compliance with standards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When formulating least-cost poultry diets, ME concentration should be optimised by an iterative procedure, not entered as a fixed value. This iteration must calculate profit margins by taking into account the way in which feed intake and saleable outputs vary with ME concentration. In the case of broilers, adjustment of critical amino acid contents in direct proportion to ME concentration does not result in birds of equal fatness. To avoid an increase in fat deposition at higher energy levels, it is proposed that amino acid specifications should be adjusted in proportion to changes in the net energy supplied by the feed. A model is available which will both interpret responses to amino acids in laying trials and give economically optimal estimates of amino acid inputs for practical feed formulation. Flocks coming into lay and flocks nearing the end of the pullet year have bimodal distributions of rates of lay, with the result that calculations of requirement based on mean output will underestimate the optimal amino acid input for the flock. Chick diets containing surplus protein can lead to impaired utilisation of the first-limiting amino acid. This difficulty can be avoided by stating amino acid requirements as a proportion of the protein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using mixed logit models to analyse choice data is common but requires ex ante specification of the functional forms of preference distributions. We make the case for greater use of bounded functional forms and propose the use of the Marginal Likelihood, calculated using Bayesian techniques, as a single measure of model performance across non nested mixed logit specifications. Using this measure leads to very different rankings of model specifications compared to alternative rule of thumb measures. The approach is illustrated using data from a choice experiment regarding GM food types which provides insights regarding the recent WTO dispute between the EU and the US, Canada and Argentina and whether labelling and trade regimes should be based on the production process or product composition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The influence matrix is used in ordinary least-squares applications for monitoring statistical multiple-regression analyses. Concepts related to the influence matrix provide diagnostics on the influence of individual data on the analysis - the analysis change that would occur by leaving one observation out, and the effective information content (degrees of freedom for signal) in any sub-set of the analysed data. In this paper, the corresponding concepts have been derived in the context of linear statistical data assimilation in numerical weather prediction. An approximate method to compute the diagonal elements of the influence matrix (the self-sensitivities) has been developed for a large-dimension variational data assimilation system (the four-dimensional variational system of the European Centre for Medium-Range Weather Forecasts). Results show that, in the boreal spring 2003 operational system, 15% of the global influence is due to the assimilated observations in any one analysis, and the complementary 85% is the influence of the prior (background) information, a short-range forecast containing information from earlier assimilated observations. About 25% of the observational information is currently provided by surface-based observing systems, and 75% by satellite systems. Low-influence data points usually occur in data-rich areas, while high-influence data points are in data-sparse areas or in dynamically active regions. Background-error correlations also play an important role: high correlation diminishes the observation influence and amplifies the importance of the surrounding real and pseudo observations (prior information in observation space). Incorrect specifications of background and observation-error covariance matrices can be identified, interpreted and better understood by the use of influence-matrix diagnostics for the variety of observation types and observed variables used in the data assimilation system. Copyright © 2004 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Growing pot poinsettia and similar crops involves careful crop monitoring and management to ensure that height specifications are met. Graphical tracking represents a target driven approach to decision support with simple interpretation. HDC (Horticultural Development Council) Poinsettia Tracker implements a graphical track based on the Generalised Logistic Curve, similar to that of other tracking packages. Any set of curve parameters can be used to track crop progress. However, graphical tracks must be expected to be site and cultivar specific. By providing a simple Curve fitting function, growers can easily develop their own site and variety specific ideal tracks based on past records with increasing quality as more seasons' data are added. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the use of pulse shaping for optimal excitation of samples in time-domain THz spectroscopy. Pulse shaping can be performed in a 4f optical system to specifications from state space models of the system's dynamics. Subspace algorithms may be used for the identification of the state space models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many photovoltaic inverter designs make use of a buck based switched mode power supply (SMPS) to produce a rectified sinusoidal waveform. This waveform is then unfolded by a low frequency switching structure to produce a fully sinusoidal waveform. The Cuk SMPS could offer advantages over the buck in such applications. Unfortunately the Cuk converter is considered to be difficult to control using classical methods. Correct closed loop design is essential for stable operation of Cuk converters. Due to these stability issues, Cuk converter based designs often require stiff low bandwidth control loops. In order to achieve this stable closed loop performance, traditional designs invariably need large, unreliable electrolytic capacitors. In this paper, an inverter with a sliding mode control approach is presented which enables the designer to make use of the Cuk converters advantages, while ameliorating control difficulties. This control method allows the selection of passive components based predominantly on ripple and reliability specifications while requiring only one state reference signal. This allows much smaller, more reliable non-electrolytic capacitors to be used. A prototype inverter has been constructed and results obtained which demonstrate the design flexibility of the Cuk topology when coupled with sliding mode control.