74 resultados para confidence in policing
em CentAUR: Central Archive University of Reading - UK
Resumo:
A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants’ conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance.
Resumo:
Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.
Resumo:
The system dynamics concept of `generic structure' is dividable into three sub-types. This paper analyses the validity of these three, using both practical and theoretical perspectives. Firstly, a new set of measures is developed for generating validity-`confidence'-amongst a group using generic structures in a practical modelling situation. It is concluded that different confidence criteria are implicitly employed; there is an argument for trading-off model precision and analytical quality for simplicity and ease of use and future research is needed to combine these `process' and `content' aspects of confidence. From a theoretical stance it is shown that with two of the sub-types a scientific notion of confidence is achievable whereas the third (`archetypes') involves merely metaphorical thinking. It is concluded that the theoretical status of archetypes requires further development, whilst ensuring that its benefits are retained.
Resumo:
Tropical cyclones have been investigated in a T159 version of the MPI ECHAM5 climate model using a novel technique to diagnose the evolution of the 3-dimensional vorticity structure of tropical cyclones, including their full life cycle from weak initial vortex to their possible extra-tropical transition. Results have been compared with reanalyses (ERA40 and JRA25) and observed tropical storms during the period 1978-1999 for the Northern Hemisphere. There is no indication of any trend in the number or intensity of tropical storms during this period in ECHAM5 or in re-analyses but there are distinct inter-annual variations. The storms simulated by ECHAM5 are realistic both in space and time, but the model and even more so the re-analyses, underestimate the intensities of the most intense storms (in terms of their maximum wind speeds). There is an indication of a response to ENSO with a smaller number of Atlantic storms during El Niño in agreement with previous studies. The global divergence circulation responds to El Niño by setting up a large-scale convergence flow, with the center over the central Pacific with enhanced subsidence over the tropical Atlantic. At the same time there is an increase in the vertical wind shear in the region of the tropical Atlantic where tropical storms normally develop. There is a good correspondence between the model and ERA40 except that the divergence circulation is somewhat stronger in the model. The model underestimates storms in the Atlantic but tends to overestimate them in the Western Pacific and in the North Indian Ocean. It is suggested that the overestimation of storms in the Pacific by the model is related to an overly strong response to the tropical Pacific SST anomalies. The overestimation in 2 the North Indian Ocean is likely to be due to an over prediction in the intensity of monsoon depressions, which are then classified as intense tropical storms. Nevertheless, overall results are encouraging and will further contribute to increased confidence in simulating intense tropical storms with high-resolution climate models.
Resumo:
Convectively coupled equatorial waves are fundamental components of the interaction between the physics and dynamics of the tropical atmosphere. A new methodology, which isolates individual equatorial wave modes, has been developed and applied to observational data. The methodology assumes that the horizontal structures given by equatorial wave theory can be used to project upper- and lower-tropospheric data onto equatorial wave modes. The dynamical fields are first separated into eastward- and westward-moving components with a specified domain of frequency–zonal wavenumber. Each of the components for each field is then projected onto the different equatorial modes using the y structures of these modes given by the theory. The latitudinal scale yo of the modes is predetermined by data to fit the equatorial trapping in a suitable latitude belt y = ±Y. The extent to which the different dynamical fields are consistent with one another in their depiction of each equatorial wave structure determines the confidence in the reality of that structure. Comparison of the analyzed modes with the eastward- and westward-moving components in the convection field enables the identification of the dynamical structure and nature of convectively coupled equatorial waves. In a case study, the methodology is applied to two independent data sources, ECMWF Reanalysis and satellite-observed window brightness temperature (Tb) data for the summer of 1992. Various convectively coupled equatorial Kelvin, mixed Rossby–gravity, and Rossby waves have been detected. The results indicate a robust consistency between the two independent data sources. Different vertical structures for different wave modes and a significant Doppler shifting effect of the background zonal winds on wave structures are found and discussed. It is found that in addition to low-level convergence, anomalous fluxes induced by strong equatorial zonal winds associated with equatorial waves are important for inducing equatorial convection. There is evidence that equatorial convection associated with Rossby waves leads to a change in structure involving a horizontal structure similar to that of a Kelvin wave moving westward with it. The vertical structure may also be radically changed. The analysis method should make a very powerful diagnostic tool for investigating convectively coupled equatorial waves and the interaction of equatorial dynamics and physics in the real atmosphere. The results from application of the analysis method for a reanalysis dataset should provide a benchmark against which model studies can be compared.
Resumo:
The NERC UK SOLAS-funded Reactive Halogens in the Marine Boundary Layer (RHaMBLe) programme comprised three field experiments. This manuscript presents an overview of the measurements made within the two simultaneous remote experiments conducted in the tropical North Atlantic in May and June 2007. Measurements were made from two mobile and one ground-based platforms. The heavily instrumented cruise D319 on the RRS Discovery from Lisbon, Portugal to São Vicente, Cape Verde and back to Falmouth, UK was used to characterise the spatial distribution of boundary layer components likely to play a role in reactive halogen chemistry. Measurements onboard the ARSF Dornier aircraft were used to allow the observations to be interpreted in the context of their vertical distribution and to confirm the interpretation of atmospheric structure in the vicinity of the Cape Verde islands. Long-term ground-based measurements at the Cape Verde Atmospheric Observatory (CVAO) on São Vicente were supplemented by long-term measurements of reactive halogen species and characterisation of additional trace gas and aerosol species during the intensive experimental period. This paper presents a summary of the measurements made within the RHaMBLe remote experiments and discusses them in their meteorological and chemical context as determined from these three platforms and from additional meteorological analyses. Air always arrived at the CVAO from the North East with a range of air mass origins (European, Atlantic and North American continental). Trace gases were present at stable and fairly low concentrations with the exception of a slight increase in some anthropogenic components in air of North American origin, though NOx mixing ratios during this period remained below 20 pptv. Consistency with these air mass classifications is observed in the time series of soluble gas and aerosol composition measurements, with additional identification of periods of slightly elevated dust concentrations consistent with the trajectories passing over the African continent. The CVAO is shown to be broadly representative of the wider North Atlantic marine boundary layer; measurements of NO, O3 and black carbon from the ship are consistent with a clean Northern Hemisphere marine background. Aerosol composition measurements do not indicate elevated organic material associated with clean marine air. Closer to the African coast, black carbon and NO levels start to increase, indicating greater anthropogenic influence. Lower ozone in this region is possibly associated with the increased levels of measured halocarbons, associated with the nutrient rich waters of the Mauritanian upwelling. Bromide and chloride deficits in coarse mode aerosol at both the CVAO and on D319 and the continuous abundance of inorganic gaseous halogen species at CVAO indicate significant reactive cycling of halogens. Aircraft measurements of O3 and CO show that surface measurements are representative of the entire boundary layer in the vicinity both in diurnal variability and absolute levels. Above the inversion layer similar diurnal behaviour in O3 and CO is observed at lower mixing ratios in the air that had originated from south of Cape Verde, possibly from within the ITCZ. ECMWF calculations on two days indicate very different boundary layer depths and aircraft flights over the ship replicate this, giving confidence in the calculated boundary layer depth.
Resumo:
With both climate change and air quality on political and social agendas from local to global scale, the links between these hitherto separate fields are becoming more apparent. Black carbon, largely from combustion processes, scatters and absorbs incoming solar radiation, contributes to poor air quality and induces respiratory and cardiovascular problems. Uncertainties in the amount, location, size and shape of atmospheric black carbon cause large uncertainty in both climate change estimates and toxicology studies alike. Increased research has led to new effects and areas of uncertainty being uncovered. Here we draw together recent results and explore the increasing opportunities for synergistic research that will lead to improved confidence in the impact of black carbon on climate change, air quality and human health. Topics of mutual interest include better information on spatial distribution, size, mixing state and measuring and monitoring. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The study reviews the literature on global chain governance and food standards to allow for an assessment of Brazilian beef exports to the European Union. The empirical approach employed is based on company case studies. The results suggest that the Brazilian beef chain has little choice but to adapt to market changes as standards evolve. Costs of compliance for meeting international food standards reduce Brazil's comparative advantage. At the same time, changes in the nature of demand have created the need for a more integrated supply chain in order to enhance confidence in Brazil's beef production and processing abroad.
Resumo:
The journey from the concept of a building to the actual built form is mediated with the use of various artefacts, such as drawings, product samples and models. These artefacts are produced for different purposes and for people with different levels of understanding of the design and construction processes. This paper studies design practice as it occurs naturally in a real-world situation by observing the conversations that surround the use of artefacts at the early stages of a building's design. Drawing on ethnographic data, insights are given into how the use of artefacts can reveal a participant's understanding of the scheme. The appropriateness of the method of conversation analysis to reveal the users' understanding of a scheme is explored by observing spoken micro-interactional behaviours. It is shown that the users' understanding of the design was developed in the conversations around the use of artefacts, as well as the knowledge that is embedded in the artefacts themselves. The users' confidence in the appearance of the building was considered to be gained in conversation, rather than the ability of the artefacts to represent a future reality.
Resumo:
This article describes the development and validation of a diagnostic test of German and its integration in a programme of formative assessment during a one-year initial teacher-training course. The test focuses on linguistic aspects that cause difficulty for trainee teachers of German as a foreign language and assesses implicit and explicit grammatical knowledge as well as students' confidence in this knowledge. Administration of the test to 57 German speakers in four groups (first-year undergraduates, fourth-year undergraduates, postgraduate trainees, and native speakers) provided evidence of its reliability and validity.
Resumo:
During the last 15 years, a series of food scares and crises (BSE, dioxin. foot and mouth disease) have seriously under-mined public confidence in food producers and operators and their capacity to produce safe food. As a result, food safety has become a top priority of the European legislative authorities and systems of national food control have been tightened up and have included the establishment of the European Food Safety Authority. In Greece a law creating the Hellenic Food Safety Authority has been approved. The main objectives of this Authority are to promote the food security to consumers and inform them of any changes or any development in the food and health sector. The paper reviews the general structure of the current food control system in Greece. It describes the structure and the mission of the Hellenic Food Safety Authority and explains the strategy to carry out inspections and the analysis of the preliminary results of such inspections. Details are also given of the personnel training and certification and accreditation standards to be met by the Authority by the end of 2004. (c) 2005 Elsevier Ltd. All rights reserved.
Patients' attitudes towards, and information needs in relation to, nurse prescribing in rheumatology
Resumo:
Aims and objectives: To assess the level of confidence that rheumatology patients would have in nurse prescribing, the effects on likely adherence and particular concerns that these patients have. In addition, given that information provision has been cited as a potential benefit of nurse prescribing, the present study assessed the extent to which these patients would want an explanation for the selected medicine, as well as which types of information should be included in such an explanation. Background: Nurse prescribing has been successfully implemented in the UK in several healthcare settings. Existing research has not addressed the effects on patients' confidence and likely adherence, nor have patients' information needs been established. However, we know that inadequate medicines information provision by health professionals is one of the largest causes of patient dissatisfaction. Methods: Fifty-four patients taking disease-modifying drugs for inflammatory joint disease attending a specialist rheumatology clinic self-completed a written questionnaire. Results: Patients indicated a relatively high level of confidence in nurse prescribing and stated that they would be very likely to take the selected medication. The level of concern was relatively low and the majority of concerns raised did not relate to the nurse's status. Strong support was expressed for the nurse providing an explanation for medicine choice. Conclusion: This research provides support for the prescription of medicines by nurses working in the area of rheumatology, the importance of nurses providing a full explanation about the selected medicines they prescribe for these patients and some indication as to which categories of information should be included. Relevance to clinical practice: Rheumatology patients who have not yet experienced nurse prescribing are, in general, positive about nurses adopting this role. It is important that nurses provide appropriate information about the prescribed medicines, in a form that can be understood.
Resumo:
A method of estimating dissipation rates from a vertically pointing Doppler lidar with high temporal and spatial resolution has been evaluated by comparison with independent measurements derived from a balloon-borne sonic anemometer. This method utilizes the variance of the mean Doppler velocity from a number of sequential samples and requires an estimate of the horizontal wind speed. The noise contribution to the variance can be estimated from the observed signal-to-noise ratio and removed where appropriate. The relative size of the noise variance to the observed variance provides a measure of the confidence in the retrieval. Comparison with in situ dissipation rates derived from the balloon-borne sonic anemometer reveal that this particular Doppler lidar is capable of retrieving dissipation rates over a range of at least three orders of magnitude. This method is most suitable for retrieval of dissipation rates within the convective well-mixed boundary layer where the scales of motion that the Doppler lidar probes remain well within the inertial subrange. Caution must be applied when estimating dissipation rates in more quiescent conditions. For the particular Doppler lidar described here, the selection of suitably short integration times will permit this method to be applicable in such situations but at the expense of accuracy in the Doppler velocity estimates. The two case studies presented here suggest that, with profiles every 4 s, reliable estimates of ϵ can be derived to within at least an order of magnitude throughout almost all of the lowest 2 km and, in the convective boundary layer, to within 50%. Increasing the integration time for individual profiles to 30 s can improve the accuracy substantially but potentially confines retrievals to within the convective boundary layer. Therefore, optimization of certain instrument parameters may be required for specific implementations.
Resumo:
Real-world text classification tasks often suffer from poor class structure with many overlapping classes and blurred boundaries. Training data pooled from multiple sources tend to be inconsistent and contain erroneous labelling, leading to poor performance of standard text classifiers. The classification of health service products to specialized procurement classes is used to examine and quantify the extent of these problems. A novel method is presented to analyze the labelled data by selectively merging classes where there is not enough information for the classifier to distinguish them. Initial results show the method can identify the most problematic classes, which can be used either as a focus to improve the training data or to merge classes to increase confidence in the predicted results of the classifier.