41 resultados para Exception Handling. Exceptional Behavior. Exception Policy. Software Testing. Design Rules


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An eddy current testing system consists of a multi-sensor probe, a computer and a special expansion card and software for data-collection and analysis. The probe incorporates an excitation coil, and sensor coils; at least one sensor coil is a lateral current-normal coil and at least one is a current perturbation coil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An eddy current testing system consists of a multi-sensor probe, computer and a special expansion card and software for data collection and analysis. The probe incorporates an excitation coil, and sensor coils; at least one sensor coil is a lateral current-normal coil and at least one is a current perturbation coil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a need for better links between hydrology and ecology, specifically between landscapes and riverscapes to understand how processes and factors controlling the transport and storage of environmental pollution have affected or will affect the freshwater biota. Here we show how the INCA modelling framework, specifically INCA-Sed (the Integrated Catchments model for Sediments) can be used to link sediment delivery from the landscape to sediment changes in-stream. INCA-Sed is a dynamic, process-based, daily time step model. The first complete description of the equations used in the INCA-Sed software (version 1.9.11) is presented. This is followed by an application of INCA-Sed made to the River Lugg (1077 km2) in Wales. Excess suspended sediment can negatively affect salmonid health. The Lugg has a large and potentially threatened population of both Atlantic salmon (Salmo salar) and Brown Trout (Salmo trutta). With the exception of the extreme sediment transport processes, the model satisfactorily simulated both the hydrology and the sediment dynamics in the catchment. Model results indicate that diffuse soil loss is the most important sediment generation process in the catchment. In the River Lugg, the mean annual Guideline Standard for suspended sediment concentration, proposed by UKTAG, of 25 mg l− 1 is only slightly exceeded during the simulation period (1995–2000), indicating only minimal effect on the Atlantic salmon population. However, the daily time step simulation of INCA-Sed also allows the investigation of the critical spawning period. It shows that the sediment may have a significant negative effect on the fish population in years with high sediment runoff. It is proposed that the fine settled particles probably do not affect the salmonid egg incubation process, though suspended particles may damage the gills of fish and make the area unfavourable for spawning if the conditions do not improve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sixty cattle farmers in England were questioned about the costs associated with premovement testing for bovine tuberculosis (TB). On average, the farmers had premovement tested 2-45 times in the previous 12 months, but the majority had tested only once. An average of 28.6 animals were tested on each occasion, but there were wide variations. The average farm labour costs were (sic)4.00 per animal tested, veterinary costs were (sic)4.33 and other costs were (sic)0.51, giving a total cost of (sic)8.84, but there were wide variations between farms, and many incurred costs of more than (sic)20 per animal. A majority of the farmers also cited disruption to the farm business or missed market opportunities as costs, but few could estimate their financial cost. Most of the farmers thought that premovement testing was a cost burden on their business, and over half thought It was not an effective policy to control bovine TB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter introduces ABMs, their construction, and the pros and cons of their use. Although relatively new, agent-basedmodels (ABMs) have great potential for use in ecotoxicological research – their primary advantage being the realistic simulations that can be constructed and particularly their explicit handling of space and time in simulations. Examples are provided of their use in ecotoxicology primarily exemplified by different implementations of the ALMaSS system. These examples presented demonstrate how multiple stressors, landscape structure, details regarding toxicology, animal behavior, and socioeconomic effects can and should be taken into account when constructing simulations for risk assessment. Like ecological systems, in ABMs the behavior at the system level is not simply the mean of the component responses, but the sum of the often nonlinear interactions between components in the system; hence this modeling approach opens the door to implementing and testing much more realistic and holistic ecotoxicological models than are currently used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers methods for testing for superiority or non-inferiority in active-control trials with binary data, when the relative treatment effect is expressed as an odds ratio. Three asymptotic tests for the log-odds ratio based on the unconditional binary likelihood are presented, namely the likelihood ratio, Wald and score tests. All three tests can be implemented straightforwardly in standard statistical software packages, as can the corresponding confidence intervals. Simulations indicate that the three alternatives are similar in terms of the Type I error, with values close to the nominal level. However, when the non-inferiority margin becomes large, the score test slightly exceeds the nominal level. In general, the highest power is obtained from the score test, although all three tests are similar and the observed differences in power are not of practical importance. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While only about 1-200 species are used intensively in commercial floriculture (e.g. carnations, chrysanthemums, gerbera, narcissus, orchids, tulips, lilies, roses, pansies and violas, saintpaulias, etc.) and 4-500 as house plants, several thousand species of herbs, shrubs and trees are traded commercially by nurseries and garden centres as ornamentals or amenity species. Most of these have been introduced from the wild with little selection or breeding. In Europe alone, 12 000 species are found in cultivation in general garden collections (i.e. excluding specialist collections and botanic gardens). In addition, specialist collections (often very large) of many other species and/or cultivars of groups such as orchids, bromeliads, cacti and succulents, primulas, rhododendrons, conifers and cycads are maintained in several centres such as botanic gardens and specialist nurseries, as are 'national collections' of cultivated species and cultivars in some countries. Specialist growers, both professional and amateur, also maintain collections of plants for cultivation, including, increasingly, native plants. The trade in ornamental and amenity horticulture cannot be fully estimated but runs into many billions of dollars annually and there is considerable potential for further development and the introduction of many new species into the trade. Despite this, most of the collections are ad hoc and no co-ordinated efforts have been made to ensure that adequate germplasm samples of these species are maintained for conservation purposes and few of them are represented at all adequately in seed banks. Few countries have paid much attention to germplasm needs of ornamentals and the Ornamental Plant Germplasm Center in conjunction with the USDA National Plant Germplasm System at The Ohio State University is an exception. Generally there is a serious gap in national and international germplasm strategies, which have tended to focus primarily on food plants and some forage and industrial crops. Adequate arrangements need to be put in place to ensure the long- and medium-term conservation of representative samples of the genetic diversity of ornamental species. The problems of achieving this will be discussed. In addition, a policy for the conservation of old cultivars or 'heritage' varieties of ornamentals needs to be formulated. The considerable potential for introduction of new ornamental species needs to be assessed. Consideration needs to be given to setting up a co-ordinating structure with overall responsibility for the conservation of germplasm of ornamental and amenity plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The aims of these studies were (a) to investigate the relationship between attentional bias and eating disorders and (b) examine the impact of psychological treatment on attentional bias. Method: The first study compared performance on a pictorial dot probe of 82 female patients with clinical eating disorders and 44 healthy female controls. The second study compared the performance of 31 patients with eating disorder on the same task before and after receiving 20 weeks of standardized cognitive behavior therapy. Twenty-four patients with eating disorder served as wait-list controls. Results: With the exception of neutral shape stimuli, attentional biases for eating, shape, and weight stimuli were greater in the patient sample than the healthy controls. The second study found that attentional biases significantly reduced after active treatment only. Conclusion: Attentional biases may be an expression of the eating disorder. The question of whether such biases warrant specific intervention requires further investigation. (C) 2008 by Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates that recent influential contributions to monetary policy imply an emerging consensus whereby neither rigid rules nor complete discretion are found optimal. Instead, middle-ground monetary regimes based on rules (operative under 'normal' circumstances) to anchor inflation expectations over the long run, but designed with enough flexibility to mitigate the short-run effect of shocks (with communicated discretion in 'exceptional' circumstances temporarily overriding these rules), are gaining support in theoretical models and policy formulation and implementation. The opposition of 'rules versus discretion' has, thus, reappeared as the synthesis of 'rules cum discretion', in essence as inflation-forecast targeting. But such synthesis is not without major theoretical problems, as we argue in this contribution. Furthermore, the very recent real-world events have made it obvious that the inflation targeting strategy of monetary policy, which rests upon the new consensus paradigm in modern macroeconomics is at best a 'fair weather' model. In the turbulent economic climate of highly unstable inflation, deep financial crisis and world-wide, abrupt economic slowdown nowadays this approach needs serious rethinking to say the least, if not abandoning it altogether

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Agri-Environment Footprint Index (AFI) has been developed as a generic methodology to assess changes in the overall environmental impacts from agriculture at the farm level and to assist in the evaluation of European agri-environmental schemes (AES). The methodology is based on multicriteria analysis (MCA) and involves stakeholder participation to provide a locally customised evaluation based on weighted environmental indicators. The methodology was subjected to a feasibility assessment in a series of case studies across the EU. The AFI approach was able to measure significant differences in environmental status between farms that participated in an AES and nonparticipants. Wider environmental concerns, beyond the scheme objectives, were also considered in some case studies and the benefits for identification of unintentional (and often beneficial) impacts of AESs are presented. The participatory approach to AES valuation proved efficient in different environments and administrative contexts. The approach proved to be appropriate for environmental evaluation of complex agri-environment systems and can complement any evaluation conducted under the Common Monitoring and Evaluation Framework. The applicability of the AFI in routine monitoring of AES impacts and in providing feedback to improve policy design is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models used in neoclassical economics assume human behaviour to be purely rational. On the other hand, models adopted in social and behavioural psychology are founded on the ‘black box’ of human cognition. In view of these observations, this paper aims at bridging this gap by introducing psychological constructs in the well established microeconomic framework of choice behaviour based on random utility theory. In particular, it combines constructs developed employing Ajzen’s theory of planned behaviour with Lancaster’s theory of consumer demand for product characteristics to explain stated preferences over certified animal-friendly foods. To reach this objective a web survey was administered in the largest five EU-25 countries: France, Germany, Italy, Spain and the UK. Findings identify some salient cross-cultural differences between northern and southern Europe and suggest that psychological constructs developed using the Ajzen model are useful in explaining heterogeneity of preferences. Implications for policy makers and marketers involved with certified animal-friendly foods are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patterns of forest cover and forest degradation determine the size and types of ecosystem services forests provide. Particularly in low-income countries, nontimber forest product (NTFP) extraction by rural people, which provides important resources and income to the rural poor, contributes to the level and pattern of forest degradation. Although recent policy, particularly in Africa, emphasizes forest degradation, relatively little research describes the spatial aspects of NTFP collection that lead to spatial degradation patterns. This paper reviews both the spatial empirical work on NTFP extraction and related forest degradation patterns, and spatial models of behavior of rural people who extract NTFPs from forest. Despite the impact of rural people's behavior on resulting quantities and patterns of forest resources, spatial–temporal models/patterns rarely inform park siting and sizing decisions, econometric assessments of park effectiveness, development projects to support conservation, or REDD protocols. Using the literature review as a lens, we discuss the models' implications for these policies with particular emphasis on effective conservation spending and leakage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.