64 resultados para Advanced application and branching systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past 15 years, a number of initiatives have been undertaken at national level to develop ocean forecasting systems operating at regional and/or global scales. The co-ordination between these efforts has been organized internationally through the Global Ocean Data Assimilation Experiment (GODAE). The French MERCATOR project is one of the leading participants in GODAE. The MERCATOR systems routinely assimilate a variety of observations such as multi-satellite altimeter data, sea-surface temperature and in situ temperature and salinity profiles, focusing on high-resolution scales of the ocean dynamics. The assimilation strategy in MERCATOR is based on a hierarchy of methods of increasing sophistication including optimal interpolation, Kalman filtering and variational methods, which are progressively deployed through the Syst`eme d’Assimilation MERCATOR (SAM) series. SAM-1 is based on a reduced-order optimal interpolation which can be operated using ‘altimetry-only’ or ‘multi-data’ set-ups; it relies on the concept of separability, assuming that the correlations can be separated into a product of horizontal and vertical contributions. The second release, SAM-2, is being developed to include new features from the singular evolutive extended Kalman (SEEK) filter, such as three-dimensional, multivariate error modes and adaptivity schemes. The third one, SAM-3, considers variational methods such as the incremental four-dimensional variational algorithm. Most operational forecasting systems evaluated during GODAE are based on least-squares statistical estimation assuming Gaussian errors. In the framework of the EU MERSEA (Marine EnviRonment and Security for the European Area) project, research is being conducted to prepare the next-generation operational ocean monitoring and forecasting systems. The research effort will explore nonlinear assimilation formulations to overcome limitations of the current systems. This paper provides an overview of the developments conducted in MERSEA with the SEEK filter, the Ensemble Kalman filter and the sequential importance re-sampling filter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the ideal method of assessing the nutritive value of a feedstuff, namely offering it to the appropriate class of animal and recording the production response obtained, is neither practical nor cost effective a range of feed evaluation techniques have been developed. Each of these balances some degree of compromise with the practical situation against data generation. However, due to the impact of animal-feed interactions over and above that of feed composition, the target animal remains the ultimate arbitrator of nutritional value. In this review current in vitro feed evaluation techniques are examined according to the degree of animal-feed interaction. Chemical analysis provides absolute values and therefore differs from the majority of in vitro methods that simply rank feeds. However, with no host animal involvement, estimates of nutritional value are inferred by statistical association. In addition given the costs involved, the practical value of many analyses conducted should be reviewed. The in sacco technique has made a substantial contribution to both understanding rumen microbial degradative processes and the rapid evaluation of feeds, especially in developing countries. However, the numerous shortfalls of the technique, common to many in vitro methods, the desire to eliminate the use of surgically modified animals for routine feed evaluation, paralleled with improvements in in vitro techniques, will see this technique increasingly replaced. The majority of in vitro systems use substrate disappearance to assess degradation, however, this provides no information regarding the quantity of derived end-products available to the host animal. As measurement of volatile fatty acids or microbial biomass production greatly increases analytical costs, fermentation gas release, a simple and non-destructive measurement, has been used as an alternative. However, as gas release alone is of little use, gas-based systems, where both degradation and fermentation gas release are measured simultaneously, are attracting considerable interest. Alternative microbial inocula are being considered, as is the potential of using multi-enzyme systems to examine degradation dynamics. It is concluded that while chemical analysis will continue to form an indispensable part of feed evaluation, enhanced use will be made of increasingly complex in vitro systems. It is vital, however, the function and limitations of each methodology are fully understood and that the temptation to over-interpret the data is avoided so as to draw the appropriate conclusions. With careful selection and correct application in vitro systems offer powerful research tools with which to evaluate feedstuffs. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. The establishment of grassy strips at the margins of arable fields is an agri-environment scheme that aims to provide resources for native flora and fauna and thus increase farmland biodiversity. These margins can be managed to target certain groups, such as farmland birds and pollinators, but the impact of such management on the soil fauna has been poorly studied. This study assessed the effect of seed mix and management on the biodiversity, conservation and functional value of field margins for soil macrofauna. 2. Experimental margin plots were established in 2001 in a winter wheat field in Cambridgeshire, UK, using a factorial design of three seed mixes and three management practices [spring cut, herbicide application and soil disturbance (scarification)]. In spring and autumn 2005, soil cores taken from the margin plots and the crop were hand-sorted for soil macrofauna. The Lumbricidae, Isopoda, Chilopoda, Diplopoda, Carabidae and Staphylinidae were identified to species and classified according to feeding type. 3. Diversity in the field margins was generally higher than in the crop, with the Lumbricidae, Isopoda and Coleoptera having significantly more species and/or higher abundances in the margins. Within the margins, management had a significant effect on the soil macrofauna, with scarified plots containing lower abundances and fewer species of Isopods. The species composition of the scarified plots was similar to that of the crop. 4. Scarification also reduced soil- and litter-feeder abundances and predator species densities, although populations appeared to recover by the autumn, probably as a result of dispersal from neighbouring plots and boundary features. The implications of the responses of these feeding groups for ecosystem services are discussed. 5. Synthesis and applications. This study shows that the management of agri-environment schemes can significantly influence their value for soil macrofauna. In order to encourage the litter-dwelling invertebrates that tend to be missing from arable systems, agri-environment schemes should aim to minimize soil cultivation and develop a substantial surface litter layer. However, this may conflict with other aims of these schemes, such as enhancing floristic and pollinator diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conservation of crop wild relatives (CWRs) is a complex interdisciplinary process that is being addressed by various national and international initiatives, including two Global Environment Facility projects ('In situ Conservation of Crop Wild Relatives through Enhanced Information Management and Field Application' and 'Design, Testing and Evaluation of Best Practices for in situ Conservation of Economically Important Wild Species'), the European Community-funded project 'European Crop Wild Relative Diversity Assessment and Conservation Forum (PGR Forum)' and the European 'In situ and On Farm Network'. The key issues that have arisen are: (1) the definition of what constitutes a CWR, (2) the need for national and regional information systems and a global system, (3) development and application of priority-determining mechanisms, (4) the incorporation of the conservation of CWRs into existing national, regional and international PGR programmes, (5) assessment of the effectiveness of conservation actions, (6) awareness of the importance of CWRs in agricultural development at local, national and international levels both for the scientific and lay communities and (7) policy development and legal framework. The above issues are illustrated by work on the conservation of a group of legumes known as grasspea chicklings, vetchlings, and horticultural ornamental peas (Lathyrus spp.) in their European and Mediterranean centre of diversity. (c) 2007 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance benefit when using Grid systems comes from different strategies, among which partitioning the applications into parallel tasks is the most important. However, in most cases the enhancement coming from partitioning is smoothed by the effect of the synchronization overhead, mainly due to the high variability of completion times of the different tasks, which, in turn, is due to the large heterogeneity of Grid nodes. For this reason, it is important to have models which capture the performance of such systems. In this paper we describe a queueing-network-based performance model able to accurately analyze Grid architectures, and we use the model to study a real parallel application executed in a Grid. The proposed model improves the classical modelling techniques and highlights the impact of resource heterogeneity and network latency on the application performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Near isogenic lines (NILs) varying for reduced height (Rht) and photoperiod insensitivity (Ppd-D1) alleles in a cv. Mercia background (rht (tall), Rht-B1b, Rht-D1b, Rht-B1c, Rht8c+Ppd-D1a, Rht-D1c, Rht12) were compared for interception of photosynthetically active radiation (PAR), radiation use efficiency (RUE), above-ground biomass (AGB), harvest index (HI), height, weed prevalence, lodging and grain yield, at one field site but within contrasting (‘organic’ v ‘conventional’) rotational and agronomic contexts, in each of three years. In the final year, further NILs (rht (tall), Rht-B1b, Rht-D1b, Rht-B1c, Rht-B1b+Rht-D1b, Rht-D1b+Rht-B1c) in Maris Huntsman and Maris Widgeon backgrounds were added together with 64 lines of a doubled haploid (DH) population [Savannah (Rht-D1b) × Renesansa (Rht-8c+Ppd-D1a)]. There were highly significant genotype × system interactions for grain yield, mostly because differences were greater in the conventional system than in the organic system. Quadratic fits of NIL grain yield against height were appropriate for both systems when all NILs and years were included. Extreme dwarfing was associated with reduced PAR, RUE, AGB, HI, and increased weed prevalence. Intermediate dwarfing was often associated with improved HI in the conventional system, but not in the organic system. Heights in excess of the optimum for yield were associated particularly with reduced HI and, in the conventional system, lodging. There was no statistical evidence that optimum height for grain yield varied with system although fits peaked at 85cm and 96cm in the conventional and organic systems, respectively. Amongst the DH lines, the marker for Ppd-D1a was associated with earlier flowering, and just in the conventional system also with reduced PAR, AGB and grain yield. The marker for Rht-D1b was associated with reduced height, and again just in the conventional system, with increased HI and grain yield. The marker for Rht8c reduced height, and in the conventional system only, increased HI. When using the System × DH line means as observations grain yield was associated with height and early vegetative growth in the organic system, but not in the conventional system. In the conventional system, PAR interception after anthesis correlated with yield. Savannah was the highest yielding line in the conventional system, producing significantly more grain than several lines that out yielded it in the organic system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the results of the crowd image analysis challenge of the PETS2010 workshop. The evaluation was carried out using a selection of the metrics developed in the Video Analysis and Content Extraction (VACE) program and the CLassification of Events, Activities, and Relationships (CLEAR) consortium. The PETS 2010 evaluation was performed using new ground truthing create from each independant two dimensional view. In addition, the performance of the submissions to the PETS 2009 and Winter-PETS 2009 were evaluated and included in the results. The evaluation highlights the detection and tracking performance of the authors’ systems in areas such as precision, accuracy and robustness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel Neuropredictive Teleoperation (NPT) Scheme is presented. The design results from two key ideas: the exploitation of the measured or estimated neural input to the human arm or its electromyograph (EMG) as the system input and the employment of a predictor of the arm movement, based on this neural signal and an arm model, to compensate for time delays in the system. Although a multitude of such models, as well as measuring devices for the neural signals and the EMG, have been proposed, current telemanipulator research has only been considering highly simplified arm models. In the present design, the bilateral constraint that the master and slave are simultaneously compliant to each other's state (equal positions and forces) is abandoned, thus obtaining a simple to analyzesuccession of only locally controlled modules, and a robustness to time delays of up to 500 ms. The proposed designs were inspired by well established physiological evidence that the brain, rather than controlling the movement on-line, programs the arm with an action plan of a complete movement, which is then executed largely in open loop, regulated only by local reflex loops. As a model of the human arm the well-established Stark model is employed, whose mathematical representation is modified to make it suitable for an engineering application. The proposed scheme is however valid for any arm model. BIBO-stability and passivity results for a variety of local control laws are reported. Simulation results and comparisons with traditional designs also highlight the advantages of the proposed design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For thousands of years, humans have inhabited locations that are highly vulnerable to the impacts of climate change, earthquakes, and floods. In order to investigate the extent to which Holocene environmental changes may have impacted on cultural evolution, we present new geologic, geomorphic, and chronologic data from the Qazvin Plain in northwest Iran that provides a backdrop of natural environmental changes for the simultaneous cultural dynamics observed on the Central Iranian Plateau. Well-resolved archaeological data from the neighbouring settlements of Zagheh (7170—6300 yr BP), Ghabristan (6215—4950 yr BP) and Sagzabad (4050—2350 yr BP) indicate that Holocene occupation of the Hajiarab alluvial fan was interrupted by a 900 year settlement hiatus. Multiproxy climate data from nearby lakes in northwest Iran suggest a transition from arid early-Holocene conditions to more humid middle-Holocene conditions from c. 7550 to 6750 yr BP, coinciding with the settlement of Zagheh, and a peak in aridity at c. 4550 yr BP during the settlement hiatus. Palaeoseismic investigations indicate that large active fault systems in close proximity to the tell sites incurred a series of large (MW ~7.1) earthquakes with return periods of ~500—1000 years during human occupation of the tells. Mapping and optically stimulated luminescence (OSL) chronology of the alluvial sequences reveals changes in depositional style from coarse-grained unconfined sheet flow deposits to proximal channel flow and distally prograding alluvial deposits sometime after c. 8830 yr BP, possibly reflecting an increase in moisture following the early-Holocene arid phase. The coincidence of major climate changes, earthquake activity, and varying sedimentation styles with changing patterns of human occupation on the Hajiarab fan indicate links between environmental and anthropogenic systems. However, temporal coincidence does not necessitate a fundamental causative dependency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Northern Hemisphere cyclone activity is assessed by applying an algorithm for the detection and tracking of synoptic scale cyclones to mean sea level pressure data. The method, originally developed for the Southern Hemisphere, is adapted for application in the Northern Hemisphere winter season. NCEP-Reanalysis data from 1958/59 to 1997/98 are used as input. The sensitivities of the results to particular parameters of the algorithm are discussed for both case studies and from a climatological point of view. Results show that the choice of settings is of major relevance especially for the tracking of smaller scale and fast moving systems. With an appropriate setting the algorithm is capable of automatically tracking different types of cyclones at the same time: Both fast moving and developing systems over the large ocean basins and smaller scale cyclones over the Mediterranean basin can be assessed. The climatology of cyclone variables, e.g., cyclone track density, cyclone counts, intensification rates, propagation speeds and areas of cyclogenesis and -lysis gives detailed information on typical cyclone life cycles for different regions. The lowering of the spatial and temporal resolution of the input data from full resolution T62/06h to T42/12h decreases the cyclone track density and cyclone counts. Reducing the temporal resolution alone contributes to a decline in the number of fast moving systems, which is relevant for the cyclone track density. Lowering spatial resolution alone mainly reduces the number of weak cyclones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of intuition is an emerging area of research in psychology, social sciences, and business studies. It is increasingly of interest to the study of management, for example in decision-making as a counterpoint to structured approaches. Recently work has been undertaken to conceptualize a construct for the intuitive nature of technology. However to-date there is no common under-standing of the term intuition in information systems (IS) research. This paper extends the study of intuition in IS research by using exploratory research to cate-gorize the use of the word “intuition” and related terms in papers published in two prominent IS journals over a ten year period. The entire text of MIS Quarterly and Information Systems Research was reviewed for the years 1999 through 2008 using searchable PDF versions of these publications. As far as could be deter-mined, this is the first application of this approach in the analysis of the text of IS academic journals. The use of the word “intuition” and related terms was catego-rized using coding consistent with Grounded Theory. The focus of this research was on the first two stages of Grounded Theory analysis - the development of codes and constructs. Saturation of coding was not reached: an extended review of these publications would be required to enable theory development. Over 400 incidents of the use of “intuition”, and related terms were found in the articles reviewed. The most prominent use of the term of “intuition” was coded as “Intui-tion as Authority” in which intuition was used to validate a research objective or finding; representing approximately 37 per cent of codes assigned. The second most common coding occurred in research articles with mathematical analysis, representing about 19 per cent of the codes assigned, for example where a ma-thematical formulation or result was “intuitive”. The possibly most impactful use of the term “intuition” was “Intuition as Outcome”, representing approximately 7 per cent of all coding, which characterized research results as adding to the intui-tive understanding of a research topic or phenomena. This research contributes to a greater theoretical understanding of intuition enabling insight into the use of intuition, and the eventual development of a theory on the use of intuition in academic IS research publications. It also provides potential benefits to practi-tioners by providing insight into and validation of the use of intuition in IS man-agement. Research directions include the creation of reflective and/or formative constructs for intuition in information systems research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How effective are multi-stakeholder scenarios building processes to bring diverse actors together and create a policy-making tool to support sustainable development and promote food security in the developing world under climate change? The effectiveness of a participatory scenario development process highlights the importance of ‘boundary work’ that links actors and organizations involved in generating knowledge on the one hand, and practitioners and policymakers who take actions based on that knowledge on the other. This study reports on the application of criteria for effective boundary work to a multi-stakeholder scenarios process in East Africa that brought together a range of regional agriculture and food systems actors. This analysis has enabled us to evaluate the extent to which these scenarios were seen by the different actors as credible, legitimate and salient, and thus more likely to be useful. The analysis has shown gaps and opportunities for improvement on these criteria, such as the quantification of scenarios, attention to translating and communicating the results through various channels and new approaches to enable a more inclusive and diverse group of participants. We conclude that applying boundary work criteria to multi-stakeholder scenarios processes can do much to increase the likelihood of developing sustainable development and food security policies that are more appropriate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A plasma source, sustained by the application of a floating high voltage (±15 kV) to parallel-plate electrodes at 50 Hz, has been achieved in a helium/air mixture at atmospheric pressure (P = 105 Pa) contained in a zip-locked plastic package placed in the electrode gap. Some of the physical and antimicrobial properties of this apparatus were established with a view to ascertain its performance as a prototype for the disinfection of fresh produce. The current–voltage (I–V) and charge–voltage (Q–V) characteristics of the system were measured as a function of gap distance d, in the range (3 × 103 ≤ Pd ≤ 1.0 × 104 Pa m). The electrical measurements showed this plasma source to exhibit the characteristic behaviour of a dielectric barrier discharge in the filamentary mode and its properties could be accurately interpreted by the two-capacitance in series model. The power consumed by the discharge and the reduced field strength were found to decrease quadratically from 12.0 W to 4.5 W and linearly from 140 Td to 50 Td, respectively, in the range studied. Emission spectra of the discharge were recorded on a relative intensity scale and the dominant spectral features could be assigned to strong vibrational bands in the 2+ and 1− systems of N2 and ${\rm N}_2^+$ , respectively, with other weak signatures from the NO and OH radicals and the N+, He and O atomic species. Absolute spectral intensities were also recorded and interpreted by comparison with the non-equilibrium synthetic spectra generated by the computer code SPECAIR. At an inter-electrode gap of 0.04 m, this comparison yielded typical values for the electron, vibrational and translational (gas) temperatures of (4980 ± 100) K, (2700 ± 200) K and (300 ± 100) K, respectively and an electron density of 1.0 × 1017 m−3. A Boltzmann plot also provided a value of (3200 ± 200 K) for the vibrational temperature. The antimicrobial efficacy was assessed by studying the resistance of both Escherichia coli K12 its isogenic mutants in soxR, soxS, oxyR, rpoS and dnaK selected to identify possible cellular responses and targets related with 5 min exposure to the active gas in proximity of, but not directly in, the path of the discharge filaments. Both the parent strain and mutants populations were significantly reduced by more than 1.5 log cycles in these conditions, showing the potential of the system. Post-treatment storage studies showed that some transcription regulators and specific genes related to oxidative stress play an important role in the E. coli repair mechanism and that plasma exposure affects specific cell regulator systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study compared fat and fatty acids in cooked retail chicken meat from conventional and organic systems. Fat contents were 1.7, 5.2, 7.1 and 12.9 g/100 g cooked weight in skinless breast, breast with skin, skinless leg and leg with skin respectively, with organic meat containing less fat overall (P < 0.01). Meat was rich in cis-monounsaturated fatty acids, although organic meat contained less than did conventional meat (1850 vs. 2538 mg/100 g; P < 0.001). Organic meat was also lower (P < 0.001) in 18:3 n−3 (115 vs. 180 mg/100 g) and, whilst it contained more (P < 0.001) docosahexaenoic acid (30.9 vs. 13.7 mg/100 g), this was due to the large effect of one supermarket. This system by supermarket interaction suggests that poultry meat labelled as organic is not a guarantee of higher long chain n−3 fatty acids. Overall there were few major differences in fatty acid contents/profiles between organic and conventional meat that were consistent across all supermarkets.