865 resultados para multi-environments experiments


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean acidification and global warming are occurring concomitantly, yet few studies have investigated how organisms will respond to increases in both temperature and CO2. Intertidal microcosms were used to examine growth, shell mineralogy and survival of two intertidal barnacle post-larvae, Semibalanus balanoides and Elminius modestus, at two temperatures (14 and 19°C) and two CO2 concentrations (380 and 1,000 ppm), fed with a mixed diatom-flagellate diet at 15,000 cells ml-1 with flow rate of 10 ml-1 min-1. Control growth rates, using operculum diameter, were 14 ± 8 µm day-1 and 6 ± 2 µm day-1 for S. balanoides and E. modestus, respectively. Subtle, but significant decreases in E. modestus growth rate were observed in high CO2 but there were no impacts on shell calcium content and survival by either elevated temperature or CO2. S. balanoides exhibited no clear alterations in growth rate but did show a large reduction in shell calcium content and survival under elevated temperature and CO2. These results suggest that a decrease by 0.4 pH(NBS) units alone would not be sufficient to directly impact the survival of barnacles during the first month post-settlement. However, in conjunction with a 4-5°C increase in temperature, it appears that significant changes to the biology of these organisms will ensue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased atmospheric CO2 concentrations are causing greater dissolution of CO2 into seawater, and are ultimately responsible for today's ongoing ocean acidification. We manipulated seawater acidity by addition of HCl and by increasing CO2 concentration and observed that two coastal harpacticoid copepods, Amphiascoides atopus and Schizopera knabeni were both more sensitive to increased acidity when generated by CO2. The present study indicates that copepods living in environments more prone to hypercapnia, such as mudflats where S. knabeni lives, may be less sensitive to future acidification. Ocean acidification is also expected to alter the toxicity of waterborne metals by influencing their speciation in seawater. CO2 enrichment did not affect the free-ion concentration of Cd but did increase the free-ion concentration of Cu. Antagonistic toxicities were observed between CO2 with Cd, Cu and Cu free-ion in A. atopus. This interaction could be due to a competition for H+ and metals for binding sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean acidification (OA) resulting from anthropogenic emissions of carbon dioxide (CO2) has already lowered and is predicted to further lower surface ocean pH. There is a particular need to study effects of OA on organisms living in cold-water environments due to the higher solubility of CO2 at lower temperatures. Mussel larvae (Mytilus edulis) and shrimp larvae (Pandalus borealis) were kept under an ocean acidification scenario predicted for the year 2100 (pH 7.6) and compared against identical batches of organisms held under the current oceanic pH of 8.1, which acted as a control. The temperature was held at a constant 10°C in the mussel experiment and at 5°C in the shrimp experiment. There was no marked effect on fertilization success, development time, or abnormality to the D-shell stage, or on feeding of mussel larvae in the low-pH (pH 7.6) treatment. Mytilus edulis larvae were still able to develop a shell in seawater undersaturated with respect to aragonite (a mineral form of CaCO3), but the size of low-pH larvae was significantly smaller than in the control. After 2 mo of exposure the mussels were 28% smaller in the pH 7.6 treatment than in the control. The experiment with Pandalus borealis larvae ran from 1 through 35 days post hatch. Survival of shrimp larvae was not reduced after 5 wk of exposure to pH 7.6, but a significant delay in zoeal progression (development time) was observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maerl community respiration, photosynthesis and calcification were measured seasonally in the Bay of Brest (France). The dynamics of oxygen, carbon and carbonate fluxes at the water-sediment interface were assessed using benthic chambers. Community respiration (CR) fluctuated in accordance with the seasonal changes in water temperature, from 1.5 mmol C m**-2 h**-1 in winter to 8.7 mmol C m**-2 h**-1 in summer. Mean gross community production (GCP) varied significantly among seasons, according to incident irradiance and temperature, from 3.4 mmol C m**-2 h**-1 in winter to 12.7 mmol C m-2 h-1 in summer. Mean annual Pmax for the P-E curve was estimated to 13.3 mmol C m-2 h-1. Carbonate precipitation only occurred during light incubations and varied seasonally from 0.7 mmol CaCO3 m-2 h-1 in winter to 4.2 mmol CaCO3 m-2 h-1 in summer. Mean annual Pmax was 3.2 mmol CaCO3 m-2 h-1. Annual CR was estimated to 407.4 g C m**-2 yr**-1, and GCP, to 240.9 g C m**-2 yr**-1. Maerl communities are, therefore, heterotrophic systems (GCP:CR = 0.6), and are a source of CO2 for surrounding environments. In addition, CO2 released by calcification averaged 39.2 g C m**-2 yr**-1. Maerl community annual carbonate production was estimated to 486.7 g CaCO3 m**-2 yr**-1; they are therefore one of the most important carbonate producers in shallow coastal waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of short-term (5 days) exposure to CO2-acidified seawater (year 2100 predicted values, ocean pH = 7.6) on key aspects of the function of the intertidal common limpet Patella vulgata (Gastropoda: Patellidae) was investigated. Changes in extracellular acid-base balance were almost completely compensated by an increase in bicarbonate ions. A concomitant increase in haemolymph Ca2+ and visible shell dissolution implicated passive shell dissolution as the bicarbonate source. Analysis of the radula using SEM revealed that individuals from the hypercapnic treatment showed an increase in the number of damaged teeth and the extent to which such teeth were damaged compared with controls. As radula teeth are composed mainly of chitin, acid dissolution seems unlikely, and so the proximate cause of damage is unknown. There was no hypercapnia-related change in metabolism (O2 uptake) or feeding rate, also discounting the possibility that teeth damage was a result of a CO2-related increase in grazing. We conclude that although the limpet appears to have the physiological capacity to maintain its extracellular acid-base balance, metabolism and feeding rate over a 5 days exposure to acidified seawater, radular damage somehow incurred during this time could still compromise feeding in the longer term, in turn decreasing the top-down ecosystem control that P. vulgata exerts over rocky shore environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graphics Processing Units (GPUs) are becoming popular accelerators in modern High-Performance Computing (HPC) clusters. Installing GPUs on each node of the cluster is not efficient resulting in high costs and power consumption as well as underutilisation of the accelerator. The research reported in this paper is motivated towards the use of few physical GPUs by providing cluster nodes access to remote GPUs on-demand for a financial risk application. We hypothesise that sharing GPUs between several nodes, referred to as multi-tenancy, reduces the execution time and energy consumed by an application. Two data transfer modes between the CPU and the GPUs, namely concurrent and sequential, are explored. The key result from the experiments is that multi-tenancy with few physical GPUs using sequential data transfers lowers the execution time and the energy consumed, thereby improving the overall performance of the application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collisionless shocks, that is shocks mediated by electromagnetic processes, are customary in space physics and in astrophysics. They are to be found in a great variety of objects and environments: magnetospheric and heliospheric shocks, supernova remnants, pulsar winds and their nebulæ, active galactic nuclei, gamma-ray bursts and clusters of galaxies shock waves. Collisionless shock microphysics enters at different stages of shock formation, shock dynamics and particle energization and/or acceleration. It turns out that the shock phenomenon is a multi-scale non-linear problem in time and space. It is complexified by the impact due to high-energy cosmic rays in astrophysical environments. This review adresses the physics of shock formation, shock dynamics and particle acceleration based on a close examination of available multi-wavelength or in situ observations, analytical and numerical developments. A particular emphasis is made on the different instabilities triggered during the shock formation and in association with particle acceleration processes with regards to the properties of the background upstream medium. It appears that among the most important parameters the background magnetic field through the magnetization and its obliquity is the dominant one. The shock velocity that can reach relativistic speeds has also a strong impact over the development of the micro-instabilities and the fate of particle acceleration. Recent developments of laboratory shock experiments has started to bring some new insights in the physics of space plasma and astrophysical shock waves. A special section is dedicated to new laser plasma experiments probing shock physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Providing good customer service, inexpensively, is a problem commonly faced by managers of service operations. To tackle this problem, managers must do four tasks: forecast customer demand for the service; translate these forecasts into employee requirements; develop a labor schedule that provides appropriate numbers of employees at appropriate times; and control the delivery of the service in real-time. This paper focuses upon the translation of forecasts of customer demand into employee requirements. Specifically, it presents and evaluates two methods for determining desired staffing levels. One of these methods is a traditional approach to the task, while the other, by using modified customer arrival rates, offers a better means of accounting for the multi-period impact of customer service. To calculate the modified arrival rates, the latter method reduces (increases) the actual customer arrival rate for a period to account for customers who arrived in the period (in earlier periods) but have some of their service performed in subsequent periods (in the period). In an experiment simulating 13824 service delivery environments, the new method demonstrated its superiority by serving 2.74% more customers within the specified waiting time limit while using 7.57% fewer labor hours.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The power of computer game technology is currently being harnessed to produce “serious games”. These “games” are targeted at the education and training marketplace, and employ various key game-engine components such as the graphics and physics engines to produce realistic “digital-world” simulations of the real “physical world”. Many approaches are driven by the technology and often lack a consideration of a firm pedagogical underpinning. The authors believe that an analysis and deployment of both the technological and pedagogical dimensions should occur together, with the pedagogical dimension providing the lead. This chapter explores the relationship between these two dimensions, and explores how “pedagogy may inform the use of technology”, how various learning theories may be mapped onto the use of the affordances of computer game engines. Autonomous and collaborative learning approaches are discussed. The design of a serious game is broken down into spatial and temporal elements. The spatial dimension is related to the theories of knowledge structures, especially “concept maps”. The temporal dimension is related to “experiential learning”, especially the approach of Kolb. The multi-player aspect of serious games is related to theories of “collaborative learning” which is broken down into a discussion of “discourse” versus “dialogue”. Several general guiding principles are explored, such as the use of “metaphor” (including metaphors of space, embodiment, systems thinking, the internet and emergence). The topological design of a serious game is also highlighted. The discussion of pedagogy is related to various serious games we have recently produced and researched, and is presented in the hope of informing the “serious game community”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the core tasks of the virtual-manufacturing environment is to characterise the transformation of the state of material during each of the unit processes. This transformation in shape, material properties, etc. can only be reliably achieved through the use of models in a simulation context. Unfortunately, many manufacturing processes involve the material being treated in both the liquid and solid state, the trans-formation of which may be achieved by heat transfer and/or electro-magnetic fields. The computational modelling of such processes, involving the interactions amongst various interacting phenomena, is a consider-able challenge. However, it must be addressed effectively if Virtual Manufacturing Environments are to become a reality! This contribution focuses upon one attempt to develop such a multi-physics computational toolkit. The approach uses a single discretisation procedure and provides for direct interaction amongst the component phenomena. The need to exploit parallel high performance hardware is addressed so that simulation elapsed times can be brought within the realms of practicality. Examples of Multiphysics modelling in relation to shape casting, and solder joint formation reinforce the motivation for this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selection of a set of requirements between all the requirements previously defined by customers is an important process, repeated at the beginning of each development step when an incremental or agile software development approach is adopted. The set of selected requirements will be developed during the actual iteration. This selection problem can be reformulated as a search problem, allowing its treatment with metaheuristic optimization techniques. This paper studies how to apply Ant Colony Optimization algorithms to select requirements. First, we describe this problem formally extending an earlier version of the problem, and introduce a method based on Ant Colony System to find a variety of efficient solutions. The performance achieved by the Ant Colony System is compared with that of Greedy Randomized Adaptive Search Procedure and Non-dominated Sorting Genetic Algorithm, by means of computational experiments carried out on two instances of the problem constructed from data provided by the experts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum sensors based on coherent matter-waves are precise measurement devices whose ultimate accuracy is achieved with Bose-Einstein condensates (BECs) in extended free fall. This is ideally realized in microgravity environments such as drop towers, ballistic rockets and space platforms. However, the transition from lab-based BEC machines to robust and mobile sources with comparable performance is a challenging endeavor. Here we report on the realization of a miniaturized setup, generating a flux of 4x10(5) quantum degenerate Rb-87 atoms every 1.6 s. Ensembles of 1 x 10(5) atoms can be produced at a 1 Hz rate. This is achieved by loading a cold atomic beam directly into a multi-layer atom chip that is designed for efficient transfer from laser-cooled to magnetically trapped clouds. The attained flux of degenerate atoms is on par with current lab-based BEC experiments while offering significantly higher repetition rates. Additionally, the flux is approaching those of current interferometers employing Raman-type velocity selection of laser-cooled atoms. The compact and robust design allows for mobile operation in a variety of demanding environments and paves the way for transportable high-precision quantum sensors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network Virtualization is a key technology for the Future Internet, allowing the deployment of multiple independent virtual networks that use resources of the same basic infrastructure. An important challenge in the dynamic provision of virtual networks resides in the optimal allocation of physical resources (nodes and links) to requirements of virtual networks. This problem is known as Virtual Network Embedding (VNE). For the resolution of this problem, previous research has focused on designing algorithms based on the optimization of a single objective. On the contrary, in this work we present a multi-objective algorithm, called VNE-MO-ILP, for solving dynamic VNE problem, which calculates an approximation of the Pareto Front considering simultaneously resource utilization and load balancing. Experimental results show evidences that the proposed algorithm is better or at least comparable to a state-of-the-art algorithm. Two performance metrics were simultaneously evaluated: (i) Virtual Network Request Acceptance Ratio and (ii) Revenue/Cost Relation. The size of test networks used in the experiments shows that the proposed algorithm scales well in execution times, for networks of 84 nodes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.