849 resultados para Exclusion process, Multi-species, Multi-scale modelling
Resumo:
We report on the experimental characterisation of laser-driven ion beams using a Thomson Parabola Spectrometer (TPS) equipped with trapezoidally shaped electric plates, proposed by Gwynne et al. [Rev. Sci. Instrum. 85, 033304 (2014)]. While a pair of extended (30 cm long) electric plates was able to produce a significant increase in the separation between neighbouring ion species at high energies, deploying a trapezoidal design circumvented the spectral clipping at the low energy end of the ion spectra. The shape of the electric plate was chosen carefully considering, for the given spectrometer configuration, the range of detectable ion energies and species. Analytical tracing of the ion parabolas matches closely with the experimental data, which suggests a minimal effect of fringe fields on the escaping ions close to the wedged edge of the electrode. The analytical formulae were derived considering the relativistic correction required for the high energy ions to be characterised using such spectrometer.
Resumo:
Ongoing developments in laser-driven ion acceleration warrant appropriate modifications to the standard Thomson Parabola Spectrometer (TPS) arrangement in order to match the diagnostic requirements associated to the particular and distinctive properties of laser-accelerated beams. Here we present an overview of recent developments by our group of the TPS diagnostic aimed to enhance the capability of diagnosing multi-species high-energy ion beams. In order to facilitate discrimination between ions with same Z / A , a recursive differential filtering technique was implemented at the TPS detector in order to allow only one of the overlapping ion species to reach the detector, across the entire energy range detectable by the TPS. In order to mitigate the issue of overlapping ion traces towards the higher energy part of the spectrum, an extended, trapezoidal electric plates design was envisaged, followed by its experimental demonstration. The design allows achieving high energy-resolution at high energies without sacrificing the lower energy part of the spectrum. Finally, a novel multi-pinhole TPS design is discussed, that would allow angularly resolved, complete spectral characterization of the high-energy, multi-species ion beams.
Resumo:
This work represents an original contribution to the methodology for ecosystem models' development as well as the rst attempt of an end-to-end (E2E) model of the Northern Humboldt Current Ecosystem (NHCE). The main purpose of the developed model is to build a tool for ecosystem-based management and decision making, reason why the credibility of the model is essential, and this can be assessed through confrontation to data. Additionally, the NHCE exhibits a high climatic and oceanographic variability at several scales, the major source of interannual variability being the interruption of the upwelling seasonality by the El Niño Southern Oscillation, which has direct e ects on larval survival and sh recruitment success. Fishing activity can also be highly variable, depending on the abundance and accessibility of the main shery resources. This context brings the two main methodological questions addressed in this thesis, through the development of an end-to-end model coupling the high trophic level model OSMOSE to the hydrodynamics and biogeochemical model ROMS-PISCES: i) how to calibrate ecosystem models using time series data and ii) how to incorporate the impact of the interannual variability of the environment and shing. First, this thesis highlights some issues related to the confrontation of complex ecosystem models to data and proposes a methodology for a sequential multi-phases calibration of ecosystem models. We propose two criteria to classify the parameters of a model: the model dependency and the time variability of the parameters. Then, these criteria along with the availability of approximate initial estimates are used as decision rules to determine which parameters need to be estimated, and their precedence order in the sequential calibration process. Additionally, a new Evolutionary Algorithm designed for the calibration of stochastic models (e.g Individual Based Model) and optimized for maximum likelihood estimation has been developed and applied to the calibration of the OSMOSE model to time series data. The environmental variability is explicit in the model: the ROMS-PISCES model forces the OSMOSE model and drives potential bottom-up e ects up the foodweb through plankton and sh trophic interactions, as well as through changes in the spatial distribution of sh. The latter e ect was taken into account using presence/ absence species distribution models which are traditionally assessed through a confusion matrix and the statistical metrics associated to it. However, when considering the prediction of the habitat against time, the variability in the spatial distribution of the habitat can be summarized and validated using the emerging patterns from the shape of the spatial distributions. We modeled the potential habitat of the main species of the Humboldt Current Ecosystem using several sources of information ( sheries, scienti c surveys and satellite monitoring of vessels) jointly with environmental data from remote sensing and in situ observations, from 1992 to 2008. The potential habitat was predicted over the study period with monthly resolution, and the model was validated using quantitative and qualitative information of the system using a pattern oriented approach. The nal ROMS-PISCES-OSMOSE E2E ecosystem model for the NHCE was calibrated using our evolutionary algorithm and a likelihood approach to t monthly time series data of landings, abundance indices and catch at length distributions from 1992 to 2008. To conclude, some potential applications of the model for shery management are presented and their limitations and perspectives discussed.
Resumo:
This thesis presents quantitative studies of T cell and dendritic cell (DC) behaviour in mouse lymph nodes (LNs) in the naive state and following immunisation. These processes are of importance and interest in basic immunology, and better understanding could improve both diagnostic capacity and therapeutic manipulations, potentially helping in producing more effective vaccines or developing treatments for autoimmune diseases. The problem is also interesting conceptually as it is relevant to other fields where 3D movement of objects is tracked with a discrete scanning interval. A general immunology introduction is presented in chapter 1. In chapter 2, I apply quantitative methods to multi-photon imaging data to measure how T cells and DCs are spatially arranged in LNs. This has been previously studied to describe differences between the naive and immunised state and as an indicator of the magnitude of the immune response in LNs, but previous analyses have been generally descriptive. The quantitative analysis shows that some of the previous conclusions may have been premature. In chapter 3, I use Bayesian state-space models to test some hypotheses about the mode of T cell search for DCs. A two-state mode of movement where T cells can be classified as either interacting to a DC or freely migrating is supported over a model where T cells would home in on DCs at distance through for example the action of chemokines. In chapter 4, I study whether T cell migration is linked to the geometric structure of the fibroblast reticular network (FRC). I find support for the hypothesis that the movement is constrained to the fibroblast reticular cell (FRC) network over an alternative 'random walk with persistence time' model where cells would move randomly, with a short-term persistence driven by a hypothetical T cell intrinsic 'clock'. I also present unexpected results on the FRC network geometry. Finally, a quantitative method is presented for addressing some measurement biases inherent to multi-photon imaging. In all three chapters, novel findings are made, and the methods developed have the potential for further use to address important problems in the field. In chapter 5, I present a summary and synthesis of results from chapters 3-4 and a more speculative discussion of these results and potential future directions.
Resumo:
The report of the proceedings of the New Delhi workshop on the SSF Guidelines (Voluntary Guidelines for Securing Sustainable Small-scale Fisheries in the Context of Food Security and Poverty Eradication). The workshop brought together 95 participants from 13 states representing civil society organizations. governments, FAO, and fishworker organizations from both the marine and inland fisheries sectors. This report will be found useful for fishworker organizations, researchers, policy makers, members of civil society and anyone interested in small-scale fisheries, tenure rights, social development, livelihoods, post harvest and trade and disasters and climate change.
Resumo:
The status of five species of commercially exploited sharks within the Great Barrier Reef Marine Park (GBRMP) and south-east Queensland was assessed using a data-limited approach. Annual harvest rate, U, estimated empirically from tagging between 2011 and 2013, was compared with an analytically-derived proxy for optimal equilibrium harvest rate, UMSY Lim. Median estimates of U for three principal retained species, Australian blacktip shark, Carcharhinus tilstoni, spot-tail shark, Carcharhinus sorrah, and spinner shark, Carcharhinus brevipinna, were 0.10, 0.06 and 0.07 year-1, respectively. Median U for two retained, non-target species, pigeye shark, Carcharhinus amboinensis and Australian sharpnose shark, Rhizoprionodon taylori, were 0.27 and 0.01 year-1, respectively. For all species except the Australian blacktip the median ratio of U/UMSY Lim was <1. The high vulnerability of this species to fishing combined with life history characteristics meant UMSY Lim was low (0.04-0.07 year-1) and that U/UMSY Lim was likely to be > 1. Harvest of the Australian blacktip shark above UMSY could place this species at a greater risk of localised depletion in parts of the GBRMP. Results of the study indicated that much higher catches, and presumably higher U, during the early 2000s were likely unsustainable. The unexpectedly high level of U on the pigeye shark indicated that output-based management controls may not have been effective in reducing harvest levels on all species, particularly those caught incidentally by other fishing sectors including the recreational sector. © 2016 Elsevier B.V.
Resumo:
When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.
Resumo:
Part 18: Optimization in Collaborative Networks
Resumo:
Understanding the fluctuations in population abundance is a central question in fisheries. Sardine fisheries is of great importance to Portugal and is data-rich and of primary concern to fisheries managers. In Portugal, sub-stocks of Sardina pilchardus (sardine) are found in different regions: the Northwest (IXaCN), Southwest (IXaCS) and the South coast (IXaS-Algarve). Each of these sardine sub-stocks is affected differently by a unique set of climate and ocean conditions, mainly during larval development and recruitment, which will consequently affect sardine fisheries in the short term. Taking this hypothesis into consideration we examined the effects of hydrographic (river discharge), sea surface temperature, wind driven phenomena, upwelling, climatic (North Atlantic Oscillation) and fisheries variables (fishing effort) on S. pilchardus catch rates (landings per unit effort, LPUE, as a proxy for sardine biomass). A 20-year time series (1989-2009) was used, for the different subdivisions of the Portuguese coast (sardine sub-stocks). For the purpose of this analysis a multi-model approach was used, applying different time series models for data fitting (Dynamic Factor Analysis, Generalised Least Squares), forecasting (Autoregressive Integrated Moving Average), as well as Surplus Production stock assessment models. The different models were evaluated, compared and the most important variables explaining changes in LPUE were identified. The type of relationship between catch rates of sardine and environmental variables varied across regional scales due to region-specific recruitment responses. Seasonality plays an important role in sardine variability within the three study regions. In IXaCN autumn (season with minimum spawning activity, larvae and egg concentrations) SST, northerly wind and wind magnitude were negatively related with LPUE. In IXaCS none of the explanatory variables tested was clearly related with LPUE. In IXaS-Algarve (South Portugal) both spring (period when large abundances of larvae are found) northerly wind and wind magnitude were negatively related with LPUE, revealing that environmental effects match with the regional peak in spawning time. Overall, results suggest that management of small, short-lived pelagic species, such as sardine quotas/sustainable yields, should be adapted to a regional scale because of regional environmental variability.