970 resultados para Location-Allocation Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Els mètodes de detecció, diagnosi i aïllament de fallades (Fault Detection and Isolation - FDI) basats en la redundància analítica (és a dir, la comparació del comportament actual del procés amb l’esperat, obtingut mitjançant un model matemàtic del mateix), són àmpliament utilitzats per al diagnòstic de sistemes quan el model matemàtic està disponible. S’ha implementat un algoritme per implementar aquesta redundància analítica a partir del model de la plana conegut com a Anàlisi Estructural

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La coordinació i assignació de tasques en entorns distribuïts ha estat un punt important de la recerca en els últims anys i aquests temes són el cor dels sistemes multi-agent. Els agents en aquests sistemes necessiten cooperar i considerar els altres agents en les seves accions i decisions. A més a més, els agents han de coordinar-se ells mateixos per complir tasques complexes que necessiten més d'un agent per ser complerta. Aquestes tasques poden ser tan complexes que els agents poden no saber la ubicació de les tasques o el temps que resta abans de que les tasques quedin obsoletes. Els agents poden necessitar utilitzar la comunicació amb l'objectiu de conèixer la tasca en l'entorn, en cas contrari, poden perdre molt de temps per trobar la tasca dins de l'escenari. De forma similar, el procés de presa de decisions distribuït pot ser encara més complexa si l'entorn és dinàmic, amb incertesa i en temps real. En aquesta dissertació, considerem entorns amb sistemes multi-agent amb restriccions i cooperatius (dinàmics, amb incertesa i en temps real). En aquest sentit es proposen dues aproximacions que permeten la coordinació dels agents. La primera és un mecanisme semi-centralitzat basat en tècniques de subhastes combinatòries i la idea principal es minimitzar el cost de les tasques assignades des de l'agent central cap als equips d'agents. Aquest algoritme té en compte les preferències dels agents sobre les tasques. Aquestes preferències estan incloses en el bid enviat per l'agent. La segona és un aproximació d'scheduling totalment descentralitzat. Això permet als agents assignar les seves tasques tenint en compte les preferències temporals sobre les tasques dels agents. En aquest cas, el rendiment del sistema no només depèn de la maximització o del criteri d'optimització, sinó que també depèn de la capacitat dels agents per adaptar les seves assignacions eficientment. Addicionalment, en un entorn dinàmic, els errors d'execució poden succeir a qualsevol pla degut a la incertesa i error de accions individuals. A més, una part indispensable d'un sistema de planificació és la capacitat de re-planificar. Aquesta dissertació també proveeix una aproximació amb re-planificació amb l'objectiu de permetre als agent re-coordinar els seus plans quan els problemes en l'entorn no permeti la execució del pla. Totes aquestes aproximacions s'han portat a terme per permetre als agents assignar i coordinar de forma eficient totes les tasques complexes en un entorn multi-agent cooperatiu, dinàmic i amb incertesa. Totes aquestes aproximacions han demostrat la seva eficiència en experiments duts a terme en l'entorn de simulació RoboCup Rescue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Rio Tinto river in SW Spain is a classic example of acid mine drainage and the focus of an increasing amount of research including environmental geochemistry, extremophile microbiology and Mars-analogue studies. Its 5000-year mining legacy has resulted in a wide range of point inputs including spoil heaps and tunnels draining underground workings. The variety of inputs and importance of the river as a research site make it an ideal location for investigating sulphide oxidation mechanisms at the field scale. Mass balance calculations showed that pyrite oxidation accounts for over 93% of the dissolved sulphate derived from sulphide oxidation in the Rio Tinto point inputs. Oxygen isotopes in water and sulphate were analysed from a variety of drainage sources and displayed delta O-18((SO4-H2O)) values from 3.9 to 13.6 parts per thousand, indicating that different oxidation pathways occurred at different sites within the catchment. The most commonly used approach to interpreting field oxygen isotope data applies water and oxygen fractionation factors derived from laboratory experiments. We demonstrate that this approach cannot explain high delta O-18((SO4-H2O)) values in a manner that is consistent with recent models of pyrite and sulphoxyanion oxidation. In the Rio Tinto, high delta O-18((SO4-H2O)) values (11.2-13.6 parts per thousand) occur in concentrated (Fe = 172-829 mM), low pH (0.88-1.4), ferrous iron (68-91% of total Fe) waters and are most simply explained by a mechanism involving a dissolved sulphite intermediate, sulphite-water oxygen equilibrium exchange and finally sulphite oxidation to sulphate with O-2. In contrast, drainage from large waste blocks of acid volcanic tuff with pyritiferous veins also had low pH (1.7). but had a low delta O-18((SO4-H2O)) value of 4.0 parts per thousand and high concentrations of ferric iron (Fe(III) = 185 mM, total Fe = 186 mM), suggesting a pathway where ferric iron is the primary oxidant, water is the primary source of oxygen in the sulphate and where sulphate is released directly from the pyrite surface. However, problems remain with the sulphite-water oxygen exchange model and recommendations are therefore made for future experiments to refine our understanding of oxygen isotopes in pyrite oxidation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In April–July 2008, intensive measurements were made of atmospheric composition and chemistry in Sabah, Malaysia, as part of the "Oxidant and particle photochemical processes above a South-East Asian tropical rainforest" (OP3) project. Fluxes and concentrations of trace gases and particles were made from and above the rainforest canopy at the Bukit Atur Global Atmosphere Watch station and at the nearby Sabahmas oil palm plantation, using both ground-based and airborne measurements. Here, the measurement and modelling strategies used, the characteristics of the sites and an overview of data obtained are described. Composition measurements show that the rainforest site was not significantly impacted by anthropogenic pollution, and this is confirmed by satellite retrievals of NO2 and HCHO. The dominant modulators of atmospheric chemistry at the rainforest site were therefore emissions of BVOCs and soil emissions of reactive nitrogen oxides. At the observed BVOC:NOx volume mixing ratio (~100 pptv/pptv), current chemical models suggest that daytime maximum OH concentrations should be ca. 105 radicals cm−3, but observed OH concentrations were an order of magnitude greater than this. We confirm, therefore, previous measurements that suggest that an unexplained source of OH must exist above tropical rainforest and we continue to interrogate the data to find explanations for this.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thirty‐three snowpack models of varying complexity and purpose were evaluated across a wide range of hydrometeorological and forest canopy conditions at five Northern Hemisphere locations, for up to two winter snow seasons. Modeled estimates of snow water equivalent (SWE) or depth were compared to observations at forest and open sites at each location. Precipitation phase and duration of above‐freezing air temperatures are shown to be major influences on divergence and convergence of modeled estimates of the subcanopy snowpack. When models are considered collectively at all locations, comparisons with observations show that it is harder to model SWE at forested sites than open sites. There is no universal “best” model for all sites or locations, but comparison of the consistency of individual model performances relative to one another at different sites shows that there is less consistency at forest sites than open sites, and even less consistency between forest and open sites in the same year. A good performance by a model at a forest site is therefore unlikely to mean a good model performance by the same model at an open site (and vice versa). Calibration of models at forest sites provides lower errors than uncalibrated models at three out of four locations. However, benefits of calibration do not translate to subsequent years, and benefits gained by models calibrated for forest snow processes are not translated to open conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the current concern over climate change, descriptions of how rainfall patterns are changing over time can be useful. Observations of daily rainfall data over the last few decades provide information on these trends. Generalized linear models are typically used to model patterns in the occurrence and intensity of rainfall. These models describe rainfall patterns for an average year but are more limited when describing long-term trends, particularly when these are potentially non-linear. Generalized additive models (GAMS) provide a framework for modelling non-linear relationships by fitting smooth functions to the data. This paper describes how GAMS can extend the flexibility of models to describe seasonal patterns and long-term trends in the occurrence and intensity of daily rainfall using data from Mauritius from 1962 to 2001. Smoothed estimates from the models provide useful graphical descriptions of changing rainfall patterns over the last 40 years at this location. GAMS are particularly helpful when exploring non-linear relationships in the data. Care is needed to ensure the choice of smooth functions is appropriate for the data and modelling objectives. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Bayesian approach to analysing data from family-based association studies is developed. This permits direct assessment of the range of possible values of model parameters, such as the recombination frequency and allelic associations, in the light of the data. In addition, sophisticated comparisons of different models may be handled easily, even when such models are not nested. The methodology is developed in such a way as to allow separate inferences to be made about linkage and association by including theta, the recombination fraction between the marker and disease susceptibility locus under study, explicitly in the model. The method is illustrated by application to a previously published data set. The data analysis raises some interesting issues, notably with regard to the weight of evidence necessary to convince us of linkage between a candidate locus and disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel Swarm Intelligence method for best-fit search, Stochastic Diffusion Search, is presented capable of rapid location of the optimal solution in the search space. Population based search mechanisms employed by Swarm Intelligence methods can suffer lack of convergence resulting in ill defined stopping criteria and loss of the best solution. Conversely, as a result of its resource allocation mechanism, the solutions SDS discovers enjoy excellent stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze the publicly released outputs of the simulations performed by climate models (CMs) in preindustrial (PI) and Special Report on Emissions Scenarios A1B (SRESA1B) conditions. In the PI simulations, most CMs feature biases of the order of 1 W m −2 for the net global and the net atmospheric, oceanic, and land energy balances. This does not result from transient effects but depends on the imperfect closure of the energy cycle in the fluid components and on inconsistencies over land. Thus, the planetary emission temperature is underestimated, which may explain the CMs' cold bias. In the PI scenario, CMs agree on the meridional atmospheric enthalpy transport's peak location (around 40°N/S), while discrepancies of ∼20% exist on the intensity. Disagreements on the oceanic transport peaks' location and intensity amount to ∼10° and ∼50%, respectively. In the SRESA1B runs, the atmospheric transport's peak shifts poleward, and its intensity increases up to ∼10% in both hemispheres. In most CMs, the Northern Hemispheric oceanic transport decreases, and the peaks shift equatorward in both hemispheres. The Bjerknes compensation mechanism is active both on climatological and interannual time scales. The total meridional transport peaks around 35° in both hemispheres and scenarios, whereas disagreements on the intensity reach ∼20%. With increased CO 2 concentration, the total transport increases up to ∼10%, thus contributing to polar amplification of global warming. Advances are needed for achieving a self-consistent representation of climate as a nonequilibrium thermodynamical system. This is crucial for improving the CMs' skill in representing past and future climate changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel imaging techniques are playing an increasingly important role in drug development, providing insight into the mechanism of action of new chemical entities. The data sets obtained by these methods can be large with complex inter-relationships, but the most appropriate statistical analysis for handling this data is often uncertain - precisely because of the exploratory nature of the way the data are collected. We present an example from a clinical trial using magnetic resonance imaging to assess changes in atherosclerotic plaques following treatment with a tool compound with established clinical benefit. We compared two specific approaches to handle the correlations due to physical location and repeated measurements: two-level and four-level multilevel models. The two methods identified similar structural variables, but higher level multilevel models had the advantage of explaining a greater proportion of variation, and the modeling assumptions appeared to be better satisfied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structure of the Arctic stratospheric polar vortex in three chemistry–climate models (CCMs) taken from the CCMVal-2 intercomparison is examined using zonal mean and geometric-based methods. The geometric methods are employed by taking 2D moments of potential vorticity fields that are representative of the polar vortices in each of the models. This allows the vortex area, centroid location and ellipticity to be determined, as well as a measure of vortex filamentation. The first part of the study uses these diagnostics to examine how well the mean state, variability and extreme variability of the polar vortices are represented in CCMs compared to ERA-40 reanalysis data, and in particular for the UMUKCA-METO, NIWA-SOCOL and CCSR/NIES models. The second part of the study assesses how the vortices are predicted to change in terms of the frequency of sudden stratospheric warmings and their general structure over the period 1960–2100. In general, it is found that the vortices are climatologically too far poleward in the CCMs and produce too few large-scale filamentation events. Only a small increase is observed in the frequency of sudden stratospheric warming events from the mean of the CCMVal-2 models, but the distribution of extreme variability throughout the winter period is shown to change towards the end of the twentyfirst century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper arises from a doctoral thesis comparing the impact of alternative installer business models on the rate at which microgeneration is taken up in homes and installation standards across the UK. The paper presents the results of the first large-scale academic survey of businesses certified to install residential microgeneration. The aim is to systematically capture those characteristics which define the business model of each surveyed company, and relate these to the number, location and type of technologies that they install, and the quality of these installations. The methodology comprised a pilot web survey of 235 certified installer businesses, which was carried out in June last year and achieved a response rate of 30%. Following optimisation of the design, the main web survey was emailed to over 2000 businesses between October and December 2011, with 317 valid responses received. The survey is being complemented during summer 2012 by semi-structured interviews with a representative sample of installers who completed the main survey. The survey results are currently being analysed. The early results indicate an emerging and volatile market where solar PV, solar hot water and air source heat pumps are the dominant technologies. Three quarters of respondents are founders of their installer business, while only 22 businesses are owned by another company. Over half of the 317 businesses have five employees or less, while 166 businesses are no more than four years old. In addition, half of the businesses stated that 100% of their employees work on microgeneration-related activities. 85% of the surveyed companies have only one business location in the UK. A third of the businesses are based either in the South West or South East regions of England. This paper outlines the interim results of the survey combined with the outcomes from additional interviews with installers to date. The research identifies some of the business models underpinning microgeneration installers and some of the ways in which installer business models impact on the rate and standards of microgeneration uptake. A tentative conclusion is that installer business models are profoundly dependent on the levels and timing of support from the UK Feed-in Tariffs and Renewable Heat Incentive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates that, in situations in which a cumulative externality exists, the basic nature and extent of resource misallocation may be substantially less than we imagine. This conclusion stems from deriving consistent conjectures in a unified framework in which congestion is present. Experiments support the conclusion that, when numbers of agents are small, when there is little heterogeneity among them, and when they have the opportunity to observe each other during repeated experiment, the market allocation may be efficient

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary 1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests. Keywords: bioenergetics; energy budget; individual-based models; population dynamics.