983 resultados para multi-commodity flow


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a rainfall simulator developed for field and laboratory studies that gives great flexibility in plot size covered, that is highly portable and able to be used on steep slopes, and that is economical in its water use. The simulator uses Veejet 80100 nozzles mounted on a manifold, with the nozzles controlled to sweep to and from across a plot width of 1.5 m. Effective rainfall intensity is controlled by the frequency with which the nozzles sweep. Spatial uniformity of rainfall on the plots is high, with coefficients of variation (CV) on the body of the plot being 8-10%. Use of the simulator for erosion and infiltration measurements is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thiodicarb, a carbamate pesticide widely used on crops, may pose several environmental and health concerns. This study aimed to explore its toxicological profile on male rats using hematological, biochemical, histopathological, and flow cytometry markers. Exposed animals were dosed daily at 10, 20, or 40 mg/kg/body weight (group A, B, and C, respectively) during 30 d. No significant changes were observed in hematological parameters among all groups. After 10 d, a decrease of total cholesterol levels was noted in rats exposed to 40 mg/kg. Aspartate aminotransferase (AST) activity increased (group A at 20 d; groups A and B at 30 d) and alkaline phosphatase (ALP) (group B at 30 d) activity significantly reduced. At 30 d a decrease of some of the other evaluated parameters was observed with total cholesterol and urea levels in group A as well as total protein and creatinine levels in groups A and B. Histological results demonstrated multi-organ dose-related damage in thiodicarb-exposed animals, evidenced as hemorrhagic and diffuse vacuolation in hepatic tissue; renal histology showed disorganized glomeruli and tubular cell degeneration; spleen was ruptured with white pulp and clusters of iron deposits within red pulp; significant cellular loss was noted at the cortex of thymus; and degenerative changes were observed within testis. The histopathologic alterations were most prominent in the high-dose group. Concerning flow cytometry studies, an increase of lymphocyte number, especially T lymphocytes, was seen in blood samples from animals exposed to the highest dose. Taken together, these results indicate marked systemic organ toxicity in rats after subacute exposure to thiodicarb.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology for multi-objective day-ahead energy resource scheduling for smart grids considering intensive use of distributed generation and Vehicle- To-Grid (V2G). The main focus is the application of weighted Pareto to a multi-objective parallel particle swarm approach aiming to solve the dual-objective V2G scheduling: minimizing total operation costs and maximizing V2G income. A realistic mathematical formulation, considering the network constraints and V2G charging and discharging efficiencies is presented and parallel computing is applied to the Pareto weights. AC power flow calculation is included in the metaheuristics approach to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accepted in 13th IEEE Symposium on Embedded Systems for Real-Time Multimedia (ESTIMedia 2015), Amsterdam, Netherlands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 30th ACM/SIGAPP Symposium On Applied Computing (SAC 2015). 13 to 17, Apr, 2015, Embedded Systems. Salamanca, Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes. Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes . Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an outline of rationale and theory of the MuSIASEM scheme (Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism). First, three points of the rationale behind our MuSIASEM scheme are discussed: (i) endosomatic and exosomatic metabolism in relation to Georgescu-Roegen’s flow-fund scheme; (2) the bioeconomic analogy of hypercycle and dissipative parts in ecosystems; (3) the dramatic reallocation of human time and land use patterns in various sectors of modern economy. Next, a flow-fund representation of the MUSIASEM scheme on three levels (the whole national level, the paid work sectors level, and the agricultural sector level) is illustrated to look at the structure of the human economy in relation to two primary factors: (i) human time - a fund; and (ii) exosomatic energy - a flow. The three levels representation uses extensive and intensive variables simultaneously. Key conceptual tools of the MuSIASEM scheme - mosaic effects and impredicative loop analysis - are explained using the three level flow-fund representation. Finally, we claim that the MuSIASEM scheme can be seen as a multi-purpose grammar useful to deal with sustainability issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents a first attempt to extend the “Multi-scale integrated analysis of societal and ecosystem metabolism (MuSIASEM)” approach to a spatial dimension using GIS techniques in the Metropolitan area of Barcelona. We use a combination of census and commercial databases along with a detailed land cover map to create a layer of Common Geographic Units that we populate with the local values of human time spent in different activities according to MuSIASEM hierarchical typology. In this way, we mapped the hours of available human time, in regards to the working hours spent in different locations, putting in evidence the gradients in spatial density between the residential location of workers (generating the work supply) and the places where the working hours are actually taking place. We found a strong three-modal pattern of clumps of areas with different combinations of values of time spent on household activities and on paid work. We also measured and mapped spatial segregation between these two activities and put forward the conjecture that this segregation increases with higher energy throughput, as the size of the functional units must be able to cope with the flow of exosomatic energy. Finally, we discuss the effectiveness of the approach by comparing our geographic representation of exosomatic throughput to the one issued from conventional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, service providers in the Cloud offer complex services ready to be used as it was a commodity like water or electricity to their customers with any other extra effort for them. However, providing these services implies a high management effort which requires a lot of human interaction. Furthermore, an efficient resource management mechanism considering only provider's resources is, though necessary, not enough, because the provider's profit is limited by the amount of resources it owns. Dynamically outsourcing resources to other providers in response to demand variation avoids this problem and makes the provider to get more profit. A key technology for achieving these goals is virtualization which facilitates provider's management and provides on-demand virtual environments, which are isolated and consolidated in order to achieve a better utilization of the provider's resources. Nevertheless, dealing with some virtualization capabilities implies an effort for the user in order to take benefit from them. In order to avoid this problem, we are contributing the research community with a virtualized environment manager which aims to provide virtual machines that fulfils with the user requirements. Another challenge is sharing resources among different federated Cloud providers while exploiting the features of virtualization in a new approach for facilitating providers' management. This project aims for reducing provider's costs and at the same time fulfilling the quality of service agreed with the customers while maximizing the provider's revenue. It considers resource management at several layers, namely locally to each node in the provider, among different nodes in the provider, and among different federated providers. This latter layer supports the novel capabilities of outsourcing when the local resources are not enough to fulfil the users demand, and offering resources to other providers when the local resources are underused.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.