51 resultados para Effects-Based Approach to Operations
Resumo:
This document presents an integrated analysis of the performance of Catalonia based on an analysis of how the energy consumption (measured at the societal level for the Catalan Society) is used within both the productive sectors of the economy and the household, to generate added value, jobs, and to guarantee a given level of material standard of living to the population. The trends found in Catalonia are compared to the trends of other European Countries to contextualize the performance of Catalonia with respect to other societies that have followed different paths of economic development. The first part of the document consists of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach that has been used to provide this integrated analysis of Catalan Society across different scales (starting from an analysis of the specific sectors of the Catalan economy as an Autonomous Community and scaling up to an intra-regional (European Union 14) comparison) and across different dimensions of analyses of energy consumption coupled with added value generation. Within the scope of this study, we observe the various trajectories of changes in the metabolic pattern for Catalonia and the EU14 countries in the Paid Work Sectors composed of namely, the Agricultural Sector, the Productive Sector and the Services and Government Sector also in comparison with the changes in the household sector. The flow intensities of the exosomatic energy and the added value generated for each specific sector are defined per hour of human activity, thus characterized as exosomatic energy (MJ/hour) (or Exosomatic Metabolic Rate) and added value (€/hour) (Economic Labour Productivity) across multiple levels. Within the second part of the document, the possible usage of the MuSIASEM approach to land use analyses (using a multi-level matrix of categories of land use) has been conducted.
Resumo:
This paper proposes a new methodology to compute Value at Risk (VaR) for quantifying losses in credit portfolios. We approximate the cumulative distribution of the loss function by a finite combination of Haar wavelet basis functions and calculate the coefficients of the approximation by inverting its Laplace transform. The Wavelet Approximation (WA) method is specially suitable for non-smooth distributions, often arising in small or concentrated portfolios, when the hypothesis of the Basel II formulas are violated. To test the methodology we consider the Vasicek one-factor portfolio credit loss model as our model framework. WA is an accurate, robust and fast method, allowing to estimate VaR much more quickly than with a Monte Carlo (MC) method at the same level of accuracy and reliability.
Resumo:
Hypermedia systems based on the Web for open distance education are becoming increasinglypopular as tools for user-driven access learning information. Adaptive hypermedia is a new direction in research within the area of user-adaptive systems, to increase its functionality by making it personalized [Eklu 961. This paper sketches a general agents architecture to include navigationaladaptability and user-friendly processes which would guide and accompany the student during hislher learning on the PLAN-G hypermedia system (New Generation Telematics Platform to Support Open and Distance Learning), with the aid of computer networks and specifically WWW technology [Marz 98-1] [Marz 98-2]. The PLAN-G actual prototype is successfully used with some informatics courses (the current version has no agents yet). The propased multi-agent system, contains two different types of adaptive autonomous software agents: Personal Digital Agents {Interface), to interacl directly with the student when necessary; and Information Agents (Intermediaries), to filtrate and discover information to learn and to adapt navigation space to a specific student
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
An implicitly parallel method for integral-block driven restricted active space self-consistent field (RASSCF) algorithms is presented. The approach is based on a model space representation of the RAS active orbitals with an efficient expansion of the model subspaces. The applicability of the method is demonstrated with a RASSCF investigation of the first two excited states of indole
Resumo:
Background: There is growing evidence that traffic-related air pollution reduces birth weight. Improving exposure assessment is a key issue to advance in this research area.Objective: We investigated the effect of prenatal exposure to traffic-related air pollution via geographic information system (GIS) models on birth weight in 570 newborns from the INMA (Environment and Childhood) Sabadell cohort.Methods: We estimated pregnancy and trimester-specific exposures to nitrogen dioxide and aromatic hydrocarbons [benzene, toluene, ethylbenzene, m/p-xylene, and o-xylene (BTEX)] by using temporally adjusted land-use regression (LUR) models. We built models for NO2 and BTEX using four and three 1-week measurement campaigns, respectively, at 57 locations. We assessed the relationship between prenatal air pollution exposure and birth weight with linear regression models. We performed sensitivity analyses considering time spent at home and time spent in nonresidential outdoor environments during pregnancy.Results: In the overall cohort, neither NO2 nor BTEX exposure was significantly associated with birth weight in any of the exposure periods. When considering only women who spent < 2 hr/day in nonresidential outdoor environments, the estimated reductions in birth weight associated with an interquartile range increase in BTEX exposure levels were 77 g [95% confidence interval (CI), 7–146 g] and 102 g (95% CI, 28–176 g) for exposures during the whole pregnancy and the second trimester, respectively. The effects of NO2 exposure were less clear in this subset.Conclusions: The association of BTEX with reduced birth weight underscores the negative role of vehicle exhaust pollutants in reproductive health. Time–activity patterns during pregnancy complement GIS-based models in exposure assessment.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
The achievable region approach seeks solutions to stochastic optimisation problems by: (i) characterising the space of all possible performances(the achievable region) of the system of interest, and (ii) optimisingthe overall system-wide performance objective over this space. This isradically different from conventional formulations based on dynamicprogramming. The approach is explained with reference to a simpletwo-class queueing system. Powerful new methodologies due to the authorsand co-workers are deployed to analyse a general multiclass queueingsystem with parallel servers and then to develop an approach to optimalload distribution across a network of interconnected stations. Finally,the approach is used for the first time to analyse a class of intensitycontrol problems.
Resumo:
In this paper we propose a subsampling estimator for the distribution ofstatistics diverging at either known rates when the underlying timeseries in strictly stationary abd strong mixing. Based on our results weprovide a detailed discussion how to estimate extreme order statisticswith dependent data and present two applications to assessing financialmarket risk. Our method performs well in estimating Value at Risk andprovides a superior alternative to Hill's estimator in operationalizingSafety First portofolio selection.
Resumo:
I discuss several lessons regarding the design and conduct of monetary policy that have emerged out of the New Keynesian research program. Those lessons include the bene.ts of price stability, the gains from commitment about future policies, the importance of nat-ural variables as benchmarks for policy, and the bene.ts of a credible anti-inflationary stance. I also point to one challenge facing NK modelling efforts: the need to come up with relevant sources of policy tradeoffs. A potentially useful approach to meeting that challenge, based on the introduction of real imperfections, is presented.
Resumo:
We study a retail benchmarking approach to determine access prices for interconnected networks. Instead of considering fixed access charges as in the existing literature, we study access pricing rules that determine the access price that network i pays to network j as a linear function of the marginal costs and the retail prices set by both networks. In the case of competition in linear prices, we show that there is a unique linear rule that implements the Ramsey outcome as the unique equilibrium, independently of the underlying demand conditions. In the case of competition in two-part tariffs, we consider a class of access pricing rules, similar to the optimal one under linear prices but based on average retail prices. We show that firms choose the variable price equal to the marginal cost under this class of rules. Therefore, the regulator (or the competition authority) can choose one among the rules to pursue additional objectives such as consumer surplus, network coverage or investment: for instance, we show that both static and dynamic e±ciency can be achieved at the same time.
Resumo:
A common problem in video surveys in very shallow waters is the presence of strong light fluctuations, due to sun light refraction. Refracted sunlight casts fast moving patterns, which can significantly degrade the quality of the acquired data. Motivated by the growing need to improve the quality of shallow water imagery, we propose a method to remove sunlight patterns in video sequences. The method exploits the fact that video sequences allow several observations of the same area of the sea floor, over time. It is based on computing the image difference between a given reference frame and the temporal median of a registered set of neighboring images. A key observation is that this difference will have two components with separable spectral content. One is related to the illumination field (lower spatial frequencies) and the other to the registration error (higher frequencies). The illumination field, recovered by lowpass filtering, is used to correct the reference image. In addition to removing the sunflickering patterns, an important advantage of the approach is the ability to preserve the sharpness in corrected image, even in the presence of registration inaccuracies. The effectiveness of the method is illustrated in image sets acquired under strong camera motion containing non-rigid benthic structures. The results testify the good performance and generality of the approach
Resumo:
The number of private gardens has increased in recent years, creating a more pleasant urban model, but not without having an environmental impact, including increased energy consumption, which is the focus of this study. The estimation of costs and energy consumption for the generic typology of private urban gardens is based on two simplifying assumptions: square geometry with surface areas from 25 to 500 m2 and hydraulic design with a single pipe. In total, eight sprinkler models have been considered, along with their possible working pressures, and 31 pumping units grouped into 5 series that adequately cover the range of required flow rates and pressures, resultin in 495 hydraulic designs repeated for two climatically different locations in the Spanish Mediterranean area (Girona and Elche). Mean total irrigation costs for the locality with lower water needs (Girona) and greater needs (Elche) were € 2,974 ha-¹ yr-¹ and € 3,383 ha-¹ yr-¹, respectively. Energy costs accounted for 11.4% of the total cost for the first location, and 23.0% for the second. While a suitable choice of the hydraulic elements of the setup is essential, as it may provide average energy savings of 77%, due to the low energy cost in relation to the cost of installation, the potential energy savings do not constitute a significant incentive for the irrigation system design. The low efficiency of the pumping units used in this type of garden is the biggest obstacle and constraint to achieving a high quality energy solution
Resumo:
Most sedimentary modelling programs developed in recent years focus on either terrigenous or carbonate marine sedimentation. Nevertheless, only a few programs have attempted to consider mixed terrigenous-carbonate sedimentation, and most of these are two-dimensional, which is a major restriction since geological processes take place in 3D. This paper presents the basic concepts of a new 3D mathematical forward simulation model for clastic sediments, which was developed from SIMSAFADIM, a previous 3D carbonate sedimentation model. The new extended model, SIMSAFADIM-CLASTIC, simulates processes of autochthonous marine carbonate production and accumulation, together with clastic transport and sedimentation in three dimensions of both carbonate and terrigenous sediments. Other models and modelling strategies may also provide realistic and efficient tools for prediction of stratigraphic architecture and facies distribution of sedimentary deposits. However, SIMSAFADIM-CLASTIC becomes an innovative model that attempts to simulate different sediment types using a process-based approach, therefore being a useful tool for 3D prediction of stratigraphic architecture and facies distribution in sedimentary basins. This model is applied to the neogene Vallès-Penedès half-graben (western Mediterranean, NE Spain) to show the capacity of the program when applied to a realistic geologic situation involving interactions between terrigenous clastics and carbonate sediments.
Resumo:
Diffeomorphism-induced symmetry transformations and time evolution are distinct operations in generally covariant theories formulated in phase space. Time is not frozen. Diffeomorphism invariants are consequently not necessarily constants of the motion. Time-dependent invariants arise through the choice of an intrinsic time, or equivalently through the imposition of time-dependent gauge fixation conditions. One example of such a time-dependent gauge fixing is the Komar-Bergmann use of Weyl curvature scalars in general relativity. An analogous gauge fixing is also imposed for the relativistic free particle and the resulting complete set time-dependent invariants for this exactly solvable model are displayed. In contrast with the free particle case, we show that gauge invariants that are simultaneously constants of motion cannot exist in general relativity. They vary with intrinsic time.