980 resultados para Integration strategies
Resumo:
The development of renewable energy sources and Distributed Generation (DG) of electricity is of main importance in the way towards a sustainable development. However, the management, in large scale, of these technologies is complicated because of the intermittency of primary resources (wind, sunshine, etc.) and small scale of some plants. The aggregation of DG plants gives place to a new concept: the Virtual Power Producer (VPP). VPPs can reinforce the importance of these generation technologies making them valuable in electricity markets. VPPs can ensure a secure, environmentally friendly generation and optimal management of heat, electricity and cold as well as optimal operation and maintenance of electrical equipment, including the sale of electricity in the energy market. For attaining these goals, there are important issues to deal with, such as reserve management strategies, strategies for bids formulation, the producers remuneration, and the producers characterization for coalition formation. This chapter presents the most important concepts related with renewable-based generation integration in electricity markets, using VPP paradigm. The presented case studies make use of two main computer applications:ViProd and MASCEM. ViProd simulates VPP operation, including the management of plants in operation. MASCEM is a multi-agent based electricity market simulator that supports the inclusion of VPPs in the players set.
Resumo:
Dissertation presented to obtain a Ph.D. degree in Sciences of Engineering and Technology, Cell Technology, at the Instituto de Tecnologia Qumica e Biolgica, Universidade Nova de Lisboa
Resumo:
Business History, Vol, 51 Issue 1, p45-58
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA School of Business and Economics
Resumo:
This study examined user-generated (UG) advertising in the context of social media networks. The focus was on how people, whether an expert in the area, a non-expert or a friend, influence the reader of the advertisement. Furthermore, the study analyzed how the certainty level of the UG advertisement influences the person viewing the ad. The study showed that for the friend source a high certainty message was more persuasive. However, regarding the certainty no significant results were found for the expert and non-expert. Further, the type of the source had a considerable impact on persuasion. Someone that we personally know (e.g., a friend) was rated most positive for all analyzes variables. This shows that with the rising usage of social media there are great opportunities for new effective advertising strategies that could include a new type of an endorser friends.
Resumo:
Choline supplementation improving memory functions in rodents is assumed to increase the synthesis and release of acetylcholine in the brain. We have found that a combined pre- and postnatal supplementation results in long-lasting facilitation of spatial memory in juvenile rats when training was conducted in presence of a local salient cue. The present work was aimed at analysing the effects of peri- and postnatal choline supplementation on spatial abilities of naive adult rats. Rats given a perinatal choline supplementation were trained in various cued procedures of the Morris navigation task when aged 5 months. The treatment had a specific effect of reducing the escape latency of the rats when the platform was at a fixed position in space and surrounded by a suspended cue. This effect was associated with an increased spatial bias when the cue and platform were removed. In this condition, the control rats showed impaired spatial discrimination following the removal of the target cue, most likely due to an overshadowing of the distant environmental cues. This impairment was not observed in the treated rats. Further training with the suspended cue at unpredictable places in the pool revealed longer escape latencies in the control than in the treated rats suggesting that this procedure induced a selective perturbation of the normal but not of the treated rats. A special probe trial with the cue at an irrelevant position and no escape platform revealed a significant bias of the control rats toward the cue and of the treated rats toward the uncued spatial escape position. This behavioural dissociation suggests that a salient cue associated with the target induces an alternative "non spatial" guidance strategy in normal rats, with the risk of overshadowing of the more distant spatial cues. In this condition, the choline supplementation facilities a spatial reliance on the cue, that is an overall facilitation of learning a set of spatial relations between several visual cues. As a consequence, the improved escape in presence of the cue is associated with a stronger memory of the spatial position following disappearance of the cue. This and previous observations suggest that a specific spatial attention process relies on the buffering of highly salient visual cues.to facilitate integration of their relative position in the environment.
Resumo:
Knowledge of the spatial distribution of hydraulic conductivity (K) within an aquifer is critical for reliable predictions of solute transport and the development of effective groundwater management and/or remediation strategies. While core analyses and hydraulic logging can provide highly detailed information, such information is inherently localized around boreholes that tend to be sparsely distributed throughout the aquifer volume. Conversely, larger-scale hydraulic experiments like pumping and tracer tests provide relatively low-resolution estimates of K in the investigated subsurface region. As a result, traditional hydrogeological measurement techniques contain a gap in terms of spatial resolution and coverage, and they are often alone inadequate for characterizing heterogeneous aquifers. Geophysical methods have the potential to bridge this gap. The recent increased interest in the application of geophysical methods to hydrogeological problems is clearly evidenced by the formation and rapid growth of the domain of hydrogeophysics over the past decade (e.g., Rubin and Hubbard, 2005).
Resumo:
In this paper we analyse the impact of policy uncertainty on foreign direct investment strategies. We also consider the impact of economic integration upon FDI decisions. The paper follows the real options approach, which allows investigating the value to a firm of waiting to invest and/or disinvest, when payoffs are stochastic due to political uncertainty and investments are partially reversible. Across the board we find that political uncertainty can be very detrimental to FDI decisions while economic integration leads to an increasing benefit of investing abroad.
Resumo:
Des progrs significatifs ont t raliss dans le domaine de l'intgration quantitative des donnes gophysique et hydrologique l'chelle locale. Cependant, l'extension de plus grandes chelles des approches correspondantes constitue encore un dfi majeur. Il est nanmoins extrmement important de relever ce dfi pour dvelopper des modles fiables de flux des eaux souterraines et de transport de contaminant. Pour rsoudre ce problme, j'ai dvelopp une technique d'intgration des donnes hydrogophysiques base sur une procdure baysienne de simulation squentielle en deux tapes. Cette procdure vise des problmes plus grande chelle. L'objectif est de simuler la distribution d'un paramtre hydraulique cible partir, d'une part, de mesures d'un paramtre gophysique pertinent qui couvrent l'espace de manire exhaustive, mais avec une faible rsolution (spatiale) et, d'autre part, de mesures locales de trs haute rsolution des mmes paramtres gophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les donnes gophysiques de faible et de haute rsolution travers une procdure de rduction dchelle. Les donnes gophysiques rgionales rduites sont ensuite relies au champ du paramtre hydraulique haute rsolution. J'illustre d'abord l'application de cette nouvelle approche dintgration des donnes une base de donnes synthtiques raliste. Celle-ci est constitue de mesures de conductivit hydraulique et lectrique de haute rsolution ralises dans les mmes forages ainsi que destimations des conductivits lectriques obtenues partir de mesures de tomographic de rsistivit lectrique (ERT) sur l'ensemble de l'espace. Ces dernires mesures ont une faible rsolution spatiale. La viabilit globale de cette mthode est teste en effectuant les simulations de flux et de transport au travers du modle original du champ de conductivit hydraulique ainsi que du modle simul. Les simulations sont alors compares. Les rsultats obtenus indiquent que la procdure dintgration des donnes propose permet d'obtenir des estimations de la conductivit en adquation avec la structure grande chelle ainsi que des predictions fiables des caractristiques de transports sur des distances de moyenne grande chelle. Les rsultats correspondant au scnario de terrain indiquent que l'approche d'intgration des donnes nouvellement mise au point est capable d'apprhender correctement les htrognites petite chelle aussi bien que les tendances gande chelle du champ hydraulique prvalent. Les rsultats montrent galement une flexibilt remarquable et une robustesse de cette nouvelle approche dintgration des donnes. De ce fait, elle est susceptible d'tre applique un large ventail de donnes gophysiques et hydrologiques, toutes les gammes dchelles. Dans la deuxime partie de ma thse, j'value en dtail la viabilit du rechantillonnage geostatique squentiel comme mcanisme de proposition pour les mthodes Markov Chain Monte Carlo (MCMC) appliques des probmes inverses gophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus prcise et plus raliste des incertitudes associes aux modles obtenus. En considrant une srie dexemples de tomographic radar puits puits, j'tudie deux classes de stratgies de rchantillonnage spatial en considrant leur habilit gnrer efficacement et prcisment des ralisations de la distribution postrieure baysienne. Les rsultats obtenus montrent que, malgr sa popularit, le rechantillonnage squentiel est plutt inefficace gnrer des chantillons postrieurs indpendants pour des tudes de cas synthtiques ralistes, notamment pour le cas assez communs et importants o il existe de fortes corrlations spatiales entre le modle et les paramtres. Pour rsoudre ce problme, j'ai dvelopp un nouvelle approche de perturbation base sur une dformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramtres du modle et lintensit de la perturbation. Par rapport au rchantillonage squentiel, cette nouvelle approche s'avre tre trs efficace pour diminuer le nombre requis d'itrations pour gnrer des chantillons indpendants partir de la distribution postrieure baysienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL's focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory.
Resumo:
The problem of small Island Developing States (SIDS) is quite recent, end of the 80s and 90s, still looking for a theoretical consolidation. SIDS, as small states in development, formed by one or several islands geographically dispersed, present reduced population, market, territory, natural resources, including drinkable water, and, in great number of the cases, low level of economic activity, factors that together, hinder the gathering of scale economies. To these diseconomies they come to join the more elevated costs in transports and communications which, allies to lower productivities, to a smaller quality and diversification of its productions, which difficult its integration in the world economy. In some SIDS these factors are not dissociating of the few investments in infrastructures, in the formation of human resources and in productive investments, just as it happens in most of the developing countries. In ecological terms, many of them with shortage of natural resources, but integrating important ecosystems in national and world terms, but with great fragility relatively to the pollution action, of excessive fishing, of uncontrolled development of tourism, factors that, conjugated and associated to the stove effect, condition the climate and the slope of the medium level of the sea water and therefore could put in cause the own survival of some of them. The drive to the awareness of the international community towards its problems summed up with the accomplishment by the United Nations in the Barbadoss Conference, 1994 where the right to the development was emphasized, through the going up the appropriate strategies and the Programme of Action for the Sustainable Development of the SIDS. The orientation of the regional and international cooperation in that sense, sharing technology (namely clean technology and control and administration environmental technology), information and creation of capacity-building, supplying means, including financial resources, creating non discriminatory and just trade rules, it would drive to the establishment of a world system economically more equal, in which the production, the consumption, the pollution levels, the demographic politics were guided towards the sustainability. It constituted an important step for the recognition for the international community on the specificities of those states and it allowed the definition of a group of norms and politics to implement at the national, regional and international level and it was important that they continued in the sense of the sustainable development. But this Conference had in its origin previous summits: the Summit of Rio de Janeiro about Environment and Development, accomplished in 1992, which left an important document - the Agenda 21, in the Conference of Stockholm at 1972 and even in the Conference of Ramsar, 1971 about Wetlands. CENTRO DE ESTUDOS AFRICANOS Occasional Papers CEA - Centro de Estudos Africanos 4 Later, the Valletta Declaration, Malta, 1998, the Forum of Small States, 2002, get the international community's attention for the problems of SIDS again, in the sense that they act to increase its resilience. If the definition of vulnerability was the inability of the countries to resist economical, ecological and socially to the external shocks and resilience as the potential for them to absorb and minimize the impact of those shocks, presenting a structure that allows them to be little affected by them, a part of the available studies, dated of the 90s, indicate that the SIDS are more vulnerable than the other developing countries. The vulnerability of SIDS results from the fact the they present an assemblage of characteristics that turns them less capable of resisting or they advance strategies that allow a larger resilience to the external shocks, either anthropogenic (economical, financial, environmental) or even natural, connected with the vicissitudes of the nature. If these vulnerability factors were grouped with the expansion of the economic capitalist system at world level, the economic and financial globalisation, the incessant search of growing profits on the part of the multinational enterprises, the technological accelerated evolution drives to a situation of disfavour of the more poor. The creation of the resilience to the external shocks, to the process of globalisation, demands from SIDS and of many other developing countries the endogen definition of strategies and solid but flexible programs of integrated development. These must be assumed by the instituted power, but also by the other stakeholders, including companies and organizations of the civil society and for the population in general. But that demands strong investment in the formation of human resources, in infrastructures, in investigation centres; it demands the creation capacity not only to produce, but also to produce differently and do international marketing. It demands institutional capacity. Cape Verde is on its way to this stage.
Resumo:
Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.
Resumo:
This is the report of the first workshop on Incorporating In Vitro Alternative Methods for Developmental Neurotoxicity (DNT) Testing into International Hazard and Risk Assessment Strategies, held in Ispra, Italy, on 19-21 April 2005. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and jointly organized by ECVAM, the European Chemical Industry Council, and the Johns Hopkins University Center for Alternatives to Animal Testing. The primary aim of the workshop was to identify and catalog potential methods that could be used to assess how data from in vitro alternative methods could help to predict and identify DNT hazards. Working groups focused on two different aspects: a) details on the science available in the field of DNT, including discussions on the models available to capture the critical DNT mechanisms and processes, and b) policy and strategy aspects to assess the integration of alternative methods in a regulatory framework. This report summarizes these discussions and details the recommendations and priorities for future work.
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.