21 resultados para general equilibrium-analysis
em Instituto Politécnico do Porto, Portugal
Resumo:
This paper studies the effects of the diffusion of a General Purpose Technology (GPT) that spreads first within the developed North country of its origin, and then to a developing South country. In the developed general equilibrium growth model, each final good can be produced by one of two technologies. Each technology is characterized by a specific labor complemented by a specific set of intermediate goods, which are enhanced periodically by Schumpeterian R&D activities. When quality reaches a threshold level, a GPT arises in one of the technologies and spreads first to the other technology within the North. Then, it propagates to the South, following a similar sequence. Since diffusion is not even, neither intra- nor inter-country, the GPT produces successive changes in the direction of technological knowledge and in inter- and intra-country wage inequality. Through this mechanism the different observed paths of wage inequality can be accommodated.
Resumo:
Sulfamethoxazole (SMX) is among the antibiotics employed in aquaculture for prophylactic and therapeutic reasons. Environmental and food spread may be prevented by controlling its levels in several stages of fish farming. The present work proposes for this purpose new SMX selective electrodes for the potentiometric determination of this sulphonamide in water. The selective membranes were made of polyvinyl chloride (PVC) with tetraphenylporphyrin manganese (III) chloride or cyclodextrin-based acting as ionophores. 2-nitrophenyl octyl ether was employed as plasticizer and tetraoctylammonium, dimethyldioctadecylammonium bromide or potassium tetrakis (4-chlorophenyl) borate was used as anionic or cationic additive. The best analytical performance was reported for ISEs of tetraphenylporphyrin manganese (III) chloride with 50% mol of potassium tetrakis (4-chlorophenyl) borate compared to ionophore. Nersntian behaviour was observed from 4.0 × 10−5 to 1.0 × 10−2 mol/L (10.0 to 2500 µg/mL), and the limit of detection was 1.2 × 10−5 mol/L (3.0 µg/mL). In general, the electrodes displayed steady potentials in the pH range of 6 to 9. Emf equilibrium was reached before 15 s in all concentration levels. The electrodes revealed good discriminating ability in environmental samples. The analytical application to contaminated waters showed recoveries from 96 to 106%.
Resumo:
Different problems are daily discuss on environmental aspects such acid rain, eutrophication, global warming and an others problems. Rarely do we find some discussions about phosphorus problematic. Through the years the phosphorus as been a real problem and must be more discussed. On this thesis was done a global material flow analysis of phosphorus, based on data from the year 2004, the production of phosphate rock in that year was 18.9 million tones, almost this amount it was used as fertilizer on the soil and the plants only can uptake, on average, 20% of the input of fertilizer to grow up, the remainder is lost for the phosphorus soil. In the phosphorus soil there is equilibrium between the phosphorus available to uptake from the plants and the phosphorus associate with other compounds, this equilibrium depends of the kind of soil and is related with the soil pH. A reserve inventory was done and we have 15,000 million tones as reserve, the amount that is economical available. The reserve base is estimated in 47,000 million tones. The major reserves can be found in Morocco and Western Sahara, United Sates, China and South Africa. The reserve estimated in 2009 was 15,000 million tone of phosphate rock or 1,963 million tone of P. If every year the mined phosphate rock is around 22 Mt/yr (phosphorus production on 2008 USGS 2009), and each year the consumption of phosphorus increases because of the food demand, the reserves of phosphate rock will be finished in about 90 years, or maybe even less. About the value/impact assessment was done a qualitative analysis, if on the future we don’t have more phosphate rock to produce fertilizers, it is expected a drop on the crops yields, each depends of the kind of the soil and the impact on the humans feed and animal production will not be a relevant problem. We can recovery phosphorus from different waste streams such as ploughing crop residues back into the soil, Food processing plants and food retailers, Human and animal excreta, Meat and bone meal, Manure fibre, Sewage sludge and wastewater. Some of these examples are developed in the paper.
Resumo:
The analysis of opiates is of vital interest in drug abuse monitoring and research. This review presents a general overview of the electrochemical methods used for detection and quantification of opiates in a variety of matrices. Emphasis has been placed on the voltammetric methods used for study and determination of morphine, codeine, and heroin. Specific issues that need to be solved and better explained as well as future trends in the use of electrochemical methods in the examination of opiates are also discussed.
Resumo:
Aiming the establishment of simple and accurate readings of citric acid (CA) in complex samples, citrate (CIT) selective electrodes with tubular configuration and polymeric membranes plus a quaternary ammonium ion exchanger were constructed. Several selective membranes were prepared for this purpose, having distinct mediator solvents (with quite different polarities) and, in some cases, p-tert-octylphenol (TOP) as additive. The latter was used regarding a possible increase in selectivity. The general working characteristics of all prepared electrodes were evaluated in a low dispersion flow injection analysis (FIA) manifold by injecting 500µl of citrate standard solutions into an ionic strength (IS) adjuster carrier (10−2 mol l−1) flowing at 3ml min−1. Good potentiometric response, with an average slope and a repeatability of 61.9mV per decade and ±0.8%, respectively, resulted from selective membranes comprising additive and bis(2-ethylhexyl)sebacate (bEHS) as mediator solvent. The same membranes conducted as well to the best selectivity characteristics, assessed by the separated solutions method and for several chemical species, such as chloride, nitrate, ascorbate, glucose, fructose and sucrose. Pharmaceutical preparations, soft drinks and beers were analyzed under conditions that enabled simultaneous pH and ionic strength adjustment (pH = 3.2; ionic strength = 10−2 mol l−1), and the attained results agreed well with the used reference method (relative error < 4%). The above experimental conditions promoted a significant increase in sensitivity of the potentiometric response, with a supra-Nernstian slope of 80.2mV per decade, and allowed the analysis of about 90 samples per hour, with a relative standard deviation <1.0%.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.
Resumo:
This paper analyzes the Portuguese short-run business cycles over the last 150 years and presents the multidimensional scaling (MDS) for visualizing the results. The analytical and numerical assessment of this long-run perspective reveals periods with close connections between the macroeconomic variables related to government accounts equilibrium, balance of payments equilibrium, and economic growth. The MDS method is adopted for a quantitative statistical analysis. In this way, similarity clusters of several historical periods emerge in the MDS maps, namely, in identifying similarities and dissimilarities that identify periods of prosperity and crises, growth, and stagnation. Such features are major aspects of collective national achievement, to which can be associated the impact of international problems such as the World Wars, the Great Depression, or the current global financial crisis, as well as national events in the context of broad political blueprints for the Portuguese society in the rising globalization process.
Resumo:
This paper addresses the impact of the CO2 opportunity cost on the wholesale electricity price in the context of the Iberian electricity market (MIBEL), namely on the Portuguese system, for the period corresponding to the Phase II of the European Union Emission Trading Scheme (EU ETS). In the econometric analysis a vector error correction model (VECM) is specified to estimate both long–run equilibrium relations and short–run interactions between the electricity price and the fuel (natural gas and coal) and carbon prices. The model is estimated using daily spot market prices and the four commodities prices are jointly modelled as endogenous variables. Moreover, a set of exogenous variables is incorporated in order to account for the electricity demand conditions (temperature) and the electricity generation mix (quantity of electricity traded according the technology used). The outcomes for the Portuguese electricity system suggest that the dynamic pass–through of carbon prices into electricity prices is strongly significant and a long–run elasticity was estimated (equilibrium relation) that is aligned with studies that have been conducted for other markets.
Resumo:
WorldFIP is standardised as European Norm EN 50170 - General Purpose Field Communication System. Field communication systems (fieldbuses) started to be widely used as the communication support for distributed computer-controlled systems (DCCS), and are being used in all sorts of process control and manufacturing applications within different types of industries. There are several advantages in using fieldbuses as a replacement of for the traditional point-to-point links between sensors/actuators and computer-based control systems. Indeed they concern economical ones (cable savings) but, importantly, fieldbuses allow an increased decentralisation and distribution of the processing power over the field. Typically DCCS have real-time requirements that must be fulfilled. By this, we mean that process data must be transferred between network computing nodes within a maximum admissible time span. WorldFIP has very interesting mechanisms to schedule data transfers. It explicit distinguishes to types of traffic: periodic and aperiodic. In this paper we describe how WorldFIP handles these two types of traffic, and more importantly, we provide a comprehensive analysis for guaranteeing the real-time requirements of both types of traffic. A major contribution is made in the analysis of worst-case response time of aperiodic transfer requests.
Resumo:
LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, its characteristics upon multiprocessor platforms have been little studied until now. Orthogonally, it has remained open how to efficiently schedule general task systems, including constrained deadline task systems, upon multiprocessors. Recent studies have introduced zero laxity (ZL) policy, which assigns a higher priority to a task with zero laxity, as a promising scheduling approach for such systems (e.g., EDZL). Towards understanding the importance of laxity in multiprocessor scheduling, this paper investigates the characteristics of ZL policy and presents the first ZL schedulability test for any work-conserving scheduling algorithm that employs this policy. It then investigates the characteristics of LLF scheduling, which also employs the ZL policy, and derives the first LLF-specific schedulability test on multiprocessors. It is shown that the proposed LLF test dominates the ZL test as well as the state-of-art EDZL test.
Resumo:
Glioma is the most frequent form of malignant brain tumor in the adults and childhood. There is a global tendency toward a higher incidence of gliomas in highly developed and industrialized countries. Simultaneously obesity is reaching epidemic proportions in such developed countries. It has been highly accepted that obesity may play an important role in the biology of several types of cancer. We have developed an in vitro method for the understanding of the influence of obesity on glioma mouse cells (Gl261). 3T3-L1 mouse pre-adipocytes were induced to the maturity. The conditioned medium was harvested and used into the Gl261 cultures. Using two-dimension electrophoresis it was analyzed the proteome content of Gl261 in the presence of conditioned medium (CGl) and in its absence (NCGl). The differently expressed spots were collected and analyzed by means of mass spectroscopy (MALDI-TOF-MS). Significantly expression pattern changes were observed in eleven proteins and enzymes. RFC1, KIF5C, ANXA2, N-RAP, RACK1 and citrate synthase were overexpressed or only present in the CGl. Contrariwise, STI1, hnRNPs and phosphoglycerate kinase 1 were significantly underexpressed in CGl. Aldose reductase and carbonic anhydrase were expressed only in NCGl. Our results show that obesity remodels the physiological and metabolic behavior of glioma cancer cells. Also, proteins found differently expressed are implicated in several signaling pathways that control matrix remodeling, proliferation, progression, migration and invasion. In general our results support the idea that obesity may increase glioma malignancy, however, some interesting paradox finding were also reported and discussed.
Resumo:
“Many-core” systems based on a Network-on-Chip (NoC) architecture offer various opportunities in terms of performance and computing capabilities, but at the same time they pose many challenges for the deployment of real-time systems, which must fulfill specific timing requirements at runtime. It is therefore essential to identify, at design time, the parameters that have an impact on the execution time of the tasks deployed on these systems and the upper bounds on the other key parameters. The focus of this work is to determine an upper bound on the traversal time of a packet when it is transmitted over the NoC infrastructure. Towards this aim, we first identify and explore some limitations in the existing recursive-calculus-based approaches to compute the Worst-Case Traversal Time (WCTT) of a packet. Then, we extend the existing model by integrating the characteristics of the tasks that generate the packets. For this extended model, we propose an algorithm called “Branch and Prune” (BP). Our proposed method provides tighter and safe estimates than the existing recursive-calculus-based approaches. Finally, we introduce a more general approach, namely “Branch, Prune and Collapse” (BPC) which offers a configurable parameter that provides a flexible trade-off between the computational complexity and the tightness of the computed estimate. The recursive-calculus methods and BP present two special cases of BPC when a trade-off parameter is 1 or ∞, respectively. Through simulations, we analyze this trade-off, reason about the implications of certain choices, and also provide some case studies to observe the impact of task parameters on the WCTT estimates.
Resumo:
In today’s healthcare paradigm, optimal sedation during anesthesia plays an important role both in patient welfare and in the socio-economic context. For the closed-loop control of general anesthesia, two drugs have proven to have stable, rapid onset times: propofol and remifentanil. These drugs are related to their effect in the bispectral index, a measure of EEG signal. In this paper wavelet time–frequency analysis is used to extract useful information from the clinical signals, since they are time-varying and mark important changes in patient’s response to drug dose. Model based predictive control algorithms are employed to regulate the depth of sedation by manipulating these two drugs. The results of identification from real data and the simulation of the closed loop control performance suggest that the proposed approach can bring an improvement of 9% in overall robustness and may be suitable for clinical practice.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.
Resumo:
The last decade has witnessed a major shift towards the deployment of embedded applications on multi-core platforms. However, real-time applications have not been able to fully benefit from this transition, as the computational gains offered by multi-cores are often offset by performance degradation due to shared resources, such as main memory. To efficiently use multi-core platforms for real-time systems, it is hence essential to tightly bound the interference when accessing shared resources. Although there has been much recent work in this area, a remaining key problem is to address the diversity of memory arbiters in the analysis to make it applicable to a wide range of systems. This work handles diverse arbiters by proposing a general framework to compute the maximum interference caused by the shared memory bus and its impact on the execution time of the tasks running on the cores, considering different bus arbiters. Our novel approach clearly demarcates the arbiter-dependent and independent stages in the analysis of these upper bounds. The arbiter-dependent phase takes the arbiter and the task memory-traffic pattern as inputs and produces a model of the availability of the bus to a given task. Then, based on the availability of the bus, the arbiter-independent phase determines the worst-case request-release scenario that maximizes the interference experienced by the tasks due to the contention for the bus. We show that the framework addresses the diversity problem by applying it to a memory bus shared by a fixed-priority arbiter, a time-division multiplexing (TDM) arbiter, and an unspecified work-conserving arbiter using applications from the MediaBench test suite. We also experimentally evaluate the quality of the analysis by comparison with a state-of-the-art TDM analysis approach and consistently showing a considerable reduction in maximum interference.