18 resultados para Emission permits auctionsj double auctions.

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study focuses on the potential roles of the brick making industries in Sudan in deforestation and greenhouse gas emission due to the consumption of biofuels. The results were based on the observation of 25 brick making industries from three administrative regions in Sudan namely, Khartoum, Kassala and Gezira. The methodological approach followed the procedures outlined by the Intergovernmental Panel on Climate Change (IPCC). For predicting a serious deforestation scenario, it was also assumed that all of wood use for this particular purpose is from unsustainable sources. The study revealed that the total annual quantity of fuelwood consumed by the surveyed brick making industries (25) was 2,381 t dm. Accordingly, the observed total potential deforested wood was 10,624 m3, in which the total deforested round wood was 3,664 m3 and deforested branches was 6,961 m3. The study observed that a total of 2,990 t biomass fuels (fuelwood and dung cake) consumed annually by the surveyed brick making industries for brick burning. Consequently, estimated total annual emissions of greenhouse gases were 4,832 t CO2, 21 t CH4, 184 t CO, 0.15 t N20, 5 t NOX and 3.5 t NO while the total carbon released in the atmosphere was 1,318 t. Altogether, the total annual greenhouse gases emissions from biomass fuels burning was 5,046 t; of which 4,104 t from fuelwood and 943 t from dung cake burning. According to the results, due to the consumption of fuelwood in the brick making industries (3,450 units) of Sudan, the amount of wood lost from the total growing stock of wood in forests and trees in Sudan annually would be 1,466,000 m3 encompassing 505,000 m3 round wood and 961,000 m3 branches annually. By considering all categories of biofuels (fuelwood and dung cake), it was estimated that, the total emissions from all the brick making industries of Sudan would be 663,000 t CO2, 2,900 t CH4, 25,300 t CO, 20 t N2O, 720 t NOX and 470 t NO per annum, while the total carbon released in the atmosphere would be 181,000 t annually.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the informational efficiency of the European Union emission allowance (EUA) market. In an efficient market, the market price is unpredictable and profits above average are impossible in the long run. The main research problem is does the EUA price follow a random walk. The method is an econometric analysis of the price series, which includes an autocorrelation coefficient test and a variance ratio test. The results reveal that the price series is autocorrelated and therefore a nonrandom walk. In order to find out the extent of predictability, the price series is modelled with an autoregressive model. The conclusion is that the EUA price is autocorrelated only to a small degree and that the predictability cannot be used to make extra profits. The EUA market is therefore considered informationally efficient, although the price series does not fulfill the requirements of a random walk. A market review supports the conclusion, but it is clear that the maturing of the market is still in process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vast amount of public services and goods are contracted through procurement auctions. Therefore it is very important to design these auctions in an optimal way. Typically, we are interested in two different objectives. The first objective is efficiency. Efficiency means that the contract is awarded to the bidder that values it the most, which in the procurement setting means the bidder that has the lowest cost of providing a service with a given quality. The second objective is to maximize public revenue. Maximizing public revenue means minimizing the costs of procurement. Both of these goals are important from the welfare point of view. In this thesis, I analyze field data from procurement auctions and show how empirical analysis can be used to help design the auctions to maximize public revenue. In particular, I concentrate on how competition, which means the number of bidders, should be taken into account in the design of auctions. In the first chapter, the main policy question is whether the auctioneer should spend resources to induce more competition. The information paradigm is essential in analyzing the effects of competition. We talk of a private values information paradigm when the bidders know their valuations exactly. In a common value information paradigm, the information about the value of the object is dispersed among the bidders. With private values more competition always increases the public revenue but with common values the effect of competition is uncertain. I study the effects of competition in the City of Helsinki bus transit market by conducting tests for common values. I also extend an existing test by allowing bidder asymmetry. The information paradigm seems to be that of common values. The bus companies that have garages close to the contracted routes are influenced more by the common value elements than those whose garages are further away. Therefore, attracting more bidders does not necessarily lower procurement costs, and thus the City should not implement costly policies to induce more competition. In the second chapter, I ask how the auctioneer can increase its revenue by changing contract characteristics like contract sizes and durations. I find that the City of Helsinki should shorten the contract duration in the bus transit auctions because that would decrease the importance of common value components and cheaply increase entry which now would have a more beneficial impact on the public revenue. Typically, cartels decrease the public revenue in a significant way. In the third chapter, I propose a new statistical method for detecting collusion and compare it with an existing test. I argue that my test is robust to unobserved heterogeneity unlike the existing test. I apply both methods to procurement auctions that contract snow removal in schools of Helsinki. According to these tests, the bidding behavior of two of the bidders seems consistent with a contract allocation scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The information that the economic agents have and regard relevant to their decision making is often assumed to be exogenous in economics. It is assumed that the agents either poses or can observe the payoff relevant information without having to exert any effort to acquire it. In this thesis we relax the assumption of ex-ante fixed information structure and study what happens to the equilibrium behavior when the agents must also decide what information to acquire and when to acquire it. This thesis addresses this question in the two essays on herding and two essays on auction theory. In the first two essays, that are joint work with Klaus Kultti, we study herding models where it is costly to acquire information on the actions that the preceding agents have taken. In our model the agents have to decide both the action that they take and additionally the information that they want to acquire by observing their predecessors. We characterize the equilibrium behavior when the decision to observe preceding agents' actions is endogenous and show how the equilibrium outcome may differ from the standard model, where all preceding agents actions are assumed to be observable. In the latter part of this thesis we study two dynamic auctions: the English and the Dutch auction. We consider a situation where bidder(s) are uninformed about their valuations for the object that is put up for sale and they may acquire this information for a small cost at any point during the auction. We study the case of independent private valuations. In the third essay of the thesis we characterize the equilibrium behavior in an English auction when there are informed and uninformed bidders. We show that the informed bidder may jump bid and signal to the uninformed that he has a high valuation, thus deterring the uninformed from acquiring information and staying in the auction. The uninformed optimally acquires information once the price has passed a particular threshold and the informed has not signalled that his valuation is high. In addition, we provide an example of an information structure where the informed bidder initially waits and then makes multiple jumps. In the fourth essay of this thesis we study the Dutch auction. We consider two cases where all bidders are all initially uninformed. In the first case the information acquisition cost is the same across all bidders and in the second also the cost of information acquisition is independently distributed and private information to the bidders. We characterize a mixed strategy equilibrium in the first and a pure strategy equilibrium in the second case. In addition we provide a conjecture of an equilibrium in an asymmetric situation where there is one informed and one uninformed bidder. We compare the revenues that the first price auction and the Dutch auction generate and we find that under some circumstances the Dutch auction outperforms the first price sealed bid auction. The usual first price sealed bid auction and the Dutch auction are strategically equivalent. However, this equivalence breaks down in case information is acquired during the auction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heart failure is a common and highly challenging medical disorder. The progressive increase of elderly population is expected to further reflect in heart failure incidence. Recent progress in cell transplantation therapy has provided a conceptual alternative for treatment of heart failure. Despite improved medical treatment and operative possibilities, end-stage coronary artery disease present a great medical challenge. It has been estimated that therapeutic angiogenesis would be the next major advance in the treatment of ischaemic heart disease. Gene transfer to augment neovascularization could be beneficial for such patients. We employed a porcine model to evaluate the angiogenic effect of vascular endothelial growth factor (VEGF)-C gene transfer. Ameroid-generated myocardial ischemia was produced and adenovirus encoding (ad)VEGF-C or β-galactosidase (LacZ) gene therapy was given intramyocardially during progressive coronary stenosis. Angiography, positron emission tomography (PET), single photon emission computed tomography (SPECT) and histology evidenced beneficial affects of the adVEGF-C gene transfer compared to adLacZ. The myocardial deterioration during progressive coronary stenosis seen in the control group was restrained in the treatment group. We observed an uneven occlusion rate of the coronary vessels with Ameroid constrictor. We developed a simple methodological improvement of Ameroid model by ligating of the Ameroid–stenosed coronary vessel. Improvement of the model was seen by a more reliable occlusion rate of the vessel concerned and a formation of a rather constant myocardial infarction. We assessed the spontaneous healing of the left ventricle (LV) in this new model by SPECT, PET, MRI, and angiography. Significant spontaneous improvement of myocardial perfusion and function was seen as well as diminishment of scar volume. Histologically more microvessels were seen in the border area of the lesion. Double staining of the myocytes in mitosis indicated more cardiomyocyte regeneration at the remote area of the lesion. The potential of autologous myoblast transplantation after ischaemia and infarction of porcine heart was evaluated. After ligation of stenosed coronary artery, autologous myoblast transplantation or control medium was directly injected into the myocardium at the lesion area. Assessed by MRI, improvement of diastolic function was seen in the myoblast-transplanted animals, but not in the control animals. Systolic function remained unchanged in both groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interstellar clouds are not featureless, but show quite complex internal structures of filaments and clumps when observed with high enough resolution. These structures have been generated by 1) turbulent motions driven mainly by supernovae, 2) magnetic fields working on the ions and, through neutral-ion collisions, on neutral gas as well, and 3) self-gravity pulling a dense clump together to form a new star. The study of the cloud structure gives us information on the relative importance of each of these mechanisms, and helps us to gain a better understanding of the details of the star formation process. Interstellar dust is often used as a tracer for the interstellar gas which forms the bulk of the interstellar matter. Some of the methods that are used to derive the column density are summarized in this thesis. A new method, which uses the scattered light to map the column density in large fields with high spatial resolution, is introduced. This thesis also takes a look at the grain alignment with respect to the magnetic fields. The aligned grains give rise to the polarization of starlight and dust emission, thus revealing the magnetic field. The alignment mechanisms have been debated for the last half century. The strongest candidate at present is the radiative torques mechanism. In the first four papers included in this thesis, the scattered light method of column density estimation is formulated, tested in simulations, and finally used to obtain a column density map from observations. They demonstrate that the scattered light method is a very useful and reliable tool in column density estimation, and is able to provide higher resolution than the near-infrared color excess method. These two methods are complementary. The derived column density maps are also used to gain information on the dust emissivity within the observed cloud. The two final papers present simulations of polarized thermal dust emission assuming that the alignment happens by the radiative torques mechanism. We show that the radiative torques can explain the observed decline of the polarization degree towards dense cores. Furthermore, the results indicate that the dense cores themselves might not contribute significantly to the polarized signal, and hence one needs to be careful when interpreting the observations and deriving the magnetic field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emissions of coal combustion fly ash through real scale ElectroStatic Precipitators (ESP) were studied in different coal combustion and operation conditions. Sub-micron fly-ash aerosol emission from a power plant boiler and the ESP were determined and consequently the aerosol penetration, as based on electrical mobility measurements, thus giving thereby an indication for an estimate on the size and the maximum extent that the small particles can escape. The experimentals indicate a maximum penetration of 4% to 20 % of the small particles, as counted on number basis instead of the normally used mass basis, while simultaneously the ESP is operating at a nearly 100% collection efficiency on mass basis. Although the size range as such seems to appear independent of the coal, of the boiler or even of the device used for the emission control, the maximum penetration level on the number basis depends on the ESP operating parameters. The measured emissions were stable during stable boiler operation for a fired coal, and the emissions seemed each to be different indicating that the sub-micron size distribution of the fly-ash could be used as a specific characteristics for recognition, for instance for authenticity, provided with an indication of known stable operation. Consequently, the results on the emissions suggest an optimum particle size range for environmental monitoring in respect to the probability of finding traces from the samples. The current work embodies also an authentication system for aerosol samples for post-inspection from any macroscopic sample piece. The system can comprise newly introduced new devices, for mutually independent use, or, for use in a combination with each other, as arranged in order to promote the sampling operation length and/or the tag selection diversity. The tag for the samples can be based on naturally occurring measures and/or added measures of authenticity in a suitable combination. The method involves not only military related applications but those in civil industries as well. Alternatively to the samples, the system can be applied to ink for note printing or other monetary valued papers, but also in a filter manufacturing for marking fibrous filters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Luonnosta haihtuvat orgaaniset yhdisteet, joita pääsee ilmaan etenkin metsistä, voivat vaikuttaa paikalliseen ja alueelliseen ilmanlaatuun, koska ne reagoivat ilmakehässä. Niiden reaktiotuotteet voivat myös osallistua uusien hiukkasten muodostumiseen ja kasvuun, millä voi olla vaikutusta ilmakehän säteilytaseeseen ja tätä kautta myös ilmastoon. Hiukkaset absorboivat ja sirottavat auringon säteilyä ja maapallon lämpösäteilyä minkä lisäksi ne vaikuttavat pilvien säteilyominaisuuksiin, määrään ja elinikään. Koko maapallon mittakaavassa luonnosta tulevat hiilivetypäästöt ylittävät ihmistoiminnan aiheuttamat päästöt moninkertaisesti. Tämän vuoksi luonnon päästöjen arviointi on tärkeää kun halutaan kehittää tehokkaita ilmanlaatu- ja ilmastostrategioita. Tämä tutkimus käsittelee boreaalisen metsän hiilivetypäästöjä. Boreaalinen metsä eli pohjoinen havumetsä on suurin maanpäällinen ekosysteemi, ja se ulottuu lähes yhtenäisenä nauhana koko pohjoisen pallonpuoliskon ympäri. Sille on tyypillistä puulajien suhteellisen pieni kirjo sekä olosuhteiden ja kasvun voimakkaat vuodenaikaisvaihtelut. Työssä on tutkittu Suomen yleisimmän boreaalisen puun eli männyn hiilivetypäästöjen vuodenaikaisvaihtelua sekä päästöjen riippuvuutta lämpötilasta ja valosta. Saatuja tuloksia on käytetty yhdessä muiden boreaalisilla puilla tehtyjen päästömittaustulosten kanssa Suomen metsiä varten kehitetyssä päästömallissa. Malli perustuu lisäksi maankäyttötietoihin, suomen metsille kehitettyyn luokitukseen ja meteorologisiin tietoihin, joiden avulla se laskee metsien hiilivetypäästöt kasvukauden aikana. Suomen metsien päästöt koostuvat koko kasvukauden ajan suurelta osin alfa- ja beta-pineenistä sekä delta-kareenista. Kesällä ja syksyllä päästöissä on myös paljon sabineenia, jota tulee etenkin lehtipuista. Päästöt seuraavat lämpötilan keskimääräistä vaihtelua, ovat suurimmillaan maan eteläosissa ja laskevat tasaisesti pohjoiseen siirryttäessä. Metsän isopreenipäästö on suhteellisen pieni – Suomessa tärkein isopreeniä päästävä puu on vähäpäästöinen kuusi, koska runsaspäästöisten pajun ja haavan osuus metsän lehtimassasta on hyvin pieni. Tässä työssä on myös laskettu ensimmäinen arvio metsän seskviterpeenipäästöistä. Seskviterpeenipäästöt alkavat Juhannuksen jälkeen ja ovat kasvukauden aikana samaa suuruusluokkaa kuin isopreenipäästöt. Vuositasolla Suomen metsien hiilivetypäästöt ovat noin kaksinkertaiset ihmistoiminnasta aiheutuviin päästöihin verrattuna.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray Raman scattering and x-ray emission spectroscopies were used to study the electronic properties and phase transitions in several condensed matter systems. The experimental work, carried out at the European Synchrotron Radiation Facility, was complemented by theoretical calculations of the x-ray spectra and of the electronic structure. The electronic structure of MgB2 at the Fermi level is dominated by the boron σ and π bands. The high density of states provided by these bands is the key feature of the electronic structure contributing to the high critical temperature of superconductivity in MgB2. The electronic structure of MgB2 can be modified by atomic substitutions, which introduce extra electrons or holes into the bands. X ray Raman scattering was used to probe the interesting σ and π band hole states in pure and aluminum substituted MgB2. A method for determining the final state density of electron states from experimental x-ray Raman scattering spectra was examined and applied to the experimental data on both pure MgB2 and on Mg(0.83)Al(0.17)B2. The extracted final state density of electron states for the pure and aluminum substituted samples revealed clear substitution induced changes in the σ and π bands. The experimental work was supported by theoretical calculations of the electronic structure and x-ray Raman spectra. X-ray emission at the metal Kβ line was applied to the studies of pressure and temperature induced spin state transitions in transition metal oxides. The experimental studies were complemented by cluster multiplet calculations of the electronic structure and emission spectra. In LaCoO3 evidence for the appearance of an intermediate spin state was found and the presence of a pressure induced spin transition was confirmed. Pressure induced changes in the electronic structure of transition metal monoxides were studied experimentally and were analyzed using the cluster multiplet approach. The effects of hybridization, bandwidth and crystal field splitting in stabilizing the high pressure spin state were discussed. Emission spectroscopy at the Kβ line was also applied to FeCO3 and a pressure induced iron spin state transition was discovered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.