36 resultados para Quantum efficiency

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various reasons, such as ethical issues in maintaining blood resources, growing costs, and strict requirements for safe blood, have increased the pressure for efficient use of resources in blood banking. The competence of blood establishments can be characterized by their ability to predict the volume of blood collection to be able to provide cellular blood components in a timely manner as dictated by hospital demand. The stochastically varying clinical need for platelets (PLTs) sets a specific challenge for balancing supply with requests. Labour has been proven a primary cost-driver and should be managed efficiently. International comparisons of blood banking could recognize inefficiencies and allow reallocation of resources. Seventeen blood centres from 10 countries in continental Europe, Great Britain, and Scandinavia participated in this study. The centres were national institutes (5), parts of the local Red Cross organisation (5), or integrated into university hospitals (7). This study focused on the departments of blood component preparation of the centres. The data were obtained retrospectively by computerized questionnaires completed via Internet for the years 2000-2002. The data were used in four original articles (numbered I through IV) that form the basis of this thesis. Non-parametric data envelopment analysis (DEA, II-IV) was applied to evaluate and compare the relative efficiency of blood component preparation. Several models were created using different input and output combinations. The focus of comparisons was on the technical efficiency (II-III) and the labour efficiency (I, IV). An empirical cost model was tested to evaluate the cost efficiency (IV). Purchasing power parities (PPP, IV) were used to adjust the costs of the working hours and to make the costs comparable among countries. The total annual number of whole blood (WB) collections varied from 8,880 to 290,352 in the centres (I). Significant variation was also observed in the annual volume of produced red blood cells (RBCs) and PLTs. The annual number of PLTs produced by any method varied from 2,788 to 104,622 units. In 2002, 73% of all PLTs were produced by the buffy coat (BC) method, 23% by aphaeresis and 4% by the platelet-rich plasma (PRP) method. The annual discard rate of PLTs varied from 3.9% to 31%. The mean discard rate (13%) remained in the same range throughout the study period and demonstrated similar levels and variation in 2003-2004 according to a specific follow-up question (14%, range 3.8%-24%). The annual PLT discard rates were, to some extent, associated with production volumes. The mean RBC discard rate was 4.5% (range 0.2%-7.7%). Technical efficiency showed marked variation (median 60%, range 41%-100%) among the centres (II). Compared to the efficient departments, the inefficient departments used excess labour resources (and probably) production equipment to produce RBCs and PLTs. Technical efficiency tended to be higher when the (theoretical) proportion of lost WB collections (total RBC+PLT loss) from all collections was low (III). The labour efficiency varied remarkably, from 25% to 100% (median 47%) when working hours were the only input (IV). Using the estimated total costs as the input (cost efficiency) revealed an even greater variation (13%-100%) and overall lower efficiency level compared to labour only as the input. In cost efficiency only, the savings potential (observed inefficiency) was more than 50% in 10 departments, whereas labour and cost savings potentials were both more than 50% in six departments. The association between department size and efficiency (scale efficiency) could not be verified statistically in the small sample. In conclusion, international evaluation of the technical efficiency in component preparation departments revealed remarkable variation. A suboptimal combination of manpower and production output levels was the major cause of inefficiency, and the efficiency did not directly relate to production volume. Evaluation of the reasons for discarding components may offer a novel approach to study efficiency. DEA was proven applicable in analyses including various factors as inputs and outputs. This study suggests that analytical models can be developed to serve as indicators of technical efficiency and promote improvements in the management of limited resources. The work also demonstrates the importance of integrating efficiency analysis into international comparisons of blood banking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrogen (N) is one of the main inputs in cereal cultivation and as more than half of the arable land in Finland is used for cereal production, N has contributed substantially to agricultural pollution through fertilizer leaching and runoff. Based on this global phenomenon, the European Community has launched several directives to reduce agricultural emissions to the environment. Trough such measures, and by using economic incentives, it is expected that northern European agricultural practices will, in the future, include reduced N fertilizer application rates. Reduced use of N fertilizer is likely to decrease both production costs and pollution, but could also result in reduced yields and quality if crops experience temporary N deficiency. Therefore, more efficient N use in cereal production, to minimize pollution risks and maximize farmer income, represents a current challenge for agronomic research in the northern growing areas. The main objective of this study was to determine the differences in nitrogen use efficiency (NUE) among spring cereals grown in Finland. Additional aims were to characterize the multiple roles of NUE by analysing the extent of variation in NUE and its component traits among different cultivars, and to understand how other physiological traits, especially radiation use efficiency (RUE) and light interception, affect and interact with the main components of NUE and contribute to differences among cultivars. This study included cultivars of barley (Hordeum vulgare L.), oat (Avena sativa L.) and wheat (Triticum aestivum L.). Field experiments were conducted between 2001 and 2004 at Jokioinen, in Finland. To determine differences in NUE among cultivars and gauge the achievements of plant breeding in NUE, 17-18 cultivars of each of the three cereal species released between 1909 and 2002 were studied. Responses to nitrogen of landraces, old cultivars and modern cultivars of each cereal species were evaluated under two N regimes (0 and 90 kg N ha-1). Results of the study revealed that modern wheat, oat and barley cultivars had similar NUE values under Finnish growing conditions and only results from a wider range of cultivars indicated that wheat cultivars could have lower NUE than the other species. There was a clear relationship between nitrogen uptake efficiency (UPE) and NUE in all species whereas nitrogen utilization efficiency (UTE) had a strong positive relationship with NUE only for oat. UTE was clearly lower in wheat than in other species. Other traits related to N translocation indicated that wheat also had a lower harvest index, nitrogen harvest index and nitrogen remobilisation efficiency and therefore its N translocation efficiency was confirmed to be very low. On the basis of these results there appears to be potential and also a need for improvement in NUE. These results may help understand the underlying physiological differences in NUE and could help to identify alternative production options, such as the different roles that species can play in crop rotations designed to meet the demands of modern agricultural practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to evaluate intensity, productivity and efficiency in agriculture in Finland and show implications for N and P fertiliser management. Environmental concerns relating to agricultural production have been and still are focused on arguments about policies that affect agriculture. These policies constrain production while demand for agricultural products such as food, fibre and energy continuously increase. Therefore the importance of increasing productivity is a great challenge to agriculture. Over the last decades producers have experienced several large changes in the production environment such as the policy reform when Finland joined the EU 1995. Other and market changes occurred with the further EU enlargement with neighbouring countries in 2005 and with the decoupling of supports over the 2006-2007 period. Decreasing prices a decreased number of farmers and decreased profitability in agricultural production have resulted from these changes and constraints and of technological development. It is known that the accession to the EU 1995 would herald changes in agriculture. Especially of interest was how the sudden changes in prices of commodities on especially those of cereals, decreased by 60%, would influence agricultural production. The knowledge of properties of the production function increased in importance as a consequence of price changes. A research on the economic instruments to regulate productions was carried out and combined with earlier studies in paper V. In paper I the objective was to compare two different technologies, the conventional farming and the organic farming, determine differences in productivity and technical efficiency. In addition input specific or environmental efficiencies were analysed. The heterogeneity of agricultural soils and its implications were analysed in article II. In study III the determinants of technical inefficiency were analysed. The aspects and possible effects of the instability in policies due to a partial decoupling of production factors and products were studied in paper IV. Consequently connection between technical efficiency based on the turnover and the sales return was analysed in this study. Simple economic instruments such as fertiliser taxes have a direct effect on fertiliser consumption and indirectly increase the value of organic fertilisers. However, fertiliser taxes, do not fully address the N and P management problems adequately and are therefore not suitable for nutrient management improvements in general. Productivity of organic farms is lower on average than conventional farms and the difference increases when looking at selling returns only. The organic sector needs more research and development on productivity. Livestock density in organic farming increases productivity, however, there is an upper limit to livestock densities on organic farms and therefore nutrient on organic farms are also limited. Soil factors affects phosphorous and nitrogen efficiency. Soils like sand and silt have lower input specific overall efficiency for nutrients N and P. Special attention is needed for the management on these soils. Clay soils and soils with moderate clay content have higher efficiency. Soil heterogeneity is cause for an unavoidable inefficiency in agriculture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the informational efficiency of the European Union emission allowance (EUA) market. In an efficient market, the market price is unpredictable and profits above average are impossible in the long run. The main research problem is does the EUA price follow a random walk. The method is an econometric analysis of the price series, which includes an autocorrelation coefficient test and a variance ratio test. The results reveal that the price series is autocorrelated and therefore a nonrandom walk. In order to find out the extent of predictability, the price series is modelled with an autoregressive model. The conclusion is that the EUA price is autocorrelated only to a small degree and that the predictability cannot be used to make extra profits. The EUA market is therefore considered informationally efficient, although the price series does not fulfill the requirements of a random walk. A market review supports the conclusion, but it is clear that the maturing of the market is still in process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our present-day understanding of fundamental constituents of matter and their interactions is based on the Standard Model of particle physics, which relies on quantum gauge field theories. On the other hand, the large scale dynamical behaviour of spacetime is understood via the general theory of relativity of Einstein. The merging of these two complementary aspects of nature, quantum and gravity, is one of the greatest goals of modern fundamental physics, the achievement of which would help us understand the short-distance structure of spacetime, thus shedding light on the events in the singular states of general relativity, such as black holes and the Big Bang, where our current models of nature break down. The formulation of quantum field theories in noncommutative spacetime is an attempt to realize the idea of nonlocality at short distances, which our present understanding of these different aspects of Nature suggests, and consequently to find testable hints of the underlying quantum behaviour of spacetime. The formulation of noncommutative theories encounters various unprecedented problems, which derive from their peculiar inherent nonlocality. Arguably the most serious of these is the so-called UV/IR mixing, which makes the derivation of observable predictions especially hard by causing new tedious divergencies, to which our previous well-developed renormalization methods for quantum field theories do not apply. In the thesis I review the basic mathematical concepts of noncommutative spacetime, different formulations of quantum field theories in the context, and the theoretical understanding of UV/IR mixing. In particular, I put forward new results to be published, which show that also the theory of quantum electrodynamics in noncommutative spacetime defined via Seiberg-Witten map suffers from UV/IR mixing. Finally, I review some of the most promising ways to overcome the problem. The final solution remains a challenge for the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The efforts of combining quantum theory with general relativity have been great and marked by several successes. One field where progress has lately been made is the study of noncommutative quantum field theories that arise as a low energy limit in certain string theories. The idea of noncommutativity comes naturally when combining these two extremes and has profound implications on results widely accepted in traditional, commutative, theories. In this work I review the status of one of the most important connections in physics, the spin-statistics relation. The relation is deeply ingrained in our reality in that it gives us the structure for the periodic table and is of crucial importance for the stability of all matter. The dramatic effects of noncommutativity of space-time coordinates, mainly the loss of Lorentz invariance, call the spin-statistics relation into question. The spin-statistics theorem is first presented in its traditional setting, giving a clarifying proof starting from minimal requirements. Next the notion of noncommutativity is introduced and its implications studied. The discussion is essentially based on twisted Poincaré symmetry, the space-time symmetry of noncommutative quantum field theory. The controversial issue of microcausality in noncommutative quantum field theory is settled by showing for the first time that the light wedge microcausality condition is compatible with the twisted Poincaré symmetry. The spin-statistics relation is considered both from the point of view of braided statistics, and in the traditional Lagrangian formulation of Pauli, with the conclusion that Pauli's age-old theorem stands even this test so dramatic for the whole structure of space-time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the intermolecular interactions in (i) boron-nitrogen based systems for hydrogen splitting and storage, (ii) endohedral complexes, A@C60, and (iii) aurophilic dimers. We first present an introduction of intermolecular interactions. The theoretical background is then described. The research results are summarized in the following sections. In the boron-nitrogen systems, the electrostatic interaction is found to be the leading contribution, as 'Coulomb Pays for Heitler and London' (CHL). For the endohedral complex, the intermolecular interaction is formulated by a one-center expansion of the Coulomb operator 1/rab. For the aurophilic attraction between two C2v monomers, a London-type formula was derived by fully accounting for the anisotropy and point-group symmetry of the monomers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present work the methods of relativistic quantum chemistry have been applied to a number of small systems containing heavy elements, for which relativistic effects are important. First, a thorough introduction of the methods used is presented. This includes some of the general methods of computational chemistry and a special section dealing with how to include the effects of relativity in quantum chemical calculations. Second, after this introduction the results obtained are presented. Investigations on high-valent mercury compounds are presented and new ways to synthesise such compounds are proposed. The methods described were applied to certain systems containing short Pt-Tl contacts. It was possible to explain the interesting bonding situation in these compounds. One of the most common actinide compounds, uranium hexafluoride was investigated and a new picture of the bonding was presented. Furthermore the rareness of uranium-cyanide compounds was discussed. In a foray into the chemistry of gold, well known for its strong relativistic effects, investigations on different gold systems were performed. Analogies between Au$^+$ and platinum on one hand and oxygen on the other were found. New systems with multiple bonds to gold were proposed to experimentalists. One of the proposed systems was spectroscopically observed shortly afterwards. A very interesting molecule, which was theoretically predicted a few years ago is WAu$_{12}$. Some of its properties were calculated and the bonding situation was discussed. In a further study on gold compounds it was possible to explain the substitution pattern in bis[phosphane-gold(I)] thiocyanate complexes. This is of some help to experimentalists as the systems could not be crystallised and the structure was therefore unknown. Finally, computations on one of the heaviest elements in the periodic table were performed. Calculation on compounds containing element 110, darmstadtium, showed that it behaves similarly as its lighter homologue platinum. The extreme importance of relativistic effects for these systems was also shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantum effects are often of key importance for the function of biological systems at molecular level. Cellular respiration, where energy is extracted from the reduction of molecular oxygen to water, is no exception. In this work, the end station of the electron transport chain in mitochondria, cytochrome c oxidase, is investigated using quantum chemical methodology. Cytochrome c oxidase contains two haems, haem a and haem a3. Haem a3, with its copper companion, CuB, is involved in the final reduction of oxygen into water. This binuclear centre receives the necessary electrons from haem a. Haem a, in turn, receives its electrons from a copper ion pair in the vicinity, called CuA. Density functional theory (DFT) has been used to clarify the charge and spin distributions of haem a, as well as changes in these during redox activity. Upon reduction, the added electron is shown to be evenly distributed over the entire haem structure, important for the accommodation of the prosthetic group within the protein. At the same time, the spin distribution of the open-shell oxidised state is more localised to the central iron. The exact spin density distribution has been disputed in the literature, however, different experiments indicating different distributions of the unpaired electron. The apparent contradiction is shown to be due to the false assumption of a unit amount of unpaired electron density; in fact, the oxidised state has about 1.3 unpaired electrons. The validity of the DFT results have been corroborated by wave function based coupled cluster calculations. Point charges, for use in classical force field based simulations, have been parameterised for the four metal centres, using a newly developed methodology. In the procedure, the subsystem for which point charges are to be obtained, is surrounded by an outer region, with the purpose of stabilising the inner region, both electronically and structurally. Finally, the possibility of vibrational promotion of the electron transfer step between haem a and a3 has been investigated. Calculating the full vibrational spectra, at DFT level, of a combined model of the two haems, revealed several normal modes that do shift electron density between the haems. The magnitude of the shift was found to be moderate, at most. The proposed mechanism could have an assisting role in the electron transfer, which still seems to be dominated by electron tunnelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The removal of non-coding sequences, introns, is an essential part of messenger RNA processing. In most metazoan organisms, the U12-type spliceosome processes a subset of introns containing highly conserved recognition sequences. U12-type introns constitute less than 0,5% of all introns and reside preferentially in genes related to information processing functions, as opposed to genes encoding for metabolic enzymes. It has previously been shown that the excision of U12-type introns is inefficient compared to that of U2-type introns, supporting the model that these introns could provide a rate-limiting control for gene expression. The low efficiency of U12-type splicing is believed to have important consequences to gene expression by limiting the production of mature mRNAs from genes containing U12-type introns. The inefficiency of U12-type splicing has been attributed to the low abundance of the components of the U12-type spliceosome in cells, but this hypothesis has not been proven. The aim of the first part of this work was to study the effect of the abundance of the spliceosomal snRNA components on splicing. Cells with a low abundance of the U12-type spliceosome were found to inefficiently process U12-type introns encoded by a transfected construct, but the expression levels of endogenous genes were not found to be affected by the abundance of the U12-type spliceosome. However, significant levels of endogenous unspliced U12-type intron-containing pre-mRNAs were detected in cells. Together these results support the idea that U12-type splicing may limit gene expression in some situations. The inefficiency of U12-type splicing has also promoted the idea that the U12-type spliceosome may control gene expression, limiting the mRNA levels of some U12-type intron-containing genes. While the identities of the primary target genes that contain U12-type introns are relatively well known, little has previously been known about the downstream genes and pathways potentially affected by the efficiency of U12-type intron processing. Here, the effects of U12-type splicing efficiency on a whole organism were studied in a Drosophila line with a mutation in an essential U12-type spliceosome component. Genes containing U12-type introns showed variable gene-specific responses to the splicing defect, which points to variation in the susceptibility of different genes to changes in splicing efficiency. Surprisingly, microarray screening revealed that metabolic genes were enriched among downstream effects, and that the phenotype could largely be attributed to one U12-type intron-containing mitochondrial gene. Gene expression control by the U12-type spliceosome could thus have widespread effects on metabolic functions in the organism. The subcellular localization of the U12-type spliceosome components was studied as a response to a recent dispute on the localization of the U12-type spliceosome. All components studied were found to be nuclear indicating that the processing of U12-type introns occurs within the nucleus, thus clarifying a question central to the field. The results suggest that the U12-type spliceosome can limit the expression of genes that contain U12-type introns in a gene-specific manner. Through its limiting role in pre-mRNA processing, the U12-type splicing activity can affect specific genetic pathways, which in the case of Drosophila are involved in metabolic functions.