962 resultados para IT Usage
Resumo:
An extensive electricity transmission network facilitates electricity trading between Finland, Sweden, Norway and Denmark. Currently most of the area's power generation is traded at NordPool, where the trading volumes have steadily increased since the early 1990's, when the exchange was founded. The Nordic electricity is expected to follow the current trend and further integrate with the other European electricity markets. Hydro power is the source for roughly a half of the supply in the Nordic electricity market and most of the hydro is generated in Norway. The dominating role of hydro power distinguishes the Nordic electricity market from most of the other market places. Production of hydro power varies mainly due to hydro reservoirs and demand for electricity. Hydro reservoirs are affected by water inflows that differ each year. The hydro reservoirs explain remarkably the behaviour of the Nordic electricity markets. Therefore among others, Kauppi and Liski (2008) have developed a model that analyzes the behaviour of the markets using hydro reservoirs as explanatory factors. Their model includes, for example, welfare loss due to socially suboptimal hydro reservoir usage, socially optimal electricity price, hydro reservoir storage and thermal reservoir storage; that are referred as outcomes. However, the model does not explain the real market condition but rather an ideal situation. In the model the market is controlled by one agent, i.e. one agent controls all the power generation reserves; it is referred to as a socially optimal strategy. Article by Kauppi and Liski (2008) includes an assumption where an individual agent has a certain fraction of market power, e.g. 20 % or 30 %. In order to maintain the focus of this thesis, this part of their paper is omitted. The goal of this thesis is two-fold. Firstly we expand the results from the socially optimal strategy for years 2006-08, as the earlier study finishes in 2005. The second objective is to improve on the methods from the previous study. This thesis results several outcomes (SPOT-price and welfare loss, etc.) due to socially optimal actions. Welfare loss is interesting as it describes the inefficiency of the market. SPOT-price is an important output for the market participants as it often has an effect on end users' electricity bills. Another function is to modify and try to improve the model by means of using more accurate input data, e.g. by considering pollution trade rights effect on input data. After modifications to the model, new welfare losses are calculated and compared with the same results before the modifications. The hydro reservoir has the higher explanatory significance in the model followed by thermal power. In Nordic markets, thermal power reserves are mostly nuclear power and other thermal sources (coal, natural gas, oil, peat). It can be argued that hydro and thermal reservoirs determine electricity supply. Roughly speaking, the model takes into account electricity demand and supply, and several parameters related to them (water inflow, oil price, etc.), yielding finally the socially optimal outcomes. The author of this thesis is not aware of any similar model being tested before. There have been some other studies that are close to the Kauppi and Liski (2008) model, but those have a somewhat different focus. For example, a specific feature in the model is the focus on long-run capacity usage that differs from the previous studies on short-run market power. The closest study to the model is from California's wholesale electricity markets that, however, uses different methodology. Work is constructed as follows.
Resumo:
A novel approach that can more effectively use the structural information provided by the traditional imaging modalities in multimodal diffuse optical tomographic imaging is introduced. This approach is based on a prior image-constrained-l(1) minimization scheme and has been motivated by the recent progress in the sparse image reconstruction techniques. It is shown that the proposed framework is more effective in terms of localizing the tumor region and recovering the optical property values both in numerical and gelatin phantom cases compared to the traditional methods that use structural information. (C) 2012 Optical Society of America
Resumo:
Energy and sustainability have become one of the most critical issues of our generation. While the abundant potential of renewable energy such as solar and wind provides a real opportunity for sustainability, their intermittency and uncertainty present a daunting operating challenge. This thesis aims to develop analytical models, deployable algorithms, and real systems to enable efficient integration of renewable energy into complex distributed systems with limited information.
The first thrust of the thesis is to make IT systems more sustainable by facilitating the integration of renewable energy into these systems. IT represents the fastest growing sectors in energy usage and greenhouse gas pollution. Over the last decade there are dramatic improvements in the energy efficiency of IT systems, but the efficiency improvements do not necessarily lead to reduction in energy consumption because more servers are demanded. Further, little effort has been put in making IT more sustainable, and most of the improvements are from improved "engineering" rather than improved "algorithms". In contrast, my work focuses on developing algorithms with rigorous theoretical analysis that improve the sustainability of IT. In particular, this thesis seeks to exploit the flexibilities of cloud workloads both (i) in time by scheduling delay-tolerant workloads and (ii) in space by routing requests to geographically diverse data centers. These opportunities allow data centers to adaptively respond to renewable availability, varying cooling efficiency, and fluctuating energy prices, while still meeting performance requirements. The design of the enabling algorithms is however very challenging because of limited information, non-smooth objective functions and the need for distributed control. Novel distributed algorithms are developed with theoretically provable guarantees to enable the "follow the renewables" routing. Moving from theory to practice, I helped HP design and implement industry's first Net-zero Energy Data Center.
The second thrust of this thesis is to use IT systems to improve the sustainability and efficiency of our energy infrastructure through data center demand response. The main challenges as we integrate more renewable sources to the existing power grid come from the fluctuation and unpredictability of renewable generation. Although energy storage and reserves can potentially solve the issues, they are very costly. One promising alternative is to make the cloud data centers demand responsive. The potential of such an approach is huge.
To realize this potential, we need adaptive and distributed control of cloud data centers and new electricity market designs for distributed electricity resources. My work is progressing in both directions. In particular, I have designed online algorithms with theoretically guaranteed performance for data center operators to deal with uncertainties under popular demand response programs. Based on local control rules of customers, I have further designed new pricing schemes for demand response to align the interests of customers, utility companies, and the society to improve social welfare.
Resumo:
In recent years, the performance of semi-supervised learning has been theoretically investigated. However, most of this theoretical development has focussed on binary classification problems. In this paper, we take it a step further by extending the work of Castelli and Cover [1] [2] to the multi-class paradigm. Particularly, we consider the key problem in semi-supervised learning of classifying an unseen instance x into one of K different classes, using a training dataset sampled from a mixture density distribution and composed of l labelled records and u unlabelled examples. Even under the assumption of identifiability of the mixture and having infinite unlabelled examples, labelled records are needed to determine the K decision regions. Therefore, in this paper, we first investigate the minimum number of labelled examples needed to accomplish that task. Then, we propose an optimal multi-class learning algorithm which is a generalisation of the optimal procedure proposed in the literature for binary problems. Finally, we make use of this generalisation to study the probability of error when the binary class constraint is relaxed.
Resumo:
Experiments and observations on the phytoplankton of certain lakes in the English Lake District were made from early 1973 to the end of March, 1974. They included laboratory and lake bioassays and observations on the quantity and quality of the phytoplankton in six lakes. The introductory sections of the report are about algae, the ecology of phytoplankton and the scope of the contracted work. Laboratory bioassays on water from one lake, Blelham Tarn, showed that phosphorus, silicon (for diatoms) and organic substances forming complexes with iron were the major substances limiting the growth of the algae tested. The growth of the test algae was limited to different degrees by those substances and, to some extent, to a greater or lesser degree at different times of year. It is suggested that a relatively simple form of bioassay could give valuable information to water undertakings. Lake bioassays and other experiments were carried out by using large in situ tubular plastic enclosures. Two such investigations are described. The effects of a change in sewerage in two drainage basins on the phytoplankton of three lakes is described and some data given about changes since 1945 in three other lakes in the same overall drainage basin. These latter lakes have been affected too by changes in sewerage and by increasing inputs of domestic and agricultural wastes. Throughout, the relevance of the work done to practical problems of water usage is kept in mind and discussed. In the last section special reference is made to the largely unpredictable results of water transfers. The report ends with a note on river phytoplankton.
Resumo:
188 p.
Resumo:
Background: Contrary to what is generally thought schizophrenia is a very common mental health issue. For this, several animal models are used to assess the illness in order to develop a definitive. The most widely spread paradigm is the use of pharmacological models. Aim: The aim of this review is to display which are the most used insults for the assessment of social behaviour related negative symptoms in animal models as well as to ascertain which is the most adequate regime. Design: Literature review. Methods: PubMed database was used for this article by the search of the indexed “schizophrenia”, “animal models”, “social behaviour” and “negative symptoms” descriptors. With the exception of a single article due to it value this review is based on articles from 10 years onwards. Besides, only clinical trials and reviews written in English or Spanish and that had laboratory rodents as target population were accepted. Results: The studies assessed agree that pharmacological models (specially those regarding the NMDA receptor antagonists) are a valuable means for the experimental investigation of negative symptoms in schizophrenia with the necessity to emphasise that only some negative symptoms (anhedonia and social interaction, mainly) can be experimentally assessed. Conclusions: There is not enough evidence regarding the fours aspects of this review. PCP, Ketamine or MK-801 in sub-acute dosage regimes are currently the most indicated insults to mimic schizophrenic symptoms in rodents, although further research in needed, albeit other substances are valuable as well. (In English language exclusively)
Resumo:
Capacitance-voltage (C-V) characteristics of lead zirconate titanate (PZT) thin films with a thickness of 130 nm were measured between 300 and 533 K. The transition between ferroelectric and paraelectric phases was revealed to be of second order in our case, with a Curie temperature at around 450 K. A linear relationship was found between the measured capacitance and the inverse square root of the applied voltage. It was shown that such a relationship could be fitted well by a universal expression of C/A = k(V+V(0))(-1/2) and that this expression could be derived by expanding the Landau-Devonshire free energy at an effective equilibrium position of the Ti/Zr ion in a PZT unit cell. By using the derived equations in this work, the free energy parameters for an individual material can be obtained solely from the corresponding C-V data, and the temperature dependences of both remnant polarization and coercive voltage are shown to be in quantitative agreement with the experimental data.
Resumo:
This paper presents the results of a project aimed at minimising fuel usage while maximising steam availability in the power and steam plant of a large newsprint mill. The approach taken was to utilise the better regulation and plant wide optimisation capabilities of Advanced Process Control, especially Model Predictive Control (MPC) techniques. These have recently made their appearance in the pulp and paper industry but are better known in the oil and petrochemical industry where they have been used for nearly 30 years. The issue in the power and steam plant is to ensure that sufficient steam is available when the paper machines require it and yet not to have to waste too much steam when one or more of the machines suffers an outage. This is a problem for which MPC is well suited. It allows variables to be kept within declared constraint ranges, a feature which has been used, effectively, to increase the steam storage capacity of the existing plant. This has resulted in less steam being condensed when it is not required and in significant reductions in the need for supplementary firing. The incidence of steam being dump-condensed while also supplementary firing the Combined Heat & Power (CHP) plant has been reduced by 95% and the overall use of supplementary firing is less than 30% of what it was. In addition the plant runs more smoothly and requires less operator time. The yearly benefit provided by the control system is greater than £200,000, measured in terms of 2005 gas prices.
Resumo:
Urquhart, C. (editor for JUSTEIS team), Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Armstrong, A., Lonsdale, R. & Fenton, R. (2003). JUSTEIS (JISC Usage Surveys: Trends in Electronic Information Services) Strand A: survey of end users of all electronic information services (HE and FE), with Action research report. Final report 2002/2003 Cycle Four. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth with Information Automation Ltd (CIQM). Sponsorship: JISC
Resumo:
Ribosome profiling (ribo-seq) is a recently developed technique that provides genomewide information on protein synthesis (GWIPS) in vivo. The high resolution of ribo-seq is one of the exciting properties of this technique. In Chapter 2, I present a computational method that utilises the sub-codon precision and triplet periodicity of ribosome profiling data to detect transitions in the translated reading frame. Application of this method to ribosome profiling data generated for human HeLa cells allowed us to detect several human genes where the same genomic segment is translated in more than one reading frame. Since the initial publication of the ribosome profiling technique in 2009, there has been a proliferation of studies that have used the technique to explore various questions with respect to translation. A review of the many uses and adaptations of the technique is provided in Chapter 1. Indeed, owing to the increasing popularity of the technique and the growing number of published ribosome profiling datasets, we have developed GWIPS-viz (http://gwips.ucc.ie), a ribo-seq dedicated genome browser. Details on the development of the browser and its usage are provided in Chapter 3. One of the surprising findings of ribosome profiling of initiating ribosomes carried out in 3 independent studies, was the widespread use of non-AUG codons as translation initiation start sites in mammals. Although initiation at non-AUG codons in mammals has been documented for some time, the extent of non-AUG initiation reported by these ribo-seq studies was unexpected. In Chapter 4, I present an approach for estimating the strength of initiating codons based on the leaky scanning model of translation initiation. Application of this approach to ribo-seq data illustrates that initiation at non-AUG codons is inefficient compared to initiation at AUG codons. In addition, our approach provides a probability of initiation score for each start site that allows its strength of initiation to be evaluated.
Resumo:
This is a collaborative paper between Juergen Schneider-Schaulies's group and ours. It shows the differential use of the two measles receptors in various strains.
Resumo:
Abstract This study evaluates the reliability of self-assessment as a measure of computer competence. This evaluation is carried out in response to recent research which has employed self-reported ratings as the sole indicator of students’ computer competence. To evaluate the reliability of self-assessed computer competence, the scores achieved by students in self-assessed computer competence tests are compared with scores achieved in objective tests. The results reveal a statistically significantly over-estimation of computer competence among the students surveyed. Furthermore, reported pre-university computer experience in terms of home and school use and formal IT education does not affect this result. The findings call into question the validity of using self-assessment as a measure of computer competence. More generally, the study also provides an up-to-date picture of self-reported computer usage and IT experience among pre-university students from New Zealand and South-east Asia and contrasts these findings with those from previous research.
Resumo:
Telematic tools are very important for our lives in the present era and moreover this idea is made more evident if we analyse young people behaviours. However, it seems that the possibilities that these tools allow subjects from a professional point of view, beyond the purely playful aspects, are still not fully exploited both by subjects, neither by educational institutions where they learn. Our work studies the uses of social media in the context of university students. In order to this we have designed a research based on quantitative methodology with a survey. We have applied a questionnaire to students in the University of Murcia. The questionnaire was answered by 487 students in the first half of 2014. The survey results confirm our hypothesis that social networks are part of the basic and habitual tools of communication between the youth of our university and eminently used for leisure purposes, and that the tools used for more academic activities are those allowing greater control of privacy.
Resumo:
Strasheela provides a means for the composer to create a symbolic score by formally describing it in a rule-based way. The environment defines a rich music representation for complex polyphonic scores. Strasheela enables the user to define expressive compositional rules and then to apply them to the score. Compositional rules can restrict many aspects of the music - including the rhythmic structure, the melodic structure and the harmonic structure - by constraining the parameters (e.g. duration or pitch) of musical events according to some numerical or logical relation. Strasheela combines this expressivity with efficient search strategies.