852 resultados para Generation analysis
Resumo:
"September 1988."
Resumo:
"IEPA/WPC/84-003."
Resumo:
Cover title.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.
Resumo:
Of the ~1.7 million SINE elements in the human genome, only a tiny number are estimated to be active in transcription by RNA polymerase (Pol) III. Tracing the individual loci from which SINE transcripts originate is complicated by their highly repetitive nature. By exploiting RNA-Seq datasets and unique SINE DNA sequences, we devised a bioinformatic pipeline allowing us to identify Pol III-dependent transcripts of individual SINE elements. When applied to ENCODE transcriptomes of seven human cell lines, this search strategy identified ~1300 Alu loci and ~1100 MIR loci corresponding to detectable transcripts, with ~120 and ~60 respectively Alu and MIR loci expressed in at least three cell lines. In vitro transcription of selected SINEs did not reflect their in vivo expression properties, and required the native 5’-flanking region in addition to internal promoter. We also identified a cluster of expressed AluYa5-derived transcription units, juxtaposed to snaR genes on chromosome 19, formed by a promoter-containing left monomer fused to an Alu-unrelated downstream moiety. Autonomous Pol III transcription was also revealed for SINEs nested within Pol II-transcribed genes raising the possibility of an underlying mechanism for Pol II gene regulation by SINE transcriptional units. Moreover the application of our bioinformatic pipeline to both RNA-seq data of cells subjected to an in vitro pro-oncogenic stimulus and of in vivo matched tumor and non-tumor samples allowed us to detect increased Alu RNA expression as well as the source loci of such deregulation. The ability to investigate SINE transcriptomes at single-locus resolution will facilitate both the identification of novel biologically relevant SINE RNAs and the assessment of SINE expression alteration under pathological conditions.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.
Resumo:
INTRODUCTION: Bipolar disorder requires long-term treatment but non-adherence is a common problem. Antipsychotic long-acting injections (LAIs) have been suggested to improve adherence but none are licensed in the UK for bipolar. However, the use of second-generation antipsychotics (SGA) LAIs in bipolar is not uncommon albeit there is a lack of systematic review in this area. This study aims to systematically review safety and efficacy of SGA LAIs in the maintenance treatment of bipolar disorder. METHODS AND ANALYSIS: The protocol is based on Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) and will include only randomised controlled trials comparing SGA LAIs in bipolar. PubMed, EMBASE, CINAHL, Cochrane Library (CENTRAL), PsychINFO, LiLACS, http://www.clinicaltrials.gov will be searched, with no language restriction, from 2000 to January 2016 as first SGA LAIs came to the market after 2000. Manufacturers of SGA LAIs will also be contacted. Primary efficacy outcome is relapse rate or delayed time to relapse or reduction in hospitalisation and primary safety outcomes are drop-out rates, all-cause discontinuation and discontinuation due to adverse events. Qualitative reporting of evidence will be based on 21 items listed on standards for reporting qualitative research (SRQR) focusing on study quality (assessed using the Jadad score, allocation concealment and data analysis), risk of bias and effect size. Publication bias will be assessed using funnel plots. If sufficient data are available meta-analysis will be performed with primary effect size as relative risk presented with 95% CI. Sensitivity analysis, conditional on number of studies and sample size, will be carried out on manic versus depressive symptoms and monotherapy versus adjunctive therapy.
Resumo:
The best results in the application of computer science systems to automatic translation are obtained in word processing when texts pertain to specific thematic areas, with structures well defined and a concise and limited lexicon. In this article we present a plan of systematic work for the analysis and generation of language applied to the field of pharmaceutical leaflet, a type of document characterized by format rigidity and precision in the use of lexicon. We propose a solution based in the use of one interlingua as language pivot between source and target languages; we are considering Spanish and Arab languages in this case of application.
Resumo:
One of the extraordinary aspects of nonlinear wave evolution which has been observed as the spontaneous occurrence of astonishing and statistically extraordinary amplitude wave is called rogue wave. We show that the eigenvalues of the associated equation of nonlinear Schrödinger equation are almost constant in the vicinity of rogue wave and we validate that optical rogue waves are formed by the collision between quasi-solitons in anomalous dispersion fiber exhibiting weak third order dispersion.
Resumo:
The innovation of optical frequency combs (OFCs) generated in passive mode-locked lasers has provided astronomy with unprecedented accuracy for wavelength calibration in high-resolution spectroscopy in research areas such as the discovery of exoplanets or the measurement of fundamental constants. The unique properties of OCFs, namely a highly dense spectrum of uniformly spaced emission lines of nearly equal intensity over the nominal wavelength range, is not only beneficial for high-resolution spectroscopy. Also in the low- to medium-resolution domain, the OFCs hold the promise to revolutionise the calibration techniques. Here, we present a novel method for generation of OFCs. As opposed to the mode-locked laser-based approach that can be complex, costly, and difficult to stabilise, we propose an all optical fibre-based system that is simple, compact, stable, and low-cost. Our system consists of three optical fibres where the first one is a conventional single-mode fibre, the second one is an erbium-doped fibre and the third one is a highly nonlinear low-dispersion fibre. The system is pumped by two equally intense continuous-wave (CW) lasers. To be able to control the quality and the bandwidth of the OFCs, it is crucial to understand how optical solitons arise out of the initial modulated CW field in the first fibre. Here, we numerically investigate the pulse evolution in the first fibre using the technique of the solitons radiation beat analysis. Having applied this technique, we realised that formation of higherorder solitons is supported in the low-energy region, whereas, in the high-energy region, Kuznetsov-Ma solitons appear.
Resumo:
Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^
Resumo:
A grid-connected DFIG for wind power generation can affect power system small-signal angular stability in two ways: by changing the system load flow condition and dynamically interacting with synchronous generators (SGs). This paper presents the application of conventional method of damping torque analysis (DTA) to examine the effect of DFIG’s dynamic interactions with SGs on the small-signal angular stability. It shows that the effect is due to the dynamic variation of power exchange between the DFIG and power system and can be estimated approximately by the DTA. Consequently, if the DFIG is modelled as a constant power source when the effect of zero dynamic interactions is assumed, the impact of change of load flow brought about by the DFIG can be determined. Thus the total effect of DFIG can be estimated from the result of DTA added on that of constant power source model. Applications of the DTA method proposed in the paper are discussed. An example of multi-machine power systems with grid-connected DFIGs are presented to demonstrate and validate the DTA method proposed and conclusions obtained in the paper.