944 resultados para allocation and extraction
Resumo:
Mode of access: Internet.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Aim: To rapidly quantify hepatitis B virus (HBV) DNA by real-time PCR using efficient TaqMan probe and extraction methods of virus DNA. Methods: Three standards were prepared by cloning PCR products which targeted S, C and X region of HBV genome into pGEM-T vector respectively. A pair of primers and matched TaqMan probe were selected by comparing the copy number and the Ct values of HBV serum samples derived from the three different standard curves using certain serum DNA. Then the efficiency of six HBV DNA extraction methods including guanidinium isothiocyanate, proteinase K, NaI, NaOH lysis, alkaline lysis and simple boiling was analyzed in sample A, B and C by real-time PCR. Meanwhile, 8 clinical HBV serum samples were quantified. Results: The copy number of the same HBV serum sample originated from the standard curve of S, C and X regions was 5.7 × 104/ mL, 6.3 × 102/mL and 1.6 × 103/mL respectively. The relative Ct value was 26.6, 31.8 and 29.5 respectively. Therefore, primers and matched probe from S region were chosen for further optimization of six extraction methods. The copy number of HBV serum samples A, B and C was 3.49 × 109/mL, 2.08 × 106/mL and 4.40 × 107/mL respectively, the relative Ct value was 19.9, 30 and 26.2 in the method of NaOH lysis, which was the efficientest among six methods. Simple boiling showed a slightly lower efficiency than NaOH lysis. Guanidinium isothiocyanate, proteinase K and NaI displayed that the copy number of HBV serum sample A, B and C was around 105/ mL, meanwhile the Ct value was about 30. Alkaline failed to quantify the copy number of three HBV serum samples, Standard deviation (SD) and coefficient variation (CV) were very low in all 8 clinical HBV serum samples, showing that quantification of HBV DNA in triplicate was reliable and accurate. Conclusion: Real-time PCR based on optimized primers and TaqMan probe from S region in combination with NaOH lysis is a simple, rapid and accurate method for quantification of HBV serum DNA. © 2006 The WJG Press. All rights reserved.
Resumo:
Internal quantum efficiency (IQE) of a high-brightness blue LED has been evaluated from the external quantum efficiency measured as a function of current at room temperature. Processing the data with a novel evaluation procedure based on the ABC-model, we have determined separately IQE of the LED structure and light extraction efficiency (LEE) of UX:3 chip. Full text Nowadays, understanding of LED efficiency behavior at high currents is quite critical to find ways for further improvement of III-nitride LED performance [1]. External quantum efficiency ηe (EQE) provides integral information on the recombination and photon emission processes in LEDs. Meanwhile EQE is the product of IQE ηi and LEE ηext at negligible carrier leakage from the active region. Separate determination of IQE and LEE would be much more helpful, providing correlation between these parameters and specific epi-structure and chip design. In this paper, we extend the approach of [2,3] to the whole range of the current/optical power variation, providing an express tool for separate evaluation of IQE and LEE. We studied an InGaN-based LED fabricated by Osram OS. LED structure grown by MOCVD on sapphire substrate was processed as UX:3 chip and mounted into the Golden Dragon package without molding. EQE was measured with Labsphere CDS-600 spectrometer. Plotting EQE versus output power P and finding the power Pm corresponding to EQE maximum ηm enables comparing the measurements with the analytical relationships ηi = Q/(Q+p1/2+p-1/2) ,p = P/Pm , and Q = B/(AC) 1/2 where A, Band C are recombination constants [4]. As a result, maximum IQE value equal to QI(Q+2) can be found from the ratio ηm/ηe plotted as a function of p1/2 +p1-1/2 (see Fig.la) and then LEE calculated as ηext = ηm (Q+2)/Q . Experimental EQE as a function of normalized optical power p is shown in Fig. 1 b along with the analytical approximation based on the ABCmodel. The approximation fits perfectly the measurements in the range of the optical power (or operating current) variation by eight orders of magnitude. In conclusion, new express method for separate evaluation of IQE and LEE of III-nitride LEDs is suggested and applied to characterization of a high-brightness blue LED. With this method, we obtained LEE from the free chip surface to the air as 69.8% and IQE as 85.7% at the maximum and 65.2% at the operation current 350 rnA. [I] G. Verzellesi, D. Saguatti, M. Meneghini, F. Bertazzi, M. Goano, G. Meneghesso, and E. Zanoni, "Efficiency droop in InGaN/GaN blue light-emitting diodes: Physical mechanisms and remedies," 1. AppL Phys., vol. 114, no. 7, pp. 071101, Aug., 2013. [2] C. van Opdorp and G. W. 't Hooft, "Method for determining effective non radiative lifetime and leakage losses in double-heterostructure lasers," 1. AppL Phys., vol. 52, no. 6, pp. 3827-3839, Feb., 1981. [3] M. Meneghini, N. Trivellin, G. Meneghesso, E. Zanoni, U. Zehnder, and B. Hahn, "A combined electro-optical method for the determination of the recombination parameters in InGaN-based light-emitting diodes," 1. AppL Phys., vol. 106, no. II, pp. 114508, Dec., 2009. [4] Qi Dai, Qifeng Shan, ling Wang, S. Chhajed, laehee Cho, E. F. Schubert, M. H. Crawford, D. D. Koleske, Min-Ho Kim, and Yongjo Park, "Carrier recombination mechanisms and efficiency droop in GalnN/GaN light-emitting diodes," App/. Phys. Leu., vol. 97, no. 13, pp. 133507, Sept., 2010. © 2014 IEEE.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
A pénzügyekben mind elméletileg, mind az alkalmazások szempontjából fontos kérdés a tőkeallokáció. Hogyan osszuk szét egy adott portfólió kockázatát annak alportfóliói között? Miként tartalékoljunk tőkét a fennálló kockázatok fedezetére, és a tartalékokat hogyan rendeljük az üzleti egységekhez? A tőkeallokáció vizsgálatára axiomatikus megközelítést alkalmazunk, tehát alapvető tulajdonságok megkövetelésével dolgozunk. Cikkünk kiindulópontja Csóka-Pintér [2010] azon eredménye, hogy a koherens kockázati mértékek axiómái, valamint a tőkeallokációra vonatkozó méltányossági, ösztönzési és stabilitási követelmények nincsenek összhangban egymással. Ebben a cikkben analitikus és szimulációs eszközökkel vizsgáljuk ezeket a követelményeket. A gyakorlati alkalmazások során használt, illetve az elméleti szempontból érdekes tőkeallokációs módszereket is elemezzük. A cikk fő következtetése, hogy a Csóka-Pintér [2010] által felvetett probléma gyakorlati szempontból is releváns, tehát az nemcsak az elméleti vizsgálatok során merül fel, hanem igen sokszor előforduló és gyakorlati probléma. A cikk további eredménye, hogy a vizsgált tőkeallokációs módszerek jellemzésével segítséget nyújt az alkalmazóknak a különböző módszerek közötti választáshoz. / === / Risk capital allocation in finance is important theoretically and also in practical applications. How can the risk of a portfolio be shared among its sub-portfolios? How should the capital reserves be set to cover risks, and how should the reserves be assigned to the business units? The study uses an axiomatic approach to analyse risk capital allocation, by working with requiring basic properties. The starting point is a 2010 study by Csoka and Pinter (2010), who showed that the axioms of coherent measures of risk are not compatible with some fairness, incentive compatibility and stability requirements of risk allocation. This paper discusses these requirements using analytical and simulation tools. It analyses methods used in practical applications that have theoretically interesting properties. The main conclusion is that the problems identified in Csoka and Pinter (2010) remain relevant in practical applications, so that it is not just a theoretical issue, it is a common practical problem. A further contribution is made because analysis of risk allocation methods helps practitioners choose among the different methods available.
Resumo:
Sub-optimal recovery of bacterial DNA from whole blood samples can limit the sensitivity of molecular assays to detect pathogenic bacteria. We compared 3 different pre-lysis protocols (none, mechanical pre-lysis and achromopeptidasepre-lysis) and 5 commercially available DNA extraction platforms for direct detection of Group B Streptococcus (GBS) in spiked whole blood samples, without enrichment culture. DNA was extracted using the QIAamp Blood Mini kit (Qiagen), UCP Pathogen Mini kit (Qiagen), QuickGene DNA Whole Blood kit S (Fuji), Speed Xtract Nucleic Acid Kit 200 (Qiagen) and MagNA Pure Compact Nucleic Acid Isolation Kit I (Roche Diagnostics Corp). Mechanical pre-lysis increased yields of bacterial genomic DNA by 51.3 fold (95% confidence interval; 31.6–85.1, p < 0.001) and pre-lysis with achromopeptidase by 6.1 fold (95% CI; 4.2–8.9, p < 0.001), compared with no pre-lysis. Differences in yield dueto pre-lysis were 2–3 fold larger than differences in yield between extraction methods. Including a pre-lysis step can improve the limits of detection of GBS using PCR or other molecular methods without need for culture.
Resumo:
We investigate the performance of dual-hop two-way amplify-and-forward (AF) relaying in the presence of inphase and quadrature-phase imbalance (IQI) at the relay node. In particular, the effective signal-to-interference-plus-noise ratio (SINR) at both sources is derived. These SINRs are used to design an instantaneous power allocation scheme, which maximizes the minimum SINR of the two sources under a total transmit power constraint. The solution to this optimization problem is analytically determined and used to evaluate the outage probability (OP) of the considered two-way AF relaying system. Both analytical and numerical results show that IQI can create fundamental performance limits on two-way relaying, which cannot be avoided by simply improving the channel conditions.
The neurotoxin β-N-methylamino-L-alanine (BMAA) : Sources, bioaccumulation and extraction procedures
Resumo:
β-methylamino-L-alanine (BMAA) is a neurotoxin linked to neurodegeneration, which is manifested in the devastating human diseases amyotrophic lateral sclerosis, Alzheimer’s and Parkinson’s disease. This neurotoxin is known to be produced by almost all tested species within the cyanobacterial phylum including free living as well as the symbiotic strains. The global distribution of the BMAA producers ranges from a terrestrial ecosystem on the Island of Guam in the Pacific Ocean to an aquatic ecosystem in Northern Europe, the Baltic Sea, where annually massive surface blooms occur. BMAA had been shown to accumulate in the Baltic Sea food web, with highest levels in the bottom dwelling fish-species as well as in mollusks. One of the aims of this thesis was to test the bottom-dwelling bioaccumulation hypothesis by using a larger number of samples allowing a statistical evaluation. Hence, a large set of fish individuals from the lake Finjasjön, were caught and the BMAA concentrations in different tissues were related to the season of catching, fish gender, total weight and species. The results reveal that fish total weight and fish species were positively correlated with BMAA concentration in the fish brain. Therefore, significantly higher concentrations of BMAA in the brain were detected in plankti-benthivorous fish species and heavier (potentially older) individuals. Another goal was to investigate the potential production of BMAA by other phytoplankton organisms. Therefore, diatom cultures were investigated and confirmed to produce BMAA, even in higher concentrations than cyanobacteria. All diatom cultures studied during this thesis work were show to contain BMAA, as well as one dinoflagellate species. This might imply that the environmental spread of BMAA in aquatic ecosystems is even higher than previously thought. Earlier reports on the concentration of BMAA in different organisms have shown highly variable results and the methods used for quantification have been intensively discussed in the scientific community. In the most recent studies, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has become the instrument of choice, due to its high sensitivity and selectivity. Even so, different studies show quite variable concentrations of BMAA. In this thesis, three of the most common BMAA extraction protocols were evaluated in order to find out if the extraction could be one of the sources of variability. It was found that the method involving precipitation of proteins using trichloroacetic acid gave the best performance, complying with all in-house validation criteria. However, extractions of diatom and cyanobacteria cultures with this validated method and quantified using LC-MS/MS still resulted in variable BMAA concentrations, which suggest that also biological reasons contribute to the discrepancies. The current knowledge on the environmental factors that can induce or reduce BMAA production is still limited. In cyanobacteria, production of BMAA was earlier shown to be negative correlated with nitrogen availability – both in laboratory cultures as well as in natural populations. Based on this observation, it was suggested that in unicellular non-diazotrophic cyanobacteria, BMAA might take part in nitrogen metabolism. In order to find out if BMAA has a similar role in diatoms, BMAA was added to two diatom species in culture, in concentrations corresponding to those earlier found in the diatoms. The results suggest that BMAA might induce a nitrogen starvation signal in diatoms, as was earlier observed in cyanobacteria. However, diatoms recover shortly by the extracellular presence of excreted ammonia. Thus, also in diatoms, BMAA might be involved in the nitrogen balance in the cell.
Resumo:
Social network sites (SNS), such as Facebook, Google+ and Twitter, have attracted hundreds of millions of users daily since their appearance. Within SNS, users connect to each other, express their identity, disseminate information and form cooperation by interacting with their connected peers. The increasing popularity and ubiquity of SNS usage and the invaluable user behaviors and connections give birth to many applications and business models. We look into several important problems within the social network ecosystem. The first one is the SNS advertisement allocation problem. The other two are related to trust mechanisms design in social network setting, including local trust inference and global trust evaluation. In SNS advertising, we study the problem of advertisement allocation from the ad platform's angle, and discuss its differences with the advertising model in the search engine setting. By leveraging the connection between social networks and hyperbolic geometry, we propose to solve the problem via approximation using hyperbolic embedding and convex optimization. A hyperbolic embedding method, \hcm, is designed for the SNS ad allocation problem, and several components are introduced to realize the optimization formulation. We show the advantages of our new approach in solving the problem compared to the baseline integer programming (IP) formulation. In studying the problem of trust mechanisms in social networks, we consider the existence of distrust (i.e. negative trust) relationships, and differentiate between the concept of local trust and global trust in social network setting. In the problem of local trust inference, we propose a 2-D trust model. Based on the model, we develop a semiring-based trust inference framework. In global trust evaluation, we consider a general setting with conflicting opinions, and propose a consensus-based approach to solve the complex problem in signed trust networks.
Resumo:
Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.