988 resultados para MULTIPLE EXCITON GENERATION


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We experimentally demonstrate a Raman fiber laser based on multiple point-action fiber Bragg grating (FBG) reflectors and distributed feedback via Rayleigh scattering in a ∼22 km long optical fiber. Twenty two lasing lines with spacing of ∼100 GHz (close to ITU grid) in C-band are generated at Watts power level. In contrast to the normal cavity with competition between laser lines, the random distributed feedback cavity exhibits highly stable multiwavelength generation with a power-equalized uniform distribution which is almost independent on power. The current set up showing the capability of generating Raman gain of about 100-nm wide giving the possibility of multiwavelength generation at different bands. © 2011 SPIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The realisation of an eventual low-voltage (LV) Smart Grid with a complete communication infrastructure is a gradual process. During this evolution the protection scheme of distribution networks should be continuously adapted and optimised to fit the protection and cost requirements at the time. This paper aims to review practices and research around the design of an effective, adaptive and economical distribution network protection scheme. The background of this topic is introduced and potential problems are defined from conventional protection theories and new Smart Grid technologies. Challenges are identified with possible solutions defined as a pathway to the ultimate flexible and reliable LV protection systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this qualitative study was to explore the academic and nonacademic experiences of self-identified first-generation college students who left college before their second year. The study sought to find how the experiences might have affected the students' decision to depart. The case study method was used to investigate these college students who attended Florida International University. Semi-structured interviews were conducted with six ex-students who identified themselves as first-generation college students. The narrative data from the interviews were transcribed, coded, and analyzed. Analysis was informed by Pascarella, Pierson, Wolniak, and Terenzini's (2004) theoretical framework of important college academic and nonacademic experiences. An audit trail was kept and the data was triangulated by using multiple sources to establish certain findings. The most critical tool for enhancing trustworthiness was the use of member checking. I also received ongoing feedback from my major professor and committee throughout the dissertation process. The participants reported the following academic experiences: (a) patterns of coursework; (b) course-related interactions with peers; (c) relationships with faculty; (d) class size; (e) academic advisement; (f) orientation and peer advisors; and (e) financial aid. The participants reported the following nonacademic experiences; (f) on- or off- campus employment; (g) on- or off-campus residence; (h) participation in extracurricular activities; (i) noncourse-related peer relationships; (j) commuting and parking; and (k) FIU as an HSI. Isolationism and poor fit with the university were the most prevalent reasons for departure. The reported experiences of these first-generation college students shed light on those experiences that contributed to their departure. University administrators should give additional attention to these stories in an effort to improve retention strategies for this population. All but two of the participants went on to enroll in other institutions and reported good experiences with their new institutions. Recommendations are provided for continued research concerning how to best meet the needs of college students like the participants; students who have not learned from their parents about higher education financial aid, academic advisement, and orientation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of 3G (the 3rd generation telecommunication) value-added services brings higher requirements of Quality of Service (QoS). Wideband Code Division Multiple Access (WCDMA) is one of three 3G standards, and enhancement of QoS for WCDMA Core Network (CN) becomes more and more important for users and carriers. The dissertation focuses on enhancement of QoS for WCDMA CN. The purpose is to realize the DiffServ (Differentiated Services) model of QoS for WCDMA CN. Based on the parallelism characteristic of Network Processors (NPs), the NP programming model is classified as Pool of Threads (POTs) and Hyper Task Chaining (HTC). In this study, an integrated programming model that combines both of the two models was designed. This model has highly efficient and flexible features, and also solves the problems of sharing conflicts and packet ordering. We used this model as the programming model to realize DiffServ QoS for WCDMA CN. ^ The realization mechanism of the DiffServ model mainly consists of buffer management, packet scheduling and packet classification algorithms based on NPs. First, we proposed an adaptive buffer management algorithm called Packet Adaptive Fair Dropping (PAFD), which takes into consideration of both fairness and throughput, and has smooth service curves. Then, an improved packet scheduling algorithm called Priority-based Weighted Fair Queuing (PWFQ) was introduced to ensure the fairness of packet scheduling and reduce queue time of data packets. At the same time, the delay and jitter are also maintained in a small range. Thirdly, a multi-dimensional packet classification algorithm called Classification Based on Network Processors (CBNPs) was designed. It effectively reduces the memory access and storage space, and provides less time and space complexity. ^ Lastly, an integrated hardware and software system of the DiffServ model of QoS for WCDMA CN was proposed. It was implemented on the NP IXP2400. According to the corresponding experiment results, the proposed system significantly enhanced QoS for WCDMA CN. It extensively improves consistent response time, display distortion and sound image synchronization, and thus increases network efficiency and saves network resource.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We thank the High-Throughput Genomics Group at the Wellcome Trust Centre for Human Genetics and the Wellcome Trust Sanger Institute for the generation of the sequencing data. This work was funded by Wellcome Trust grant 090532/Z/09/Z (J.F.). Primary phenotyping of the mice was supported by the Mary Lyon Centre and Mammalian Genetics Unit (Medical Research Council, UK Hub grant G0900747 91070 and Medical Research Council, UK grant MC U142684172). D.A.B acknowledges support from NIH R01AR056280. The sleep work was supported by the state of Vaud (Switzerland) and the Swiss National Science Foundation (SNF 14694 and 136201 to P.F.). The ECG work was supported by the Netherlands CardioVascular Research Initiative (Dutch Heart Foundation, Dutch Federation of University Medical Centres, the Netherlands Organization for Health Research and Development, and the Royal Netherlands Academy of Sciences) PREDICT project, InterUniversity Cardiology Institute of the Netherlands (ICIN; 061.02; C.A.R., C.R.B). Na Cai is supported by the Agency of Science, Technology and Research (A*STAR) Graduate Academy. The authors wish to acknowledge excellent technical assistance from: Ayako Kurioka, Leo Swadling, Catherine de Lara, James Ussher, Rachel Townsend, Sima Lionikaite, Ausra S. Lionikiene, Rianne Wolswinkel and Inge van der Made. We would like to thank Thomas M Keane and Anthony G Doran for their help in annotating variants and adding the FVB/NJ strain to the Mouse Genomes Project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioscience subjects require a significant amount of training in laboratory techniques to produce highly skilled science graduates. Many techniques which are currently used in diagnostic, research and industrial laboratories require expensive equipment for single users; examples of which include next generation sequencing, quantitative PCR, mass spectrometry and other analytical techniques. The cost of the machines, reagents and limited access frequently preclude undergraduate students from using such cutting edge techniques. In addition to cost and availability, the time taken for analytical runs on equipment such as High Performance Liquid Chromatography (HPLC) does not necessarily fit with the limitations of timetabling. Understanding the theory underlying these techniques without the accompanying practical classes can be unexciting for students. One alternative from wet laboratory provision is to use virtual simulations of such practical which enable students to see the machines and interact with them to generate data. The Faculty of Science and Technology at the University of Westminster has provided all second and third year undergraduate students with iPads so that these students all have access to a mobile device to assist with learning. We have purchased licences from Labster to access a range of virtual laboratory simulations. These virtual laboratories are fully equipped and require student responses to multiple answer questions in order to progress through the experiment. In a pilot study to look at the feasibility of the Labster virtual laboratory simulations with the iPad devices; second year Biological Science students (n=36) worked through the Labster HPLC simulation on iPads. The virtual HPLC simulation enabled students to optimise the conditions for the separation of drugs. Answers to Multiple choice questions were necessary to progress through the simulation, these focussed on the underlying principles of the HPLC technique. Following the virtual laboratory simulation students went to a real HPLC in the analytical suite in order to separate of asprin, caffeine and paracetamol. In a survey 100% of students (n=36) in this cohort agreed that the Labster virtual simulation had helped them to understand HPLC. In free text responses one student commented that "The terminology is very clear and I enjoyed using Labster very much”. One member of staff commented that “there was a very good knowledge interaction with the virtual practical”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this Master’s Thesis was to study the suitability of transportation of liquid wastes to the portfolio of the case company. After the preliminary study the waste types were narrowed down to waste oil and oily waste from ports. The thesis was executed by generating a business plan. The qualitative research of this Master’s Thesis was executed as a case study by collecting information from multiple sources. The business plan was carried out by first familiarizing oneself with literature related to business planning which was then used as a base for the interview of the customer and interviews of the personnel of the case company. Additionally, internet sources and informal conversational interviews with the personnel of the case company were used and these interviews took place during the preliminary study and this thesis. The results of this thesis describe the requirements for the case company that must be met to be able to start operations. Import of waste oil fits perfectly to the portfolio of the case company and it doesn’t require any big investments. Success of the import of waste oil is affected by price of crude oil, exchange rate of ruble and legislation among others. Transportation of oily waste from ports, in turn, is not a core competence of the case company so more actions are required to start operating such as subcontracting with a waste management company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypertension is a major risk factor for cardiovascular disease and mortality, and a growing global public health concern, with up to one-third of the world’s population affected. Despite the vast amount of evidence for the benefits of blood pressure (BP) lowering accumulated to date, elevated BP is still the leading risk factor for disease and disability worldwide. It is well established that hypertension and BP are common complex traits, where multiple genetic and environmental factors contribute to BP variation. Furthermore, family and twin studies confirmed the genetic component of BP, with a heritability estimate in the range of 30-50%. Contemporary genomic tools enabling the genotyping of millions of genetic variants across the human genome in an efficient, reliable, and cost-effective manner, has transformed hypertension genetics research. This is accompanied by the presence of international consortia that have offered unprecedentedly large sample sizes for genome-wide association studies (GWASs). While GWAS for hypertension and BP have identified more than 60 loci, variants in these loci are associated with modest effects on BP and in aggregate can explain less than 3% of the variance in BP. The aims of this thesis are to study the genetic and environmental factors that influence BP and hypertension traits in the Scottish population, by performing several genetic epidemiological analyses. In the first part of this thesis, it aims to study the burden of hypertension in the Scottish population, along with assessing the familial aggregation and heritialbity of BP and hypertension traits. In the second part, it aims to validate the association of common SNPs reported in the large GWAS and to estimate the variance explained by these variants. In this thesis, comprehensive genetic epidemiology analyses were performed on Generation Scotland: Scottish Family Health Study (GS:SFHS), one of the largest population-based family design studies. The availability of clinical, biological samples, self-reported information, and medical records for study participants has allowed several assessments to be performed to evaluate factors that influence BP variation in the Scottish population. Of the 20,753 subjects genotyped in the study, a total of 18,470 individuals (grouped into 7,025 extended families) passed the stringent quality control (QC) criteria and were available for all subsequent analysis. Based on the BP-lowering treatment exposure sources, subjects were further classified into two groups. First, subjects with both a self-reported medications (SRMs) history and electronic-prescription records (EPRs; n =12,347); second, all the subjects with at least one medication history source (n =18,470). In the first group, the analysis showed a good concordance between SRMs and EPRs (kappa =71%), indicating that SRMs can be used as a surrogate to assess the exposure to BP-lowering medication in GS:SFHS participants. Although both sources suffer from some limitations, SRMs can be considered the best available source to estimate the drug exposure history in those without EPRs. The prevalence of hypertension was 40.8% with higher prevalence in men (46.3%) compared to women (35.8%). The prevalence of awareness, treatment and controlled hypertension as defined by the study definition were 25.3%, 31.2%, and 54.3%, respectively. These findings are lower than similar reported studies in other populations, with the exception of controlled hypertension prevalence, which can be considered better than other populations. Odds of hypertension were higher in men, obese or overweight individuals, people with a parental history of hypertension, and those living in the most deprived area of Scotland. On the other hand, deprivation was associated with higher odds of treatment, awareness and controlled hypertension, suggesting that people living in the most deprived area may have been receiving better quality of care, or have higher comorbidity levels requiring greater engagement with doctors. These findings highlight the need for further work to improve hypertension management in Scotland. The family design of GS:SFHS has allowed family-based analysis to be performed to assess the familial aggregation and heritability of BP and hypertension traits. The familial correlation of BP traits ranged from 0.07 to 0.20, and from 0.18 to 0.34 for parent-offspring pairs and sibling pairs, respectively. A higher correlation of BP traits was observed among first-degree relatives than other types of relative pairs. A variance-component model that was adjusted for sex, body mass index (BMI), age, and age-squared was used to estimate heritability of BP traits, which ranged from 24% to 32% with pulse pressure (PP) having the lowest estimates. The genetic correlation between BP traits showed a high correlation between systolic (SBP), diastolic (DBP) and mean arterial pressure (MAP) (G: 81% to 94%), but lower correlations with PP (G: 22% to 78%). The sibling recurrence risk ratio (λS) for hypertension and treatment were calculated as 1.60 and 2.04 respectively. These findings confirm the genetic components of BP traits in GS:SFHS, and justify further work to investigate genetic determinants of BP. Genetic variants reported in the recent large GWAS of BP traits were selected for genotyping in GS:SFHS using a custom designed TaqMan® OpenArray®. The genotyping plate included 44 single nucleotide polymorphisms (SNPs) that have been previously reported to be associated with BP or hypertension at genome-wide significance level. A linear mixed model that is adjusted for age, age-squared, sex, and BMI was used to test for the association between the genetic variants and BP traits. Of the 43 variants that passed the QC, 11 variants showed statistically significant association with at least one BP trait. The phenotypic variance explained by these variant for the four BP traits were 1.4%, 1.5%, 1.6%, and 0.8% for SBP, DBP, MAP, and PP, respectively. The association of genetic risk score (GRS) that were constructed from selected variants has showed a positive association with BP level and hypertension prevalence, with an average effect of one mmHg increase with each 0.80 unit increases in the GRS across the different BP traits. The impact of BP-lowering medication on the genetic association study for BP traits has been established, with typical practice of adding a fixed value (i.e. 15/10 mmHg) to the measured BP values to adjust for BP treatment. Using the subset of participants with the two treatment exposure sources (i.e. SRMs and EPRs), the influence of using either source to justify the addition of fixed values in SNP association signal was analysed. BP phenotypes derived from EPRs were considered the true phenotypes, and those derived from SRMs were considered less accurate, with some phenotypic noise. Comparing SNPs association signals between the four BP traits in the two model derived from the different adjustments showed that MAP was the least impacted by the phenotypic noise. This was suggested by identifying the same overlapped significant SNPs for the two models in the case of MAP, while other BP traits had some discrepancy between the two sources

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis charts the stakeholder communities, physical environment and daily life of two little studied Qādiriyya Sufi shrines associated with Shaikh ʿAbd al-Qādir al-Jīlānī (1077 – 1165 AD), a 12th century Ḥanbalī Muslim theologian and the posthumous founder of one of the oldest Sufi orders in Islam. The first shrine is based in Baghdad and houses his burial chamber; and the second shrine, on the outskirts of the city of ‘Aqra in the Kurdish region of northern Iraq, is that of his son Shaikh ʿAbd al-ʿAzīz (died 1206 AD). The latter was also known for lecturing in Ḥanbalī theology in the region, and venerated for this as well as his association with Shaikh ʿAbd al-Qādir. Driven by the research question “What shapes the identity orientations of these two Qādiriyya Sufi shrines in modern times?” the findings presented here are the result of field research carried out between November 2009 and February 2014. This field research revealed a complex context in which the two shrines existed and interacted, influenced by both Sufi and non-Sufi stakeholders who identified with and accessed these shrines to satisfy a variety of spiritual and practical needs, which in turn influenced the way each considered and viewed the two shrines from a number of orientations. These overlapping orientations include the Qādirī Sufi entity and the resting place of its patron saint; the orthodox Sunnī mosque with its muftī-imams, who are employed by the Iraqi government; the local Shīʿa community’s neighbourhood saint’s shrine and its destination for spiritual and practical aid; and the local provider of welfare to the poor of the city (soup kitchen, funeral parlour and electricity-generation amongst other services). The research findings also revealed a continuously changing and adapting Qādirī Sufi scene not immune from the national and regional socio-religio-political environments in which the two shrines exist: a non-Sufi national political class vying to influence and manipulate these shrines for their own purposes; and powerful national sectarian factions jostling to do the same. The mixture of stakeholders using and associating with the two shrines were found to be influential shapers of these entities, both physically and spiritually. Through encountering and interacting with each other, most stakeholders contributed to maintaining and rejuvenating the two shrines, but some also sought to adapt and change them driven by their particular orientation’s perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gastric (GC) and breast (BrC) cancer are two of the most common and deadly tumours. Different lines of evidence suggest a possible causative role of viral infections for both GC and BrC. Wide genome sequencing (WGS) technologies allow searching for viral agents in tissues of patients with cancer. These technologies have already contributed to establish virus-cancer associations as well as to discovery new tumour viruses. The objective of this study was to document possible associations of viral infection with GC and BrC in Mexican patients. In order to gain idea about cost effective conditions of experimental sequencing, we first carried out an in silico simulation of WGS. The next-generation-platform IlluminaGallx was then used to sequence GC and BrC tumour samples. While we did not find viral sequences in tissues from BrC patients, multiple reads matching Epstein-Barr virus (EBV) sequences were found in GC tissues. An end-point polymerase chain reaction confirmed an enrichment of EBV sequences in one of the GC samples sequenced, validating the next-generation sequencing-bioinformatics pipeline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-orthogonal multiple access (NOMA) is emerging as a promising multiple access technology for the fifth generation cellular networks to address the fast growing mobile data traffic. It applies superposition coding in transmitters, allowing simultaneous allocation of the same frequency resource to multiple intra-cell users. Successive interference cancellation is used at the receivers to cancel intra-cell interference. User pairing and power allocation (UPPA) is a key design aspect of NOMA. Existing UPPA algorithms are mainly based on exhaustive search method with extensive computation complexity, which can severely affect the NOMA performance. A fast proportional fairness (PF) scheduling based UPPA algorithm is proposed to address the problem. The novel idea is to form user pairs around the users with the highest PF metrics with pre-configured fixed power allocation. Systemlevel simulation results show that the proposed algorithm is significantly faster (seven times faster for the scenario with 20 users) with a negligible throughput loss than the existing exhaustive search algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of multiple stellar populations in globular clusters (GCs) is now well accepted, however, very little is known regarding their origin. In this Thesis, I study how multiple populations formed and evolved by means of customized 3D numerical simulations, in light of the most recent data from spectroscopic and photometric observations of Local and high-redshift Universe. Numerical simulations are the perfect tool to interpret these data: hydrodynamic simulations are suited to study the early phases of GCs formation, to follow in great detail the gas behavior, while N-body codes permit tracing the stellar component. First, we study the formation of second-generation stars in a rotating massive GC. We assume that second-generation stars are formed out of asymptotic giant branch stars (AGBs) ejecta, diluted by external pristine gas. We find that, for low pristine gas density, stars mainly formed out of AGBs ejecta rotate faster than stars formed out of more diluted gas, in qualitative agreement with current observations. Then, assuming a similar setup, we explored whether Type Ia supernovae affect the second- generation star formation and their chemical composition. We show that the evolution depends on the density of the infalling gas, but, in general, an iron spread is developed, which may explain the spread observed in some massive GCs. Finally, we focused on the long-term evolution of a GC, composed of two populations and orbiting the Milky Way disk. We have derived that, for an extended first population and a low-mass second one, the cluster loses almost 98 percent of its initial first population mass and the GC mass can be as much as 20 times less after a Hubble time. Under these conditions, the derived fraction of second-population stars reproduces the observed value, which is one of the strongest constraints of GC mass loss.