18 resultados para second-generation sequencing

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drying is a major and challenging step in the pre-treatment of biomass for production of second generation synfuels for transport. The biomass feedstocks are mostly wet and need to be dried from 30 to 60 wt% moisture content to about 10-15 wt%. The present survey aims to define and evaluate a few of the most promising optimised concepts for biomass pre-treatment scheme in the production of second generation synfuels for transport. The most promising commercially available drying processes were reviewed, focusing on the applications, operational factors and emissions of dryers. The most common dryers applied now for biomass in bio-energy plants are direct rotary dryers, but the use of steam drying techniques is increasing. Steam drying systems enable the integration of the dryer to existing energy sources. In addition to integration, emissions and fire or explosion risks have to be considered when selecting a dryer for the plant. In steam drying there will be no gaseous emissions, but the aqueous effluents need often treatment. Concepts for biomass pre-treatment were defined for two different cases including a large-scale wood-based gasification synfuel production and a small-scale pyrolysis process based on wood chips and miscanthus bundles. For the first case a pneumatic conveying steam dryer was suggested. In the second case the flue gas will be used as drying medium in a direct or indirect rotary dryer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Bipolar disorder requires long-term treatment but non-adherence is a common problem. Antipsychotic long-acting injections (LAIs) have been suggested to improve adherence but none are licensed in the UK for bipolar. However, the use of second-generation antipsychotics (SGA) LAIs in bipolar is not uncommon albeit there is a lack of systematic review in this area. This study aims to systematically review safety and efficacy of SGA LAIs in the maintenance treatment of bipolar disorder. METHODS AND ANALYSIS: The protocol is based on Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) and will include only randomised controlled trials comparing SGA LAIs in bipolar. PubMed, EMBASE, CINAHL, Cochrane Library (CENTRAL), PsychINFO, LiLACS, http://www.clinicaltrials.gov will be searched, with no language restriction, from 2000 to January 2016 as first SGA LAIs came to the market after 2000. Manufacturers of SGA LAIs will also be contacted. Primary efficacy outcome is relapse rate or delayed time to relapse or reduction in hospitalisation and primary safety outcomes are drop-out rates, all-cause discontinuation and discontinuation due to adverse events. Qualitative reporting of evidence will be based on 21 items listed on standards for reporting qualitative research (SRQR) focusing on study quality (assessed using the Jadad score, allocation concealment and data analysis), risk of bias and effect size. Publication bias will be assessed using funnel plots. If sufficient data are available meta-analysis will be performed with primary effect size as relative risk presented with 95% CI. Sensitivity analysis, conditional on number of studies and sample size, will be carried out on manic versus depressive symptoms and monotherapy versus adjunctive therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have previously described ProxiMAX, a technology that enables the fabrication of precise, combinatorial gene libraries via codon-by-codon saturation mutagenesis. ProxiMAX was originally performed using manual, enzymatic transfer of codons via blunt-end ligation. Here we present Colibra™: an automated, proprietary version of ProxiMAX used specifically for antibody library generation, in which double-codon hexamers are transferred during the saturation cycling process. The reduction in process complexity, resulting library quality and an unprecedented saturation of up to 24 contiguous codons are described. Utility of the method is demonstrated via fabrication of complementarity determining regions (CDR) in antibody fragment libraries and next generation sequencing (NGS) analysis of their quality and diversity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advent of the Integrated Services Digital Network (ISDN) led to the standardisation of the first video codecs for interpersonal video communications, followed closely by the development of standards for the compression, storage and distribution of digital video in the PC environment, mainly targeted at CD-ROM storage. At the same time the second-generation digital wireless networks, and the third-generation networks being developed, have enough bandwidth to support digital video services. The radio propagation medium is a difficult environment in which to deploy low bit error rate, real time services such as video. The video coding standards designed for ISDN and storage applications, were targeted at low bit error rate levels, orders of magnitude lower than the typical bit error rates experienced on wireless networks. This thesis is concerned with the transmission of digital, compressed video over wireless networks. It investigates the behaviour of motion compensated, hybrid interframe DPCM/DCT video coding algorithms, which form the basis of current coding algorithms, in the presence of high bit error rates commonly found on digital wireless networks. A group of video codecs, based on the ITU-T H.261 standard, are developed which are robust to the burst errors experienced on radio channels. The radio link is simulated at low level, to generate typical error files that closely model real world situations, in a Rayleigh fading environment perturbed by co-channel interference, and on frequency selective channels which introduce inter symbol interference. Typical anti-multipath techniques, such as antenna diversity, are deployed to mitigate the effects of the channel. Link layer error control techniques are also investigated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Groupe Spécial Mobile (GSM) has been developed as the pan-European second generation of digital mobile systems. GSM operates in the 900 MHz frequency band and employs digital technology instead of the analogue technology of its predecessors. Digital technology enables the GSM system to operate in much smaller zones in comparison with the analogue systems. The GSM system will offer greater roaming facilities to its subscribers, extended throughout the countries that have installed the system. The GSM system could be seen as a further enhancement to European integration. GSM has adopted a contention-based protocol for multipoint-to-point transmission. In particular, the slotted-ALOHA medium access protocol is used to coordinate the transmission of the channel request messages between the scattered mobile stations. Collision still happens when more than one mobile station having the same random reference number attempts to transmit on the same time-slot. In this research, a modified version of this protocol has been developed in order to reduce the number of collisions and hence increase the random access channel throughput compared to the existing protocol. The performance evaluation of the protocol has been carried out using simulation methods. Due to the growing demand for mobile radio telephony as well as for data services, optimal usage of the scarce availability radio spectrum is becoming increasingly important. In this research, a protocol has been developed whereby the number of transmitted information packets over the GSM system is increased without any additional increase of the allocated radio spectrum. Simulation results are presented to show the improvements achieved by the proposed protocol. Cellular mobile radio networks commonly respond to an increase in the service demand by using smaller coverage areas. As a result, the volume of the signalling exchanges increases. In this research, a proposal for interconnecting the various entitles of the mobile radio network over the future broadband networks based on the IEEE 802.6 Metropolitan Area Network (MAN) is outlined. Simulation results are presented to show the benefits achieved by interconnecting these entities over the broadband Networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of antibiotics was investigated in twelve acute hospitals in England. Data was collected electronically and by questionnaire for the financial years 2001/2, 2002/3 and 2003/4. Hospitals were selected on the basis of their Medicines Management Self-Assessment Scores (MMAS) and included a cohort of three hospitals with integrated electronic prescribing systems. The total sample size was 6.65% of English NHS activity for 2001/2 based on Finished Consultant Episode (FCE) numbers. Data collected included all antibiotics dispensed (ATC category J01), hospital activity FCE's and beddays, Medicines Management Self-assessment scores, Antibiotic Medicines Management scores (AMS), Primary Care Trust (PCT) of origin of referral populations, PCT antibiotic prescribing rates, Index of Multiple Deprivation for each PCT. The DDD/FCE (Defined Daily Dose/FCE) was found to correlate with the DDD 100beddays (r = 0.74 p

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis examines the ways that libraries have employed computers to assist with housekeeping operations. It considers the relevance of such applications to company libraries in the construction industry, and describes more specifically the development of an integrated cataloguing and loan system. A review of the main features in the development of computerised ordering, cataloguing and circulation control systems shows that fully integrated packages are beginning to be completed, and that some libraries are introducing second generation programs. Cataloguing is the most common activity to be computerised, both at national and company level. Results from a sample of libraries in the construction industry suggest that the only computerised housekeeping system is at Taylor Woodrow. Most of the firms have access to an in-house computer, and some of the libraries, particularly those in firms of consulting engineers, might benefit from computerisation, but there are differing attitudes amongst the librarians towards the computer. A detailed study of the library at Taylor Woodrow resulted in a feasibility report covering all the areas of its activities. One of the main suggestions was the possible use of a computerised loans and cataloguing system. An integrated system to cover these two areas was programmed in Fortran and implemented. This new system provides certain benefits and saves staff time, but at the cost of time on the computer. Some improvements could be made by reprogramming, but it provides a general system for small technical libraries. A general equation comparing costs for manual and computerised operations is progressively simplified to a form where the annual saving from the computerised system is expressed in terms of staff and computer costs and the size of the library. This equation gives any library an indication of the savings or extra cost which would result from using the computerised system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since wireless network optimisations can be typically designed and evaluated independently of one another under the assumption that they can be applied jointly or independently. In this paper, we have analysis some rate algorithms in wireless networks. Since wireless networks have different standards in IEEE with peculiar features, data rate is one of those important parameters that wireless networks depend on for performances. The optimisation of this network is dependent on the behaviour of a particular rate algorithm in a network scenario. We have considered some first and second generation's rate algorithm, and it is all about selecting an appropriate data rate that any available wireless network can utilise for transmission in order to achieve a good performance. We have designed and analysis a wireless network and results obtained for some rate algorithms, like ONOE and AARF.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To capture the genomic profiles for histone modification, chromatin immunoprecipitation (ChIP) is combined with next generation sequencing, which is called ChIP-seq. However, enriched regions generated from the ChIP-seq data are only evaluated on the limited knowledge acquired from manually examining the relevant biological literature. This paper proposes a novel framework, which integrates multiple knowledge sources such as biological literature, Gene Ontology, and microarray data. In order to precisely analyze ChIP-seq data for histone modification, knowledge integration is based on a unified probabilistic model. The model is employed to re-rank the enriched regions generated from peak finding algorithms. Through filtering the reranked enriched regions using some predefined threshold, more reliable and precise results could be generated. The combination of the multiple knowledge sources with the peaking finding algorithm produces a new paradigm for ChIP-seq data analysis. © (2012) Trans Tech Publications, Switzerland.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dwindling fossil fuel reserves, and growing concerns over CO2 emissions and associated climate change, are driving the quest for renewable feedstocks to provide alternative, sustainable fuel sources. Catalysis has a rich history of facilitating energy efficient, selective molecular transformations, and in a post-petroleum era will play a pivotal role in overcoming the scientific and engineering barriers to economically viable, and sustainable, biofuels derived from renewable resources. The production of second generation biofuels, derived from biomass sourced from inedible crop components, e.g. agricultural or forestry waste, or alternative non-food crops such as Switchgrass or Jatropha Curcas that require minimal cultivation, necessitate new heterogeneous catalysts and processes to transform these polar and viscous feedstocks [1]. Here we show how advances in the rational design of nanoporous solid acids and bases, and their utilisation in novel continuous reactors, can deliver superior performance in the energy-efficient esterification and transesterification of bio-oil components into biodiesel [2-4]. Notes: [1] K. Wilson, A.F. Lee, Cat. Sci. Tech. 2012 ,2, 884. [2] J. Dhainaut, J.-P. Dacquin, A. F. Lee, K. Wilson, Green Chem. 2010 , 12, 296. [3] C. Pirez, J.-M. Caderon, J.-P. Dacquin, A.F. Lee, K. Wilson, ACS Catal. 2012 , 2, 1607. [4] J.J. Woodford, J.-P. Dacquin, K. Wilson, A.F. Lee, Energy Environ. Sci. 2012 , 5, 6145.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Two series of novel modified silicas have been prepared in which individual dendritic branches have been attached to aminopropylsilica using standard peptide coupling methodology. The dendritic branches are composed of enantiomerically pure l-lysine building blocks, and hence, the modified silicas have the potential to act as chiral stationary phases in chromatography. In one series of modified silicas, the surface of the dendritic branch consists of Boc carbamate groups, whereas the other has benzoyl amide surface groups. Different coupling reagents have been investigated in order to maximize the loading onto the solid phase. The new supported dendritic materials have been fully characterized with properties of the bulk material determined by elemental analysis, 13C NMR, and IR spectroscopy, whereas XPS provides important information about the surface of the modified silica exposed to the incident X-rays, the key region in which potential chromatographic performance of these materials will take place. Although the bulk analyses indicate that loading of the dendritic branch onto silica decreases with increasing dendritic generation (and consequently steric bulk), XPS indicates that the optimum surface coverage is actually obtained at the second generation of dendritic growth.