42 resultados para Network scale-up method
Resumo:
A literature review of work carried out on batch and continuous chromatographic biochemical reactor-separators has been made. The major part of this work has involved the development of a batch chromatographic reactor-separator for the production of dextran and fructose by the enzymatic action of the enzyme dextransucrase on sucrose. In this reactor, simultaneous reaction and separation occurs thus reducing downstream processing and isolation of products as compared to the existing industrial process. The chromatographic reactor consisted of a glass column packed with a stationary phase consisting of cross linked polysytrene resin in the calcium form. The mobile phase consisted of diluted dextransucrase in deionised water. Initial experiments were carried out on a reactor separtor which had an internal diameter of 0.97cm and length of 1.5m. To study the effect of scale up the reactor diameter was doubled to 1.94cm and length increased to 1.75m. The results have shown that the chromatographic reactor uses more enzyme than a conventional batch reactor for a given conversion of sucrose and that an increase in void volume results in higher conversions of sucrose. A comparison of the molecular weight distribution of dextran produced by the chromatographic reactor was made with that from a conventional batch reactor. The results have shown that the chromatographic reactor produces 30% more dextran of molecular weight greater than 150,000 daltons at 20% w/v sucrose concentration than conventional reactors. This is because some of the fructose molecules are prevented as acting as acceptors in the chromatographic reactor due to their removal from the reaction zone. In the conventional reactor this is not possible and therefore a greater proportion of low molecular weight dextran is produced which does not have much clinical use. A theoretical model was developed to describe the behaviour of the reactor separator and this model was simulated using a computer. The simulation predictions showed good agreement with experimental results at high eluent flowrates and low conversions.
Resumo:
Covalent attachment of the anticancer drugs temozolomide (Temodal) and mitozolomide to triplex-forming oligonucleotides (TFOs) is a potential way of targeting these alkylating agents to specific gene sequences to maximise site-selectivity. In this work, polypyrimidine TFO conjugates of both drugs were synthesised and targeted to duplex DNA in an attempt to effect site-specific alkylation of guanine residues. Concurrently, in an attempt to enhance the triple helix stability of TFOs at neutral pH, the thermal stabilities of triplexes formed from TFOs containing isoguanine, 2-O-benzyl- and 2-O-allyl-adenine were evaluated. A novel cleavage and deprotection procedure was developed which allowed for the solid phase synthesis of the base-sensitive TFO-drug conjugates using a recently developed silyl-linked controlled pore glass (SLCPG) support. Covalent attachment of either temozolomide or mitozolomide at the 5'-end of TFO conjugates caused no destabilisation of the triplexes studied. The synthesis of a phosphoramidite derivative of mitozolomide enabled direct incorporation of this reagent into a model sequence during DNA synthesis. After cleavage and deprotection of the TFO-drug conjugate, the 5'-end mitozolomide residue was found to have decomposed presumably as a result of ring-opening of the tetrazinone ring. The base-sensitive antibacterial and antitumour agent, metronidazole, was also successfully incorporated at the 5'-end of the oligonucleotide d(T8) using conventional methods. Two C2-substituted derivatives of 2'-deoxyadenosine containing 2-O-benzyl and 2-O-allyl groups were synthesised. Hydrogenolysis of the 2-O-benzyl analogue provided a useful route, amenable to scale-up, for the synthesis of the rare nucleoside 2'-deoxyisoguanosine (isoG). Both the 2-O-allyl and 2-O-benzyl derivatives were incorporated into TFO sequences using phosphoramidite methodology. Thermal melting experiments showed that the 2-O-allyl and 2-O-benzyl groups caused marked destabilisation of the triple helices studied, in contrast to hexose-DNA duplexes, where aralkyl substituents caused significant stabilisation of duplexes. TFOs containing isoG were synthesised by Pd(O)-catalysed deallylation of 2-0-allyl adenine residues. These sequences containing isoG, in its N3- or 02-H tautomeric form, formed triple helices which were equally as stable as those containing adenine.
Combinatorial approach to multi-substituted 1,4-Benzodiazepines as novel non-peptide CCK-antagonists
Resumo:
For the drug discovery process, a library of 168 multisubstituted 1,4-benzodiazepines were prepared by a 5-step solid phase combinatorial approach. Substituents were varied in the 3,5, 7 and 8-position on the benzodiazepine scaffold. The combinatorial library was evaluated in a CCK radiolabelled binding assay and CCKA (alimentary) and CCKB (brain) selective lead structures were discovered. The template of CCKA selective 1,4-benzodiazepin-2-ones bearing the tryptophan moiety was chemically modified by selective alkylation and acylation reactions. These studies provided a series of Asperlicin naturally analogues. The fully optimised Asperlicin related compound possessed a similar CCKA activity as the natural occuring compound. 3-Alkylated 1,4-benzodiazepines with selectivity towards the CCKB receptor subtype were optimised on A) the lipophilic side chain and B) the 2-aminophenyl-ketone moiety, together with some stereochemical changes. A C3 unit in the 3-position of 1,4-benzodiazepines possessed a CCKB activity within the nanomolar range. Further SAR optimisation on the N1-position by selective alkylation resulted in an improved CCKB binding with potentially decreased activity on the GABAA/benzodiazepine receptor complex. The in vivo studies revealed two N1-alkylated compounds containing unsaturated alkyl groups with anxiolytic properties. Alternative chemical approaches have been developed, including a route that is suitable for scale up of the desired target molecule in order to provide sufficient quantities for further in vivo evaluation.
Resumo:
Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.
Resumo:
The use of liposomes as carriers of peptide, protein, and DNA vaccines requires simple, easy-to-scale-up technology capable of high-yield vaccine entrapment. Work from this laboratory has led to the development of techniques that can generate liposomes of various sizes, containing soluble antigens such as proteins and particulate antigens (e.g., killed or attenuated bacteria or viruses), as well as antigen-encoding DNA vaccines. Entrapment of vaccines is carried out by the dehydration-rehydration procedure which entails freeze-drying of a mixture of "empty" small unilamellar vesicles and free vaccines. On rehydration, the large multilamellar vesicles formed incorporate up to 90% or more of the vaccine used. When such liposomes are microfluidized in the presence of nonentrapped material, their size is reduced to about 100 nm in diameter, with much of the originally entrapped vaccine still associated with the vesicles. A similar technique applied for the entrapment of particulate antigens (e.g., Bacillus subtilis spores) consists of freeze-drying giant vesicles (4-5 microm in diameter) in the presence of spores. On rehydration and sucrose gradient fractionation of the suspension, up to 30% or more of the spores used are associated with generated giant liposomes of similar mean size.
Resumo:
While semantic search technologies have been proven to work well in specific domains, they still have to confront two main challenges to scale up to the Web in its entirety. In this work we address this issue with a novel semantic search system that a) provides the user with the capability to query Semantic Web information using natural language, by means of an ontology-based Question Answering (QA) system [14] and b) complements the specific answers retrieved during the QA process with a ranked list of documents from the Web [3]. Our results show that ontology-based semantic search capabilities can be used to complement and enhance keyword search technologies.
Resumo:
In the last few years, significant advances have been made in understanding how a yeast cell responds to the stress of producing a recombinant protein, and how this information can be used to engineer improved host strains. The molecular biology of the expression vector, through the choice of promoter, tag and codon optimization of the target gene, is also a key determinant of a high-yielding protein production experiment. Recombinant Protein Production in Yeast: Methods and Protocols examines the process of preparation of expression vectors, transformation to generate high-yielding clones, optimization of experimental conditions to maximize yields, scale-up to bioreactor formats and disruption of yeast cells to enable the isolation of the recombinant protein prior to purification. Written in the highly successful Methods in Molecular Biology™ series format, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and key tips on troubleshooting and avoiding known pitfalls.
Resumo:
Progress in the development of generic molecular devices based on responsive polymers is discussed. Characterisation of specially synthesised polyelectrolyte gels, "grafted from" brushes and triblock copolymers is reported. A Landolt pH-oscillator, based on bromate/ sulfite/ferrocyanide, with a room temperature period of 20 min and a range of 3.1
Resumo:
This chapter discusses engineering design and performance of various types of biomass transformation reactors. These reactors vary in their operating principle depending on the processing capacity and the nature of the desired end product, that is, gas, chemicals or liquid bio-oil. Mass balance around a thermal conversion reactor is usually carried out to identify the degree of conversion and obtain the amount of the various components in the product. The energy balance around the reactors is essential for determining the optimum reactor temperature and the amount of heat required to complete the overall reactions. Experimental and pilot-plant testing is essential for proper reactor design. However, it is common practice to use correlation and valid parameter values in determining the realistic reactor dimensions and configurations. Despite the recent progress in thermochemical conversion technology, reactor performance and scale up potential are the subjects of continuing research.
Resumo:
Desalination is a costly means of providing freshwater. Most desalination plants use either reverse osmosis (RO) or thermal distillation. Both processes have drawbacks: RO is efficient but uses expensive electrical energy; thermal distillation is inefficient but uses less expensive thermal energy. This work aims to provide an efficient RO plant that uses thermal energy. A steam-Rankine cycle has been designed to drive mechanically a batch-RO system that achieves high recovery, without the high energy penalty typically incurred in a continuous-RO system. The steam may be generated by solar panels, biomass boilers, or as an industrial by-product. A novel mechanical arrangement has been designed for low cost, and a steam-jacketed arrangement has been designed for isothermal expansion and improved thermodynamic efficiency. Based on detailed heat transfer and cost calculations, a gain output ratio of 69-162 is predicted, enabling water to be treated at a cost of 71 Indian Rupees/m3 at small scale. Costs will reduce with scale-up. Plants may be designed for a wide range of outputs, from 5 m3/day, up to commercial versions producing 300 m3/day of clean water from brackish groundwater.
Resumo:
Background aims: The selection of medium and associated reagents for human mesenchymal stromal cell (hMSC) culture forms an integral part of manufacturing process development and must be suitable for multiple process scales and expansion technologies. Methods: In this work, we have expanded BM-hMSCs in fetal bovine serum (FBS)- and human platelet lysate (HPL)-containing media in both a monolayer and a suspension-based microcarrier process. Results: The introduction of HPL into the monolayer process increased the BM-hMSC growth rate at the first experimental passage by 0.049 day and 0.127/day for the two BM-hMSC donors compared with the FBS-based monolayer process. This increase in growth rate in HPL-containing medium was associated with an increase in the inter-donor consistency, with an inter-donor range of 0.406 cumulative population doublings after 18 days compared with 2.013 in FBS-containing medium. Identity and quality characteristics of the BM-hMSCs are also comparable between conditions in terms of colony-forming potential, osteogenic potential and expression of key genes during monolayer and post-harvest from microcarrier expansion. BM-hMSCs cultured on microcarriers in HPL-containing medium demonstrated a reduction in the initial lag phase for both BM-hMSC donors and an increased BM-hMSC yield after 6 days of culture to 1.20 ± 0.17 × 105 and 1.02 ± 0.005 × 105 cells/mL compared with 0.79 ± 0.05 × 105 and 0.36 ± 0.04 × 105 cells/mL in FBS-containing medium. Conclusions: This study has demonstrated that HPL, compared with FBS-containing medium, delivers increased growth and comparability across two BM-hMSC donors between monolayer and microcarrier culture, which will have key implications for process transfer during scale-up.
Resumo:
The conventional, geometrically lumped description of the physical processes inside a high shear granulator is not reliable for process design and scale-up. In this study, a compartmental Population Balance Model (PBM) with spatial dependence is developed and validated in two lab-scale high shear granulation processes using a 1.9L MiPro granulator and 4L DIOSNA granulator. The compartmental structure is built using a heuristic approach based on computational fluid dynamics (CFD) analysis, which includes the overall flow pattern, velocity and solids concentration. The constant volume Monte Carlo approach is implemented to solve the multi-compartment population balance equations. Different spatial dependent mechanisms are included in the compartmental PBM to describe granule growth. It is concluded that for both cases (low and high liquid content), the adjustment of parameters (e.g. layering, coalescence and breakage rate) can provide a quantitative prediction of the granulation process.
Resumo:
Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.
Resumo:
Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.
Resumo:
Data Envelopment Analysis (DEA) is one of the most widely used methods in the measurement of the efficiency and productivity of Decision Making Units (DMUs). DEA for a large dataset with many inputs/outputs would require huge computer resources in terms of memory and CPU time. This paper proposes a neural network back-propagation Data Envelopment Analysis to address this problem for the very large scale datasets now emerging in practice. Neural network requirements for computer memory and CPU time are far less than that needed by conventional DEA methods and can therefore be a useful tool in measuring the efficiency of large datasets. Finally, the back-propagation DEA algorithm is applied to five large datasets and compared with the results obtained by conventional DEA.