59 resultados para two-stage sequential procedure

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EU intends to increase the fraction of fuels from biogenic energy sources from 2% in 2005 to 8% in 2020. This means a minimum of 30 million TOE/a of fuels from biomass. This makes technical-scale generation of syngas from high-grade biomass, e.g. straw, hay, bark, or paper/cardboard waste, and the production of synthetic fuels by Fischer-Tropsch (FT) synthesis highly attractive. The BTL concept (Biomass to Liquids) of the Karlsruhe Research Center, labeled bioliq, focuses on this challenge by locally concentrating the biomass energy content by fast pyrolysis in a coke/oil slurry followed by slurry conversion to syngas in a central entrained flow gasifier at 1200C and pressures above 4MPa. FT synthesis generates intermediate products for synthetic fuels. To prevent the sensitive catalysts from being poisoned the syngas must be free of tar and particulates. Trace concentrations of H2S, COS, CS2, HCl, NH3, and HCN must be on the order of a few ppb. Moreover, maximum conversion efficiency will be achieved by cleaning the gas above the synthesis conditions. (T>350C, P>4MPa). The concept of an innovative dry HTHP syngas cleaning process is presented. Based on HT particle filtration and suitable sorption and catalysis processes for the relevant contaminants, an overall concept will be derived, which leads to a syngas quality required for FT synthesis in only two combined stages. Results of filtration experiments on a pilot scale are presented. The influence of temperature on the separation and conversion, respectively, of particulates and gaseous contaminants is discussed on the basis of experimental results obtained on a laboratory and pilot scale. Extensive studies of this concept are performed in a scientific network comprising the Karlsruhe Research Center and five universities; funding is provided by the Helmholtz Association of National Research Centers in Germany.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many automated negotiation models have been developed to solve the conflict in many distributed computational systems. However, the problem of finding win-win outcome in multiattribute negotiation has not been tackled well. To address this issue, based on an evolutionary method of multiobjective optimization, this paper presents a negotiation model that can find win-win solutions of multiple attributes, but needs not to reveal negotiating agents' private utility functions to their opponents or a third-party mediator. Moreover, we also equip our agents with a general type of utility functions of interdependent multiattributes, which captures human intuitions well. In addition, we also develop a novel time-dependent concession strategy model, which can help both sides find a final agreement among a set of win-win ones. Finally, lots of experiments confirm that our negotiation model outperforms the existing models developed recently. And the experiments also show our model is stable and efficient in finding fair win-win outcomes, which is seldom solved in the existing models. © 2012 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The visual system pools information from local samples to calculate textural properties. We used a novel stimulus to investigate how signals are combined to improve estimates of global orientation. Stimuli were 29 × 29 element arrays of 4 c/deg log Gabors, spaced 1° apart. A proportion of these elements had a coherent orientation (horizontal/vertical) with the remainder assigned random orientations. The observer's task was to identify the global orientation. The spatial configuration of the signal was modulated by a checkerboard pattern of square checks containing potential signal elements. The other locations contained either randomly oriented elements (''noise check'') or were blank (''blank check''). The distribution of signal elements was manipulated by varying the size and location of the checks within a fixed-diameter stimulus. An ideal detector would only pool responses from potential signal elements. Humans did this for medium check sizes and for large check sizes when a signal was presented in the fovea. For small check sizes, however, the pooling occurred indiscriminately over relevant and irrelevant locations. For these check sizes, thresholds for the noise check and blank check conditions were similar, suggesting that the limiting noise is not induced by the response to the noise elements. The results are described by a model that filters the stimulus at the potential target orientations and then combines the signals over space in two stages. The first is a mandatory integration of local signals over a fixed area, limited by internal noise at each location. The second is a taskdependent combination of the outputs from the first stage. © 2014 ARVO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parameter optimization of a two-stage Raman fibre converters (RFC) based on phosphosilicate core fiber was presented. The optimal operational regime was determined and tolerance of the converter against variations of laser parameters was analyzed. Converter was pumped by ytterbium-doped double-clad fibre laser with a maximum output power of 3.8W at 1061 nm. A phosphosilicate-core RFC with enhanced performance was fabricated using the results of numerical modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-stage data envelopment analysis (DEA) efficiency models identify the efficient frontier of a two-stage production process. In some two-stage processes, the inputs to the first stage are shared by the second stage, known as shared inputs. This paper proposes a new relational linear DEA model for dealing with measuring the efficiency score of two-stage processes with shared inputs under constant returns-to-scale assumption. Two case studies of banking industry and university operations are taken as two examples to illustrate the potential applications of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our understanding of early spatial vision owes much to contrast masking and summation paradigms. In particular, the deep region of facilitation at low mask contrasts is thought to indicate a rapidly accelerating contrast transducer (eg a square-law or greater). In experiment 1, we tapped an early stage of this process by measuring monocular and binocular thresholds for patches of 1 cycle deg-1 sine-wave grating. Threshold ratios were around 1.7, implying a nearly linear transducer with an exponent around 1.3. With this form of transducer, two previous models (Legge, 1984 Vision Research 24 385 - 394; Meese et al, 2004 Perception 33 Supplement, 41) failed to fit the monocular, binocular, and dichoptic masking functions measured in experiment 2. However, a new model with two-stages of divisive gain control fits the data very well. Stage 1 incorporates nearly linear monocular transducers (to account for the high level of binocular summation and slight dichoptic facilitation), and monocular and interocular suppression (to fit the profound 42 Oral presentations: Spatial vision Thursday dichoptic masking). Stage 2 incorporates steeply accelerating transduction (to fit the deep regions of monocular and binocular facilitation), and binocular summation and suppression (to fit the monocular and binocular masking). With all model parameters fixed from the discrimination thresholds, we examined the slopes of the psychometric functions. The monocular and binocular slopes were steep (Weibull ߘ3-4) at very low mask contrasts and shallow (ߘ1.2) at all higher contrasts, as predicted by all three models. The dichoptic slopes were steep (ߘ3-4) at very low contrasts, and very steep (ß>5.5) at high contrasts (confirming Meese et al, loco cit.). A crucial new result was that intermediate dichoptic mask contrasts produced shallow slopes (ߘ2). Only the two-stage model predicted the observed pattern of slope variation, so providing good empirical support for a two-stage process of binocular contrast transduction. [Supported by EPSRC GR/S74515/01]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two key issues defined the focus of this research in manufacturing plasmid DNA for use In human gene therapy. First, the processing of E.coli bacterial cells to effect the separation of therapeutic plasmid DNA from cellular debris and adventitious material. Second, the affinity purification of the plasmid DNA in a Simple one-stage process. The need arises when considering the concerns that have been recently voiced by the FDA concerning the scalability and reproducibility of the current manufacturing processes in meeting the quality criteria of purity, potency, efficacy, and safety for a recombinant drug substance for use in humans. To develop a preliminary purification procedure, an EFD cross-flow micro-filtration module was assessed for its ability to effect the 20-fold concentration, 6-time diafiltration, and final clarification of the plasmid DNA from the subsequent cell lysate that is derived from a 1 liter E.coli bacterial cell culture. Historically, the employment of cross-flow filtration modules within procedures for harvesting cells from bacterial cultures have failed to reach the required standards dictated by existing continuous centrifuge technologies, frequently resulting in the rapid blinding of the membrane with bacterial cells that substantially reduces the permeate flux. By challenging the EFD module, containing six helical wound tubular membranes promoting centrifugal instabilities known as Dean vortices, with distilled water between the Dean number's of 187Dn and 818Dn,and the transmembrane pressures (TMP) of 0 to 5 psi. The data demonstrated that the fluid dynamics significantly influenced the permeation rate, displaying a maximum at 227Dn (312 Imh) and minimum at 818Dn (130 Imh) for a transmembrane pressure of 1 psi. Numerical studies indicated that the initial increase and subsequent decrease resulted from a competition between the centrifugal and viscous forces that create the Dean vortices. At Dean numbers between 187Dn and 227Dn , the forces combine constructively to increase the apparent strength and influence of the Dean vortices. However, as the Dean number in increases above 227 On the centrifugal force dominates the viscous forces, compressing the Dean vortices into the membrane walls and reducing their influence on the radial transmembrane pressure i.e. the permeate flux reduced. When investigating the action of the Dean vortices in controlling tile fouling rate of E.coli bacterial cells, it was demonstrated that the optimum cross-flow rate at which to effect the concentration of a bacterial cell culture was 579Dn and 3 psi TMP, processing in excess of 400 Imh for 20 minutes (i.e., concentrating a 1L culture to 50 ml in 10 minutes at an average of 450 Imh). The data demonstrated that there was a conflict between the Dean number at which the shear rate could control the cell fouling, and the Dean number at which tile optimum flux enhancement was found. Hence, the internal geometry of the EFD module was shown to sub-optimal for this application. At 579Dn and 3 psi TMP, the 6-fold diafiltration was shown to occupy 3.6 minutes of process time, processing at an average flux of 400 Imh. Again, at 579Dn and 3 psi TMP the clarification of the plasmid from tile resulting freeze-thaw cell lysate was achieved at 120 Iml1, passing 83% (2,5 mg) of the plasmid DNA (6,3 ng μ-1 10.8 mg of genomic DNA (∼23,00 Obp, 36 ng μ-1 ), and 7.2 mg of cellular proteins (5-100 kDa, 21.4 ngμ-1 ) into the post-EFD process stream. Hence the EFD module was shown to be effective, achieving the desired objectives in approximately 25 minutes. On the basis of its ability to intercalate into low molecular weight dsDNA present in dilute cell lysates, and be electrophoresed through agarose, the fluorophore PicoGreen was selected for the development of a suitable dsDNA assay. It was assesseel for its accuracy, and reliability, In determining the concentration and identity of DNA present in samples that were eleclrophoresed through agarose gels. The signal emitted by intercalated PicoGreen was shown to be constant and linear, and that the mobility of the PicaGreen-DNA complex was not affected by the intercalation. Concerning the secondary purification procedure, various anion-exchange membranes were assessed for their ability to capture plasmid DNA from the post-EFD process stream. For a commercially available Sartorius Sartobind Q15 membrane, the reduction in the equilibriumbinding capacity for  ctDNA in buffer of increasing ionic demonstrated that DNA was being.adsorbed by electrostatic  interactions only. However, the problems associated with fluid distribution across the membrane demonstrated that the membrane housing was the predominant cause of the .erratic breakthrough curves. Consequently, this would need to be rectified before such a membrane could be integrated into the current system, or indeed be scaled beyond laboratory scale. However, when challenged with the process material, the data showed that considerable quantities of protein (1150 μg) were adsorbed preferentially to the plasmid DNA (44 μg). This was also shown for derived Pall Gelman UltraBind US450 membranes that had been functionalised by varying molecular weight poly-L~lysine and polyethyleneimine ligands. Hence the anion-exchange membranes were shown to be ineffective in capturing plasmid DNA from the process stream. Finally, work was performed to integrate a sequence-specific DNA·binding protein into a single-stage DNA chromatography, isolating plasmid DNA from E.coli cells whilst minimising the contamination from genomic DNA and cellular protein. Preliminary work demonstrated that the fusion protein was capable of isolating pUC19 DNA into which the recognition sequence for the fusion-protein had been inserted (pTS DNA) when in the presence of the conditioned process material. Althougth the pTS recognition sequence differs from native pUC19 sequences by only 2 bp, the fusion protein was shown to act as a highly selective affinity ligand for pTS DNA alone. Subsequently, the scale of the process was scaled 25-fold and positioned directly following the EFD system. In conclusion, the integration of the EFD micro-filtration system and zinc-finger affinity purification technique resulted in the capture of approximately 1 mg of plasmid DNA was purified from 1L of E.coli  culture in a simple two stage process, resulting in the complete removal of genomic DNA and 96.7% of cellular protein in less than 1 hour of process time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis examines and explains the development of occupational exposure limits (OELs) as a means of preventing work related disease and ill health. The research focuses on the USA and UK and sets the work within a certain historical and social context. A subsidiary aim of the thesis is to identify any short comings in OELs and the methods by which they are set and suggest alternatives. The research framework uses Thomas Kuhn's idea of science progressing by means of paradigms which he describes at one point, `lq ... universally recognised scientific achievements that for a time provide model problems and solutions to a community of practitioners. KUHN (1970). Once learned individuals in the community, `lq ... are committed to the same rules and standards for scientific practice. Ibid. Kuhn's ideas are adapted by combining them with a view of industrial hygiene as an applied science-based profession having many of the qualities of non-scientific professions. The great advantage of this approach to OELs is that it keeps the analysis grounded in the behaviour and priorities of the groups which have forged, propounded, used, benefited from, and defended, them. The development and use of OELs on a larger scale is shown to be connected to the growth of a new profession in the USA; industrial hygiene, with the assistance of another new profession; industrial toxicology. The origins of these professions, particularly industrial hygiene, are traced. By examining the growth of the professions and the writings of key individuals it is possible to show how technical, economic and social factors became embedded in the OEL paradigm which industrial hygienists and toxicologists forged. The origin, mission and needs of these professions and their clients made such influences almost inevitable. The use of the OEL paradigm in practice is examined by an analysis of the process of the American Conference of Governmental Industrial Hygienists, Threshold Limit Value (ACGIH, TLV) Committee via the Minutes from 1962-1984. A similar approach is taken with the development of OELs in the UK. Although the form and definition of TLVs has encouraged the belief that they are health-based OELs the conclusion is that they, and most other OELs, are, and always have been, reasonably practicable limits: the degree of risk posed by a substance is weighed against the feasibility and cost of controlling exposure to that substance. The confusion over the status of TLVs and other OELs is seen to be a confusion at the heart of the OEL paradigm and the historical perspective explains why this should be. The paradigm has prevented the creation of truly health-based and, conversely, truly reasonably practicable OELs. In the final part of the thesis the analysis of the development of OELs is set in a contemporary context and a proposal for a two-stage, two-committee procedure for producing sets of OELs is put forward. This approach is set within an alternative OEL paradigm. The advantages, benefits and likely obstacles to these proposals are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical optimization is performed of the 40-Gb/s dispersion-managed (DM) soliton transmission system with in-line synchronous intensity modulation. Stability of DM soliton transmission results from a combined action of dispersion, nonlinearity, in-line filtering, and modulation through effective periodic bandwidth management of carrier pulses. Therefore, analysis of the multiparametric problem is typically required. A two-stage time-saving numerical optimization procedure is applied. At the first step, the regions of the stable carrier propagation are determined using theoretical models available for DM solitons, and system parameters are optimized. At the second stage, full numerical simulations are undertaken in order to verify the tolerance of optimal transmission regimes. An approach developed demonstrates feasibility of error-free transmission over 20 000 km in a transmission line composed of standard fiber and dispersion compensation fiber at 40 Gb/s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents a two stage process to determine suitable areas to grow fuel crops: i) FAO Agro Ecological Zones (AEZ) procedure is applied to four Indian states of different geographical characteristics; and ii) Modelling the growth of candidate crops with GEPIC water and nutrient model, which is used to determine potential yield of candidate crops in areas where irrigation water is brackish or soil is saline. Absence of digital soil maps, paucity of readily available climate data and knowledge of detailed requirements of candidate crops are some of the major problems, of which, a series of detailed maps will evaluate true potential of biofuels in India.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eukaryotic-especially human-membrane protein overproduction remains a major challenge in biochemistry. Heterologously overproduced and purified proteins provide a starting point for further biochemical, biophysical and structural studies, and the lack of sufficient quantities of functional membrane proteins is frequently a bottleneck hindering this. Here, we report exceptionally high production levels of a correctly folded and crystallisable recombinant human integral membrane protein in its active form; human aquaporin 1 (hAQP1) has been heterologously produced in the membranes of the methylotrophic yeast Pichia pastoris. After solubilisation and a two step purification procedure, at least 90 mg hAQP1 per liter of culture is obtained. Water channel activity of this purified hAQP1 was verified by reconstitution into proteoliposomes and performing stopped-flow vesicle shrinkage measurements. Mass spectrometry confirmed the identity of hAQP1 in crude membrane preparations, and also from purified protein reconstituted into proteoliposomes. Furthermore, crystallisation screens yielded diffraction quality crystals of untagged recombinant hAQP1. This study illustrates the power of the yeast P. pastoris as a host to produce exceptionally high yields of a functionally active, human integral membrane protein for subsequent functional and structural characterization. © 2007 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fundamental problem for any visual system with binocular overlap is the combination of information from the two eyes. Electrophysiology shows that binocular integration of luminance contrast occurs early in visual cortex, but a specific systems architecture has not been established for human vision. Here, we address this by performing binocular summation and monocular, binocular, and dichoptic masking experiments for horizontal 1 cycle per degree test and masking gratings. These data reject three previously published proposals, each of which predict too little binocular summation and insufficient dichoptic facilitation. However, a simple development of one of the rejected models (the twin summation model) and a completely new model (the two-stage model) provide very good fits to the data. Two features common to both models are gently accelerating (almost linear) contrast transduction prior to binocular summation and suppressive ocular interactions that contribute to contrast gain control. With all model parameters fixed, both models correctly predict (1) systematic variation in psychometric slopes, (2) dichoptic contrast matching, and (3) high levels of binocular summation for various levels of binocular pedestal contrast. A review of evidence from elsewhere leads us to favor the two-stage model. © 2006 ARVO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How do signals from the 2 eyes combine and interact? Our recent work has challenged earlier schemes in which monocular contrast signals are subject to square-law transduction followed by summation across eyes and binocular gain control. Much more successful was a new 'two-stage' model in which the initial transducer was almost linear and contrast gain control occurred both pre- and post-binocular summation. Here we extend that work by: (i) exploring the two-dimensional stimulus space (defined by left- and right-eye contrasts) more thoroughly, and (ii) performing contrast discrimination and contrast matching tasks for the same stimuli. Twenty-five base-stimuli made from 1 c/deg patches of horizontal grating, were defined by the factorial combination of 5 contrasts for the left eye (0.3-32%) with five contrasts for the right eye (0.3-32%). Other than in contrast, the gratings in the two eyes were identical. In a 2IFC discrimination task, the base-stimuli were masks (pedestals), where the contrast increment was presented to one eye only. In a matching task, the base-stimuli were standards to which observers matched the contrast of either a monocular or binocular test grating. In the model, discrimination depends on the local gradient of the observer's internal contrast-response function, while matching equates the magnitude (rather than gradient) of response to the test and standard. With all model parameters fixed by previous work, the two-stage model successfully predicted both the discrimination and the matching data and was much more successful than linear or quadratic binocular summation models. These results show that performance measures and perception (contrast discrimination and contrast matching) can be understood in the same theoretical framework for binocular contrast vision. © 2007 VSP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National guidance and clinical guidelines recommended multidisciplinary teams (MDTs) for cancer services in order to bring specialists in relevant disciplines together, ensure clinical decisions are fully informed, and to coordinate care effectively. However, the effectiveness of cancer teams was not previously evaluated systematically. A random sample of 72 breast cancer teams in England was studied (548 members in six core disciplines), stratified by region and caseload. Information about team constitution, processes, effectiveness, clinical performance, and members' mental well-being was gathered using appropriate instruments. Two input variables, team workload (P=0.009) and the proportion of breast care nurses (P=0.003), positively predicted overall clinical performance in multivariate analysis using a two-stage regression model. There were significant correlations between individual team inputs, team composition variables, and clinical performance. Some disciplines consistently perceived their team's effectiveness differently from the mean. Teams with shared leadership of their clinical decision-making were most effective. The mental well-being of team members appeared significantly better than in previous studies of cancer clinicians, the NHS, and the general population. This study established that team composition, working methods, and workloads are related to measures of effectiveness, including the quality of clinical care. © 2003 Cancer Research UK.