977 resultados para Scalable Nanofabrication


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The optimisation and scale-up of process conditions leading to high yields of recombinant proteins is an enduring bottleneck in the post-genomic sciences. Typical experiments rely on varying selected parameters through repeated rounds of trial-and-error optimisation. To rationalise this, several groups have recently adopted the 'design of experiments' (DoE) approach frequently used in industry. Studies have focused on parameters such as medium composition, nutrient feed rates and induction of expression in shake flasks or bioreactors, as well as oxygen transfer rates in micro-well plates. In this study we wanted to generate a predictive model that described small-scale screens and to test its scalability to bioreactors. Results Here we demonstrate how the use of a DoE approach in a multi-well mini-bioreactor permitted the rapid establishment of high yielding production phase conditions that could be transferred to a 7 L bioreactor. Using green fluorescent protein secreted from Pichia pastoris, we derived a predictive model of protein yield as a function of the three most commonly-varied process parameters: temperature, pH and the percentage of dissolved oxygen in the culture medium. Importantly, when yield was normalised to culture volume and density, the model was scalable from mL to L working volumes. By increasing pre-induction biomass accumulation, model-predicted yields were further improved. Yield improvement was most significant, however, on varying the fed-batch induction regime to minimise methanol accumulation so that the productivity of the culture increased throughout the whole induction period. These findings suggest the importance of matching the rate of protein production with the host metabolism. Conclusion We demonstrate how a rational, stepwise approach to recombinant protein production screens can reduce process development time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computer systems of today are characterised by data and program control that are distributed functionally and geographically across a network. A major issue of concern in this environment is the operating system activity of resource management for different processors in the network. To ensure equity in load distribution and improved system performance, load balancing is often undertaken. The research conducted in this field so far, has been primarily concerned with a small set of algorithms operating on tightly-coupled distributed systems. More recent studies have investigated the performance of such algorithms in loosely-coupled architectures but using a small set of processors. This thesis describes a simulation model developed to study the behaviour and general performance characteristics of a range of dynamic load balancing algorithms. Further, the scalability of these algorithms are discussed and a range of regionalised load balancing algorithms developed. In particular, we examine the impact of network diameter and delay on the performance of such algorithms across a range of system workloads. The results produced seem to suggest that the performance of simple dynamic policies are scalable but lack the load stability of more complex global average algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work described in this thesis focuses on the use of a design-of-experiments approach in a multi-well mini-bioreactor to enable the rapid establishments of high yielding production phase conditions in yeast, which is an increasingly popular host system in both academic and industrial laboratories. Using green fluorescent protein secreted from the yeast, Pichia pastoris, a scalable predictive model of protein yield per cell was derived from 13 sets of conditions each with three factors (temperature, pH and dissolved oxygen) at 3 levels and was directly transferable to a 7 L bioreactor. This was in clear contrast to the situation in shake flasks, where the process parameters cannot be tightly controlled. By further optimisating both the accumulation of cell density in batch and improving the fed-batch induction regime, additional yield improvement was found to be additive to the per cell yield of the model. A separate study also demonstrated that improving biomass improved product yield in a second yeast species, Saccharomyces cerevisiae. Investigations of cell wall hydrophobicity in high cell density P. pastoris cultures indicated that cell wall hydrophobin (protein) compositional changes with growth phase becoming more hydrophobic in log growth than in lag or stationary phases. This is possibly due to an increased occurrence of proteins associated with cell division. Finally, the modelling approach was validated in mammalian cells, showing its flexibility and robustness. In summary, the strategy presented in this thesis has the benefit of reducing process development time in recombinant protein production, directly from bench to bioreactor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a Markov chain based analytical model is proposed to evaluate the slotted CSMA/CA algorithm specified in the MAC layer of IEEE 802.15.4 standard. The analytical model consists of two two-dimensional Markov chains, used to model the state transition of an 802.15.4 device, during the periods of a transmission and between two consecutive frame transmissions, respectively. By introducing the two Markov chains a small number of Markov states are required and the scalability of the analytical model is improved. The analytical model is used to investigate the impact of the CSMA/CA parameters, the number of contending devices, and the data frame size on the network performance in terms of throughput and energy efficiency. It is shown by simulations that the proposed analytical model can accurately predict the performance of slotted CSMA/CA algorithm for uplink, downlink and bi-direction traffic, with both acknowledgement and non-acknowledgement modes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a 2R regeneration scheme based on a nonlinear optical loop mirror (NOLM) and optical filtering. We numerically investigate wavelength-division multiplexing (WDM) operation at a channel bit rate of 40 Gbit/s. In distinction to our previous work, we focus here on the regenerative characteristics and signal quality after a single transmission section, whose length is varied from 200 to 1000 km. © 2003 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a 2R regeneration scheme based on a nonlinear optical loop mirror and optical filtering. The feasibility of wavelength-division multiplexing operation at 40 Gbit/s is numerically demonstrated. We examine the characteristics of one-step regeneration and discuss networking applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To guarantee QoS for multicast transmission, admission control for multicast sessions is expected. Probe-based multicast admission control (PBMAC) scheme is a scalable and simple approach. However, PBMAC suffers from the subsequent request problem which can significantly reduce the maximum number of multicast sessions that a network can admit. In this letter, we describe the subsequent request problem and propose an enhanced PBMAC scheme to solve this problem. The enhanced scheme makes use of complementary probing and remarking which require only minor modification to the original scheme. By using a fluid-based analytical model, we are able to prove that the enhanced scheme can always admit a higher number of multicast sessions. Furthermore, we present validation of the analytical model using packet based simulation. Copyright © 2005 The Institute of Electronics, Information and Communication Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of hMSCs for allogeneic therapies requiring lot sizes of billions of cells will necessitate large-scale culture techniques such as the expansion of cells on microcarriers in bioreactors. Whilst much research investigating hMSC culture on microcarriers has focused on growth, much less involves their harvesting for passaging or as a step towards cryopreservation and storage. A successful new harvesting method has recently been outlined for cells grown on SoloHill microcarriers in a 5L bioreactor [1]. Here, this new method is set out in detail, harvesting being defined as a two-step process involving cell 'detachment' from the microcarriers' surface followed by the 'separation' of the two entities. The new detachment method is based on theoretical concepts originally developed for secondary nucleation due to agitation. Based on this theory, it is suggested that a short period (here 7min) of intense agitation in the presence of a suitable enzyme should detach the cells from the relatively large microcarriers. In addition, once detached, the cells should not be damaged because they are smaller than the Kolmogorov microscale. Detachment was then successfully achieved for hMSCs from two different donors using microcarrier/cell suspensions up to 100mL in a spinner flask. In both cases, harvesting was completed by separating cells from microcarriers using a Steriflip® vacuum filter. The overall harvesting efficiency was >95% and after harvesting, the cells maintained all the attributes expected of hMSC cells. The underlying theoretical concepts suggest that the method is scalable and this aspect is discussed too. © 2014 The Authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational and communication complexities call for distributed, robust, and adaptive control. This paper proposes a promising way of bottom-up design of distributed control in which simple controllers are responsible for individual nodes. The overall behavior of the network can be achieved by interconnecting such controlled loops in cascade control for example and by enabling the individual nodes to share information about data with their neighbors without aiming at unattainable global solution. The problem is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, that can be implemented adaptively and which provide a systematic rich way to information sharing. This paper elaborates the overall solution, applies it to linear-Gaussian case, and provides simulation results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of novel, affordable and efficacious therapeutics will be necessary to ensure the continued progression in the standard of global healthcare. With the potential to address previously unmet patient needs as well as tackling the social and economic effects of chronic and age-related conditions, cell therapies will lead the new generation of healthcare products set to improve health and wealth across the globe. However, if many of the small to medium enterprises (SMEs) engaged in much of the commercialization efforts are to successfully traverse the ‘Valley of Death’ as they progress through clinical trials, there are a number of challenges that must be overcome. No longer do the challenges remain biological but rather a series of engineering and manufacturing issues must also be considered and addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advent of smart TVs has reshaped the TV-consumer interaction by combining TVs with mobile-like applications and access to the Internet. However, consumers are still unable to seamlessly interact with the contents being streamed. An example of such limitation is TV shopping, in which a consumer makes a purchase of a product or item displayed in the current TV show. Currently, consumers can only stop the current show and attempt to find a similar item in the Web or an actual store. It would be more convenient if the consumer could interact with the TV to purchase interesting items. ^ Towards the realization of TV shopping, this dissertation proposes a scalable multimedia content processing framework. Two main challenges in TV shopping are addressed: the efficient detection of products in the content stream, and the retrieval of similar products given a consumer-selected product. The proposed framework consists of three components. The first component performs computational and temporal aware multimedia abstraction to select a reduced number of frames that summarize the important information in the video stream. By both reducing the number of frames and taking into account the computational cost of the subsequent detection phase, this component component allows the efficient detection of products in the stream. The second component realizes the detection phase. It executes scalable product detection using multi-cue optimization. Additional information cues are formulated into an optimization problem that allows the detection of complex products, i.e., those that do not have a rigid form and can appear in various poses. After the second component identifies products in the video stream, the consumer can select an interesting one for which similar ones must be located in a product database. To this end, the third component of the framework consists of an efficient, multi-dimensional, tree-based indexing method for multimedia databases. The proposed index mechanism serves as the backbone of the search. Moreover, it is able to efficiently bridge the semantic gap and perception subjectivity issues during the retrieval process to provide more relevant results.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The difluoromethyl-allo-threonyl hydroxamate-based compound LPC-058 is a potent inhibitor of UDP-3-O-(R-3-hydroxymyristoyl)-N-acetylglucosamine deacetylase (LpxC) in Gram-negative bacteria. A scalable synthesis of this compound is described. The key step in the synthetic sequence is a transition metal/base-catalyzed aldol reaction of methyl isocyanoacetate and difluoroacetone, giving rise to 4-(methoxycarbonyl)-5,5-disubstituted 2-oxazoline. A simple NMR-based determination of enantiomeric purity is also described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constant technology advances have caused data explosion in recent years. Accord- ingly modern statistical and machine learning methods must be adapted to deal with complex and heterogeneous data types. This phenomenon is particularly true for an- alyzing biological data. For example DNA sequence data can be viewed as categorical variables with each nucleotide taking four different categories. The gene expression data, depending on the quantitative technology, could be continuous numbers or counts. With the advancement of high-throughput technology, the abundance of such data becomes unprecedentedly rich. Therefore efficient statistical approaches are crucial in this big data era.

Previous statistical methods for big data often aim to find low dimensional struc- tures in the observed data. For example in a factor analysis model a latent Gaussian distributed multivariate vector is assumed. With this assumption a factor model produces a low rank estimation of the covariance of the observed variables. Another example is the latent Dirichlet allocation model for documents. The mixture pro- portions of topics, represented by a Dirichlet distributed variable, is assumed. This dissertation proposes several novel extensions to the previous statistical methods that are developed to address challenges in big data. Those novel methods are applied in multiple real world applications including construction of condition specific gene co-expression networks, estimating shared topics among newsgroups, analysis of pro- moter sequences, analysis of political-economics risk data and estimating population structure from genotype data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explores methods for fabrication of nanohole arrays, and their integration into a benchtop system for use as sensors or anti-counterfeit labels. Chapter 1 gives an introduction to plasmonics and more specifically nanohole arrays and how they have potential as label free sensors compared to the current biosensors on the market. Various fabrication methods are explored, including Focused Ion Beam, Electron Beam Lithography, Nanoimprint lithography, Template stripping and Phase Shift Lithography. Focused Ion Beam was chosen to fabricate the nanohole arrays due to its suitability for rapid prototyping and it’s relatively low cost. In chapter 2 the fabrication of nanohole arrays using FIB is described, and the samples characterised. The fabricated nanohole arrays are tested as bulk refractive index sensors, before a bioassay using whole molecule human IgG antibodies and antigen is developed and performed on the senor. In chapter 3 the fabricated sensors are integrated into a custom built system, capable of real time, multiplexed detection of biomolecules. Here, scFv antibodies of two biomolecules relevant to the detection of pancreatic cancer (C1q and C3) are attached to the nanohole arrays, and detection of their complementary proteins is demonstrated both in buffer (10 nM detection of C1q Ag) and human serum. Chapter 4 explores arrays of anisotropic (elliptical) nanoholes and shows how the shape anisotropy induces polarisation sensitive transmission spectra, in both simulations and fabricated arrays. The potential use of such samples as visible and NIR tag for anti-counterfeiting applications is demonstrated. Finally, chapter 5 gives a summary of the work completed and discusses potential future work in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.

Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.