19 resultados para cosmology: large-scale structure of Universe
em Aston University Research Archive
Resumo:
The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.
Resumo:
Some color centers in diamond can serve as quantum bits which can be manipulated with microwave pulses and read out with laser, even at room temperature. However, the photon collection efficiency of bulk diamond is greatly reduced by refraction at the diamond/air interface. To address this issue, we fabricated arrays of diamond nanostructures, differing in both diameter and top end shape, with HSQ and Cr as the etching mask materials, aiming toward large scale fabrication of single-photon sources with enhanced collection efficiency made of nitrogen vacancy (NV) embedded diamond. With a mixture of O2 and CHF3 gas plasma, diamond pillars with diameters down to 45 nm were obtained. The top end shape evolution has been represented with a simple model. The tests of size dependent single-photon properties confirmed an improved single-photon collection efficiency enhancement, larger than tenfold, and a mild decrease of decoherence time with decreasing pillar diameter was observed as expected. These results provide useful information for future applications of nanostructured diamond as a single-photon source.
Resumo:
The atomic scale structure of sodium borophosphates made by the sol-gel method is compared to those made by the melt-quench method. It is found that although the sol-gel generated materials have a higher tendency towards crystallization, they nevertheless show a qualitatively similar crystallization trend with composition to their melt-quench analogues; the progressive introduction of boron oxide into the phosphate network initially inhibits then promotes crystallization. At the composition associated with the most stable amorphous sodium borophosphate (20 mol% boron oxide), it is found that the atomic scale structure of the sol-gel synthesized network glass is almost identical to that of the corresponding melt-quenched one.
Resumo:
Large-scale massively parallel molecular dynamics (MD) simulations of the human class I major histo-compatibility complex (MHC) protein HLA-A*0201 bound to a decameric tumor-specific antigenic peptide GVY-DGREHTV were performed using a scalable MD code on high-performance computing platforms. Such computational capabilities put us in reach of simulations of various scales and complexities. The supercomputing resources available Large-scale massively parallel molecular dynamics (MD) simulations of the human class I major histocompatibility complex (MHC) protein HLA-A*0201 bound to a decameric tumor-specific antigenic peptide GVYDGREHTV were performed using a scalable MD code on high-performance computing platforms. Such computational capabilities put us in reach of simulations of various scales and complexities. The supercomputing resources available for this study allow us to compare directly differences in the behavior of very large molecular models; in this case, the entire extracellular portion of the peptide–MHC complex vs. the isolated peptide binding domain. Comparison of the results from the partial and the whole system simulations indicates that the peptide is less tightly bound in the partial system than in the whole system. From a detailed study of conformations, solvent-accessible surface area, the nature of the water network structure, and the binding energies, we conclude that, when considering the conformation of the α1–α2 domain, the α3 and β2m domains cannot be neglected. © 2004 Wiley Periodicals, Inc. J Comput Chem 25: 1803–1813, 2004
Resumo:
Data Envelopment Analysis (DEA) is one of the most widely used methods in the measurement of the efficiency and productivity of Decision Making Units (DMUs). DEA for a large dataset with many inputs/outputs would require huge computer resources in terms of memory and CPU time. This paper proposes a neural network back-propagation Data Envelopment Analysis to address this problem for the very large scale datasets now emerging in practice. Neural network requirements for computer memory and CPU time are far less than that needed by conventional DEA methods and can therefore be a useful tool in measuring the efficiency of large datasets. Finally, the back-propagation DEA algorithm is applied to five large datasets and compared with the results obtained by conventional DEA.
Resumo:
In this thesis patterns of working hours in large-scale grocery retailing in Britain and France are compared. The research is carried out using cross-national comparative methodology, and the analysis is based on information derived from secondary sources and empirical research in large-scale grocery retailing involving employers and trade unions at industry level and case studies at outlet level. The thesis begins by comparing national patterns of working hours in Britain and France over the post-war period. Subsequently, a detailed comparison of working hours in large-scale grocery retailing in Britain and France is carried out through the analysis of secondary sources and empirical data. Emphasis is placed on analyzing part-time working hours. They are contrasted and compared at national level and explained in terms of supply and demand factors. The relationships between the structuring of, and satisfaction with, working hours and factors determining women's integration in the workforce in Britain and France are investigated. Part-time hours are then compared and contrasted in large-scale grocery retailing in the context of the analysis of working hours. The relationship between the structuring of working hours and satisfaction with them is examined in both countries through research with women part-timers in case study outlets. The cross-national comparative methodology is used to examine whether dissimilar national contexts in Britain and France have led to different patterns of working hours in large-scale grocery retailing. The principal conclusion is that significant differences are found in the length, organization and flexibility of working hours and that these differences can be attributed to dissimilar socio-economic, political, and cultural contexts in the two countries.
Resumo:
This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains information relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of concept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network approach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the presence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear techniques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.
Resumo:
T-cell activation requires interaction of T-cell receptors (TCR) with peptide epitopes bound by major histocompatibility complex (MHC) proteins. This interaction occurs at a special cell-cell junction known as the immune or immunological synapse. Fluorescence microscopy has shown that the interplay among one agonist peptide-MHC (pMHC), one TCR and one CD4 provides the minimum complexity needed to trigger transient calcium signalling. We describe a computational approach to the study of the immune synapse. Using molecular dynamics simulation, we report here on a study of the smallest viable model, a TCR-pMHC-CD4 complex in a membrane environment. The computed structural and thermodynamic properties are in fair agreement with experiment. A number of biomolecules participate in the formation of the immunological synapse. Multi-scale molecular dynamics simulations may be the best opportunity we have to reach a full understanding of this remarkable supra-macromolecular event at a cell-cell junction.
Resumo:
Different procurement decisions taken by relief organizations can result in considerably different implications in regards to transport, storage, and distribution of humanitarian aid and ultimately can influence the performance of the humanitarian supply chain and the delivery of the humanitarian aid. In this article, we look into what resources are needed and how these resources evolve in the delivery of humanitarian aid. Drawing on the resource-based view of the firm, we develop a framework to categorize the impact of local resources on the configuration of humanitarian supply chains. In contrast to other papers, the importance of localizing the configuration of the humanitarian supply chain is not only conceptually recognized, but empirical investigations are also provided. In terms of methodology, this article is based on the analysis of secondary data from two housing reconstruction projects. Findings indicate that the use of local resources in humanitarian aid has positive effects on programs' overall supply chain performance and these effects are not only related to the macroeconomic perspective, but benefits expand to improvements related to the use of knowledge. At the same time, it was found that local sourcing often comes with a number of problems. For example, in one of the cases, significant problems existed, which were related to the scarcity of local supplies. Both housing reconstruction projects have indicated the continuous need for changes throughout the programs as a dynamic supply chain configuration is important for the long-term sustainability of reconstruction aid. © 2014 Decision Sciences Institute.
Resumo:
This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). The DME-SR reactions scheme and kinetics in the presence of a bifunctional catalyst of CuO/ZnO/Al2O3+ZSM-5 were incorporated in the model using in-house developed user-defined function. The model was validated by comparing the predictions with experimental data from the literature. The results revealed for the first time detailed CFB reactor hydrodynamics, gas residence time, temperature distribution and product gas composition at a selected operating condition of 300 °C and steam to DME mass ratio of 3 (molar ratio of 7.62). The spatial variation in the gas species concentrations suggests the existence of three distinct reaction zones but limited temperature variations. The DME conversion and hydrogen yield were found to be 87% and 59% respectively, resulting in a product gas consisting of 72 mol% hydrogen. In part II of this study, the model presented here will be used to optimize the reactor design and study the effect of operating conditions on the reactor performance and products.
Resumo:
Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.