854 resultados para large scale linear system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

T-cell activation requires interaction of T-cell receptors (TCR) with peptide epitopes bound by major histocompatibility complex (MHC) proteins. This interaction occurs at a special cell-cell junction known as the immune or immunological synapse. Fluorescence microscopy has shown that the interplay among one agonist peptide-MHC (pMHC), one TCR and one CD4 provides the minimum complexity needed to trigger transient calcium signalling. We describe a computational approach to the study of the immune synapse. Using molecular dynamics simulation, we report here on a study of the smallest viable model, a TCR-pMHC-CD4 complex in a membrane environment. The computed structural and thermodynamic properties are in fair agreement with experiment. A number of biomolecules participate in the formation of the immunological synapse. Multi-scale molecular dynamics simulations may be the best opportunity we have to reach a full understanding of this remarkable supra-macromolecular event at a cell-cell junction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we study an area localization problem in large scale Underwater Wireless Sensor Networks (UWSNs). The limited bandwidth, the severely impaired channel and the cost of underwater equipment all makes the underwater localization problem very challenging. Exact localization is very difficult for UWSNs in deep underwater environment. We propose a Mobile DETs based efficient 3D multi-power Area Localization Scheme (3D-MALS) to address the challenging problem. In the proposed scheme, the ideas of 2D multi-power Area Localization Scheme(2D-ALS) [6] and utilizing Detachable Elevator Transceiver (DET) are used to achieve the simplicity, location accuracy, scalability and low cost performances. The DET can rise and down to broadcast its position. And it is assumed that all the underwater nodes underwater have pressure sensors and know their z coordinates. The simulation results show that our proposed scheme is very efficient. © 2009 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different procurement decisions taken by relief organizations can result in considerably different implications in regards to transport, storage, and distribution of humanitarian aid and ultimately can influence the performance of the humanitarian supply chain and the delivery of the humanitarian aid. In this article, we look into what resources are needed and how these resources evolve in the delivery of humanitarian aid. Drawing on the resource-based view of the firm, we develop a framework to categorize the impact of local resources on the configuration of humanitarian supply chains. In contrast to other papers, the importance of localizing the configuration of the humanitarian supply chain is not only conceptually recognized, but empirical investigations are also provided. In terms of methodology, this article is based on the analysis of secondary data from two housing reconstruction projects. Findings indicate that the use of local resources in humanitarian aid has positive effects on programs' overall supply chain performance and these effects are not only related to the macroeconomic perspective, but benefits expand to improvements related to the use of knowledge. At the same time, it was found that local sourcing often comes with a number of problems. For example, in one of the cases, significant problems existed, which were related to the scarcity of local supplies. Both housing reconstruction projects have indicated the continuous need for changes throughout the programs as a dynamic supply chain configuration is important for the long-term sustainability of reconstruction aid. © 2014 Decision Sciences Institute.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). The DME-SR reactions scheme and kinetics in the presence of a bifunctional catalyst of CuO/ZnO/Al2O3+ZSM-5 were incorporated in the model using in-house developed user-defined function. The model was validated by comparing the predictions with experimental data from the literature. The results revealed for the first time detailed CFB reactor hydrodynamics, gas residence time, temperature distribution and product gas composition at a selected operating condition of 300 °C and steam to DME mass ratio of 3 (molar ratio of 7.62). The spatial variation in the gas species concentrations suggests the existence of three distinct reaction zones but limited temperature variations. The DME conversion and hydrogen yield were found to be 87% and 59% respectively, resulting in a product gas consisting of 72 mol% hydrogen. In part II of this study, the model presented here will be used to optimize the reactor design and study the effect of operating conditions on the reactor performance and products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human mesenchymal stem cell (hMSC) therapies have the potential to revolutionise the healthcare industry and replicate the success of the therapeutic protein industry; however, for this to be achieved there is a need to apply key bioprocessing engineering principles and adopt a quantitative approach for large-scale reproducible hMSC bioprocess development. Here we provide a quantitative analysis of the changes in concentration of glucose, lactate and ammonium with time during hMSC monolayer culture over 4 passages, under 100% and 20% dissolved oxgen (dO2), where either a 100%, 50% or 0% growth medium exchange was performed after 72h in culture. Yield coefficients, specific growth rates (h-1) and doubling times (h) were calculated for all cases. The 100% dO2 flasks outperformed the 20% dO2 flasks with respect to cumulative cell number, with the latter consuming more glucose and producing more lactate and ammonium. Furthermore, the 100% and 50% medium exchange conditions resulted in similar cumulative cell numbers, whilst the 0% conditions were significantly lower. Cell immunophenotype and multipotency were not affected by the experimental culture conditions. This study demonstrates the importance of determining optimal culture conditions for hMSC expansion and highlights a potential cost savings from only making a 50% medium exchange, which may prove significant for large-scale bioprocessing. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell-based therapies have the potential to contribute to global healthcare, whereby the use of living cells and tissues can be used as medicinal therapies. Despite this potential, many challenges remain before the full value of this emerging field can be realized. The characterization of input material for cell-based therapy bioprocesses from multiple donors is necessary to identify and understand the potential implications of input variation on process development. In this work, we have characterized bone marrow derived human mesenchymal stem cells (BM-hMSCs) from multiple donors and discussed the implications of the measurable input variation on the development of autologous and allogeneic cell-based therapy manufacturing processes. The range of cumulative population doublings across the five BM-hMSC lines over 30 days of culture was 5.93, with an 18.2% range in colony forming efficiency at the end of the culture process and a 55.1% difference in the production of interleukin-6 between these cell lines. It has been demonstrated that this variation results in a range in the process time between these donor hMSC lines for a hypothetical product of over 13 days, creating potential batch timing issues when manufacturing products from multiple patients. All BM-hMSC donor lines demonstrated conformity to the ISCT criteria but showed a difference in cell morphology. Metabolite analysis showed that hMSCs from the different donors have a range in glucose consumption of 26.98 pmol cell−1 day−1, Lactate production of 29.45 pmol cell−1 day−1 and ammonium production of 1.35 pmol cell−1 day−1, demonstrating the extent of donor variability throughout the expansion process. Measuring informative product attributes during process development will facilitate progress towards consistent manufacturing processes, a critical step in the translation cell-based therapies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seminal multiple view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis methodology. Although seminal, these benchmark datasets are limited in scope with few reference scenes. Here, we try to take these works a step further by proposing a new multi-view stereo dataset, which is an order of magnitude larger in number of scenes and with a significant increase in diversity. Specifically, we propose a dataset containing 80 scenes of large variability. Each scene consists of 49 or 64 accurate camera positions and reference structured light scans, all acquired by a 6-axis industrial robot. To apply this dataset we propose an extension of the evaluation protocol from the Middlebury evaluation, reflecting the more complex geometry of some of our scenes. The proposed dataset is used to evaluate the state of the art multiview stereo algorithms of Tola et al., Campbell et al. and Furukawa et al. Hereby we demonstrate the usability of the dataset as well as gain insight into the workings and challenges of multi-view stereopsis. Through these experiments we empirically validate some of the central hypotheses of multi-view stereopsis, as well as determining and reaffirming some of the central challenges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When machining a large-scale aerospace part, the part is normally located and clamped firmly until a set of features are machined. When the part is released, its size and shape may deform beyond the tolerance limits due to stress release. This paper presents the design of a new fixing method and flexible fixtures that would automatically respond to workpiece deformation during machining. Deformation is inspected and monitored on-line, and part location and orientation can be adjusted timely to ensure follow-up operations are carried out under low stress and with respect to the related datum defined in the design models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With ever-more demanding requirements for the accurate manufacture of large components, dimensional measuring techniques are becoming progressively more sophisticated. This review describes some of the more recently developed techniques and the state-of-the-art in the more well-known large-scale dimensional metrology methods. In some cases, the techniques are described in detail, or, where relevant specialist review papers exist, these are cited as further reading. The traceability of the measurement data collected is discussed with reference to new international standards that are emerging. In some cases, hybrid measurement techniques are finding specialized applications and these are referred to where appropriate. © IMechE 2009.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H15, 62P10.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.