852 resultados para LARGE-SCALE SYNTHESIS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a potential method to assist developers of future bioenergy schemes when selecting from available suppliers of biomass materials. The method aims to allow tacit requirements made on biomass suppliers to be considered at the design stage of new developments. The method used is a combination of the Analytical Hierarchy Process and the Quality Function Deployment methods (AHP-QFD). The output of the method is a ranking and relative weighting of the available suppliers which could be used to improve optimization algorithms such as linear and goal programming. The paper is at a conceptual stage and no results have been obtained. The aim is to use the AHP-QFD method to bridge the gap between treatment of explicit and tacit requirements of bioenergy schemes; allowing decision makers to identify the most successful supply strategy available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different procurement decisions taken by relief organizations can result in considerably different implications in regards to transport, storage, and distribution of humanitarian aid and ultimately can influence the performance of the humanitarian supply chain and the delivery of the humanitarian aid. In this article, we look into what resources are needed and how these resources evolve in the delivery of humanitarian aid. Drawing on the resource-based view of the firm, we develop a framework to categorize the impact of local resources on the configuration of humanitarian supply chains. In contrast to other papers, the importance of localizing the configuration of the humanitarian supply chain is not only conceptually recognized, but empirical investigations are also provided. In terms of methodology, this article is based on the analysis of secondary data from two housing reconstruction projects. Findings indicate that the use of local resources in humanitarian aid has positive effects on programs' overall supply chain performance and these effects are not only related to the macroeconomic perspective, but benefits expand to improvements related to the use of knowledge. At the same time, it was found that local sourcing often comes with a number of problems. For example, in one of the cases, significant problems existed, which were related to the scarcity of local supplies. Both housing reconstruction projects have indicated the continuous need for changes throughout the programs as a dynamic supply chain configuration is important for the long-term sustainability of reconstruction aid. © 2014 Decision Sciences Institute.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). The DME-SR reactions scheme and kinetics in the presence of a bifunctional catalyst of CuO/ZnO/Al2O3+ZSM-5 were incorporated in the model using in-house developed user-defined function. The model was validated by comparing the predictions with experimental data from the literature. The results revealed for the first time detailed CFB reactor hydrodynamics, gas residence time, temperature distribution and product gas composition at a selected operating condition of 300 °C and steam to DME mass ratio of 3 (molar ratio of 7.62). The spatial variation in the gas species concentrations suggests the existence of three distinct reaction zones but limited temperature variations. The DME conversion and hydrogen yield were found to be 87% and 59% respectively, resulting in a product gas consisting of 72 mol% hydrogen. In part II of this study, the model presented here will be used to optimize the reactor design and study the effect of operating conditions on the reactor performance and products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale massively parallel molecular dynamics (MD) simulations of the human class I major histo-compatibility complex (MHC) protein HLA-A*0201 bound to a decameric tumor-specific antigenic peptide GVY-DGREHTV were performed using a scalable MD code on high-performance computing platforms. Such computational capabilities put us in reach of simulations of various scales and complexities. The supercomputing resources available Large-scale massively parallel molecular dynamics (MD) simulations of the human class I major histocompatibility complex (MHC) protein HLA-A*0201 bound to a decameric tumor-specific antigenic peptide GVYDGREHTV were performed using a scalable MD code on high-performance computing platforms. Such computational capabilities put us in reach of simulations of various scales and complexities. The supercomputing resources available for this study allow us to compare directly differences in the behavior of very large molecular models; in this case, the entire extracellular portion of the peptide–MHC complex vs. the isolated peptide binding domain. Comparison of the results from the partial and the whole system simulations indicates that the peptide is less tightly bound in the partial system than in the whole system. From a detailed study of conformations, solvent-accessible surface area, the nature of the water network structure, and the binding energies, we conclude that, when considering the conformation of the α1–α2 domain, the α3 and β2m domains cannot be neglected. © 2004 Wiley Periodicals, Inc. J Comput Chem 25: 1803–1813, 2004

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human mesenchymal stem cell (hMSC) therapies have the potential to revolutionise the healthcare industry and replicate the success of the therapeutic protein industry; however, for this to be achieved there is a need to apply key bioprocessing engineering principles and adopt a quantitative approach for large-scale reproducible hMSC bioprocess development. Here we provide a quantitative analysis of the changes in concentration of glucose, lactate and ammonium with time during hMSC monolayer culture over 4 passages, under 100% and 20% dissolved oxgen (dO2), where either a 100%, 50% or 0% growth medium exchange was performed after 72h in culture. Yield coefficients, specific growth rates (h-1) and doubling times (h) were calculated for all cases. The 100% dO2 flasks outperformed the 20% dO2 flasks with respect to cumulative cell number, with the latter consuming more glucose and producing more lactate and ammonium. Furthermore, the 100% and 50% medium exchange conditions resulted in similar cumulative cell numbers, whilst the 0% conditions were significantly lower. Cell immunophenotype and multipotency were not affected by the experimental culture conditions. This study demonstrates the importance of determining optimal culture conditions for hMSC expansion and highlights a potential cost savings from only making a 50% medium exchange, which may prove significant for large-scale bioprocessing. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter discusses network protection of high-voltage direct current (HVDC) transmission systems for large-scale offshore wind farms where the HVDC system utilizes voltage-source converters. The multi-terminal HVDC network topology and protection allocation and configuration are discussed with DC circuit breaker and protection relay configurations studied for different fault conditions. A detailed protection scheme is designed with a solution that does not require relay communication. Advanced understanding of protection system design and operation is necessary for reliable and safe operation of the meshed HVDC system under fault conditions. Meshed-HVDC systems are important as they will be used to interconnect large-scale offshore wind generation projects. Offshore wind generation is growing rapidly and offers a means of securing energy supply and addressing emissions targets whilst minimising community impacts. There are ambitious plans concerning such projects in Europe and in the Asia-Pacific region which will all require a reliable yet economic system to generate, collect, and transmit electrical power from renewable resources. Collective offshore wind farms are efficient and have potential as a significant low-carbon energy source. However, this requires a reliable collection and transmission system. Offshore wind power generation is a relatively new area and lacks systematic analysis of faults and associated operational experience to enhance further development. Appropriate fault protection schemes are required and this chapter highlights the process of developing and assessing such schemes. The chapter illustrates the basic meshed topology, identifies the need for distance evaluation, and appropriate cable models, then details the design and operation of the protection scheme with simulation results used to illustrate operation. © Springer Science+Business Media Singapore 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell-based therapies have the potential to contribute to global healthcare, whereby the use of living cells and tissues can be used as medicinal therapies. Despite this potential, many challenges remain before the full value of this emerging field can be realized. The characterization of input material for cell-based therapy bioprocesses from multiple donors is necessary to identify and understand the potential implications of input variation on process development. In this work, we have characterized bone marrow derived human mesenchymal stem cells (BM-hMSCs) from multiple donors and discussed the implications of the measurable input variation on the development of autologous and allogeneic cell-based therapy manufacturing processes. The range of cumulative population doublings across the five BM-hMSC lines over 30 days of culture was 5.93, with an 18.2% range in colony forming efficiency at the end of the culture process and a 55.1% difference in the production of interleukin-6 between these cell lines. It has been demonstrated that this variation results in a range in the process time between these donor hMSC lines for a hypothetical product of over 13 days, creating potential batch timing issues when manufacturing products from multiple patients. All BM-hMSC donor lines demonstrated conformity to the ISCT criteria but showed a difference in cell morphology. Metabolite analysis showed that hMSCs from the different donors have a range in glucose consumption of 26.98 pmol cell−1 day−1, Lactate production of 29.45 pmol cell−1 day−1 and ammonium production of 1.35 pmol cell−1 day−1, demonstrating the extent of donor variability throughout the expansion process. Measuring informative product attributes during process development will facilitate progress towards consistent manufacturing processes, a critical step in the translation cell-based therapies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seminal multiple view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis methodology. Although seminal, these benchmark datasets are limited in scope with few reference scenes. Here, we try to take these works a step further by proposing a new multi-view stereo dataset, which is an order of magnitude larger in number of scenes and with a significant increase in diversity. Specifically, we propose a dataset containing 80 scenes of large variability. Each scene consists of 49 or 64 accurate camera positions and reference structured light scans, all acquired by a 6-axis industrial robot. To apply this dataset we propose an extension of the evaluation protocol from the Middlebury evaluation, reflecting the more complex geometry of some of our scenes. The proposed dataset is used to evaluate the state of the art multiview stereo algorithms of Tola et al., Campbell et al. and Furukawa et al. Hereby we demonstrate the usability of the dataset as well as gain insight into the workings and challenges of multi-view stereopsis. Through these experiments we empirically validate some of the central hypotheses of multi-view stereopsis, as well as determining and reaffirming some of the central challenges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When machining a large-scale aerospace part, the part is normally located and clamped firmly until a set of features are machined. When the part is released, its size and shape may deform beyond the tolerance limits due to stress release. This paper presents the design of a new fixing method and flexible fixtures that would automatically respond to workpiece deformation during machining. Deformation is inspected and monitored on-line, and part location and orientation can be adjusted timely to ensure follow-up operations are carried out under low stress and with respect to the related datum defined in the design models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With ever-more demanding requirements for the accurate manufacture of large components, dimensional measuring techniques are becoming progressively more sophisticated. This review describes some of the more recently developed techniques and the state-of-the-art in the more well-known large-scale dimensional metrology methods. In some cases, the techniques are described in detail, or, where relevant specialist review papers exist, these are cited as further reading. The traceability of the measurement data collected is discussed with reference to new international standards that are emerging. In some cases, hybrid measurement techniques are finding specialized applications and these are referred to where appropriate. © IMechE 2009.