853 resultados para Calculation methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the life cycle GHG emissions from existing UK pulverized coal power plants. The life cycle of the electricity Generation plant includes construction, operation and decommissioning. The operation phase is extended to upstream and downstream processes. Upstream processes include the mining and transport of coal including methane leakage and the production and transport of limestone and ammonia, which are necessary for flue gas clean up. Downstream processes, on the other hand, include waste disposal and the recovery of land used for surface mining. The methodology used is material based process analysis that allows calculation of the total emissions for each process involved. A simple model for predicting the energy and material requirements of the power plant is developed. Preliminary calculations reveal that for a typical UK coal fired plant, the life cycle emissions amount to 990 g CO2-e/kWh of electricity generated, which compares well with previous UK studies. The majority of these emissions result from direct fuel combustion (882 g/kWh 89%) with methane leakage from mining operations accounting for 60% of indirect emissions. In total, mining operations (including methane leakage) account for 67.4% of indirect emissions, while limestone and other material production and transport account for 31.5%. The methodology developed is also applied to a typical IGCC power plant. It is found that IGCC life cycle emissions are 15% less than those from PC power plants. Furthermore, upon investigating the influence of power plant parameters on life cycle emissions, it is determined that, while the effect of changing the load factor is negligible, increasing efficiency from 35% to 38% can reduce emissions by 7.6%. The current study is funded by the UK National Environment Research Council (NERC) and is undertaken as part of the UK Carbon Capture and Storage Consortium (UKCCSC). Future work will investigate the life cycle emissions from other power generation technologies with and without carbon capture and storage. The current paper reveals that it might be possible that, when CCS is employed. the emissions during generation decrease to a level where the emissions from upstream processes (i.e. coal production and transport) become dominant, and so, the life cycle efficiency of the CCS system can be significantly reduced. The location of coal, coal composition and mining method are important in determining the overall impacts. In addition to studying the net emissions from CCS systems, future work will also investigate the feasibility and technoeconomics of these systems as a means of carbon abatement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One cubic centimetre potato cubes were blanched, sulfited, dried initially for between 40 and 80 min in air at 90 degreesC in a cabinet drier, puffed in a high temperature fluidised bed and then dried for up to 180 min in a cabinet drier. The final moisture content was 0.05 dwb. The resulting product was optimised using response surface methodology, in terms of volume and colour (L-*, a(*) and b(*) values) of the dry product, as well as rehydration ratio and texture of the rehydrated product. The operating conditions resulting in the optimised product were found to be blanching for 6 min in water at 100 degreesC, dipping in 400 ppm sodium metabisulfite solution for 10 min, initially drying for 40 min and puffing in air at 200 degreesC for 40 s, followed by final drying to a moisture content of 0.05 dwb. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To describe the calculations and approaches used to design experimental diets of differing saturated fatty acid (SFA) and monounsaturated fatty acid (MUFA) compositions for use in a long-term dietary intervention study, and to evaluate the degree to which the dietary targets were met. Design, setting and subjects: Fifty-one students living in a university hall of residence consumed a reference (SFA) diet for 8 weeks followed by either a moderate MUFA (MM) diet or a high MUFA (HM) diet for 16 weeks. The three diets were designed to differ only in their proportions of SFA and MUFA, while keeping total fat, polyunsaturated fatty acids (PUFA), trans-fatty acids, and the ratio of palmitic to stearic acid, and n-6 to n-3 PUFA, unchanged. Results: Using habitual diet records and a standardised database for food fatty acid compositions, a sequential process of theoretical fat substitutions enabled suitable fat sources for use in the three diets to be identified, and experimental margarines for baking, spreading and the manufacture of snack foods to be designed. The dietary intervention was largely successful in achieving the fatty acid targets of the three diets, although unintended differences between the original target and the analysed fatty acid composition of the experimental margarines resulted in a lower than anticipated MUFA intake on the HM diet, and a lower ratio of palmitic to stearic acid compared with the reference or MM diet. Conclusions: This study has revealed important theoretical considerations that should be taken into account when designing diets of specific fatty acid composition, as well as practical issues of implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Postprandial lipid metabolism in humans has deserved much attention during the last two decades. Although fasting lipid and lipoprotein parameters reflect body homeostasis to some extent, the transient lipid and lipoprotein accumulation that occurs in the circulation after a fat-containing meal highlights the individual capacity to handle an acute fat input. An exacerbated postprandial accumulation of triglyceride-rich lipoproteins in the circulation has been associated with an increased cardiovascular risk. Methods: The important number of studies published in this field raises the question of the methodology used for such postprandial studies, as reviewed. Results: Based on our experiences, the present review reports and discuss the numerous methodological issues involved to serve as a basis for further works. These aspects include aims of the postprandial tests, size and nutrient composition of the test meals and background diets, pre-test conditions, characteristics of subjects involved, timing of sampling, suitable markers of postprandial lipid metabolism and calculations. Conclusion: In conclusion, we stress the need for standardization of postprandial tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present molecular dynamics simulations of the photodissociated state of MbNO performed at 300 K using a fluctuating charge model for the nitric oxide (NO) ligand. After dissociation, NO is observed to remain mainly in the centre of the distal haem pocket, although some movement towards the primary docking site and the xenon-4 pocket can be seen. We calculate the NO infrared spectrum for the photodissociated ligand within the haem pocket and find a narrow peak in the range 1915-1922 cm(-1). The resulting blue shift of 1 to 8 cm(-1) compared to gas-phase NO is much smaller than the red shifts calculated and observed for carbon monoxide (CO) in Mb. A small splitting, due to NO in the xenon-4 pocket, is also observed. At lower temperatures, the spectra and conformational space explored by the ligand remain largely unchanged, but the electrostatic interactions with residue His64 become increasingly significant in determining the details of the ligand orientation within the distal haem pocket. The investigation of the effect of the L29F mutation reveals significant differences between the behaviour of NO and that of CO, and suggests a coupling between the ligand and the protein dynamics due to the different ligand dipole moments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a preface to this Special Issue on the results of the QUEST-GSI (Global Scale Impacts) project on climate change impacts on catchment-scale water resources. A detailed description of the unified methodology, subsequently used in all studies in this issue, is provided. The project method involved running simulations of catchment-scale hydrology using a unified set of past and future climate scenarios, to enable a consistent analysis of the climate impacts around the globe. These scenarios include "policy-relevant" prescribed warming scenarios. This is followed by a synthesis of the key findings. Overall, the studies indicate that in most basins the models project substantial changes to river flow, beyond that observed in the historical record, but that in many cases there is considerable uncertainty in the magnitude and sign of the projected changes. The implications of this for adaptation activities are discussed.