937 resultados para Inovation models in nets
Resumo:
The research examines the deposition of airborne particles which contain heavy metals and investigates the methods that can be used to identify their sources. The research focuses on lead and cadmium because these two metals are of growing public and scientific concern on environmental health grounds. The research consists of three distinct parts. The first is the development and evaluation of a new deposition measurement instrument - the deposit cannister - designed specifically for large-scale surveys in urban areas. The deposit cannister is specifically designed to be cheap, robust, and versatile and therefore to permit comprehensive high-density urban surveys. The siting policy reduces contamination from locally resuspended surface-dust. The second part of the research has involved detailed surveys of heavy metal deposition in Walsall, West Midlands, using the new high-density measurement method. The main survey, conducted over a six-week period in November - December 1982, provided 30-day samples of deposition at 250 different sites. The results have been used to examine the magnitude and spatial variability of deposition rates in the case-study area, and to evaluate the performance of the measurement method. The third part of the research has been to conduct a 'source-identification' exercise. The methods used have been Receptor Models - Factor Analysis and Cluster Analysis - and a predictive source-based deposition model. The results indicate that there are six main source processes contributing to deposition of metals in the Walsall area: coal combustion, vehicle emissions, ironfounding, copper refining and two general industrial/urban processes. |A source-based deposition model has been calibrated using facctorscores for one source factor as the dependent variable, rather than metal deposition rates, thus avoiding problems traditionally encountered in calibrating models in complex multi-source areas. Empirical evidence supports the hypothesised associatlon of this factor with emissions of metals from the ironfoundry industry.
Resumo:
This thesis is a theoretical study of the accuracy and usability of models that attempt to represent the environmental control system of buildings in order to improve environmental design. These models have evolved from crude representations of a building and its environment through to an accurate representation of the dynamic characteristics of the environmental stimuli on buildings. Each generation of models has had its own particular influence on built form. This thesis analyses the theory, structure and data of such models in terms of their accuracy of simulation and therefore their validity in influencing built form. The models are also analysed in terms of their compatability with the design process and hence their ability to aid designers. The conclusions are that such models are unlikely to improve environmental performance since: a the models can only be applied to a limited number of building types, b they can only be applied to a restricted number of the characteristics of a design, c they can only be employed after many major environmental decisions have been made, d the data used in models is inadequate and unrepresentative, e models do not account for occupant interaction in environmental control. It is argued that further improvements in the accuracy of simulation of environmental control will not significantly improve environmental design. This is based on the premise that strategic environmental decisions are made at the conceptual stages of design whereas models influence the detailed stages of design. It is hypothesised that if models are to improve environmental design it must be through the analysis of building typologies which provides a method of feedback between models and the conceptual stages of design. Field studies are presented to describe a method by which typologies can be analysed and a theoretical framework is described which provides a basis for further research into the implications of the morphology of buildings on environmental design.
Resumo:
The importance of tissue transglutaminase (TG2) in angiogenesis is unclear and contradictory. Here we show that inhibition of extracellular TG2 protein crosslinking or downregulation of TG2 expression leads to inhibition of angiogenesis in cell culture, the aorta ring assay and in vivo models. In a human umbilical vein endothelial cell (HUVEC) co-culture model, inhibition of extracellular TG2 activity can halt the progression of angiogenesis, even when introduced after tubule formation has commenced and after addition of excess vascular endothelial growth factor (VEGF). In both cases, this leads to a significant reduction in tubule branching. Knockdown of TG2 by short hairpin (shRNA) results in inhibition of HUVEC migration and tubule formation, which can be restored by add back of wt TG2, but not by the transamidation-defective but GTP-binding mutant W241A. TG2 inhibition results in inhibition of fibronectin deposition in HUVEC monocultures with a parallel reduction in matrix-bound VEGFA, leading to a reduction in phosphorylated VEGF receptor 2 (VEGFR2) at Tyr1214 and its downstream effectors Akt and ERK1/2, and importantly its association with b1 integrin. We propose a mechanism for the involvement of matrix-bound VEGFA in angiogenesis that is dependent on extracellular TG2-related activity. © 2013 Macmillan Publishers Limited. All rights reserved.
Resumo:
The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. In this paper, a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them.
Resumo:
This study analyzes the validity of different Q-factor models in the BER estimation in RZ-DPSK transmission at 40 Gb/s channel rate. The impact of the duty cycle of the carrier pulses on the accuracy of the BER estimates through the different models has also been studied.
Resumo:
This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether (DME) gas adsorptive separation and steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). Hydrogen is currently receiving increasing interest as an alternative source of clean energy and has high potential applications, including the transportation sector and power generation. Computational fluid dynamic (CFD) modelling has attracted considerable recognition in the engineering sector consequently leading to using it as a tool for process design and optimisation in many industrial processes. In most cases, these processes are difficult or expensive to conduct in lab scale experiments. The CFD provides a cost effective methodology to gain detailed information up to the microscopic level. The main objectives in this project are to: (i) develop a predictive model using ANSYS FLUENT (CFD) commercial code to simulate the flow hydrodynamics, mass transfer, reactions and heat transfer in a large scale dual fluidized bed system for combined gas separation and steam reforming processes (ii) implement a suitable adsorption models in the CFD code, through a user defined function, to predict selective separation of a gas from a mixture (iii) develop a model for dimethyl ether steam reforming (DME-SR) to predict hydrogen production (iv) carry out detailed parametric analysis in order to establish ideal operating conditions for future industrial application. The project has originated from a real industrial case problem in collaboration with the industrial partner Dow Corning (UK) and jointly funded by the Engineering and Physical Research Council (UK) and Dow Corning. The research examined gas separation by adsorption in a bubbling bed, as part of a dual fluidized bed system. The adsorption process was simulated based on the kinetics derived from the experimental data produced as part of a separate PhD project completed under the same fund. The kinetic model was incorporated in FLUENT CFD tool as a pseudo-first order rate equation; some of the parameters for the pseudo-first order kinetics were obtained using MATLAB. The modelling of the DME adsorption in the designed bubbling bed was performed for the first time in this project and highlights the novelty in the investigations. The simulation results were analysed to provide understanding of the flow hydrodynamic, reactor design and optimum operating condition for efficient separation. Bubbling bed validation by estimation of bed expansion and the solid and gas distribution from simulation agreed well with trends seen in the literatures. Parametric analysis on the adsorption process demonstrated that increasing fluidizing velocity reduced adsorption of DME. This is as a result of reduction in the gas residence time which appears to have much effect compared to the solid residence time. The removal efficiency of DME from the bed was found to be more than 88%. Simulation of the DME-SR in FLUENT CFD was conducted using selected kinetics from literature and implemented in the model using an in-house developed user defined function. The validation of the kinetics was achieved by simulating a case to replicate an experimental study of a laboratory scale bubbling bed by Vicente et al [1]. Good agreement was achieved for the validation of the models, which was then applied in the DME-SR in the large scale riser section of the dual fluidized bed system. This is the first study to use the selected DME-SR kinetics in a circulating fluidized bed (CFB) system and for the geometry size proposed for the project. As a result, the simulation produced the first detailed data on the spatial variation and final gas product in such an industrial scale fluidized bed system. The simulation results provided insight in the flow hydrodynamic, reactor design and optimum operating condition. The solid and gas distribution in the CFB was observed to show good agreement with literatures. The parametric analysis showed that the increase in temperature and steam to DME molar ratio increased the production of hydrogen due to the increased DME conversions, whereas the increase in the space velocity has been found to have an adverse effect. Increasing temperature between 200 oC to 350 oC increased DME conversion from 47% to 99% while hydrogen yield increased substantially from 11% to 100%. The CO2 selectivity decreased from 100% to 91% due to the water gas shift reaction favouring CO at higher temperatures. The higher conversions observed as the temperature increased was reflected on the quantity of unreacted DME and methanol concentrations in the product gas, where both decreased to very low values of 0.27 mol% and 0.46 mol% respectively at 350 °C. Increasing the steam to DME molar ratio from 4 to 7.68 increased the DME conversion from 69% to 87%, while the hydrogen yield increased from 40% to 59%. The CO2 selectivity decreased from 100% to 97%. The decrease in the space velocity from 37104 ml/g/h to 15394 ml/g/h increased the DME conversion from 87% to 100% while increasing the hydrogen yield from 59% to 87%. The parametric analysis suggests an operating condition for maximum hydrogen yield is in the region of 300 oC temperatures and Steam/DME molar ratio of 5. The analysis of the industrial sponsor’s case for the given flow and composition of the gas to be treated suggests that 88% of DME can be adsorbed from the bubbling and consequently producing 224.4t/y of hydrogen in the riser section of the dual fluidized bed system. The process also produces 1458.4t/y of CO2 and 127.9t/y of CO as part of the product gas. The developed models and parametric analysis carried out in this study provided essential guideline for future design of DME-SR at industrial level and in particular this work has been of tremendous importance for the industrial collaborator in order to draw conclusions and plan for future potential implementation of the process at an industrial scale.
Resumo:
Most empirical work in economic growth assumes either a Cobb–Douglas production function expressed in logs or a log-approximated constant elasticity of substitution specification. Estimates from each are likely biased due to logging the model and the latter can also suffer from approximation bias. We illustrate this with a successful replication of Masanjala and Papagerogiou (The Solow model with CES technology: nonlinearities and parameter heterogeneity, Journal of Applied Econometrics 2004; 19: 171–201) and then estimate both models in levels to avoid these biases. Our estimation in levels gives results in line with conventional wisdom.
Resumo:
Presentation of an abstract
Resumo:
The 6th edition of the workshop Models@run.time was held at the 14th International Conference MODELS. The workshop took place in the city of Wellington, New Zealand, on the 17th of October 2011. The workshop was organised by Nelly Bencomo, Gordon Blair, Robert France, Betty H.C. Cheng, and Cédric Jeanneret. We present a summary of the workshop and a synopsis of the papers presented during the workshop. © 2012 Springer-Verlag Berlin Heidelberg.
Resumo:
The second edition of the workshop Models@run.time was co-located with the ACM/IEEE 10th International Conference on Model Driven Engineering Languages and Systems. The workshop took place in the lively city of Nashville, USA, on the 2nd of October, 2007. The workshop was organised by Nelly Bencomo, Robert France, and Gordon Blair and was attended by at least 25 people from 7 countries. This summary gives an overview of the presentations and lively discussions that took place during the workshop. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
The 5th edition of the workshop Models@run.time was held at the 13th International Conference MODELS. The workshop took place in the exciting city of Oslo, Norway, on the 5th of October 2010. The workshop was organised by Nelly Bencomo, Gordon Blair, Franck Fleurey, and Cédric Jeanneret. It was attended by at least 33 people from more than 11 countries. In this summary we present a synopsis of the presentations and discussions that took place during the workshop. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
The 4th edition of the workshop Models@run.time was held at the 12th International Conference on Model Driven Engineering Languages and Systems (MODELS). The workshop took place in the city of Denver, Colorado, USA, on the 5th of October 2009. The workshop was organised by Nelly Bencomo, Robert France, Gordon Blair, Freddy Muñoz, and Cédric Jeanneret. It was attended by at least 45 people from more than 10 countries. In this summary we present a synopsis of the presentations and discussions that took place during the 4th International Workshop on Models@run.time. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
The third edition of the workshop Models@run.time was held at the ACM/IEEE 11th International Conference on Model Driven Engineering Languages and Systems (MODELS). The workshop took place in the beautiful city of Toulouse, France, on the 30th of October, 2008. The workshop was organised by Nelly Bencomo, Robert France, Gordon Blair, Freddy Muñoz, and Cèdric Jeanneret.It was attended by at least 44 people from more than 10 countries. In this summary we present an overview of the presentations and fruitful discussions that took place during the 3rd edition of the workshop Models@run.time.
Resumo:
The use of spreadsheets has become routine in all aspects of business with usage growing across a range of functional areas and a continuing trend towards end user spreadsheet development. However, several studies have raised concerns about the accuracy of spreadsheet models in general, and of end user developed applications in particular, raising the risk element for users. High error rates have been discovered, even though the users/developers were confident that their spreadsheets were correct. The lack of an easy to use, context-sensitive validation methodology has been highlighted as a significant contributor to the problems of accuracy. This paper describes experiences in using a practical, contingency factor-based methodology for validation of spreadsheet-based DSS. Because the end user is often both the system developer and a stakeholder, the contingency factor-based validation methodology may need to be used in more than one way. The methodology can also be extended to encompass other DSS.