813 resultados para Constraint based modelling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The UK government aims at achieving 80% CO2 emission reduction by 2050 which requires collective efforts across all the UK industry sectors. In particular, the housing sector has a large potential to contribute to achieving the aim because the housing sector alone accounts for 27% of the total UK CO2 emission, and furthermore, 87% of the housing which is responsible for current 27% CO2 emission will still stand in 2050. Therefore, it is essential to improve energy efficiency of existing housing stock built with low energy efficiency standard. In order for this, a whole‐house needs to be refurbished in a sustainable way by considering the life time financial and environmental impacts of a refurbished house. However, the current refurbishment process seems to be challenging to generate a financially and environmentally affordable refurbishment solution due to the highly fragmented nature of refurbishment practice and a lack of knowledge and skills about whole‐house refurbishment in the construction industry. In order to generate an affordable refurbishment solution, diverse information regarding costs and environmental impacts of refurbishment measures and materials should be collected and integrated in right sequences throughout the refurbishment project life cycle among key project stakeholders. Consequently, various researchers increasingly study a way of utilizing Building Information Modelling (BIM) to tackle current problems in the construction industry because BIM can support construction professionals to manage construction projects in a collaborative manner by integrating diverse information, and to determine the best refurbishment solution among various alternatives by calculating the life cycle costs and lifetime CO2 performance of a refurbishment solution. Despite the capability of BIM, the BIM adoption rate is low with 25% in the housing sector and it has been rarely studied about a way of using BIM for housing refurbishment projects. Therefore, this research aims to develop a BIM framework to formulate a financially and environmentally affordable whole‐house refurbishment solution based on the Life Cycle Costing (LCC) and Life Cycle Assessment (LCA) methods simultaneously. In order to achieve the aim, a BIM feasibility study was conducted as a pilot study to examine whether BIM is suitable for housing refurbishment, and a BIM framework was developed based on the grounded theory because there was no precedent research. After the development of a BIM framework, this framework was examined by a hypothetical case study using BIM input data collected from questionnaire survey regarding homeowners’ preferences for housing refurbishment. Finally, validation of the BIM framework was conducted among academics and professionals by providing the BIM framework and a formulated refurbishment solution based on the LCC and LCA studies through the framework. As a result, BIM was identified as suitable for housing refurbishment as a management tool, and it is timely for developing the BIM framework. The BIM framework with seven project stages was developed to formulate an affordable refurbishment solution. Through the case study, the Building Regulation is identified as the most affordable energy efficiency standard which renders the best LCC and LCA results when it is applied for whole‐house refurbishment solution. In addition, the Fabric Energy Efficiency Standard (FEES) is recommended when customers are willing to adopt high energy standard, and the maximum 60% of CO2 emissions can be reduced through whole‐house fabric refurbishment with the FEES. Furthermore, limitations and challenges to fully utilize BIM framework for housing refurbishment were revealed such as a lack of BIM objects with proper cost and environmental information, limited interoperability between different BIM software and limited information of LCC and LCA datasets in BIM system. Finally, the BIM framework was validated as suitable for housing refurbishment projects, and reviewers commented that the framework can be more practical if a specific BIM library for housing refurbishment with proper LCC and LCA datasets is developed. This research is expected to provide a systematic way of formulating a refurbishment solution using BIM, and to become a basis for further research on BIM for the housing sector to resolve the current limitations and challenges. Future research should enhance the BIM framework by developing more detailed process map and develop BIM objects with proper LCC and LCA Information.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the years 2004 and 2005 we collected samples of phytoplankton, zooplankton and macroinvertebrates in an artificial small pond in Budapest. We set up a simulation model predicting the abundance of the cyclopoids, Eudiaptomus zachariasi and Ischnura pumilio by considering only temperature as it affects the abundance of population of the previous day. Phytoplankton abundance was simulated by considering not only temperature, but the abundance of the three mentioned groups. This discrete-deterministic model could generate similar patterns like the observed one and testing it on historical data was successful. However, because the model was overpredicting the abundances of Ischnura pumilio and Cyclopoida at the end of the year, these results were not considered. Running the model with the data series of climate change scenarios, we had an opportunity to predict the individual numbers for the period around 2050. If the model is run with the data series of the two scenarios UKHI and UKLO, which predict drastic global warming, then we can observe a decrease in abundance and shift in the date of the maximum abundance occurring (excluding Ischnura pumilio, where the maximum abundance increases and it occurs later), whereas under unchanged climatic conditions (BASE scenario) the change in abundance is negligible. According to the scenarios GFDL 2535, GFDL 5564 and UKTR, a transition could be noticed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Climate change is one of the most crucial ecological problems of our age with great influence. Seasonal dynamics of aquatic communities are — among others — regulated by the climate, especially by temperature. In this case study we attempted the simulation modelling of the seasonal dynamics of a copepod species, Cyclops vicinus, which ranks among the zooplankton community, based on a quantitative database containing ten years of data from the Danube’s Göd area. We set up a simulation model predicting the abundance of Cyclops vicinus by considering only temperature as it affects the abundance of population. The model was adapted to eight years of daily temperature data observed between 1981 and 1994 and was tested successfully with the additional data of two further years. The model was run with the data series of climate change scenarios specified for the period around 2070- 2100. On the other hand we looked for the geographically analogous areas with the Göd region which are mostly similar to the future climate of the Göd area. By means of the above-mentioned points we can get a view how the climate of the region will change by the end of the 21st century, and the way the seasonal dynamics of a chosen planktonic crustacean species may follow this change. According to our results the area of Göd will be similar to the northern region of Greece. The maximum abundance of the examined species occurs a month to one and a half months earlier, moreover larger variances are expected between years in respect of the abundance. The deviations are expected in the direction of smaller or significantly larger abundance not observed earlier.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Acknowledgements and funding We would like to thank the GPs who took part in this study. We would also like to thank Marie Pitkethly and Gail Morrison for their help and support in recruiting GPs to the study. WIME was funded by the Chief Scientist Office, grant number CZH/4/610. The Health Services Research Unit, University of Aberdeen, is core funded by the Chief Scientist Office of the Scottish Government Health Directorates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The application of pharmacokinetic modelling within the drug development field essentially allows one to develop a quantitative description of the temporal behaviour of a compound of interest at a tissue/organ level, by identifying and defining relationships between a dose of a drug and dependent variables. In order to understand and characterise the pharmacokinetics of a drug, it is often helpful to employ pharmacokinetic modelling using empirical or mechanistic approaches. Pharmacokinetic models can be developed within mathematical and statistical commercial software such as MATLAB using traditional mathematical and computation coding, or by using the Simbiology Toolbox available within MATLAB for a graphical user interface approach to developing pharmacokinetic (PBPK) models. For formulations dosed orally, a prerequisite for clinical activity is the entry of the drug into the systemic circulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Knowledge and its management have been respectively accepted as a critical resource and a core business competency. Despite that literature proves the existence of a gap between the theoretical considerations of Knowledge Management (KM) and their efficient application. Such lacking, we argue, derives from the missing link between a framework of Knowledge Management and the particular methods and guidelines of its implementation. In an attempt to bridge this gap, an original, process- based holistic Knowledge Management framework is proposed, aiming to address the problem of knowledge management application and performance by utilising a set of well accepted Enterprise Modelling (EM) methods and tools.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Succinate is a naturally occurring metabolite in organism’s cell and is industrially important chemical with various applications in food and pharmaceutical industry. It is also widely used to produce bio-degradable plastics, surfactants, detergents etc. In last decades, emphasis has been given to bio-based chemical production using industrial biotechnology route rather than fossil-based production considering sustainability and environment friendly economy. In this thesis I am presenting a computational model for silico metabolic engineering of Saccharomyces cerevisiae for large scale production of succinate. For metabolic modelling, I have used OptKnock and OptGene optimization algorithms to identify the reactions to delete from the genome-scale metabolic model of S. cerevisiae to overproduce succinate by coupling with organism’s growth. Both OptKnock and OptGene proposed numerous straightforward and non-intuitive deletion strategies when number of constraints including growth constraint to the model were applied. The most interesting strategy identified by both algorithms was deletion combination of pyruvate decarboxylase and Ubiquinol:ferricytochrome c reductase(respiratory enzyme) reactions thereby also suggesting anaerobic fermentation of the organism in glucose medium. Such strategy was never reported earlier for growth-coupled succinate production in S.cerevisiae.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents quantitative studies of T cell and dendritic cell (DC) behaviour in mouse lymph nodes (LNs) in the naive state and following immunisation. These processes are of importance and interest in basic immunology, and better understanding could improve both diagnostic capacity and therapeutic manipulations, potentially helping in producing more effective vaccines or developing treatments for autoimmune diseases. The problem is also interesting conceptually as it is relevant to other fields where 3D movement of objects is tracked with a discrete scanning interval. A general immunology introduction is presented in chapter 1. In chapter 2, I apply quantitative methods to multi-photon imaging data to measure how T cells and DCs are spatially arranged in LNs. This has been previously studied to describe differences between the naive and immunised state and as an indicator of the magnitude of the immune response in LNs, but previous analyses have been generally descriptive. The quantitative analysis shows that some of the previous conclusions may have been premature. In chapter 3, I use Bayesian state-space models to test some hypotheses about the mode of T cell search for DCs. A two-state mode of movement where T cells can be classified as either interacting to a DC or freely migrating is supported over a model where T cells would home in on DCs at distance through for example the action of chemokines. In chapter 4, I study whether T cell migration is linked to the geometric structure of the fibroblast reticular network (FRC). I find support for the hypothesis that the movement is constrained to the fibroblast reticular cell (FRC) network over an alternative 'random walk with persistence time' model where cells would move randomly, with a short-term persistence driven by a hypothetical T cell intrinsic 'clock'. I also present unexpected results on the FRC network geometry. Finally, a quantitative method is presented for addressing some measurement biases inherent to multi-photon imaging. In all three chapters, novel findings are made, and the methods developed have the potential for further use to address important problems in the field. In chapter 5, I present a summary and synthesis of results from chapters 3-4 and a more speculative discussion of these results and potential future directions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study is about the comparison of simulation techniques between Discrete Event Simulation (DES) and Agent Based Simulation (ABS). DES is one of the best-known types of simulation techniques in Operational Research. Recently, there has been an emergence of another technique, namely ABS. One of the qualities of ABS is that it helps to gain a better understanding of complex systems that involve the interaction of people with their environment as it allows to model concepts like autonomy and pro-activeness which are important attributes to consider. Although there is a lot of literature relating to DES and ABS, we have found none that focuses on exploring the capability of both in tackling the human behaviour issues which relates to queuing time and customer satisfaction in the retail sector. Therefore, the objective of this study is to identify empirically the differences between these simulation techniques by stimulating the potential economic benefits of introducing new policies in a department store. To apply the new strategy, the behaviour of consumers in a retail store will be modelled using the DES and ABS approach and the results will be compared. We aim to understand which simulation technique is better suited to human behaviour modelling by investigating the capability of both techniques in predicting the best solution for an organisation in using management practices. Our main concern is to maximise customer satisfaction, for example by minimising their waiting times for the different services provided.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Part 4: Transition Towards Product-Service Systems

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Discrete Event Simulation (DES) is a very popular simulation technique in Operational Research. Recently, there has been the emergence of another technique, namely Agent Based Simulation (ABS). Although there is a lot of literature relating to DES and ABS, we have found less that focuses on exploring the capabilities of both in tackling human behaviour issues. In order to understand the gap between these two simulation techniques, therefore, our aim is to understand the distinctions between DES and ABS models with the real world phenomenon in modelling and simulating human behaviour. In achieving the aim, we have carried out a case study at a department store. Both DES and ABS models will be compared using the same problem domain which is concerning on management policy in a fitting room. The behaviour of staffs while working and customers’ satisfaction will be modelled for both models behaviour understanding.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Looking for a better knowledge concerning water and ionic liquids (ILs) interactions, a systematic study of the activity coefficients of water in pyridinium, pyrrolidinium and piperidinium-based ILs at 298.2 K is here presented based on water activity measurements. Additionally, the study of the structural effects of the pyridinium-based cation is also pursued. The results show that non-aromatic ILs are interacting more with water than aromatic ones, and among the ortho, meta and para isomers of 1-butyl-methylpyridinium chloride, the ortho position confers a more hydrophilic character to that specific IL. The physicalchemistry of the solutions was interpreted based on dissociation constants, natural bond orbitals and excess enthalpies providing a sound basis for the interpretation of the experimental observations. These results show that hydrogen bonding controls the behavior of these systems, being the anion-water one of the most relevant interactions, but modulated by the anionecation interactions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analysed the use of microneedle-based electrodes to enhance electroporation of mouse testis with DNA vectors for production of transgenic mice. Different microneedle formats were developed and tested, and we ultimately used electrodes based on arrays of 500 μm tall microneedles. In a series of experiments involving injection of a DNA vector expressing Green Fluorescent Protein (GFP) and electroporation using microneedle electrodes and a commercially available voltage supply, we compared the performance of flat and microneedle electrodes by measuring GFP expression at various timepoints after electroporation. Our main finding, supported by both experimental and simulated data, is that needles significantly enhanced electroporation of testis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.