813 resultados para feature based modelling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel biosensing system based on a micromachined rectangular silicon membrane is proposed and investigated in this paper. A distributive sensing scheme is designed to monitor the dynamics of the sensing structure. An artificial neural network is used to process the measured data and to identify cell presence and density. Without specifying any particular bio-application, the investigation is mainly concentrated on the performance testing of this kind of biosensor as a general biosensing platform. The biosensing experiments on the microfabricated membranes involve seeding different cell densities onto the sensing surface of membrane, and measuring the corresponding dynamics information of each tested silicon membrane in the form of a series of frequency response functions (FRFs). All of those experiments are carried out in cell culture medium to simulate a practical working environment. The EA.hy 926 endothelial cell lines are chosen in this paper for the bio-experiments. The EA.hy 926 endothelial cell lines represent a particular class of biological particles that have irregular shapes, non-uniform density and uncertain growth behaviour, which are difficult to monitor using the traditional biosensors. The final predicted results reveal that the methodology of a neural-network based algorithm to perform the feature identification of cells from distributive sensory measurement has great potential in biosensing applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The topic of this thesis is the development of knowledge based statistical software. The shortcomings of conventional statistical packages are discussed to illustrate the need to develop software which is able to exhibit a greater degree of statistical expertise, thereby reducing the misuse of statistical methods by those not well versed in the art of statistical analysis. Some of the issues involved in the development of knowledge based software are presented and a review is given of some of the systems that have been developed so far. The majority of these have moved away from conventional architectures by adopting what can be termed an expert systems approach. The thesis then proposes an approach which is based upon the concept of semantic modelling. By representing some of the semantic meaning of data, it is conceived that a system could examine a request to apply a statistical technique and check if the use of the chosen technique was semantically sound, i.e. will the results obtained be meaningful. Current systems, in contrast, can only perform what can be considered as syntactic checks. The prototype system that has been implemented to explore the feasibility of such an approach is presented, the system has been designed as an enhanced variant of a conventional style statistical package. This involved developing a semantic data model to represent some of the statistically relevant knowledge about data and identifying sets of requirements that should be met for the application of the statistical techniques to be valid. Those areas of statistics covered in the prototype are measures of association and tests of location.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nearest feature line-based subspace analysis is first proposed in this paper. Compared with conventional methods, the newly proposed one brings better generalization performance and incremental analysis. The projection point and feature line distance are expressed as a function of a subspace, which is obtained by minimizing the mean square feature line distance. Moreover, by adopting stochastic approximation rule to minimize the objective function in a gradient manner, the new method can be performed in an incremental mode, which makes it working well upon future data. Experimental results on the FERET face database and the UCI satellite image database demonstrate the effectiveness.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With new and emerging e-business technologies to transform business processes, it is important to understand how those technologies will affect the performance of a business. Will the overall business process be cheaper, faster and more accurate or will a sub-optimal change have been implemented? The use of simulation to model the behaviour of business processes is well established, and it has been applied to e-business processes to understand their performance in terms of measures such as lead-time, cost and responsiveness. This paper introduces the concept of simulation components that enable simulation models of e-business processes to be built quickly from generic e-business templates. The paper demonstrates how these components were devised, as well as the results from their application through case studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we present syllable-based duration modelling in the context of a prosody model for Standard Yorùbá (SY) text-to-speech (TTS) synthesis applications. Our prosody model is conceptualised around a modular holistic framework. This framework is implemented using the Relational Tree (R-Tree) techniques. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration, intonation, and intensity, using different techniques and their subsequent integration. We applied the Fuzzy Decision Tree (FDT) technique to model the duration dimension. In order to evaluate the effectiveness of FDT in duration modelling, we have also developed a Classification And Regression Tree (CART) based duration model using the same speech data. Each of these models was integrated into our R-Tree based prosody model. We performed both quantitative (i.e. Root Mean Square Error (RMSE) and Correlation (Corr)) and qualitative (i.e. intelligibility and naturalness) evaluations on the two duration models. The results show that CART models the training data more accurately than FDT. The FDT model, however, shows a better ability to extrapolate from the training data since it achieved a better accuracy for the test data set. Our qualitative evaluation results show that our FDT model produces synthesised speech that is perceived to be more natural than our CART model. In addition, we also observed that the expressiveness of FDT is much better than that of CART. That is because the representation in FDT is not restricted to a set of piece-wise or discrete constant approximation. We, therefore, conclude that the FDT approach is a practical approach for duration modelling in SY TTS applications. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parameter optimization of a two-stage Raman fibre converters (RFC) based on phosphosilicate core fiber was presented. The optimal operational regime was determined and tolerance of the converter against variations of laser parameters was analyzed. Converter was pumped by ytterbium-doped double-clad fibre laser with a maximum output power of 3.8W at 1061 nm. A phosphosilicate-core RFC with enhanced performance was fabricated using the results of numerical modelling.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The UK government aims at achieving 80% CO2 emission reduction by 2050 which requires collective efforts across all the UK industry sectors. In particular, the housing sector has a large potential to contribute to achieving the aim because the housing sector alone accounts for 27% of the total UK CO2 emission, and furthermore, 87% of the housing which is responsible for current 27% CO2 emission will still stand in 2050. Therefore, it is essential to improve energy efficiency of existing housing stock built with low energy efficiency standard. In order for this, a whole‐house needs to be refurbished in a sustainable way by considering the life time financial and environmental impacts of a refurbished house. However, the current refurbishment process seems to be challenging to generate a financially and environmentally affordable refurbishment solution due to the highly fragmented nature of refurbishment practice and a lack of knowledge and skills about whole‐house refurbishment in the construction industry. In order to generate an affordable refurbishment solution, diverse information regarding costs and environmental impacts of refurbishment measures and materials should be collected and integrated in right sequences throughout the refurbishment project life cycle among key project stakeholders. Consequently, various researchers increasingly study a way of utilizing Building Information Modelling (BIM) to tackle current problems in the construction industry because BIM can support construction professionals to manage construction projects in a collaborative manner by integrating diverse information, and to determine the best refurbishment solution among various alternatives by calculating the life cycle costs and lifetime CO2 performance of a refurbishment solution. Despite the capability of BIM, the BIM adoption rate is low with 25% in the housing sector and it has been rarely studied about a way of using BIM for housing refurbishment projects. Therefore, this research aims to develop a BIM framework to formulate a financially and environmentally affordable whole‐house refurbishment solution based on the Life Cycle Costing (LCC) and Life Cycle Assessment (LCA) methods simultaneously. In order to achieve the aim, a BIM feasibility study was conducted as a pilot study to examine whether BIM is suitable for housing refurbishment, and a BIM framework was developed based on the grounded theory because there was no precedent research. After the development of a BIM framework, this framework was examined by a hypothetical case study using BIM input data collected from questionnaire survey regarding homeowners’ preferences for housing refurbishment. Finally, validation of the BIM framework was conducted among academics and professionals by providing the BIM framework and a formulated refurbishment solution based on the LCC and LCA studies through the framework. As a result, BIM was identified as suitable for housing refurbishment as a management tool, and it is timely for developing the BIM framework. The BIM framework with seven project stages was developed to formulate an affordable refurbishment solution. Through the case study, the Building Regulation is identified as the most affordable energy efficiency standard which renders the best LCC and LCA results when it is applied for whole‐house refurbishment solution. In addition, the Fabric Energy Efficiency Standard (FEES) is recommended when customers are willing to adopt high energy standard, and the maximum 60% of CO2 emissions can be reduced through whole‐house fabric refurbishment with the FEES. Furthermore, limitations and challenges to fully utilize BIM framework for housing refurbishment were revealed such as a lack of BIM objects with proper cost and environmental information, limited interoperability between different BIM software and limited information of LCC and LCA datasets in BIM system. Finally, the BIM framework was validated as suitable for housing refurbishment projects, and reviewers commented that the framework can be more practical if a specific BIM library for housing refurbishment with proper LCC and LCA datasets is developed. This research is expected to provide a systematic way of formulating a refurbishment solution using BIM, and to become a basis for further research on BIM for the housing sector to resolve the current limitations and challenges. Future research should enhance the BIM framework by developing more detailed process map and develop BIM objects with proper LCC and LCA Information.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A hybrid Molecular Dynamics/Fluctuating Hydrodynamics framework based on the analogy with two-phase hydrodynamics has been extended to dynamically tracking the feature of interest at all-atom resolution. In the model, the hydrodynamics description is used as an effective boundary condition to close the molecular dynamics solution without resorting to standard periodic boundary conditions. The approach is implemented in a popular Molecular Dynamics package GROMACS and results for two biomolecular systems are reported. A small peptide dialanine and a complete capsid of a virus porcine circovirus 2 in water are considered and shown to reproduce the structural and dynamic properties compared to those obtained in theory, purely atomistic simulations, and experiment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays financial institutions due to regulation and internal motivations care more intensively on their risks. Besides previously dominating market and credit risk new trend is to handle operational risk systematically. Operational risk is the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. First we show the basic features of operational risk and its modelling and regulatory approaches, and after we will analyse operational risk in an own developed simulation model framework. Our approach is based on the analysis of latent risk process instead of manifest risk process, which widely popular in risk literature. In our model the latent risk process is a stochastic risk process, so called Ornstein- Uhlenbeck process, which is a mean reversion process. In the model framework we define catastrophe as breach of a critical barrier by the process. We analyse the distributions of catastrophe frequency, severity and first time to hit, not only for single process, but for dual process as well. Based on our first results we could not falsify the Poisson feature of frequency, and long tail feature of severity. Distribution of “first time to hit” requires more sophisticated analysis. At the end of paper we examine advantages of simulation based forecasting, and finally we concluding with the possible, further research directions to be done in the future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the years 2004 and 2005 we collected samples of phytoplankton, zooplankton and macroinvertebrates in an artificial small pond in Budapest. We set up a simulation model predicting the abundance of the cyclopoids, Eudiaptomus zachariasi and Ischnura pumilio by considering only temperature as it affects the abundance of population of the previous day. Phytoplankton abundance was simulated by considering not only temperature, but the abundance of the three mentioned groups. This discrete-deterministic model could generate similar patterns like the observed one and testing it on historical data was successful. However, because the model was overpredicting the abundances of Ischnura pumilio and Cyclopoida at the end of the year, these results were not considered. Running the model with the data series of climate change scenarios, we had an opportunity to predict the individual numbers for the period around 2050. If the model is run with the data series of the two scenarios UKHI and UKLO, which predict drastic global warming, then we can observe a decrease in abundance and shift in the date of the maximum abundance occurring (excluding Ischnura pumilio, where the maximum abundance increases and it occurs later), whereas under unchanged climatic conditions (BASE scenario) the change in abundance is negligible. According to the scenarios GFDL 2535, GFDL 5564 and UKTR, a transition could be noticed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Climate change is one of the most crucial ecological problems of our age with great influence. Seasonal dynamics of aquatic communities are — among others — regulated by the climate, especially by temperature. In this case study we attempted the simulation modelling of the seasonal dynamics of a copepod species, Cyclops vicinus, which ranks among the zooplankton community, based on a quantitative database containing ten years of data from the Danube’s Göd area. We set up a simulation model predicting the abundance of Cyclops vicinus by considering only temperature as it affects the abundance of population. The model was adapted to eight years of daily temperature data observed between 1981 and 1994 and was tested successfully with the additional data of two further years. The model was run with the data series of climate change scenarios specified for the period around 2070- 2100. On the other hand we looked for the geographically analogous areas with the Göd region which are mostly similar to the future climate of the Göd area. By means of the above-mentioned points we can get a view how the climate of the region will change by the end of the 21st century, and the way the seasonal dynamics of a chosen planktonic crustacean species may follow this change. According to our results the area of Göd will be similar to the northern region of Greece. The maximum abundance of the examined species occurs a month to one and a half months earlier, moreover larger variances are expected between years in respect of the abundance. The deviations are expected in the direction of smaller or significantly larger abundance not observed earlier.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Acknowledgements and funding We would like to thank the GPs who took part in this study. We would also like to thank Marie Pitkethly and Gail Morrison for their help and support in recruiting GPs to the study. WIME was funded by the Chief Scientist Office, grant number CZH/4/610. The Health Services Research Unit, University of Aberdeen, is core funded by the Chief Scientist Office of the Scottish Government Health Directorates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The application of pharmacokinetic modelling within the drug development field essentially allows one to develop a quantitative description of the temporal behaviour of a compound of interest at a tissue/organ level, by identifying and defining relationships between a dose of a drug and dependent variables. In order to understand and characterise the pharmacokinetics of a drug, it is often helpful to employ pharmacokinetic modelling using empirical or mechanistic approaches. Pharmacokinetic models can be developed within mathematical and statistical commercial software such as MATLAB using traditional mathematical and computation coding, or by using the Simbiology Toolbox available within MATLAB for a graphical user interface approach to developing pharmacokinetic (PBPK) models. For formulations dosed orally, a prerequisite for clinical activity is the entry of the drug into the systemic circulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Knowledge and its management have been respectively accepted as a critical resource and a core business competency. Despite that literature proves the existence of a gap between the theoretical considerations of Knowledge Management (KM) and their efficient application. Such lacking, we argue, derives from the missing link between a framework of Knowledge Management and the particular methods and guidelines of its implementation. In an attempt to bridge this gap, an original, process- based holistic Knowledge Management framework is proposed, aiming to address the problem of knowledge management application and performance by utilising a set of well accepted Enterprise Modelling (EM) methods and tools.