909 resultados para 010406 Stochastic Analysis and Modelling
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
Résumé Le cancer du sein est le cancer le plus commun chez les femmes et est responsable de presque 30% de tous les nouveaux cas de cancer en Europe. On estime le nombre de décès liés au cancer du sein en Europe est à plus de 130.000 par an. Ces chiffres expliquent l'impact social considérable de cette maladie. Les objectifs de cette thèse étaient: (1) d'identifier les prédispositions et les mécanismes biologiques responsables de l'établissement des sous-types spécifiques de cancer du sein; (2) les valider dans un modèle ín vivo "humain-dans-souris"; et (3) de développer des traitements spécifiques à chaque sous-type de cancer du sein identifiés. Le premier objectif a été atteint par l'intermédiaire de l'analyse des données d'expression de gènes des tumeurs, produite dans notre laboratoire. Les données obtenues par puces à ADN ont été produites à partir de 49 biopsies des tumeurs du sein provenant des patientes participant dans l'essai clinique EORTC 10994/BIG00-01. Les données étaient très riches en information et m'ont permis de valider des données précédentes des autres études d'expression des gènes dans des tumeurs du sein. De plus, cette analyse m'a permis d'identifier un nouveau sous-type biologique de cancer du sein. Dans la première partie de la thèse, je décris I identification des tumeurs apocrines du sein par l'analyse des puces à ADN et les implications potentielles de cette découverte pour les applications cliniques. Le deuxième objectif a été atteint par l'établissement d'un modèle de cancer du sein humain, basé sur des cellules épithéliales mammaires humaines primaires (HMECs) dérivées de réductions mammaires. J'ai choisi d'adapter un système de culture des cellules en suspension basé sur des mammosphères précédemment décrit et pat décidé d'exprimer des gènes en utilisant des lentivirus. Dans la deuxième partie de ma thèse je décris l'établissement d'un système de culture cellulaire qui permet la transformation quantitative des HMECs. Par la suite, j'ai établi un modèle de xénogreffe dans les souris immunodéficientes NOD/SCID, qui permet de modéliser la maladie humaine chez la souris. Dans la troisième partie de ma thèse je décris et je discute les résultats que j'ai obtenus en établissant un modèle estrogène-dépendant de cancer du sein par transformation quantitative des HMECs avec des gènes définis, identifiés par analyse de données d'expression des gènes dans le cancer du sein. Les cellules transformées dans notre modèle étaient estrogène-dépendantes pour la croissance, diploïdes et génétiquement normales même après la culture cellulaire in vitro prolongée. Les cellules formaient des tumeurs dans notre modèle de xénogreffe et constituaient des métastases péritonéales disséminées et du foie. Afin d'atteindre le troisième objectif de ma thèse, j'ai défini et examiné des stratégies de traitement qui permettent réduire les tumeurs et les métastases. J'ai produit un modèle de cancer du sein génétiquement défini et positif pour le récepteur de l'estrogène qui permet de modéliser le cancer du sein estrogène-dépendant humain chez la souris. Ce modèle permet l'étude des mécanismes impliqués dans la formation des tumeurs et des métastases. Abstract Breast cancer is the most common cancer in women and accounts for nearly 30% of all new cancer cases in Europe. The number of deaths from breast cancer in Europe is estimated to be over 130,000 each year, implying the social impact of the disease. The goals of this thesis were first, to identify biological features and mechanisms --responsible for the establishment of specific breast cancer subtypes, second to validate them in a human-in-mouse in vivo model and third to develop specific treatments for identified breast cancer subtypes. The first objective was achieved via the analysis of tumour gene expression data produced in our lab. The microarray data were generated from 49 breast tumour biopsies that were collected from patients enrolled in the clinical trial EORTC 10994/BIG00-01. The data set was very rich in information and allowed me to validate data of previous breast cancer gene expression studies and to identify biological features of a novel breast cancer subtype. In the first part of the thesis I focus on the identification of molecular apacrine breast tumours by microarray analysis and the potential imptìcation of this finding for the clinics. The second objective was attained by the production of a human breast cancer model system based on primary human mammary epithelial cells {HMECs) derived from reduction mammoplasties. I have chosen to adopt a previously described suspension culture system based on mammospheres and expressed selected target genes using lentiviral expression constructs. In the second part of my thesis I mainly focus on the establishment of a cell culture system allowing for quantitative transformation of HMECs. I then established a xenograft model in immunodeficient NOD/SCID mice, allowing to model human disease in a mouse. In the third part of my thesis I describe and discuss the results that I obtained while establishing an oestrogen-dependent model of breast cancer by quantitative transformation of HMECs with defined genes identified after breast cancer gene expression data analysis. The transformed cells in our model are oestrogen-dependent for growth; remain diploid and genetically normal even after prolonged cell culture in vitro. The cells farm tumours and form disseminated peritoneal and liver metastases in our xenograft model. Along the lines of the third objective of my thesis I defined and tested treatment schemes allowing reducing tumours and metastases. I have generated a genetically defined model of oestrogen receptor alpha positive human breast cancer that allows to model human oestrogen-dependent breast cancer in a mouse and enables the study of mechanisms involved in tumorigenesis and metastasis.
Resumo:
The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.
Resumo:
The integrated system of design for manufacturing and assembly (DFMA) and internet based collaborative design are presented to support product design, manufacturing process, and assembly planning for axial eccentric oil-pump design. The presented system manages and schedules group oriented collaborative activities. The design guidelines of internet based collaborative design & DFMA are expressed. The components and the manufacturing stages of axial eccentric oil-pump are expressed in detail. The file formats of the presented system include the data types of collaborative design of the product, assembly design, assembly planning and assembly system design. Product design and assembly planning can be operated synchronously and intelligently and they are integrated under the condition of internet based collaborative design and DFMA. The technologies of collaborative modelling, collaborative manufacturing, and internet based collaborative assembly for the specific pump construction are developed. A seven-security level is presented to ensure the security of the internet based collaborative design system.
Resumo:
The results shown in this thesis are based on selected publications of the 2000s decade. The work was carried out in several national and EC funded public research projects and in close cooperation with industrial partners. The main objective of the thesis was to study and quantify the most important phenomena of circulating fluidized bed combustors by developing and applying proper experimental and modelling methods using laboratory scale equipments. An understanding of the phenomena plays an essential role in the development of combustion and emission performance, and the availability and controls of CFB boilers. Experimental procedures to study fuel combustion behaviour under CFB conditions are presented in the thesis. Steady state and dynamic measurements under well controlled conditions were carried out to produce the data needed for the development of high efficiency, utility scale CFB technology. The importance of combustion control and furnace dynamics is emphasized when CFB boilers are scaled up with a once through steam cycle. Qualitative information on fuel combustion characteristics was obtained directly by comparing flue gas oxygen responses during the impulse change experiments with fuel feed. A one-dimensional, time dependent model was developed to analyse the measurement data Emission formation was studied combined with fuel combustion behaviour. Correlations were developed for NO, N2O, CO and char loading, as a function of temperature and oxygen concentration in the bed area. An online method to characterize char loading under CFB conditions was developed and validated with the pilot scale CFB tests. Finally, a new method to control air and fuel feeds in CFB combustion was introduced. The method is based on models and an analysis of the fluctuation of the flue gas oxygen concentration. The effect of high oxygen concentrations on fuel combustion behaviour was also studied to evaluate the potential of CFB boilers to apply oxygenfiring technology to CCS. In future studies, it will be necessary to go through the whole scale up chain from laboratory phenomena devices through pilot scale test rigs to large scale, commercial boilers in order to validate the applicability and scalability of the, results. This thesis shows the chain between the laboratory scale phenomena test rig (bench scale) and the CFB process test rig (pilot). CFB technology has been scaled up successfully from an industrial scale to a utility scale during the last decade. The work shown in the thesis, for its part, has supported the development by producing new detailed information on combustion under CFB conditions.
Resumo:
This doctoral thesis presents a study on the development of a liquid-cooled frame salient pole permanent-magnet-exited traction machine for a four-wheel-driven electric car. The emphasis of the thesis is put on a radial flux machine design in order to achieve a light-weight machine structure for traction applications. The design features combine electromagnetic and thermal design methods, because traction machine operation does not have a strict operating point. Arbitrary load cycles and the flexible supply require special attention in the design process. It is shown that accurate modelling of the machine magnetic state is essential for high-performance operation. The saturation effect related to the cross-saturation has to be taken carefully into account in order to achieve the desired operation. Two prototype machines have been designed and built for testing: one totally enclosed machine with a special magnet module pole arrangement and another through-ventilated machine with a more traditional embedded magnet structure. Both structures are built with magnetically salient structures in order to increase the torque production capability with the reluctance torque component. Both machine structures show potential for traction usage. However, the traditional embedded magnet design turns out to be mechanically the more secure one of these two machine options.
Resumo:
This master’s thesis studies the case company’s current purchase invoice process and the challenges that are related to it. Like most of other master’s thesis this study consists of both theoretical- and empirical parts. The purpose of this work is to combine theoretical and empirical parts together so that the theoretical part brings value to the empirical case study. The case company’s main business is frequency converters for both low voltage AC & DC drives and medium voltage AC Drives which are used across all industries and applications. The main focus of this study is on the current invoice process modelling. When modelling the existing process with discipline and care, current challenges can be understood better. Empirical study relays heavily on interviews and existing, yet fragmented, data. This, along with own calculations and analysis, creates the foundation for the empirical part of this master’s thesis.
Resumo:
To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.
Resumo:
The objective of this thesis is to study the time dependent behaviour of some complex queueing and inventory models. It contains a detailed analysis of the basic stochastic processes underlying these models. In the theory of queues, analysis of time dependent behaviour is an area.very little developed compared to steady state theory. Tine dependence seems certainly worth studying from an application point of view but unfortunately, the analytic difficulties are considerable. Glosod form solutions are complicated even for such simple models as M/M /1. Outside M/>M/1, time dependent solutions have been found only in special cases and involve most often double transforms which provide very little insight into the behaviour of the queueing systems themselves. In inventory theory also There is not much results available giving the time dependent solution of the system size probabilities. Our emphasis is on explicit results free from all types of transforms and the method used may be of special interest to a wide variety of problems having regenerative structure.
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities
Resumo:
Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.
Resumo:
The interpretation of soil water dynamics under drip irrigation systems is relevant for crop production as well as on water use and management. In this study a three-dimensional representation of the flow of water under drip irrigation is presented. The work includes analysis of the water balance at point scale as well as area-average, exploring uncertainties in water balance estimations depending on the number of locations sampled. The water flow was monitored by detailed profile water content measurements before irrigation, after irrigation and 24 h later with a dense array of soil moisture access tubes radially distributed around selected drippers. The objective was to develop a methodology that could be used on selected occasions to obtain 'snap shots' of the detailed three-dimensional patterns of soil moisture. Such patterns are likely to be very complex, as spatial variability will be induced for a number of reasons, such as strong horizontal gradients in soil moisture, variations between individual sources in the amount of water applied and spatial variability is soil hydraulic properties. Results are compared with a widely used numerical model, Hydrus-2D. The observed dynamic of the water content distribution is in good agreement with model simulations, although some discrepancies concerning the horizontal distribution of the irrigation bulb are noted due to soil heterogeneity. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Crumpets are made by heating fermented batter on a hot plate at around 230°C. The characteristic structure dominated by vertical pores develops rapidly: structure has developed throughout around 75% of the product height within 30s, which is far faster than might be expected from transient heat conduction through the batter. Cooking is complete within around 3 min. Image analysis based on results from X-ray tomography shows that the voidage fraction is approximately constant and that there is continual coalescence between the larger pores throughout the product although there is also a steady level of small bubbles trapped within the solidified batter. We report here experimental studies which shed light on some of the mechanisms responsible for this structure, together with some models of key phenomena.Three aspects are discussed here: the role of gas (carbon dioxide and nitrogen) nuclei in initiating structure development; convective heat transfer inside the developing pores; and the kinetics of setting the batter into an elastic solid structure. It is shown conclusively that the small bubbles of carbon dioxide resulting from the fermentation stage play a crucial role as nuclei for pore development: without these nuclei, the result is not a porous structure, but rather a solid, elastic, inedible, gelatinized product. These nuclei are also responsible for the tiny bubbles which are set in the final product. The nuclei form the source of the dominant pore structure which is largely driven by the, initially explosive, release of water vapour from the batter together with the desorption of dissolved carbon dioxide. It is argued that the rapid evaporation, transport and condensation of steam within the growing pores provides an important mechanism, as in a heat pipe, for rapid heat transfer, and models for this process are developed and tested. The setting of the continuous batter phase is essential for final product quality: studies using differential scanning calorimetry and on the kinetics of change in the visco-elastic properties of the batter suggest that this process is driven by the kinetics of gelatinization. Unlike many thermally driven food processes the rates of heating are such that gelatinization kinetics cannot be neglected. The implications of these results for modelling and for the development of novel structures are discussed.
Resumo:
This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.