877 resultados para MODELING SYSTEM


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A distributed, agent-based intelligent system models and simulates a smart grid using physical players and computationally simulated agents. The proposed system can assess the impact of demand response programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work an adaptive modeling and spectral estimation scheme based on a dual Discrete Kalman Filtering (DKF) is proposed for speech enhancement. Both speech and noise signals are modeled by an autoregressive structure which provides an underlying time frame dependency and improves time-frequency resolution. The model parameters are arranged to obtain a combined state-space model and are also used to calculate instantaneous power spectral density estimates. The speech enhancement is performed by a dual discrete Kalman filter that simultaneously gives estimates for the models and the signals. This approach is particularly useful as a pre-processing module for parametric based speech recognition systems that rely on spectral time dependent models. The system performance has been evaluated by a set of human listeners and by spectral distances. In both cases the use of this pre-processing module has led to improved results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transport is an essential sector in modern societies. It connects economic sectors and industries. Next to its contribution to economic development and social interconnection, it also causes adverse impacts on the environment and results in health hazards. Transport is a major source of ground air pollution, especially in urban areas, and therefore contributing to the health problems, such as cardiovascular and respiratory diseases, cancer, and physical injuries. This thesis presents the results of a health risk assessment that quantifies the mortality and the diseases associated with particulate matter pollution resulting from urban road transport in Hai Phong City, Vietnam. The focus is on the integration of modelling and GIS approaches in the exposure analysis to increase the accuracy of the assessment and to produce timely and consistent assessment results. The modelling was done to estimate traffic conditions and concentrations of particulate matters based on geo-references data. A simplified health risk assessment was also done for Ha Noi based on monitoring data that allows a comparison of the results between the two cases. The results of the case studies show that health risk assessment based on modelling data can provide a much more detail results and allows assessing health impacts of different mobility development options at micro level. The use of modeling and GIS as a common platform for the integration of different assessments (environmental, health, socio-economic, etc.) provides various strengths, especially in capitalising on the available data stored in different units and forms and allows handling large amount of data. The use of models and GIS in a health risk assessment, from a decision making point of view, can reduce the processing/waiting time while providing a view at different scales: from micro scale (sections of a city) to a macro scale. It also helps visualising the links between air quality and health outcomes which is useful discussing different development options. However, a number of improvements can be made to further advance the integration. An improved integration programme of the data will facilitate the application of integrated models in policy-making. Data on mobility survey, environmental monitoring and measuring must be standardised and legalised. Various traffic models, together with emission and dispersion models, should be tested and more attention should be given to their uncertainty and sensitivity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neurological disorders are a major concern in modern societies, with increasing prevalence mainly related with the higher life expectancy. Most of the current available therapeutic options can only control and ameliorate the patients’ symptoms, often be-coming refractory over time. Therapeutic breakthroughs and advances have been hampered by the lack of accurate central nervous system (CNS) models. The develop-ment of these models allows the study of the disease onset/progression mechanisms and the preclinical evaluation of novel therapeutics. This has traditionally relied on genetically engineered animal models that often diverge considerably from the human phenotype (developmentally, anatomically and physiologically) and 2D in vitro cell models, which fail to recapitulate the characteristics of the target tissue (cell-cell and cell-matrix interactions, cell polarity). The in vitro recapitulation of CNS phenotypic and functional features requires the implementation of advanced culture strategies that enable to mimic the in vivo struc-tural and molecular complexity. Models based on differentiation of human neural stem cells (hNSC) in 3D cultures have great potential as complementary tools in preclinical research, bridging the gap between human clinical studies and animal models. This thesis aimed at the development of novel human 3D in vitro CNS models by integrat-ing agitation-based culture systems and a wide array of characterization tools. Neural differentiation of hNSC as 3D neurospheres was explored in Chapter 2. Here, it was demonstrated that human midbrain-derived neural progenitor cells from fetal origin (hmNPC) can generate complex tissue-like structures containing functional dopaminergic neurons, as well as astrocytes and oligodendrocytes. Chapter 3 focused on the development of cellular characterization assays for cell aggregates based on light-sheet fluorescence imaging systems, which resulted in increased spatial resolu-tion both for fixed samples or live imaging. The applicability of the developed human 3D cell model for preclinical research was explored in Chapter 4, evaluating the poten-tial of a viral vector candidate for gene therapy. The efficacy and safety of helper-dependent CAV-2 (hd-CAV-2) for gene delivery in human neurons was evaluated, demonstrating increased neuronal tropism, efficient transgene expression and minimal toxicity. The potential of human 3D in vitro CNS models to mimic brain functions was further addressed in Chapter 5. Exploring the use of 13C-labeled substrates and Nucle-ar Magnetic Resonance (NMR) spectroscopy tools, neural metabolic signatures were evaluated showing lineage-specific metabolic specialization and establishment of neu-ron-astrocytic shuttles upon differentiation. Chapter 6 focused on transferring the knowledge and strategies described in the previous chapters for the implementation of a scalable and robust process for the 3D differentiation of hNSC derived from human induced pluripotent stem cells (hiPSC). Here, software-controlled perfusion stirred-tank bioreactors were used as technological system to sustain cell aggregation and dif-ferentiation. The work developed in this thesis provides practical and versatile new in vitro ap-proaches to model the human brain. Furthermore, the culture strategies described herein can be further extended to other sources of neural phenotypes, including pa-tient-derived hiPSC. The combination of this 3D culture strategy with the implemented characterization methods represents a powerful complementary tool applicable in the drug discovery, toxicology and disease modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mont Collon mafic complex is one of the best preserved examples of the Early Permian magmatism in the Central Alps, related to the intra-continental collapse of the Variscan belt. It mostly consists (> 95 vol.%) of ol+hy-nonnative plagioclase-wehrlites, olivine- and cpx-gabbros with cumulitic structures, crosscut by acid dikes. Pegmatitic gabbros, troctolites and anorthosites outcrop locally. A well-preserved cumulative, sequence is exposed in the Dents de Bertol area (center of intrusion). PT-calculations indicate that this layered magma chamber emplaced at mid-crustal levels at about 0.5 GPa and 1100 degrees C. The Mont Collon cumulitic rocks record little magmatic differentiation, as illustrated by the restricted range of clinopyroxene mg-number (Mg#(cpx)=83-89). Whole-rock incompatible trace-element contents (e.g. Nb, Zr, Ba) vary largely and without correlation with major-element composition. These features are characteristic of an in-situ crystallization process with variable amounts of interstitial liquid L trapped between the cumulus mineral phases. LA-ICPMS measurements show that trace-element distribution in the latter is homogeneous, pointing to subsolidus re-equilibration between crystals and interstitial melts. A quantitative modeling based on Langmuir's in-situ crystallization equation successfully duplicated the REE concentrations in cumulitic minerals of all rock facies of the intrusion. The calculated amounts of interstitial liquid L vary between 0 and 35% for degrees of differentiation F of 0 to 20%, relative to the least evolved facies of the intrusion. L values are well correlated with the modal proportions of interstitial amphibole and whole-rock incompatible trace-element concentrations (e.g. Zr, Nb) of the tested samples. However, the in-situ crystallization model reaches its limitations with rock containing high modal content of REE-bearing minerals (i.e. zircon), such as pegmatitic gabbros. Dikes of anorthositic composition, locally crosscutting the layered lithologies, evidence that the Mont Collon rocks evolved in open system with mixing of intercumulus liquids of different origins and possibly contrasting compositions. The proposed model is not able to resolve these complex open systems, but migrating liquids could be partly responsible for the observed dispersion of points in some correlation diagrams. Absence of significant differentiation with recurrent lithologies in the cumulitic pile of Dents de Bertol points to an efficiently convective magma chamber, with possible periodic replenishment, (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation is based on four articles dealing with modeling of ozonation. The literature part of this considers some models for hydrodynamics in bubble column simulation. A literature review of methods for obtaining mass transfer coefficients is presented. The methods presented to obtain mass transfer are general models and can be applied to any gas-liquid system. Ozonation reaction models and methods for obtaining stoichiometric coefficients and reaction rate coefficients for ozonation reactions are discussed in the final section of the literature part. In the first article, ozone gas-liquid mass transfer into water in a bubble column was investigated for different pH values. A more general method for estimation of mass transfer and Henry’s coefficient was developed from the Beltrán method. The ozone volumetric mass transfer coefficient and the Henry’s coefficient were determined simultaneously by parameter estimation using a nonlinear optimization method. A minor dependence of the Henry’s law constant on pH was detected at the pH range 4 - 9. In the second article, a new method using the axial dispersion model for estimation of ozone self-decomposition kinetics in a semi-batch bubble column reactor was developed. The reaction rate coefficients for literature equations of ozone decomposition and the gas phase dispersion coefficient were estimated and compared with the literature data. The reaction order in the pH range 7-10 with respect to ozone 1.12 and 0.51 the hydroxyl ion were obtained, which is in good agreement with literature. The model parameters were determined by parameter estimation using a nonlinear optimization method. Sensitivity analysis was conducted using object function method to obtain information about the reliability and identifiability of the estimated parameters. In the third article, the reaction rate coefficients and the stoichiometric coefficients in the reaction of ozone with the model component p-nitrophenol were estimated at low pH of water using nonlinear optimization. A novel method for estimation of multireaction model parameters in ozonation was developed. In this method the concentration of unknown intermediate compounds is presented as a residual COD (chemical oxygen demand) calculated from the measured COD and the theoretical COD for the known species. The decomposition rate of p-nitrophenol on the pathway producing hydroquinone was found to be about two times faster than the p-nitrophenol decomposition rate on the pathway producing 4- nitrocatechol. In the fourth article, the reaction kinetics of p-nitrophenol ozonation was studied in a bubble column at pH 2. Using the new reaction kinetic model presented in the previous article, the reaction kinetic parameters, rate coefficients, and stoichiometric coefficients as well as the mass transfer coefficient were estimated with nonlinear estimation. The decomposition rate of pnitrophenol was found to be equal both on the pathway producing hydroquinone and on the path way producing 4-nitrocathecol. Comparison of the rate coefficients with the case at initial pH 5 indicates that the p-nitrophenol degradation producing 4- nitrocathecol is more selective towards molecular ozone than the reaction producing hydroquinone. The identifiability and reliability of the estimated parameters were analyzed with the Marcov chain Monte Carlo (MCMC) method. @All rights reserved. No part of the publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of the author.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La douleur est une expérience perceptive comportant de nombreuses dimensions. Ces dimensions de douleur sont inter-reliées et recrutent des réseaux neuronaux qui traitent les informations correspondantes. L’élucidation de l'architecture fonctionnelle qui supporte les différents aspects perceptifs de l'expérience est donc une étape fondamentale pour notre compréhension du rôle fonctionnel des différentes régions de la matrice cérébrale de la douleur dans les circuits corticaux qui sous tendent l'expérience subjective de la douleur. Parmi les diverses régions du cerveau impliquées dans le traitement de l'information nociceptive, le cortex somatosensoriel primaire et secondaire (S1 et S2) sont les principales régions généralement associées au traitement de l'aspect sensori-discriminatif de la douleur. Toutefois, l'organisation fonctionnelle dans ces régions somato-sensorielles n’est pas complètement claire et relativement peu d'études ont examiné directement l'intégration de l'information entre les régions somatiques sensorielles. Ainsi, plusieurs questions demeurent concernant la relation hiérarchique entre S1 et S2, ainsi que le rôle fonctionnel des connexions inter-hémisphériques des régions somatiques sensorielles homologues. De même, le traitement en série ou en parallèle au sein du système somatosensoriel constitue un autre élément de questionnement qui nécessite un examen plus approfondi. Le but de la présente étude était de tester un certain nombre d'hypothèses sur la causalité dans les interactions fonctionnelle entre S1 et S2, alors que les sujets recevaient des chocs électriques douloureux. Nous avons mis en place une méthode de modélisation de la connectivité, qui utilise une description de causalité de la dynamique du système, afin d'étudier les interactions entre les sites d'activation définie par un ensemble de données provenant d'une étude d'imagerie fonctionnelle. Notre paradigme est constitué de 3 session expérimentales en utilisant des chocs électriques à trois différents niveaux d’intensité, soit modérément douloureux (niveau 3), soit légèrement douloureux (niveau 2), soit complètement non douloureux (niveau 1). Par conséquent, notre paradigme nous a permis d'étudier comment l'intensité du stimulus est codé dans notre réseau d'intérêt, et comment la connectivité des différentes régions est modulée dans les conditions de stimulation différentes. Nos résultats sont en faveur du mode sériel de traitement de l’information somatosensorielle nociceptive avec un apport prédominant de la voie thalamocorticale vers S1 controlatérale au site de stimulation. Nos résultats impliquent que l'information se propage de S1 controlatéral à travers notre réseau d'intérêt composé des cortex S1 bilatéraux et S2. Notre analyse indique que la connexion S1→S2 est renforcée par la douleur, ce qui suggère que S2 est plus élevé dans la hiérarchie du traitement de la douleur que S1, conformément aux conclusions précédentes neurophysiologiques et de magnétoencéphalographie. Enfin, notre analyse fournit des preuves de l'entrée de l'information somatosensorielle dans l'hémisphère controlatéral au côté de stimulation, avec des connexions inter-hémisphériques responsable du transfert de l'information à l'hémisphère ipsilatéral.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To determine scoliosis curve types using non invasive surface acquisition, without prior knowledge from X-ray data. Methods Classification of scoliosis deformities according to curve type is used in the clinical management of scoliotic patients. In this work, we propose a robust system that can determine the scoliosis curve type from non invasive acquisition of the 3D back surface of the patients. The 3D image of the surface of the trunk is divided into patches and local geometric descriptors characterizing the back surface are computed from each patch and constitute the features. We reduce the dimensionality by using principal component analysis and retain 53 components using an overlap criterion combined with the total variance in the observed variables. In this work, a multi-class classifier is built with least-squares support vector machines (LS-SVM). The original LS-SVM formulation was modified by weighting the positive and negative samples differently and a new kernel was designed in order to achieve a robust classifier. The proposed system is validated using data from 165 patients with different scoliosis curve types. The results of our non invasive classification were compared with those obtained by an expert using X-ray images. Results The average rate of successful classification was computed using a leave-one-out cross-validation procedure. The overall accuracy of the system was 95%. As for the correct classification rates per class, we obtained 96%, 84% and 97% for the thoracic, double major and lumbar/thoracolumbar curve types, respectively. Conclusion This study shows that it is possible to find a relationship between the internal deformity and the back surface deformity in scoliosis with machine learning methods. The proposed system uses non invasive surface acquisition, which is safe for the patient as it involves no radiation. Also, the design of a specific kernel improved classification performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is proposed to study the suspended sediment transport characteristics of river basins of Kerala and to model suspended sediment discharge mechanism for typical micro-watersheds. The Pamba river basin is selected as a representative hydrologic regime for detailed studies of suspended sediment characteristics and its seasonal variation. The applicability of various erosion models would be tested by comparing with the observed event data (by continuous monitoring of rainfall, discharge, and suspended sediment concentration for lower order streams). Empirical, conceptual and physically distributed models were used for making the comparison of performance of the models. Large variations in the discharge and sediment quantities were noticed during a particular year between the river basins investigated and for an individual river basin during the years for which the data was available. In general, the sediment yield pattern follows the seasonal distribution of rainfall, discharge and physiography of the land. This confirms with similar studies made for other Indian rivers. It was observed from this study, that the quantity of sediment transported downstream shows a decreasing trend over the years corresponding to increase in discharge. For sound and sustainable management of coastal zones, it is important to understand the balance between erosion and retention and to quantify the exact amount of the sediments reaching this eco-system. This, of course, necessitates a good length of time series data and more focused research on the behaviour of each river system, both present and past. In this realm of river inputs to ocean system, each of the 41 rivers of Kerala may have dominant yet diversified roles to influence the coastal ecosystem as reflected from this study on the major fraction of transport, namely the suspended sediments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on the fractionation and quantification of chlorophenols, the most important and potential pollutant in this category, the distribution and seasonal dynamics of MBAS, phenols and clorophenols and development of a model to describe the chemical reactivity of the estuary are utilizing the dynamics of boron. The CES is highly influenced by various anthropogenic activities like discharge of agricultural, industrial and urban wastes operation of shipyard, oil and other transporting activities, fishing, dredging etc. Seasonal values of MBAS showed high values in the surface water during monsoon compared to premonsoon and postmonsoon. In the Cochin estuary o-chlorophenol and p-chlorophenol showed low values in the surface water compared to bottom water in the northern part of the estuary and higher values in the surface water in the southern part

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of the thesis is to improve the state of knowledge and understanding of the physical structure of the TMCS and its short range prediction. The present study principally addresses the fine structure, dynamics and microphysics of severe convective storms.The structure and dynamics of the Tropical cloud clusters over Indian region is not well understood. The observational cases discussed in the thesis are limited to the temperature and humidity observations. We propose a mesoscale observational network along with all the available Doppler radars and other conventional and non—conventional observations. Simultaneous observations with DWR, VHF and UHF radars of the same cloud system will provide new insight into the dynamics and microphysics of the clouds. More cases have to be studied in detail to obtain climatology of the storm type passing over tropical Indian region. These observational data sets provide wide variety of information to be assimilated to the mesoscale data assimilation system and can be used to force CSRM.The gravity wave generation and stratosphere troposphere exchange (STE) processes associated with convection gained a great deal of attention to modem science and meteorologist. Round the clock observations using VHF and UHF radars along with supplementary data sets like DWR, satellite, GPS/Radiosondes, meteorological rockets and aircrafl observations is needed to explore the role of convection and associated energetics in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motion instability is an important issue that occurs during the operation of towed underwater vehicles (TUV), which considerably affects the accuracy of high precision acoustic instrumentations housed inside the same. Out of the various parameters responsible for this, the disturbances from the tow-ship are the most significant one. The present study focus on the motion dynamics of an underwater towing system with ship induced disturbances as the input. The study focus on an innovative system called two-part towing. The methodology involves numerical modeling of the tow system, which consists of modeling of the tow-cables and vehicles formulation. Previous study in this direction used a segmental approach for the modeling of the cable. Even though, the model was successful in predicting the heave response of the tow-body, instabilities were observed in the numerical solution. The present study devises a simple approach called lumped mass spring model (LMSM) for the cable formulation. In this work, the traditional LMSM has been modified in two ways. First, by implementing advanced time integration procedures and secondly, use of a modified beam model which uses only translational degrees of freedoms for solving beam equation. A number of time integration procedures, such as Euler, Houbolt, Newmark and HHT-α were implemented in the traditional LMSM and the strength and weakness of each scheme were numerically estimated. In most of the previous studies, hydrodynamic forces acting on the tow-system such as drag and lift etc. are approximated as analytical expression of velocities. This approach restricts these models to use simple cylindrical shaped towed bodies and may not be applicable modern tow systems which are diversed in shape and complexity. Hence, this particular study, hydrodynamic parameters such as drag and lift of the tow-system are estimated using CFD techniques. To achieve this, a RANS based CFD code has been developed. Further, a new convection interpolation scheme for CFD simulation, called BNCUS, which is blend of cell based and node based formulation, was proposed in the study and numerically tested. To account for the fact that simulation takes considerable time in solving fluid dynamic equations, a dedicated parallel computing setup has been developed. Two types of computational parallelisms are explored in the current study, viz; the model for shared memory processors and distributed memory processors. In the present study, shared memory model was used for structural dynamic analysis of towing system, distributed memory one was devised in solving fluid dynamic equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Safety critical software failure can have a high price. Such software should be free of errors before it is put into operation. Application of formal methods in the Software Development Life Cycle helps to ensure that the software for safety critical missions are ultra reliable. PVS theorem prover, a formal method tool, can be used for the formal verification of software in ADA Language for Flight Software Application (ALFA.). This paper describes the modeling of ALFA programs for PVS theorem prover. An ALFA2PVS translator is developed which automatically converts the software in ALFA to PVS specification. By this approach the software can be verified formally with respect to underflow/overflow errors and divide by zero conditions without the actual execution of the code.