906 resultados para Complex systems prediction


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Les micelles polyioniques ont émergé comme des systèmes prometteurs de relargage de médicaments hydrophiles ioniques. Le but de cette étude était le développement des micelles polyioniques à base de dextrane pour la relargage de médicaments hydrophiles cationiques utilisant une nouvelle famille de copolymères bloc carboxymethyldextran-poly(éthylène glycol) (CMD-PEG). Quatre copolymères CMD-PEG ont été préparés dont deux copolymères identiques en termes de longueurs des blocs de CMD et de PEG mais différent en termes de densité de charges du bloc CMD; et deux autres copolymères dans lesquels les blocs chargés sont les mêmes mais dont les blocs de PEG sont différents. Les propriétés d’encapsulation des micelles CMD-PEG ont été évaluées avec différentes molécules cationiques: le diminazène (DIM), un médicament cationique modèle, le chlorhydrate de minocycline (MH), un analogue semi-synthétique de la tétracycline avec des propriétés neuro-protectives prometteuses et différents antibiotiques aminoglycosidiques. La cytotoxicité des copolymères CMD-PEG a été évaluée sur différentes lignées cellulaires en utilisant le test MTT et le test du Bleu Alamar. La formation de micelles des copolymères de CMD-PEG a été caractérisée par différentes techniques telles que la spectroscopie RMN 1H, la diffusion de la lumière dynamique (DLS) et la titration calorimétrique isotherme (ITC). Le taux de relargage des médicaments et l’activité pharmacologique des micelles contenant des médicaments ont aussi été évalués. Les copolymères CMD-PEG n'ont induit aucune cytotoxicité dans les hépatocytes humains et dans les cellules microgliales murines (N9) après 24 h incubation pour des concentrations allant jusqu’à 15 mg/mL. Les interactions électrostatiques entre les copolymères de CMD-PEG et les différentes drogues cationiques ont amorcé la formation de micelles polyioniques avec un coeur composé du complexe CMD-médicaments cationiques et une couronne composée de PEG. Les propriétés des micelles DIM/CMDPEG ont été fortement dépendantes du degré de carboxyméthylation du bloc CMD. Les micelles de CMD-PEG de degré de carboxyméthylation du bloc CMD ≥ 60 %, ont incorporé jusqu'à 64 % en poids de DIM et ont résisté à la désintégration induite par les sels et ceci jusqu'à 400 mM NaCl. Par contre, les micelles de CMD-PEG de degré de carboxyméthylation ~ 30% avaient une plus faible teneur en médicament (~ 40 % en poids de DIM) et se désagrégeaient à des concentrations en sel inférieures (∼ 100 mM NaCl). Le copolymère de CMD-PEG qui a montré les propriétés micellaires les plus satisfaisantes a été sélectionné comme système de livraison potentiel de chlorhydrate de minocycline (MH) et d’antibiotiques aminoglycosidiques. Les micelles CMD-PEG encapsulantes de MH ou d’aminoglycosides ont une petite taille (< 200 nm de diamètre), une forte capacité de chargement (≥ 50% en poids de médicaments) et une plus longue période de relargage de médicament. Ces micelles furent stables en solution aqueuse pendant un mois; après lyophilisation et en présence d'albumine sérique bovine. De plus, les micelles ont protégé MH contre sa dégradation en solutions aqueuses. Les micelles encapsulant les drogues ont maintenu les activités pharmacologiques de ces dernières. En outre, les micelles MH réduisent l’inflammation induite par les lipopolysaccharides dans les cellules microgliales murines (N9). Les micelles aminoglycosides ont été quant à elles capable de tuer une culture bactérienne test. Toutefois les micelles aminoglycosides/CMDPEG furent instables dans les conditions physiologiques. Les propriétés des micelles ont été considérablement améliorées par des modifications hydrophobiques de CMD-PEG. Ainsi, les micelles aminoglycosides/dodecyl-CMD-PEG ont montré une taille plus petite et une meilleure stabilité aux conditions physiologiques. Les résultats obtenus dans le cadre de cette étude montrent que CMD-PEG copolymères sont des systèmes prometteurs de relargage de médicaments cationiques.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cette thèse présente une revue des réflexions récentes et plus traditionnelles provenant de la théorie des systèmes, de la créativité en emploi, des théories d’organisation du travail et de la motivation afin de proposer une perspective psychologique de la régulation des actions des individus au sein d’environnements de travail complexes et incertains. Des composantes de la Théorie de la Régulation de l’Action (Frese & Zapf, 1994) ainsi que de la Théorie de l’Auto-Détermination (Deci & Ryan, 2000) sont mises en relation afin d’évaluer un modèle définissant certains schémas cognitifs clés associés aux tâches individuelles et collectives en emploi. Nous proposons que ces schémas cognitifs, organisés de manière hiérarchique, jouent un rôle central dans la régulation d’une action efficace au sein d’un système social adaptatif. Nos mesures de ces schémas cognitifs sont basées sur des échelles de mesure proposées dans le cadre des recherches sur l’ambiguïté de rôle (eg. Sawyer, 1992; Breaugh & Colihan, 1994) et sont mis en relation avec des mesures de satisfaction des besoins psychologiques (Van den Broeck, Vansteenkiste, De Witte, Soenens & Lens, 2009) et du bien-être psychologique (Goldberg, 1972). Des données provenant de 153 employés à temps plein d’une compagnie de jeu vidéo ont été récoltées à travers deux temps de mesure. Les résultats révèlent que différents types de schémas cognitifs associés aux tâches individuelles et collectives sont liés à la satisfaction de différents types de besoin psychologiques et que ces derniers sont eux-mêmes liés au bien-être psychologique. Les résultats supportent également l’hypothèse d’une organisation hiérarchique des schémas cognitifs sur la base de leur niveau d’abstraction et de leur proximité avec l’exécution concrète de l’action. Ces résultats permettent de fournir une explication initiale au processus par lequel les différents types de schémas cognitifs développés en emplois et influencé par l’environnement de travail sont associés à l’attitude des employés et à leur bien-être psychologique. Les implications pratiques et théoriques pour la motivation, l’apprentissage, l’habilitation, le bien-être psychologique et l’organisation du travail dans les environnements de travail complexes et incertains sont discutés.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

La compréhension de processus biologiques complexes requiert des approches expérimentales et informatiques sophistiquées. Les récents progrès dans le domaine des stratégies génomiques fonctionnelles mettent dorénavant à notre disposition de puissants outils de collecte de données sur l’interconnectivité des gènes, des protéines et des petites molécules, dans le but d’étudier les principes organisationnels de leurs réseaux cellulaires. L’intégration de ces connaissances au sein d’un cadre de référence en biologie systémique permettrait la prédiction de nouvelles fonctions de gènes qui demeurent non caractérisées à ce jour. Afin de réaliser de telles prédictions à l’échelle génomique chez la levure Saccharomyces cerevisiae, nous avons développé une stratégie innovatrice qui combine le criblage interactomique à haut débit des interactions protéines-protéines, la prédiction de la fonction des gènes in silico ainsi que la validation de ces prédictions avec la lipidomique à haut débit. D’abord, nous avons exécuté un dépistage à grande échelle des interactions protéines-protéines à l’aide de la complémentation de fragments protéiques. Cette méthode a permis de déceler des interactions in vivo entre les protéines exprimées par leurs promoteurs naturels. De plus, aucun biais lié aux interactions des membranes n’a pu être mis en évidence avec cette méthode, comparativement aux autres techniques existantes qui décèlent les interactions protéines-protéines. Conséquemment, nous avons découvert plusieurs nouvelles interactions et nous avons augmenté la couverture d’un interactome d’homéostasie lipidique dont la compréhension demeure encore incomplète à ce jour. Par la suite, nous avons appliqué un algorithme d’apprentissage afin d’identifier huit gènes non caractérisés ayant un rôle potentiel dans le métabolisme des lipides. Finalement, nous avons étudié si ces gènes et un groupe de régulateurs transcriptionnels distincts, non préalablement impliqués avec les lipides, avaient un rôle dans l’homéostasie des lipides. Dans ce but, nous avons analysé les lipidomes des délétions mutantes de gènes sélectionnés. Afin d’examiner une grande quantité de souches, nous avons développé une plateforme à haut débit pour le criblage lipidomique à contenu élevé des bibliothèques de levures mutantes. Cette plateforme consiste en la spectrométrie de masse à haute resolution Orbitrap et en un cadre de traitement des données dédié et supportant le phénotypage des lipides de centaines de mutations de Saccharomyces cerevisiae. Les méthodes expérimentales en lipidomiques ont confirmé les prédictions fonctionnelles en démontrant certaines différences au sein des phénotypes métaboliques lipidiques des délétions mutantes ayant une absence des gènes YBR141C et YJR015W, connus pour leur implication dans le métabolisme des lipides. Une altération du phénotype lipidique a également été observé pour une délétion mutante du facteur de transcription KAR4 qui n’avait pas été auparavant lié au métabolisme lipidique. Tous ces résultats démontrent qu’un processus qui intègre l’acquisition de nouvelles interactions moléculaires, la prédiction informatique des fonctions des gènes et une plateforme lipidomique innovatrice à haut débit , constitue un ajout important aux méthodologies existantes en biologie systémique. Les développements en méthodologies génomiques fonctionnelles et en technologies lipidomiques fournissent donc de nouveaux moyens pour étudier les réseaux biologiques des eucaryotes supérieurs, incluant les mammifères. Par conséquent, le stratégie présenté ici détient un potentiel d’application au sein d’organismes plus complexes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Identification and Control of Non‐linear dynamical systems are challenging problems to the control engineers.The topic is equally relevant in communication,weather prediction ,bio medical systems and even in social systems,where nonlinearity is an integral part of the system behavior.Most of the real world systems are nonlinear in nature and wide applications are there for nonlinear system identification/modeling.The basic approach in analyzing the nonlinear systems is to build a model from known behavior manifest in the form of system output.The problem of modeling boils down to computing a suitably parameterized model,representing the process.The parameters of the model are adjusted to optimize a performanace function,based on error between the given process output and identified process/model output.While the linear system identification is well established with many classical approaches,most of those methods cannot be directly applied for nonlinear system identification.The problem becomes more complex if the system is completely unknown but only the output time series is available.Blind recognition problem is the direct consequence of such a situation.The thesis concentrates on such problems.Capability of Artificial Neural Networks to approximate many nonlinear input-output maps makes it predominantly suitable for building a function for the identification of nonlinear systems,where only the time series is available.The literature is rich with a variety of algorithms to train the Neural Network model.A comprehensive study of the computation of the model parameters,using the different algorithms and the comparison among them to choose the best technique is still a demanding requirement from practical system designers,which is not available in a concise form in the literature.The thesis is thus an attempt to develop and evaluate some of the well known algorithms and propose some new techniques,in the context of Blind recognition of nonlinear systems.It also attempts to establish the relative merits and demerits of the different approaches.comprehensiveness is achieved in utilizing the benefits of well known evaluation techniques from statistics. The study concludes by providing the results of implementation of the currently available and modified versions and newly introduced techniques for nonlinear blind system modeling followed by a comparison of their performance.It is expected that,such comprehensive study and the comparison process can be of great relevance in many fields including chemical,electrical,biological,financial and weather data analysis.Further the results reported would be of immense help for practical system designers and analysts in selecting the most appropriate method based on the goodness of the model for the particular context.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The prediction of extratropical cyclones by the European Centre for Medium Range Weather Forecasts (ECMWF) and the National Centers for Environmental Prediction (NCEP) Ensemble Prediction Systems (EPS) has been investigated using an objective feature tracking methodology to identify and track the cyclones along the forecast trajectories. Overall the results show that the ECMWF EPS has a slightly higher level of skill than the NCEP EPS in the northern hemisphere (NH). However in the southern hemisphere (SH), NCEP has higher predictive skill than ECMWF for the intensity of the cyclones. The results from both EPS indicate a higher level of predictive skill for the position of extratropical cyclones than their intensity and show that there is a larger spread in intensity than position. Further analysis shows that the predicted propagation speed of cyclones is generally too slow for the ECMWF EPS and show a slight bias for the intensity of the cyclones to be overpredicted. This is also true for the NCEP EPS in the SH. For the NCEP EPS in the NH the intensity of the cyclones is underpredicted. There is small bias in both the EPS for the cyclones to be displaced towards the poles. For each ensemble forecast of each cyclone, the predictive skill of the ensemble member that best predicts the cyclones position and intensity was computed. The results are very encouraging showing that the predictive skill of the best ensemble member is significantly higher than that of the control forecast in terms of both the position and intensity of the cyclones. The prediction of cyclones before they are identified as 850 hPa vorticity centers in the analysis cycle was also considered. It is shown that an indication of extratropical cyclones can be given by at least 1 ensemble member 7 days before they are identified in the analysis. Further analysis of the ECMWF EPS shows that the ensemble mean has a higher level of skill than the control forecast, particularly for the intensity of the cyclones, 2 from day 3 of the forecast. There is a higher level of skill in the NH than the SH and the spread in the SH is correspondingly larger. The difference between the ensemble mean and spread is very small for the position of the cyclones, but the spread of the ensemble is smaller than the ensemble mean error for the intensity of the cyclones in both hemispheres. Results also show that the ECMWF control forecast has ½ to 1 day more skill than the perturbed members, for both the position and intensity of the cyclones, throughout the forecast.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

During the past 15 years, a number of initiatives have been undertaken at national level to develop ocean forecasting systems operating at regional and/or global scales. The co-ordination between these efforts has been organized internationally through the Global Ocean Data Assimilation Experiment (GODAE). The French MERCATOR project is one of the leading participants in GODAE. The MERCATOR systems routinely assimilate a variety of observations such as multi-satellite altimeter data, sea-surface temperature and in situ temperature and salinity profiles, focusing on high-resolution scales of the ocean dynamics. The assimilation strategy in MERCATOR is based on a hierarchy of methods of increasing sophistication including optimal interpolation, Kalman filtering and variational methods, which are progressively deployed through the Syst`eme d’Assimilation MERCATOR (SAM) series. SAM-1 is based on a reduced-order optimal interpolation which can be operated using ‘altimetry-only’ or ‘multi-data’ set-ups; it relies on the concept of separability, assuming that the correlations can be separated into a product of horizontal and vertical contributions. The second release, SAM-2, is being developed to include new features from the singular evolutive extended Kalman (SEEK) filter, such as three-dimensional, multivariate error modes and adaptivity schemes. The third one, SAM-3, considers variational methods such as the incremental four-dimensional variational algorithm. Most operational forecasting systems evaluated during GODAE are based on least-squares statistical estimation assuming Gaussian errors. In the framework of the EU MERSEA (Marine EnviRonment and Security for the European Area) project, research is being conducted to prepare the next-generation operational ocean monitoring and forecasting systems. The research effort will explore nonlinear assimilation formulations to overcome limitations of the current systems. This paper provides an overview of the developments conducted in MERSEA with the SEEK filter, the Ensemble Kalman filter and the sequential importance re-sampling filter.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The ECMWF ensemble weather forecasts are generated by perturbing the initial conditions of the forecast using a subset of the singular vectors of the linearised propagator. Previous results show that when creating probabilistic forecasts from this ensemble better forecasts are obtained if the mean of the spread and the variability of the spread are calibrated separately. We show results from a simple linear model that suggest that this may be a generic property for all singular vector based ensemble forecasting systems based on only a subset of the full set of singular vectors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The application of prediction theories has been widely practised for many years in many industries such as manufacturing, defence and aerospace. Although these theories are not new, their application has not been widely used within the building services industry. Collectively, the building services industry should take a deeper look at these approaches in comparison with the traditional deterministic approaches currently being practised. By extending the application into this industry, this paper seeks to provide the industry with an overview of how simplified stochastic modelling coupled with availability and reliability predictions using historical data compiled from various sources could enhance the quality of building services systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A fully automated procedure to extract and to image local fibre orientation in biological tissues from scanning X-ray diffraction is presented. The preferred chitin fibre orientation in the flow sensing system of crickets is determined with high spatial resolution by applying synchrotron radiation based X-ray microbeam diffraction in conjunction with advanced sample sectioning using a UV micro-laser. The data analysis is based on an automated detection of azimuthal diffraction maxima after 2D convolution filtering (smoothing) of the 2D diffraction patterns. Under the assumption of crystallographic fibre symmetry around the morphological fibre axis, the evaluation method allows mapping the three-dimensional orientation of the fibre axes in space. The resulting two-dimensional maps of the local fibre orientations - together with the complex shape of the flow sensing system - may be useful for a better understanding of the mechanical optimization of such tissues.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Requirements analysis focuses on stakeholders concerns and their influence towards e-government systems. Some characteristics of stakeholders concerns clearly show the complexity and conflicts. This imposes a number of questions in the requirements analysis, such as how are they relevant to stakeholders? What are their needs? How conflicts among the different stakeholders can be resolved? And what coherent requirements can be methodologically produced? This paper describes the problem articulation method in organizational semiotics which can be used to conduct such complex requirements analysis. The outcomes of the analysis enable e-government systems development and management to meet userspsila needs. A case study of Yantai Citizen Card is chosen to illustrate a process of analysing stakeholders in the lifecycle of requirements analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A neural network enhanced proportional, integral and derivative (PID) controller is presented that combines the attributes of neural network learning with a generalized minimum-variance self-tuning control (STC) strategy. The neuro PID controller is structured with plant model identification and PID parameter tuning. The plants to be controlled are approximated by an equivalent model composed of a simple linear submodel to approximate plant dynamics around operating points, plus an error agent to accommodate the errors induced by linear submodel inaccuracy due to non-linearities and other complexities. A generalized recursive least-squares algorithm is used to identify the linear submodel, and a layered neural network is used to detect the error agent in which the weights are updated on the basis of the error between the plant output and the output from the linear submodel. The procedure for controller design is based on the equivalent model, and therefore the error agent is naturally functioned within the control law. In this way the controller can deal not only with a wide range of linear dynamic plants but also with those complex plants characterized by severe non-linearity, uncertainties and non-minimum phase behaviours. Two simulation studies are provided to demonstrate the effectiveness of the controller design procedure.