941 resultados para Chebyshev And Binomial Distributions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structure and hydration of the HNP-3 have been derived from molecular dynamics data using root mean square deviation, radial and energy distributions. Three antiparallel beta sheets were found to be preserved. 15 intramolecular hydrogen bonds were identified together with 36 hydrogen bonds on the backbone and 35 on the side chain atoms. From the point of view of the hydration dynamics, the analysis shows a high solvent accessibility of the monomer and attractive interactions with water molecules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to simulate blood flow in thoracic human aorta and understand the role of flow dynamics in the initialization and localization of atherosclerotic plaque in human thoracic aorta. The blood flow dynamics in idealized and realistic models of human thoracic aorta were numerically simulated in three idealized and two realistic thoracic aorta models. The idealized models of thoracic aorta were reconstructed with measurements available from literature, and the realistic models of thoracic aorta were constructed by image processing Computed Tomographic (CT) images. The CT images were made available by South Karelia Central Hospital in Lappeenranta. The reconstruction of thoracic aorta consisted of operations, such as contrast adjustment, image segmentations, and 3D surface rendering. Additional design operations were performed to make the aorta model compatible for the numerical method based computer code. The image processing and design operations were performed with specialized medical image processing software. Pulsatile pressure and velocity boundary conditions were deployed as inlet boundary conditions. The blood flow was assumed homogeneous and incompressible. The blood was assumed to be a Newtonian fluid. The simulations with idealized models of thoracic aorta were carried out with Finite Element Method based computer code, while the simulations with realistic models of thoracic aorta were carried out with Finite Volume Method based computer code. Simulations were carried out for four cardiac cycles. The distribution of flow, pressure and Wall Shear Stress (WSS) observed during the fourth cardiac cycle were extensively analyzed. The aim of carrying out the simulations with idealized model was to get an estimate of flow dynamics in a realistic aorta model. The motive behind the choice of three aorta models with distinct features was to understand the dependence of flow dynamics on aorta anatomy. Highly disturbed and nonuniform distribution of velocity and WSS was observed in aortic arch, near brachiocephalic, left common artery, and left subclavian artery. On the other hand, the WSS profiles at the roots of branches show significant differences with geometry variation of aorta and branches. The comparison of instantaneous WSS profiles revealed that the model with straight branching arteries had relatively lower WSS compared to that in the aorta model with curved branches. In addition to this, significant differences were observed in the spatial and temporal profiles of WSS, flow, and pressure. The study with idealized model was extended to study blood flow in thoracic aorta under the effects of hypertension and hypotension. One of the idealized aorta models was modified along with the boundary conditions to mimic the thoracic aorta under the effects of hypertension and hypotension. The results of simulations with realistic models extracted from CT scans demonstrated more realistic flow dynamics than that in the idealized models. During systole, the velocity in ascending aorta was skewed towards the outer wall of aortic arch. The flow develops secondary flow patterns as it moves downstream towards aortic arch. Unlike idealized models, the distribution of flow was nonplanar and heavily guided by the artery anatomy. Flow cavitation was observed in the aorta model which was imaged giving longer branches. This could not be properly observed in the model with imaging containing a shorter length for aortic branches. The flow circulation was also observed in the inner wall of the aortic arch. However, during the diastole, the flow profiles were almost flat and regular due the acceleration of flow at the inlet. The flow profiles were weakly turbulent during the flow reversal. The complex flow patterns caused a non-uniform distribution of WSS. High WSS was distributed at the junction of branches and aortic arch. Low WSS was distributed at the proximal part of the junction, while intermedium WSS was distributed in the distal part of the junction. The pulsatile nature of the inflow caused oscillating WSS at the branch entry region and inner curvature of aortic arch. Based on the WSS distribution in the realistic model, one of the aorta models was altered to induce artificial atherosclerotic plaque at the branch entry region and inner curvature of aortic arch. Atherosclerotic plaque causing 50% blockage of lumen was introduced in brachiocephalic artery, common carotid artery, left subclavian artery, and aortic arch. The aim of this part of the study was first to study the effect of stenosis on flow and WSS distribution, understand the effect of shape of atherosclerotic plaque on flow and WSS distribution, and finally to investigate the effect of lumen blockage severity on flow and WSS distributions. The results revealed that the distribution of WSS is significantly affected by plaque with mere 50% stenosis. The asymmetric shape of stenosis causes higher WSS in branching arteries than in the cases with symmetric plaque. The flow dynamics within thoracic aorta models has been extensively studied and reported here. The effects of pressure and arterial anatomy on the flow dynamic were investigated. The distribution of complex flow and WSS is correlated with the localization of atherosclerosis. With the available results we can conclude that the thoracic aorta, with complex anatomy is the most vulnerable artery for the localization and development of atherosclerosis. The flow dynamics and arterial anatomy play a role in the localization of atherosclerosis. The patient specific image based models can be used to diagnose the locations in the aorta vulnerable to the development of arterial diseases such as atherosclerosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT This paper aims at describing the osmotic dehydration of radish cut into cylindrical pieces, using one- and two-dimensional analytical solutions of diffusion equation with boundary conditions of the first and third kind. These solutions were coupled with an optimizer to determine the process parameters, using experimental data. Three models were proposed to describe the osmotic dehydration of radish slices in brine at low temperature. The two-dimensional model with boundary condition of the third kind well described the kinetics of mass transfers, and it enabled prediction of moisture and solid distributions at any given time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses the use of covariant phase space observables in quantum tomography. Necessary and sufficient conditions for the informational completeness of covariant phase space observables are proved, and some state reconstruction formulae are derived. Different measurement schemes for measuring phase space observables are considered. Special emphasis is given to the quantum optical eight-port homodyne detection scheme and, in particular, on the effect of non-unit detector efficiencies on the measured observable. It is shown that the informational completeness of the observable does not depend on the efficiencies. As a related problem, the possibility of reconstructing the position and momentum distributions from the marginal statistics of a phase space observable is considered. It is shown that informational completeness for the phase space observable is neither necessary nor sufficient for this procedure. Two methods for determining the distributions from the marginal statistics are presented. Finally, two alternative methods for determining the state are considered. Some of their shortcomings when compared to the phase space method are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vesihuoltoverkostojen ikääntyminen ja niiden kunnon heikentyminen ovat useimpien vesihuoltolaitosten ongelmia. Tekemättä jääneistä saneerauksista muodostuu saneerausvelkaa, jonka hoitaminen vaatii tehokkaita toimenpiteitä.Saneerausten tehokas kohdentaminen on tärkeää, koska käytettävissä olevat taloudelliset ja toiminnalliset resurssit ovat rajallisia. Työn tavoitteena oli kehittää laskentamalli, jonka avulla vesihuollon huonokuntoiset alueet voidaan arvottaa saneerausjärjestykseen. Tutkimusmenetelminä käytettiin vesihuollon yleisten tietojen osalta kirjallisuustutkimusta sekä toimeksiantajan verkostotietojen osalta tapaustutkimusta. Malliin valittiin arvottamisen kannalta merkittävimmät tekijät. Valinta tehtiin tekijöiden arvottamis- ja laskentakelpoisuuden perusteella. Tutkimustietojen perusteella saatiin määritettyä putkimateriaalien ikä- ja materiaalikertoimet. Niiden lisäksi laskennassa huomioidaan putkien kunnossapitotietoja. Tutkimuksen lopputuloksena saatiin kehitettyä laskentamalli, joka vastaa asetettua työn tavoitetta. Laskennan tuloksena saadaan lukuarvo, joka perustuu verkostojen ikä- ja materiaalijakaumaan sekä kunnossapitotietoihin. Suurin lukuarvo vastaa kiireellisimmin saneerattavia kohteita.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Waste has been incinerated for energy utilization for more than a hundred years, but the harmful emissions emitted from the incineration plants did not begin to cause concern until the 1980s. Many plants were shutdown and the waste incineration plant in Kyläsaari Helsinki was one of them. In later years, new landfill regulations have increased the interest in waste incineration. During the last year, four new plants were taken into operation in Finland, Westenergy in Vaasa among them. The presence of dust has been observed indoors at Westenergy waste incineration plant. Dust is defined as particles with a diameter above 10 μm, while fine particles have a diameter smaller than 2.5 μm, ultrafine under 0.1 μm and nanoparticles under 0.05 μm. In recent years, the focus of particle health research has been changed to investigate smaller particles. Ultrafine particles have been found to be more detrimental to health than larger particles. Limit values regulating the concentrations of ultrafine particles have not been determined yet. The objective of this thesis was to investigate dust and particles present inside the Westenergy waste incineration facility. The task was to investigate the potential pollutant sources and to give recommendations of how to minimize the presence of dust and particles in the power plant. The total particle number concentrations and size distributions where measured at 15 points inside the plant with an Engine Exhaust Particle Sizer (EEPS) Spectrometer. The measured particles were mainly in the ultrafine size range. Dust was only visually investigated, since the main purpose was to follow the dust accumulation. The measurement points inside the incineration plant were chosen according to investigate exposure to visitors and workers. At some points probable leakage of emissions were investigated. The measurements were carried out during approximately one month in March–April 2013. The results of the measurements showed that elevated levels of dust and particles are present in the indoor air at the waste incineration plant. The cleanest air was found in the control room, warehouse and office. The most polluted air was near the sources that were investigated due to possible leakage and in the bottom ash hall. However, the concentrations were near measured background concentrations in European cities and no leakage could be detected. The high concentrations were assumed to be a result of a lot of dust and particles present on surfaces that had not been cleaned in a while. The main source of the dust and particles present inside the waste incineration plant was thought to be particles and dust from the outside air. Other activities in the area around the waste incineration facility are ground work activities, stone crushing and traffic, which probably are sources of particle formation. Filtration of the outside air prior entering the facility would probably save personnel and visitors from nuisance and save in cleaning and maintenance costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern research on literacy, the scriptualization of the administration and the interaction between the governing and the governed as part of the political culture, has brought to the fore the issue of different scripts and their terms. The present dissertation focuses on the parish scribes in the county of Ostrobothnia during the period 1721–1868. The peasantry had been given the right to pay parish scribes in 1624. The parish scribes who were to assist the peasants in connection with the collection of taxes simultaneously supervised the bailiffs who collected the taxes. Their writing skills made the scribes indispensable also in many other contexts. In Ostrobothnia, the peasantry had use for parish scribes, who worked as mediators between Swedish and Finnish, between the oral and the written and vice versa. The aim of this dissertation is on the one hand to explore the recruitment of parish scribes, and on the other to examine the parish scribes as a professional and social group. The parish scribes’ significance for the peasantry in everyday life, local decision-making and in connection with political processes will be analyzed by examining their work and professional activities. The recruitment of parish scribes and has been analyzed as a decision process where different actors were able to influence the election. The parish scribes’ competence requirements and terms of employment have been analyzed. The parish scribes as a professional body or a social group has not previously been explored. The examination of the 154 parish scribes as a professional and social group has been carried out in the form of a collective biography. Parish scribes’ tasks originally consisted of the collection of taxes, but the duties within the parish administration increased in the eighteenth and nineteenth centuries. The private writing assignments consisted of many different documents: bills of sale, probate inventories and estate distributions, wills, land tenancy agreements, life annuity and crofter agreements, promissory notes, auction records and different survey documents. The interaction with state power has been analyzed by examining five political decision-making processes that the peasants actively participated in.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Second-rank tensor interactions, such as quadrupolar interactions between the spin- 1 deuterium nuclei and the electric field gradients created by chemical bonds, are affected by rapid random molecular motions that modulate the orientation of the molecule with respect to the external magnetic field. In biological and model membrane systems, where a distribution of dynamically averaged anisotropies (quadrupolar splittings, chemical shift anisotropies, etc.) is present and where, in addition, various parts of the sample may undergo a partial magnetic alignment, the numerical analysis of the resulting Nuclear Magnetic Resonance (NMR) spectra is a mathematically ill-posed problem. However, numerical methods (de-Pakeing, Tikhonov regularization) exist that allow for a simultaneous determination of both the anisotropy and orientational distributions. An additional complication arises when relaxation is taken into account. This work presents a method of obtaining the orientation dependence of the relaxation rates that can be used for the analysis of the molecular motions on a broad range of time scales. An arbitrary set of exponential decay rates is described by a three-term truncated Legendre polynomial expansion in the orientation dependence, as appropriate for a second-rank tensor interaction, and a linear approximation to the individual decay rates is made. Thus a severe numerical instability caused by the presence of noise in the experimental data is avoided. At the same time, enough flexibility in the inversion algorithm is retained to achieve a meaningful mapping from raw experimental data to a set of intermediate, model-free

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to price options on equity index futures with an application to standard options on S&P 500 futures traded on the Chicago Mercantile Exchange. Our methodology is based on stochastic dynamic programming, which can accommodate European as well as American options. The model accommodates dividends from the underlying asset. It also captures the optimal exercise strategy and the fair value of the option. This approach is an alternative to available numerical pricing methods such as binomial trees, finite differences, and ad-hoc numerical approximation techniques. Our numerical and empirical investigations demonstrate convergence, robustness, and efficiency. We use this methodology to value exchange-listed options. The European option premiums thus obtained are compared to Black's closed-form formula. They are accurate to four digits. The American option premiums also have a similar level of accuracy compared to premiums obtained using finite differences and binomial trees with a large number of time steps. The proposed model accounts for deterministic, seasonally varying dividend yield. In pricing futures options, we discover that what matters is the sum of the dividend yields over the life of the futures contract and not their distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans ce mémoire, je démontre que la distribution de probabilités de l'état quantique Greenberger-Horne-Zeilinger (GHZ) sous l'action locale de mesures de von Neumann indépendantes sur chaque qubit suit une distribution qui est une combinaison convexe de deux distributions. Les coefficients de la combinaison sont reliés aux parties équatoriales des mesures et les distributions associées à ces coefficients sont reliées aux parties réelles des mesures. Une application possible du résultat est qu'il permet de scinder en deux la simulation de l'état GHZ. Simuler, en pire cas ou en moyenne, un état quantique comme GHZ avec des ressources aléatoires, partagées ou privées, et des ressources classiques de communication, ou même des ressources fantaisistes comme les boîtes non locales, est un problème important en complexité de la communication quantique. On peut penser à ce problème de simulation comme un problème où plusieurs personnes obtiennent chacune une mesure de von Neumann à appliquer sur le sous-système de l'état GHZ qu'il partage avec les autres personnes. Chaque personne ne connaît que les données décrivant sa mesure et d'aucune façon une personne ne connaît les données décrivant la mesure d'une autre personne. Chaque personne obtient un résultat aléatoire classique. La distribution conjointe de ces résultats aléatoires classiques suit la distribution de probabilités trouvée dans ce mémoire. Le but est de simuler classiquement la distribution de probabilités de l'état GHZ. Mon résultat indique une marche à suivre qui consiste d'abord à simuler les parties équatoriales des mesures pour pouvoir ensuite savoir laquelle des distributions associées aux parties réelles des mesures il faut simuler. D'autres chercheurs ont trouvé comment simuler les parties équatoriales des mesures de von Neumann avec de la communication classique dans le cas de 3 personnes, mais la simulation des parties réelles résiste encore et toujours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’asymétrie de mise en charge (MEC) lors du passage assis à debout (PAD) chez les personnes hémiparétiques est une observation clinique connue mais peu expliquée. Ce projet visait donc le développement de connaissances sur les facteurs explicatifs de l’asymétrie de MEC chez cette clientèle en s’intéressant plus spécifiquement au lien entre la distribution des efforts aux genoux lors du PAD et l’asymétrie de MEC observée ainsi qu’à la perception de ces deux éléments lors de cette tâche. Ainsi, les objectifs généraux étaient de : 1) déterminer si l’exécution spontanée asymétrique du PAD des sujets hémiparétiques est expliquée par une distribution des efforts symétriques aux genoux en quantifiant ces efforts par le Taux d’utilisation musculaire électromyographique (TUMEMG) et, 2) déterminer si les individus hémiparétiques sont conscients des stratégies motrices qu’ils utilisent en évaluant leurs perceptions de MEC et d’efforts aux genoux durant le PAD. La première étude a évalué la capacité des personnes hémiparétiques à percevoir leur distribution de MEC aux membres inférieurs lors du PAD. Par rapport aux participants sains, leur distribution de MEC fut davantage asymétrique et leurs erreurs de perception plus élevées. La deuxième étude a quantifié la distribution des efforts aux genoux chez les sujets sains et hémiparétiques lors du PAD spontané. Les deux groupes ont montré une association entre leur distribution de MEC et leur distribution d’effort. Toutefois, la relation était plus faible chez les patients. Le classement des participants hémiparétiques en sous-groupes selon leur degré d’asymétrie de force maximale des extenseurs des genoux (faible, modéré, sévère) a révélé une similarité des efforts aux genoux parétique et non parétique chez le groupe ayant une atteinte sévère. La troisième étude a déterminé si la perception de la distribution des efforts aux genoux des sujets hémiparétiques était reliée à leur distribution réelle d’effort mesurée lors de PAD exécutés dans différentes positions de pieds. En plus d’être incapables de percevoir les changements de distribution d’effort induits par les différentes positions de pieds, leurs erreurs de perception d’effort furent plus élevées que celles de MEC. Par le biais du test fonctionnel assis-debout de cinq répétitions, la dernière étude a déterminé l’influence du nombre de répétitions du PAD sur les distributions de MEC et d’efforts aux genoux chez les sujets sains et hémiparétiques. Contrairement aux contrôles, les distributions des sujets hémiparétiques furent plus asymétriques à la première répétition du test fonctionnel que lors de l’exécution spontanée unique du PAD. En somme, les résultats de cette thèse ont démontré que la distribution des efforts aux genoux doit être considérée parmi les facteurs explicatifs de l’asymétrie de MEC des individus hémiparétiques lors du PAD et qu’il y a un besoin de mieux documenter la perception des personnes hémiparétiques lorsqu’elles exécutent des tâches fonctionnelles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En lien avec l’avancée rapide de la réduction de la taille des motifs en microfabrication, des processus physiques négligeables à plus grande échelle deviennent dominants lorsque cette taille s’approche de l’échelle nanométrique. L’identification et une meilleure compréhension de ces différents processus sont essentielles pour améliorer le contrôle des procédés et poursuivre la «nanométrisation» des composantes électroniques. Un simulateur cellulaire à l’échelle du motif en deux dimensions s’appuyant sur les méthodes Monte-Carlo a été développé pour étudier l’évolution du profil lors de procédés de microfabrication. Le domaine de gravure est discrétisé en cellules carrées représentant la géométrie initiale du système masque-substrat. On insère les particules neutres et ioniques à l’interface du domaine de simulation en prenant compte des fonctions de distribution en énergie et en angle respectives de chacune des espèces. Le transport des particules est effectué jusqu’à la surface en tenant compte des probabilités de réflexion des ions énergétiques sur les parois ou de la réémission des particules neutres. Le modèle d’interaction particule-surface tient compte des différents mécanismes de gravure sèche telle que la pulvérisation, la gravure chimique réactive et la gravure réactive ionique. Le transport des produits de gravure est pris en compte ainsi que le dépôt menant à la croissance d’une couche mince. La validité du simulateur est vérifiée par comparaison entre les profils simulés et les observations expérimentales issues de la gravure par pulvérisation du platine par une source de plasma d’argon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le contenu de cette thèse est divisé de la façon suivante. Après un premier chapitre d’introduction, le Chapitre 2 est consacré à introduire aussi simplement que possible certaines des théories qui seront utilisées dans les deux premiers articles. Dans un premier temps, nous discuterons des points importants pour la construction de l’intégrale stochastique par rapport aux semimartingales avec paramètre spatial. Ensuite, nous décrirons les principaux résultats de la théorie de l’évaluation en monde neutre au risque et, finalement, nous donnerons une brève description d’une méthode d’optimisation connue sous le nom de dualité. Les Chapitres 3 et 4 traitent de la modélisation de l’illiquidité et font l’objet de deux articles. Le premier propose un modèle en temps continu pour la structure et le comportement du carnet d’ordres limites. Le comportement du portefeuille d’un investisseur utilisant des ordres de marché est déduit et des conditions permettant d’éliminer les possibilités d’arbitrages sont données. Grâce à la formule d’Itô généralisée il est aussi possible d’écrire la valeur du portefeuille comme une équation différentielle stochastique. Un exemple complet de modèle de marché est présenté de même qu’une méthode de calibrage. Dans le deuxième article, écrit en collaboration avec Bruno Rémillard, nous proposons un modèle similaire mais cette fois-ci en temps discret. La question de tarification des produits dérivés est étudiée et des solutions pour le prix des options européennes de vente et d’achat sont données sous forme explicite. Des conditions spécifiques à ce modèle qui permettent d’éliminer l’arbitrage sont aussi données. Grâce à la méthode duale, nous montrons qu’il est aussi possible d’écrire le prix des options européennes comme un problème d’optimisation d’une espérance sur en ensemble de mesures de probabilité. Le Chapitre 5 contient le troisième article de la thèse et porte sur un sujet différent. Dans cet article, aussi écrit en collaboration avec Bruno Rémillard, nous proposons une méthode de prévision des séries temporelles basée sur les copules multivariées. Afin de mieux comprendre le gain en performance que donne cette méthode, nous étudions à l’aide d’expériences numériques l’effet de la force et la structure de dépendance sur les prévisions. Puisque les copules permettent d’isoler la structure de dépendance et les distributions marginales, nous étudions l’impact de différentes distributions marginales sur la performance des prévisions. Finalement, nous étudions aussi l’effet des erreurs d’estimation sur la performance des prévisions. Dans tous les cas, nous comparons la performance des prévisions en utilisant des prévisions provenant d’une série bivariée et d’une série univariée, ce qui permet d’illustrer l’avantage de cette méthode. Dans un intérêt plus pratique, nous présentons une application complète sur des données financières.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial pollutants, consisting of heavy metals, petroleum residues, petrochemicals, and a wide spectrum of pesticides, enter the marine environment on a massive scale and pose a very serious threat to all forms of aquatic life. Although, earlier, efforts were directed towards the identification of pollutants and their major sources, because of a growing apprehension about the potential harm that pesticides can inflict upon various aquatic fauna and flora, research on fundamental and applied aspects of pesticides in the aquatic environment has mushroomed to a point where it has become difficult to even keep track of the current advances and developments. The Cochin Estuarine System (CES), adjoining the Greater Cochin area, receives considerable amounts of domestic sewage, urban wastes, agricultural runoff as well as effluent from the industrial units spread all along its shores. Since preliminary investigations revealed that the most prominent of organic pollutants discharged to these estuarine waters were the pesticides, the present study was designed to analyse the temporal and spatial distribution profile of some of the more toxic, persistent pesticides ——— organochlorines such as DDT and their metabolites; HCH-isomers; a cyclodiene compound," Endosulfan and a widely distributed, easily degradable, organophosphorus compound, Malathion, besides investigating their sorptional and toxicological characteristics. Although, there were indications of widespread contamination of various regions of the CBS with DDT, HCH-isomers etc., due to inadequacies of the monitoring programmes and due to a glaring void of baseline data the causative factors could not identified authentically. Therefore, seasonal and spatial distributions of some of the more commonly used pesticides in the CES were monitored systematically, (employing Gas Chromatographic techniques) and the results are analysed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the residual Kullback–Leibler discrimination information measure is extended to conditionally specified models. The extension is used to characterize some bivariate distributions. These distributions are also characterized in terms of proportional hazard rate models and weighted distributions. Moreover, we also obtain some bounds for this dynamic discrimination function by using the likelihood ratio order and some preceding results.