995 resultados para gradually truncated log-normal
Resumo:
Les personnes vieillissantes doivent composer au quotidien avec des douleurs chroniques. Le but de ce travail est de mieux comprendre les mécanismes sous-jacents qui contribueraient aux douleurs chroniques liées au vieillissement et par là, ouvrir un chemin vers de nouvelles perspectives thérapeutiques. Les contrôles inhibiteurs diffus nociceptifs (CIDN) ont un rôle qui n’est pas des moindres dans le contrôle de la douleur. Des études expérimentales examinant l’effet analgésique de la contre stimulation hétérotopique nociceptive (HNCS), un protocole permettant de tester l’efficacité de ces CIDN, suggèrent que le recrutement des CIDN au sein de cette population était plus faible (i.e. moins d’inhibition) comparé à une population plus jeune. En revanche, les études examinant la sensibilisation centrale induite par sommation temporelle (TS) de la douleur rapportent des résultats mitigés. De plus, une composante importante influençant l’expérience de douleur, les ressources cognitives, dont l’inhibition cognitive, se voient aussi décliner avec l’âge. Premièrement, le recrutement des CIDN a été comparé entre des participants sains, jeunes et des plus âgés avec la HNCS, et le recrutement des mécanismes de sensibilisation centrale avec la TS. La stimulation électrique du nerf sural a été choisie pour permettre de quantifier la douleur, tout en prenant une mesure indicative de la nociception spinale qu’est le réflexe nociceptif spinal (RIII). Nos sujets ont aussi participé à une tâche cognitive (le Stroop), testant l’inhibition cognitive. Deuxièmement, l’efficacité des CIDN ainsi que de l’inhibition cognitive a été testée chez les jeunes et les aînés en imagerie par résonance magnétique (IRM), afin de vérifier la relation entre ces deux mesures psychophysiques et l’épaisseur corticale des régions qui y sont impliquées ainsi que l’effet de l’âge sur celles-ci. Les résultats suggèrent un moindre recrutement des CIDN chez les plus âgés lors de l’expérimentation de la HNCS. Également, les sujets âgés présentaient des capacités d’inhibitions cognitives plus faibles que les jeunes. En plus, une corrélation entre l’inhibition cognitive et la modulation du réflexe RIII par la HNCS a été mise en évidence. Pour l’expérience de TS, les résultats étaient comparables pour les deux groupes, suggérant que les mécanismes impliqués dans la régulation de la douleur ne subiraient pas l’effet de l’âge de la même manière. Pour l’étude de l’épaisseur corticale, on y trouve une diminution globale de l’épaisseur corticale liée à l’âge, mais aussi une corrélation de l’analgésie par la HNCS avec l’inhibition cognitive et également, une relation des deux avec l’épaisseur corticale du cortex orbitofrontal (OFC) latéral gauche, suggérant la possibilité d’une existence d’un réseau neuronal au moins partiellement commun du contrôle inhibiteur descendant sensoriel et cognitif. Ce travail montre que l’effet de l’âge sur les mécanismes centraux de la régulation de la douleur est loin d’être uniforme. Également, il montre une corrélation entre la modulation endogène de la douleur et l’inhibition cognitive, ces deux processus seraient associés à une même région cérébrale. Ces résultats pourraient contribuer à identifier d’autres méthodes thérapeutiques, ouvrant ainsi une nouvelle avenue vers d’autres options dans la prise en charge des douleurs chroniques chez les personnes vieillissantes.
Resumo:
The present study focuses attention on defining certain measures of income inequality for the truncated distributions and characterization of probability distributions using the functional form of these measures, extension of some measures of inequality and stability to higher dimensions, characterization of bivariate models using the above concepts and estimation of some measures of inequality using the Bayesian techniques. The thesis defines certain measures of income inequality for the truncated distributions and studies the effect of truncation upon these measures. An important measure used in Reliability theory, to measure the stability of the component is the residual entropy function. This concept can advantageously used as a measure of inequality of truncated distributions. The geometric mean comes up as handy tool in the measurement of income inequality. The geometric vitality function being the geometric mean of the truncated random variable can be advantageously utilized to measure inequality of the truncated distributions. The study includes problem of estimation of the Lorenz curve, Gini-index and variance of logarithms for the Pareto distribution using Bayesian techniques.
Resumo:
The present work is an attempt to understand the characteristics of high energy ball milling on the structural, electrical and magnetic properties of some normal spinets in the ultra fine regime, Magnetism and magnetic materials have been a fascinating subject for the mankind ever since the discovery of lodestone. Since then, man has been applying this principle of magnetism to build devices for various applications. Magnetism can be classified broadly into five categories. They are diamagnetic, paramagnetic, ferromagnetic antiferromagnetic and ferrimagnetic. Of these, ferro and ferri magnetic materials assume great commercial importance due to their unique properties like appropriate magnetic characteristics, high resistivity and low eddy current losses. The emergence of nanoscience and nanotechnology during the last decade had its impact in the field of magnetism and magnetic materials too. Now, it is common knowledge that materials synthesized in the nanoregime exhibit novel and superlative properties with respect to their coarser sized counterparts in the micron regime. These studies reveal that dielectric properties can be varied appreciably by high-energy ball milling in nanosized zinc ferrites produced by coprecipitation method. A semi conducting behaviour was observed in these materials with the Oxygen vacancies acting as the main charge carrier for conduction, which was produced at the time of coprecipitation and milling. Thus through this study, it was possible to successfully investigate the finite size effects on the structural, electrical and magnetic properties of normal spinels in the ultra fine regime
Resumo:
The question of stability of black hole was first studied by Regge and Wheeler who investigated linear perturbations of the exterior Schwarzschild spacetime. Further work on this problem led to the study of quasi-normal modes which is believed as a characteristic sound of black holes. Quasi-normal modes (QNMs) describe the damped oscillations under perturbations in the surrounding geometry of a black hole with frequencies and damping times of oscillations entirely fixed by the black hole parameters.In the present work we study the influence of cosmic string on the QNMs of various black hole background spacetimes which are perturbed by a massless Dirac field.
Resumo:
The space constraints on wireless gadgets is a challenge to antenna designers as the ground plane dimensions of the printed monopole significantly affect s the antenna characteristics.Investigations on ground plane truncations have led to the development of an extremely broad band printed monopole antenna.Omnidirectional radiation characteristics with moderate gain makes this antenna highly suitable for mobile/wireless applications .This thesis also highlights the development of UWB printed antenna along with design equations .Optimum ground plane dimensions for compact antenna applications,folding technique for miniaturization and double folding for dual band application are the other highlights of this thesis.
Resumo:
Concrete is a universal material in the construction industry. With natural resources like sand and aggregate, fast depleting, it is time to look for alternate materials to substitute these in the process of making concrete. There are instances like exposure to solar radiation, fire, furnaces, and nuclear reactor vessels, special applications like missile launching pads etc., where concrete is exposed to temperature variations In this research work, an attempt has been made to understand the behaviour of concrete when weathered laterite aggregate is used in both conventional and self compacting normal strength concrete. The study has been extended to understand the thermal behaviour of both types of laterised concretes and to check suitability as a fire protection material. A systematic study of laterised concrete considering parameters like source of laterite aggregate, grades of Ordinary Portland Cement (OPC) and types of supplementary cementitious materials (fly ash and GGBFS) has been carried out to arrive at a feasible combination of various ingredients in laterised concrete. A mix design methodology has been proposed for making normal strength laterised self compacting concrete based on trial mixes and the same has also been validated. The physical and mechanical properties of laterised concretes have been studied with respect to different variables like exposure temperature (200°C, 400°C and 600°C) and cooling environment (air cooled and water cooled). The behaviour of ferrocement elements with laterised self compacting concrete has also been studied by varying the cover to mesh reinforcement (10mm to 50mm at an interval of 10mm), exposure temperature and cooling environment.
Resumo:
Evolution of mini warm pool in the Arabian Sea just before the onset of southwest monsoon and behavior of SST in the vicinity of weather systems formed during the premonsoon, southwest monsoon and post monsoon seasons were studied using TMI SST data. The Arabian Sea mini warm pool is formed about three weeks ahead of onset of southwest monsoon. Maximum SST is found about one week ahead of monsoon onset and then the warm pool gradually dissipated. Generally, a low-pressure system is formed when the SST exceeds a certain threshold value for the formation of the system. Daily SST values are examined both in Arabian sea and Bay of Bengal to bring out the quantity of increase in SST just before the formation of the system, quantity of rapid decrease in SST during the formation of the system and the number of days required for returning to normal SST. Many cases were examined for pre-monsoon, southwest monsoon and post monsoon seasons to understand the behavior of SST pattern. It is found that the SST increases about 3° C just before the formation of the system and decreases about 4° C during the formation within 2 to 3 days and takes about 4 to 6 days to return to normal SST pattern. However, the SST pattern depends on the weather system
Resumo:
Several oral vaccination studies have been undertaken to evoke a better protection against white spot syndrome virus (WSSV), amajor shrimp pathogen. Formalin-inactivated virus andWSSV envelope protein VP28 were suggested as candidate vaccine components, but their uptake mechanism upon oral delivery was not elucidated. In this study the fate of these components and of live WSSV, orally intubated to black tiger shrimp (Penaeus monodon) was investigated by immunohistochemistry, employing antibodies specific for VP28 and haemocytes. The midgut has been identified as the most prominent site of WSSV uptake and processing. The truncated recombinant VP28 (rec-VP28), formalin-inactivated virus (IVP) and live WSSV follow an identical uptake route suggested as receptor-mediated endocytosis that starts with adherence of luminal antigens at the apical layers of gut epithelium. Processing of internalized antigens is performed in endo-lysosomal compartments leading to formation of supra-nuclear vacuoles. However, the majority of WSSV-antigens escape these compartments and are transported to the inter-cellular space via transcytosis. Accumulation of the transcytosed antigens in the connective tissue initiates aggregation and degranulation of haemocytes. Finally the antigens exiting the midgut seem to reach the haemolymph. The nearly identical uptake pattern of the different WSSV-antigens suggests that receptors on the apical membrane of shrimp enterocytes recognize rec-VP28 efficiently. Hence the truncated VP28 can be considered suitable for oral vaccination, when the digestion in the foregut can be bypassed
Resumo:
In this article, we study reliability measures such as geometric vitality function and conditional Shannon’s measures of uncertainty proposed by Ebrahimi (1996) and Sankaran and Gupta (1999), respectively, for the doubly (interval) truncated random variables. In survival analysis and reliability engineering, these measures play a significant role in studying the various characteristics of a system/component when it fails between two time points. The interrelationships among these uncertainty measures for various distributions are derived and proved characterization theorems arising out of them
Resumo:
In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored
Resumo:
In this paper, we examine the relationships between log odds rate and various reliability measures such as hazard rate and reversed hazard rate in the context of repairable systems. We also prove characterization theorems for some families of distributions viz. Burr, Pearson and log exponential models. We discuss the properties and applications of log odds rate in weighted models. Further we extend the concept to the bivariate set up and study its properties.
Resumo:
A/though steel is most commonly used as a reinforcing material in concrete due to its competitive cost and favorable mechanical properties, the problem of corrosion of steel rebars leads to a reduction in life span of the structure and adds to maintenance costs. Many techniques have been developed in recent past to reduce corrosion (galvanizing, epoxy coating, etc.) but none of the solutions seem to be viable as an adequate solution to the corrosion problem. Apart from the use of fiber reinforced polymer (FRP) rebars, hybrid rebars consisting of both FRP and steel are also being tried to overcome the problem of steel corrosion. This paper evaluates the performance of hybrid rebars as longitudinal reinforcement in normal strength concrete beams. Hybrid rebars used in this study essentially consist of glass fiber reinforced polymer (GFRP) strands of 2 mm diameter wound helically on a mild steel core of 6 mm diameter. GFRP stirrups have been used as shear reinforcement. An attempt has been made to evaluate the flexural and shear performance of beams having hybrid rebars in normal strength concrete with and without polypropylene fibers added to the concrete matrix
Resumo:
In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.
Resumo:
In this text, we present two stereo-based head tracking techniques along with a fast 3D model acquisition system. The first tracking technique is a robust implementation of stereo-based head tracking designed for interactive environments with uncontrolled lighting. We integrate fast face detection and drift reduction algorithms with a gradient-based stereo rigid motion tracking technique. Our system can automatically segment and track a user's head under large rotation and illumination variations. Precision and usability of this approach are compared with previous tracking methods for cursor control and target selection in both desktop and interactive room environments. The second tracking technique is designed to improve the robustness of head pose tracking for fast movements. Our iterative hybrid tracker combines constraints from the ICP (Iterative Closest Point) algorithm and normal flow constraint. This new technique is more precise for small movements and noisy depth than ICP alone, and more robust for large movements than the normal flow constraint alone. We present experiments which test the accuracy of our approach on sequences of real and synthetic stereo images. The 3D model acquisition system we present quickly aligns intensity and depth images, and reconstructs a textured 3D mesh. 3D views are registered with shape alignment based on our iterative hybrid tracker. We reconstruct the 3D model using a new Cubic Ray Projection merging algorithm which takes advantage of a novel data structure: the linked voxel space. We present experiments to test the accuracy of our approach on 3D face modelling using real-time stereo images.