943 resultados para Approximate Bayesian computation, Posterior distribution, Quantile distribution, Response time data
Resumo:
The pharmacokinetics of scorpion venom and its toxins has been investigated in experimental models using adult animals, although, severe scorpion accidents are associated more frequently with children. We compared the effect of age on the pharmacokinetics of tityustoxin, one of the most active principles of Tityus serrulatus venom, in young male/female rats (21-22 days old, N = 5-8) and in adult male rats (150-160 days old, N = 5-8). Tityustoxin (6 µg) labeled with 99mTechnetium was administered subcutaneously to young and adult rats. The plasma concentration vs time data were subjected to non-compartmental pharmacokinetic analysis to obtain estimates of various pharmacokinetic parameters such as total body clearance (CL/F), distribution volume (Vd/F), area under the curve (AUC), and mean residence time. The data were analyzed with and without considering body weight. The data without correction for body weight showed a higher Cmax (62.30 ± 7.07 vs 12.71 ± 2.11 ng/ml, P < 0.05) and AUC (296.49 ± 21.09 vs 55.96 ± 5.41 ng h-1 ml-1, P < 0.05) and lower Tmax (0.64 ± 0.19 vs 2.44 ± 0.49 h, P < 0.05) in young rats. Furthermore, Vd/F (0.15 vs 0.42 l/kg) and CL/F (0.02 ± 0.001 vs 0.11 ± 0.01 l h-1 kg-1, P < 0.05) were lower in young rats. However, when the data were reanalyzed taking body weight into consideration, the Cmax (40.43 ± 3.25 vs 78.21 ± 11.23 ng kg-1 ml-1, P < 0.05) and AUC (182.27 ± 11.74 vs 344.62 ± 32.11 ng h-1 ml-1, P < 0.05) were lower in young rats. The clearance (0.03 ± 0.002 vs 0.02 ± 0.002 l h-1 kg-1, P < 0.05) and Vd/F (0.210 vs 0.067 l/kg) were higher in young rats. The raw data (not adjusted for body weight) strongly suggest that age plays a pivotal role in the disposition of tityustoxin. Furthermore, our results also indicate that the differences in the severity of symptoms observed in children and adults after scorpion envenomation can be explained in part by differences in the pharmacokinetics of the toxin.
Resumo:
Le Streptocoque de groupe B (GBS) est un important agent d’infection invasive pouvant mener à la mort et demeure la cause principale de septicémie néonatale à ce jour. Neuf sérotypes ont été officiellement décrits basés sur la composition de la capsule polysaccharidique (CPS). Parmi ces sérotypes, le type III est considéré le plus virulent et fréquemment associé aux maladies invasives graves, telle que la méningite. Malgré que plusieurs recherches aient été effectuées au niveau des interactions entre GBS type III et les cellules du système immunitaire innées, aucune information n’est disponible sur la régulation de la réponse immunitaire adaptative dirigée contre ce dernier. Notamment, le rôle de cellules T CD4+ dans l’immuno-pathogenèse de l’infection causée par GBS n’a jamais été étudié. Dans cet étude, trois différents modèles murins d’infection ont été développé pour évaluer l’activation et la modulation des cellules T CD4+ répondantes au GBS de type III : ex vivo, in vivo, et in vitro. Les résultats d’infections ex vivo démontrent que les splénocytes totaux répondent à l’infection en produisant des cytokines de type-1 pro-inflammatoires. Une forte production d’IL-10 accompagne cette cascade inflammatoire, probablement dans l’effort de l’hôte de maintenir l’homéostasie. Les résultats démontrent aussi que les cellules T sont activement recrutées par les cellules répondantes du système inné en produisant des facteurs chimiotactiques, tels que CXCL9, CXCL10, et CCL3. Plus spécifiquement, les résultats obtenus à partir des cellules isolées T CD4+ provenant des infections ex vivo ou in vivo démontrent que ces cellules participent à la production d’IFN-γ et de TNF-α ainsi que d’IL-2, suggérant un profil d’activation Th1. Les cellules isolées T CD4+ n’étaient pas des contributeurs majeurs d’IL-10. Ceci indique que cette cytokine immuno-régulatrice est principalement produite par les cellules de l’immunité innée de la rate de souris infectées. Le profil Th1 des cellules T CD4+ a été confirmé en utilisant un modèle in vitro. Nos résultats démontrent aussi que la CPS de GBS a une role immuno-modulateur dans le développement de la réponse Th1. En résumé, cette étude adresse pour la première fois, la contribution des cellules T CD4+ dans la production d’IFN-γ lors d’une infection à GBS et donc, dans le développement d’une réponse de type Th1. Ces résultats renforcent d’avantage le rôle central de cette cytokine pour un control efficace des infections causées par ce pathogène.
Resumo:
Laser ablation of graphite has been carried out using 1.06mm radiation from a Q-switched Nd:YAG laser and the time of flight distribution of molecular C2 present in the resultant plasma is investigated in terms of distance from the target as well as laser fluences employing time resolved spectroscopic technique. At low laser fluences the intensities of the emission lines from C2 exhibit only single peak structure while beyond a threshold laser fluence, emission from C2 shows a twin peak distribution in time. The occurrence of the faster velocity component at higher laser fluences is explained as due to species generated from recombination processes while the delayed peak is attributed to dissociation of higher carbon clusters resulting in the generation of C2 molecule. Analysis of measured data provides a fairly complete picture of the evolution and dynamics of C2 species in the laser induced plasma from graphite.
Resumo:
The emission features of laser ablated graphite plume generated in a helium ambient atmosphere have been investigated with time and space resolved plasma diagnostic technique. Time resolved optical emission spectroscopy is employed to reveal the velocity distribution of different species ejected during ablation. At lower values of laser fluences only a slowly propagating component of C2 is seen. At high fluences emission from C2 shows a twin peak distribution in time. The formation of an emission peak with diminished time delay giving an energetic peak at higher laser fluences is attributed to many body recombination. It is also observed that these double peaks get modified into triple peak time of flight distribution at distances greater than 16 mm from the target. The occurrence of multiple peaks in the C2 emission is mainly due to the delays caused from the different formation mechanism of C2 species. The velocity distribution of the faster peak exhibits an oscillating character with distance from the target surface.
Resumo:
This thesis entitled Reliability Modelling and Analysis in Discrete time Some Concepts and Models Useful in the Analysis of discrete life time data.The present study consists of five chapters. In Chapter II we take up the derivation of some general results useful in reliability modelling that involves two component mixtures. Expression for the failure rate, mean residual life and second moment of residual life of the mixture distributions in terms of the corresponding quantities in the component distributions are investigated. Some applications of these results are also pointed out. The role of the geometric,Waring and negative hypergeometric distributions as models of life lengths in the discrete time domain has been discussed already. While describing various reliability characteristics, it was found that they can be often considered as a class. The applicability of these models in single populations naturally extends to the case of populations composed of sub-populations making mixtures of these distributions worth investigating. Accordingly the general properties, various reliability characteristics and characterizations of these models are discussed in chapter III. Inference of parameters in mixture distribution is usually a difficult problem because the mass function of the mixture is a linear function of the component masses that makes manipulation of the likelihood equations, leastsquare function etc and the resulting computations.very difficult. We show that one of our characterizations help in inferring the parameters of the geometric mixture without involving computational hazards. As mentioned in the review of results in the previous sections, partial moments were not studied extensively in literature especially in the case of discrete distributions. Chapters IV and V deal with descending and ascending partial factorial moments. Apart from studying their properties, we prove characterizations of distributions by functional forms of partial moments and establish recurrence relations between successive moments for some well known families. It is further demonstrated that partial moments are equally efficient and convenient compared to many of the conventional tools to resolve practical problems in reliability modelling and analysis. The study concludes by indicating some new problems that surfaced during the course of the present investigation which could be the subject for a future work in this area.
Resumo:
The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’t includes below detection limits and/or zero values, and since most of the geological data responds to lognormal distributions, these “zero data” represent a mathematical challenge for the interpretation. We need to start by recognizing that there are zero values in geology. For example the amount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-exists with nepheline. Another common essential zero is a North azimuth, however we can always change that zero for the value of 360°. These are known as “Essential zeros”, but what can we do with “Rounded zeros” that are the result of below the detection limit of the equipment? Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimes we need to differentiate between a sodic and a potassic alteration. Pre-classification into groups requires a good knowledge of the distribution of the data and the geochemical characteristics of the groups which is not always available. Considering the zero values equal to the limit of detection of the used equipment will generate spurious distributions, especially in ternary diagrams. Same situation will occur if we replace the zero values by a small amount using non-parametric or parametric techniques (imputation). The method that we are proposing takes into consideration the well known relationships between some elements. For example, in copper porphyry deposits, there is always a good direct correlation between the copper values and the molybdenum ones, but while copper will always be above the limit of detection, many of the molybdenum values will be “rounded zeros”. So, we will take the lower quartile of the real molybdenum values and establish a regression equation with copper, and then we will estimate the “rounded” zero values of molybdenum by their corresponding copper values. The method could be applied to any type of data, provided we establish first their correlation dependency. One of the main advantages of this method is that we do not obtain a fixed value for the “rounded zeros”, but one that depends on the value of the other variable. Key words: compositional data analysis, treatment of zeros, essential zeros, rounded zeros, correlation dependency
Resumo:
El proyecto de investigación parte de la dinámica del modelo de distribución tercerizada para una compañía de consumo masivo en Colombia, especializada en lácteos, que para este estudio se ha denominado “Lactosa”. Mediante datos de panel con estudio de caso, se construyen dos modelos de demanda por categoría de producto y distribuidor y mediante simulación estocástica, se identifican las variables relevantes que inciden sus estructuras de costos. El problema se modela a partir del estado de resultados por cada uno de los cuatro distribuidores analizados en la región central del país. Se analiza la estructura de costos y el comportamiento de ventas dado un margen (%) de distribución logístico, en función de las variables independientes relevantes, y referidas al negocio, al mercado y al entorno macroeconómico, descritas en el objeto de estudio. Entre otros hallazgos, se destacan brechas notorias en los costos de distribución y costos en la fuerza de ventas, pese a la homogeneidad de segmentos. Identifica generadores de valor y costos de mayor dispersión individual y sugiere uniones estratégicas de algunos grupos de distribuidores. La modelación con datos de panel, identifica las variables relevantes de gestión que inciden sobre el volumen de ventas por categoría y distribuidor, que focaliza los esfuerzos de la dirección. Se recomienda disminuir brechas y promover desde el productor estrategias focalizadas a la estandarización de procesos internos de los distribuidores; promover y replicar los modelos de análisis, sin pretender remplazar conocimiento de expertos. La construcción de escenarios fortalece de manera conjunta y segura la posición competitiva de la compañía y sus distribuidores.
Resumo:
Cada vez hay más datos LiDAR disponibles que cubren grandes extensiones del territorio pero la distribución de este tipo de datos todavía no se ha resuelto debido al elevado volumen de datos y a que el análisis de la información no es trivial para usuarios no expertos en tecnología LiDAR. Actualmente DIELMO está llevando a cabo un proyecto implementar diferentes servicios para la distribución de datos LiDAR a través de una IDE
Resumo:
Case study simulations with idealized tracers have been used to determine the relationship between the dynamics and conceptual representations of different midlatitude frontal systems and the amount, distribution, and time scale of boundary layer ventilation by these systems. The key features of ventilation by a kata– and ana–cold frontal system are found to be quantitatively and also often qualitatively similar to the main ventilation pathways, which are the conveyor belts, cloud head, and other convection. The conveyor belts and cloud head occur within cloud, implying that they can be identified using satellite imagery. Differences in the transport by the two systems can be related to their conceptual representations and include a sensitive dependence on the diurnal cycle for the kata- but not the ana-cold frontal case.
Resumo:
Water table response to rainfall was investigated at six sites in the Upper, Middle and Lower Chalk of southern England. Daily time series of rainfall and borehole water level were cross-corretated to investigate seasonal variations in groundwater-level response times, based on periods of 3-month duration. The time tags (in days) yielding significant correlations were compared with the average unsaturated zone thickness during each 3-month period. In general, for cases when the unsaturated zone was greater than 18 m thick, the time tag for a significant water-level response increased rapidly once the depth to the water table exceeded a critical value, which varied from site to site. For shallower water tables, a linear relationship between the depth to the water table and the water-level response time was evident. The observed variations in response time can only be partially accounted for using a diffusive model for propagation through the unsaturated matrix, suggesting that some fissure flow was occurring. The majority of rapid responses were observed during the winter/spring recharge period, when the unsaturated zone is thinnest and the unsaturated zone moisture content is highest, and were more likely to occur when the rainfall intensity exceeded 5 mm/day. At some sites, a very rapid response within 24 h of rainfall was observed in addition to the longer term responses even when the unsaturated zone was up to 64 m thick. This response was generally associated with the autumn period. The results of the cross-correlation analysis provide statistical support for the presence of fissure flow and for the contribution of multiple pathways through the unsaturated zone to groundwater recharge. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Many time series are measured monthly, either as averages or totals, and such data often exhibit seasonal variability-the values of the series are consistently larger for some months of the year than for others. A typical series of this type is the number of deaths each month attributed to SIDS (Sudden Infant Death Syndrome). Seasonality can be modelled in a number of ways. This paper describes and discusses various methods for modelling seasonality in SIDS data, though much of the discussion is relevant to other seasonally varying data. There are two main approaches, either fitting a circular probability distribution to the data, or using regression-based techniques to model the mean seasonal behaviour. Both are discussed in this paper.
Resumo:
A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.
Resumo:
Detailed understanding of the haemodynamic changes that underlie non-invasive neuroimaging techniques such as blood oxygen level dependent functional magnetic resonance imaging is essential if we are to continue to extend the use of these methods for understanding brain function and dysfunction. The use of animal and in particular rodent research models has been central to these endeavours as they allow in-vivo experimental techniques that provide measurements of the haemodynamic response function at high temporal and spatial resolution. A limitation of most of this research is the use of anaesthetic agents which may disrupt or mask important features of neurovascular coupling or the haemodynamic response function. In this study we therefore measured spatiotemporal cortical haemodynamic responses to somatosensory stimulation in awake rats using optical imaging spectroscopy. Trained, restrained animals received non-noxious stimulation of the whisker pad via chronically implanted stimulating microwires whilst optical recordings were made from the contralateral somatosensory cortex through a thin cranial window. The responses we measure from un-anaesthetised animals are substantially different from those reported in previous studies which have used anaesthetised animals. These differences include biphasic response regions (initial increases in blood volume and oxygenation followed by subsequent decreases) as well as oscillations in the response time series of awake animals. These haemodynamic response features do not reflect concomitant changes in the underlying neuronal activity and therefore reflect neurovascular or cerebrovascular processes. These hitherto unreported hyperemic response dynamics may have important implications for the use of anaesthetised animal models for research into the haemodynamic response function.
Resumo:
The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.