942 resultados para geometric mean diameter


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Atazanavir inhibits UDP-glucuronyl-transferase-1A1 (UGT1A1), which metabolizes raltegravir, but the magnitude of steady-state inhibition and role of the UGT1A1 genotype are unknown. Sufficient inhibition could lead to reduced-dose and -cost raltegravir regimens. Nineteen healthy volunteers, age 24 to 51 years, took raltegravir 400 mg twice daily (arm A) and 400 mg plus atazanavir 400 mg once daily (arm B), separated by ?3 days, in a crossover design. After 1 week on each regimen, raltegravir and raltegravir-glucuronide plasma and urine concentrations were measured by liquid chromatography-tandem mass spectrometry in multiple samples obtained over 12 h (arm A) or 24 h (arm B) and analyzed by noncompartmental methods. UGT1A1 promoter variants were detected with a commercially available kit and published primers. The primary outcome was the ratio of plasma raltegravir C(tau), or concentration at the end of the dosing interval, for arm B (24 h) versus arm A (12 h). The arm B-to-arm A geometric mean ratios (95% confidence interval, P value) for plasma raltegravir C(tau), area under the concentration-time curve from 0 to 12 h (AUC(0-12)), and raltegravir-glucuronide/raltegravir AUC(0-12) were 0.38 (0.22 to 0.65, 0.001), 1.32 (0.62 to 2.81, 0.45), and 0.47 (0.38 to 0.59, <0.001), respectively. Nine volunteers were heterozygous and one was homozygous for a UGT1A1 reduction-of-function allele, but these were not associated with metabolite formation. Although atazanavir significantly reduced the formation of the glucuronide metabolite, its steady-state boosting of plasma raltegravir did not render the C(tau) with a once-daily raltegravir dose of 400 mg similar to the C(tau) with the standard twice-daily dose. UGT1A1 promoter variants did not significantly influence this interaction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Characterize ethylbenzene and xylene air concentrations, and explore the biological exposure markers (urinary t,t-muconic acid (t,t-MA) and unmetabolized toluene) among petroleum workers offshore. Offshore workers have increased health risks due to simultaneous exposures to several hydrocarbons present in crude oil. We discuss the pooled benzene exposure results from our previous and current studies and possible co-exposure interactions. BTEX air concentrations were measured during three consecutive 12-h work shifts among 10 tank workers, 15 process operators, and 18 controls. Biological samples were collected pre-shift on the first day of study and post-shift on the third day of the study. The geometric mean exposure over the three work shifts were 0.02 ppm benzene, 0.05 ppm toluene, 0.03 ppm ethylbenzene, and 0.06 ppm xylene. Benzene in air was significantly correlated with unmetabolized benzene in blood (r = 0.69, p < 0.001) and urine (r = 0.64, p < 0.001), but not with urinary t,t-MA (r = 0.27, p = 0.20). Toluene in air was highly correlated with the internal dose of toluene in both blood (r = 0.70, p < 0.001) and urine (r = 0.73, p < 0.001). Co-exposures were present; however, an interaction of metabolism was not likely at these low benzene and toluene exposures. Urinary benzene, but not t,t-MA, was a reliable biomarker for benzene at low exposure levels. Urinary toluene was a useful biomarker for toluene exposure. Xylene and ethylbenzene air levels were low. Dermal exposure assessment needs to be performed in future studies among these workers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Each item in a given collection is characterized by a set of possible performances. A (ranking) method is a function that assigns an ordering of the items to every performance profile. Ranking by Rating consists in evaluating each item’s performance by using an exogenous rating function, and ranking items according to their performance ratings. Any such method is separable: the ordering of two items does not depend on the performances of the remaining items. We prove that every separable method must be of the ranking-by-rating type if (i) the set of possible performances is the same for all items and the method is anonymous, or (ii) the set of performances of each item is ordered and the method is monotonic. When performances are m-dimensional vectors, a separable, continuous, anonymous, monotonic, and invariant method must rank items according to a weighted geometric mean of their performances along the m dimensions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present study focuses attention on defining certain measures of income inequality for the truncated distributions and characterization of probability distributions using the functional form of these measures, extension of some measures of inequality and stability to higher dimensions, characterization of bivariate models using the above concepts and estimation of some measures of inequality using the Bayesian techniques. The thesis defines certain measures of income inequality for the truncated distributions and studies the effect of truncation upon these measures. An important measure used in Reliability theory, to measure the stability of the component is the residual entropy function. This concept can advantageously used as a measure of inequality of truncated distributions. The geometric mean comes up as handy tool in the measurement of income inequality. The geometric vitality function being the geometric mean of the truncated random variable can be advantageously utilized to measure inequality of the truncated distributions. The study includes problem of estimation of the Lorenz curve, Gini-index and variance of logarithms for the Pareto distribution using Bayesian techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

R from http://www.r-project.org/ is ‘GNU S’ – a language and environment for statistical computing and graphics. The environment in which many classical and modern statistical techniques have been implemented, but many are supplied as packages. There are 8 standard packages and many more are available through the cran family of Internet sites http://cran.r-project.org . We started to develop a library of functions in R to support the analysis of mixtures and our goal is a MixeR package for compositional data analysis that provides support for operations on compositions: perturbation and power multiplication, subcomposition with or without residuals, centering of the data, computing Aitchison’s, Euclidean, Bhattacharyya distances, compositional Kullback-Leibler divergence etc. graphical presentation of compositions in ternary diagrams and tetrahedrons with additional features: barycenter, geometric mean of the data set, the percentiles lines, marking and coloring of subsets of the data set, theirs geometric means, notation of individual data in the set . . . dealing with zeros and missing values in compositional data sets with R procedures for simple and multiplicative replacement strategy, the time series analysis of compositional data. We’ll present the current status of MixeR development and illustrate its use on selected data sets

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simpson's paradox, also known as amalgamation or aggregation paradox, appears when dealing with proportions. Proportions are by construction parts of a whole, which can be interpreted as compositions assuming they only carry relative information. The Aitchison inner product space structure of the simplex, the sample space of compositions, explains the appearance of the paradox, given that amalgamation is a nonlinear operation within that structure. Here we propose to use balances, which are specific elements of this structure, to analyse situations where the paradox might appear. With the proposed approach we obtain that the centre of the tables analysed is a natural way to compare them, which avoids by construction the possibility of a paradox. Key words: Aitchison geometry, geometric mean, orthogonal projection

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El propósito básico de esta investigación es identificar los desajustes (déficits o superávits) no estructurales entre la oferta y demanda de trabajo en Colombia para los años 2020 y 2030, así como interpretar los hallazgos y proponer elementos de estrategia para las empresas a fin de mitigar los efectos adversos del desajuste sobre su capacidad para atraer y retener talento. El argumento central del proyecto consiste en sostener que en un escenario no mayor a los 10 años la oferta de trabajo calificado en Colombia no será suficiente para (i) equilibrar el mercado y (ii) atender la demanda agregada de trabajo, debido a los cambios generacionales en la realidad demográfica del país, el bajo nivel de preparación de la fuerza laboral disponible y los altos índices de informalidad de los trabajadores y las empresas. Dentro de los resultados se presenta una proyección del comportamiento del mercado de trabajo, así como la magnitud del desequilibrio entre los agentes del mercado. Este estudio aplicado es una propuesta cuantitativa de aproximación a la crisis de talento que se revisa en otros estudios. Es un precedente sólido para profundizar con otros enfoques el futuro del trabajo en Colombia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aquesta tesi estudia com estimar la distribució de les variables regionalitzades l'espai mostral i l'escala de les quals admeten una estructura d'espai Euclidià. Apliquem el principi del treball en coordenades: triem una base ortonormal, fem estadística sobre les coordenades de les dades, i apliquem els output a la base per tal de recuperar un resultat en el mateix espai original. Aplicant-ho a les variables regionalitzades, obtenim una aproximació única consistent, que generalitza les conegudes propietats de les tècniques de kriging a diversos espais mostrals: dades reals, positives o composicionals (vectors de components positives amb suma constant) són tractades com casos particulars. D'aquesta manera, es generalitza la geostadística lineal, i s'ofereix solucions a coneguts problemes de la no-lineal, tot adaptant la mesura i els criteris de representativitat (i.e., mitjanes) a les dades tractades. L'estimador per a dades positives coincideix amb una mitjana geomètrica ponderada, equivalent a l'estimació de la mediana, sense cap dels problemes del clàssic kriging lognormal. El cas composicional ofereix solucions equivalents, però a més permet estimar vectors de probabilitat multinomial. Amb una aproximació bayesiana preliminar, el kriging de composicions esdevé també una alternativa consistent al kriging indicador. Aquesta tècnica s'empra per estimar funcions de probabilitat de variables qualsevol, malgrat que sovint ofereix estimacions negatives, cosa que s'evita amb l'alternativa proposada. La utilitat d'aquest conjunt de tècniques es comprova estudiant la contaminació per amoníac a una estació de control automàtic de la qualitat de l'aigua de la conca de la Tordera, i es conclou que només fent servir les tècniques proposades hom pot detectar en quins instants l'amoni es transforma en amoníac en una concentració superior a la legalment permesa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Laser photoacoustic spectra of vapour phase CHDCl2 reveal the presence of an interaction which has been ascribed to interbond coupling between C-H and C-D local modes. The absolute value of the interbond coupling parameter for the CHD group, determined from a fit of a model local mode hamiltonian to the experimental data, is shown to be given approximately by the geometric mean of the interbond coupling parameters of the CH2 and CD2 groups recently derived from similar studies of CH2Cl2 and CD2Cl2. Such behaviour is understood in terms of a simple analysis in which kinetic coupling effects dominate. It is suggested that C-H stretch/bend Fermi resonance is responsible for some weaker features in the spectra and modelling calculations are described which allow an order of magnitude estimate of the size of the coupling parameter involved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Transcriptomic techniques are now being applied in ecotoxicology and toxicology to measure the impact of stressors and develop understanding of mechanisms of toxicity. Microarray technology in particular offers the potential to measure thousands of gene responses simultaneously. However, it is important that microarrays responses should be validated, at least initially, using real-time quantitative polymerase chain reaction (QPCR). The accurate measurement of target gene expression requires normalisation to an invariant internal control e. g., total RNA or reference genes. Reference genes are preferable, as they control for variation inherent in the cDNA synthesis and PCR. However, reference gene expression can vary between tissues and experimental conditions, which makes it crucial to validate them prior to application. Results: We evaluated 10 candidate reference genes for QPCR in Daphnia magna following a 24 h exposure to the non-steroidal anti-inflammatory drug (NSAID) ibuprofen (IB) at 0, 20, 40 and 80 mg IB l(-1). Six of the 10 candidates appeared suitable for use as reference genes. As a robust approach, we used a combination normalisation factor (NF), calculated using the geNorm application, based on the geometric mean of three selected reference genes: glyceraldehyde-3-phosphate dehydrogenase, ubiquitin conjugating enzyme and actin. The effects of normalisation are illustrated using as target gene leukotriene B4 12-hydroxydehydrogenase (Ltb4dh), which was upregulated following 24 h exposure to 63-81 mg IB l(-1). Conclusions: As anticipated, use of the NF clarified the response of Ltb4dh in daphnids exposed to sublethal levels of ibuprofen. Our findings emphasise the importance in toxicogenomics of finding and applying invariant internal QPCR control(s) relevant to the study conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A linear trinuclear Ni-Schiff base complex [Ni-3(salpen)(2)(PhCH2COO)(2)(EtOH)] has been synthesized by combining Ni(ClO4)(2)center dot 6H(2)O, phenyl acetic acid (C6H5CH2COOH), and the Schiff base ligand, N,N'-bis(salicylidene)-1,3-pentanediamine (H(2)salpen). This complex is self-assembled through hydrogen bonding and C-H-g interaction in the solid state to generate a sheet-like architecture, while in organic solvent (CH2Cl2), it forms vesicles with a mean diameter of 290 nm and fused vesicles, depending upon the concentration of the solution. These vesicles act as an excellent carrier of dye molecules in CH2Cl2. The morphology of the complex has been determined by scanning electron microscopy and transmission electron microscopy experiments, and the encapsulation of dye has been examined by confocal microscopic image and electronic absorption spectra.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Water Framework Directive has caused a paradigm shift towards the integrated management of recreational water quality through the development of drainage basin-wide programmes of measures. This has increased the need for a cost-effective diagnostic tool capable of accurately predicting riverine faecal indicator organism (FIO) concentrations. This paper outlines the application of models developed to fulfil this need, which represent the first transferrable generic FIO models to be developed for the UK to incorporate direct measures of key FIO sources (namely human and livestock population data) as predictor variables. We apply a recently developed transfer methodology, which enables the quantification of geometric mean presumptive faecal coliforms and presumptive intestinal enterococci concentrations for base- and high-flow during the summer bathing season in unmonitored UK watercourses, to predict FIO concentrations in the Humber river basin district. Because the FIO models incorporate explanatory variables which allow the effects of policy measures which influence livestock stocking rates to be assessed, we carry out empirical analysis of the differential effects of seven land use management and policy instruments (fiscal constraint, production constraint, cost intervention, area intervention, demand-side constraint, input constraint, and micro-level land use management) all of which can be used to reduce riverine FIO concentrations. This research provides insights into FIO source apportionment, explores a selection of pollution remediation strategies and the spatial differentiation of land use policies which could be implemented to deliver river quality improvements. All of the policy tools we model reduce FIO concentrations in rivers but our research suggests that the installation of streamside fencing in intensive milk producing areas may be the single most effective land management strategy to reduce riverine microbial pollution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nanoscale zerovalent iron (nZVI) has potential for the remediation of organochlorine-contaminated environments. Environmental safety concerns associated with in situ deployment of nZVI include potential negative impacts on indigenous microbes whose biodegradative functions could contribute to contaminant remediation. With respect to a two-step polychlorinated biphenyl remediation scenario comprising nZVI dechlorination followed by aerobic biodegradation, we examined the effect of polyacrylic acid (PAA)-coated nZVI (mean diameter = 12.5 nm) applied at 10 g nZVI kg−1 to Aroclor-1242 contaminated and uncontaminated soil over 28 days. nZVI had a limited effect on Aroclor congener profiles, but, either directly or indirectly via changes to soil physico-chemical conditions (pH, Eh), nZVI addition caused perturbation to soil bacterial community composition, and reduced the activity of chloroaromatic mineralizing microorganisms. We conclude that nZVI addition has the potential to inhibit microbial functions that could be important for PCB remediation strategies combining nZVI treatment and biodegradation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Particulate matter generated during the cooking process has been identified as one of the major problems of indoor air quality and indoor environmental health. Reliable assessment of exposure to cooking-generated particles requires accurate information of emission characteristics especially the size distribution. This study characterizes the volume/mass-based size distribution of the fume particles at the oil-heating stage for the typical Chinese-style cooking in a laboratory kitchen. A laser-diffraction size analyzer is applied to measure the volume frequency of fume particles ranged from 0.1 to 10 μm, which contribute to most mass proportion in PM2.5 and PM10. Measurements show that particle emissions have little dependence on the types of vegetable oil used but have a close relationship with the heating temperature. It is found that volume frequency of fume particles in the range of 1.0–4.0 μm accounts for nearly 100% of PM0.1–10 with the mode diameter 2.7 μm, median diameter 2.6 μm, Sauter mean diameter 3.0 μm, DeBroukere mean diameter 3.2 μm, and distribution span 0.48. Such information on emission characteristics obtained in this study can be possibly used to improve the assessment of indoor air quality due to PM0.1–10 in the kitchen and residential flat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conclusion. Hyperbaric oxygen treatment (HBOT) promoted an increase of the mean axonal diameter in the group evaluated 2 weeks after lesion induction, which suggests a more advanced regeneration process. However, the number of myelin nerve fibers of the facial nerve of the rabbits was similar when compared to the control and treatment groups, in both evaluation periods. Objective. To evaluate the effect of HBOT on the histological pattern of the facial nerve in rabbits exposed to a nerve crush injury. Materials and methods. Twenty rabbits were exposed to facial nerve crush injury. Ten rabbits received HBOT, 10 rabbits comprised the control group. The rabbits were sacrificed 2 and 4 weeks after the trauma. Qualitative morphological analysis, measurement of the external axonal diameters and myelin fiber count were carried out in an area of 185 000 mu m(2). Results. There was an increase in the area of the axons and thicker myelin in the 2 weeks treatment group in comparison with the control group. The mean diameter of the axons was of 2.34 mu m in the control group and of 2.81 mu m in the HBOT group, with statistically significant differences. The 2 week control group had a mean number of myelin fibers of 186 +/- 5.2664, and the HBOT group had a mean number of 2026.3 +/- 302; this was not statistically significant. The 4 week control group presented a mean of 2495.1 +/- 479 fibers and the HBOT group presented a mean of 2359.9 +/- 473; this was not statistically significant.