993 resultados para Restorable load estimation
Resumo:
Selostus: Maassa olevan nitraattitypen arviointi simulointimallin avulla
Resumo:
OBJECTIVES: An article by the Swiss AIDS Commission states that patients with stably suppressed viraemia [i.e. several successive HIV-1 RNA plasma concentrations (viral loads, VL) below the limits of detection during 6 months or more of highly active antiretroviral therapy (HAART)] are unlikely to be infectious. Questions then arise: how reliable is the undetectability of the VL, given the history of measures? What factors determine reliability? METHODS: We assessed the probability (henceforth termed reliability) that the n+1 VL would exceed 50 or 1000 HIV-1 RNA copies/mL when the nth one had been <50 copies/mL in 6168 patients of the Swiss HIV Cohort Study who were continuing to take HAART between 2003 and 2007. General estimating equations were used to analyse potential factors of reliability. RESULTS: With a cut-off at 50 copies/mL, reliability was 84.5% (n=1), increasing to 94.5% (n=5). Compliance, the current type of HAART and the first antiretroviral therapy (ART) received (HAART or not) were predictive factors of reliability. With a cut-off at 1000 copies/mL, reliability was 97.5% (n=1), increasing to 99.1% (n=4). Chart review revealed that patients had stopped their treatment, admitted to major problems with compliance or were taking non-HAART ART in 72.2% of these cases. Viral escape caused by resistance was found in 5.6%. No explanation was found in the charts of 22.2% of cases. CONCLUSIONS: After several successive VLs at <50 copies/mL, reliability reaches approximately 94% with a cut-off of 50 copies/mL and approximately 99% with a cut-off at 1000 copies/mL. Compliance is the most important factor predicting reliability.
Resumo:
A method is proposed for the estimation of absolute binding free energy of interaction between proteins and ligands. Conformational sampling of the protein-ligand complex is performed by molecular dynamics (MD) in vacuo and the solvent effect is calculated a posteriori by solving the Poisson or the Poisson-Boltzmann equation for selected frames of the trajectory. The binding free energy is written as a linear combination of the buried surface upon complexation, SASbur, the electrostatic interaction energy between the ligand and the protein, Eelec, and the difference of the solvation free energies of the complex and the isolated ligand and protein, deltaGsolv. The method uses the buried surface upon complexation to account for the non-polar contribution to the binding free energy because it is less sensitive to the details of the structure than the van der Waals interaction energy. The parameters of the method are developed for a training set of 16 HIV-1 protease-inhibitor complexes of known 3D structure. A correlation coefficient of 0.91 was obtained with an unsigned mean error of 0.8 kcal/mol. When applied to a set of 25 HIV-1 protease-inhibitor complexes of unknown 3D structures, the method provides a satisfactory correlation between the calculated binding free energy and the experimental pIC5o without reparametrization.
Resumo:
A major issue in the application of waveform inversion methods to crosshole ground-penetrating radar (GPR) data is the accurate estimation of the source wavelet. Here, we explore the viability and robustness of incorporating this step into a recently published time-domain inversion procedure through an iterative deconvolution approach. Our results indicate that, at least in non-dispersive electrical environments, such an approach provides remarkably accurate and robust estimates of the source wavelet even in the presence of strong heterogeneity of both the dielectric permittivity and electrical conductivity. Our results also indicate that the proposed source wavelet estimation approach is relatively insensitive to ambient noise and to the phase characteristics of the starting wavelet. Finally, there appears to be little to no trade-off between the wavelet estimation and the tomographic imaging procedures.
Resumo:
OBJECTIVES: Toll-like receptors (TLRs) are innate immune sensors that are integral to resisting chronic and opportunistic infections. Mounting evidence implicates TLR polymorphisms in susceptibilities to various infectious diseases, including HIV-1. We investigated the impact of TLR single nucleotide polymorphisms (SNPs) on clinical outcome in a seroincident cohort of HIV-1-infected volunteers. DESIGN: We analyzed TLR SNPs in 201 antiretroviral treatment-naive HIV-1-infected volunteers from a longitudinal seroincident cohort with regular follow-up intervals (median follow-up 4.2 years, interquartile range 4.4). Participants were stratified into two groups according to either disease progression, defined as peripheral blood CD4(+) T-cell decline over time, or peak and setpoint viral load. METHODS: Haplotype tagging SNPs from TLR2, TLR3, TLR4, and TLR9 were detected by mass array genotyping, and CD4(+) T-cell counts and viral load measurements were determined prior to antiretroviral therapy initiation. The association of TLR haplotypes with viral load and rapid progression was assessed by multivariate regression models using age and sex as covariates. RESULTS: Two TLR4 SNPs in strong linkage disequilibrium [1063 A/G (D299G) and 1363 C/T (T399I)] were more frequent among individuals with high peak viral load compared with low/moderate peak viral load (odds ratio 6.65, 95% confidence interval 2.19-20.46, P < 0.001; adjusted P = 0.002 for 1063 A/G). In addition, a TLR9 SNP previously associated with slow progression was found less frequently among individuals with high viral setpoint compared with low/moderate setpoint (odds ratio 0.29, 95% confidence interval 0.13-0.65, P = 0.003, adjusted P = 0.04). CONCLUSION: This study suggests a potentially new role for TLR4 polymorphisms in HIV-1 peak viral load and confirms a role for TLR9 polymorphisms in disease progression.
Resumo:
The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.
Resumo:
Les précipitations journalières extrêmes centennales ont été estimées à partir d'analyses de Gumbel et de sept formule empiriques effectuées sur des séries de mesures pluviométriques à 151 endroits de la Suisse pour deux périodes de 50 ans. Ces estimations ont été comparées avec les valeurs journalières maximales mesurées durant les 100 dernières années (1911-2010) afin de tester l'efficacité de ces sept formules. Cette comparaison révèle que la formule de Weibull serait la meilleure pour estimer les précipitations journalières centennales à partir de la série de mesures pluviométriques 1961-2010, mais la moins bonne pour la série de mesures 1911-1960. La formule de Hazen serait la plus efficace pour cette dernière période. Ces différences de performances entre les formules empiriques pour les deux périodes étudiées résultent de l'augmentation des précipitations journalières maximales mesurées de 1911 à 2010 pour 90% des stations en Suisse. Mais les différences entre les pluies extrêmes estimées à partir des sept formules empiriques ne dépassent pas 6% en moyenne.
Resumo:
Captan and folpet are two fungicides largely used in agriculture, but biomonitoring data are mostly limited to measurements of captan metabolite concentrations in spot urine samples of workers, which complicate interpretation of results in terms of internal dose estimation, daily variations according to tasks performed, and most plausible routes of exposure. This study aimed at performing repeated biological measurements of exposure to captan and folpet in field workers (i) to better assess internal dose along with main routes-of-entry according to tasks and (ii) to establish most appropriate sampling and analysis strategies. The detailed urinary excretion time courses of specific and non-specific biomarkers of exposure to captan and folpet were established in tree farmers (n = 2) and grape growers (n = 3) over a typical workweek (seven consecutive days), including spraying and harvest activities. The impact of the expression of urinary measurements [excretion rate values adjusted or not for creatinine or cumulative amounts over given time periods (8, 12, and 24 h)] was evaluated. Absorbed doses and main routes-of-entry were then estimated from the 24-h cumulative urinary amounts through the use of a kinetic model. The time courses showed that exposure levels were higher during spraying than harvest activities. Model simulations also suggest a limited absorption in the studied workers and an exposure mostly through the dermal route. It further pointed out the advantage of expressing biomarker values in terms of body weight-adjusted amounts in repeated 24-h urine collections as compared to concentrations or excretion rates in spot samples, without the necessity for creatinine corrections.
Resumo:
Human-induced habitat fragmentation constitutes a major threat to biodiversity. Both genetic and demographic factors combine to drive small and isolated populations into extinction vortices. Nevertheless, the deleterious effects of inbreeding and drift load may depend on population structure, migration patterns, and mating systems and are difficult to predict in the absence of crossing experiments. We performed stochastic individual-based simulations aimed at predicting the effects of deleterious mutations on population fitness (offspring viability and median time to extinction) under a variety of settings (landscape configurations, migration models, and mating systems) on the basis of easy-to-collect demographic and genetic information. Pooling all simulations, a large part (70%) of variance in offspring viability was explained by a combination of genetic structure (F(ST)) and within-deme heterozygosity (H(S)). A similar part of variance in median time to extinction was explained by a combination of local population size (N) and heterozygosity (H(S)). In both cases the predictive power increased above 80% when information on mating systems was available. These results provide robust predictive models to evaluate the viability prospects of fragmented populations.
Resumo:
In this report, sixteen secondary and primary bridge standards for two types of bridges are rated for AASHTO HS20-44 vehicle configuration utilizing Load Factor methodology. The ratings apply only to those bridges which: (1) are built according to the applicable bridge standard plans, (2) have no structural deterioration or damage, and (3) have no added wearing surface in excess of one-half inch integral wearing surface.
Resumo:
In this report, 25 secondary bridge standards for three types of bridges are rated for the AASHTO HS20-44 vehicle configuration and five typical Iowa legal vehicles. The ratings apply only to those bridges which: (1) are built according to the applicable bridge standard plans, (2) have no structural deterioration or damage, and (3) have no added wearing surface in excess of 0.5-in. (1.27-cm) integral wearing surface. Appendix A contains the results of the original October 1982 report on load ratings for standard bridges.
Resumo:
Each year several prestressed concrete girder bridges in Iowa and other states are struck and damaged by vehicles with loads too high to pass under the bridge. Whether or not intermediate diaphragms play a significant role in reducing the effect of these unusual loading conditions has often been a topic of discussion. A study of the effects of the type and location of intermediate diaphragms in prestressed concrete girder bridges when the bridge girder flanges were subjected to various levels of vertical and horizontal loading was undertaken. The purpose of the research was to determine whether steel diaphragms of any conventional configuration can provide adequate protection to minimize the damage to prestressed concrete girders caused by lateral loads, similar to the protection provided by the reinforced concrete intermediate diaphragms presently being used by the Iowa Department of Transportation. The research program conducted and described in this report included the following: A comprehensive literature search and survey questionnaire were undertaken to define the state-of-the-art in the use of intermediate diaphragms in prestressed concrete girder bridges. A full scale, simple span, restressed concrete girder bridge model, containing three beams was constructed and tested with several types of intermediate diaphragms located at the one-third points of the span or at the mid-span. Analytical studies involving a three-dimensional finite element analysis model were used to provide additional information on the behavior of the experimental bridge. The performance of the bridge with no intermediate diaphragms was quite different than that with intermediate diaphragms in place. All intermediate diaphragms tested had some effect in distributing the loads to the slab and other girders, although some diaphragm types performed better than others. The research conducted has indicated that the replacement of the reinforced concrete intermediate diaphragms currently being used in Iowa with structural steel diaphragms may be possible.
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).