996 resultados para Encoding methods
Resumo:
Aims: The adaptive immune response against hepatitis C virus (HCV) is significantly shaped by the host's composition of HLA alleles. Thus, the HLA phenotype is a critical determinant of viral evolution during adaptive immune pressure. Potential associations of HLA class I alleles with polymorphisms of HCV immune escape variants are largely unknown. Methods: Direct sequence analysis of the genes encoding the HCV proteins E2, NS3 and NS5B in a cohort of 159 patients with chronic HCV genotype 1 infection who were treated with pegylated interferon-alfa 2b and ribavirin in a prospective controlled trial for 48 weeks was exhibited. HLA class I genotyping was performed by strand-specific reverse hybridization with the INNO-LiPA line probe assays for HLA-A and HLA-B and by strand-specific PCR-SSP. We analyzed each amino acid position of HCV proteins using an extension of Fisher's exact test for associations with HLA alleles. In addition, associations of specific HLA alleles with inflammatory activity, liver fibrosis, HCV RNA viral load and virologic treatment outcome were investigated. Results: Separate analyses of HCV subtype 1a and 1b isolates revealed substantially different patterns of HLA-restricted polymorphisms between subtypes. Only one polymorphism within NS5B (V2758x) was significantly associated with HLA B*15 in HCV genotype 1b infected patients (adjusted p=0,048). However, a number of HLA class I-restricted polymorphisms within novel putative HCV CD8+ T cell epitopes (genotype 1a: HLA-A*11 GTRTIASPK1086-1094 [NS3], HLA-B*07 WPAPQGARSL1111-1120 [NS3]; genotype 1b: HLA-A*24 HYAPRPCGI488-496 [E2], HLA-B*44 GENETDVLL530-538 [E2], HLA-B*15 RVFTEAMTRY2757-2766 [NS5B]) were observed with high predicted epitope binding scores assessed by the web-based software SYFPEITHI (>21). Most of the identified putative epitopes were overlapping with already otherwise published epitopes, indicating a high immunogenicity of the accordant HCV protein region. In addition, certain HLA class I alleles were associated with inflammatory activity, stage of liver fibrosis, and sustained virologic response to antiviral therapy. Conclusions: HLA class I restricted HCV sequence polymorphisms are rare. HCV polymorphisms identified within putative HCV CD8+ T cell epitopes in the present study differ in their genomic distribution between genotype 1a and 1b isolates, implying divergent adaptation to the host's immune pressure on the HCV subtype level.
Resumo:
Selective pressures related to gene function and chromosomal architecture are acting on genome sequences and can be revealed, for instance, by appropriate genometric methods. Cumulative nucleotide skew analyses, i.e., GC, TA, and ORF orientation skews, predict the location of the origin of DNA replication for 88 out of 100 completely sequenced bacterial chromosomes. These methods appear fully reliable for proteobacteria, Gram-positives, and spirochetes as well as for euryarchaeotes. Based on this genome architecture information, coorientation analyses reveal that in prokaryotes, ribosomal RNA (rRNA) genes encoding the small and large ribosomal subunits are all transcribed in the same direction as DNA replication; that is, they are located along the leading strand. This result offers a simple and reliable method for circumscribing the region containing the origin of the DNA replication and reveals a strong selective pressure acting on the orientation of rRNA genes similar to the weaker one acting on the orientation of ORFs. Rate of coorientation of transfer RNA (tRNA) genes with DNA replication appears to be taxon-specific. Analyzing nucleotide biases such as GC and TA skews of genes and plotting one against the other reveals a taxonomic clusterization of species. All ribosomal RNA genes are enriched in Gs and depleted in Cs, the only so far known exception being the rRNA genes of deuterostomian mitochondria. However, this exception can be explained by the fact that in the chromosome of the human mitochondrion, the model of the deuterostomian organelle genome, DNA replication, and rRNA transcription proceed in opposite directions. A general rule is deduced from prokaryotic and mitochondrial genomes: ribosomal RNA genes that are transcribed in the same direction as the DNA replication are enriched in Gs, and those transcribed in the opposite direction are depleted in Gs.
Resumo:
Introduction: Neuroimaging of the self focused on high-level mechanisms such as language, memory or imagery of the self. Recent evidence suggests that low-level mechanisms of multisensory and sensorimotor integration may play a fundamental role in encoding self-location and the first-person perspective (Blanke and Metzinger, 2009). Neurological patients with out-of body experiences (OBE) suffer from abnormal self-location and the first-person perspective due to a damage in the temporo-parietal junction (Blanke et al., 2004). Although self-location and the first-person perspective can be studied experimentally (Lenggenhager et al., 2009), the neural underpinnings of self-location have yet to be investigated. To investigate the brain network involved in self-location and first-person perspective we used visuo-tactile multisensory conflict, magnetic resonance (MR)-compatible robotics, and fMRI in study 1, and lesion analysis in a sample of 9 patients with OBE due to focal brain damage in study 2. Methods: Twenty-two participants saw a video showing either a person's back or an empty room being stroked (visual stimuli) while the MR-compatible robotic device stroked their back (tactile stimulation). Direction and speed of the seen stroking could either correspond (synchronous) or not (asynchronous) to those of the seen stroking. Each run comprised the four conditions according to a 2x2 factorial design with Object (Body, No-Body) and Synchrony (Synchronous, Asynchronous) as main factors. Self-location was estimated using the mental ball dropping (MBD; Lenggenhager et al., 2009). After the fMRI session participants completed a 6-item adapted from the original questionnaire created by Botvinick and Cohen (1998) and based on questions and data obtained by Lenggenhager et al. (2007, 2009). They were also asked to complete a questionnaire to disclose the perspective they adopted during the illusion. Response times (RTs) for the MBD and fMRI data were analyzed with a 3-way mixed model ANOVA with the in-between factor Perspective (up, down) and the two with-in factors Object (body, no-body) and Stroking (synchronous, asynchronous). Quantitative lesion analysis was performed using MRIcron (Rorden et al., 2007). We compared the distributions of brain lesions confirmed by multimodality imaging (Knowlton, 2004) in patients with OBE with those showing complex visual hallucinations involving people or faces, but without any disturbance of self-location and first person perspective. Nine patients with OBE were investigated. The control group comprised 8 patients. Structural imaging data were available for normalization and co-registration in all the patients. Normalization of each patient's lesion into the common MNI (Montreal Neurological Institute) reference space permitted simple, voxel-wise, algebraic comparisons to be made. Results: Even if in the scanner all participants were lying on their back and were facing upwards, analysis of perspective showed that half of the participants had the impression to be looking down at the virtual human body below them, despite any cues about their body position (Down-group). The other participants had the impression to be looking up at the virtual body above them (Up-group). Analysis of Q3 ("How strong was the feeling that the body you saw was you?") indicated stronger self-identification with the virtual body during the synchronous stroking. RTs in the MBD task confirmed these subjective data (significant 3-way interaction between perspective, object and stroking). fMRI results showed eight cortical regions where the BOLD signal was significantly different during at least one of the conditions resulting from the combination of Object and Stroking, relative to baseline: right and left temporo-parietal junction, right EBA, left middle occipito-temporal gyrus, left postcentral gyrus, right medial parietal lobe, bilateral medial occipital lobe (Fig 1). The activation patterns in right and left temporo-parietal junction and right EBA reflected changes in self-location and perspective as revealed by statistical analysis that was performed on the percentage of BOLD change with respect to the baseline. Statistical lesion overlap comparison (using nonparametric voxel based lesion symptom mapping) with respect to the control group revealed the right temporo-parietal junction, centered at the angular gyrus (Talairach coordinates x = 54, y =-52, z = 26; p>0.05, FDR corrected). Conclusions: The present questionnaire and behavioural results show that - despite the noisy and constraining MR environment) our participants had predictable changes in self-location, self-identification, and first-person perspective when robotic tactile stroking was applied synchronously with the robotic visual stroking. fMRI data in healthy participants and lesion data in patients with abnormal self-location and first-person perspective jointly revealed that the temporo-parietal cortex especially in the right hemisphere encodes these conscious experiences. We argue that temporo-parietal activity reflects the experience of the conscious "I" as embodied and localized within bodily space.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
Reliable estimates of heavy-truck volumes are important in a number of transportation applications. Estimates of truck volumes are necessary for pavement design and pavement management. Truck volumes are important in traffic safety. The number of trucks on the road also influences roadway capacity and traffic operations. Additionally, heavy vehicles pollute at higher rates than passenger vehicles. Consequently, reliable estimates of heavy-truck vehicle miles traveled (VMT) are important in creating accurate inventories of on-road emissions. This research evaluated three different methods to calculate heavy-truck annual average daily traffic (AADT) which can subsequently be used to estimate vehicle miles traveled (VMT). Traffic data from continuous count stations provided by the Iowa DOT were used to estimate AADT for two different truck groups (single-unit and multi-unit) using the three methods. The first method developed monthly and daily expansion factors for each truck group. The second and third methods created general expansion factors for all vehicles. Accuracy of the three methods was compared using n-fold cross-validation. In n-fold cross-validation, data are split into n partitions, and data from the nth partition are used to validate the remaining data. A comparison of the accuracy of the three methods was made using the estimates of prediction error obtained from cross-validation. The prediction error was determined by averaging the squared error between the estimated AADT and the actual AADT. Overall, the prediction error was the lowest for the method that developed expansion factors separately for the different truck groups for both single- and multi-unit trucks. This indicates that use of expansion factors specific to heavy trucks results in better estimates of AADT, and, subsequently, VMT, than using aggregate expansion factors and applying a percentage of trucks. Monthly, daily, and weekly traffic patterns were also evaluated. Significant variation exists in the temporal and seasonal patterns of heavy trucks as compared to passenger vehicles. This suggests that the use of aggregate expansion factors fails to adequately describe truck travel patterns.
Resumo:
P>1. Entomopathogenic nematodes can function as indirect defence for plants that are attacked by root herbivores. By releasing volatile organic compounds (VOCs), plants signal the presence of host insects and thereby attract nematodes.2. Nonetheless, how roots deploy indirect defences, how indirect defences relate to direct defences, and the ecological consequences of root defence allocation for herbivores and plant biomass are essentially unknown.3. We investigate a natural below-ground tritrophic system, involving common milkweed, a specialist root-boring beetle and entomopathogenic nematodes, and asked whether there is a negative genetic correlation between direct defences (root cardenolides) and indirect defences (emission of volatiles in the roots and nematode attraction), and between constitutive and inducible defences.4. Volatiles of roots were analysed using two distinct sampling methods. First, we collected emissions from living Asclepias syriaca roots by dynamic headspace sampling. This method showed that attacked A. syriaca plants emit five times higher levels of volatiles than control plants. Secondly, we used a solid phase micro-extraction (SPME) method to sample the full pool of volatiles in roots for genetic correlations of volatile biosynthesis.5. Field experiments showed that entomopathogenic nematodes prevent the loss of biomass to root herbivory. Additionally, suppression of root herbivores was mediated directly by cardenolides and indirectly by the attraction of nematodes. Genetic families of plants with high cardenolides benefited less from nematodes compared to low-cardenolide families, suggesting that direct and indirect defences may be redundant. Although constitutive and induced root defences traded off within each strategy (for both direct and indirect defence, cardenolides and VOCs, respectively), we found no trade-off between the two strategies.6. Synthesis. Constitutive expression and inducibility of defences may trade off because of resource limitation or because they are redundant. Direct and indirect defences do not trade off, likely because they may not share a limiting resource and because independently they may promote defence across the patchiness of herbivore attack and nematode presence in the field. Indeed, some redundancy in strategies may be necessary to increase effective defence, but for each strategy, an economy of deployment reduces overall costs.
Resumo:
This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]
Resumo:
Background: The objective of the present study was to compare three different sampling and questionnaire administration methods used in the international KIDSCREEN study in terms of participation, response rates, and external validity. Methods: Children and adolescents aged 8–18 years were surveyed in 13 European countries using either telephone sampling and mail administration, random sampling of school listings followed by classroom or mail administration, or multistage random sampling of communities and households with self-administration of the survey materials at home. Cooperation, completion, and response rates were compared across countries and survey methods. Data on non-respondents was collected in 8 countries. The population fraction (PF, respondents in each sex-age, or educational level category, divided by the population in the same category from Eurostat census data) and population fraction ratio (PFR, ratio of PF) and their corresponding 95% confidence intervals were used to analyze differences by country between the KIDSCREEN samples and a reference Eurostat population. Results: Response rates by country ranged from 18.9% to 91.2%. Response rates were highest in the school-based surveys (69.0%–91.2%). Sample proportions by age and gender were similar to the reference Eurostat population in most countries, although boys and adolescents were slightly underrepresented (PFR <1). Parents in lower educational categories were less likely to participate (PFR <1 in 5 countries). Parents in higher educational categories were overrepresented when the school and household sampling strategies were used (PFR = 1.78–2.97). Conclusion: School-based sampling achieved the highest overall response rates but also produced slightly more biased samples than the other methods. The results suggest that the samples were sufficiently representative to provide reference population values for the KIDSCREEN instrument.
Resumo:
The pseudo-spectral time-domain (PSTD) method is an alternative time-marching method to classicalleapfrog finite difference schemes in the simulation of wave-like propagating phenomena. It is basedon the fundamentals of the Fourier transform to compute the spatial derivatives of hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acoustics simulations. However, one of the first issues to be solved consists on modeling wallabsorption. Unfortunately, there are no references in the technical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals to overcome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
Background: The cooperative interaction between transcription factors has a decisive role in the control of the fate of the eukaryotic cell. Computational approaches for characterizing cooperative transcription factors in yeast, however, are based on different rationales and provide a low overlap between their results. Because the wealth of information contained in protein interaction networks and regulatory networks has proven highly effective in elucidating functional relationships between proteins, we compared different sets of cooperative transcription factor pairs (predicted by four different computational methods) within the frame of those networks. Results: Our results show that the overlap between the sets of cooperative transcription factors predicted by the different methods is low yet significant. Cooperative transcription factors predicted by all methods are closer and more clustered in the protein interaction network than expected by chance. On the other hand, members of a cooperative transcription factor pair neither seemed to regulate each other nor shared similar regulatory inputs, although they do regulate similar groups of target genes. Conclusion: Despite the different definitions of transcriptional cooperativity and the different computational approaches used to characterize cooperativity between transcription factors, the analysis of their roles in the framework of the protein interaction network and the regulatory network indicates a common denominator for the predictions under study. The knowledge of the shared topological properties of cooperative transcription factor pairs in both networks can be useful not only for designing better prediction methods but also for better understanding the complexities of transcriptional control in eukaryotes.
Resumo:
Background: The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods: The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results: After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion: The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents.
Resumo:
The Pseudo-Spectral Time Domain (PSTD) method is an alternative time-marching method to classical leapfrog finite difference schemes inthe simulation of wave-like propagating phenomena. It is based on the fundamentals of the Fourier transform to compute the spatial derivativesof hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acousticssimulations. However, one of the first issues to be solved consists on modeling wall absorption. Unfortunately, there are no references in thetechnical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals toovercome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
The State of Iowa currently has approximately 69,000 miles of unpaved secondary roads. Due to the low traffic count on these unpaved o nts as ng e two dust ed d roads, paving with asphalt or Portland cement concrete is not economical. Therefore to reduce dust production, the use of dust suppressants has been utilized for decades. This study was conducted to evaluate the effectiveness of several widely used dust suppressants through quantitative field testing on two of Iowa’s most widely used secondary road surface treatments: crushed limestone rock and alluvial sand/gravel. These commercially available dust suppressants included: lignin sulfonate, calcium chloride, and soybean oil soapstock. These suppressants were applied to 1000 ft test sections on four unpaved roads in Story County, Iowa. Tduplicate field conditions, the suppressants were applied as a surface spray once in early June and again in late August or early September. The four unpaved roads included two with crushed limestone rock and two with alluvial sand/gravel surface treatmewell as high and low traffic counts. The effectiveness of the dust suppressants was evaluated by comparing the dust produced on treated and untreated test sections. Dust collection was scheduled for 1, 2, 4, 6, and 8 weeks after each application, for a total testiperiod of 16 weeks. Results of a cost analysis between annual dust suppressant application and biennial aggregate replacement indicated that the cost of the dust suppressant, its transportation, and application were relatively high when compared to that of thaggregate types. Therefore, the biennial aggregate replacement is considered more economical than annual dust suppressant application, although the application of annual dust suppressant reduced the cost of road maintenance by 75 %. Results of thecollection indicated that the lignin sulfonate suppressant outperformed calcium chloride and soybean oil soapstock on all four unpavroads, the effect of the suppressants on the alluvial sand/gravel surface treatment was less than that on the crushed limestone rock, the residual effects of all the products seem reasonably well after blading, and the combination of alluvial sand/gravel surface treatment anhigh traffic count caused dust reduction to decrease dramatically.