968 resultados para hybrid methods
Resumo:
P>1. Entomopathogenic nematodes can function as indirect defence for plants that are attacked by root herbivores. By releasing volatile organic compounds (VOCs), plants signal the presence of host insects and thereby attract nematodes.2. Nonetheless, how roots deploy indirect defences, how indirect defences relate to direct defences, and the ecological consequences of root defence allocation for herbivores and plant biomass are essentially unknown.3. We investigate a natural below-ground tritrophic system, involving common milkweed, a specialist root-boring beetle and entomopathogenic nematodes, and asked whether there is a negative genetic correlation between direct defences (root cardenolides) and indirect defences (emission of volatiles in the roots and nematode attraction), and between constitutive and inducible defences.4. Volatiles of roots were analysed using two distinct sampling methods. First, we collected emissions from living Asclepias syriaca roots by dynamic headspace sampling. This method showed that attacked A. syriaca plants emit five times higher levels of volatiles than control plants. Secondly, we used a solid phase micro-extraction (SPME) method to sample the full pool of volatiles in roots for genetic correlations of volatile biosynthesis.5. Field experiments showed that entomopathogenic nematodes prevent the loss of biomass to root herbivory. Additionally, suppression of root herbivores was mediated directly by cardenolides and indirectly by the attraction of nematodes. Genetic families of plants with high cardenolides benefited less from nematodes compared to low-cardenolide families, suggesting that direct and indirect defences may be redundant. Although constitutive and induced root defences traded off within each strategy (for both direct and indirect defence, cardenolides and VOCs, respectively), we found no trade-off between the two strategies.6. Synthesis. Constitutive expression and inducibility of defences may trade off because of resource limitation or because they are redundant. Direct and indirect defences do not trade off, likely because they may not share a limiting resource and because independently they may promote defence across the patchiness of herbivore attack and nematode presence in the field. Indeed, some redundancy in strategies may be necessary to increase effective defence, but for each strategy, an economy of deployment reduces overall costs.
Resumo:
This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]
Resumo:
Background: The objective of the present study was to compare three different sampling and questionnaire administration methods used in the international KIDSCREEN study in terms of participation, response rates, and external validity. Methods: Children and adolescents aged 8–18 years were surveyed in 13 European countries using either telephone sampling and mail administration, random sampling of school listings followed by classroom or mail administration, or multistage random sampling of communities and households with self-administration of the survey materials at home. Cooperation, completion, and response rates were compared across countries and survey methods. Data on non-respondents was collected in 8 countries. The population fraction (PF, respondents in each sex-age, or educational level category, divided by the population in the same category from Eurostat census data) and population fraction ratio (PFR, ratio of PF) and their corresponding 95% confidence intervals were used to analyze differences by country between the KIDSCREEN samples and a reference Eurostat population. Results: Response rates by country ranged from 18.9% to 91.2%. Response rates were highest in the school-based surveys (69.0%–91.2%). Sample proportions by age and gender were similar to the reference Eurostat population in most countries, although boys and adolescents were slightly underrepresented (PFR <1). Parents in lower educational categories were less likely to participate (PFR <1 in 5 countries). Parents in higher educational categories were overrepresented when the school and household sampling strategies were used (PFR = 1.78–2.97). Conclusion: School-based sampling achieved the highest overall response rates but also produced slightly more biased samples than the other methods. The results suggest that the samples were sufficiently representative to provide reference population values for the KIDSCREEN instrument.
Resumo:
The pseudo-spectral time-domain (PSTD) method is an alternative time-marching method to classicalleapfrog finite difference schemes in the simulation of wave-like propagating phenomena. It is basedon the fundamentals of the Fourier transform to compute the spatial derivatives of hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acoustics simulations. However, one of the first issues to be solved consists on modeling wallabsorption. Unfortunately, there are no references in the technical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals to overcome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
Background: The cooperative interaction between transcription factors has a decisive role in the control of the fate of the eukaryotic cell. Computational approaches for characterizing cooperative transcription factors in yeast, however, are based on different rationales and provide a low overlap between their results. Because the wealth of information contained in protein interaction networks and regulatory networks has proven highly effective in elucidating functional relationships between proteins, we compared different sets of cooperative transcription factor pairs (predicted by four different computational methods) within the frame of those networks. Results: Our results show that the overlap between the sets of cooperative transcription factors predicted by the different methods is low yet significant. Cooperative transcription factors predicted by all methods are closer and more clustered in the protein interaction network than expected by chance. On the other hand, members of a cooperative transcription factor pair neither seemed to regulate each other nor shared similar regulatory inputs, although they do regulate similar groups of target genes. Conclusion: Despite the different definitions of transcriptional cooperativity and the different computational approaches used to characterize cooperativity between transcription factors, the analysis of their roles in the framework of the protein interaction network and the regulatory network indicates a common denominator for the predictions under study. The knowledge of the shared topological properties of cooperative transcription factor pairs in both networks can be useful not only for designing better prediction methods but also for better understanding the complexities of transcriptional control in eukaryotes.
Resumo:
Background: The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods: The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results: After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion: The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents.
Resumo:
The Pseudo-Spectral Time Domain (PSTD) method is an alternative time-marching method to classical leapfrog finite difference schemes inthe simulation of wave-like propagating phenomena. It is based on the fundamentals of the Fourier transform to compute the spatial derivativesof hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acousticssimulations. However, one of the first issues to be solved consists on modeling wall absorption. Unfortunately, there are no references in thetechnical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals toovercome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.
Resumo:
The State of Iowa currently has approximately 69,000 miles of unpaved secondary roads. Due to the low traffic count on these unpaved o nts as ng e two dust ed d roads, paving with asphalt or Portland cement concrete is not economical. Therefore to reduce dust production, the use of dust suppressants has been utilized for decades. This study was conducted to evaluate the effectiveness of several widely used dust suppressants through quantitative field testing on two of Iowa’s most widely used secondary road surface treatments: crushed limestone rock and alluvial sand/gravel. These commercially available dust suppressants included: lignin sulfonate, calcium chloride, and soybean oil soapstock. These suppressants were applied to 1000 ft test sections on four unpaved roads in Story County, Iowa. Tduplicate field conditions, the suppressants were applied as a surface spray once in early June and again in late August or early September. The four unpaved roads included two with crushed limestone rock and two with alluvial sand/gravel surface treatmewell as high and low traffic counts. The effectiveness of the dust suppressants was evaluated by comparing the dust produced on treated and untreated test sections. Dust collection was scheduled for 1, 2, 4, 6, and 8 weeks after each application, for a total testiperiod of 16 weeks. Results of a cost analysis between annual dust suppressant application and biennial aggregate replacement indicated that the cost of the dust suppressant, its transportation, and application were relatively high when compared to that of thaggregate types. Therefore, the biennial aggregate replacement is considered more economical than annual dust suppressant application, although the application of annual dust suppressant reduced the cost of road maintenance by 75 %. Results of thecollection indicated that the lignin sulfonate suppressant outperformed calcium chloride and soybean oil soapstock on all four unpavroads, the effect of the suppressants on the alluvial sand/gravel surface treatment was less than that on the crushed limestone rock, the residual effects of all the products seem reasonably well after blading, and the combination of alluvial sand/gravel surface treatment anhigh traffic count caused dust reduction to decrease dramatically.
Resumo:
The purpose of this research was to summarize existing nondestructive test methods that have the potential to be used to detect materials-related distress (MRD) in concrete pavements. The various nondestructive test methods were then subjected to selection criteria that helped to reduce the size of the list so that specific techniques could be investigated in more detail. The main test methods that were determined to be applicable to this study included two stress-wave propagation techniques (impact-echo and spectral analysis of surface waves techniques), infrared thermography, ground penetrating radar (GPR), and visual inspection. The GPR technique was selected for a preliminary round of “proof of concept” trials. GPR surveys were carried out over a variety of portland cement concrete pavements for this study using two different systems. One of the systems was a state-of-the-art GPR system that allowed data to be collected at highway speeds. The other system was a less sophisticated system that was commercially available. Surveys conducted with both sets of equipment have produced test results capable of identifying subsurface distress in two of the three sites that exhibited internal cracking due to MRD. Both systems failed to detect distress in a single pavement that exhibited extensive cracking. Both systems correctly indicated that the control pavement exhibited negligible evidence of distress. The initial positive results presented here indicate that a more thorough study (incorporating refinements to the system, data collection, and analysis) is needed. Improvements in the results will be dependent upon defining the optimum number and arrangement of GPR antennas to detect the most common problems in Iowa pavements. In addition, refining highfrequency antenna response characteristics will be a crucial step toward providing an optimum GPR system for detecting materialsrelated distress.
Resumo:
Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.
Resumo:
In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.
Resumo:
Voriconazole (VRC) is a broad-spectrum antifungal triazole with nonlinear pharmacokinetics. The utility of measurement of voriconazole blood levels for optimizing therapy is a matter of debate. Available high-performance liquid chromatography (HPLC) and bioassay methods are technically complex, time-consuming, or have a narrow analytical range. Objectives of the present study were to develop new, simple analytical methods and to assess variability of voriconazole blood levels in patients with invasive mycoses. Acetonitrile precipitation, reverse-phase separation, and UV detection were used for HPLC. A voriconazole-hypersusceptible Candida albicans mutant lacking multidrug efflux transporters (cdr1Delta/cdr1Delta, cdr2Delta/cdr2Delta, flu1Delta/flu1Delta, and mdr1Delta/mdr1Delta) and calcineurin subunit A (cnaDelta/cnaDelta) was used for bioassay. Mean intra-/interrun accuracies over the VRC concentration range from 0.25 to 16 mg/liter were 93.7% +/- 5.0%/96.5% +/- 2.4% (HPLC) and 94.9% +/- 6.1%/94.7% +/- 3.3% (bioassay). Mean intra-/interrun coefficients of variation were 5.2% +/- 1.5%/5.4% +/- 0.9% and 6.5% +/- 2.5%/4.0% +/- 1.6% for HPLC and bioassay, respectively. The coefficient of concordance between HPLC and bioassay was 0.96. Sequential measurements in 10 patients with invasive mycoses showed important inter- and intraindividual variations of estimated voriconazole area under the concentration-time curve (AUC): median, 43.9 mg x h/liter (range, 12.9 to 71.1) on the first and 27.4 mg x h/liter (range, 2.9 to 93.1) on the last day of therapy. During therapy, AUC decreased in five patients, increased in three, and remained unchanged in two. A toxic encephalopathy probably related to the increase of the VRC AUC (from 71.1 to 93.1 mg x h/liter) was observed. The VRC AUC decreased (from 12.9 to 2.9 mg x h/liter) in a patient with persistent signs of invasive aspergillosis. These preliminary observations suggest that voriconazole over- or underexposure resulting from variability of blood levels might have clinical implications. Simple HPLC and bioassay methods offer new tools for monitoring voriconazole therapy.
Resumo:
We evaluated 25 protocol variants of 14 independent computational methods for exon identification, transcript reconstruction and expression-level quantification from RNA-seq data. Our results show that most algorithms are able to identify discrete transcript components with high success rates but that assembly of complete isoform structures poses a major challenge even when all constituent elements are identified. Expression-level estimates also varied widely across methods, even when based on similar transcript models. Consequently, the complexity of higher eukaryotic genomes imposes severe limitations on transcript recall and splice product discrimination that are likely to remain limiting factors for the analysis of current-generation RNA-seq data.