931 resultados para Rule-based techniques
Resumo:
Asphalt binder is typically modified with poly type (styrene-butadiene-styrene or SBS) polymers to improve its rheological properties and performance grade. The elastic and principal component of SBS polymers is butadiene. For the last decade, butadiene prices have fluctuated and significantly increased, leading state highway agencies to search for economically viable alternatives to butadiene based materials. This project reports the recent advances in polymerization techniques that have enabled the synthesis of elastomeric, thermoplastic, block-copolymers (BCPs) comprised of styrene and soybean oil, where the “B” block in SBS polymers is replaced with polymerized triglycerides derived from soybean oil. These new breeds of biopolymers have elastomeric properties comparable to well-established butadiene-based styrenic BCPs. In this report, two types of biopolymer formulations are evaluated for their ability to modify asphalt binder. Laboratory blends of asphalt modified with the biopolymers are tested for their rheological properties and performance grade. Blends of asphalt modified with the biopolymers are compared to blends of asphalt modified with two commonly used commercial polymers. The viscoelastic properties of the blends show that biopolymers improve the performance grade of the asphalt to a similar and even greater extent as the commercial SBS polymers. Results shown in this report indicate there is an excellent potential for the future of these biopolymers as economically and environmentally favorable alternatives to their petrochemically-derived analogs.
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
Identifying the geographic distribution of populations is a basic, yet crucial step in many fundamental and applied ecological projects, as it provides key information on which many subsequent analyses depend. However, this task is often costly and time consuming, especially where rare species are concerned and where most sampling designs generally prove inefficient. At the same time, rare species are those for which distribution data are most needed for their conservation to be effective. To enhance fieldwork sampling, model-based sampling (MBS) uses predictions from species distribution models: when looking for the species in areas of high habitat suitability, chances should be higher to find them. We thoroughly tested the efficiency of MBS by conducting an important survey in the Swiss Alps, assessing the detection rate of three rare and five common plant species. For each species, habitat suitability maps were produced following an ensemble modeling framework combining two spatial resolutions and two modeling techniques. We tested the efficiency of MBS and the accuracy of our models by sampling 240 sites in the field (30 sitesx8 species). Across all species, the MBS approach proved to be effective. In particular, the MBS design strictly led to the discovery of six sites of presence of one rare plant, increasing chances to find this species from 0 to 50%. For common species, MBS doubled the new population discovery rates as compared to random sampling. Habitat suitability maps coming from the combination of four individual modeling methods predicted well the species' distribution and more accurately than the individual models. As a conclusion, using MBS for fieldwork could efficiently help in increasing our knowledge of rare species distribution. More generally, we recommend using habitat suitability models to support conservation plans.
Resumo:
To cite this article: Ponvert C, Perrin Y, Bados-Albiero A, Le Bourgeois M, Karila C, Delacourt C, Scheinmann P, De Blic J. Allergy to betalactam antibiotics in children: results of a 20-year study based on clinical history, skin and challenge tests. Pediatr Allergy Immunol 2011; 22: 411-418. ABSTRACT: Studies based on skin and challenge tests have shown that 12-60% of children with suspected betalactam hypersensitivity were allergic to betalactams. Responses in skin and challenge tests were studied in 1865 children with suspected betalactam allergy (i) to confirm or rule out the suspected diagnosis; (ii) to evaluate diagnostic value of immediate and non-immediate responses in skin and challenge tests; (iii) to determine frequency of betalactam allergy in those children, and (iv) to determine potential risk factors for betalactam allergy. The work-up was completed in 1431 children, of whom 227 (15.9%) were diagnosed allergic to betalactams. Betalactam hypersensitivity was diagnosed in 50 of the 162 (30.9%) children reporting immediate reactions and in 177 of the 1087 (16.7%) children reporting non-immediate reactions (p < 0.001). The likelihood of betalactam hypersensitivity was also significantly higher in children reporting anaphylaxis, serum sickness-like reactions, and (potentially) severe skin reactions such as acute generalized exanthematic pustulosis, Stevens-Johnson syndrome, and drug reaction with systemic symptoms than in other children (p < 0.001). Skin tests diagnosed 86% of immediate and 31.6% of non-immediate sensitizations. Cross-reactivity and/or cosensitization among betalactams was diagnosed in 76% and 14.7% of the children with immediate and non-immediate hypersensitivity, respectively. The number of children diagnosed allergic to betalactams decreased with time between the reaction and the work-up, probably because the majority of children with severe and worrying reactions were referred for allergological work-up more promptly than the other children. Sex, age, and atopy were not risk factors for betalactam hypersensitivity. In conclusion, we confirm in numerous children that (i) only a few children with suspected betalactam hypersensitivity are allergic to betalactams; (ii) the likelihood of betalactam allergy increases with earliness and/or severity of the reactions; (iii) although non-immediate-reading skin tests (intradermal and patch tests) may diagnose non-immediate sensitizations in children with non-immediate reactions to betalactams (maculopapular rashes and potentially severe skin reactions especially), the diagnostic value of non-immediate-reading skin tests is far lower than the diagnostic value of immediate-reading skin tests, most non-immediate sensitizations to betalactams being diagnosed by means of challenge tests; (iv) cross-reactivity and/or cosensitizations among betalactams are much more frequent in children reporting immediate and/or anaphylactic reactions than in the other children; (v) age, sex and personal atopy are not significant risk factors for betalactam hypersensitivity; and (vi) the number of children with diagnosed allergy to betalactams (of the immediate-type hypersensitivity especially) decreases with time between the reaction and allergological work-up. Finally, based on our experience, we also propose a practical diagnostic approach in children with suspected betalactam hypersensitivity.
Resumo:
In this paper, we describe several techniques for detecting tonic pitch value in Indian classical music. In Indian music, the raga is the basic melodic framework and it is built on the tonic. Tonic detection is therefore fundamental for any melodic analysis in Indian classical music. This workexplores detection of tonic by processing the pitch histograms of Indian classic music. Processing of pitch histograms using group delay functions and its ability to amplify certain traits of Indian music in the pitch histogram, is discussed. Three different strategies to detect tonic, namely, the concert method, the template matching and segmented histogram method are proposed. The concert method exploits the fact that the tonic is constant over a piece/concert.templatematchingmethod and segmented histogrammethodsuse the properties: (i) the tonic is always present in the background, (ii) some notes are less inflected and dominant, to detect the tonic of individual pieces. All the three methods yield good results for Carnatic music (90−100% accuracy), while for Hindustanimusic, the templatemethod works best, provided the v¯adi samv¯adi notes for a given piece are known (85%).
Resumo:
The polycyclic aromatic hydrocarbon (PAH)-degrading strain Burkholderia sp. RP007 served as host strain for the design of a bacterial biosensor for the detection of phenanthrene. RP007 was transformed with a reporter plasmid containing a transcriptional fusion between the phnS putative promoter/operator region and the gene encoding the enhanced green fluorescent protein (GFP). The resulting bacterial biosensor--Burkholderia sp. strain RP037--produced significant amounts of GFP after batch incubation in the presence of phenanthrene crystals. Co-incubation with acetate did not disturb the phenanthrene-specific response but resulted in a homogenously responding population of cells. Active metabolism was required for induction with phenanthrene. The magnitude of GFP induction was influenced by physical parameters affecting the phenanthrene flux to the cells, such as the contact surface area between solid phenanthrene and the aqueous phase, addition of surfactant, and slow phenanthrene release from Model Polymer Release System beads or from a water-immiscible oil. These results strongly suggest that the bacterial biosensor can sense different phenanthrene fluxes while maintaining phenanthrene metabolism, thus acting as a genuine sensor for phenanthrene bioavailability. A relationship between GFP production and phenanthrene mass transfer is proposed.
Resumo:
Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.
Resumo:
Background: Johanson-Blizzard syndrome (JBS; OMIM 243800) is an autosomal recessive disorder that includes congenital exocrine pancreatic insufficiency, facial dysmorphism with the characteristic nasal wing hypoplasia, multiple malformations, and frequent mental retardation. Our previous work has shown that JBS is caused by mutations in human UBR1, which encodes one of the E3 ubiquitin ligases of the N-end rule pathway. The N-end rule relates the regulation of the in vivo half-life of a protein to the identity of its N-terminal residue. One class of degradation signals (degrons) recognized by UBR1 are destabilizing N-terminal residues of protein substrates.Methodology/Principal Findings: Most JBS-causing alterations of UBR1 are nonsense, frameshift or splice-site mutations that abolish UBR1 activity. We report here missense mutations of human UBR1 in patients with milder variants of JBS. These single-residue changes, including a previously reported missense mutation, involve positions in the RING-H2 and UBR domains of UBR1 that are conserved among eukaryotes. Taking advantage of this conservation, we constructed alleles of the yeast Saccharomyces cerevisiae UBR1 that were counterparts of missense JBS-UBR1 alleles. Among these yeast Ubr1 mutants, one of them (H160R) was inactive in yeast-based activity assays, the other one (Q1224E) had a detectable but weak activity, and the third one (V146L) exhibited a decreased but significant activity, in agreement with manifestations of JBS in the corresponding JBS patients.Conclusions/Significance: These results, made possible by modeling defects of a human ubiquitin ligase in its yeast counterpart, verified and confirmed the relevance of specific missense UBR1 alleles to JBS, and suggested that a residual activity of a missense allele is causally associated with milder variants of JBS.
Resumo:
Brain perfusion can be assessed by CT and MR. For CT, two major techniquesare used. First, Xenon CT is an equilibrium technique based on a freely diffusibletracer. First pass of iodinated contrast injected intravenously is a second method,more widely available. Both methods are proven to be robust and quantitative,thanks to the linear relationship between contrast concentration and x-ray attenuation.For the CT methods, concern regarding x-ray doses delivered to the patientsneed to be addressed. MR is also able to assess brain perfusion using the firstpass of gadolinium based contrast agent injected intravenously. This method hasto be considered as a semi-quantitative because of the non linear relationshipbetween contrast concentration and MR signal changes. Arterial spin labelingis another MR method assessing brain perfusion without injection of contrast. Insuch case, the blood flow in the carotids is magnetically labelled by an externalradiofrequency pulse and observed during its first pass through the brain. Eachof this various CT and MR techniques have advantages and limits that will be illustratedand summarised.Learning Objectives:1. To understand and compare the different techniques for brain perfusionimaging.2. To learn about the methods of acquisition and post-processing of brainperfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).
Resumo:
Aim To assess the geographical transferability of niche-based species distribution models fitted with two modelling techniques. Location Two distinct geographical study areas in Switzerland and Austria, in the subalpine and alpine belts. Methods Generalized linear and generalized additive models (GLM and GAM) with a binomial probability distribution and a logit link were fitted for 54 plant species, based on topoclimatic predictor variables. These models were then evaluated quantitatively and used for spatially explicit predictions within (internal evaluation and prediction) and between (external evaluation and prediction) the two regions. Comparisons of evaluations and spatial predictions between regions and models were conducted in order to test if species and methods meet the criteria of full transferability. By full transferability, we mean that: (1) the internal evaluation of models fitted in region A and B must be similar; (2) a model fitted in region A must at least retain a comparable external evaluation when projected into region B, and vice-versa; and (3) internal and external spatial predictions have to match within both regions. Results The measures of model fit are, on average, 24% higher for GAMs than for GLMs in both regions. However, the differences between internal and external evaluations (AUC coefficient) are also higher for GAMs than for GLMs (a difference of 30% for models fitted in Switzerland and 54% for models fitted in Austria). Transferability, as measured with the AUC evaluation, fails for 68% of the species in Switzerland and 55% in Austria for GLMs (respectively for 67% and 53% of the species for GAMs). For both GAMs and GLMs, the agreement between internal and external predictions is rather weak on average (Kulczynski's coefficient in the range 0.3-0.4), but varies widely among individual species. The dominant pattern is an asymmetrical transferability between the two study regions (a mean decrease of 20% for the AUC coefficient when the models are transferred from Switzerland and 13% when they are transferred from Austria). Main conclusions The large inter-specific variability observed among the 54 study species underlines the need to consider more than a few species to test properly the transferability of species distribution models. The pronounced asymmetry in transferability between the two study regions may be due to peculiarities of these regions, such as differences in the ranges of environmental predictors or the varied impact of land-use history, or to species-specific reasons like differential phenotypic plasticity, existence of ecotypes or varied dependence on biotic interactions that are not properly incorporated into niche-based models. The lower variation between internal and external evaluation of GLMs compared to GAMs further suggests that overfitting may reduce transferability. Overall, a limited geographical transferability calls for caution when projecting niche-based models for assessing the fate of species in future environments.
Resumo:
This special issue aims to cover some problems related to non-linear and nonconventional speech processing. The origin of this volume is in the ISCA Tutorial and Research Workshop on Non-Linear Speech Processing, NOLISP’09, held at the Universitat de Vic (Catalonia, Spain) on June 25–27, 2009. The series of NOLISP workshops started in 2003 has become a biannual event whose aim is to discuss alternative techniques for speech processing that, in a sense, do not fit into mainstream approaches. A selected choice of papers based on the presentations delivered at NOLISP’09 has given rise to this issue of Cognitive Computation.
Resumo:
Mass spectrometry (MS) is currently the most sensitive and selective analytical technique for routine peptide and protein structure analysis. Top-down proteomics is based on tandem mass spectrometry (MS/ MS) of intact proteins, where multiply charged precursor ions are fragmented in the gas phase, typically by electron transfer or electron capture dissociation, to yield sequence-specific fragment ions. This approach is primarily used for the study of protein isoforms, including localization of post-translational modifications and identification of splice variants. Bottom-up proteomics is utilized for routine high-throughput protein identification and quantitation from complex biological samples. The proteins are first enzymatically digested into small (usually less than ca. 3 kDa) peptides, these are identified by MS or MS/MS, usually employing collisional activation techniques. To overcome the limitations of these approaches while combining their benefits, middle-down proteomics has recently emerged. Here, the proteins are digested into long (3-15 kDa) peptides via restricted proteolysis followed by the MS/MS analysis of the obtained digest. With advancements of high-resolution MS and allied techniques, routine implementation of the middle-down approach has been made possible. Herein, we present the liquid chromatography (LC)-MS/MS-based experimental design of our middle-down proteomic workflow coupled with post-LC supercharging.
Resumo:
In this paper we present a method for blind deconvolution of linear channels based on source separation techniques, for real word signals. This technique applied to blind deconvolution problems is based in exploiting not the spatial independence between signals but the temporal independence between samples of the signal. Our objective is to minimize the mutual information between samples of the output in order to retrieve the original signal. In order to make use of use this idea the input signal must be a non-Gaussian i.i.d. signal. Because most real world signals do not have this i.i.d. nature, we will need to preprocess the original signal before the transmission into the channel. Likewise we should assure that the transmitted signal has non-Gaussian statistics in order to achieve the correct function of the algorithm. The strategy used for this preprocessing will be presented in this paper. If the receiver has the inverse of the preprocess, the original signal can be reconstructed without the convolutive distortion.
Resumo:
We propose a deep study on tissue modelization andclassification Techniques on T1-weighted MR images. Threeapproaches have been taken into account to perform thisvalidation study. Two of them are based on FiniteGaussian Mixture (FGM) model. The first one consists onlyin pure gaussian distributions (FGM-EM). The second oneuses a different model for partial volume (PV) (FGM-GA).The third one is based on a Hidden Markov Random Field(HMRF) model. All methods have been tested on a DigitalBrain Phantom image considered as the ground truth. Noiseand intensity non-uniformities have been added tosimulate real image conditions. Also the effect of ananisotropic filter is considered. Results demonstratethat methods relying in both intensity and spatialinformation are in general more robust to noise andinhomogeneities. However, in some cases there is nosignificant differences between all presented methods.