902 resultados para Application method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the urgence of a new paradigm in wireless digital trasmission which should allow for higher bit rate, lower latency and tigher delay constaints, it has been proposed to investigate the fundamental building blocks that at the circuital/device level, will boost the change towards a more efficient network architecture, with high capacity, higher bandwidth and a more satisfactory end user experience. At the core of each transciever, there are inherently analog devices capable of providing the carrier signal, the oscillators. It is strongly believed that many limitations in today's communication protocols, could be relieved by permitting high carrier frequency radio transmission, and having some degree of reconfigurability. This led us to studying distributed oscillator architectures which work in the microwave range and possess wideband tuning capability. As microvave oscillators are essentially nonlinear devices, a full nonlinear analyis, synthesis, and optimization had to be considered for their implementation. Consequently, all the most used nonlinear numerical techniques in commercial EDA software had been reviewed. An application of all the aforementioned techniques has been shown, considering a systems of three coupled oscillator ("triple push" oscillator) in which the stability of the various oscillating modes has been studied. Provided that a certain phase distribution is maintained among the oscillating elements, this topology permits a rise in the output power of the third harmonic; nevertheless due to circuit simmetry, "unwanted" oscillating modes coexist with the intenteded one. Starting with the necessary background on distributed amplification and distributed oscillator theory, the design of a four stage reverse mode distributed voltage controlled oscillator (DVCO) using lumped elments has been presented. All the design steps have been reported and for the first time a method for an optimized design with reduced variations in the output power has been presented. Ongoing work is devoted to model a wideband DVCO and to implement a frequency divider.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multiplex polymerase chain reaction (PCR) assay was performed on 167 thermophilic campylobacters isolated from non-human primates. Samples were first identified by phenotypic methods resulting in 64 Campylobacter jejuni and 103 C. coli strains. Four strains identified biochemically as C. coli, were then determined to be C. jejuni by PCR. Comparison of methodologies showed that the main discrepancies were attributed to the hippurate hydrolysis test and sensitivity to cephalothin and nalidixic acid. Analysis of data showed that the application of phenotypic methods should be supplemented by a molecular method to offer a more reliable Campylobacter identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strains of enterotoxigenic Escherichia coli (ETEC) are responsible for significant rates of morbidity and mortality among children, particularly in developing countries. The majority of clinical and public health laboratories are capable of isolating and identifying Salmonella, Shigella, Campylobacter, and Escherichia coli O157:H7 from stool samples, but ETEC cannot be identified by routine methods. The method most often used to identify ETEC is polymerase chain reaction for heat-stable and heat-labile enterotoxin genes, and subsequent serotyping, but most clinical and public health laboratories do not have the capacity or resources to perform these tests. In this study, polyclonal rabbit and monoclonal mouse IgG2b antibodies against ETEC heat-labile toxin-I (LT) were characterized and the potential applicability of a capture assay was analyzed. IgG-enriched fractions from rabbit polyclonal and the IgG2b monoclonal antibodies recognized LT in a conformational shape and they were excellent tools for detection of LT-producing strains. These findings indicate that the capture immunoassay could be used as a diagnostic assay of ETEC LT-producing strains in routine diagnosis and in epidemiological studies of diarrhea in developing countries as enzyme linked immunosorbent assay techniques remain as effective and economical choice for the detection of specific pathogen antigens in cultures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of molecular tools for genotyping Mycobacterium tuberculosis isolates in epidemiological surveys in order to identify clustered and orphan strains requires faster response times than those offered by the reference method, IS6110 restriction fragment length polymorphism (RFLP) genotyping. A method based on PCR, the mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) genotyping technique, is an option for fast fingerprinting of M. tuberculosis, although precise evaluations of correlation between MIRU-VNTR and RFLP findings in population-based studies in different contexts are required before the methods are switched. In this study, we evaluated MIRU-VNTR genotyping (with a set of 15 loci [MIRU-15]) in parallel to RFLP genotyping in a 39-month universal population-based study in a challenging setting with a high proportion of immigrants. For 81.9% (281/343) of the M. tuberculosis isolates, both RFLP and MIRU-VNTR types were obtained. The percentages of clustered cases were 39.9% (112/281) and 43.1% (121/281) for RFLP and MIRU-15 analyses, and the numbers of clusters identified were 42 and 45, respectively. For 85.4% of the cases, the RFLP and MIRU-15 results were concordant, identifying the same cases as clustered and orphan (kappa, 0.7). However, for the remaining 14.6% of the cases, discrepancies were observed: 16 of the cases clustered by RFLP analysis were identified as orphan by MIRU-15 analysis, and 25 cases identified as orphan by RFLP analysis were clustered by MIRU-15 analysis. When discrepant cases showing subtle genotypic differences were tolerated, the discrepancies fell from 14.6% to 8.6%. Epidemiological links were found for 83.8% of the cases clustered by both RFLP and MIRU-15 analyses, whereas for the cases clustered by RFLP or MIRU-VNTR analysis alone, links were identified for only 30.8% or 38.9% of the cases, respectively. The latter group of cases mainly comprised isolates that could also have been clustered, if subtle genotypic differences had been tolerated. MIRU-15 genotyping seems to be a good alternative to RFLP genotyping for real-time interventional schemes. The correlation between MIRU-15 and IS6110 RFLP findings was reasonable, although some uncertainties as to the assignation of clusters by MIRU-15 analysis were identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVE:. The information assessment method (IAM) permits health professionals to systematically document the relevance, cognitive impact, use and health outcomes of information objects delivered by or retrieved from electronic knowledge resources. The companion review paper (Part 1) critically examined the literature, and proposed a 'Push-Pull-Acquisition-Cognition-Application' evaluation framework, which is operationalized by IAM. The purpose of the present paper (Part 2) is to examine the content validity of the IAM cognitive checklist when linked to email alerts. METHODS: A qualitative component of a mixed methods study was conducted with 46 doctors reading and rating research-based synopses sent on email. The unit of analysis was a doctor's explanation of a rating of one item regarding one synopsis. Interviews with participants provided 253 units that were analysed to assess concordance with item definitions. RESULTS AND CONCLUSION: The content relevance of seven items was supported. For three items, revisions were needed. Interviews suggested one new item. This study has yielded a 2008 version of IAM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En aquest treball, es proposa un nou mètode per estimar en temps real la qualitat del producte final en processos per lot. Aquest mètode permet reduir el temps necessari per obtenir els resultats de qualitat de les anàlisi de laboratori. S'utiliza un model de anàlisi de componentes principals (PCA) construït amb dades històriques en condicions normals de funcionament per discernir si un lot finalizat és normal o no. Es calcula una signatura de falla pels lots anormals i es passa a través d'un model de classificació per la seva estimació. L'estudi proposa un mètode per utilitzar la informació de les gràfiques de contribució basat en les signatures de falla, on els indicadors representen el comportament de les variables al llarg del procés en les diferentes etapes. Un conjunt de dades compost per la signatura de falla dels lots anormals històrics es construeix per cercar els patrons i entrenar els models de classifcació per estimar els resultas dels lots futurs. La metodologia proposada s'ha aplicat a un reactor seqüencial per lots (SBR). Diversos algoritmes de classificació es proven per demostrar les possibilitats de la metodologia proposada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the design and implementation of QRP, an open source proof-of-concept authentication system that uses a two-factorauthentication by combining a password and a camera-equipped mobile phone, acting as an authentication token. QRP is extremely secure asall the sensitive information stored and transmitted is encrypted, but it isalso an easy to use and cost-efficient solution. QRP is portable and can be used securely in untrusted computers. Finally, QRP is able to successfully authenticate even when the phone is offline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To develop and evaluate a practical method for the quantification of signal-to-noise ratio (SNR) on coronary MR angiograms (MRA) acquired with parallel imaging.Materials and Methods: To quantify the spatially varying noise due to parallel imaging reconstruction, a new method has been implemented incorporating image data acquisition followed by a fast noise scan during which radio-frequency pulses, cardiac triggering and navigator gating are disabled. The performance of this method was evaluated in a phantom study where SNR measurements were compared with those of a reference standard (multiple repetitions). Subsequently, SNR of myocardium and posterior skeletal muscle was determined on in vivo human coronary MRA.Results: In a phantom, the SNR measured using the proposed method deviated less than 10.1% from the reference method for small geometry factors (<= 2). In vivo, the noise scan for a 10 min coronary MRA acquisition was acquired in 30 s. Higher signal and lower SNR, due to spatially varying noise, were found in myocardium compared with posterior skeletal muscle.Conclusion: SNR quantification based on a fast noise scan is a validated and easy-to-use method when applied to three-dimensional coronary MRA obtained with parallel imaging as long as the geometry factor remains low.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: : We describe a retinal endovascular fibrinolysis technique to directly reperfuse experimentally occluded retinal veins using a simple micropipette. METHODS: : Retinal vein occlusion was photochemically induced in 12 eyes of 12 minipigs: after intravenous injection of 10% fluorescein (1-mL bolus), the targeted retinal vein segment was exposed to thrombin (50 units) and to Argon laser (100-200 mW) through a pars plana approach. A beveled micropipette with a 30-μm-diameter sharp edge was used for micropuncture of the occluded vein and endovascular microinjection of tissue plasminogen activator (50 μg/mL) in 11 eyes. In one control eye, balanced salt solution was injected. The lesion site was examined histologically. RESULTS: : Retinal vein occlusion was achieved in all cases. Endovascular microinjection of tissue plasminogen activator or balanced salt solution led to reperfusion of the occluded retinal vein in all cases. Indicative of successful reperfusion were the following: continuous endovascular flow, unaffected collateral circulation, no optic disk ischemia, and no venous wall bleeding. However, balanced salt solution injection was accompanied by thrombus formation at the punctured site, whereas no thrombus was observed with tissue plasminogen activator injection. CONCLUSION: : Retinal endovascular fibrinolysis constitutes an efficient method of micropuncture and reperfusion of an experimentally occluded retinal vein. Thrombus formation at the punctured site can be prevented by injection of tissue plasminogen activator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2009-2010 Data Fusion Contest organized by the Data Fusion Technical Committee of the IEEE Geoscience and Remote Sensing Society was focused on the detection of flooded areas using multi-temporal and multi-modal images. Both high spatial resolution optical and synthetic aperture radar data were provided. The goal was not only to identify the best algorithms (in terms of accuracy), but also to investigate the further improvement derived from decision fusion. This paper presents the four awarded algorithms and the conclusions of the contest, investigating both supervised and unsupervised methods and the use of multi-modal data for flood detection. Interestingly, a simple unsupervised change detection method provided similar accuracy as supervised approaches, and a digital elevation model-based predictive method yielded a comparable projected change detection map without using post-event data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characteriza- tion and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is com- bined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS in- strumental error is small enough to enable detection of pre- cursory displacements of millimetric magnitude. This con- sists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Dis- placement measurement are improved considerably by ap- plying Nearest Neighbour (NN) averaging, which reduces the error (1σ ) up to a factor of 6. This technique was ap- plied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumen- tal error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by apply- ing the NN averaging method. These results show that mil- limetric displacements prior to failure can be detected using TLS.