917 resultados para post-processing method
Resumo:
Arenaviruses merit interest as clinically important human pathogens and include several causative agents, chiefly Lassa virus (LASV), of hemorrhagic fever disease in humans. There are no licensed LASV vaccines, and current antiarenavirus therapy is limited to the use of ribavirin, which is only partially effective and is associated with significant side effects. The arenavirus glycoprotein (GP) precursor GPC is processed by the cellular site 1 protease (S1P) to generate the peripheral virion attachment protein GP1 and the fusion-active transmembrane protein GP2, which is critical for production of infectious progeny and virus propagation. Therefore, S1P-mediated processing of arenavirus GPC is a promising target for therapeutic intervention. To this end, we have evaluated the antiarenaviral activity of PF-429242, a recently described small-molecule inhibitor of S1P. PF-429242 efficiently prevented the processing of GPC from the prototypic arenavirus lymphocytic choriomeningitis virus (LCMV) and LASV, which correlated with the compound's potent antiviral activity against LCMV and LASV in cultured cells. In contrast, a recombinant LCMV expressing a GPC whose processing into GP1 and GP2 was mediated by furin, instead of S1P, was highly resistant to PF-429242 treatment. PF-429242 did not affect virus RNA replication or budding but had a modest effect on virus cell entry, indicating that the antiarenaviral activity of PF-429242 was mostly related to its ability to inhibit S1P-mediated processing of arenavirus GPC. Our findings support the feasibility of using small-molecule inhibitors of S1P-mediated processing of arenavirus GPC as a novel antiviral strategy.
Resumo:
The STAR family of proteins links signaling pathways to various aspects of post-transcriptional regulation and processing of RNAs. Sam68 belongs to this class of heteronuclear ribonucleoprotein particle K (hnRNP K) homology (KH) single domain-containing family of RNA-binding proteins that also contains some domains predicted to bind critical components in signal transduction pathways. In response to phosphorylation and other post-transcriptional modifications, Sam68 has been shown to have the ability to link signal transduction pathways to downstream effects regulating RNA metabolism, including transcription, alternative splicing or RNA transport. In addition to its function as a docking protein in some signaling pathways, this prototypic STAR protein has been identified to have a nuclear localization and to take part in the formation of both nuclear and cytosolic multi-molecular complexes such as Sam68 nuclear bodies and stress granules. Coupling with other proteins and RNA targets, Sam68 may play a role in the regulation of differential expression and mRNA processing and translation according to internal and external signals, thus mediating important physiological functions, such as cell death, proliferation or cell differentiation.
Resumo:
Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
Platelet-rich plasma (PRP) is a volume of plasma fraction of autologous blood having platelet concentrations above baseline whole-blood values due to processing and concentration. PRP is used in various surgical fields to enhance soft-tissue and bone healing by delivering supra-physiological concentrations of autologous platelets at the site of tissue damage. These preparations may provide a good cellular source of various growth factors and cytokines, and modulate tissue response to injury. Common clinically available materials for blood preparations combined with a two-step centrifugation protocol at 280g each, to ensure cellular component integrity, provided platelet preparations which were concentrated 2-3 fold over total blood values. Costs were shown to be lower than those of other methods which require specific equipment and high-cost disposables, while safety and traceability can be increased. PRP can be used for the treatment of wounds of all types including burns and also of split-thickness skin graft donor sites, which are frequently used in burn management. The procedure can be standardized and is easy to adapt in clinical settings with minimal infrastructure, thus enabling large numbers of patients to benefit from a form of cellular therapy.
Resumo:
A novel approach to the study of hepatic glycogen kinetics and fractional gluconeogenesis in vivo is described. Ten healthy female subjects were fed an iso-caloric diet containing 55% carbohydrate energy with a 13C abundance of 1.083 atom percent for a 3-day baseline period; then, a diet of similar composition, but providing carbohydrate with a 13C abundance of 1.093 atom percent was started and continued for 5 days. Resting respiratory gas exchanges, urinary nitrogen excretion, breath 13CO2 and plasma 13C glucose were measured every morning in the fasting state. The enrichment in 13C of hepatic glycogen was calculated from these measured data. 13C glycogen enrichment increased after switching to a 13C enriched carbohydrate diet, and was identical to the 13C enrichment of dietary carbohydrates after 3 days. The time required to renew 50% of hepatic glycogen, as determined from the kinetics of 13C glycogen enrichment, was 18.9 +/- 3.6 h. Fractional gluconeogenesis, as determined from the difference between the enrichments of glucose oxidized originating from hepatic glycogen and plasma glucose 13C was 50.8 +/- 5.3%. This non-invasive method will allow the study of hepatic glycogen metabolism in insulin-resistant patients.
Resumo:
Saffaj et al. recently criticized our method of monitoring carbon dioxide in human postmortem cardiac gas samples using Headspace-Gas Chromatography-Mass Spectrometry. According to the authors, their demonstration, based on the latest SFSTP guidelines (established after 2007 [1,2]) fitted for the validation of drug monitoring bioanalytical methods, has put in evidence potential errors. However, our validation approach was built using SFSTP guidelines established before 2007 [3-6]. We justify the use of these guidelines because of the post-mortem context of the study (and not clinical) and the gaseous state of the sample (and not solid or liquid). Using these guidelines, our validation remains correct.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to obtain richer resources and a broader range of potential uses for a significant number of languages. With the objective of reducing cost by eliminating human intervention, we present a new method towards the automatic merging of resources. This method includes both, the automatic mapping of resources involved to a common format and merging them, once in this format. This paper presents how we have addressed the merging of two verb subcategorization frame lexica for Spanish, but our method will be extended to cover other types of Lexical Resources. The achieved results, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
BACKGROUND AND PURPOSE: The EORTC 22043-30041 trial investigates the role of the addition of androgen suppression to post-operative radiotherapy in patients who have undergone radical prostatectomy. As part of the quality assurance of radiotherapy (QART) a Dummy Run (DR) procedure was performed. MATERIALS AND METHOD: The protocol included detailed and published delineation guidelines. Participating institutions digitally submitted radiotherapy treatment volumes and a treatment plan for a standard clinical case. Submissions were centrally reviewed using the VODCA software platform. RESULTS: Thirty-eight submissions from thirty-one institutions were reviewed. Six were accepted without comments. Twenty-three were accepted with comments on one or more items: target volume delineation (22), OAR delineation (23), planning and dosimetry (3) or treatment verification (1). Nine submissions were rejected requiring resubmission, seven for target volume delineation reasons alone. Intervention to highlight the importance of delineation guidelines was made prior to the entry of the first patient in the trial. After this, a lower percentage of resubmissions was required. CONCLUSIONS: The EORTC 22043-30041 Dummy Run highlights the need for timely and effective QART in clinical trials. The variation in target volume and OAR definition demonstrates that clinical guidelines and radiotherapy protocols are not a substitute for QART procedures. Early intervention in response to the Dummy Run improved protocol understanding.
Resumo:
Identification of post-translational modifications of proteins in biological samples often requires access to preanalytical purification and concentration methods. In the purification step high or low molecular weight substances can be removed by size exclusion filters, and high abundant proteins can be removed, or low abundant proteins can be enriched, by specific capturing tools. In this paper is described the experience and results obtained with a recently emerged and easy-to-use affinity purification kit for enrichment of the low amounts of EPO found in urine and plasma specimens. The kit can be used as a pre-step in the EPO doping control procedure, as an alternative to the commonly used ultrafiltration, for detecting aberrantly glycosylated isoforms. The commercially available affinity purification kit contains small disposable anti-EPO monolith columns (6 ?L volume, Ø7 mm, length 0.15 mm) together with all required buffers. A 24-channel vacuum manifold was used for simultaneous processing of samples. The column concentrated EPO from 20 mL urine down to 55 ?L eluate with a concentration factor of 240 times, while roughly 99.7% of non-relevant urine proteins were removed. The recoveries of Neorecormon (epoetin beta), and the EPO analogues Aranesp and Mircera applied to buffer were high, 76%, 67% and 57%, respectively. The recovery of endogenous EPO from human urine was 65%. High recoveries were also obtained when purifying human, mouse and equine EPO from serum, and human EPO from cerebrospinal fluid. Evaluation with the accredited EPO doping control method based on isoelectric focusing (IEF) showed that the affinity purification procedure did not change the isoform distribution for rhEPO, Aranesp, Mircera or endogenous EPO. The kit should be particularly useful for applications in which it is essential to avoid carry-over effects, a problem commonly encountered with conventional particle-based affinity columns. The encouraging results with EPO propose that similar affinity monoliths, with the appropriate antibodies, should constitute useful tools for general applications in sample preparation, not only for doping control of EPO and other hormones such as growth hormone and insulin but also for the study of post-translational modifications of other low abundance proteins in biological and clinical research, and for sample preparation prior to in vitro diagnostics.
Resumo:
Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel
A filtering method to correct time-lapse 3D ERT data and improve imaging of natural aquifer dynamics
Resumo:
We have developed a processing methodology that allows crosshole ERT (electrical resistivity tomography) monitoring data to be used to derive temporal fluctuations of groundwater electrical resistivity and thereby characterize the dynamics of groundwater in a gravel aquifer as it is infiltrated by river water. Temporal variations of the raw ERT apparent-resistivity data were mainly sensitive to the resistivity (salinity), temperature and height of the groundwater, with the relative contributions of these effects depending on the time and the electrode configuration. To resolve the changes in groundwater resistivity, we first expressed fluctuations of temperature-detrended apparent-resistivity data as linear superpositions of (i) time series of riverwater-resistivity variations convolved with suitable filter functions and (ii) linear and quadratic representations of river-water-height variations multiplied by appropriate sensitivity factors; river-water height was determined to be a reliable proxy for groundwater height. Individual filter functions and sensitivity factors were obtained for each electrode configuration via deconvolution using a one month calibration period and then the predicted contributions related to changes in water height were removed prior to inversion of the temperature-detrended apparent-resistivity data. Applications of the filter functions and sensitivity factors accurately predicted the apparent-resistivity variations (the correlation coefficient was 0.98). Furthermore, the filtered ERT monitoring data and resultant time-lapse resistivity models correlated closely with independently measured groundwater electrical resistivity monitoring data and only weakly with the groundwater-height fluctuations. The inversion results based on the filtered ERT data also showed significantly less inversion artefacts than the raw data inversions. We observed resistivity increases of up to 10% and the arrival time peaks in the time-lapse resistivity models matched those in the groundwater resistivity monitoring data.
Resumo:
Optimal behavior relies on flexible adaptation to environmental requirements, notably based on the detection of errors. The impact of error detection on subsequent behavior typically manifests as a slowing down of RTs following errors. Precisely how errors impact the processing of subsequent stimuli and in turn shape behavior remains unresolved. To address these questions, we used an auditory spatial go/no-go task where continual feedback informed participants of whether they were too slow. We contrasted auditory-evoked potentials to left-lateralized go and right no-go stimuli as a function of performance on the preceding go stimuli, generating a 2 × 2 design with "preceding performance" (fast hit [FH], slow hit [SH]) and stimulus type (go, no-go) as within-subject factors. SH trials yielded SH trials on the following trials more often than did FHs, supporting our assumption that SHs engaged effects similar to errors. Electrophysiologically, auditory-evoked potentials modulated topographically as a function of preceding performance 80-110 msec poststimulus onset and then as a function of stimulus type at 110-140 msec, indicative of changes in the underlying brain networks. Source estimations revealed a stronger activity of prefrontal regions to stimuli after successful than error trials, followed by a stronger response of parietal areas to the no-go than go stimuli. We interpret these results in terms of a shift from a fast automatic to a slow controlled form of inhibitory control induced by the detection of errors, manifesting during low-level integration of task-relevant features of subsequent stimuli, which in turn influences response speed.
Resumo:
OBJECTIVE: Depth of emotional processing has shown to be related to outcome across approaches to psychotherapy. Moreover, a specific emotional sequence has been postulated and tested in several studies on experiential psychotherapy (Pascual-Leone & Greenberg, 2007). This process-outcome study aims at reproducing the sequential model of emotional processing in psychodynamic psychotherapy for adjustment disorder and linking these variables with ultimate therapeutic outcome. METHOD: In this study, 32 patients underwent short-term dynamic psychotherapy. On the basis of reliable clinical change statistics, a subgroup (n = 16) presented with good outcome and another subgroup (n = 16) had a poor outcome in the end of treatment. The strongest alliance session of each case was rated using the observer-rated system Classification of Affective Meaning States. Reliability coefficients for the measure were excellent (κ = .82). RESULTS: Using 1 min as the fine-grained unit of analysis, results showed that the experience of fundamentally adaptive grief was more common in the in-session process of patients with good outcome, compared with those with poor outcomes (χ2 = 6.56, p = .01, d = 1.23). This variable alone predicted 19% of the change in depressive symptoms as measured by the Beck Depression Inventory at the end of treatment. Moreover, sequences of the original model were supported and related to outcome. CONCLUSIONS: These results are discussed within the framework of the sequential model of emotional processing and its possible relevance for psychodynamic psychotherapy. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.