992 resultados para Epstein-Glaser causal method
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
In parasites, host specificity may result either from restricted dispersal capacity or from fixed coevolutionary host-parasite adaptations. Knowledge of those proximal mechanisms leading to particular host specificity is fundamental to understand host-parasite interactions and potential coevolution of parasites and hosts. The relative importance of these two mechanisms was quantified through infection and cross-infection experiments using mites and bats as a model. Monospecific pools of parasitic mites (Spinturnix myoti and S. andegavinus) were subjected either to individual bats belonging to their traditional, native bat host species, or to another substitute host species within the same bat genus (Myotis). The two parasite species reacted differently to these treatments. S. myoti exhibited a clear preference for, and had a higher fitness on, its native host, Myotis myotis. In contrast, S. andegavinus showed no host choice, although its fitness was higher on its native host M. daubentoni. The causal mechanisms mediating host specificity can apparently differ within closely related host-parasite systems.
Resumo:
RATIONALE: The aim of the work was to develop and validate a method for the quantification of vitamin D metabolites in serum using ultra-high-pressure liquid chromatography coupled to mass spectrometry (LC/MS), and to validate a high-resolution mass spectrometry (LC/HRMS) approach against a tandem mass spectrometry (LC/MS/MS) approach using a large clinical sample set. METHODS: A fast, accurate and reliable method for the quantification of the vitamin D metabolites, 25-hydroxyvitamin D2 (25OH-D2) and 25-hydroxyvitamin D3 (25OH-D3), in human serum was developed and validated. The C3 epimer of 25OH-D3 (3-epi-25OH-D3) was also separated from 25OH-D3. The samples were rapidly prepared via a protein precipitation step followed by solid-phase extraction (SPE) using an HLB μelution plate. Quantification was performed using both LC/MS/MS and LC/HRMS systems. RESULTS: Recovery, matrix effect, inter- and intra-day reproducibility were assessed. Lower limits of quantification (LLOQs) were determined for both 25OH-D2 and 25OH-D3 for the LC/MS/MS approach (6.2 and 3.4 µg/L, respectively) and the LC/HRMS approach (2.1 and 1.7 µg/L, respectively). A Passing & Bablok fit was determined between both approaches for 25OH-D3 on 662 clinical samples (1.11 + 1.06x). It was also shown that results can be affected by the inclusion of the isomer 3-epi-25OH-D3. CONCLUSIONS: Quantification of the relevant vitamin D metabolites was successfully developed and validated here. It was shown that LC/HRMS is an accurate, powerful and easy to use approach for quantification within clinical laboratories. Finally, the results here suggest that it is important to separate 3-epi-25OH-D3 from 25OH-D3. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
The objective of traffic engineering is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization, few works have focused on LSR (label switched router) label space. This paper proposes an algorithm that takes advantage of the MPLS label stack features in order to reduce the number of labels used in LSPs. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The described algorithm sets up NHLFE (next hop label forwarding entry) tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the described algorithm achieves a great reduction factor in the label space. The presented works apply for both types of connections: P2MP (point-to-multipoint) and P2P (point-to-point)
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced
Resumo:
The recommended treatment for latent tuberculosis (TB) infection in adults is a daily dose of isoniazid (INH) 300 mg for six months. In Brazil, INH was formulated as 100 mg tablets. The treatment duration and the high pill burden compromised patient adherence to the treatment. The Brazilian National Programme for Tuberculosis requested a new 300 mg INH formulation. The aim of our study was to compare the bioavailability of the new INH 300 mg formulation and three 100 mg tablets of the reference formulation. We conducted a randomised, single dose, open label, two-phase crossover bioequivalence study in 28 healthy human volunteers. The 90% confidence interval for the INH maximum concentration of drug observed in plasma and area under the plasma concentration vs. time curve from time zero to the last measurable concentration “time t” was 89.61-115.92 and 94.82-119.44, respectively. The main limitation of our study was that neither adherence nor the safety profile of multiple doses was evaluated. To determine the level of INH in human plasma, we developed and validated a sensitive, simple and rapid high-performance liquid chromatography-tandem mass spectrometry method. Our results showed that the new formulation was bioequivalent to the 100 mg reference product. This finding supports the use of a single 300 mg tablet daily strategy to treat latent TB. This new formulation may increase patients’ adherence to the treatment and quality of life.
Resumo:
We describe a simple method for detection of Plasmodium vivaxand Plasmodium falciparum infection in anophelines using a triplex TaqMan real-time polymerase chain reaction (PCR) assay (18S rRNA). We tested the assay on Anopheles darlingi and Anopheles stephensi colony mosquitoes fed withPlasmodium-infected blood meals and in duplicate on field collected An. darlingi. We compared the real-time PCR results of colony-infected and field collected An. darlingi, separately, to a conventional PCR method. We determined that a cytochromeb-PCR method was only 3.33% as sensitive and 93.38% as specific as our real-time PCR assay with field-collected samples. We demonstrate that this assay is sensitive, specific and reproducible.
Resumo:
We examined drivers of article citations using 776 articles that were published from 1990-2012 in a broad-based and high-impact social sciences journal, The Leadership Quarterly. These articles had 1,191 unique authors having published and received in total (at the time of their most recent article published in our dataset) 16,817 articles and 284,777 citations, respectively. Our models explained 66.6% of the variance in citations and showed that quantitative, review, method, and theory articles were significantly more cited than were qualitative articles or agent-based simulations. As concerns quantitative articles, which constituted the majority of the sample, our model explained 80.3% of the variance in citations; some methods (e.g., use of SEM) and designs (e.g., meta-analysis), as well as theoretical approaches (e.g., use of transformational, charismatic, or visionary type-leadership theories) predicted higher article citations. Regarding the statistical conclusion validity of quantitative articles, articles having endogeneity threats received significantly fewer citations than did those using a more robust design or an estimation procedure that ensured correct causal estimation. We make several general recommendations on how to improve research practice and article citations.
Resumo:
22q11.2 deletion syndrome (22q11.2DS) is a common genetic condition associated with cognitive and learning impairments. In this study, we applied a three-dimensional method for quantifying gyrification at thousands of points over the cortical surface to imaging data from 44 children, adolescents, and young adults with 22q11.2DS (17 males, 27 females; mean age 17y 2mo [SD 9y 1mo], range 6-37y), and 53 healthy participants (21 males, 32 females; mean age 15y 4mo [SD 8y 6mo]; range 6-40y). Several clusters of reduced gyrification were observed, further substantiating the pattern of cerebral alterations presented by children with the syndrome. Comparisons within 22q11.2DS demonstrated an effect of congenital heart disease (CHD) on cortical gyrification, with reduced gyrification at the parieto-temporo-occipital junction in patients with CHD, as compared with patients without CHD. Reductions in gyrification can resemble mild polymicrogyria, suggesting early abnormal neuronal proliferation or migration and providing support for an effect of hemodynamic factors on brain development in 22q11.2DS. The results also shed light on the pathophysiology of acquired brain injury in other populations with CHD.
Resumo:
Gastric (GC) and breast (BrC) cancer are two of the most common and deadly tumours. Different lines of evidence suggest a possible causative role of viral infections for both GC and BrC. Wide genome sequencing (WGS) technologies allow searching for viral agents in tissues of patients with cancer. These technologies have already contributed to establish virus-cancer associations as well as to discovery new tumour viruses. The objective of this study was to document possible associations of viral infection with GC and BrC in Mexican patients. In order to gain idea about cost effective conditions of experimental sequencing, we first carried out an in silico simulation of WGS. The next-generation-platform IlluminaGallx was then used to sequence GC and BrC tumour samples. While we did not find viral sequences in tissues from BrC patients, multiple reads matching Epstein-Barr virus (EBV) sequences were found in GC tissues. An end-point polymerase chain reaction confirmed an enrichment of EBV sequences in one of the GC samples sequenced, validating the next-generation sequencing-bioinformatics pipeline.
Resumo:
The work presented in this paper belongs to the power quality knowledge area and deals with the voltage sags in power transmission and distribution systems. Propagating throughout the power network, voltage sags can cause plenty of problems for domestic and industrial loads that can financially cost a lot. To impose penalties to responsible party and to improve monitoring and mitigation strategies, sags must be located in the power network. With such a worthwhile objective, this paper comes up with a new method for associating a sag waveform with its origin in transmission and distribution networks. It solves this problem through developing hybrid methods which hire multiway principal component analysis (MPCA) as a dimension reduction tool. MPCA reexpresses sag waveforms in a new subspace just in a few scores. We train some well-known classifiers with these scores and exploit them for classification of future sags. The capabilities of the proposed method for dimension reduction and classification are examined using the real data gathered from three substations in Catalonia, Spain. The obtained classification rates certify the goodness and powerfulness of the developed hybrid methods as brand-new tools for sag classification
Resumo:
Objectives. The goal of this study is to evaluate a T2-mapping sequence by: (i) measuring the reproducibility intra- and inter-observer variability in healthy volunteers in two separate scanning session with a T2 reference phantom; (2) measuring the mean T2 relaxation times by T2-mapping in infarcted myocardium in patients with subacute MI and compare it with patient's the gold standard X-ray coronary angiography and healthy volunteers results. Background. Myocardial edema is a consequence of an inflammation of the tissue, as seen in myocardial infarct (MI). It can be visualized by cardiovascular magnetic resonance (CMR) imaging using the T2 relaxation time. T2-mapping is a quantitative methodology that has the potential to address the limitation of the conventional T2-weighted (T2W) imaging. Methods. The T2-mapping protocol used for all MRI scans consisted in a radial gradient echo acquisition with a lung-liver navigator for free-breathing acquisition and affine image registration. Mid-basal short axis slices were acquired.T2-maps analyses: 2 observers semi- automatically segmented the left ventricle in 6 segments accordingly to the AHA standards. 8 healthy volunteers (age: 27 ± 4 years; 62.5% male) were scanned in 2 separate sessions. 17 patients (age : 61.9 ± 13.9 years; 82.4% male) with subacute STEMI (70.6%) and NSTEMI underwent a T2-mapping scanning session. Results. In healthy volunteers, the mean inter- and intra-observer variability over the entire short axis slice (segment 1 to 6) was 0.1 ms (95% confidence interval (CI): -0.4 to 0.5, p = 0.62) and 0.2 ms (95% CI: -2.8 to 3.2, p = 0.94, respectively. T2 relaxation time measurements with and without the correction of the phantom yielded an average difference of 3.0 ± 1.1 % and 3.1 ± 2.1 % (p = 0.828), respectively. In patients, the inter-observer variability in the entire short axis slice (S1-S6), was 0.3 ms (95% CI: -1.8 to 2.4, p = 0.85). Edema location as determined through the T2-mapping and the coronary artery occlusion as determined on X-ray coronary angiography correlated in 78.6%, but only in 60% in apical infarcts. All except one of the maximal T2 values in infarct patients were greater than the upper limit of the 95% confidence interval for normal myocardium. Conclusions. The T2-mapping methodology is accurate in detecting infarcted, i.e. edematous tissue in patients with subacute infarcts. This study further demonstrated that this T2-mapping technique is reproducible and robust enough to be used on a segmental basis for edema detection without the need of a phantom to yield a T2 correction factor. This new quantitative T2-mapping technique is promising and is likely to allow for serial follow-up studies in patients to improve our knowledge on infarct pathophysiology, on infarct healing, and for the assessment of novel treatment strategies for acute infarctions.
Resumo:
Rapport de synthèse : Objectif : Le but de ce travail est d`étudier l'angiographie par scanner multi-barrette (AS) dans l'évaluation de l'artériopathie oblitérante (AOMI) de l'aorte abdominale et des membres inférieurs utilisant une méthode adaptative d'acquisition pour optimiser le rehaussement artériel en particulier pour le lit artériel distal et les artères des pieds. Matériels et méthodes : Trente-quatre patients pressentant une AOMI ont bénéficié d'une angiographie trans-cathéter (ATC) et d'une AS dans un délai inférieur ou égal à 15 jours. L'AS a été effectuée du tronc coeliaque jusqu'aux artères des pieds en une seule acquisition utilisant une haute résolution spatiale (16x0.625 mm). La vitesse de table et le temps de rotation pour chaque examen ont été choisis selon le temps de transit du produit de contraste, obtenu après un bolus test. Une quantité totale de 130 ml de contraste à 4 ml/s a été utilisée. L'analyse des images de l'AS a été effectuée par deux observateurs et les données ATC ont été interprétées de manière indépendante par deux autres observateurs. L'analyse a inclus la qualité de l'image et la détection de sténose supérieure ou égale à 50 % par patient et par segment artériel. La sensibilité et la spécificité de l'AS ont été calculées en considérant l'ATC comme examen de référence. La variabilité Interobservateur a été mesurée au moyen d'une statistique de kappa. Résultas : L'ATC a été non-conclusive dans 0.7 % des segments, tandis que l'AS était conclusive dans tous les segments. Sur l'analyse par patient, la sensibilité et la spécificité totales pour détecter une sténose significative égale ou supérieure à 50 % étaient de 100 %. L'analyse par segment a montré des sensibilités et de spécificités variant respectivement de 91 à 100 % et de 81 à 100 %. L'analyse des artères distales des pieds a révélé une sensibilité de 100 % et une spécificité de 90 %. Conclusion : L'angiographie par CT multi-barrettes utilisant cette méthode adaptative d'acquisition améliore la qualité de l'image et fournit une technique non-invasive et fiable pour évaluer L'AOMI, y compris les artères distales des pieds.
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression