990 resultados para Quantitative sensory test


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To assess quantitative real-time polymerase chain reaction (q-PCR) for the sputum smear diagnosis of pulmonary tuberculosis (PTB) in patients living with HIV/AIDS with a clinical suspicion of PTB.Method: This is a prospective study to assess the accuracy of a diagnostic test, conducted on 140 sputum specimens from 140 patients living with HIV/AIDS with a clinical suspicion of PTB, attended at two referral hospitals for people living with HIV/AIDS in the city of Recife, Pernambuco, Brazil. A Löwenstein-Jensen medium culture and 7H9 broth were used as gold standard.Results: Of the 140 sputum samples, 47 (33.6%) were positive with the gold standard. q-PCR was positive in 42 (30%) of the 140 patients. Only one (0.71%) did not correspond to the culture. The sensitivity, specificity and accuracy of the q-PCR were 87.2%, 98.9% and 95% respectively. In 39 (93%) of the 42 q-PCR positive cases, the CT (threshold cycle) was equal to or less than 37.Conclusion: q-PCR performed on sputum smears from patients living with HIV/AIDS demonstrated satisfactory sensitivity, specificity and accuracy, and may therefore be recommended as a method for diagnosing PTB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this work was to test a cytomegalovirus qualitative PCR and a semi-quantitative PCR on the determination of CMV load in leukocytes of bone marrow and kidney transplanted (RT) patients. Thirty three BMT and 35 RT patients participated of the study. The DNA was subjected to a qualitative PCR using primers that amplify part of CMV gB gene. CMV load of positive samples was determined by a semi-quantitative PCR using quantified plasmids inserted with part of the gB gene of CMV as controls. The sensitivity of the test was determined to be 867 plasmid copies/µg DNA. CMV loads between 2,118 and 72,443 copies/µg DNA were observed in 12.1% BMT recipients and between 1,246 and 58,613 copies/µg DNA in 22.9% RT recipients. Further studies are necessary to confirm the usefulness of this CMV semi-quantitative PCR in transplanted patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unraveling the effect of selection vs. drift on the evolution of quantitative traits is commonly achieved by one of two methods. Either one contrasts population differentiation estimates for genetic markers and quantitative traits (the Q(st)-F(st) contrast) or multivariate methods are used to study the covariance between sets of traits. In particular, many studies have focused on the genetic variance-covariance matrix (the G matrix). However, both drift and selection can cause changes in G. To understand their joint effects, we recently combined the two methods into a single test (accompanying article by Martin et al.), which we apply here to a network of 16 natural populations of the freshwater snail Galba truncatula. Using this new neutrality test, extended to hierarchical population structures, we studied the multivariate equivalent of the Q(st)-F(st) contrast for several life-history traits of G. truncatula. We found strong evidence of selection acting on multivariate phenotypes. Selection was homogeneous among populations within each habitat and heterogeneous between habitats. We found that the G matrices were relatively stable within each habitat, with proportionality between the among-populations (D) and the within-populations (G) covariance matrices. The effect of habitat heterogeneity is to break this proportionality because of selection for habitat-dependent optima. Individual-based simulations mimicking our empirical system confirmed that these patterns are expected under the selective regime inferred. We show that homogenizing selection can mimic some effect of drift on the G matrix (G and D almost proportional), but that incorporating information from molecular markers (multivariate Q(st)-F(st)) allows disentangling the two effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blood sampling on filter paper is a current practice seroepidemiological studies by indirect fluorescent antibody test (IFAT). There is, however, scant comparative information about the use of bloodspot eluates for detection of malarial IgG antibodies simultaneously by IFAT and enzyme immunoassay (ELISA). Here we report data obtained by both serological methods done on 219 bloodspot eluate samples collected in a rural community in Brazilian Amazon Basin (Alto Paraíso, Ariquemes municipality) where malaria is endemic. Plasmodium falciparum and P. vivax thick smear antigens were used in the IFAT; a detergent-soluble P. falciparum antigen was prepared for ELISA. Substantial agreement of results (Kappa coefficient k = 0.686) was observed when P. falciparum antigen was used in both tests, and IFAT titers were found to be strongly correlated ELISA antibody units (Spearman correlation coeficient rs = 0.818, p < 0.0001). Only moderate agreement (k = 0.467) between IFAT with P. vivax antigen and ELISA with P. falciparum antigen was observed. Spearman correlation coefficient value between quantitative results (IFAT titers and ELISA antibody units) in this case was numerically lowe (rs = 0.540, p < 0.0001). Our results suggest that, with P. falciparum antigen, both IFAT and ELISA performed on bloodspot eluates are equivalent for seropidemiological purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative ultrasound (QUS) appears to be developing into an acceptable, low-cost and readily-accessible alternative to dual X-ray absorptiometry (DXA) measurements of bone mineral density (BMD) in the detection and management of osteoporosis. Perhaps the major difficulty with their widespread use is that many different QUS devices exist that differ substantially from each other, in terms of the parameters they measure and the strength of empirical evidence supporting their use. But another problem is that virtually no data exist outside of Caucasian or Asian populations. In general, heel QUS appears to be most tested and most effective. Some, but not all heel QUS devices are effective assessing fracture risk in some, but not all populations, the evidence being strongest for Caucasian females > 55 years old, though some evidence exists for Asian females > 55 and for Caucasian and Asian males > 70. Certain devices may allow to estimate the likelihood of osteoporosis, but very limited evidence exists supporting QUS use during the initiation or monitoring of osteoporosis treatment. Likely, QUS is most effective when combined with an assessment of clinical risk factors (CRF); with DXA reserved for individuals who are not identified as either high or low risk using QUS and CRF. However, monitoring and maintenance of test and instrument accuracy, precision and reproducibility are essential if QUS devices are to be used in clinical practice; and further scientific research in non-Caucasian, non-Asian populations clearly is compulsory to validate this tool for more widespread use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Q(st)-F(st)) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2F(st)/(1 - F(st))G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2F(st)/(1 - F(st))] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Q(st)-F(st) comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are no validated criteria for the diagnosis of sensory neuronopathy (SNN) yet. In a preliminary monocenter study a set of criteria relying on clinical and electrophysiological data showed good sensitivity and specificity for a diagnosis of probable SNN. The aim of this study was to test these criteria on a French multicenter study. 210 patients with sensory neuropathies from 15 francophone reference centers for neuromuscular diseases were included in the study with an expert diagnosis of non-SNN, SNN or suspected SNN according to the investigations performed in these centers. Diagnosis was obtained independently from the set of criteria to be tested. The expert diagnosis was taken as the reference against which the proposed SNN criteria were tested. The set relied on clinical and electrophysiological data easily obtainable with routine investigations. 9/61 (16.4 %) of non-SNN patients, 23/36 (63.9 %) of suspected SNN, and 102/113 (90.3 %) of SNN patients according to the expert diagnosis were classified as SNN by the criteria. The SNN criteria tested against the expert diagnosis in the SNN and non-SNN groups had 90.3 % (102/113) sensitivity, 85.2 % (52/61) specificity, 91.9 % (102/111) positive predictive value, and 82.5 % (52/63) negative predictive value. Discordance between the expert diagnosis and the SNN criteria occurred in 20 cases. After analysis of these cases, 11 could be reallocated to a correct diagnosis in accordance with the SNN criteria. The proposed criteria may be useful for the diagnosis of probable SNN in patients with sensory neuropathy. They can be reached with simple clinical and paraclinical investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human Immunodeficiency Virus continues to be a pandemic. Spain is one of the European countries with the highest incidence of HIV. Within Catalonia, Spain many projects have been implemented with the intention of improving HIV knowledge and lowering the incidence. HIV knowledge is also known to have a positive effect on lowering stigma and discrimination of the people living with HIV. However, few studies study the distribution of HIV knowledge and its association to HIV status, age, sex, geographical zone of origin and level of education within the same study. Objectives: To identify if HIV knowledge is associated with HIV status, age, sex, geographical zone of origin and level of education. Method: Quantitative, cross-sectional, centre-based study comprising of people receiving an HIV test in Catalonia, Spain. Data will be collected from the 11 HIV Non-Governmental Organisations in Catalonia, Spain. The Brief HIV Knowledge Scale will be used to assess HIV knowledge; information from the HIV test session will be used to assess HIV status, age, sex, geographic zone of origin and level of education. The association between HIV knowledge and the afore mentioned variables will then be calculated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human herpesvirus 6 (HHV-6) may cause severe complications after haematopoietic stem cell transplantation (HSCT). Monitoring this virus and providing precise, rapid and early diagnosis of related clinical diseases, constitute essential measures to improve outcomes. A prospective survey on the incidence and clinical features of HHV-6 infections after HSCT has not yet been conducted in Brazilian patients and the impact of this infection on HSCT outcome remains unclear. A rapid test based on real-time quantitative polymerase chain reaction (qPCR) has been optimised to screen and quantify clinical samples for HHV-6. The detection step was based on reaction with TaqMan® hydrolysis probes. A set of previously described primers and probes have been tested to evaluate efficiency, sensitivity and reproducibility. The target efficiency range was 91.4% with linearity ranging from 10-106 copies/reaction and a limit of detection of five copies/reaction or 250 copies/mL of plasma. The qPCR assay developed in the present study was simple, rapid and sensitive, allowing the detection of a wide range of HHV-6 loads. In conclusion, this test may be useful as a practical tool to help elucidate the clinical relevance of HHV-6 infection and reactivation in different scenarios and to determine the need for surveillance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The study of the attentional system remains a challenge for current neuroscience. The "Attention Network Test" (ANT) was designed to study simultaneously three different attentional networks (alerting, orienting, and executive) based in subtraction of different experimental conditions. However, some studies recommend caution with these calculations due to the interactions between the attentional networks. In particular, it is highly relevant that several interpretations about attentional impairment have arisen from these calculations in diverse pathologies. Event related potentials (ERPs) and neural source analysis can be applied to disentangle the relationships between these attentional networks not specifically shown by behavioral measures. RESULTS This study shows that there is a basic level of alerting (tonic alerting) in the no cue (NC) condition, represented by a slow negative trend in the ERP trace prior to the onset of the target stimuli. A progressive increase in the CNV amplitude related to the amount of information provided by the cue conditions is also shown. Neural source analysis reveals specific modulations of the CNV related to a task-related expectancy presented in the NC condition; a late modulation triggered by the central cue (CC) condition and probably representing a generic motor preparation; and an early and late modulation for spatial cue (SC) condition suggesting specific motor and sensory preactivation. Finally, the first component in the information processing of the target stimuli modulated by the interaction between orienting network and the executive system can be represented by N1. CONCLUSIONS The ANT is useful as a paradigm to study specific attentional mechanisms and their interactions. However, calculation of network effects is based in subtractions with non-comparable experimental conditions, as evidenced by the present data, which can induce misinterpretations in the study of the attentional capacity in human subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To objectively compare quantitative parameters related to image quality attained at coronary magnetic resonance (MR) angiography of the right coronary artery (RCA) performed at 7 T and 3 T. MATERIALS AND METHODS: Institutional review board approval was obtained, and volunteers provided signed informed consent. Ten healthy adult volunteers (mean age ± standard deviation, 25 years ± 4; seven men, three women) underwent navigator-gated three-dimensional MR angiography of the RCA at 7 T and 3 T. For 7 T, a custom-built quadrature radiofrequency transmit-receive surface coil was used. At 3 T, a commercial body radiofrequency transmit coil and a cardiac coil array for signal reception were used. Segmented k-space gradient-echo imaging with spectrally selective adiabatic fat suppression was performed, and imaging parameters were similar at both field strengths. Contrast-to-noise ratio between blood and epicardial fat; signal-to-noise ratio of the blood pool; RCA vessel sharpness, diameter, and length; and navigator efficiency were quantified at both field strengths and compared by using a Mann-Whitney U test. RESULTS: The contrast-to-noise ratio between blood and epicardial fat was significantly improved at 7 T when compared with that at 3 T (87 ± 34 versus 52 ± 13; P = .01). Signal-to-noise ratio of the blood pool was increased at 7 T (109 ± 47 versus 67 ± 19; P = .02). Vessel sharpness obtained at 7 T was also higher (58% ± 9 versus 50% ± 5; P = .04). At the same time, RCA vessel diameter and length and navigator efficiency showed no significant field strength-dependent difference. CONCLUSION: In our quantitative and qualitative study comparing in vivo human imaging of the RCA at 7 T and 3 T in young healthy volunteers, parameters related to image quality attained at 7 T equal or surpass those from 3 T.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A critical feature of cooperative animal societies is the reproductive skew, a shorthand term for the degree to which a dominant individual monopolizes overall reproduction in the group. Our theoretical analysis of the evolutionarily stable skew in matrifilial (i.e., mother-daughter) societies, in which relatednesses to offspring are asymmetrical, predicts that reproductive skews in such societies should tend to be greater than those of semisocial societies (i.e., societies composed of individuals of the same generation, such as siblings), in which relatednesses to offspring are symmetrical. Quantitative data on reproductive skews in semisocial and matrifilial associations within the same species for 17 eusocial Hymenoptera support this prediction. Likewise, a survey of reproductive partitioning within 20 vertebrate societies demonstrates that complete reproductive monopoly is more likely to occur in matrifilial than in semisocial societies, also as predicted by the optimal skew model.