990 resultados para Quantitative sensory test
Resumo:
BACKGROUND: Early detection of colorectal cancer through timely follow-up of positive Fecal Occult Blood Tests (FOBTs) remains a challenge. In our previous work, we found 40% of positive FOBT results eligible for colonoscopy had no documented response by a treating clinician at two weeks despite procedures for electronic result notification. We determined if technical and/or workflow-related aspects of automated communication in the electronic health record could lead to the lack of response. METHODS: Using both qualitative and quantitative methods, we evaluated positive FOBT communication in the electronic health record of a large, urban facility between May 2008 and March 2009. We identified the source of test result communication breakdown, and developed an intervention to fix the problem. Explicit medical record reviews measured timely follow-up (defined as response within 30 days of positive FOBT) pre- and post-intervention. RESULTS: Data from 11 interviews and tracking information from 490 FOBT alerts revealed that the software intended to alert primary care practitioners (PCPs) of positive FOBT results was not configured correctly and over a third of positive FOBTs were not transmitted to PCPs. Upon correction of the technical problem, lack of timely follow-up decreased immediately from 29.9% to 5.4% (p<0.01) and was sustained at month 4 following the intervention. CONCLUSION: Electronic communication of positive FOBT results should be monitored to avoid limiting colorectal cancer screening benefits. Robust quality assurance and oversight systems are needed to achieve this. Our methods may be useful for others seeking to improve follow-up of FOBTs in their systems.
Resumo:
BACKGROUND: Methylphenidate (MPD) is a psychostimulant commonly prescribed for attention deficit/hyperactivity disorder. The mode of action of the brain circuitry responsible for initiating the animals' behavior in response to psychostimulants is not well understood. There is some evidence that psychostimulants activate the ventral tegmental area (VTA), nucleus accumbens (NAc), and prefrontal cortex (PFC). METHODS: The present study was designed to investigate the acute dose-response of MPD (0.6, 2.5, and 10.0 mg/kg) on locomotor behavior and sensory evoked potentials recorded from the VTA, NAc, and PFC in freely behaving rats previously implanted with permanent electrodes. For locomotor behavior, adult male Wistar-Kyoto (WKY; n = 39) rats were given saline on experimental day 1 and either saline or an acute injection of MPD (0.6, 2.5, or 10.0 mg/kg, i.p.) on experimental day 2. Locomotor activity was recorded for 2-h post injection on both days using an automated, computerized activity monitoring system. Electrophysiological recordings were also performed in the adult male WKY rats (n = 10). Five to seven days after the rats had recovered from the implantation of electrodes, each rat was placed in a sound-insulated, electrophysiological test chamber where its sensory evoked field potentials were recorded before and after saline and 0.6, 2.5, and 10.0 mg/kg MPD injection. Time interval between injections was 90 min. RESULTS: Results showed an increase in locomotion with dose-response characteristics, while a dose-response decrease in amplitude of the components of sensory evoked field responses of the VTA, NAc, and PFC neurons. For example, the P3 component of the sensory evoked field response of the VTA decreased by 19.8% +/- 7.4% from baseline after treatment of 0.6 mg/kg MPD, 37.8% +/- 5.9% after 2.5 mg/kg MPD, and 56.5% +/- 3.9% after 10 mg/kg MPD. Greater attenuation from baseline was observed in the NAc and PFC. Differences in the intensity of MPD-induced attenuation were also found among these brain areas. CONCLUSION: These results suggest that an acute treatment of MPD produces electrophysiologically detectable alterations at the neuronal level, as well as observable, behavioral responses. The present study is the first to investigate the acute dose-response effects of MPD on behavior in terms of locomotor activity and in the brain involving the sensory inputs of VTA, NAc, and PFC neurons in intact, non-anesthetized, freely behaving rats previously implanted with permanent electrodes.
Quantitative analysis of benign paroxysmal positional vertigo fatigue under canalithiasis conditions
Resumo:
In our daily life, small flows in the semicircular canals (SCCs) of the inner ear displace a sensory structure called the cupula which mediates the transduction of head angular velocities to afferent signals. We consider a dysfunction of the SCCs known as canalithiasis. Under this condition, small debris particles disturb the flow in the SCCs and can cause benign paroxysmal positional vertigo (BPPV), arguably the most common form of vertigo in humans. The diagnosis of BPPV is mainly based on the analysis of typical eye movements (positional nystagmus) following provocative head maneuvers that are known to lead to vertigo in BPPV patients. These eye movements are triggered by the vestibulo-ocular reflex, and their velocity provides an indirect measurement of the cupula displacement. An attenuation of the vertigo and the nystagmus is often observed when the provocative maneuver is repeated. This attenuation is known as BPPV fatigue. It was not quantitatively described so far, and the mechanisms causing it remain unknown. We quantify fatigue by eye velocity measurements and propose a fluid dynamic interpretation of our results based on a computational model for the fluid–particle dynamics of a SCC with canalithiasis. Our model suggests that the particles may not go back to their initial position after a first head maneuver such that a second head maneuver leads to different particle trajectories causing smaller cupula displacements.
Resumo:
Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^
Resumo:
AIMS: We conducted a meta-analysis to evaluate the accuracy of quantitative stress myocardial contrast echocardiography (MCE) in coronary artery disease (CAD). METHODS AND RESULTS: Database search was performed through January 2008. We included studies evaluating accuracy of quantitative stress MCE for detection of CAD compared with coronary angiography or single-photon emission computed tomography (SPECT) and measuring reserve parameters of A, beta, and Abeta. Data from studies were verified and supplemented by the authors of each study. Using random effects meta-analysis, we estimated weighted mean difference (WMD), likelihood ratios (LRs), diagnostic odds ratios (DORs), and summary area under curve (AUC), all with 95% confidence interval (CI). Of 1443 studies, 13 including 627 patients (age range, 38-75 years) and comparing MCE with angiography (n = 10), SPECT (n = 1), or both (n = 2) were eligible. WMD (95% CI) were significantly less in CAD group than no-CAD group: 0.12 (0.06-0.18) (P < 0.001), 1.38 (1.28-1.52) (P < 0.001), and 1.47 (1.18-1.76) (P < 0.001) for A, beta, and Abeta reserves, respectively. Pooled LRs for positive test were 1.33 (1.13-1.57), 3.76 (2.43-5.80), and 3.64 (2.87-4.78) and LRs for negative test were 0.68 (0.55-0.83), 0.30 (0.24-0.38), and 0.27 (0.22-0.34) for A, beta, and Abeta reserves, respectively. Pooled DORs were 2.09 (1.42-3.07), 15.11 (7.90-28.91), and 14.73 (9.61-22.57) and AUCs were 0.637 (0.594-0.677), 0.851 (0.828-0.872), and 0.859 (0.842-0.750) for A, beta, and Abeta reserves, respectively. CONCLUSION: Evidence supports the use of quantitative MCE as a non-invasive test for detection of CAD. Standardizing MCE quantification analysis and adherence to reporting standards for diagnostic tests could enhance the quality of evidence in this field.
Resumo:
The previously described Nc5-specific PCR test for the diagnosis of Neospora caninum infections was used to develop a quantitative PCR assay which allows the determination of infection intensities within different experimental and diagnostic sample groups. The quantitative PCR was performed by using a dual fluorescent hybridization probe system and the LightCycler Instrument for online detection of amplified DNA. This assay was successfully applied for demonstrating the parasite proliferation kinetics in organotypic slice cultures of rat brain which were infected in vitro with N. caninum tachyzoites. This PCR-based method of parasite quantitation with organotypic brain tissue samples can be regarded as a novel ex vivo approach for exploring different aspects of cerebral N. caninum infection.
Resumo:
PURPOSE Fundus autofluorescence (FAF) cannot only be characterized by the intensity or the emission spectrum, but also by its lifetime. As the lifetime of a fluorescent molecule is sensitive to its local microenvironment, this technique may provide more information than fundus autofluorescence imaging. We report here the characteristics and repeatability of FAF lifetime measurements of the human macula using a new fluorescence lifetime imaging ophthalmoscope (FLIO). METHODS A total of 31 healthy phakic subjects were included in this study with an age range from 22 to 61 years. For image acquisition, a fluorescence lifetime ophthalmoscope based on a Heidelberg Engineering Spectralis system was used. Fluorescence lifetime maps of the retina were recorded in a short- (498-560 nm) and a long- (560-720 nm) spectral channel. For quantification of fluorescence lifetimes a standard ETDRS grid was used. RESULTS Mean fluorescence lifetimes were shortest in the fovea, with 208 picoseconds for the short-spectral channel and 239 picoseconds for the long-spectral channel, respectively. Fluorescence lifetimes increased from the central area to the outer ring of the ETDRS grid. The test-retest reliability of FLIO was very high for all ETDRS areas (Spearman's ρ = 0.80 for the short- and 0.97 for the long-spectral channel, P < 0.0001). Fluorescence lifetimes increased with age. CONCLUSIONS The FLIO allows reproducible measurements of fluorescence lifetimes of the macula in healthy subjects. By using a custom-built software, we were able to quantify fluorescence lifetimes within the ETDRS grid. Establishing a clinically accessible standard against which to measure FAF lifetimes within the retina is a prerequisite for future studies in retinal disease.
Resumo:
BACKGROUND: Impaired manual dexterity is frequent and disabling in patients with multiple sclerosis (MS). Therefore, convenient, quick and validated tests for manual dexterity in MS patients are needed. OBJECTIVE: The aim of this study was to validate the Coin Rotation task (CRT) to examine manual dexterity in patients with MS. DESIGN: Cross-sectional study. METHODS: 101 outpatients with MS were assessed with the CRT, the Expanded Disability Status Scale (EDSS), the Scale for the assessment and rating of ataxia (SARA), the Modified Ashworth Scale (MAS), and their muscle strength and sensory deficits of the hands were noted. Concurrent validity and diagnostic accuracy of the CRT were determined by comparison with the Nine Hole Peg Test (9HPT). Construct validity was determined by comparison with a valid dexterity questionnaire. Multiple regression analysis was done to explore correlations of the CRT with the EDSS, SARA, MAS, muscle strength and sensory deficits. RESULTS: The CRT correlated significantly with the 9HPT (r=.73, p<.0001) indicating good concurrent validity. The cut-off values for the CRT relative to the 9HPT were 18.75 seconds for the dominant (sensitivity: 81.5%; specificity 80.0%) and 19.25 seconds for the non-dominant hand (sensitivity: 90.3%; specificity: 81.8%) demonstrating good diagnostic accuracy. Furthermore, the CRT correlated significantly with the dexterity questionnaire (r=-.49, p<.0001) indicating moderate construct validity. Multiple regression analyses revealed that the EDSS was the strongest predictor for impaired dexterity. LIMITATIONS: Mostly relapsing-remitting MS patients with an EDSS up to 7 were examined. CONCLUSIONS: This study validates the CRT as a test that can be used easily and quickly to evaluate manual dexterity in patients with MS.
Resumo:
Neuronal outgrowth has been proposed in many systems as a mechanism underlying memory storage. For example, sensory neuron outgrowth is widely accepted as an underlying mechanism of long-term sensitization of defensive withdrawal reflexes in Aplysia. The hypothesis is that learning leads to outgrowth and consequently to the formation of new synapses, which in turn strengthen the neural circuit underlying the behavior. However, key experiments to test this hypothesis have never been performed. ^ Four days of sensitization training leads to outgrowth of siphon sensory neurons mediating the siphon-gill withdrawal response in Aplysia . We found that a similar training protocol produced robust outgrowth in tail sensory neurons mediating the tail siphon withdrawal reflex. In contrast, 1 day of training, which effectively induces long-term behavioral sensitization and synaptic facilitation, was not associated with neuronal outgrowth. Further examination of the effect of behavioral training protocols on sensory neuron outgrowth indicated that this structural modification is associated only with the most persistent forms of sensitization, and that the induction of these changes is dependent on the spacing of the training trials over multiple days. Therefore, we suggest that neuronal outgrowth is not a universal mechanism underlying long-term sensitization, but is involved only in the most persistent forms of the memory. ^ Sensory neuron outgrowth presumably contributes to long-term sensitization through formation of new synapses with follower motor neurons, but this hypothesis has never been directly tested. The contribution of outgrowth to long-term sensitization was assessed using confocal microscopy to examine sites of contact between physiologically connected pairs of sensory and motor neurons. Following 4 days of training, the strength of both the behavior and sensorimotor synapse and the number of appositions with follower neurons was enhanced only on the trained side of the animal. In contrast, outgrowth was induced on both sides of the animal, indicating that although sensory neuron outgrowth does appear to contribute to sensitization through the formation of new synapses, outgrowth alone is not sufficient to account for the effects of sensitization. This indicates that key regulatory steps are downstream from outgrowth, possibly in the targeting of new processes and activation of new synapses. ^
Resumo:
The intensification of consequential testing situations is associated with an increase in anxiety among American students (Casbarro, 2005). Test anxiety can have negative effects on student test performance (Everson, Millsap, & Rodriguez, 1991). If test anxiety has the potential to decrease students’ test scores, it becomes a factor that can threaten the validity of any inferences drawn between test scores and student progress (Cizek & Burg, 2006). There are several factors that relate closely to test anxiety (Cizek & Burg, 2006). Variables of key influence include gender, socioeconomic status, and teacher-manifested anxiety (Hembree, 1988). Another influence upon test anxiety is students’ participation in academic support programs to prepare them for exit examinations. The purpose of this study was to examine the relationship between 10th grade high school student gender, socioeconomic status, perceived teacher anxiety, and student preparedness with levels of the Massachusetts Comprehensive Assessment System (MCAS) test anxiety. It appears that few studies have examined levels of high school test anxiety in regards to this specific high-stakes MCAS exit exam required for high school graduation. A two-phase sequential mixed-methods research design was used to survey (N=156) 10th grade students represented by a sampling of (n=80) students with low socioeconomic status and (n=76) students with high socioeconomic status regarding their levels of test anxiety in relation to upcoming MCAS testing. A multiple regression analysis was used to measure the relationship between the predictor variables (gender, socioeconomic status, perceived teacher anxiety, and student preparedness) with the criterion variable of student test anxiety using the Test Anxiety Inventory (TAI). Personal interviews with (n=20) volunteer students provided rich explanations of students’ academic self-efficacy, their perceptions of their performance on the upcoming MCAS exam, and their use of strategies to reduce their levels of test anxiety. Personal interviews with (n=12) volunteer school administrators and teachers provided descriptions of their perceptions of how test anxiety affected their students’ performance. A major quantitative finding of this study was that the variables of student socioeconomic status and student ratings of teacher anxiety accounted for the variance in students’ levels of surveyed test anxiety (R2 = .06, p = .033, small to medium effect size). These results indicate that different student populations vary in their readiness skills to successfully participate in consequential testing situations. Consequently, high-test anxious students would require emotional preparation as well as academic preparation when confronting high-stakes testing. The results have the potential to re-shape the format of schools’ MCAS test preparation efforts.
Resumo:
Obesity has been on the rise in the United States over the last 30 years for all populations, including preschoolers. The purpose of the project was to develop an observation tool to measure physical activity levels in preschool children and use the tool in a pilot test of the CATCH UP curriculum at two Head Start Centers in Houston. Pretest and posttest interobserver agreements were all above 0.60 for physical activity level and physical activity type. Preschoolers spent the majority of their time in light physical activity (75.33% pretest, 87.77% posttest), and spent little time in moderate to vigorous physical activity (MVPA) (24.67% pretest, 12.23% posttest). Percent time spent in MVPA decreased significantly pretest to posttest from (F=5.738, p=0.043). While the pilot testing of the CATCH UP curriculum did not show an increase in MVPA, the SOFIT-P tool did show promising results as being a new method for collecting physical activity level data for preschoolers. Once the new tool has undergone more reliability and validity testing, it could allow for a more convenient method of collecting physical activity levels for preschoolers. ^
Resumo:
Radiation therapy has been used as an effective treatment for malignancies in pediatric patients. However, in many cases, the side effects of radiation diminish these patients’ quality of life. In order to develop strategies to minimize radiogenic complications, one must first quantitatively estimate pediatric patients’ relative risk for radiogenic late effects, which has not become feasible till recently because of the calculational complexity. The goals of this work were to calculate the dose delivered to tissues and organs in pediatric patients during contemporary photon and proton radiotherapies; to estimate the corresponding risk of radiogenic second cancer and cardiac toxicity based on the calculated doses and on dose-risk models from the literature; to test for the statistical significance of the difference between predicted risks after photon versus proton radiotherapies; and to provide a prototype of an evidence-based approach to selecting treatment modalities for pediatric patients, taking second cancer and cardiac toxicity into account. The results showed that proton therapy confers a lower predicted risk of radiogenic second cancer, and lower risks of radiogenic cardiac toxicities, compared to photon therapy. An uncertainty analysis revealed that the qualitative findings of this study are insensitive to changes in a wide variety of host and treatment related factors.
Resumo:
It has been hypothesized that results from the short term bioassays will ultimately provide information that will be useful for human health hazard assessment. Although toxicologic test systems have become increasingly refined, to date, no investigator has been able to provide qualitative or quantitative methods which would support the use of short term tests in this capacity.^ Historically, the validity of the short term tests have been assessed using the framework of the epidemiologic/medical screens. In this context, the results of the carcinogen (long term) bioassay is generally used as the standard. However, this approach is widely recognized as being biased and, because it employs qualitative data, cannot be used in the setting of priorities. In contrast, the goal of this research was to address the problem of evaluating the utility of the short term tests for hazard assessment using an alternative method of investigation.^ Chemical carcinogens were selected from the list of carcinogens published by the International Agency for Research on Carcinogens (IARC). Tumorigenicity and mutagenicity data on fifty-two chemicals were obtained from the Registry of Toxic Effects of Chemical Substances (RTECS) and were analyzed using a relative potency approach. The relative potency framework allows for the standardization of data "relative" to a reference compound. To avoid any bias associated with the choice of the reference compound, fourteen different compounds were used.^ The data were evaluated in a format which allowed for a comparison of the ranking of the mutagenic relative potencies of the compounds (as estimated using short term data) vs. the ranking of the tumorigenic relative potencies (as estimated from the chronic bioassays). The results were statistically significant (p $<$.05) for data standardized to thirteen of the fourteen reference compounds. Although this was a preliminary investigation, it offers evidence that the short term test systems may be of utility in ranking the hazards represented by chemicals which may be human carcinogens. ^
Resumo:
Radiomics is the high-throughput extraction and analysis of quantitative image features. For non-small cell lung cancer (NSCLC) patients, radiomics can be applied to standard of care computed tomography (CT) images to improve tumor diagnosis, staging, and response assessment. The first objective of this work was to show that CT image features extracted from pre-treatment NSCLC tumors could be used to predict tumor shrinkage in response to therapy. This is important since tumor shrinkage is an important cancer treatment endpoint that is correlated with probability of disease progression and overall survival. Accurate prediction of tumor shrinkage could also lead to individually customized treatment plans. To accomplish this objective, 64 stage NSCLC patients with similar treatments were all imaged using the same CT scanner and protocol. Quantitative image features were extracted and principal component regression with simulated annealing subset selection was used to predict shrinkage. Cross validation and permutation tests were used to validate the results. The optimal model gave a strong correlation between the observed and predicted shrinkages with . The second objective of this work was to identify sets of NSCLC CT image features that are reproducible, non-redundant, and informative across multiple machines. Feature sets with these qualities are needed for NSCLC radiomics models to be robust to machine variation and spurious correlation. To accomplish this objective, test-retest CT image pairs were obtained from 56 NSCLC patients imaged on three CT machines from two institutions. For each machine, quantitative image features with concordance correlation coefficient values greater than 0.90 were considered reproducible. Multi-machine reproducible feature sets were created by taking the intersection of individual machine reproducible feature sets. Redundant features were removed through hierarchical clustering. The findings showed that image feature reproducibility and redundancy depended on both the CT machine and the CT image type (average cine 4D-CT imaging vs. end-exhale cine 4D-CT imaging vs. helical inspiratory breath-hold 3D CT). For each image type, a set of cross-machine reproducible, non-redundant, and informative image features was identified. Compared to end-exhale 4D-CT and breath-hold 3D-CT, average 4D-CT derived image features showed superior multi-machine reproducibility and are the best candidates for clinical correlation.