14 resultados para ATTENTION SHIFT
em DigitalCommons@The Texas Medical Center
Resumo:
Magnetic resonance temperature imaging (MRTI) is recognized as a noninvasive means to provide temperature imaging for guidance in thermal therapies. The most common method of estimating temperature changes in the body using MR is by measuring the water proton resonant frequency (PRF) shift. Calculation of the complex phase difference (CPD) is the method of choice for measuring the PRF indirectly since it facilitates temperature mapping with high spatiotemporal resolution. Chemical shift imaging (CSI) techniques can provide the PRF directly with high sensitivity to temperature changes while minimizing artifacts commonly seen in CPD techniques. However, CSI techniques are currently limited by poor spatiotemporal resolution. This research intends to develop and validate a CSI-based MRTI technique with intentional spectral undersampling which allows relaxed parameters to improve spatiotemporal resolution. An algorithm based on autoregressive moving average (ARMA) modeling is developed and validated to help overcome limitations of Fourier-based analysis allowing highly accurate and precise PRF estimates. From the determined acquisition parameters and ARMA modeling, robust maps of temperature using the k-means algorithm are generated and validated in laser treatments in ex vivo tissue. The use of non-PRF based measurements provided by the technique is also investigated to aid in the validation of thermal damage predicted by an Arrhenius rate dose model.
Resumo:
Several studies have shown that children with spina bifida meningomyelocele (SBM) and hydrocephalus have attention problems on parent ratings and difficulties in stimulus orienting associated with a posterior brain attention system. Less is known about response control and inhibition associated with an anterior brain attention system. Using the Gordon Vigilance Task (Gordon, 1983), we studied error rate, reaction time, and performance over time for sustained attention, a key anterior attention function, in 101 children with SBM, 17 with aqueductal stenosis (AS; another condition involving congenital hydrocephalus), and 40 typically developing controls (NC). In SBM, we investigated the relation between cognitive attention and parent ratings of inattention and hyperactivity and explored the impact of medical variables. Children with SBM did not differ from AS or NC groups on measures of sustained attention, but they committed more errors and responded more slowly. Approximately one-third of the SBM group had attention symptoms, although parent attention ratings were not associated with task performance. Hydrocephalus does not account for the attention profile of children with SBM, which also reflects the distinctive brain dysmorphologies associated with this condition.
Resumo:
OBJECTIVE: To examine the relationships between physical growth and medications prescribed for symptoms of attention-deficit hyperactivity disorder in children with HIV. METHODS: Analysis of data from children with perinatally acquired HIV (N = 2251; age 3-19 years), with and without prescriptions for stimulant and nonstimulant medications used to treat attention-deficit hyperactivity disorder, in a long-term observational study. Height and weight measurements were transformed to z scores and compared across medication groups. Changes in z scores during a 2-year interval were compared using multiple linear regression models adjusting for selected covariates. RESULTS: Participants with (n = 215) and without (n = 2036) prescriptions were shorter than expected based on US age and gender norms (p < .001). Children without prescriptions weighed less at baseline than children in the general population (p < .001) but gained height and weight at a faster rate (p < .001). Children prescribed stimulants were similar to population norms in baseline weight; their height and weight growth velocities were comparable with the general population and children without prescriptions (for weight, p = .511 and .100, respectively). Children prescribed nonstimulants had the lowest baseline height but were similar to population norms in baseline weight. Their height and weight growth velocities were comparable with the general population but significantly slower than children without prescriptions (p = .01 and .02, respectively). CONCLUSION: The use of stimulants to treat symptoms of attention-deficit hyperactivity disorder does not significantly exacerbate the potential for growth delay in children with HIV and may afford opportunities for interventions that promote physical growth. Prospective studies are needed to confirm these findings.
Resumo:
Hippocampal place cells in the rat undergo experience-dependent changes when the rat runs stereotyped routes. One such change, the backward shift of the place field center of mass, has been linked by previous modeling efforts to spike-timing-dependent plasticity (STDP). However, these models did not account for the termination of the place field shift and they were based on an abstract implementation of STDP that ignores many of the features found in cortical plasticity. Here, instead of the abstract STDP model, we use a calcium-dependent plasticity (CaDP) learning rule that can account for many of the observed properties of cortical plasticity. We use the CaDP learning rule in combination with a model of metaplasticity to simulate place field dynamics. Without any major changes to the parameters of the original model, the present simulations account both for the initial rapid place field shift and for the subsequent slowing down of this shift. These results suggest that the CaDP model captures the essence of a general cortical mechanism of synaptic plasticity, which may underlie numerous forms of synaptic plasticity observed both in vivo and in vitro.
Resumo:
Human trafficking is regarded by Interpol as the second largest and fastest growing criminal industry in the world. This letter is submitted in response to the topic of Human Trafficking addressed in Volume 2, Issue 1. In response to the ever-increasing attention to this problem, various programs focus on the rescue of survivors in anti-trafficking efforts - sometimes overshadowing efforts to prevent human trafficking and rehabilitate those harmed. A comprehensive, responsible approach requires a system of rescue and rehabilitation with a deliberate eye toward prevention. The basic human rights of survivors are at risk of being violated by “so-called rescue missions, despite the good intentions of would-be rescuers.” At the prevention level, a firm human rights approach is needed. When interventions shift their emphasis to prevention and tackle the innate contributors to inequality, then the roots of trafficking and slavery can be firmly extirpated. By taking a thoughtful and vested approach to tackling all areas of trafficking— including prevention, rescue, and rehabilitation—resources can be used more effectively, and communities are likely to have a more extensive impact in the fight against this hideous crime against humanity.
Resumo:
The causes and contexts of food insecurity among children in the U.S. are poorly understood because the prevalence of food insecurity at the child level is low compared to the prevalence of household food insecurity. In addition, caregivers may be reluctant to admit their children may not be getting enough food due to shame or fear they might lose custody of their children. Based on our ongoing qualitative research with mothers of young children, we suggest that food security among children is related to adverse childhood experiences of caregivers. This translates into poor mental and physical health in adolescence and adulthood, which can lead to inability to secure and maintain meaningful employment that pays a living wage. In this paper we propose that researchers shift the framework for understanding food insecurity in the United States to adopt a life course approach. This demands we pay greater attention to the lifelong consequences of exposure to trauma or toxic stress—exposure to violence, rape, abuse and neglect, and housing, food, and other forms of deprivation—during childhood. We then describe three case studies of women from our ongoing study to describe a variety of toxic stress exposures and how they have an impact on a woman’s earning potential, her mental health, and attitudes toward raising children. Each woman describes her exposure to violence and deprivation as a child and adolescent, describes experiences with child hunger, and explains how her experiences have shaped her ability to nourish her children. We describe ways in which we can shift the nature of research investigations on food insecurity, and provide recommendations for policy-oriented solutions regarding income support programs, early intervention programs, child and adult mental health services, and violence prevention programs.
Resumo:
This study of ambulance workers for the emergency medical services of the City of Houston studied the factors related to shiftwork tolerance and intolerance. The EMS personnel work a 24-hour shift with rotating days of the week. Workers are assigned to A, B, C, D shift, each of which rotate 24-hours on, 24-hours off, 24-hours on and 4 days off. One-hundred and seventy-six male EMTs, paramedics and chauffeurs from stations of varying levels of activity were surveyed. The sample group ranged in age from 20 to 45. The average tenure on the job was 8.2 years. Over 68% of the workers held a second job, the majority of which worked over 20 hours a week at the second position.^ The survey instrument was a 20-page questionnaire modeled after the Folkard Standardized Shiftwork Index. In addition to demographic data, the survey tool provided measurements of general job satisfaction, sleep quality, general health complaints, morningness/eveningness, cognitive and somatic anxiety, depression, and circadian types. The survey questionnaire included an EMS-specific scaler of stress.^ A conceptual model of Shiftwork Tolerance was presented to identify the key factors examined in the study. An extensive list of 265 variables was reduced to 36 key variables that related to: (1) shift schedule and demographic/lifestyle factors, (2) individual differences related to traits and characteristics, and (3) tolerance/intolerance effects. Using the general job satisfaction scaler as the key measurement of shift tolerance/intolerance, it was shown that a significant relationship existed between this dependent variable and stress, number of years working a 24-hour shift, sleep quality, languidness/vigorousness. The usual amount of sleep received during the shift, general health complaints and flexibility/rigidity (R$\sp2$ =.5073).^ The sample consisted of a majority of morningness-types or extreme-morningness types, few evening-types and no extreme-evening types, duplicating the findings of Motohashi's previous study of ambulance workers. The level of activity by station was not significant on any of the dependent variables examined. However, the shift worked had a relationship with sleep quality, despite the fact that all shifts work the same hours and participate in the same rotation schedule. ^
Resumo:
We postulated that neuromuscular disuse results in deleteriously affected tissue-vascular fluid exchange processes and subsequently damages the important oxidative bioenergetic process of intramuscular lipid metabolism. The in-depth research reported in the literature is somewhat limited by the ex vivo nature and sporadic time-course characterization of disuse atrophy and recovery. Thus, an in vivo controlled, localized animal model of disuse atrophy was developed in one of the hindlimbs of laboratory rabbits (employing surgically implanted tetrodotoxin (TTX)-filled mini-osmotic pump-sciatic nerve superfusion system) and tested repeatedly with magnetic resonance (MR) throughout the 2-week period of temporarily induced disuse and during the recovery period (following explantation of the TTX-filled pump) for a period of 3 weeks. Controls consisted of saline/"sham"-implanted rabbit hindlimbs. The validity of this model was established with repeated electrophysiologic nerve conduction testing using a clinically appropriate protocol and percutaneously inserted small needle stimulating and recording electrodes. Evoked responses recorded from proximal (P) and distal (D) sites to the sciatic nerve cuff in the TTX-implanted group revealed significantly decreased (p $<$ 0.001) proximal-to-distal (P/D) amplitude ratios (as much as 50-70% below Baseline/pre-implanted and sham-implanted group values) and significantly increased (p $<$ 0.01) differential latency (PL-DL) values (as much as 1.5 times the pre- and sham-implanted groups). By Day 21 of recovery, observed P/D and PL-DL levels matched Baseline/sham-implemented levels. MRI-determined cross-sectional area (CSA) values of Baseline/pre-implanted, sham- or TTX-implanted, and recovering/explanted and the corresponding contralateral hindlimb tibialis anterior (TA) muscles normalized to tibial bone (TB) CSA (in TA/TB ratios) revealed that there was a significant decline (indicative of atrophic response) from pre- and sham-implanted controls by as much as 20% (p $<$ 0.01) at Day 7 and 50-55% (p $<$ 0.001) at Day 13 of TTX-implantation. In the non-implanted contralaterals, a significant increase (indicative of hypertrophic response) by as much as 10% (p $<$ 0.025) at Day 7 and 27% (p $<$ 0.001) at Day 13 + TTX was found. The induced atrophic/hypertrophic TA muscles were observed to be fully recovered by Day 21 post-explantation as evidenced by image TA/TB ratios. End-point biopsy results from a small group of rabbits revealed comprehensive atrophy of both Type I and Type II fibers, although the heterogeneity of the response supports the use of image-guided, volume-localized proton magnetic resonance spectroscopy (MRS) to noninvasively assess tissue-level metabolic changes. MRS-determined results of a 0.25cc volume of tissue within implanted limb TA muscles under resting/pre-ischemic, ischemic-stressed, and post-ischemic conditions at timepoints during and following disuse atrophy/recovery revealed significantly increased intramuscular spectral lipid levels, as much as 2-3 times (p $<$ 0.01) the Baseline/pre-implanted values at Day 7 and 6-7 times (p $<$ 0.001) at Day 13 + TTX, which approached normal levels (compared to pre- and sham-implanted groups) by Day 21 of post-explanation recovery. (Abstract shortened by UMI.) ^
Resumo:
In spite of the dramatic increase and general concern with U.S. hospital bad debt expense (AMNews, January 12, 2004; Philadelphia Business Journal, April 30, 2004; WSJ, July 23, 2004), there appears to be little available analysis of the precise sources and causes of its growth. This is particularly true in terms of the potential contribution of insured patients to bad debt expense in light of the recent shift in managed care from health maintenance organization (HMO) plans to preferred provider organization (PPO) plans (Kaiser Annual Survey Report, 2003). This study examines and attempts to explain the recent dramatic growth in bad debt expense by focusing on and analyzing data from two Houston-area hospital providers within one healthcare system. In contrast to prior studies in which self-pay was found to be the primary source of hospital bad debt expense (Saywell, R. M., et al., 1989; Zollinger, T. W., 1991; Weissman, Joel S., et al., 1999), this study hypothesizes that the growing hospital bad debt expense is mainly due to the shifting trend away from HMOs to PPOs as a conscious decision by employers to share costs with employees. Compared to HMO plans, the structure of PPOs includes higher co-pays, coinsurance, and deductibles for the patient-pay portion of medical bills, creating the potential for an increase in bad debt for hospital providers (from a case study). This bad debt expense has a greater impact in the community hospital than in the Texas Medical Center hospital. ^
Resumo:
A growing number of studies show strong associations between stress and altered immune function. In vivo studies of chronic and acute stress have demonstrated that cognitive stressors are strongly correlated with high circulating levels of catecholamines (CT) and corticosteroids (CS) that are associated with changes in type-1/type-2 cytokine expression. Although individual pharmacologic doses of CS and CT can inhibit the expression of T-helper 1 (Th1, type-1 like) and promote the production of T-helper 2 (Th2, type-2 like) cytokines in antigen-specific and mitogen stimulated human leukocyte cultures in vitro, little attention has been focused on the effects of combination physiologic-stress doses of CT and CS that may be more physiologically relevant. In addition, both in-vivo and in-vitro studies suggest that the differential expression of the B7 family of costimulatory molecules CD80 and CD86 may promote the expression of type-1 or type-2 cytokines, respectively. Furthermore, corticosteroids can influence the expression of β2-adrenergic receptors in various human tissues. We therefore investigated the combined effects of physiologic-stress doses of in vitro CT and CS upon the type-1/type-2 cytokine balance and expression of B7 costimulatory molecules of human peripheral blood mononuclear cells (PBMC) as a model to study the immunomodulatory effects of physiologic stress. Results demonstrated a significant decrease in type-1 cytokine expression and a significant increase in type-2 cytokine production in our CS+CT incubated cultures when compared to either CT or CS agents alone. In addition, we demonstrated the differential expression of CD80/CD86 in favor of CD86 at the cellular and population level as determined by flow cytometry in lipopolysaccharide stimulated human Monocytes. Furthermore, we developed flow cytometry based assays to detect total β2AR in human CD4+ T-lymphocytes that demonstrated decreased expression of β2AR in mitogen stimulated CD4+ T-lymphocytes in the presence of physiologic stress levels of CS and CT as single in vitro agents, however, when both CS and CT were combined, significantly higher expression of β2AR was observed. In summary, our in vitro data suggest that both CS and CT work cooperatively to shift immunity towards type-2 responses. ^
Resumo:
Background. Attention Deficit-Hyperactivity Disorder (AD/HD) diagnosis in children and adolescents has been on the rise over the last couple of decades and a multitude of studies have been conducted in an aim to better understand the disease. Literature has explored the role of several factors suspected of contributing to development of the disease, including: prenatal smoking exposures, environmental exposures, and low-birth weight. However, there is very limited reporting of fetal/infant exposure to antidepressants and prescription medications and the long-term behavioral outcomes, namely development of AD/HD. The purpose of this study was to evaluate the relationship between mother's exposure to prescription medications and/or antidepressants around the time of conception, during pregnancy, or while breastfeeding and the development of Attention-Deficit/Hyperactivity Disorder in offspring. Methods. Secondary analysis of data from a case-control study was performed. Exposure histories were collected for the mother and offspring. Data were collected using a secure, confidential, self-report, online survey to evaluate the relationship between antidepressant and/or prescription medication exposure and the development of AD/HD. The period of exposure to these drugs was defined as: around the time of conception, during pregnancy, or while breastfeeding. Cases were defined as a child who had been diagnosed with AD/HD. Controls were defined as a child who had not been diagnosed with AD/HD. Results. Prescription medication and antidepressant medication exposures around the time of conception, during pregnancy, or while breastfeeding were not associated with development of AD/HD. However, traumatic brain injury (OR=2.77 (1.61–4.77)) and preterm birth (OR=1.48 (1.04–2.12)) were identified as potential risk factors. These results support existing literature on AD/HD, but future work must be undertaken to better evaluate fetal/infant medication exposures and long-term behavioral outcomes.^
Resumo:
To investigate the association between allergies and attention deficit hyperactivity disorder (ADHD), a case-control study was conducted using the National Longitudinal Survey of Youth population. Cases were between the ages of 4 and 11 years and were classified either by a maternal-reported diagnosis or by the Behavior Problems Index Hyperactivity Scale. Controls were chosen from the same age group but had a score of less than 14 on the overall Behavior Problems Index. A history of allergies was considered positive if any of the following conditions were reported as requiring treatment by a doctor or other health professional: asthma, allergic conditions, or food allergies. A strong association was observed between allergies and a maternal-reported diagnosis while controlling for demographic, socioeconomic, perinatal, and environmental factors (adjusted odds ratio = 2.85 (95% CI = 1.49-5.42)). Other risk factors found to be important risk factors for a diagnosis of ADHD were gender (male), gestational age ($<$36 weeks), and maternal education ($\leq$high school). No association between allergies and cases classified as ADHD based on the hyperactivity symptom scale was observed. This study confirms other studies that reported an allergy/ADHD association in diagnoses populations. Further investigation confirming the association and explanation of the reasons and underlying mechanisms of the observed association is warranted. These studies should use validated diagnostic criteria for the diagnosis of ADHD symptoms and allergies, adequate sample sizes, and control for confounding. ^
Resumo:
Blood lead levels > 10 µg/dL are known to affect various areas of the brain that influence behavior and cause many other health problems in children. As a result, the Centers for Disease Control and Prevention (CDC) set the blood lead action level at 10 µg/dL. However, recent research provides evidence that blood lead levels <10 µg/dL also may lead to behavioral problems in children. With the recent increase in diagnosis of Attention-Deficit Hyperactivity Disorder (ADHD) in children in the U.S. it is important to determine possible environmental toxins such as lead that may play a role in causing ADHD symptoms. The aim of this systematic review of the literature was to identify recent published studies that examine an association between blood lead levels < 10 µg/dL and ADHD symptoms in children in order to summarize their findings and describe major gaps in the literature. Although available research is limited, the articles reviewed indicate that blood lead at levels much below the CDC action level of 10 µg/dL may affect a child's level of attention, hyperactivity, impulsivity and ADHD diagnosis. Additional prospective research is warranted in order to inform the revision of current blood lead action levels as well as better elucidate the relationship between lead and ADHD diagnoses.^
Resumo:
Developing a Model Interruption is a known human factor that contributes to errors and catastrophic events in healthcare as well as other high-risk industries. The landmark Institute of Medicine (IOM) report, To Err is Human, brought attention to the significance of preventable errors in medicine and suggested that interruptions could be a contributing factor. Previous studies of interruptions in healthcare did not offer a conceptual model by which to study interruptions. As a result of the serious consequences of interruptions investigated in other high-risk industries, there is a need to develop a model to describe, understand, explain, and predict interruptions and their consequences in healthcare. Therefore, the purpose of this study was to develop a model grounded in the literature and to use the model to describe and explain interruptions in healthcare. Specifically, this model would be used to describe and explain interruptions occurring in a Level One Trauma Center. A trauma center was chosen because this environment is characterized as intense, unpredictable, and interrupt-driven. The first step in developing the model began with a review of the literature which revealed that the concept interruption did not have a consistent definition in either the healthcare or non-healthcare literature. Walker and Avant’s method of concept analysis was used to clarify and define the concept. The analysis led to the identification of five defining attributes which include (1) a human experience, (2) an intrusion of a secondary, unplanned, and unexpected task, (3) discontinuity, (4) externally or internally initiated, and (5) situated within a context. However, before an interruption could commence, five conditions known as antecedents must occur. For an interruption to take place (1) an intent to interrupt is formed by the initiator, (2) a physical signal must pass a threshold test of detection by the recipient, (3) the sensory system of the recipient is stimulated to respond to the initiator, (4) an interruption task is presented to recipient, and (5) the interruption task is either accepted or rejected by v the recipient. An interruption was determined to be quantifiable by (1) the frequency of occurrence of an interruption, (2) the number of times the primary task has been suspended to perform an interrupting task, (3) the length of time the primary task has been suspended, and (4) the frequency of returning to the primary task or not returning to the primary task. As a result of the concept analysis, a definition of an interruption was derived from the literature. An interruption is defined as a break in the performance of a human activity initiated internal or external to the recipient and occurring within the context of a setting or location. This break results in the suspension of the initial task by initiating the performance of an unplanned task with the assumption that the initial task will be resumed. The definition is inclusive of all the defining attributes of an interruption. This is a standard definition that can be used by the healthcare industry. From the definition, a visual model of an interruption was developed. The model was used to describe and explain the interruptions recorded for an instrumental case study of physicians and registered nurses (RNs) working in a Level One Trauma Center. Five physicians were observed for a total of 29 hours, 31 minutes. Eight registered nurses were observed for a total of 40 hours 9 minutes. Observations were made on either the 0700–1500 or the 1500-2300 shift using the shadowing technique. Observations were recorded in the field note format. The field notes were analyzed by a hybrid method of categorizing activities and interruptions. The method was developed by using both a deductive a priori classification framework and by the inductive process utilizing line-byline coding and constant comparison as stated in Grounded Theory. The following categories were identified as relative to this study: Intended Recipient - the person to be interrupted Unintended Recipient - not the intended recipient of an interruption; i.e., receiving a phone call that was incorrectly dialed Indirect Recipient – the incidental recipient of an interruption; i.e., talking with another, thereby suspending the original activity Recipient Blocked – the intended recipient does not accept the interruption Recipient Delayed – the intended recipient postpones an interruption Self-interruption – a person, independent of another person, suspends one activity to perform another; i.e., while walking, stops abruptly and talks to another person Distraction – briefly disengaging from a task Organizational Design – the physical layout of the workspace that causes a disruption in workflow Artifacts Not Available – supplies and equipment that are not available in the workspace causing a disruption in workflow Initiator – a person who initiates an interruption Interruption by Organizational Design and Artifacts Not Available were identified as two new categories of interruption. These categories had not previously been cited in the literature. Analysis of the observations indicated that physicians were found to perform slightly fewer activities per hour when compared to RNs. This variance may be attributed to differing roles and responsibilities. Physicians were found to have more activities interrupted when compared to RNs. However, RNs experienced more interruptions per hour. Other people were determined to be the most commonly used medium through which to deliver an interruption. Additional mediums used to deliver an interruption vii included the telephone, pager, and one’s self. Both physicians and RNs were observed to resume an original interrupted activity more often than not. In most interruptions, both physicians and RNs performed only one or two interrupting activities before returning to the original interrupted activity. In conclusion the model was found to explain all interruptions observed during the study. However, the model will require an even more comprehensive study in order to establish its predictive value.