6 resultados para artial discharges
em DigitalCommons@The Texas Medical Center
Resumo:
PURPOSE: To review our clinical experience and determine if there are appropriate signs and symptoms to consider POLG sequencing prior to valproic acid (VPA) dosing in patients with seizures. METHODS: Four patients who developed VPA-induced hepatotoxicity were examined for POLG sequence variations. A subsequent chart review was used to describe clinical course prior to and after VPA dosing. RESULTS: Four patients of multiple different ethnicities, age 3-18 years, developed VPA-induced hepatotoxicity. All were given VPA due to intractable partial seizures. Three of the patients had developed epilepsia partialis continua. The time from VPA exposure to liver failure was between 2 and 3 months. Liver failure was reversible in one patient. Molecular studies revealed homozygous p.R597W or p.A467T mutations in two patients. The other two patients showed compound heterozygous mutations, p.A467T/p.Q68X and p.L83P/p.G888S. Clinical findings and POLG mutations were diagnostic of Alpers-Huttenlocher syndrome. CONCLUSION: Our cases underscore several important findings: POLG mutations have been observed in every ethnic group studied to date; early predominance of epileptiform discharges over the occipital region is common in POLG-induced epilepsy; the EEG and MRI findings varying between patients and stages of the disease; and VPA dosing at any stage of Alpers-Huttenlocher syndrome can precipitate liver failure. Our data support an emerging proposal that POLG gene testing should be considered in any child or adolescent who presents or develops intractable seizures with or without status epilepticus or epilepsia partialis continua, particularly when there is a history of psychomotor regression.
Resumo:
Background. The Cypress Creek is one of the main tributaries of Lake Houston, which provides drinking water to 21.4 million customers. Furthermore, the watershed is being utilized for contact and non-contact recreation, such as canoeing, swimming, hiking trail, and picnics. Water along the creek is impacted by numerous wastewater outfalls from both point and non-point sources. As the creek flows into Lake Houston, it carries both organic and inorganic contaminants that may affect the drinking water quality of this important water source reservoir. Objective. This study was carried out to evaluate the inorganic chemical load of the water in Cypress Creek along its entire length, from the headwaters in Waller County and up to the drainage into Lake Houston. The purpose was to determine whether there are hazardous concentrations of metals in the water and what would be the likely sources. Method. Samples were collected at 29 sites along the creek and analyzed for 29 metals, 17 of which were on the Environmental Protection Agency priority pollution list. Public access sites primarily at bridges were used for sample collection. Samples were transported on ice to the University Of Texas School Of Public Health laboratory, spiked with 2 ml HNO3 kept overnight in the refrigerator, and the following day transported to the EPA laboratory for analysis. Analysis was done by EPA Method 200.7-ICP, Method 200.8ICP/MS and Method 245.1-CVAAS. Results. Metals were present above the detection limits at 65% of sites. Concentrations of aluminum, iron, sodium, potassium, magnesium, and calcium, were particularly high at all sites. Aluminum, sodium, and iron concentrations greatly exceeded the EPA secondary drinking water standards at all sites. ^ Conclusion. The recreational water along Cypress Creek is impacted by wastewater from both permitted and non-permitted outfalls, which deposit inorganic substances into the water. Although a number of inorganic contaminants were present in the water, toxic metals regulated by the EPA were mostly below the recommended limits. However, high concentrations of aluminum, sodium, and iron in the Cypress Creek bring forward the issue of unauthorized discharges of salt water from mining, as well as industrial and domestic wastewater.^
Resumo:
To address concerns expressed about the possible effect of drilling mud discharges on shallow, low-energy estuarine ecosystems, a 12 month study was designed to detect alterations in water quality and sediment geochemistry. Each drilling mud used in the study and sediments from the study site were analyzed in the laboratory for chemical and physical characteristics. Potential water quality impacts were simulated by the EPA-COE elutriation test procedure. Mud toxicity was measured by acute and chronic bioassays with Mysidopsis bahia, Mercenaria mercenaria, and Nereis virens.^ For the field study, a relatively pristine, shallow (1.2 m) estuary (Christmas Bay, TX) without any drilling activity for the last 30 years was chosen for the study site. After a three month baseline study, three stations were selected. Station 1 was an external control. At each treatment station (2, 3), mesocosms were constructed to enclose a 3.5 m$\sp3$ water column. Each treatment station included an internal control site also. Each in situ mesocosm, except the controls, was successively dosed at a mesocosm-specific dose (1:100; 1:1,000; or 1:10,000 v/v) with 4 field collected drilling muds (spud, nondispersed, lightly-treated, and heavily-treated lignosulfonate) in sequential order over 1.5 months. Twenty-four hours after each dose, water exchange was allowed until the next treatment. Station 3 was destroyed by a winter storm. After the last treatment, the enclosures were removed and the remaining sites monitored for 6 months. One additional site was similarly dosed (1:100 v/v) with clean dredged sediment from Christmas Bay for comparison between dredged sediments and drilling muds.^ Results of the analysis of the water samples and field measurements showed that water quality was impacted during the discharges, primarily at the highest dose (1:100 v/v), but that elevated levels of C, Cr (T,F), Cr$\sp{+3}$ (T, F), N, Pb, and Zn returned to ambient levels before the end of the 24 hour exposure period or immediately after water exchange was allowed (Al, Ba(T), Chlorophyll ABC, SS, %T). Barium, from the barite, was used as a geochemical tracer in the sediments to confirm estimated doses by mass balance calculations. Barium reached a maximum of 166x background levels at the high dose mesocosm. Barium levels returned to ambient or only slightly elevated levels at the end of the 6 month monitoring period due to sediment deposition, resuspension, and bioturbation. QA/QC results using blind samples consisting of lab standards and spiked samples for both water and sediment matrices were within acceptable coefficients of variation.^ In order to avoid impacts on water quality and sediment geochemistry in a shallow estuarine ecosystem, this study concluded that a minimal dilution of 1:1,000 (v/v) would be required in addition to existing regulatory constraints. ^
Resumo:
A life table methodology was developed which estimates the expected remaining Army service time and the expected remaining Army sick time by years of service for the United States Army population. A measure of illness impact was defined as the ratio of expected remaining Army sick time to the expected remaining Army service time. The variances of the resulting estimators were developed on the basis of current data. The theory of partial and complete competing risks was considered for each type of decrement (death, administrative separation, and medical separation) and for the causes of sick time.^ The methodology was applied to world-wide U.S. Army data for calendar year 1978. A total of 669,493 enlisted personnel and 97,704 officers were reported on active duty as of 30 September 1978. During calendar year 1978, the Army Medical Department reported 114,647 inpatient discharges and 1,767,146 sick days. Although the methodology is completely general with respect to the definition of sick time, only sick time associated with an inpatient episode was considered in this study.^ Since the temporal measure was years of Army service, an age-adjusting process was applied to the life tables for comparative purposes. Analyses were conducted by rank (enlisted and officer), race and sex, and were based on the ratio of expected remaining Army sick time to expected remaining Army service time. Seventeen major diagnostic groups, classified by the Eighth Revision, International Classification of Diseases, Adapted for Use In The United States, were ranked according to their cumulative (across years of service) contribution to expected remaining sick time.^ The study results indicated that enlisted personnel tend to have more expected hospital-associated sick time relative to their expected Army service time than officers. Non-white officers generally have more expected sick time relative to their expected Army service time than white officers. This racial differential was not supported within the enlisted population. Females tend to have more expected sick time relative to their expected Army service time than males. This tendency remained after diagnostic groups 580-629 (Genitourinary System) and 630-678 (Pregnancy and Childbirth) were removed. Problems associated with the circulatory system, digestive system and musculoskeletal system were among the three leading causes of cumulative sick time across years of service. ^
Resumo:
Background. Kidney disease is a growing public health phenomenon in the U.S. and in the world. Downstream interventions, dialysis and renal transplants covered by Medicare's renal disease entitlement policy in those who are 65 years and over have been expensive treatments that have been not foolproof. The shortage of kidney donors in the U.S. has grown in the last two decades. Therefore study of upstream events in kidney disease development and progression is justified to prevent the rising prevalence of kidney disease. Previous studies have documented the biological route by which obesity can progress and accelerate kidney disease, but health services literature on quantifying the effects of overweight and obesity on economic outcomes in the context of renal disease were lacking. Objectives . The specific aims of this study were (1) to determine the likelihood of overweight and obesity in renal disease and in three specific adult renal disease sub-populations, hypertensive, diabetic and both hypertensive and diabetic (2) to determine the incremental health service use and spending in overweight and obese renal disease populations and (3) to determine who financed the cost of healthcare for renal disease in overweight and obese adult populations less than 65 years of age. Methods. This study was a retrospective cross-sectional study of renal disease cases pooled for years 2002 to 2009 from the Medical Expenditure Panel Survey. The likelihood of overweight and obesity was estimated using chi-square test. Negative binomial regression and generalized gamma model with log link were used to estimate healthcare utilization and healthcare expenditures for six health event categories. Payments by self/family, public and private insurance were described for overweight and obese kidney disease sub-populations. Results. The likelihood of overweight and obesity was 0.29 and 0.46 among renal disease and obesity was common in hypertensive and diabetic renal disease population. Among obese renal disease population, negative binomial regression estimates of healthcare utilization per person per year as compared to normal weight renal disease persons were significant for office-based provider visits and agency home health visits respectively (p=0.001; p=0.005). Among overweight kidney disease population health service use was significant for inpatient hospital discharges (p=0.027). Over years 2002 to 2009, overweight and obese renal disease sub-populations had 53% and 63% higher inpatient facility and doctor expenditures as compared to normal weight renal disease population and these result were statistically significant (p=0.007; p=0.026). Overweigh renal disease population had significant total expenses per person per year for office-based and outpatient associated care. Overweight and obese renal disease persons paid less from out-of-pocket overall compared to normal weight renal disease population. Medicare and Medicaid had the highest mean annual payments for obese renal disease persons, while mean annual payments per year were highest for private insurance among normal weight renal disease population. Conclusion. Overweight and obesity were common in those with acute and chronic kidney disease and resulted in higher healthcare spending and increased utilization of office-based providers, hospital inpatient department and agency home healthcare. Healthcare for overweight and obese renal disease persons younger than 65 years of age was financed more by private and public insurance and less by out of pocket payments. With the increasing epidemic of obesity in the U.S. and the aging of the baby boomer population, the findings of the present study have implications for public health and for greater dissemination of healthcare resources to prevent, manage and delay the onset of overweight and obesity that can progress and accelerate the course of the kidney disease.^
Resumo:
The main aim of this study was to look at the association of Clostridium difficile infection (CDI) and HIV. A secondary goal was to look at the trend of CDI-related deaths in Texas from 1999-2011. To evaluate the coinfection of CDI and HIV, we looked at 2 datasets provided by CHS-TDSHS, for 13 years of study period from 1999-2011: 1) Texas death certificate data and 2) Texas hospital discharge data. An ancillary source of data was national level death data from CDC. We did a secondary data analysis and reported the age-adjusted death rates (mortality) and hospital discharge frequencies (morbidity) for CDI, HIV and for CDI+HIV coinfection.^ Since the turn of the century, CDI has reemerged as an important public health challenge due to the emergence of hypervirulent epidemic strains. From 1999-2011, there has been a significant upward trend in CDI-related death rates; in the state of Texas alone, CDI mortality rate has increased 8.7 fold in this time period at the rate of 0.2 deaths per year per 100,000 individuals. On the contrary, mortality due to HIV has decreased by 46% and has been trending down. The demographic groups in Texas with the highest CDI mortality rates were elderly aged 65+, males, whites and hospital inpatients. The epidemiology of C. difficile has changed in such a way that it is not only staying confined to these traditional high-risk groups, but is also being increasingly reported in low-risk populations such as healthy people in the community (community acquired C. difficile), and most recently immunocompromised patients. Among the latter, HIV can worsen the adverse health outcomes of CDI and vice versa. In patients with CDI and HIV coinfection, higher mortality and morbidity was found in young & middle-aged adults, blacks and males, the same demographic population that is at higher risk for HIV. As with typical CDI, the coinfection was concentrated in the hospital inpatients. Of all the CDI-related deaths in USA from 1999-2010, in the 25-44 year age group, 13% had HIV infection. Of all CDI-related inpatient hospital discharges in Texas from 1999-2011, in patients 44 years and younger, 17% had concomitant HIV infection. Therefore, HIV is a possible novel emerging risk factor for CDI.^