955 resultados para Log Stackers and Sawmill Yard


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research studies on the association between exposures to air contaminants and disease frequently use worn dosimeters to measure the concentration of the contaminant of interest. But investigation of exposure determinants requires additional knowledge beyond concentration, i.e., knowledge about personal activity such as whether the exposure occurred in a building or outdoors. Current studies frequently depend upon manual activity logging to record location. This study's purpose was to evaluate the use of a worn data logger recording three environmental parameters—temperature, humidity, and light intensity—as well as time of day, to determine indoor or outdoor location, with an ultimate aim of eliminating the need to manually log location or at least providing a method to verify such logs. For this study, data collection was limited to a single geographical area (Houston, Texas metropolitan area) during a single season (winter) using a HOBO H8 four-channel data logger. Data for development of a Location Model were collected using the logger for deliberate sampling of programmed activities in outdoor, building, and vehicle locations at various times of day. The Model was developed by analyzing the distributions of environmental parameters by location and time to establish a prioritized set of cut points for assessing locations. The final Model consisted of four "processors" that varied these priorities and cut points. Data to evaluate the Model were collected by wearing the logger during "typical days" while maintaining a location log. The Model was tested by feeding the typical day data into each processor and generating assessed locations for each record. These assessed locations were then compared with true locations recorded in the manual log to determine accurate versus erroneous assessments. The utility of each processor was evaluated by calculating overall error rates across all times of day, and calculating individual error rates by time of day. Unfortunately, the error rates were large, such that there would be no benefit in using the Model. Another analysis in which assessed locations were classified as either indoor (including both building and vehicle) or outdoor yielded slightly lower error rates that still precluded any benefit of the Model's use.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unlike infections occurring during periods of chemotherapy-induced neutropenia, postoperative infections in patients with solid malignancy remain largely understudied. The purpose of this population-based study was to evaluate the clinical and economic burden, as well as the relationship of hospital surgical volume and outcomes associated with serious postoperative infection (SPI) – i.e., bacteremia/sepsis, pneumonia, and wound infection – following resection of common solid tumors.^ From the Texas Discharge Data Research File, we identified all Texas residents who underwent resection of cancer of the lung, esophagus, stomach, pancreas, colon, or rectum between 2002 and 2006. From their billing records, we identified ICD-9 codes indicating SPI and also subsequent SPI-related readmissions occurring within 30 days of surgery. Random-effects logistic regression was used to calculate the impact of SPI on mortality, as well as the association between surgical volume and SPI, adjusting for case-mix, hospital characteristics, and clustering of multiple surgical admissions within the same patient and patients within the same hospital. Excess bed days and costs were calculated by subtracting values for patients without infections from those with infections computed using multilevel mixed-effects generalized linear model by fitting a gamma distribution to the data using log link.^ Serious postoperative infection occurred following 9.4% of the 37,582 eligible tumor resections and was independently associated with an 11-fold increase in the odds of in-hospital mortality (95% Confidence Interval [95% CI], 6.7-18.5, P < 0.001). Patients with SPI required 6.3 additional hospital days (95% CI, 6.1 - 6.5) at an incremental cost of $16,396 (95% CI, $15,927–$16,875). There was a significant trend toward lower overall rates of SPI with higher surgical volume (P=0.037). ^ Due to the substantial morbidity, mortality, and excess costs associated with SPI following solid tumor resections and given that, under current reimbursement practices, most of this heavy burden is borne by acute care providers, it is imperative for hospitals to identify more effective prophylactic measures, so that these potentially preventable infections and their associated expenditures can be averted. Additional volume-outcomes research is also needed to identify infection prevention processes that can be transferred from higher- to lower-volume providers.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous research has shown dietary intake self-monitoring, and culturally tailored weight loss interventions to be effective tools for weight loss. Technology can be used to tailor weight loss interventions to better suit adolescents. There is a lack of research to date on the use of personal digital assistants (PDAs) to self-monitor dietary intake among adolescents. The objective of this study was to determine the difference in dietary intake self-monitoring frequency between using a Personal Digital Assistant (PDA) or paper logs as a diet diary in obese adolescent females; and to describe differences in diet adherence, as well as changes in body size and self-efficacy to resist eating. We hypothesized dietary intake self-monitoring frequency would be greater during PDA use than during paper log use. This study was a randomized crossover trial. Participants recorded their diet for 4 weeks: 2 weeks on a PDA and 2 weeks on paper logs. Thirty-four obese females ages 12-20 were recruited for participation. Thirty were included in analyses. Participants recorded more entries/day while using the paper logs (4.10 entries/day ± 0.63) than while using the PDA (3.01 entries/day ±0.75) (p<0.001). Significantly more meals and snacks were skipped during paper log use (0.81/day ± 0.65) than during PDA use (0.23/day ± 0.22) (p=0.011). Changes in body size (BMI, weight, and waist circumference) and self-efficacy to resist eating did not differ significantly between PDA and paper log use. When compared to paper logs, participants felt the PDA was more convenient (p=0.020), looked forward to using the PDA more (p=0.008), and would rather continue using the PDA than the paper logs (p=0.020). The findings of this study indicate use of a PDA as a dietary intake self-monitoring tool among adolescents would not result in increased dietary intake self-monitoring to aid in weight loss. Use of paper logs would result in greater data returned to clinicians, though use of PDAs would likely get adolescents more excited about adhering to recommendations to record their diet. Future research should look at updated communication devices, such as cell phones and other PDAs with additional features, and the role they can play in increasing dietary intake self-monitoring among adolescents.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cross-sectional study was performed to quantify the prevalence of symtomatology in residents of mobile homes as a function of indoor formaldehyde concentration. Formaldehyde concentrations were monitored for a seven hour period with an automated wet-chemical colorimetric analyzer. The health status of family members was ascertained by administration of questionnaires and physical exams. This is the first investigation to perform clinical assessments on residents undergoing concurrent exposure assessment in the home.^ Only 22.8% of households eligible for participation chose to cooperate. Monitoring data and health evaluations were obtained from 155 households in four Texas counties. A total of 428 residents (86.1%) were available for examination during the sampling hours. The study population included 45 infants, 126 children, and 257 adults.^ Formaldehyde concentration was not found to be significantly associated with increased risks for symptoms and signs of ocular irritation, dermal anomalies, or malaise. Three associations were identified that warrant further investigation. The relative odds associated with a doubling of formaldehyde concentration was significantly associated with parenchymal rales in adults and children. However, risk was modified by log respirable suspended particulate concentrations. Due to the presence of modification by a continuous variable, prevalence odds ratios (POR) and 95% confidence intervals (95% CI) for these associations are presented in tables. A doubling of formaldehyde concentration was also associated with an increased risk of perceived tightness in the chest in adults. Prevalence odds ratios are presented in a table due to effect modification by the average number of hours spent indoors on weekdays. Furthermore, a doubling of formaldehyde concentration was associated with an increased risk of drowsiness in children (POR = 2.60; 95% CI 1.04-6.51) and adults (POR = 1.94; 95% CI 1.20-3.14). ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficacy of waste stabilization lagoons for the treatment of five priority pollutants and two widely used commercial compounds was evaluated in laboratory model ponds. Three ponds were designed to simulate a primary anaerobic lagoon, a secondary facultative lagoon, and a tertiary aerobic lagoon. Biodegradation, volatilization, and sorption losses were quantified for bis(2-chloroethyl) ether, benzene, toluene, naphthalene, phenanthrene, ethylene glycol, and ethylene glycol monoethyl ether. A statistical model using a log normal transformation indicated biodegradation of bis(2-chloroethyl) ether followed first-order kinetics. Additionally, multiple regression analysis indicated biochemical oxygen demand was the water quality variable most highly correlated with bis(2-chloroethyl) ether effluent concentration. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obesity during pregnancy is a serious health concern which has been associated with many adverse health outcomes for both the mother and the infant. In addition, data on the prevalence of obesity and its effects on pregnant women living in the border region are limited. This goal of this study was to examine the prevalence of preconception obesity among women living on each side of the Brownsville-Matamoros border who have just given birth, the relationship between obesity and pregnancy complications for the total population, and these associations by location. Study participants were drawn from a sample (n=947) from the Brownsville-Matamoros Sister City Project which included women from 10 border region hospitals (6 in Matamoros, 4 in Cameron County) who were recruited based on hospital log records indicating they had given birth to a live infant. De-identified data from verbal questionnaires administered within twenty-four hours after birth were analyzed to determine prevalence of preconception obesity on both sides of the border, and associated pregnancy outcomes for women residing in the United States and those in Mexico. Participants with missing height or weight data were excluded from analyses in this study, resulting in a final sample of 727 women. Significant associations were found between pre-pregnancy obesity and adverse pregnancy outcomes (OR=1.85, CI=1.30–2.64), hypertensive conditions (OR=2.76, CI=1.72–4.43), and macrosomia (OR=6.77, CI=1.13–40.57) using the total sample. Comparisons between the United States and Mexico sides of the border showed differences; associations between preconception obesity and adverse pregnancy outcomes were marginally significant among women in the United States (p=0.05), but failed to reach significance within this group for each individual complication. However, significant associations were found between obesity and preeclampsia (OR=3.61, CI=2.14–6.10), as well as obesity and the presence of one or more adverse pregnancy outcome (OR=2.29, CI=1.30–4.02), among women in Mexico. The results from this analysis provide new information specific to women on the Texas and Mexico border, a region that had not previously been studied. These significant associations between preconception obesity and adverse birth outcomes indicate that efforts to prevent obesity should focus on women of childbearing age, especially in Mexico.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preventable Hospitalizations (PHs) are hospitalizations that can be avoided with appropriate and timely care in the ambulatory setting and hence are closely associated with primary care access in a community. Increased primary care availability and health insurance coverage may increase primary care access, and consequently may be significantly associated with risks and costs of PHs. Objective. To estimate the risk and cost of preventable hospitalizations (PHs); to determine the association of primary care availability and health insurance coverage with the risk and costs of PHs, first alone and then simultaneously; and finally, to estimate the impact of expansions in primary care availability and health insurance coverage on the burden of PHs among non-elderly adult residents of Harris County. Methods. The study population was residents of Harris County, age 18 to 64, who had at least one hospital discharge in a Texas hospital in 2008. The primary independent variables were availability of primary care physicians, availability of primary care safety net clinics and health insurance coverage. The primary dependent variables were PHs and associated hospitalization costs. The Texas Health Care Information Collection (THCIC) Inpatient Discharge data was used to obtain information on the number and costs of PHs in the study population. Risk of PHs in the study population, as well as average and total costs of PHs were calculated. Multivariable logistic regression models and two-step Heckman regression models with log-transformed costs were used to determine the association of primary care availability and health insurance coverage with the risk and costs of PHs respectively, while controlling for individual predisposing, enabling and need characteristics. Predicted PH risk and cost were used to calculate the predicted burden of PHs in the study population and the impact of expansions in primary care availability and health insurance coverage on the predicted burden. Results. In 2008, hospitalized non-elderly adults in Harris County had 11,313 PHs and a corresponding PH risk of 8.02%. Congestive heart failure was the most common PH. PHs imposed a total economic burden of $84 billion at an average of $7,449 per PH. Higher primary care safety net availability was significantly associated with the lower risk of PHs in the final risk model, but only in the uninsured. A unit increase in safety net availability led to a 23% decline in PH odds in the uninsured, compared to only a 4% decline in the insured. Higher primary care physician availability was associated with increased PH costs in the final cost model (β=0.0020; p<0.05). Lack of health insurance coverage increased the risk of PH, with the uninsured having 30% higher odds of PHs (OR=1.299; p<0.05), but reduced the cost of a PH by 7% (β=-0.0668; p<0.05). Expansions in primary care availability and health insurance coverage were associated with a reduction of about $1.6 million in PH burden at the highest level of expansion. Conclusions. Availability of primary care resources and health insurance coverage in hospitalized non-elderly adults in Harris County are significantly associated with the risk and costs of PHs. Expansions in these primary care access factors can be expected to produce significant reductions in the burden of PHs in Harris County.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Venous thromboembolism (VTE), including deep vein thrombosis (DVT) and pulmonary embolism (PE), is the third most preventable cardiovascular disease and a growing public health problem in the United States. The incidence of VTE remains high with an annual estimate of more than 600,000 symptomatic events. DVT affects an estimated 2 million American each year with a death toll of 300,000 persons per year from DVT-related PE. Leukemia patients are at high risk for both hemorrhage and thrombosis; however, little is known about thrombosis among acute leukemia patients. The ultimate goal of this dissertation was to obtain deep understanding of thrombotic issue among acute leukemia patients. The dissertation was presented in a format of three papers. First paper mainly looked at distribution and risk factors associated with development of VTE among patients with acute leukemia prior to leukemia treatment. Second paper looked at incidence, risk factors, and impact of VTE on survival of patients with acute lymphoblastic leukemia during treatment. Third paper looked at recurrence and risk factors for VTE recurrence among acute leukemia patients with an initial episode of VTE. Descriptive statistics, Chi-squared or Fisher's exact test, median test, Mann-Whitney test, logistic regression analysis, Nonparametric Estimation Kaplan-Meier with a log-rank test or Cox model were used when appropriate. Results from analyses indicated that acute leukemia patients had a high prevalence, incidence, and recurrent rate of VTE. Prior history of VTE, obesity, older age, low platelet account, presence of Philadelphia positive ALL, use of oral contraceptives or hormone replacement therapy, presence of malignancies, and co-morbidities may place leukemia patients at an increased risk for VTE development or recurrence. Interestingly, development of VTE was not associated with a higher risk of death among hospitalized acute leukemia patients.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. End-stage liver disease (ESLD) is an irreversible condition that leads to the imminent complete failure of the liver. Orthotopic liver transplantation (OLT) has been well accepted as the best curative option for patients with ESLD. Despite the progress in liver transplantation, the major limitation nowadays is the discrepancy between donor supply and organ demand. In an effort to alleviate this situation, mismatched donor and recipient gender or race livers are being used. However, the simultaneous impact of donor and recipient gender and race mismatching on patient survival after OLT remains unclear and relatively challenging to surgeons. ^ Objective. To examine the impact of donor and recipient gender and race mismatching on patient survival after OLT using the United Network for Organ Sharing (UNOS) database. ^ Methods. A total of 40,644 recipients who underwent OLT between 2002 and 2011 were included. Kaplan-Meier survival curves and the log-rank tests were used to compare the survival rates among different donor-recipient gender and race combinations. Univariate Cox regression analysis was used to assess the association of donor-recipient gender and race mismatching with patient survival after OLT. Multivariable Cox regression analysis was used to model the simultaneous impact of donor-recipient gender and race mismatching on patient survival after OLT adjusting for a list of other risk factors. Multivariable Cox regression analysis stratifying on recipient hepatitis C virus (HCV) status was also conducted to identify the variables that were differentially associated with patient survival in HCV + and HCV − recipients. ^ Results. In the univariate analysis, compared to male donors to male recipients, female donors to male recipients had a higher risk of patient mortality (HR, 1.122; 95% CI, 1.065–1.183), while in the multivariable analysis, male donors to female recipients experienced an increased mortality rates (adjusted HR, 1.114; 95% CI, 1.048–1.184). Compared to white donors to white recipients, Hispanic donors to black recipients had a higher risk of patient mortality (HR, 1.527; 95% CI, 1.293–1.804) in the univariate analysis, and similar result (adjusted HR, 1.553; 95% CI, 1.314–1.836) was noted in multivariable analysis. After the stratification on recipient HCV status in the multivariable analysis, HCV + mismatched recipients appeared to be at greater risk of mortality than HCV − mismatched recipients. Female donors to female HCV − recipients (adjusted HR, 0.843; 95% CI, 0.769–0.923), and Hispanic HCV + recipients receiving livers from black donors (adjusted HR, 0.758; 95% CI, 0.598–0.960) had a protective effect on patient survival after OLT. ^ Conclusion. Donor-recipient gender and race mismatching adversely affect patient survival after OLT, both independently and after the adjustment for other risk factors. Female recipient HCV status is an important effect modifier in the association between donor-recipient gender combination and patient survival.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Kidney disease is a growing public health phenomenon in the U.S. and in the world. Downstream interventions, dialysis and renal transplants covered by Medicare's renal disease entitlement policy in those who are 65 years and over have been expensive treatments that have been not foolproof. The shortage of kidney donors in the U.S. has grown in the last two decades. Therefore study of upstream events in kidney disease development and progression is justified to prevent the rising prevalence of kidney disease. Previous studies have documented the biological route by which obesity can progress and accelerate kidney disease, but health services literature on quantifying the effects of overweight and obesity on economic outcomes in the context of renal disease were lacking. Objectives . The specific aims of this study were (1) to determine the likelihood of overweight and obesity in renal disease and in three specific adult renal disease sub-populations, hypertensive, diabetic and both hypertensive and diabetic (2) to determine the incremental health service use and spending in overweight and obese renal disease populations and (3) to determine who financed the cost of healthcare for renal disease in overweight and obese adult populations less than 65 years of age. Methods. This study was a retrospective cross-sectional study of renal disease cases pooled for years 2002 to 2009 from the Medical Expenditure Panel Survey. The likelihood of overweight and obesity was estimated using chi-square test. Negative binomial regression and generalized gamma model with log link were used to estimate healthcare utilization and healthcare expenditures for six health event categories. Payments by self/family, public and private insurance were described for overweight and obese kidney disease sub-populations. Results. The likelihood of overweight and obesity was 0.29 and 0.46 among renal disease and obesity was common in hypertensive and diabetic renal disease population. Among obese renal disease population, negative binomial regression estimates of healthcare utilization per person per year as compared to normal weight renal disease persons were significant for office-based provider visits and agency home health visits respectively (p=0.001; p=0.005). Among overweight kidney disease population health service use was significant for inpatient hospital discharges (p=0.027). Over years 2002 to 2009, overweight and obese renal disease sub-populations had 53% and 63% higher inpatient facility and doctor expenditures as compared to normal weight renal disease population and these result were statistically significant (p=0.007; p=0.026). Overweigh renal disease population had significant total expenses per person per year for office-based and outpatient associated care. Overweight and obese renal disease persons paid less from out-of-pocket overall compared to normal weight renal disease population. Medicare and Medicaid had the highest mean annual payments for obese renal disease persons, while mean annual payments per year were highest for private insurance among normal weight renal disease population. Conclusion. Overweight and obesity were common in those with acute and chronic kidney disease and resulted in higher healthcare spending and increased utilization of office-based providers, hospital inpatient department and agency home healthcare. Healthcare for overweight and obese renal disease persons younger than 65 years of age was financed more by private and public insurance and less by out of pocket payments. With the increasing epidemic of obesity in the U.S. and the aging of the baby boomer population, the findings of the present study have implications for public health and for greater dissemination of healthcare resources to prevent, manage and delay the onset of overweight and obesity that can progress and accelerate the course of the kidney disease.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The intensity of care for patients at the end-of-life is increasing in recent years. Publications have focused on intensity of care for many cancers, but none on melanoma patients. Substantial gaps exist in knowledge about intensive care and its alternative, hospice care, among the advanced melanoma patients at the end of life. End-of-life care may be used in quite different patterns and induce both intended and unintended clinical and economic consequences. We used the Surveillance, Epidemiology, and End Results (SEER)-Medicare linked databases to identify patients aged 65 years or older with metastatic melanoma who died between 2000 and 2007. We evaluated trends and associations between sociodemographic and health services characteristics and the use of hospice care, chemotherapy, surgery, and radiation therapy and costs. Survival, end-of-life costs, and incremental cost-effectiveness ratio were evaluated using propensity score methods. Costs were analyzed from the perspective of Medicare in 2009 dollars. In the first journal Article we found increasing use of surgery for patients with metastatic melanoma from 13% in 2000 to 30% in 2007 (P=0.03 for trend), no significant fluctuation in use of chemotherapy (P=0.43) or radiation therapy (P=0.46). Older patients were less likely to receive radiation therapy or chemotherapy. The use of hospice care increased from 61% in 2000 to 79% in 2007 (P =0.07 for trend). Enrollment in short-term (1-3 days) hospice care use increased, while long-term hospice care (≥ 4 days) remained stable. Patients living in the SEER Northeast and South regions were less likely to undergo surgery. Patients enrolled in long-term hospice care used significantly less chemotherapy, surgery and radiation therapy. In the second journal article, of 611 patients identified for this study, 358 (59%) received no hospice care after their diagnosis, 168 (27%) received 1 to 3 days of hospice care, and 85 (14%) received 4 or more days of hospice care. The median survival time was 181 days for patients with no hospice care, 196 days for patients enrolled in hospice for 1 to 3 days, and 300 days for patients enrolled for 4 or more days (log-rank test, P < 0.001). The estimated hazard ratios (HR) between 4 or more days hospice use and survival were similar within the original cohort Cox proportional hazard model (HR, 0.62; 95% CI, 0.49-0.78, P < 0.0001) and the propensity score-matched model (HR, 0.61; 95% CI, 0.47-0.78, P = 0.0001). Patients with ≥ 4 days of hospice care incurred lower end-of-life costs than the other two groups ($14,298 versus $19,380 for the 1- to 3-days hospice care, and $24,351 for patients with no hospice care; p < 0.0001). In conclusion, Surgery and hospice care use increased over the years of this study while the use of chemotherapy and radiation therapy remained consistent for patients diagnosed with metastatic melanoma. Patients diagnosed with advanced melanoma who enrolled in ≥ 4 days of hospice care experienced longer survival than those who had 1-3 days of hospice or no hospice care, and this longer overall survival was accompanied by lower end-of-life costs.^

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The euphotic depth (Zeu) is a key parameter in modelling primary production (PP) using satellite ocean colour. However, evaluations of satellite Zeu products are scarce. The objective of this paper is to investigate existing approaches and sensors to estimate Zeu from satellite and to evaluate how different Zeu products might affect the estimation of PP in the Southern Ocean (SO). Euphotic depth was derived from MODIS and SeaWiFS products of (i) surface chlorophyll-a (Zeu-Chla) and (ii) inherent optical properties (Zeu-IOP). They were compared with in situ measurements of Zeu from different regions of the SO. Both approaches and sensors are robust to retrieve Zeu, although the best results were obtained using the IOP approach and SeaWiFS data, with an average percentage of error (E) of 25.43% and mean absolute error (MAE) of 0.10 m (log scale). Nevertheless, differences in the spatial distribution of Zeu-Chla and Zeu-IOP for both sensors were found as large as 30% over specific regions. These differences were also observed in PP. On average, PP based on Zeu-Chla was 8% higher than PP based on Zeu-IOP, but it was up to 30% higher south of 60°S. Satellite phytoplankton absorption coefficients (aph) derived by the Quasi-Analytical Algorithm at different wavelengths were also validated and the results showed that MODIS aph are generally more robust than SeaWiFS. Thus, MODIS aph should be preferred in PP models based on aph in the SO. Further, we reinforce the importance of investigating the spatial differences between satellite products, which might not be detected by the validation with in situ measurements due to the insufficient amount and uneven distribution of the data.