865 resultados para 12-MONTH PREVALENCE


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article proposes a combined technique including bone grafting, connective tissue graft, and coronally advanced flap to create some space for simultaneous bone regrowth and root coverage. A 23 year-old female was referred to our private clinic with a severe class II Miller recession and lack of attached gingiva. The suggested treatment plan comprised of root coverage combined with xenograft bone particles. The grafted area healed well and full coverage was achieved at 12-month follow-up visit. Bone-added periodontal plastic surgery can be considered as a practical procedure for management of deep gingival recession without buccal bone plate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dispersal limitation is often involved when the species composition of a dry abandoned grassland shows a slow response to resumed regular mowing. A seed-addition experiment, using 32 species which do not belong to the local species pool, was performed on Monte San Giorgio (southern Switzerland) to test whether the low recruitment success was due to dispersal limitation or due to unfavourable microsite conditions. In October 1997, 20 species were individually sown in six 3 × 4 m blocks of a 2 × 2 factorial “partial” split-plot design with treatments of abandonment vs. mowing and undisturbed vs. root-removed soil, this last being applied in small naturally-degradable pots. Moreover, 12 species were sown only in the treatments on undisturbed soil. Seedlings of sown and spontaneously germinating seeds were observed on 16 occasions over one 12-month period. Seeds of 31 out of the 32 species germinated. Twenty-four species showed germination rates higher than 5% and different seasonal germination patterns. Established vegetation, especially the tussocks ofMolinia arundinacea, reduced the quality of microsites for germination. Whereas a few species germinated better under the litter ofMolinia arundinacea, many more germinated better under the more variable microsite conditions of a mown grassland. Only a few seedlings of 25 species out of the 31 germinated species survived until October 1998. Seedling survival was negatively affected by litter, unfavourable weather conditions (frost and dry periods followed by heavy rains) and herbivory (slugs and grasshoppers). Tussocks ofMolinia arundinacea, however, tended to protect seedlings. The poor establishment success of “new” species observed in abandoned meadows on Monte San Giorgio after resumed mowing is due to dispersal and microsite limitations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Information about the size of a tumor and its temporal evolution is needed for diagnosis as well as treatment of brain tumor patients. The aim of the study was to investigate the potential of a fully-automatic segmentation method, called BraTumIA, for longitudinal brain tumor volumetry by comparing the automatically estimated volumes with ground truth data acquired via manual segmentation. Longitudinal Magnetic Resonance (MR) Imaging data of 14 patients with newly diagnosed glioblastoma encompassing 64 MR acquisitions, ranging from preoperative up to 12 month follow-up images, was analysed. Manual segmentation was performed by two human raters. Strong correlations (R = 0.83-0.96, p < 0.001) were observed between volumetric estimates of BraTumIA and of each of the human raters for the contrast-enhancing (CET) and non-enhancing T2-hyperintense tumor compartments (NCE-T2). A quantitative analysis of the inter-rater disagreement showed that the disagreement between BraTumIA and each of the human raters was comparable to the disagreement between the human raters. In summary, BraTumIA generated volumetric trend curves of contrast-enhancing and non-enhancing T2-hyperintense tumor compartments comparable to estimates of human raters. These findings suggest the potential of automated longitudinal tumor segmentation to substitute manual volumetric follow-up of contrast-enhancing and non-enhancing T2-hyperintense tumor compartments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND In recent years, the scientific discussion has focused on new strategies to enable a torn anterior cruciate ligament (ACL) to heal into mechanically stable scar tissue. Dynamic intraligamentary stabilization (DIS) was first performed in a pilot study of 10 patients. The purpose of the current study was to evaluate whether DIS would lead to similarly sufficient stability and good clinical function in a larger case series. METHODS Acute ACL ruptures were treated by using an internal stabilizer, combined with anatomical repositioning of torn bundles and microfracturing to promote self-healing. Clinical assessment (Tegner, Lysholm, IKDC, and visual analogue scale [VAS] for patient satisfaction scores) and assessment of knee laxity was performed at 3, 6, 12, and 24 months. A one-sample design with a non-inferiority margin was chosen to compare the preoperative and postoperative IKDS and Lysholm scores. RESULTS 278 patients with a 6:4 male to female ratio were included. Average patient age was 31 years. Preoperative mean IKDC, Lysholm, and Tegner scores were 98.8, 99.3, and 5.1 points, respectively. The mean anteroposterior (AP) translation difference from the healthy contralateral knee was 4.7 mm preoperatively. After DIS treatment, the mean 12-month IKDC, Lysholm, and Tegner scores were 93.6, 96.2, and 4.9 points, respectively, and the mean AP translation difference was 2.3 mm. All these outcomes were significantly non-inferior to the preoperative or healthy contralateral values (p < 0.0001). Mean patient satisfaction was 8.8 (VAS 0-10). Eight ACL reruptures occurred and 3 patients reported insufficient subjective stability of the knee at the end of the study period. CONCLUSIONS Anatomical repositioning, along with DIS and microfracturing, leads to clinically stable healing of the torn ACL in the large majority of patients. Most patients exhibited almost normal knee function, reported excellent satisfaction, and were able to return to their previous levels of sporting activity. Moreover, this strategy resulted in stable healing of all sutured menisci, which could lower the rate of osteoarthritic changes in future. The present findings support the discussion of a new paradigm in ACL treatment based on preservation and self-healing of the torn ligament.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE Few studies have used multivariate models to quantify the effect of multiple previous spine surgeries on patient-oriented outcome after spine surgery. This study sought to quantify the effect of prior spine surgery on 12-month postoperative outcomes in patients undergoing surgery for different degenerative disorders of the lumbar spine. METHODS The study included 4940 patients with lumbar degenerative disease documented in the Spine Tango Registry of EUROSPINE, the Spine Society of Europe, from 2004 to 2015. Preoperatively and 12 months postoperatively, patients completed the multidimensional Core Outcome Measures Index (COMI; 0-10 scale). Patients' medical history and surgical details were recorded using the Spine Tango Surgery 2006 and 2011 forms. Multiple linear regression models were used to investigate the relationship between the number of previous surgeries and the 12-month postoperative COMI score, controlling for the baseline COMI score and other potential confounders. RESULTS In the adjusted model including all cases, the 12-month COMI score showed a 0.37-point worse value [95 % confidence intervals (95 % CI) 0.29-0.45; p < 0.001] for each additional prior spine surgery. In the subgroup of patients with lumbar disc herniation, the corresponding effect was 0.52 points (95 % CI 0.27-0.77; p < 0.001) and in lumbar degenerative spondylolisthesis, 0.40 points (95 % CI 0.17-0.64; p = 0.001). CONCLUSIONS We were able to demonstrate a clear "dose-response" effect for previous surgery: the greater the number of prior spine surgeries, the systematically worse the outcome at 12 months' follow-up. The results of this study can be used when considering or consenting a patient for further surgery, to better inform the patient of the likely outcome and to set realistic expectations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advances in medical technology, in genetics, and in clinical research have led to early detection of cancer, precise diagnosis, and effective treatment modalities. Decline in cancer incidence and mortality due to cancer has led to increased number of long-term survivors. However, the ethnic minority population has not experienced this decline and still continues to carry a disparate proportion of the cancer burden. Majority of the clinical research including survivorship studies have recruited and continue to recruit a convenient sample of middle- to upper-class Caucasian survivors. Thus, minorities are underrepresented in cancer research in terms of both clinical studies and in health related quality of life (HRQOL) studies. ^ Life style and diet have been associated with increased risk of breast cancer. High vegetable low fat diet has been shown to reduce recurrence of breast cancer and early death. The Women's Healthy Eating and Living Study is an ongoing multi-site randomized controlled trial that is evaluating the high-vegetable low fat diet in reducing the recurrence of breast cancer and early death. The purpose of this dissertation was to (1) compare the impact of the modified diet on the HRQOL during the first 12-month period on specific Minorities and matched Caucasians; (2) identify predictors that significantly impact the HRQOL of the study participants; and (3) using the structural equation modeling assess the impact of nutrition on the HRQOL of the intervention group participants. Findings suggest that there are no significant differences in change in HRQOL between Minorities and Caucasians; between Minorities in the intervention group and those in the comparison group; and between women in the intervention group and those in the comparison group. Minority indicator variable and Intervention/Comparison group indicator variable were not found to be good predictors of HRQOL. Although the structural equation models suggested viable representation of the relationship between the antecedent variables, the mediating variables and the two outcome variables, the impact of nutrition was not statistically significant to be included in the model. This dissertation, by analyzing the HRQOL of minorities in the WHEL Study, attempted to add to the knowledge base specific to minority cancer survivors. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With substance abuse treatment expanding in prisons and jails, understanding how behavior change interacts with a restricted setting becomes more essential. The Transtheoretical Model (TTM) has been used to understand intentional behavior change in unrestricted settings, however, evidence indicates restrictive settings can affect the measurement and structure of the TTM constructs. The present study examined data from problem drinkers at baseline and end-of-treatment from three studies: (1) Project CARE (n = 187) recruited inmates from a large county jail; (2) Project Check-In (n = 116) recruited inmates from a state prison; (3) Project MATCH, a large multi-site alcohol study had two recruitment arms, aftercare (n = 724 pre-treatment and 650 post-treatment) and outpatient (n = 912 pre-treatment and 844 post-treatment). The analyses were conducted using cross-sectional data to test for non-invariance of measures of the TTM constructs: readiness, confidence, temptation, and processes of change (Structural Equation Modeling, SEM) across restricted and unrestricted settings. Two restricted (jail and aftercare) and one unrestricted group (outpatient) entering treatment and one restricted (prison) and two unrestricted groups (aftercare and outpatient) at end-of-treatment were contrasted. In addition TTM end-of-treatment profiles were tested as predictors of 12 month drinking outcomes (Profile Analysis). Although SEM did not indicate structural differences in the overall TTM construct model across setting types, there were factor structure differences on the confidence and temptation constructs at pre-treatment and in the factor structure of the behavioral processes at the end-of-treatment. For pre-treatment temptation and confidence, differences were found in the social situations factor loadings and in the variance for the confidence and temptation latent factors. For the end-of-treatment behavioral processes, differences across the restricted and unrestricted settings were identified in the counter-conditioning and stimulus control factor loadings. The TTM end-of-treatment profiles were not predictive of drinking outcomes in the prison sample. Both pre and post-treatment differences in structure across setting types involved constructs operationalized with behaviors that are limited for those in restricted settings. These studies suggest the TTM is a viable model for explicating addictive behavior change in restricted settings but calls for modification of subscale items that refer to specific behaviors and caution in interpreting the mean differences across setting types for problem drinkers. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Practice Change Model, physicians act as key stakeholders, people who have both an investment in the practice and the capacity to influence how the practice performs. This leadership role is critical to the development and change of the practice. Leadership roles and effectiveness are an important factor in quality improvement in primary care practices.^ The study conducted involved a comparative case study analysis to identify leadership roles and the relationship between leadership roles and the number and type of quality improvement strategies adopted during a Practice Change Model-based intervention study. The research utilized secondary data from four primary care practices with various leadership styles. The practices are located in the San Antonio region and serve a large Hispanic population. The data was collected by two ABC Project Facilitators from each practice during a 12-month period including Key Informant Interviews (all staff members), MAP (Multi-method Assessment Process), and Practice Facilitation field notes. This data was used to evaluate leadership styles, management within the practice, and intervention tools that were implemented. The chief steps will be (1) to analyze if the leader-member relations contribute to the type of quality improvement strategy or strategies selected (2) to investigate if leader-position power contributes to the number of strategies selected and the type of strategy selected (3) and to explore whether the task structure varies across the four primary care practices.^ The research found that involving more members of the clinic staff in decision-making, building bridges between organizational staff and clinical staff, and task structure are all associated with the direct influence on the number and type of quality improvement strategies implemented in primary care practice.^ Although this research only investigated leadership styles of four different practices, it will offer future guidance on how to establish the priorities and implementation of quality improvement strategies that will have the greatest impact on patient care improvement. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. EAP programs for airline pilots in companies with a well developed recovery management program are known to reduce pilot absenteeism following treatment. Given the costs and safety consequences to society, it is important to identify pilots who may be experiencing an AOD disorder to get them into treatment. ^ Hypotheses. This study investigated the predictive power of workplace absenteeism in identifying alcohol or drug disorders (AOD). The first hypothesis was that higher absenteeism in a 12-month period is associated with higher risk that an employee is experiencing AOD. The second hypothesis was that AOD treatment would reduce subsequent absence rates and the costs of replacing pilots on missed flights. ^ Methods. A case control design using eight years (time period) of monthly archival absence data (53,000 pay records) was conducted with a sample of (N = 76) employees having an AOD diagnosis (cases) matched 1:4 with (N = 304) non-diagnosed employees (controls) of the same profession and company (male commercial airline pilots). Cases and controls were matched on the variables age, rank and date of hire. Absence rate was defined as sick time hours used over the sum of the minimum guarantee pay hours annualized using the months the pilot worked for the year. Conditional logistic regression was used to determine if absence predicts employees experiencing an AOD disorder, starting 3 years prior to the cases receiving the AOD diagnosis. A repeated measures ANOVA, t tests and rate ratios (with 95% confidence intervals) were conducted to determine differences between cases and controls in absence usage for 3 years pre and 5 years post treatment. Mean replacement costs were calculated for sick leave usage 3 years pre and 5 years post treatment to estimate the cost of sick leave from the perspective of the company. ^ Results. Sick leave, as measured by absence rate, predicted the risk of being diagnosed with an AOD disorder (OR 1.10, 95% CI = 1.06, 1.15) during the 12 months prior to receiving the diagnosis. Mean absence rates for diagnosed employees increased over the three years before treatment, particularly in the year before treatment, whereas the controls’ did not (three years, x = 6.80 vs. 5.52; two years, x = 7.81 vs. 6.30, and one year, x = 11.00cases vs. 5.51controls. In the first year post treatment compared to the year prior to treatment, rate ratios indicated a significant (60%) post treatment reduction in absence rates (OR = 0.40, CI = 0.28, 0.57). Absence rates for cases remained lower than controls for the first three years after completion of treatment. Upon discharge from the FAA and company’s three year AOD monitoring program, case’s absence rates increased slightly during the fourth year (controls, x = 0.09, SD = 0.14, cases, x = 0.12, SD = 0.21). However, the following year, their mean absence rates were again below those of the controls (controls, x = 0.08, SD = 0.12, cases, x¯ = 0.06, SD = 0.07). Significant reductions in costs associated with replacing pilots calling in sick, were found to be 60% less, between the year of diagnosis for the cases and the first year after returning to work. A reduction in replacement costs continued over the next two years for the treated employees. ^ Conclusions. This research demonstrates the potential for workplace absences as an active organizational surveillance mechanism to assist managers and supervisors in identifying employees who may be experiencing or at risk of experiencing an alcohol/drug disorder. Currently, many workplaces use only performance problems and ignore the employee’s absence record. A referral to an EAP or alcohol/drug evaluation based on the employee’s absence/sick leave record as incorporated into company policy can provide another useful indicator that may also carry less stigma, thus reducing barriers to seeking help. This research also confirms two conclusions heretofore based only on cross-sectional studies: (1) higher absence rates are associated with employees experiencing an AOD disorder; (2) treatment is associated with lower costs for replacing absent pilots. Due to the uniqueness of the employee population studied (commercial airline pilots) and the organizational documentation of absence, the generalizability of this study to other professions and occupations should be considered limited. ^ Transition to Practice. The odds ratios for the relationship between absence rates and an AOD diagnosis are precise; the OR for year of diagnosis indicates the likelihood of being diagnosed increases 10% for every hour change in sick leave taken. In practice, however, a pilot uses approximately 20 hours of sick leave for one trip, because the replacement will have to be paid the guaranteed minimum of 20 hour. Thus, the rate based on hourly changes is precise but not practical. ^ To provide the organization with practical recommendations the yearly mean absence rates were used. A pilot flies on average, 90 hours a month, 1080 annually. Cases used almost twice the mean rate of sick time the year prior to diagnosis (T-1) compared to controls (cases, x = .11, controls, x = .06). Cases are expected to use on average 119 hours annually (total annual hours*mean annual absence rate), while controls will use 60 hours. The cases’ 60 hours could translate to 3 trips of 20 hours each. Management could use a standard of 80 hours or more of sick time claimed in a year as the threshold for unacceptable absence, a 25% increase over the controls (a cost to the company of approximately of $4000). At the 80-hour mark, the Chief Pilot would be able to call the pilot in for a routine check as to the nature of the pilot’s excessive absence. This management action would be based on a company standard, rather than a behavioral or performance issue. Using absence data in this fashion would make it an active surveillance mechanism. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Emergency departments (EDs) have been called the net below the safety net due to their long history of providing care to the uninsured and others lacking access to the healthcare system. In past years, those with Medicaid and, more recently, those with Medicare, are also utilizing the ED as a medical home for routine primary care. There are many reasons for this but the costs to the community have become increasingly burdensome. ^ To evaluate how often the ED is being utilized for primary care, we applied a standardized tool, the New York University Algorithm, to over 43,000 ED visits when no hospitalization was required made by Hardin, Jefferson, and Orange County residents over a 12 month period. We compared our results to Harris County, where studies using the same framework have been performed, and found that sizeable segments of the population in both areas are utilizing the ED for non-emergent primary care that could be treated in a more cost-effective community setting. ^ We also analyzed our dataset for visit-specific characteristics. We found evidence of two possible health care disparities: (1) Blacks had a higher rate of primary care-related ED visits in relation to their percentage of the population when compared to other racial/ethnic groups; and (2) when form of payment is considered, the uninsured were more likely to have a primary care-related ED visit than any other group. These findings suggest a lack of community-based primary care services for the medically needy in Southeast Texas. ^ We believe that studies such as this are warranted elsewhere in Texas as well. We plan to present our findings to local policy makers, who should find this information helpful in identifying gaps in the safety net and assist them in better allocating scarce community resources. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction. Despite the ban of lead-containing gasoline and paint, childhood lead poisoning remains a public health issue. Furthermore, a Medicaid-eligible child is 8 times more likely to have an elevated blood lead level (EBLL) than a non-Medicaid child, which is the primary reason for the early detection lead screening mandate for ages 12 and 24 months among the Medicaid population. Based on field observations, there was evidence that suggested a screening compliance issue. Objective. The purpose of this study was to analyze blood lead screening compliance in previously lead poisoned Medicaid children and test for an association between timely lead screening and timely childhood immunizations. The mean months between follow-up tests were also examined for a significant difference between the non-compliant and compliant lead screened children. Methods. Access to the surveillance data of all childhood lead poisoned cases in Bexar County was granted by the San Antonio Metropolitan Health District. A database was constructed and analyzed using descriptive statistics, logistic regression methods and non-parametric tests. Lead screening at 12 months of age was analyzed separately from lead screening at 24 months. The small portion of the population who were also related were included in one analysis and removed from a second analysis to check for significance. Gender, ethnicity, age of home, and having a sibling with an EBLL were ruled out as confounders for the association tests but ethnicity and age of home were adjusted in the nonparametric tests. Results. There was a strong significant association between lead screening compliance at 12 months and childhood immunization compliance, with or without including related children (p<0.00). However, there was no significant association between the two variables at the age of 24 months. Furthermore, there was no significant difference between the median of the mean months of follow-up blood tests among the non-compliant and compliant lead screened population for at the 12 month screening group but there was a significant difference at the 24 month screening group (p<0.01). Discussion. Descriptive statistics showed that 61% and 56% of the previously lead poisoned Medicaid population did not receive their 12 and 24 month mandated lead screening on time, respectively. This suggests that their elevated blood lead level may have been diagnosed earlier in their childhood. Furthermore, a child who is compliant with their lead screening at 12 months of age is 2.36 times more likely to also receive their childhood immunizations on time compared to a child who was not compliant with their 12 month screening. Even though there was no statistical significant association found for the 24 month group, the public health significance of a screening compliance issue is no less important. The Texas Medicaid program needs to enforce lead screening compliance because it is evident that there has been no monitoring system in place. Further recommendations include a need for an increased focus on parental education and the importance of taking their children for wellness exams on time.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Seasonal variation in menarche, menstrual cycle length and menopause was investigated using Tremin Trust data. Too, self-reported hot flash data for women with natural and surgically-induced menopause were analyzed for rhythms.^ Menarche data from approximately 600 U.S. women born between 1940 and 1970 revealed a 6-month rhythm (first acrophase in January, double amplitude of 58%M). A notable shift from a December-January peak in menarche for those born in the 1940s and 1950s to an August-September peak for those born in the 1960s was observed. Groups of girls 8-14 and 15-17 yr old at menarche exhibited a seasonal difference in the pattern of menarche occurrence of about 6 months in relation to each other. Girls experiencing menarche during August-October were statistically significantly younger than those experiencing it at other times. Season of birth was not associated with season of menarche.^ The lengths of approximately 150,000 menstrual intervals of U.S. women were analyzed for seasonality. Menstrual intervals possibly disturbed by natural (e.g., childbirth) or other events (e.g., surgery, medication) were excluded. No 6- or 12-month rhythmicities were found for specific interval lengths (14-24, 25-31 and 32-56 days) or ages in relation to menstrual interval (9-11, 12-13, 15-19, 20-24, 25-39, 40-44 and 44 yr old and older).^ Hot flash data of 14 women experiencing natural menopause (NM) and 11 experiencing surgically-induced menopause (SIM) did not differ in frequency of hot flashes. Hot flashes in NM women exhibited 12- and 8-hr, but not 24-hr rhythmicities. Hot flashes in SIM women exhibited 24- and 12-hr, but not 8-hr, rhythmicities. Regardless of type of menopause, women with a peak frequency in hot flashes during the morning (0400 through 0950) were distinguishable from those with such in the evening (1600 through 2159).^ Data from approximately 200 U.S. women revealed a 6-month rhythm in menopause with first peak in May. No significant 12-month variation in menopause was detected by Cosinor analysis. Season of birth and age at menopause were not associated with season of menopause. Age at menopause declined significantly over the years for women born between 1907 and 1926, inclusive. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study demonstrated that accurate, short-term forecasts of Veterans Affairs (VA) hospital utilization can be made using the Patient Treatment File (PTF), the inpatient discharge database of the VA. Accurate, short-term forecasts of two years or less can reduce required inventory levels, improve allocation of resources, and are essential for better financial management. These are all necessary achievements in an era of cost-containment.^ Six years of non-psychiatric discharge records were extracted from the PTF and used to calculate four indicators of VA hospital utilization: average length of stay, discharge rate, multi-stay rate (a measure of readmissions) and days of care provided. National and regional levels of these indicators were described and compared for fiscal year 1984 (FY84) to FY89 inclusive.^ Using the observed levels of utilization for the 48 months between FY84 and FY87, five techniques were used to forecast monthly levels of utilization for FY88 and FY89. Forecasts were compared to the observed levels of utilization for these years. Monthly forecasts were also produced for FY90 and FY91.^ Forecasts for days of care provided were not produced. Current inpatients with very long lengths of stay contribute a substantial amount of this indicator and it cannot be accurately calculated.^ During the six year period between FY84 and FY89, average length of stay declined substantially, nationally and regionally. The discharge rate was relatively stable, while the multi-stay rate increased slightly during this period. FY90 and FY91 forecasts show a continued decline in the average length of stay, while the discharge rate is forecast to decline slightly and the multi-stay rate is forecast to increase very slightly.^ Over a 24 month ahead period, all three indicators were forecast within a 10 percent average monthly error. The 12-month ahead forecast errors were slightly lower. Average length of stay was less easily forecast, while the multi-stay rate was the easiest indicator to forecast.^ No single technique performed significantly better as determined by the Mean Absolute Percent Error, a standard measure of error. However, Autoregressive Integrated Moving Average (ARIMA) models performed well overall and are recommended for short-term forecasting of VA hospital utilization. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To address concerns expressed about the possible effect of drilling mud discharges on shallow, low-energy estuarine ecosystems, a 12 month study was designed to detect alterations in water quality and sediment geochemistry. Each drilling mud used in the study and sediments from the study site were analyzed in the laboratory for chemical and physical characteristics. Potential water quality impacts were simulated by the EPA-COE elutriation test procedure. Mud toxicity was measured by acute and chronic bioassays with Mysidopsis bahia, Mercenaria mercenaria, and Nereis virens.^ For the field study, a relatively pristine, shallow (1.2 m) estuary (Christmas Bay, TX) without any drilling activity for the last 30 years was chosen for the study site. After a three month baseline study, three stations were selected. Station 1 was an external control. At each treatment station (2, 3), mesocosms were constructed to enclose a 3.5 m$\sp3$ water column. Each treatment station included an internal control site also. Each in situ mesocosm, except the controls, was successively dosed at a mesocosm-specific dose (1:100; 1:1,000; or 1:10,000 v/v) with 4 field collected drilling muds (spud, nondispersed, lightly-treated, and heavily-treated lignosulfonate) in sequential order over 1.5 months. Twenty-four hours after each dose, water exchange was allowed until the next treatment. Station 3 was destroyed by a winter storm. After the last treatment, the enclosures were removed and the remaining sites monitored for 6 months. One additional site was similarly dosed (1:100 v/v) with clean dredged sediment from Christmas Bay for comparison between dredged sediments and drilling muds.^ Results of the analysis of the water samples and field measurements showed that water quality was impacted during the discharges, primarily at the highest dose (1:100 v/v), but that elevated levels of C, Cr (T,F), Cr$\sp{+3}$ (T, F), N, Pb, and Zn returned to ambient levels before the end of the 24 hour exposure period or immediately after water exchange was allowed (Al, Ba(T), Chlorophyll ABC, SS, %T). Barium, from the barite, was used as a geochemical tracer in the sediments to confirm estimated doses by mass balance calculations. Barium reached a maximum of 166x background levels at the high dose mesocosm. Barium levels returned to ambient or only slightly elevated levels at the end of the 6 month monitoring period due to sediment deposition, resuspension, and bioturbation. QA/QC results using blind samples consisting of lab standards and spiked samples for both water and sediment matrices were within acceptable coefficients of variation.^ In order to avoid impacts on water quality and sediment geochemistry in a shallow estuarine ecosystem, this study concluded that a minimal dilution of 1:1,000 (v/v) would be required in addition to existing regulatory constraints. ^