848 resultados para Duration of hospital stay


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Few studies have been undertaken to understand the employment impact in patients with colorectal cancer and none in middle-aged individuals with cancer. This study described transitions in, and key factors influencing, work participation during the 12 months following a diagnosis of colorectal cancer. Methods We enrolled 239 adults during 2010 and 2011who were employed at the time of their colorectal cancer diagnosis and were prospectively followed over 12 months. They were compared to an age- and gender-matched general population group of 717 adults from the Household, Income and Labour Dynamics in Australia (HILDA) Survey. Data were collected using telephone and postal surveys. Primary outcomes included work participation at 12 months, changes in hours worked and time to work re-entry. Multivariable logistic and Cox proportional hazards models were undertaken. Results A significantly higher proportion of participants with colorectal cancer (27%) had stopped working at 12 months than participants from the comparison group (8%) (p < 0.001). Participants with cancer who returned to work took a median of 91 days off work (25–75 percentiles: 14–183 days). For participants with cancer, predictors of not working at 12 months included: being older, lower BMI and lower physical well-being. Factors related to delayed work re-entry included not being university-educated, working for an employer with more than 20 employees in a non-professional or managerial role, longer hospital stay, poorer perceived financial status and having or had chemotherapy. Conclusions In middle-adulthood, those working and diagnosed with colorectal cancer can expect to take around three months off work. Individuals treated with chemotherapy, without a university degree and from large employers could be targeted for specific assistance for a more timely work entry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficacy, adverse reactions, and long-term effects of intestinal lavage treatment with a balanced electrolyte solution (Golytely) was evaluated in patients with cystic fibrosis and distal intestinal obstruction syndrome. Twenty-two patients with cystic fibrosis (mean age 21.8 years, range 14 to 34 years, 15 boys or men) who sough medical attention because of abdominal pain and a mass in the right iliac fossa received Golytely, 5.6 ± 1.9 L (mean ± 1 SD), either orally (n = 14) or via nasogastric tube (n = 8) during 5.6 ± 2.4 hours. No serious side effects occurred. Serum electrolyte values remained within normal limits. Body weight did not change significantly. Minor adverse reactions included bloating (n = 12), nausea (n = 8), vomiting (n = 1), and chills (n = 3). All but one patient reported impressive relief of symptoms and remained pain free for an average of 3 months (range 1 to 19 months). Symptoms of abdominal pain and radiologic signs of fecal impaction assessed before and after lavage both decreased significantly (P < .0001). During follow-up (mean 15.2 months, range 4 to 26 months), 11 patients required a total of 38 (range one to nine) additional doses of Golytely. Seven patients drank the solution at home (21 treatments); only two patients chose a nasogastric tube. In ten patients with symptoms of recurrent distal intestinal obstruction syndrome prior to institution of therapy, duration of hospitalization was significantly reduced by this treatment (5.1 ± 7.6 v 2.3 ± 6.3 hospital days per annum, P < .02). It is concluded that intestinal lavage is a well-accepted, safe, and effective therapy for distal intestinal obstruction syndrome in patients with cystic fibrosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Falls among hospitalised patients impose a considerable burden on health systems globally and prevention is a priority. Some patient-level interventions have been effective in reducing falls, but others have not. An alternative and promising approach to reducing inpatient falls is through the modification of the hospital physical environment and the night lighting of hospital wards is a leading candidate for investigation. In this pilot trial, we will determine the feasibility of conducting a main trial to evaluate the effects of modified night lighting on inpatient ward level fall rates. We will test also the feasibility of collecting novel forms of patient level data through a concurrent observational sub-study. Methods/design: A stepped wedge, cluster randomised controlled trial will be conducted in six inpatient wards over 14 months in a metropolitan teaching hospital in Brisbane (Australia). The intervention will consist of supplementary night lighting installed across all patient rooms within study wards. The planned placement of luminaires, configurations and spectral characteristics are based on prior published research and pre-trial testing and modification. We will collect data on rates of falls on study wards (falls per 1000 patient days), the proportion of patients who fall once or more, and average length of stay. We will recruit two patients per ward per month to a concurrent observational sub-study aimed at understanding potential impacts on a range of patient sleep and mobility behaviour. The effect on the environment will be monitored with sensors to detect variation in light levels and night-time room activity. We will also collect data on possible patient-level confounders including demographics, pre-admission sleep quality, reported vision, hearing impairment and functional status. Discussion: This pragmatic pilot trial will assess the feasibility of conducting a main trial to investigate the effects of modified night lighting on inpatient fall rates using several new methods previously untested in the context of environmental modifications and patient safety. Pilot data collected through both parts of the trial will be utilised to inform sample size calculations, trial design and final data collection methods for a subsequent main trial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Australian mothers consistently rate postnatal care as the poorest aspect of their maternity care, and researchers and policymakers have widely acknowledged the need for improvement in how postnatal care is provided. Aim To identify and analyse mothers’ comments about postnatal care in their free text responses to an open ended question in the Having a Baby in Queensland Survey, 2010, and reflect on their implications for midwifery practice and maternity service policies. Methods The survey assessed mothers’ experiences of maternity care four months after birth. We analysed free-text data from an open-ended question inviting respondents to write ‘anything else you would like to tell us’. Of the final survey sample (N = 7193), 60% (N = 4310) provided comments, 26% (N = 1100) of which pertained to postnatal care. Analysis included the coding and enumeration of issues to identify the most common problems commented on by mothers. Comments were categorised according to whether they related to in-hospital or post-discharge care, and whether they were reported by women birthing in public or private birthing facilities. Results The analysis revealed important differences in maternal experiences according to birthing sector: mothers birthing in public facilities were more likely to raise concerns about the quality and/or duration of their in-hospital stay than those in private facilities. Conversely, mothers who gave birth in private facilities were more likely to raise concerns about inadequate post-discharge care. Regardless of birthing sector, however, a substantial proportion of all mothers spontaneously raised concerns about their experiences of inadequate and/or inconsistent breastfeeding support. Conclusion Women who birth in private facilities were more likely to spontaneously report concerns about their level of post-discharge care than women from public facilities in Queensland, and publically provided community based care is not sufficient to meet women's needs. Inadequate or inconsistent professional breastfeeding support remains a major issue for early parenting women regardless of birthing sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dry seeding of aman rice can facilitate timely crop establishment and early harvest and thus help to alleviate the monga (hunger) period in the High Ganges Flood Plain of Bangladesh. Dry seeding also offers many other potential benefits, including reduced cost of crop establishment and improved soil structure for crops grown in rotation with rice. However, the optimum time for seeding in areas where farmers have access to water for supplementary irrigation has not been determined. We hypothesized that earlier sowing is safer, and that increasing seed rate mitigates the adverse effects of significant rain after sowing on establishment and crop performance. To test these hypotheses, we analyzed long term rainfall data, and conducted field experiments on the effects of sowing date (target dates of 25 May, 10 June, 25 June, and 10 July) and seed rate (20, 40, and 60 kg ha−1) on crop establishment, growth, and yield of dry seeded Binadhan-7 (short duration, 110–120 d) during the 2012 and 2013 rainy seasons. Wet soil as a result of untimely rainfall usually prevented sowing on the last two target dates in both years, but not on the first two dates. Rainfall analysis also suggested a high probability of being able to dry seed in late May/early June, and a low probability of being able to dry seed in late June/early July. Delaying sowing from 25 May/10 June to late June/early July usually resulted in 20–25% lower plant density and lower uniformity of the plant stand as a result of rain shortly after sowing. Delaying sowing also reduced crop duration, and tillering or biomass production when using a low seed rate. For the late June/early July sowings, there was a strong positive relationship between plant density and yield, but this was not the case for earlier sowings. Thus, increasing seed rate compensated for the adverse effect of untimely rains after sowing on plant density and the shorter growth duration of the late sown crops. The results indicate that in this region, the optimum date for sowing dry seeded rice is late May to early June with a seed rate of 40 kg ha−1. Planting can be delayed to late June/early July with no yield loss using a seed rate of 60 kg ha−1, but in many years, the soil is simply too wet to be able to dry seed at this time due to rainfall.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Taiwanese migrants settled in Brisbane, Australia (N=271) completed a questionnaire battery available in both Mandarin and English. A series of multiple and hierarchical regression analyses were used to investigate the factors associated with these migrants’ acculturation and indicators of psychological well-being. Results indicated that various personal factors (age, English language proficiency and duration of stay) were associated with acculturation and indicators of psychological wellbeing. Acculturation was not associated with wellbeing. Social support was associated with the indicators of the participants’ wellbeing. The outcome indicated that although associated with similar personal and environmental factors, acculturation and psychological wellbeing occurred separately. The study highlights the significance of certain personal resources and social support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is one part of a collaborative depression research project, the Vantaa Depression Study (VDS), involving the Department of Mental and Alcohol Research of the National Public Health Institute, Helsinki, and the Department of Psychiatry of the Peijas Medical Care District (PMCD), Vantaa, Finland. The VDS includes two parts, a record-based study consisting of 803 patients, and a prospective, naturalistic cohort study of 269 patients. Both studies include secondary-level care psychiatric out- and inpatients with a new episode of major depressive disorder (MDD). Data for the record-based part of the study came from a computerised patient database incorporating all outpatient visits as well as treatment periods at the inpatient unit. We included all patients aged 20 to 59 years old who had been assigned a clinical diagnosis of depressive episode or recurrent depressive disorder according to the International Classification of Diseases, 10th edition (ICD-10) criteria and who had at least one outpatient visit or day as an inpatient in the PMCD during the study period January 1, 1996, to December 31, 1996. All those with an earlier diagnosis of schizophrenia, other non-affective psychosis, or bipolar disorder were excluded. Patients treated in the somatic departments of Peijas Hospital and those who had consulted but not received treatment from the psychiatric consultation services were excluded. The study sample comprised 290 male and 513 female patients. All their psychiatric records were reviewed and each patient completed a structured form with 57 items. The treatment provided was reviewed up to the end of the depression episode or to the end of 1997. Most (84%) of the patients received antidepressants, including a minority (11%) on treatment with clearly subtherapeutic low doses. During the treatment period the depressed patients investigated averaged only a few visits to psychiatrists (median two visits), but more to other health professionals (median seven). One-fifth of both genders were inpatients, with a mean of nearly two inpatient treatment periods during the overall treatment period investigated. The median length of a hospital stay was 2 weeks. Use of antidepressants was quite conservative: The first antidepressant had been switched to another compound in only about one-fifth (22%) of patients, and only two patients had received up to five antidepressant trials. Only 7% of those prescribed any antidepressant received two antidepressants simultaneously. None of the patients was prescribed any other augmentation medication. Refusing antidepressant treatment was the most common explanation for receiving no antidepressants. During the treatment period, 19% of those not already receiving a disability pension were granted one due to psychiatric illness. These patients were nearly nine years older than those not pensioned. They were also more severely ill, made significantly more visits to professionals and received significantly more concomitant medications (hypnotics, anxiolytics, and neuroleptics) than did those receiving no pension. In the prospective part of the VDS, 806 adult patients were screened (aged 20-59 years) in the PMCD for a possible new episode of DSM-IV MDD. Of these, 542 patients were interviewed face-to-face with the WHO Schedules for Clinical Assessment in Neuropsychiatry (SCAN), Version 2.0. Exclusion criteria were the same as in the record-based part of the VDS. Of these, 542 269 patients fulfiled the criteria of DSM-IV MDE. This study investigated factors associated with patients' functional disability, social adjustment, and work disability (being on sick-leave or being granted a disability pension). In the beginning of the treatment the most important single factor associated with overall social and functional disability was found to be severity of depression, but older age and personality disorders also significantly contributed. Total duration and severity of depression, phobic disorders, alcoholism, and personality disorders all independently contributed to poor social adjustment. Of those who were employed, almost half (43%) were on sick-leave. Besides severity and number of episodes of depression, female gender and age over 50 years strongly and independently predicted being on sick-leave. Factors influencing social and occupational disability and social adjustment among patients with MDD were studied prospectively during an 18-month follow-up period. Patients' functional disability and social adjustment were alleviated during the follow-up concurrently with recovery from depression. The current level of functioning and social adjustment of a patient with depression was predicted by severity of depression, recurrence before baseline and during follow-up, lack of full remission, and time spent depressed. Comorbid psychiatric disorders, personality traits (neuroticism), and perceived social support also had a significant influence. During the 18-month follow-up period, of the 269, 13 (5%) patients switched to bipolar disorder, and 58 (20%) dropped out. Of the 198, 186 (94%) patients were at baseline not pensioned, and they were investigated. Of them, 21 were granted a disability pension during the follow-up. Those who received a pension were significantly older, more seldom had vocational education, and were more often on sick-leave than those not pensioned, but did not differ with regard to any other sociodemographic or clinical factors. Patients with MDD received mostly adequate antidepressant treatment, but problems existed in treatment intensity and monitoring. It is challenging to find those at greatest risk for disability and to provide them adequate and efficacious treatment. This includes great challenges to the whole society to provide sufficient resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Venous thromboembolism (VTE) is the greatest single cause of maternal mortality in pregnant women in developed countries. Pregnancy is a hypercoagulable state and brings about an enhanced risk of deep venous thrombosis (DVT) in otherwise healthy women. Traditionally, unfractionated heparin (UFH) has been used for treatment of DVT during pregnancy. We showed in our observational study that low molecular weight heparin (LMWH) is as effective and safe as UFH in the treatment of DVT during pregnancy. Although DVT during pregnancy is often massive, increasing the risk of developing long-term consequences, namely post-thrombotic syndrome (PTS), only 11% of all patients had confirmed PTS 3 4 years after DVT. In our studies the prevalence of PTS was not dependent on treatment (UFH vs LMWH). Low molecular weight heparin is more easily administered, few laboratory controls are required and the hospital stay is shorter, factors that lower the costs of treatment. Cervical insufficiency is defined as repeated very preterm delivery during the second or early third trimester. Infection is a well-known risk factor of preterm delivery. We found overpresentation of thrombophilic mutations (FV Leiden, prothrombin G20210A)among 42 patients with cervical insufficiency compared with controls (OR 6.7, CI 2.7 18.4). Thus, thrombophilia might be a risk factor of cervical insufficiency possibly explained by interaction of coagulation and inflammation processes. The presence of antiphospholipid (aPL) antibodies increases the risk for recurrent miscarriage (RM). Annexins are proteins which all bind to anionic phospholipids (PLs) preventing clotting on vascular phospholipid surfaces. Plasma concentrations of circulating annexin IV and V were investigated in 77 pregnancies at the beginning of pregnancy among women with a history of RM, and in connection to their aPL antibody status. Control group consisted unselected pregnant patients (n=25) without history of adverse pregnancy outcome. Plasma levels of annexin V were significantly higher at the beginning (≤5th week) of pregnancy in women with aPL antibodies compared with those without aPL antibodies (P=0.03). Levels of circulating annexin V were also higher at the 6th (P= 0.01) and 8th week of pregnancy in subjects with aPL antibodies (P=0.01). Results support the hypothesis that aPL could displace annexin from anionic phospholipid surfaces of syncytiotrophoblasts (STBs) and may exert procoagulant activities on the surfaces of STBs Recurrent miscarriage (RM) has been suggested to be caused by mutations in genes coding for various coagulation factors resulting in thrombophilia. In the last study of my thesis were investigated the prevalence of thrombomodulin (TM) and endothelial protein C receptor polymorphism EPCR among 40 couples and six women suffering RM. This study showed that mutations in the TM or EPCR genes are not a major cause of RM in Finnish patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of this study were to describe Finnish day surgery practice at present and to evaluate quality of care by assessing postdischarge minor morbidity and quality indicators. Potential treatment options were approached by investigating the role of oral dexamethasone as a part of multimodal analgesia and the feasibility of day surgery in patients aged 65 years and older. Over a 2-month period, all patient cases at 14 Finnish day surgery or short-stay units were analyzed (Study I). Quality indicators included rates and reasons for overnight admission, readmission, reoperation, cancellations, and patient satisfaction. Recovery during the first postoperative week was assessed at two units (Study II). Altogether 2732 patients graded daily the intensity of predefined symptoms. To define risk factors of postdischarge symptoms, multinomial regression analysis was used. Sixty patients scheduled to undergo day surgery for hallux valgus were randomized to receive twice perioperatively dexamethasone 9 mg or placebo (Study III). Paracetamol 1 g was administered 3 times daily. Rescue medication (oxycodone) consumption during 0-3 postoperative days (POD), maximal pain scores and adverse effects were documented. Medically stable patients aged 65 years or older, scheduled for open inguinal hernia repair, were randomized to receive treatment either as day cases or inpatients (Study IV). Complications, unplanned admissions, healthcare visits, and patients’ acceptance of the type of care provided were assessed during 2 weeks postoperatively. In Study I, unplanned overnight admissions were reported in 5.9%, return hospital visits during PODs 1-28 in 3.7%, and readmissions in 0.7% of patients. Patient satisfaction was high. In Study II, pain was the most common symptom in adult patients (57%). Postdischarge symptoms were more frequent in adults aged < 40 years, children aged ≥ 7 years, females, and following a longer duration of surgery. In Study III, the total median (range) oxycodone consumption during the study period was 45 (0–165) mg in the dexamethasone group, compared with 78 (15–175) mg in the placebo group (P < 0.049). On PODs 0-1, patients in the dexamethasone group reported significantly lower pain scores. Following inguinal hernia repair, no significant differences in outcome measures were seen between the study groups. Patient satisfaction was equally high in day cases and inpatients (Study IV). Finnish day surgery units provide good-quality services. Minor postdischarge symptoms are common, and they are influenced by several patient-, surgery-, and anesthesia-related factors. Oral dexamethasone combined with paracetamol improves pain relief and reduces the need for oxycodone rescue medication following correction of hallux valgus. Day surgery for open inguinal hernia repair is safe and well accepted by patients aged 65 years or older and can be recommended as the primary choice of care for medically stable patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spirometry is the most widely used lung function test in the world. It is fundamental in diagnostic and functional evaluation of various pulmonary diseases. In the studies described in this thesis, the spirometric assessment of reversibility of bronchial obstruction, its determinants, and variation features are described in a general population sample from Helsinki, Finland. This study is a part of the FinEsS study, which is a collaborative study of clinical epidemiology of respiratory health between Finland (Fin), Estonia (Es), and Sweden (S). Asthma and chronic obstructive pulmonary disease (COPD) constitute the two major obstructive airways diseases. The prevalence of asthma has increased, with around 6% of the population in Helsinki reporting physician-diagnosed asthma. The main cause of COPD is smoking with changes in smoking habits in the population affecting its prevalence with a delay. Whereas airway obstruction in asthma is by definition reversible, COPD is characterized by fixed obstruction. Cough and sputum production, the first symptoms of COPD, are often misinterpreted for smokers cough and not recognized as first signs of a chronic illness. Therefore COPD is widely underdiagnosed. More extensive use of spirometry in primary care is advocated to focus smoking cessation interventions on populations at risk. The use of forced expiratory volume in six seconds (FEV6) instead of forced vital capacity (FVC) has been suggested to enable office spirometry to be used in earlier detection of airflow limitation. Despite being a widely accepted standard method of assessment of lung function, the methodology and interpretation of spirometry are constantly developing. In 2005, the ATS/ERS Task Force issued a joint statement which endorsed the 12% and 200 ml thresholds for significant change in forced expiratory volume in one second (FEV1) or FVC during bronchodilation testing, but included the notion that in cases where only FVC improves it should be verified that this is not caused by a longer exhalation time in post-bronchodilator spirometry. This elicited new interest in the assessment of forced expiratory time (FET), a spirometric variable not usually reported or used in assessment. In this population sample, we examined FET and found it to be on average 10.7 (SD 4.3) s and to increase with ageing and airflow limitation in spirometry. The intrasession repeatability of FET was the poorest of the spirometric variables assessed. Based on the intrasession repeatability, a limit for significant change of 3 s was suggested for FET during bronchodilation testing. FEV6 was found to perform equally well as FVC in the population and in a subgroup of subjects with airways obstruction. In the bronchodilation test, decreases were frequently observed in FEV1 and particularly in FVC. The limit of significant increase based on the 95th percentile of the population sample was 9% for FEV1 and 6% for FEV6 and FVC; these are slightly lower than the current limits for single bronchodilation tests (ATS/ERS guidelines). FEV6 was proven as a valid alternative to FVC also in the bronchodilation test and would remove the need to control duration of exhalation during the spirometric bronchodilation test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atopy-related allergic diseases, i.e. allergic rhinoconjunctivitis, atopic dermatitis and asthma, have increased in frequency in the industrialized countries. In order to reverse this trend, effective preventive strategies need to be developed. This requires a better understanding of the early-life events leading to the expression of the atopic phenotype. The present study has aimed at defining early-life factors and markers associated with the subsequent development of allergic diseases in a cohort of 200 healthy, unselected Finnish newborns prospectively followed up from birth to age 20 years. Their mothers were encouraged to start and maintain exclusive breastfeeding as long as it was nutritionally sufficient for the infant. Consequently, all the infants received some duration of exclusive breastfeeding, 58% of the infants were on exclusive breastfeeding for the first 6 months of life, and 18% received this feeding at least for the first 9 months. Of the infants, 42% had a family history of allergy. After the first year of follow-up, the children were re-assessed at ages 5, 11 and 20 years with clinical examination, skin prick testing, and parental and personal interviews. Exclusive breastfeeding for over 9 months was associated with atopic dermatitis and symptoms of food hypersensitivity at age 5 years, and with symptoms of food hypersensitivity at age 11 years in the children with a familial allergy. Subjects with allergic symptoms or a positive skin prick test in childhood or adolescence had lower retinol concentrations during their infancy and childhood than others. An elevated cord serum immunoglobulin E concentration predicted subsequent atopic manifestations though with modest sensitivity. Children and adolescents with allergic symptoms, skin prick test positivity and an elevated IgE had lower total cholesterol levels in infancy and childhood than the nonatopic subjects. In conclusion, prolonging strictly exclusive breastfeeding for over 9 months of age was not of help in prevention of allergic symptoms; instead, it was associated with increased atopic dermatitis and food hypersensitivity symptoms in childhood. Due to the modest sensitivity, cord serum IgE is not an effective screening method for atopic predisposition in the general population. Retinol and cholesterol concentrations in infancy were inversely associated with the subsequent development of allergic symptoms. Based on these findings, it is proposed that there may be differences in the inborn regulation of retinol and cholesterol levels in children with and without a genetic susceptibility to atopy, and these may play a role in the development of atopic sensitization and allergic diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing antimicrobial resistance in bacteria has led to the need for better understanding of antimicrobial usage patterns. In 1999, the World Organisation for Animal Health (OIE) recommended that an international ad hoc group should be established to address human and animal health risks related to antimicrobial resistance and the contribution of antimicrobial usage in veterinary medicine. In European countries the need for continuous recording of the usage of veterinary antimicrobials as well as for animal species-specific and indication-based data on usage has been acknowledged. Finland has been among the first countries to develop prudent use guidelines in veterinary medicine, as the Ministry of Agriculture and Forestry issued the first animal species-specific indication-based recommendations for antimicrobial use in animals in 1996. These guidelines have been revised in 2003 and 2009. However, surveillance on the species-specific use of antimicrobials in animals has not been performed in Finland. This thesis provides animal species-specific information on indication-based antimicrobial usage. Different methods for data collection have been utilized. Information on antimicrobial usage in animals has been gathered in four studies (studies A-D). Material from studies A, B and C have been used in an overlapping manner in the original publications I-IV. Study A (original publications I & IV) presents a retrospective cross-sectional survey on prescriptions for small animals at the Veterinary Teaching Hospital of the University of Helsinki. Prescriptions for antimicrobial agents (n = 2281) were collected and usage patterns, such as the indication and length of treatment, were reviewed. Most of the prescriptions were for dogs (78%), and primarily for the treatment of skin and ear infections most of which were treated with cephalexin for a median period of 14 days. Prescriptions for cats (18%) were most often for the treatment of urinary tract infections with amoxicillin for a median length of 10 days. Study B (original publication II) was a retrospective cross-sectional survey where prescriptions for animals were collected from 17 University Pharmacies nationwide. Antimicrobial prescriptions (n = 1038) for mainly dogs (65%) and cats (19%) were investigated. In this study, cephalexin and amoxicillin were also the most frequently used drugs for dogs and cats, respectively. In study C (original publications III & IV), the indication-based usage of antimicrobials of practicing veterinarians was analyzed by using a prospective questionnaire. Randomly selected practicing veterinarians in Finland (n = 262) recorded all their antimicrobial usage during a 7-day study period. Cattle (46%) with mastitis were the most common patients receiving antimicrobial treatment, generally intramuscular penicillin G or intramammary treatment with ampicillin and cloxacillin. The median length of treatment was four days, regardless of the route of administration. Antimicrobial use in horses was evaluated in study D, the results of which are previously unpublished. Firstly, data collected with the prospective questionnaire from the practicing veterinarians showed that horses (n = 89) were frequently treated for skin or wound infections by using penicillin G or trimethoprim-sulfadiazine. The mean duration of treatment was five to seven days. Secondly, according to retrospective data collected from patient records, horses (n = 74) that underwent colic surgery at the Veterinary Teaching Hospital of the University of Helsinki were generally treated according to national and hospital recommendations; penicillin G and gentamicin was administered preoperatively and treatment was continued for a median of three days postoperatively. In conclusion, Finnish veterinarians followed well the national prudent use guidelines. Narrow-spectrum antimicrobials were preferred and, for instance, fluoroquinolones were used sparingly. Prescription studies seemed to give good information on antimicrobials usage, especially when combined with complementary information from patient records. A prospective questionnaire study provided a fair amount of valuable data on several animal species. Electronic surveys are worthwhile exploiting in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maternal drug abuse during pregnancy endangers the future health and wellbeing of the infant and growing child. On the other hand, via maternal abstinence, these problems would never occur; so the problems would be totally preventable. Buprenorphine is widely used in opioid maintenance treatment as a substitute medication. In Finland, during 2000 s buprenorphine misuse has steadily increased. In 2009 almost one third of clientele of substance treatment units were in treatment because of buprenorphine dependence. At Helsinki Women s Clinic the first child with prenatal buprenorphine exposure was born in 2001. During 1992-2001 in the three capital area maternity hospitals (Women s clinic, Maternity hospital, Jorvi hospital) 524 women were followed at special antenatal clinics due to substance abuse problems. Three control women were drawn from birth register to each case woman and matched for parity and same place and date of the index birth. According to register data mortality rate was 38-fold higher among cases than controls within 6-15 years after index birth. Especially, the risk for violent or accidental death was increased. The women with substance misuse problems had also elevated risk for viral hepatitis and psychiatric morbidity. They were more often reimbursed for psychopharmaceuticals. Disability pensions and rehabilitation allowances were more often granted to cases than controls. In total 626 children were born from these pregnancies. According to register data 38% of these children were placed in out-of-home care as part of child protection services by the age of two years, and half of them by the age of 12 years, the median follow-up time was 5.8 years. The risk for out-of-home care was associated with factors identifiable during the pre- and perinatal period. In 2002-2005 67 pregnant women with buprenorphine dependence were followed up at the Helsinki University Hospital, Department of Obstetrics and Gynecology. Their pregnancies were uneventful. The prematurity rate was similar and there were no more major anomalies compared to the national statistics. The neonates were lighter compared to the national statistics. They were also born in good condition, with no perinatal hypoxia as defined by standard clinical parameters or certain biochemical markers in the cord blood: erythropoietin, S100 and cardiac troponin-t. Almost 80% of newborns developed neonatal abstinence syndrome (NAS) and two third of them needed morphine medication for it. Maternal smoking over ten cigarettes per day aggravated and benzodiazepine use attenuated NAS. An infant s highest urinary norbuprenorphine concentration during their first 3 days of life correlated with the duration of morphine treatment. The average length of infant s hospital stay was 25 days.