978 resultados para feedlot receiving


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactions among individuals give rise to both cooperation and conflict. Individuals will behave selfishly or altruistically depending on which gives the higher payoff. The reproductive strategies of many animals are flexible and several alternative tactics may be present from which the most suitable one is applied. Generally, alternative reproductive tactics may be defined as a response to competition from individuals of the same sex. These alternative reproductive tactics are means by which individuals may fine-tune their fitness to the reigning circumstances and which are shaped by the environment individuals are occupying as well as by the behaviour of other individuals sharing the environment. By employing such alternative ways of achieving reproductive output, individuals may alleviate competition from others. Conspecific brood parasitism (CBP) is an alternative reproductive strategy found in several egg laying animal groups, and it is especially common among waterfowl. Within this alternative reproductive strategy, four reproductive options can be identified. These four options represent a continuum from low reproductive effort coupled with low fitness returns, to high reproductive effort and consequently high benefits. It may not be evident how individuals should allocate reproductive effort between eggs laid in their own nest vs. in nests of others, however. Limited fecundity will constrain the number of eggs donated by a parasite, but also the tendency for hosts to accept parasitic eggs may affect the allocation decision. Furthermore, kinship, individual quality and the costs of breeding may play a role in complicating the allocation decision. In this thesis, I view the seemingly paradoxical effects of kinship on conflict resolution in the context of alternative reproductive tactics, examining the resulting features of cooperation and conflict. Conspecific brood parasitism sets the stage for investigating these questions. By using both empirical and theoretical approaches, I examine the nature of CBP in a brood parasitic duck, the Barrow's goldeneye (Bucephala islandica). The theoretical chapter of this thesis gives rise to four main conclusions. Firstly, variation in individual quality plays a central role in shaping breeding strategies. Secondly, kinship plays a central role in the evolution of CBP. Thirdly, egg recognition ability may affect the prevalence of parasitism. If egg recognition is perfect, higher relatedness between host and parasite facilitates CBP. Finally, I show that the relative costs of egg laying and post-laying care play a so far underestimated role in determining the prevalence of parasitism. The costs of breeding may outweigh possible inclusive fitness benefits accrued from receiving eggs from relatives. Several of the patterns brought out by the theoretical work are then confirmed empirically in the following chapters. Findings include confirmation of the central role of relatedness in determining the extent of parasitism as well as inducing a counterintuitive host clutch reduction. Furthermore, I demonstrate a cost of CBP inflicted on hosts, as well as results suggesting that host age reflects individual quality, affecting the ability to overcome costs inflicted by CBP. In summary, I demonstrate both theoretically and empirically the presence of cooperation and conflict in the interactions between conspecific parasites and their hosts. The field of CBP research has traditionally been divided, but the first steps have now been taken toward the acceptance of the opposite side of the divide. Especially the theoretical findings of chapter 1 offer the possibility to view seemingly contrasting results of various studies within the same framework, and may direct future research toward more general features underlying differences in the patterns of CBP between populations or species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

- Background Expressed emotion (EE) captures the affective quality of the relationship between family caregivers and their care recipients and is known to increase the risk of poor health outcomes for caregiving dyads. Little is known about expressed emotion in the context of caregiving for persons with dementia, especially in non-Western cultures. The Family Attitude Scale (FAS) is a psychometrically sound self-reporting measure for EE. Its use in the examination of caregiving for patients with dementia has not yet been explored. - Objectives This study was performed to examine the psychometric properties of the Chinese version of the FAS (FAS-C) in Chinese caregivers of relatives with dementia, and its validity in predicting severe depressive symptoms among the caregivers. - Methods The FAS was translated into Chinese using Brislin's model. Two expert panels evaluated the semantic equivalence and content validity of this Chinese version (FAS-C), respectively. A total of 123 Chinese primary caregivers of relatives with dementia were recruited from three elderly community care centers in Hong Kong. The FAS-C was administered with the Chinese versions of the 5-item Mental Health Inventory (MHI-5), the Zarit Burden Interview (ZBI) and the Revised Memory and Behavioral Problem Checklist (RMBPC). - Results The FAS-C had excellent semantic equivalence with the original version and a content validity index of 0.92. Exploratory factor analysis identified a three-factor structure for the FAS-C (hostile acts, criticism and distancing). Cronbach's alpha of the FAS-C was 0.92. Pearson's correlation indicated that there were significant associations between a higher score on the FAS-C and greater caregiver burden (r = 0.66, p < 0.001), poorer mental health of the caregivers (r = −0.65, p < 0.001) and a higher level of dementia-related symptoms (frequency of symptoms: r = 0.45, p < 0.001; symptom disturbance: r = 0.51, p < 0.001), which serves to suggest its construct validity. For detecting severe depressive symptoms of the family caregivers, the receiving operating characteristics (ROC) curve had an area under curve of 0.78 (95% confidence interval (CI) = 0.69–0.87, p < 0.0001). The optimal cut-off score was >47 with a sensitivity of 0.720 (95% CI = 0.506–0.879) and specificity of 0.742 (95% CI = 0.643–0.826). - Conclusions The FAS-C is a reliable and valid measure to assess the affective quality of the relationship between Chinese caregivers and their relatives with dementia. It also has acceptable predictability in identifying family caregivers with severe depressive symptoms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With transplant rejection rendered a minor concern and survival rates after liver transplantation (LT) steadily improving, long-term complications are attracting more attention. Current immunosuppressive therapies, together with other factors, are accompanied by considerable long-term toxicity, which clinically manifests as renal dysfunction, high risk for cardiovascular disease, and cancer. This thesis investigates the incidence, causes, and risk factors for such renal dysfunction, cardiovascular risk, and cancer after LT. Long-term effects of LT are further addressed by surveying the quality of life and employment status of LT recipients. The consecutive patients included had undergone LT at Helsinki University Hospital from 1982 onwards. Data regarding renal function – creatinine and estimated glomerular filtration rate (GFR) – were recorded before and repeatedly after LT in 396 patients. The presence of hypertension, dyslipidemia, diabetes, impaired fasting glucose, and overweight/obesity before and 5 years after LT was determined among 77 patients transplanted for acute liver failure. The entire cohort of LT patients (540 patients), including both children and adults, was linked with the Finnish Cancer Registry, and numbers of cancers observed were compared to site-specific expected numbers based on national cancer incidence rates stratified by age, gender, and calendar time. Health-related quality of life (HRQoL), measured by the 15D instrument, and employment status were surveyed among all adult patients alive in 2007 (401 patients). The response rate was 89%. Posttransplant cardiovascular risk factor prevalence and HRQoL were compared with that in the age- and gender-matched Finnish general population. The cumulative risk for chronic kidney disease increased from 10% at 5 years to 16% at 10 years following LT. GFR up to 10 years after LT could be predicted by the GFR at 1 year. In patients transplanted for chronic liver disease, a moderate correlation of pretransplant GFR with later GFR was also evident, whereas in acute liver failure patients after LT, even severe pretransplant renal dysfunction often recovered. By 5 years after LT, 71% of acute liver failure patients were receiving antihypertensive medications, 61% were exhibiting dyslipidemia, 10% were diabetic, 32% were overweight, and 13% obese. Compared with the general population, only hypertension displayed a significantly elevated prevalence among patients – 2.7-fold – whereas patients exhibited 30% less dyslipidemia and 71% less impaired fasting glucose. The cumulative incidence of cancer was 5% at 5 years and 13% at 10. Compared with the general population, patients were subject to a 2.6-fold cancer risk, with non-melanoma skin cancer (standardized incidence ratio, SIR, 38.5) and non-Hodgkin lymphoma (SIR 13.9) being the predominant malignancies. Non-Hodgkin lymphoma was associated with male gender, young age, and the immediate posttransplant period, whereas old age and antibody induction therapy raised skin-cancer risk. HRQoL deviated clinically unimportantly from the values in the general population, but significant deficits among patients were evident in some physical domains. HRQoL did not seem to decrease with longer follow-up. Although 87% of patients reported improved working capacity, data on return to working life showed marked age-dependency: Among patients aged less than 40 at LT, 70 to 80% returned to work, among those aged 40 to 50, 55%, and among those above 50, 15% to 28%. The most common cause for unemployment was early retirement before LT. Those patients employed exhibited better HRQoL than those unemployed. In conclusion, although renal impairment, hypertension, and cancer are evidently common after LT and increase with time, patients’ quality of life remains comparable with that of the general population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vasomotor hot flushes are complained of by approximately 75% of postmenopausal women, but their frequency and severity show great individual variation. Hot flushes have been present in women attending observational studies showing cardiovascular benefit associated with hormone therapy use, whereas they have been absent or very mild in randomized hormone therapy trials showing cardiovascular harm. Therefore, if hot flushes are a factor connected with vascular health, they could perhaps be one explanation for the divergence of cardiovascular data in observational versus randomized studies. For the present study 150 healthy, recently postmenopausal women showing a large variation in hot flushes were studied in regard to cardiovascular health by way of pulse wave analysis, ambulatory blood pressure and several biochemical vascular markers. In addition, the possible impact of hot flushes on outcomes of hormone therapy was studied. This study shows that women with severe hot flushes exhibit a greater vasodilatory reactivity as assessed by pulse wave analysis than do women without vasomotor symptoms. This can be seen as a hot flush-related vascular benefit. Although severe night-time hot flushes seem to be accompanied by transient increases in blood pressure and heart rate, the diurnal blood pressure and heart rate profiles show no significant differences between women without and with mild, moderate or severe hot flushes. The levels of vascular markers, such as lipids, lipoproteins, C-reactive protein and sex hormone-binding globulin show no association with hot flush status. In the 6-month hormone therapy trial the women were classified as having either tolerable or intolerable hot flushes. These groups were treated in a randomized order with transdermal estradiol gel, oral estradiol alone or in combination with medroxyprogesterone acetate, or with placebo. In women with only tolerable hot flushes, oral estradiol leads to a reduced vasodilatory response and increases in 24-hour and daytime blood pressures as compared to women with intolerable hot flushes receiving the same therapy. No such effects were observed with the other treatment regimes or in women with intolerable hot flushes. The responses of vascular biomarkers to hormone therapy are unaffected by hot flush status. In conclusion, hot flush status contributes to cardiovascular health before and during hormone therapy. Severe hot flushes are associated with an increased vasodilatory, and thus, a beneficial vascular status. Oral estradiol leads to vasoconstrictive changes and increases in blood pressure, and thus to possible vascular harm, but only in women whose hot flushes are so mild that they would probably not lead to the initiation of hormone therapy in clinical practice. Healthy, recently postmenopausal women with moderate to severe hot flushes should be given the opportunity to use hormone therapy alleviate hot flushes, and if estrogen is prescribed for indications other than for the control of hot flushes, transdermal route of administration should be favored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study is one part of a collaborative depression research project, the Vantaa Depression Study (VDS), involving the Department of Mental and Alcohol Research of the National Public Health Institute, Helsinki, and the Department of Psychiatry of the Peijas Medical Care District (PMCD), Vantaa, Finland. The VDS includes two parts, a record-based study consisting of 803 patients, and a prospective, naturalistic cohort study of 269 patients. Both studies include secondary-level care psychiatric out- and inpatients with a new episode of major depressive disorder (MDD). Data for the record-based part of the study came from a computerised patient database incorporating all outpatient visits as well as treatment periods at the inpatient unit. We included all patients aged 20 to 59 years old who had been assigned a clinical diagnosis of depressive episode or recurrent depressive disorder according to the International Classification of Diseases, 10th edition (ICD-10) criteria and who had at least one outpatient visit or day as an inpatient in the PMCD during the study period January 1, 1996, to December 31, 1996. All those with an earlier diagnosis of schizophrenia, other non-affective psychosis, or bipolar disorder were excluded. Patients treated in the somatic departments of Peijas Hospital and those who had consulted but not received treatment from the psychiatric consultation services were excluded. The study sample comprised 290 male and 513 female patients. All their psychiatric records were reviewed and each patient completed a structured form with 57 items. The treatment provided was reviewed up to the end of the depression episode or to the end of 1997. Most (84%) of the patients received antidepressants, including a minority (11%) on treatment with clearly subtherapeutic low doses. During the treatment period the depressed patients investigated averaged only a few visits to psychiatrists (median two visits), but more to other health professionals (median seven). One-fifth of both genders were inpatients, with a mean of nearly two inpatient treatment periods during the overall treatment period investigated. The median length of a hospital stay was 2 weeks. Use of antidepressants was quite conservative: The first antidepressant had been switched to another compound in only about one-fifth (22%) of patients, and only two patients had received up to five antidepressant trials. Only 7% of those prescribed any antidepressant received two antidepressants simultaneously. None of the patients was prescribed any other augmentation medication. Refusing antidepressant treatment was the most common explanation for receiving no antidepressants. During the treatment period, 19% of those not already receiving a disability pension were granted one due to psychiatric illness. These patients were nearly nine years older than those not pensioned. They were also more severely ill, made significantly more visits to professionals and received significantly more concomitant medications (hypnotics, anxiolytics, and neuroleptics) than did those receiving no pension. In the prospective part of the VDS, 806 adult patients were screened (aged 20-59 years) in the PMCD for a possible new episode of DSM-IV MDD. Of these, 542 patients were interviewed face-to-face with the WHO Schedules for Clinical Assessment in Neuropsychiatry (SCAN), Version 2.0. Exclusion criteria were the same as in the record-based part of the VDS. Of these, 542 269 patients fulfiled the criteria of DSM-IV MDE. This study investigated factors associated with patients' functional disability, social adjustment, and work disability (being on sick-leave or being granted a disability pension). In the beginning of the treatment the most important single factor associated with overall social and functional disability was found to be severity of depression, but older age and personality disorders also significantly contributed. Total duration and severity of depression, phobic disorders, alcoholism, and personality disorders all independently contributed to poor social adjustment. Of those who were employed, almost half (43%) were on sick-leave. Besides severity and number of episodes of depression, female gender and age over 50 years strongly and independently predicted being on sick-leave. Factors influencing social and occupational disability and social adjustment among patients with MDD were studied prospectively during an 18-month follow-up period. Patients' functional disability and social adjustment were alleviated during the follow-up concurrently with recovery from depression. The current level of functioning and social adjustment of a patient with depression was predicted by severity of depression, recurrence before baseline and during follow-up, lack of full remission, and time spent depressed. Comorbid psychiatric disorders, personality traits (neuroticism), and perceived social support also had a significant influence. During the 18-month follow-up period, of the 269, 13 (5%) patients switched to bipolar disorder, and 58 (20%) dropped out. Of the 198, 186 (94%) patients were at baseline not pensioned, and they were investigated. Of them, 21 were granted a disability pension during the follow-up. Those who received a pension were significantly older, more seldom had vocational education, and were more often on sick-leave than those not pensioned, but did not differ with regard to any other sociodemographic or clinical factors. Patients with MDD received mostly adequate antidepressant treatment, but problems existed in treatment intensity and monitoring. It is challenging to find those at greatest risk for disability and to provide them adequate and efficacious treatment. This includes great challenges to the whole society to provide sufficient resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The fecal neutrophil-derived proteins calprotectin and lactoferrin have proven useful surrogate markers of intestinal inflammation. The aim of this study was to compare fecal calprotectin and lactoferrin concentrations to clinically, endoscopically, and histologically assessed Crohn’s disease (CD) activity, and to explore the suitability of these proteins as surrogate markers of mucosal healing during anti-TNFα therapy. Furthermore, we studied changes in the number and expression of effector and regulatory T cells in bowel biopsy specimens during anti-TNFα therapy. Patients and methods: Adult CD patients referred for ileocolonoscopy (n=106 for 77 patients) for various reasons were recruited (Study I). Clinical disease activity was assessed with the Crohn’s disease activity index (CDAI) and endoscopic activity with both the Crohn’s disease index of severity (CDEIS) and the simple endoscopic score for Crohn’s disease (SES-CD). Stool samples for measurements of calprotectin and lactoferrin, and blood samples for CRP were collected. For Study II, biopsy specimens were obtained from the ileum and the colon for histologic activity scoring. In prospective Study III, after baseline ileocolonoscopy, 15 patients received induction with anti-TNFα blocking agents and endoscopic, histologic, and fecal-marker responses to therapy were evaluated at 12 weeks. For detecting changes in the number and expression of effector and regulatory T cells, biopsy specimens were taken from the most severely diseased lesions in the ileum and the colon (Study IV). Results: Endoscopic scores correlated significantly with fecal calprotectin and lactoferrin (p<0.001). Both fecal markers were significantly lower in patients with endoscopically inactive than with active disease (p<0.001). In detecting endoscopically active disease, the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) for calprotectin ≥200 μg/g were 70%, 92%, 94%, and 61%; for lactoferrin ≥10 μg/g they were 66%, 92%, 94%, and 59%. Accordingly, the sensitivity, specificity, PPV, and NPV for CRP >5 mg/l were 48%, 91%, 91%, and 48%. Fecal markers were significantly higher in active colonic (both p<0.001) or ileocolonic (calprotectin p=0.028, lactoferrin p=0.004) than in ileal disease. In ileocolonic or colonic disease, colon histology score correlated significantly with fecal calprotectin (r=0.563) and lactoferrin (r=0.543). In patients receiving anti-TNFα therapy, median fecal calprotectin decreased from 1173 μg/g (range 88-15326) to 130 μg/g (13-1419) and lactoferrin from 105.0 μg/g (4.2-1258.9) to 2.7 μg/g (0.0-228.5), both p=0.001. The relation of ileal IL-17+ cells to CD4+ cells decreased significantly during anti-TNF treatment (p=0.047). The relation of IL-17+ cells to Foxp3+ cells was higher in the patients’ baseline specimens than in their post-treatment specimens (p=0.038). Conclusions: For evaluation of CD activity, based on endoscopic findings, more sensitive surrogate markers than CDAI and CRP were fecal calprotectin and lactoferrin. Fecal calprotectin and lactoferrin were significantly higher in endoscopically active disease than in endoscopic remission. In both ileocolonic and colonic disease, fecal markers correlated closely with histologic disease activity. In CD, these neutrophil-derived proteins thus seem to be useful surrogate markers of endoscopic activity. During anti-TNFα therapy, fecal calprotectin and lactoferrin decreased significantly. The anti-TNFα treatment was also reflected in a decreased IL-17/Foxp3 cell ratio, which may indicate improved balance between effector and regulatory T cells with treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Many different guidelines recommend people with foot complications, or those at risk, should attend multiple health professionals for foot care each year. However, few studies have investigated the characteristics of those attending health professionals for foot care and if those characteristics match those requiring foot care as per guideline recommendations. The aim of this paper was to determine the associated characteristics of people who attended a health professional for foot care in the year prior to their hospitalisation. Methods Eligible participants were all adults admitted overnight, for any reason, into five diverse hospitals on one day; excluding maternity, mental health and cognitively impaired patients. Participants underwent a foot examination to clinically diagnose different foot complications; including wounds, infections, deformity, peripheral arterial disease and peripheral neuropathy. They were also surveyed on social determinant, medical history, self-care, foot complication history, and, past health professional attendance for foot care in the year prior to hospitalisation. Results Overall, 733 participants consented; mean(±SD) age 62(±19) years, 408 (55.8%) male, 172 (23.5%) diabetes. Two hundred and fifty-six (34.9% (95% CI) (31.6-38.4)) participants had attended a health professional for foot care; including attending podiatrists 180 (24.5%), GPs 93 (24.6%), and surgeons 36 (4.9%). In backwards stepwise multivariate analyses attending any health professional for foot care was independently associated (OR (95% CI)) with diabetes (3.0 (2.1-4.5)), arthritis (1.8 (1.3-2.6)), mobility impairment (2.0 (1.4-2.9)) and previous foot ulcer (5.4 (2.9-10.0)). Attending a podiatrist was independently associated with female gender (2.6 (1.7-3.9)), increasing years of age (1.06 (1.04-1.08), diabetes (5.0 (3.2-7.9)), arthritis (2.0 (1.3-3.0)), hypertension (1.7 (1.1-2.6) and previous foot ulcer (4.5 (2.4-8.1). While attending a GP was independently associated with having a foot ulcer (10.4 (5.6-19.2). Conclusions Promisingly these findings indicate that people with a diagnosis of diabetes and arthritis are more likely to attend health professionals for foot care. However, it also appears those with active foot complications, or significant risk factors, may not be more likely to receive the multi-disciplinary foot care recommended by guidelines. More concerted efforts are required to ensure all people with foot complications are receiving recommended foot care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diffuse large B-cell lymphoma (DLBCL) is the most common of the non-Hodgkin lymphomas. As DLBCL is characterized by heterogeneous clinical and biological features, its prognosis varies. To date, the International Prognostic Index has been the strongest predictor of outcome for DLBCL patients. However, no biological characters of the disease are taken into account. Gene expression profiling studies have identified two major cell-of-origin phenotypes in DLBCL with different prognoses, the favourable germinal centre B-cell-like (GCB) and the unfavourable activated B-cell-like (ABC) phenotypes. However, results of the prognostic impact of the immunohistochemically defined GCB and non-GCB distinction are controversial. Furthermore, since the addition of the CD20 antibody rituximab to chemotherapy has been established as the standard treatment of DLBCL, all molecular markers need to be evaluated in the post-rituximab era. In this study, we aimed to evaluate the predictive value of immunohistochemically defined cell-of-origin classification in DLBCL patients. The GCB and non-GCB phenotypes were defined according to the Hans algorithm (CD10, BCL6 and MUM1/IRF4) among 90 immunochemotherapy- and 104 chemotherapy-treated DLBCL patients. In the chemotherapy group, we observed a significant difference in survival between GCB and non-GCB patients, with a good and a poor prognosis, respectively. However, in the rituximab group, no prognostic value of the GCB phenotype was observed. Likewise, among 29 high-risk de novo DLBCL patients receiving high-dose chemotherapy and autologous stem cell transplantation, the survival of non-GCB patients was improved, but no difference in outcome was seen between GCB and non-GCB subgroups. Since the results suggested that the Hans algorithm was not applicable in immunochemotherapy-treated DLBCL patients, we aimed to further focus on algorithms based on ABC markers. We examined the modified activated B-cell-like algorithm based (MUM1/IRF4 and FOXP1), as well as a previously reported Muris algorithm (BCL2, CD10 and MUM1/IRF4) among 88 DLBCL patients uniformly treated with immunochemotherapy. Both algorithms distinguished the unfavourable ABC-like subgroup with a significantly inferior failure-free survival relative to the GCB-like DLBCL patients. Similarly, the results of the individual predictive molecular markers transcription factor FOXP1 and anti-apoptotic protein BCL2 have been inconsistent and should be assessed in immunochemotherapy-treated DLBCL patients. The markers were evaluated in a cohort of 117 patients treated with rituximab and chemotherapy. FOXP1 expression could not distinguish between patients, with favourable and those with poor outcomes. In contrast, BCL2-negative DLBCL patients had significantly superior survival relative to BCL2-positive patients. Our results indicate that the immunohistochemically defined cell-of-origin classification in DLBCL has a prognostic impact in the immunochemotherapy era, when the identifying algorithms are based on ABC-associated markers. We also propose that BCL2 negativity is predictive of a favourable outcome. Further investigational efforts are, however, warranted to identify the molecular features of DLBCL that could enable individualized cancer therapy in routine patient care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intensive care is to be provided to patients benefiting from it, in an ethical, efficient, effective and cost-effective manner. This implies a long-term qualitative and quantitative analysis of intensive care procedures and related resources. The study population consists of 2709 patients treated in the general intensive care unit (ICU) of Helsinki University Hospital. Study sectors investigate intensive care patients mortality, quality of life (QOL), Quality-Adjusted Life-Years (QALY units) and factors related to severity of illness, length of stay (LOS), patient s age, evaluation period as well as experiences and memories connected with the ICU episode. In addition, the study examines the qualities of two QOL measures, the RAND 36 Item Health Survey 1.0 (RAND-36) and the 5 Item EuroQol-5D (EQ-5D) and assesses the correlation of the test results. Patients treated in 1995 responded to the RAND-36 questionnaire in 1996. All patients, treated from 1995-2000, received a QOL questionnaires in 2001, when 1 7 years had lapsed from the intensive treatment. Response rate was 79.5 %. Main Results 1) Of the patients who died within the first year (n = 1047) 66 % died during the intensive care period or within the following month. The non-survivors were more aged than the surviving patients, had generally a higher than average APACHE II and SOFA score depicting the severity of illness, their ICU LOS was longer and hospital stay shorter than of the surviving patients (p < 0.001). Mortality of patients receiving conservative treatment was higher than of those receiving surgical treatment. Patients replying to the QOL survey in 2001 (n = 1099) had recovered well: 97 % of those lived at home. More than half considered their QOL as good or extremely good, 40 % as satisfactory and 7 % as bad. All QOL indexes of those of working-age were considerably lower (p < 0.001) than comparable figures of the age- and gender-adjusted Finnish population. The 5-year monitoring period made evident that mental recovery was slower than physical recovery. 2) The results of RAND-36 and EQ-5D correlated well (p < 0.01). The RAND-36 profile measure distinguished more clearly between the different categories of QOL and their levels. EQ-5D measured well the patient groups general QOL and the sum index was used to calculate QALY units. 3) QALY units were calculated by multiplying the time the patient survived after ICU stay or expected life-years by the EQ-5D sum index. Aging automatically lowers the number of QALY units. Patients under the age of 65 receiving conservative treatment benefited from treatment to a greater extent measured in QALY units than their peers receiving surgical treatment, but in the age group 65 and over patients with surgical treatment received higher QALY ratings than recipients of conservative treatment. 4) The intensive care experience and QOL ratings were connected. The QOL indices were statistically highest for those recipients with memories of intensive care as a positive experience, albeit their illness requiring intensive care treatment was less serious than average. No statistically significant differences were found in the QOL indices of those with negative memories, no memories or those who did not express the quality of their experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some perioperative clinical factors related to the primary cemented arthroplasty operation for osteoarthritis of the hip or knee joint are studied and discussed in this thesis. In a randomized, double-blind study, 39 patients were divided into two groups: one receiving tranexamic acid and the other not receiving it. Tranexamic acid was given in a dose of 10 mg/kg before the operation and twice thereafter, at 8-hour intervals. Total blood loss was smaller in the tranexamic acid group than in the control group. No thromboembolic complications were noticed. In a prospective, randomized study, 58 patients with hip arthroplasty and 39 patients with knee arthroplasty were divided into groups with postoperative closed-suction drainage and without drainage. There was no difference in healing of the wounds, postoperative blood transfusions, complications or range of motion. As a result of this study, the use of drains is no longer recommended. In a randomised study the effectiveness of a femoral nerve block (25 patients) was compared with other methods of pain control (24 patients) on the first postoperative day after total knee arthroplasty. The femoral block consisted of a single injection administered at patients´ bedside during the surgeon´s hospital rounds. Femoral block patients reported less pain and required half of the amount of oxycodone. Additional femoral block or continued epidural analgesia was required more frequently by the control group patients. Pain management with femoral blocks resulted in less work for nursing staff. In a retrospective study of 422 total hip and knee arthroplasty cases the C-reactive protein levels and clinical course were examined. After hip and knee arthroplasty the maximal C-reactive protein values are seen on the second and third postoperative days, after which the level decreases rapidly. There is no difference between patients with cemented or uncemented prostheses. Major postoperative complications may cause a further increase in C-reactive protein levels at one and two weeks. In-hospital and outpatient postoperative control radiographs of 200 hip and knee arthroplasties were reviewed retrospectively. If postoperative radiographs are of good quality, there seems to be no need for early repetitive radiographs. The quality and safety of follow-up is not compromised by limiting follow-up radiographs to those with clinical indications. Exposure of the patients and the staff to radiation is reduced. Reading of the radiographs by only the treating orthopaedic surgeon is enough. These factors may seem separate from each other, but linking them together may help the treating orthopaedic surgeon to adequate patient care strategy. Notable savings can be achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Vantaa Primary Care Depression Study (PC-VDS) is a naturalistic and prospective cohort study concerning primary care patients with depressive disorders. It forms a collaborative research project between the Department of Mental and Alcohol Research of the National Public Health Institute, and the Primary Health Care Organization of the City of Vantaa. The aim is to obtain a comprehensive view on clinically significant depression in primary care, and to compare depressive patients in primary care and in secondary level psychiatric care in terms of clinical characteristics. Consecutive patients (N=1111) in three primary care health centres were screened for depression with the PRIME-MD, and positive cases interviewed by telephone. Cases with current depressive symptoms were diagnosed face-to-face with the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I/P). A cohort of 137 patients with unipolar depressive disorders, comprising all patients with at least two depressive symptoms and clinically significant distress or disability, was recruited. The Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II), medical records, rating scales, interview and a retrospective life-chart were used to obtain comprehensive cross-sectional and retrospective longitudinal information. For investigation of suicidal behaviour the Scale for Suicidal Ideation (SSI), patient records and the interview were used. The methodology was designed to be comparable to The Vantaa Depression Study (VDS) conducted in secondary level psychiatric care. Comparison of major depressive disorder (MDD) patients aged 20-59 from primary care in PC-VDS (N=79) was conducted with new psychiatric outpatients (N =223) and inpatients (N =46) in VDS. The PC-VDS cohort was prospectively followed up at 3, 6 and 18 months. Altogether 123 patients (90%) completed the follow-up. Duration of the index episode and the timing of relapses or recurrences were examined using a life-chart. The retrospective investigation revealed current MDD in most (66%), and lifetime MDD in nearly all (90%) cases of clinically significant depressive syndromes. Two thirds of the “subsyndromal” cases had a history of major depressive episode (MDE), although they were currently either in partial remission or a potential prodromal phase. Recurrences and chronicity were common. The picture of depression was complicated by Axis I co-morbidity in 59%, Axis II in 52% and chronic Axis III disorders in 47%; only 12% had no co-morbidity. Within their lifetimes, one third (37%) had seriously considered suicide, and one sixth (17%) had attempted it. Suicidal behaviour clustered in patients with moderate to severe MDD, co-morbidity with personality disorders, and a history of treatment in psychiatric care. The majority had received treatment for depression, but suicidal ideation had mostly remained unrecognised. The comparison of patients with MDD in primary care to those in psychiatric care revealed that the majority of suicidal or psychotic patients were receiving psychiatric treatment, and the patients with the most severe symptoms and functional limitations were hospitalized. In other clinical aspects, patients with MDD in primary care were surprisingly similar to psychiatric outpatients. Mental health contacts earlier in the current MDE were common among primary care patients. The 18-month prospective investigation with a life-chart methodology verified the chronic and recurrent nature of depression in primary care. Only one-quarter of patients with MDD achieved and maintained full remission during the follow-up, while another quarter failed to remit at all. The remaining patients suffered either from residual symptoms or recurrences. While severity of depression was the strongest predictor of recovery, presence of co-morbid substance use disorders, chronic medical illness and cluster C personality disorders all contributed to an adverse outcome. In clinical decision making, beside severity of depression and co-morbidity, history of previous MDD should not be ignored by primary care doctors while depression there is usually severe enough to indicate at least follow-up, and concerning those with residual symptoms, evaluation of their current treatment. Moreover, recognition of suicidal behaviour among depressed patients should also be improved. In order to improve outcome of depression in primary care, the often chronic and recurrent nature of depression should be taken into account in organizing the care. According to literature management programs of a chronic disease, with enhancement of the role of case managers and greater integration of primary and specialist care, have been successful. Optimum ways of allocating resources between treatment providers as well as within health centres should be found.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Juvenile idiopathic arthritis (JIA) is a heterogeneous group of childhood chronic arthritides, associated with chronic uveitis in 20% of cases. For JIA patients responding inadequately to conventional disease-modifying anti-rheumatic drugs (DMARDs), biologic therapies, anti-tumor necrosis factor (anti-TNF) agents are available. In this retrospective multicenter study, 258 JIA-patients refractory to DMARDs and receiving biologic agents during 1999-2007 were included. Prior to initiation of anti-TNFs, growth velocity of 71 patients was delayed in 75% and normal in 25%. Those with delayed growth demonstrated a significant increase in growth velocity after initiation of anti-TNFs. Increase in growth rate was unrelated to pubertal growth spurt. No change was observed in skeletal maturation before and after anti-TNFs. The strongest predictor of change in growth velocity was growth rate prior to anti-TNFs. Change in inflammatory activity remained a significant predictor even after decrease in glucocorticoids was taken into account. In JIA-associated uveitis, impact of two first-line biologic agents, etanercept and infliximab, and second-line or third-line anti-TNF agent, adalimumab, was evaluated. In 108 refractory JIA patients receiving etanercept or infliximab, uveitis occurred in 45 (42%). Uveitis improved in 14 (31%), no change was observed in 14 (31%), and in 17 (38%) uveitis worsened. Uveitis improved more frequently (p=0.047) and frequency of annual uveitis flares was lower (p=0.015) in those on infliximab than in those on etanercept. In 20 patients taking adalimumab, 19 (95%) had previously failed etanercept and/or infliximab. In 7 patients (35%) uveitis improved, in one (5%) worsened, and in 12 (60%) no change occurred. Those with improved uveitis were younger and had shorter disease duration. Serious adverse events (AEs) or side-effects were not observed. Adalimumab was effective also in arthritis. Long-term drug survival (i.e. continuation rate on drug) with etanercept (n=105) vs. infliximab (n=104) was at 24 months 68% vs. 68%, and at 48 months 61% vs. 48% (p=0.194 in log-rank analysis). First-line anti-TNF agent was discontinued either due to inefficacy (etanercept 28% vs. infliximab 20%, p=0.445), AEs (7% vs. 22%, p=0.002), or inactive disease (10% vs. 16%, p=0.068). Females, patients with systemic JIA (sJIA), and those taking infliximab as the first therapy were at higher risk for treatment discontinuation. One-third switched to the second anti-TNF agent, which was discontinued less often than the first. In conclusion, in refractory JIA anti-TNFs induced enhanced growth velocity. Four-year treatment survival was comparable between etanercept and infliximab, and switching from first-line to second-line agent a reasonable therapeutic option. During anti-TNF treatment, one-third with JIA-associated anterior uveitis improved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drugs and surgical techniques may have harmful renal effects during the perioperative period. Traditional biomarkers are often insensitive to minor renal changes, but novel biomarkers may more accurately detect disturbances in glomerular and tubular function and integrity. The purpose of this study was first, to evaluate the renal effects of ketorolac and clonidine during inhalation anesthesia with sevoflurane and isoflurane, and secondly, to evaluate the effect of tobacco smoking on the production of inorganic fluoride (F-) following enflurane and sevoflurane anesthesia as well as to determine the effect of F- on renal function and cellular integrity in surgical patients. A total of 143 patients undergoing either conventional (n = 75) or endoscopic (n = 68) inpatient surgery were enrolled in four studies. The ketorolac and clonidine studies were prospective, randomized, placebo controlled and double-blinded, while the cigarette smoking studies were prospective cohort studies with two parallel groups. As a sign of proximal tubular deterioration, a similar transient increase in urine N-acetyl-beta-D-glucosaminidase/creatinine (U-NAG/crea) was noted in both the ketorolac group and in the controls (baseline vs. at two hours of anesthesia, p = 0.015) with a 3.3 minimum alveolar concentration hour sevoflurane anesthesia. Uncorrelated U-NAG increased above the maximum concentration measured from healthy volunteers (6.1 units/l) in 5/15 patients with ketorolac and in none of the controls (p = 0.042). As a sign of proximal tubular deterioration, U-glutathione transferase-alpha/crea (U-GST-alpha/crea) increased in both groups at two hours after anesthesia but a more significant increase was noted in the patients with ketorolac. U-GST-alpha/crea increased above the maximum ratio measured from healthy volunteers in 7/15 patients with ketorolac and in 3/15 controls. Clonidine diminished the activation of the renin-angiotensin aldosterone system during pneumoperitoneum; urine output was better preserved in the patients treated with clonidine (1/15 patients developed oliguria) than in the controls (8/15 developed oliguria (p=0.005)). Most patients with pneumoperitoneum and isoflurane anesthesia developed a transient proximal tubular deterioration, as U-NAG increased above 6.1 units/L in 11/15 patients with clonidine and in 7/15 controls. In the patients receiving clonidine treatment, the median of U-NAG/crea was higher than in the controls at 60 minutes of pneumoperitoneum (p = 0.01), suggesting that clonidine seems to worsen proximal tubular deterioration. Smoking induced the metabolism of enflurane, but the renal function remained intact in both the smokers and the non-smokers with enflurane anesthesia. On the contrary, smoking did not induce sevoflurane metabolism, but glomerular function decreased in 4/25 non-smokers and in 7/25 smokers with sevoflurane anesthesia. All five patients with S-F- ≥ 40 micromol/L, but only 6/45 with S-F- less than 40 micromol/L (p = 0.001), developed a S-tumor associated trypsin inhibitor concentration above 3 nmol/L as a sign of glomerular dysfunction. As a sign of proximal tubulus deterioration, U-beta 2-microglobulin increased in 2/5 patients with S-F- over 40 micromol/L compared to 2/45 patients with the highest S-F- less than 40 micromol/L (p = 0.005). To conclude, sevoflurane anesthesia may cause a transient proximal tubular deterioration which may be worsened by a co-administration of ketorolac. Clonidine premedication prevents the activation of the renin-angiotensin aldosterone system and preserves normal urine output, but may be harmful for proximal tubules during pneumoperitoneum. Smoking induces the metabolism of enflurane but not that of sevoflurane. Serum F- of 40 micromol/L or higher may induce glomerular dysfunction and proximal tubulus deterioration in patients with sevoflurane anesthesia. The novel renal biomarkers warrant further studies in order to establish reference values for surgical patients having inhalation anesthesia.