830 resultados para high risk population


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study aimed to examine the incidence of young adult-onset T1DM and T2DM among Finns, and to explore the possible risk factors for young adult-onset T1DM and T2DM that occur during the perinatal period and childhood. In the studies I-II, the incidence of diabetes was examined among 15-39-year-old Finns during the years 1992-2001. Information on the new diagnoses of diabetes was collected from four sources: standardized national reports filled in by diabetes nurses, the Hospital Discharge Register, the Drug Reimbursement Register, and the Drug Prescription Register. The type of diabetes was assigned using information obtained from these four data sources. The incidence of T1DM was 18 per 100,000/year, and there was a clear male predominance in the incidence of T1DM. The incidence of T1DM increased on average 3.9% per year during 1992-2001. The incidence of T2DM was 13 per 100,000/year, and it displayed an increase of 4.3% per year. In the studies III-V, the effects of perinatal exposures and childhood growth on the risk for young adult-onset T1DM and T2DM were explored in a case-control setting. Individuals diagnosed with T1DM (n=1,388) and T2DM (n=1,121) during the period 1992-1996 were chosen as the diabetes cases for the study, and two controls were chosen for each case from the National Population Register. Data on the study subjects parents and siblings was obtained from the National Population Register. The study subjects original birth records and child welfare clinic records were traced nationwide. The risk for young adult-onset T2DM was the lowest among the offspring of mothers aged about 30 years, whereas the risk for T2DM increased towards younger and older maternal ages. Birth orders second to fourth were found protective of T2DM. In addition, the risk for T2DM was observed to decrease with increasing birth weight until 4.2 kg, after which the risk began to increase. A high body mass index (BMI) at the BMI rebound between ages 3-11 years substantially increased the risk for T2DM, and the excess weight gain in individuals diagnosed with T2DM began in early childhood. Maternal age, birth order, or body size at birth had no effect on the risk for young adult-onset T1DM. Instead, individuals with T1DM were observed to have a higher maximum BMI before the age of 3 than their control subjects. In conclusion, the increasing trend in the development of both T1DM and T2DM among young Finnish adults is alarming. The high risk for T1DM among the Finnish population extends to at least 40 years of age, and at least 200-300 young Finnish adults are diagnosed with T2DM every year. Growth during the fetal period and childhood notably affects the risk for T2DM. T2DM prevention should also target childhood obesity. Rapid growth during the first years of life may be a risk factor for late-onset T1DM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The introduction of glyphosate tolerant cotton has significantly improved the flexibility and management of a number of problem weeds in cotton systems. However, reliance on glyphosate poses risks to the industry in term of glyphosate resistance and species shift. The aims of this project were to identify these risks, and determine strategies to prevent and mitigate the potential for resistance evolution. Field surveys identified fleabane as the most common weed now in both irrigated and dryland system. Sowthistle has also increased in prevalence, and bladder ketmia and peachvine remained common. The continued reliance on glyphosate has favoured small seeded, and glyphosate tolerant species. Fleabane is both of these, with populations confirmed resistant in grains systems in Queensland and NSW. When species were assessed for their resistance risk, fleabane, liverseed grass, feathertop Rhodes grass, sowthistle and barnyard grass were determined to have high risk ratings. Management practices were also determined to rely heavily on glyphosate and therefore be high risk in summer fallows, and dryland glyphosate tolerant and conventional cotton. Situations were these high risk species are present in high risk cropping phases need particular attention. The confirmation of a glyphosate resistance barnyard grass population in a dryland glyphosate tolerant cotton system means resistance is now a reality for the cotton industry. However, experiments have shown that resistant populations can be managed with other herbicide options currently available. However, the options for fleabane management in cotton are still limited. Although some selective residual herbicides are showing promise, the majority of fleabane control tactics can only be used in other phases of the cotton rotation. An online glyphosate resistance tool has been developed. This tool allows growers to assess their individual glyphosate resistance risks, and how they can adjust their practices to reduce their risks. It also provides researchers with current information on weed species present and practices used across the industry. This tool will be extremely useful in tailoring future research and extension efforts. Simulations from the expanded glyphosate resistance model have shown that glyphosate resistance can be prevented and managed in glyphosate-tolerant cotton farming systems. However, for strategies to be successful, some effort is required. Simulations have shown the importance of controlling survivors of glyphosate applications, using effective glyphosate alternatives in fallows, and combining several effective glyphosate alternatives in crop, and these are the key to the prevention and management of glyphosate resistance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hereditary nonpolyposis colorectal cancer (HNPCC) and familial adenomatous polyposis (FAP) are characterized by a high risk and early onset of colorectal cancer (CRC). HNPCC is due to a germline mutation in one of the following MMR genes: MLH1, MSH2, MSH6 and PMS2. A majority of FAP and attenuated FAP (AFAP) cases are due to germline mutations of APC, causing the development of multiple colorectal polyps. To date, over 450 MMR gene mutations and over 800 APC mutations have been identified. Most of these mutations lead to a truncated protein, easily detected by conventional mutation detection methods. However, in about 30% of HNPCC and FAP, and about 90% of AFAP families, mutations remain unknown. We aimed to clarify the genetic basis and genotype-phenotype correlation of mutation negative HNPCC and FAP/AFAP families by advanced mutation detection methods designed to detect large genomic rearrangements, mRNA and protein expression alterations, promoter mutations, phenotype linked haplotypes, and tumoral loss of heterozygosity. We also aimed to estimate the frequency of HNPCC in Uruguayan CRC patients. Our expression based analysis of mutation negative HNPCC divided these families into two categories: 1) 42% of families linked to the MMR genes with a phenotype resembling that of mutation positive, and 2) 58% of families likely to be associated with other susceptibility genes. Unbalanced mRNA expression of MLH1 was observed in two families. Further studies revealed that a MLH1 nonsense mutation, R100X was associated with aberrant splicing of exons not related to the mutation and an MLH1 deletion (AGAA) at nucleotide 210 was associated with multiple exon skipping, without an overall increase in the frequency of splice events. APC mutation negative FAP/AFAP families were divided into four groups according to the genetic basis of their predisposition. Four (14%) families displayed a constitutional deletion of APC with profuse polyposis, early age of onset and frequent extracolonic manifestations. Aberrant mRNA expression of one allele was observed in seven (24%) families with later onset and less frequent extracolonic manifestations. In 15 (52%) families the involvement of APC could neither be confirmed nor excluded. In three (10%) of the families a germline mutation was detected in genes other than APC: AXIN2 in one family, and MYH in two families. The families with undefined genetic basis and especially those with AXIN2 or MYH mutations frequently displayed AFAP or atypical polyposis. Of the Uruguayan CRC patients, 2.6% (12/461) fulfilled the diagnostic criteria for HNPCC and 5.6% (26/461) were associated with increased risk of cancer. Unexpectedly low frequency of molecularly defined HNPCC cases may suggest a different genetic profile in the Uruguayan population and the involvement of novel susceptibility genes. Accurate genetic and clinical characterization of families with hereditary colorectal cancers, and the definition of the genetic basis of "mutation negative" families in particular, facilitate proper clinical management of such families.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: The ATM gene encoding a putative protein kinase is mutated in ataxia-telangiectasia (A-T), an autosomal recessive disorder with a predisposition for cancer. Studies of A-T families suggest that female heterozygotes have an increased risk of breast cancer compared with noncarriers. However, neither linkage analyses nor mutation studies have provided supporting evidence for a role of ATM in breast cancer predisposition. Nevertheless, two recurrent ATM mutations, T7271G and IVS10-6T-->G, reportedly increase the risk of breast cancer. We examined these two ATM mutations in a population-based, case-control series of breast cancer families and multiple-case breast cancer families. METHODS: Five hundred twenty-five or 262 case patients with breast cancer and 381 or 68 control subjects, respectively, were genotyped for the T7271G and IVS10-6T-->G ATM mutations, as were index patients from 76 non-BRCA1/2 multiple-case breast cancer families. Linkage and penetrance were analyzed. ATM protein expression and kinase activity were analyzed in lymphoblastoid cell lines from mutation carriers. All statistical tests were two-sided. RESULTS: In case and control subjects unselected for family history of breast cancer, one case patient had the T7271G mutation, and none had the IVS10-6T-->G mutation. In three multiple-case families, one of these two mutations segregated with breast cancer. The estimated average penetrance of the mutations was 60% (95% confidence interval [CI] = 32% to 90%) to age 70 years, equivalent to a 15.7-fold (95% CI = 6.4-fold to 38.0-fold) increased relative risk compared with that of the general population. Expression and activity analyses of ATM in heterozygous cell lines indicated that both mutations are dominant negative. CONCLUSION: At least two ATM mutations are associated with a sufficiently high risk of breast cancer to be found in multiple-case breast cancer families. Full mutation analysis of the ATM gene in such families could help clarify the role of ATM in breast cancer susceptibility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Given the limited resources available for weed management, a strategic approach is required to give the best bang for your buck. The current study incorporates: (1) a model ensemble approach to identify areas of uncertainty and commonality regarding a species invasive potential, (2) current distribution of the invaded species, and (3) connectivity of systems to identify target regions and focus efforts for more effective management. Uncertainty in the prediction of suitable habitat for H. amplexicaulis (study species) in Australia was addressed in an ensemble-forecasting approach to compare distributional scenarios from four models (CLIMATCH; CLIMEX; boosted regression trees [BRT]; maximum entropy [Maxent]). Models were built using subsets of occurrence and environmental data. Catchment risk was determined through incorporating habitat suitability, the current abundance and distribution of H. amplexicaulis, and catchment connectivity. Our results indicate geographic differences between predictions of different approaches. Despite these differences a number of catchments in northern, central, and southern Australia were identified as high risk of invasion or further spread by all models suggesting they should be given priority for the management of H. amplexicaulis. The study also highlighted the utility of ensemble approaches in indentifying areas of uncertainty and commonality regarding the species invasive potential.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective To investigate the epidemic characteristics of human cutaneous anthrax (CA) in China, detect the spatiotemporal clusters at the county level for preemptive public health interventions, and evaluate the differences in the epidemiological characteristics within and outside clusters. Methods CA cases reported during 2005–2012 from the national surveillance system were evaluated at the county level using space-time scan statistic. Comparative analysis of the epidemic characteristics within and outside identified clusters was performed using using the χ2 test or Kruskal-Wallis test. Results The group of 30–39 years had the highest incidence of CA, and the fatality rate increased with age, with persons ≥70 years showing a fatality rate of 4.04%. Seasonality analysis showed that most of CA cases occurred between May/June and September/October of each year. The primary spatiotemporal cluster contained 19 counties from June 2006 to May 2010, and it was mainly located straddling the borders of Sichuan, Gansu, and Qinghai provinces. In these high-risk areas, CA cases were predominantly found among younger, local, males, shepherds, who were living on agriculture and stockbreeding and characterized with high morbidity, low mortality and a shorter period from illness onset to diagnosis. Conclusion CA was geographically and persistently clustered in the Southwestern China during 2005–2012, with notable differences in the epidemic characteristics within and outside spatiotemporal clusters; this demonstrates the necessity for CA interventions such as enhanced surveillance, health education, mandatory and standard decontamination or disinfection procedures to be geographically targeted to the areas identified in this study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: We sought to characterise the demographics, length of admission, final diagnoses, long-term outcome and costs associated with the population who presented to an Australian emergency department (ED) with symptoms of possible acute coronary syndrome (ACS). Design, setting and participants: Prospectively collected data on ED patients presenting with suspected ACS between November 2008 and February 2011 was used, including data on presentation and at 30 days after presentation. Information on patient disposition, length of stay and costs incurred was extracted from hospital administration records. Main outcome measures: Primary outcomes were mean and median cost and length of hospital stay. Secondary outcomes were diagnosis of ACS, other cardiovascular conditions or non-cardiovascular conditions within 30 days of presentation. Results: An ACS was diagnosed in 103 (11.1%) of the 926 patients recruited. 193 patients (20.8%) were diagnosed with other cardiovascular-related conditions and 622 patients (67.2%) had non-cardiac-related chest pain. ACS events occurred in 0 and 11 (1.9%) of the low-risk and intermediate-risk groups, respectively. Ninety-two (28.0%) of the 329 high-risk patients had an ACS event. Patients with a proven ACS, high-grade atrioventricular block, pulmonary embolism and other respiratory conditions had the longest length of stay. The mean cost was highest in the ACS group ($13 509; 95% CI, $11 794–$15 223) followed by other cardiovascular conditions ($7283; 95% CI, $6152–$8415) and non-cardiovascular conditions ($3331; 95% CI, $2976–$3685). Conclusions: Most ED patients with symptoms of possible ACS do not have a cardiac cause for their presentation. The current guideline-based process of assessment is lengthy, costly and consumes significant resources. Investigation of strategies to shorten this process or reduce the need for objective cardiac testing in patients at intermediate risk according to the National Heart Foundation and Cardiac Society of Australia and New Zealand guideline is required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Trichinella surveillance in wildlife relies on muscle digestion of large samples which are logistically difficult to store and transport in remote and tropical regions as well as labour-intensive to process. Serological methods such as enzyme-linked immunosorbent assays (ELISAs) offer rapid, cost-effective alternatives for surveillance but should be paired with additional tests because of the high false-positive rates encountered in wildlife. We investigated the utility of ELISAs coupled with Western blot (WB) in providing evidence of Trichinella exposure or infection in wild boar. Serum samples were collected from 673 wild boar from a high- and low-risk region for Trichinella introduction within mainland Australia, which is considered Trichinella-free. Sera were examined using both an 'in-house' and a commercially available indirect-ELISA that used excretory secretory (E/S) antigens. Cut-off values for positive results were determined using sera from the low-risk population. All wild boar from the high-risk region (352) and 139/321 (43.3%) of the wild boar from the low-risk region were tested by artificial digestion. Testing by Western blot using E/S antigens, and a Trichinella-specific real-time PCR was also carried out on all ELISA-positive samples. The two ELISAs correctly classified all positive controls as well as one naturally infected wild boar from Gabba Island in the Torres Strait. In both the high- and low-risk populations, the ELISA results showed substantial agreement (k-value = 0.66) that increased to very good (k-value = 0.82) when WB-positive only samples were compared. The results of testing sera collected from the Australian mainland showed the Trichinella seroprevalence was 3.5% (95% C.I. 0.0-8.0) and 2.3% (95% C.I. 0.0-5.6) using the in-house and commercial ELISA coupled with WB respectively. These estimates were significantly higher (P < 0.05) than the artificial digestion estimate of 0.0% (95% C.I. 0.0-1.1). Real-time PCR testing of muscle from seropositive animals did not detect Trichinella DNA in any mainland animals, but did reveal the presence of a second larvae-positive wild boar on Gabba Island, supporting its utility as an alternative, highly sensitive method in muscle examination. The serology results suggest Australian wildlife may have been exposed to Trichinella parasites. However, because of the possibility of non-specific reactions with other parasitic infections, more work using well-defined cohorts of positive and negative samples is required. Even if the specificity of the ELISAs is proven to be low, their ability to correctly classify the small number of true positive sera in this study indicates utility in screening wild boar populations for reactive sera which can be followed up with additional testing. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Wheat is at peak quality soon after harvest. Subsequently, diverse biota use wheat as a resource in storage, including insects and mycotoxin-producing fungi. Transportation networks for stored grain are crucial to food security and provide a model system for an analysis of the population structure, evolution, and dispersal of biota in networks. We evaluated the structure of rail networks for grain transport in the United States and Eastern Australia to identify the shortest paths for the anthropogenic dispersal of pests and mycotoxins, as well as the major sources, sinks, and bridges for movement. We found important differences in the risk profile in these two countries and identified priority control points for sampling, detection, and management. An understanding of these key locations and roles within the network is a new type of basic research result in postharvest science and will provide insights for the integrated pest management of high-risk subpopulations, such as pesticide-resistant insect pests.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Investigating population changes gives insight into effectiveness and need for prevention and rehabilitation services. Incidence rates of amputation are highly varied, making it difficult to meaningfully compare rates between studies and regions or to compare changes over time. Study Design Historical cohort study of transtibial amputation, knee disarticulation, and transfemoral amputations resulting from vascular disease or infection, with/without diabetes, in 2003-2004, in the three Northern provinces of the Netherlands. Objectives To report the incidence of first transtibial amputation, knee disarticulation, or transfemoral amputation in 2003-2004 and the characteristics of this population, and to compare these outcomes to an earlier reported cohort from 1991 to 1992. Methods Population-based incidence rates were calculated per 100,000 person-years and compared across the two cohorts. Results Incidence of amputation was 8.8 (all age groups) and 23.6 (≥45 years) per 100,000 person-years. This was unchanged from the earlier study of 1991-1992. The relative risk of amputation was 12 times greater for people with diabetes than for people without diabetes. Conclusions Investigation is needed into reasons for the unchanged incidence with respect to the provision of services from a range of disciplines, including vascular surgery, diabetes care, and multidisciplinary foot clinics. Clinical relevance This study shows an unchanged incidence of amputation over time and a high risk of amputation related to diabetes. Given the increased prevalence of diabetes and population aging, both of which present an increase in the population at risk of amputation, finding methods for reducing the rate of amputation is of importance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recommendations - 1 To identify a person with diabetes at risk for foot ulceration, examine the feet annually to seek evidence for signs or symptoms of peripheral neuropathy and peripheral artery disease. (GRADE strength of recommendation: strong; Quality of evidence: low) - 2 In a person with diabetes who has peripheral neuropathy, screen for a history of foot ulceration or lower-extremity amputation, peripheral artery disease, foot deformity, pre-ulcerative signs on the foot, poor foot hygiene and ill-fitting or inadequate footwear. (Strong; Low) - 3 Treat any pre-ulcerative sign on the foot of a patient with diabetes. This includes removing callus, protecting blisters and draining when necessary, treating ingrown or thickened toe nails, treating haemorrhage when necessary and prescribing antifungal treatment for fungal infections. (Strong; Low) - 4 To protect their feet, instruct an at-risk patient with diabetes not to walk barefoot, in socks only, or in thin-soled standard slippers, whether at home or when outside. (Strong; Low) - 5 Instruct an at-risk patient with diabetes to daily inspect their feet and the inside of their shoes, daily wash their feet (with careful drying particularly between the toes), avoid using chemical agents or plasters to remove callus or corns, use emollients to lubricate dry skin and cut toe nails straight across. (Weak; Low) - 6 Instruct an at-risk patient with diabetes to wear properly fitting footwear to prevent a first foot ulcer, either plantar or non-plantar, or a recurrent non-plantar foot ulcer. When a foot deformity or a pre-ulcerative sign is present, consider prescribing therapeutic shoes, custom-made insoles or toe orthosis. (Strong; Low) - 7 To prevent a recurrent plantar foot ulcer in an at-risk patient with diabetes, prescribe therapeutic footwear that has a demonstrated plantar pressure-relieving effect during walking (i.e. 30% relief compared with plantar pressure in standard of care therapeutic footwear) and encourage the patient to wear this footwear. (Strong; Moderate) - 8 To prevent a first foot ulcer in an at-risk patient with diabetes, provide education aimed at improving foot care knowledge and behaviour, as well as encouraging the patient to adhere to this foot care advice. (Weak; Low) - 9 To prevent a recurrent foot ulcer in an at-risk patient with diabetes, provide integrated foot care, which includes professional foot treatment, adequate footwear and education. This should be repeated or re-evaluated once every 1 to 3 months as necessary. (Strong; Low) - 10 Instruct a high-risk patient with diabetes to monitor foot skin temperature at home to prevent a first or recurrent plantar foot ulcer. This aims at identifying the early signs of inflammation, followed by action taken by the patient and care provider to resolve the cause of inflammation. (Weak; Moderate) - 11 Consider digital flexor tenotomy to prevent a toe ulcer when conservative treatment fails in a high-risk patient with diabetes, hammertoes and either a pre-ulcerative sign or an ulcer on the distal toe. (Weak; Low) - 12 Consider Achilles tendon lengthening, joint arthroplasty, single or pan metatarsal head resection, or osteotomy to prevent a recurrent foot ulcer when conservative treatment fails in a high-risk patient with diabetes and a plantar forefoot ulcer. (Weak; Low) - 13 Do not use a nerve decompression procedure in an effort to prevent a foot ulcer in an at-risk patient with diabetes, in preference to accepted standards of good quality care. (Weak; Low)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The area of intensively managed forests, in which required conditions for several liverwort species are seldom found, has expanded over the forest landscape during the last century. Liverworts are very sensitive to habitat changes, because they demand continuously moist microclimate. Consequently, about third of the forest liverworts have been classified as threatened or near threatened in Finland. The general objective of this thesis is to increase knowledge of the reproductive and dispersal strategies of the substrate-specific forest bryophytes. A further aim was to develop recommendations for conservation measures for species inhabiting unstable and stable habitats in forest landscape. Both population ecological and genetic methods have been applied in the research. Anastrophyllum hellerianum inhabits spatially and temporally limited substrate patches, decaying logs, which can be considered as unstable habitats. The results show that asexual reproduction by gemmae is the dominant mode of reproduction, whereas sexual reproduction is considerably infrequent. Unlike previously assumed, not only spores but also the asexual propagules may contribute to long-distance dispersal. The combination of occasional spore production and practically continuous, massive gemma production facilitates dispersal both on a local scale and over long distances, and it compensates for the great propagule losses that take place preceding successful establishment at suitable sites. However, establishment probability of spores may be restricted because of environmental and biological limitations linked to the low success of sexual reproduction. Long-lasting dry seasons are likely to result in a low success of sexual reproduction and decreased release rate of gemmae from the shoots, and consequent fluctuations in population sizes. In the long term, the substratum limitation is likely to restrict population sizes and cause local extinctions, especially in small-sized remnant populations. Contrastingly, larger forest fragments with more natural disturbance dynamics, to which the species is adapted, are pivotal to species survival. Trichocolea tomentella occupies stable spring and mesic habitats in woodland. The relatively small populations are increasingly fragmented with a high risk for extinction for extrinsic reasons. The results show that T. tomentella mainly invests in population persistence by effective clonal growth via forming independent ramets and in competitive ability, and considerably less in sexuality and dispersal potential. The populations possess relatively high levels of genetic diversity regardless of population size and of degree of isolation. Thus, the small-sized populations inhabiting stable habitats should not be neglected when establishing conservation strategies for the species and when considering the habitat protection of small spring sites. Restricted dispersal capacity, also on a relatively small spatial scale, is likely to prevent successful (re-)colonization in the potential habitat patches of recovering forest landscapes. By contrast, random short-range dispersal of detached vegetative fragments within populations at suitable habitat seems to be frequent. Thus, the restoration actions of spring and streamside habitats close to the populations of T. tomentella may contribute to population expansion. That, in turn, decreases the harmful effects of environmental stochasticity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With transplant rejection rendered a minor concern and survival rates after liver transplantation (LT) steadily improving, long-term complications are attracting more attention. Current immunosuppressive therapies, together with other factors, are accompanied by considerable long-term toxicity, which clinically manifests as renal dysfunction, high risk for cardiovascular disease, and cancer. This thesis investigates the incidence, causes, and risk factors for such renal dysfunction, cardiovascular risk, and cancer after LT. Long-term effects of LT are further addressed by surveying the quality of life and employment status of LT recipients. The consecutive patients included had undergone LT at Helsinki University Hospital from 1982 onwards. Data regarding renal function – creatinine and estimated glomerular filtration rate (GFR) – were recorded before and repeatedly after LT in 396 patients. The presence of hypertension, dyslipidemia, diabetes, impaired fasting glucose, and overweight/obesity before and 5 years after LT was determined among 77 patients transplanted for acute liver failure. The entire cohort of LT patients (540 patients), including both children and adults, was linked with the Finnish Cancer Registry, and numbers of cancers observed were compared to site-specific expected numbers based on national cancer incidence rates stratified by age, gender, and calendar time. Health-related quality of life (HRQoL), measured by the 15D instrument, and employment status were surveyed among all adult patients alive in 2007 (401 patients). The response rate was 89%. Posttransplant cardiovascular risk factor prevalence and HRQoL were compared with that in the age- and gender-matched Finnish general population. The cumulative risk for chronic kidney disease increased from 10% at 5 years to 16% at 10 years following LT. GFR up to 10 years after LT could be predicted by the GFR at 1 year. In patients transplanted for chronic liver disease, a moderate correlation of pretransplant GFR with later GFR was also evident, whereas in acute liver failure patients after LT, even severe pretransplant renal dysfunction often recovered. By 5 years after LT, 71% of acute liver failure patients were receiving antihypertensive medications, 61% were exhibiting dyslipidemia, 10% were diabetic, 32% were overweight, and 13% obese. Compared with the general population, only hypertension displayed a significantly elevated prevalence among patients – 2.7-fold – whereas patients exhibited 30% less dyslipidemia and 71% less impaired fasting glucose. The cumulative incidence of cancer was 5% at 5 years and 13% at 10. Compared with the general population, patients were subject to a 2.6-fold cancer risk, with non-melanoma skin cancer (standardized incidence ratio, SIR, 38.5) and non-Hodgkin lymphoma (SIR 13.9) being the predominant malignancies. Non-Hodgkin lymphoma was associated with male gender, young age, and the immediate posttransplant period, whereas old age and antibody induction therapy raised skin-cancer risk. HRQoL deviated clinically unimportantly from the values in the general population, but significant deficits among patients were evident in some physical domains. HRQoL did not seem to decrease with longer follow-up. Although 87% of patients reported improved working capacity, data on return to working life showed marked age-dependency: Among patients aged less than 40 at LT, 70 to 80% returned to work, among those aged 40 to 50, 55%, and among those above 50, 15% to 28%. The most common cause for unemployment was early retirement before LT. Those patients employed exhibited better HRQoL than those unemployed. In conclusion, although renal impairment, hypertension, and cancer are evidently common after LT and increase with time, patients’ quality of life remains comparable with that of the general population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bipolar disorder (BP) is a complex psychiatric disorder characterized by episodes of mania and depression. BP affects approximately 1% of the world’s population and shows no difference in lifetime prevalence between males and females. BP arises from complex interactions among genetic, developmental and environmental factors, and it is likely that several predisposing genes are involved in BP. The genetic background of BP is still poorly understood, although intensive and long-lasting research has identified several chromosomal regions and genes involved in susceptibility to BP. This thesis work aims to identify the genetic variants that influence bipolar disorder in the Finnish population by candidate gene and genome-wide linkage analyses in families with many BP cases. In addition to diagnosis-based phenotypes, neuropsychological traits that can be seen as potential endophenotypes or intermediate traits for BP were analyzed. In the first part of the thesis, we examined the role of the allelic variants of the TSNAX/DISC1 gene cluster to psychotic and bipolar spectrum disorders and found association of distinct allelic haplotypes with these two groups of disorders. The haplotype at the 5’ end of the Disrupted-in-Schizophrenia-1 gene (DISC1) was over-transmitted to males with psychotic disorder (p = 0.008; for an extended haplotype p = 0.0007 with both genders), whereas haplotypes at the 3’ end of DISC1 associated with bipolar spectrum disorder (p = 0.0002; for an extended haplotype p = 0.0001). The variants of these haplotypes also showed association with different cognitive traits. The haplotypes at the 5’ end associated with perseverations and auditory attention, while the variants at the 3’ end associated with several cognitive traits including verbal fluency and psychomotor processing speed. Second, in our complete set of BP families with 723 individuals we studied six functional candidate genes from three distinct signalling systems: serotonin-related genes (SLC6A4 and TPH2), BDNF -related genes (BDNF, CREB1 and NTRK2) and one gene related to the inflammation and cytokine system (P2RX7). We replicated association of the functional variant Val66Met of BDNF with BP and better performance in retention. The variants at the 5’ end of SLC6A4 also showed some evidence of association among males (p = 0.004), but the widely studied functional variants did not yield any significant results. A protective four-variant haplotype on P2RX7 showed evidence of association with BP and executive functions: semantic and phonemic fluency (p = 0.006 and p = 0.0003, respectively). Third, we analyzed 23 bipolar families originating from the North-Eastern region of Finland. A genome-wide scan was performed using the 6K single nucleotide polymorphism (SNP) array. We identified susceptibility loci at chromosomes 7q31 with a LOD score of 3.20 and at 9p13.1 with a LOD score of 4.02. We followed up both linkage findings in the complete set of 179 Finnish bipolar families. The finding on chromosome 9p13 was supported (maximum LOD score of 3.02), but the susceptibility gene itself remains unclarified. In the fourth part of the thesis, we wanted to test the role of the allelic variants that have associated with bipolar disorder in recent genome-wide association studies (GWAS). We could confirm findings for the DFNB31, SORCS2, SCL39A3, and DGKH genes. The best signal in this study comes from DFNB31, which remained significant after multiple testing corrections. Two variants of SORCS2 were allelic replications and presented the same signal as the haplotype analysis. However, no association was detected with the PALB2 gene, which was the most significantly associated region in the previous GWAS. Our results indicate that BP is heterogeneous and its genetic background may accordingly vary in different populations. In order to fully understand the allelic heterogeneity that underlies common diseases such as BP, complete genome sequencing for many individuals with and without the disease is required. Identification of the specific risk variants will help us better understand the pathophysiology underlying BP and will lead to the development of treatments with specific biochemical targets. In addition, it will further facilitate the identification of environmental factors that alter risk, which will potentially provide improved occupational, social and psychological advice for individuals with high risk of BP.