888 resultados para Working population
Resumo:
As populations live longer, healthier lives in countries like Australia the growing population of older people is increasing the strains on social security and pension systems. Yet many seniors are healthy and want to remain active during the later years in life. Whilst there is significant research on seniors, ageing and the employment of mature-aged people there is scant research on seniors creating jobs as opposed to seeking jobs as employees. This is the first empirical research specifically on senior entrepreneurship in Australia. Seniors often have the skills, financial resources and time available to contribute to economic activity, which leads to the growing prevalence of senior entrepreneurship. Senior entrepreneurship is the process whereby people aged 50+ participate in business start-ups; however, despite representing the fastest growing segment of entrepreneurship little is known about this phenomenon. This research seeks to answer the following questions: What is the scope of senior entrepreneurship in Australia? What are the impacts of senior entrepreneurship in Australia? What perceptions do seniors hold about entrepreneurship as a career option? What policy implications and recommendations can be derived to enhance active ageing, and extend working lives through senior entrepreneurship?
Resumo:
Background. Evidence of cognitive dysfunction in depressive and anxiety disorders is growing. However, the neuropsychological profile of young adults has received only little systematic investigation, although depressive and anxiety disorders are major public health problems for this age group. Available studies have typically failed to account for psychiatric comorbidity, and samples derived from population-based settings have also seldom been investigated. Burnout-related cognitive functioning has previously been investigated in only few studies, again all using clinical samples and wide age groups. Aims. Based on the information gained by conducting a comprehensive review, studies on cognitive impairment in depressive and anxiety disorders among young adults are rare. The present study examined cognitive functioning in young adults with a history of unipolar depressive or anxiety disorders in comparison to healthy peers, and associations of current burnout symptoms with cognitive functioning, in a population-based setting. The aim was also to determine whether cognitive deficits vary as a function of different disorder characteristics, such as severity, psychiatric comorbidity, age at onset, or the treatments received. Methods. Verbal and visual short-term memory, verbal long-term memory and learning, attention, psychomotor processing speed, verbal intelligence, and executive functioning were measured in a population-based sample of 21-35 year olds. Performance was compared firstly between participants with pure non-psychotic depression (n=68) and healthy peers (n=70), secondly between pure (n=69) and comorbid depression (n=57), and thirdly between participants with anxiety disorders (n=76) and healthy peers (n=71). The diagnostic procedure was based on the SCID interview. Fourthly, the associations of current burnout symptoms, measured with the Maslach Burnout Inventory General Survey, and neuropsychological test performance were investigated among working young adults (n=225). Results. Young adults with depressive or anxiety disorders, with or without psychiatric comorbidity, were not found to have major cognitive impairments when compared to healthy peers. Only mildly compromised verbal learning was found among depressed participants. Pure and comorbid depression groups did not differ in cognitive functioning, either. Among depressed participants, those who had received treatment showed more impaired verbal memory and executive functioning, and earlier onset corresponded with more impaired executive functioning. In anxiety disorders, psychotropic medication and low psychosocial functioning were associated with deficits in executive functioning, psychomotor processing speed, and visual short-term memory. Current burnout symptoms were associated with better performance in verbal working memory and verbal intelligence. However, lower examiner-rated social and occupational functioning was associated with problems in verbal attention, memory, and learning. Conclusions. Depression, anxiety disorders, or burnout symptoms may not be associated with major cognitive deficits among young adults derived from the general population. Even psychiatric comorbidity may not aggravate cognitive functioning in depressive or anxiety disorders among these young adults. However, treatment-seeking in depression was found to be associated with cognitive deficits, suggesting that these deficits relate to increased distress. Additionally, early-onset depression, found to be associated with executive dysfunction, may represent a more severe form of the disorder. In anxiety disorders, those with low symptom-related psychosocial functioning may have cognitive impairment. An association with self-reported burnout symptoms and cognitive deficits was not detected, but individuals with low social and occupational functioning may have impaired cognition.
Resumo:
Background: Irritable bowel syndrome (IBS) is a common functional gastrointestinal (GI) disorder characterised by abdominal pain and abnormal bowel function. It is associated with a high rate of healthcare consumption and significant health care costs. The prevalence and economic burden of IBS in Finland has not been studied before. The aims of this study were to assess the prevalence of IBS according to various diagnostic criteria and to study the rates of psychiatric and somatic comorbidity in IBS. In addition, health care consumption and societal costs of IBS were to be evaluated. Methods: The study was a two-phase postal survey. Questionnaire I identifying IBS by Manning 2 (at least two of the six Manning symptoms), Manning 3 (at least three Manning symptoms), Rome I, and Rome II criteria, was mailed to a random sample of 5 000 working age subjects. It also covered extra-GI symptoms such as headache, back pain, and depression. Questionnaire II, covering rates of physician visits, and use of GI medication, was sent to subjects fulfilling Manning 2 or Rome II IBS criteria in Questionnaire I. Results: The response rate was 73% and 86% for questionnaires I and II. The prevalence of IBS was 15.9%, 9.6%, 5.6%, and 5.1% according to Manning 2, Manning 3, Rome I, and Rome II criteria. Of those meeting Rome II criteria, 97% also met Manning 2 criteria. Presence of severe abdominal pain was more often reported by subjects meeting either of the Rome criteria than those meeting either of the Manning criteria. Presence of depression, anxiety, and several somatic symptoms was more common among subjects meeting any IBS criterion than by controls. Of subjects with depressive symptoms, 11.6% met Rome II IBS criteria compared to 3.7% of those with no depressiveness. Subjects meeting any IBS criteria made more physician visits than controls. Intensity of GI symptoms and presence of dyspeptic symptoms were the strongest predictors of GI consultations. Presence of dyspeptic symptoms and a history of abdominal pain in childhood also predicted non-GI visits. Annual GI related individual costs were higher in the Rome II group (497 ) than in the Manning 2 group (295 ). Direct expenses of GI symptoms and non GI physician visits ranged between 98M for Rome II and 230M for Manning 2 criteria. Conclusions: The prevalence of IBS varies substantially depending on the criteria applied. Rome II criteria are more restrictive than Manning 2, and they identify an IBS population with more severe GI symptoms, more frequent health care use, and higher individual health care costs. Subjects with IBS demonstrate high rates of psychiatric and somatic comorbidity regardless of health care seeking status. Perceived symptom severity rather than psychiatric comorbidity predicts health care seeking for GI symptoms. IBS incurs considerable medical costs. The direct GI and non-GI costs are equivalent to up to 5% of outpatient health care and medicine costs in Finland. A more integral approach to IBS by physicians, accounting also for comorbid conditions, may produce a more favourable course in IBS patients and reduce health care expenditures.
Resumo:
The purpose of this study was to compare the neuropsychological performance of two frontal dysexecutive phenotypes - disinhibited&' syndrome (DS) and &'apathetic&' syndrome (AS) following a traumatic brain injury in a non-western population, Oman. Methods: The study compared the performance of DS and AS in neuropsychological measures including those tapping into verbal reasoning ability/working memory/attention planning/goal-directed behavior and affective ranges. Results: The present analysis showed that DS and AS participants did not differ on indices measuring working memory/attention and affective ranges. However, the two cohorts differed significantly in measures of planning/goal-directed behaviour. Conclusion: This study lays the groundwork for further scrutiny in delineating the different characteristics of what has been previously labelled as frontal dysexecutive phenotype. This study indicates that DS and AS are marked with specific neuropsychological deficits.
Resumo:
This study presents a population projection for Namibia for years 2011–2020. In many countries of sub-Saharan Africa, including Namibia, the population growth is still continuing even though the fertility rates have declined. However, many of these countries suffer from a large HIV epidemic that is slowing down the population growth. In Namibia, the epidemic has been severe. Therefore, it is important to assess the effect of HIV/AIDS on the population of Namibia in the future. Demographic research on Namibia has not been very extensive, and data on population is not widely available. According to the studies made, fertility has been shown to be generally declining and mortality has been significantly increasing due to AIDS. Previous population projections predict population growth for Namibia in the near future, yet HIV/AIDS is affecting the future population developments. For the projection constructed in this study, data on population is taken from the two most recent censuses, from 1991 and 2001. Data on HIV is available from HIV Sentinel Surveys 1992–2008, which test pregnant women for HIV in antenatal clinics. Additional data are collected from different sources and recent studies. The projection is made with software (EPP and Spectrum) specially designed for developing countries with scarce data. The projection includes two main scenarios which have different assumptions concerning the development of the HIV epidemic. In addition, two hypothetical scenarios are made: the first considering the case where HIV epidemic would never have existed and the second considering the case where HIV treatment would never have existed. The results indicate population growth for Namibia. Population in the 2001 census was 1.83 million and is projected to result in 2.38/2.39 million in 2020 in the first two scenarios. Without HIV, population would be 2.61 million and without treatment 2.30 million in 2020. Urban population is growing faster than rural. Even though AIDS is increasing mortality, the past high fertility rates still keep young adult age groups quite large. The HIV epidemic shows to be slowing down, but it is still increasing the mortality of the working-aged population. The initiation of HIV treatment in 2004 in the public sector seems to have had an effect on many projected indicators, diminishing the impact of HIV on the population. For example, the rise of mortality is slowing down.
Resumo:
Work has a central role in the lives of big share of adult Finns and meals they eat during the workday comprise an important factor in their nutrition, health, and well-being. On workdays, lunch is mainly eaten at worksite canteens or, especially among women, as a packed meal in the workplace s break room. No national-level data is available on the nutritional quality of the meals served by canteens, although the Finnish Institute of Occupational Health laid out the first nutrition recommendations for worksite canteens in 1971. The aim of this study was to examine the contribution of various socio-demographic, socioeconomic, and work-related factors to the lunch eating patterns of Finnish employees during the working day and how lunch eating patterns influence dietary intake. Four different population-based cross-sectional datasets were used in this thesis. Three of the datasets were collected by the National Institute for Health and Welfare (Health Behaviour and Health among the Finnish Adult Population survey from 1979 to 2001, n=24746, and 2005 to 2007, n=5585, the National Findiet 2002 Study, n=261), and one of them by the Finnish Institute of Occupational Health (Work and Health in Finland survey from 1997, 2000, and 2003, n=6369). The Health Behaviour and Health among the Finnish Adult Population survey and the Work and Health in Finland survey are nationally representative studies that are conducted repeatedly. Survey information was collected by self-administered questionnaires, dietary recalls, and telephone interviews. The frequency of worksite canteen use has been quite stable for over two decades in Finland. A small decreasing trend can be seen in all socioeconomic groups. During the whole period studied, those with more years of education ate at worksite canteens more often than the others. The size of the workplace was the most important work-related determinant associated with the use of a worksite canteen. At small workplaces, other work-related determinants, like occupation, physical strain at work, and job control, were also associated with canteen use, whereas at bigger workplaces the associations were almost nonexistent. The major social determinants of worksite canteen availability were the education and occupational status of employees and the only work-related determinant was the size of the workplace. A worksite canteen was more commonly available to employees at larger workplaces and to those with the higher education and the higher occupational status. Even when the canteen was equally available to all employees, its use was nevertheless determined by occupational class and the place of residence, especially among female employees. Those with higher occupational status and those living in the Helsinki capital area ate in canteens more frequently than the others. Employees who ate at a worksite canteen consumed more vegetables and vegetable and fish dishes at lunch than did those who ate packed lunches. Also, the daily consumption of vegetables and the proportion of the daily users of vegetables were higher among those male employees who ate at a canteen. In conclusion, life possibilities, i.e. the availability of a canteen, education, occupational status, and work-related factors, played an important role in the choice of where to eat lunch among Finnish employees. The most basic prerequisite for eating in a canteen was availability, but there were also a number of underlying social determinants. Occupational status and the place of residence were the major structural factors behind individuals choices in their lunch eating patterns. To ensure the nutrition, health, and well-being of employees, employers should provide them with the option to have good quality meals during working hours. The availability of worksite canteens should be especially supported in lower socioeconomic groups. In addition, employees should be encouraged to have lunch at a worksite canteen when one is available by removing structural barriers to its use.
Resumo:
Oreochromis niloticus (L.) were caught by beach seining, hook and line and trawling from Nyanza Gulf, lake Victoria (Kenya) in order to study their feeding ecology and population characteristics. Collected fish were weighed and TL measured immediately after capture. Fish were dissected and sexed. Stomach contents were removed and preserved in 4% buffered formalin for laboratory analysis. In the laboratory items were sorted into categories such as three quarters, half and quarter and awarded 20, 15 and 5 points respectively. Main food items for O. niloticus from November 1998 to March 1999 were insects, algae, fish and plant material. Increase in insects in the diet of O. niloticus might be attributed to the lake infestation by water hyacinth which harbours different species of insects
Resumo:
EXECUTIVE SUMMARY: At present, the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) criteria used to assess whether a population qualifies for inclusion in the CITES Appendices relate to (A) size of the population, (B) area of distribution of the population, and (C) declines in the size of the population. Numeric guidelines are provided as indicators of a small population (less than 5,000 individuals), a small subpopulation (less than 500 individuals), a restricted area of distribution for a population (less than 10,000 km2), a restricted area of distribution for a subpopula-tion (less than 500 km2), a high rate of decline (a decrease of 50% or more in total within 5 years or two generations whichever is longer or, for a small wild population, a decline of 20% or more in total within ten years or three generations whichever is longer), large fluctuations (population size or area of distribution varies widely, rapidly and frequently, with a variation greater than one order of magnitude), and a short-term fluctuation (one of two years or less). The Working Group discussed several broad issues of relevance to the CITES criteria and guidelines. These included the importance of the historical extent of decline versus the recent rate of decline; the utility and validity of incorporating relative population productivity into decline criteria; the utility of absolute numbers for defining small populations or small areas; the appropriateness of generation times as time frames for examining declines; the importance of the magnitude and frequency of fluctuations as factors affecting risk of extinction; and the overall utility of numeric thresh-olds or guidelines.
Resumo:
BACKGROUND:The Framingham Heart Study (FHS), founded in 1948 to examine the epidemiology of cardiovascular disease, is among the most comprehensively characterized multi-generational studies in the world. Many collected phenotypes have substantial genetic contributors; yet most genetic determinants remain to be identified. Using single nucleotide polymorphisms (SNPs) from a 100K genome-wide scan, we examine the associations of common polymorphisms with phenotypic variation in this community-based cohort and provide a full-disclosure, web-based resource of results for future replication studies.METHODS:Adult participants (n = 1345) of the largest 310 pedigrees in the FHS, many biologically related, were genotyped with the 100K Affymetrix GeneChip. These genotypes were used to assess their contribution to 987 phenotypes collected in FHS over 56 years of follow up, including: cardiovascular risk factors and biomarkers; subclinical and clinical cardiovascular disease; cancer and longevity traits; and traits in pulmonary, sleep, neurology, renal, and bone domains. We conducted genome-wide variance components linkage and population-based and family-based association tests.RESULTS:The participants were white of European descent and from the FHS Original and Offspring Cohorts (examination 1 Offspring mean age 32 +/- 9 years, 54% women). This overview summarizes the methods, selected findings and limitations of the results presented in the accompanying series of 17 manuscripts. The presented association results are based on 70,897 autosomal SNPs meeting the following criteria: minor allele frequency [greater than or equal to] 10%, genotype call rate [greater than or equal to] 80%, Hardy-Weinberg equilibrium p-value [greater than or equal to] 0.001, and satisfying Mendelian consistency. Linkage analyses are based on 11,200 SNPs and short-tandem repeats. Results of phenotype-genotype linkages and associations for all autosomal SNPs are posted on the NCBI dbGaP website at http://www.ncbi.nlm.nih.gov/projects/gap/cgi-bin/study.cgi?id=phs000007.CONCLUSION:We have created a full-disclosure resource of results, posted on the dbGaP website, from a genome-wide association study in the FHS. Because we used three analytical approaches to examine the association and linkage of 987 phenotypes with thousands of SNPs, our results must be considered hypothesis-generating and need to be replicated. Results from the FHS 100K project with NCBI web posting provides a resource for investigators to identify high priority findings for replication.
Resumo:
BackgroundChildren with autism spectrum disorder are increasingly educated in mainstream classrooms in the United Kingdom (Wilkinson & Twist, Autism and Educational Assessment: UK Policy and Practice. NFER, Slough, 2010), and some employers are now specifically seeking out staff on the autism spectrum. Does that mean that we are living in an inclusive society' [United Nations Department of Economic and Social Affairs (UNDESA), Creating an Inclusive Society: Practical Strategies to Promote Social Integration 2008], in the sense that inequalities are reduced and full economic, social and cultural participation is advanced for individuals with autism?
MethodsA general population survey was conducted to assess how close we, as a society, are to an inclusive society for individuals with autism in Northern Ireland. Public attitudes were examined to (i) visibility and social interaction, (ii) aetiology, needs and interventions, and (iii) rights and resources.
ResultsA stratified, representative sample of 1204 adults took part in the survey; of these, 989 were aware of autism and their attitudes and behavioural projections reflected a mix of acceptance and denunciation. The level of confusion with regard to interventions reflected the general uncertainty within UK policy regarding meeting the needs of individuals on the autism spectrum (International Journal of Disability, Development and Education 61, 134, 2014a).
ConclusionTherefore, it seems that inclusion is working to an extent, but more clarity is needed with regard to adequate education, intervention and support for individuals with autism.
Resumo:
Background: More effective treatments have become available for haematological malignancies from the early 2000s, but few large-scale population-based studies have investigated their effect on survival. Using EUROCARE data, and HAEMACARE morphological groupings, we aimed to estimate time trends in population-based survival for 11 lymphoid and myeloid malignancies in 20 European countries, by region and age. Methods: In this retrospective observational study, we included patients (aged 15 years and older) diagnosed with haematological malignancies, diagnosed up to Dec 31, 2007, and followed up to Dec 31, 2008. We used data from the 30 cancer registries (across 20 countries) that provided continuous incidence and good quality data from 1992 to 2007. We used a hybrid approach to estimate age-standardised and age-specific 5-year relative survival, for each malignancy, overall and for five regions (UK, and northern, central, southern, and eastern Europe), and four 3-year periods (1997–99, 2000–02, 2003–05, 2006–08). For each malignancy, we also estimated the relative excess risk of death during the 5 years after diagnosis, by period, age, and region. Findings: We analysed 560 444 cases. From 1997–99 to 2006–08 survival increased for most malignancies: the largest increases were for diffuse large B-cell lymphoma (42·0% [95% CI 40·7–43·4] to 55·4% [54·6–56·2], p<0·0001), follicular lymphoma (58·9% [57·3–60·6] to 74·3% [72·9–75·5], p<0·0001), chronic myeloid leukaemia (32·3% [30·6–33·9] to 54·4% [52·5–56·2], p<0·0001), and acute promyelocytic leukaemia (50·1% [43·7–56·2] to 61·9% [57·0–66·4], p=0·0038, estimate not age-standardised). Other survival increases were seen for Hodgkin's lymphoma (75·1% [74·1–76·0] to 79·3% [78·4–80·1], p<0·0001), chronic lymphocytic leukaemia/small lymphocytic lymphoma (66·1% [65·1–67·1] to 69·0% [68·1–69·8], p<0·0001), multiple myeloma/plasmacytoma (29·8% [29·0–30·6] to 39·6% [38·8–40·3], p<0·0001), precursor lymphoblastic leukaemia/lymphoma (29·8% [27·7–32·0] to 41·1% [39·0–43·1], p<0·0001), acute myeloid leukaemia (excluding acute promyelocytic leukaemia, 12·6% [11·9–13·3] to 14·8% [14·2–15·4], p<0·0001), and other myeloproliferative neoplasms (excluding chronic myeloid leukaemia, 70·3% [68·7–71·8] to 74·9% [73·8–75·9], p<0·0001). Survival increased slightly in southern Europe, more in the UK, and conspicuously in northern, central, and eastern Europe. However, eastern European survival was lower than that for other regions. Survival decreased with advancing age, and increased with time only slightly in patients aged 75 years or older, although a 10% increase in survival occurred in elderly patients with follicular lymphoma, diffuse large B-cell lymphoma, and chronic myeloid leukaemia. Interpretation: These trends are encouraging. Widespread use of new and more effective treatment probably explains much of the increased survival. However, the persistent differences in survival across Europe suggest variations in the quality of care and availability of the new treatments. High-resolution studies that collect data about stage at diagnosis and treatments for representative samples of cases could provide further evidence of treatment effectiveness and explain geographic variations in survival.
Resumo:
Objectives. We compared the mental health risk to unpaid caregivers bereaved of a care recipient with the risk to persons otherwise bereaved and to nonbereaved caregivers.
Methods. We linked prescription records for antidepressant and anxiolytic drugs to characteristics and life-event data of members of the Northern Ireland Longitudinal Study (n = 317 264). Using a case-control design, we fitted logistic regression models, stratified by age, to model relative likelihood of mental health problems, using the proxy measures of mental health–related prescription.
Results. Both caregivers and bereaved individuals were estimated to be at between 20% and 50% greater risk for mental health problems than noncaregivers in similar circumstances (for bereaved working-age caregivers, odds ratio = 1.41; 95% confidence interval = 1.27, 1.56). For older people, there was no evidence of additional risk to bereaved caregivers, though there was for working-age people. Older people appeared to recover more quickly from caregiver bereavement.
Conclusions. Caregivers were at risk for mental ill health while providing care and after the death of the care recipient. Targeted caregiver support needs to extend beyond the life of the care recipient.
Resumo:
BACKGROUND: Head and neck (H&N) cancers are a heterogeneous group of malignancies, affecting various sites, with different prognoses. The aims of this study are to analyse survival for patients with H&N cancers in relation to tumour location, to assess the change in survival between European countries, and to investigate whether survival improved over time.
METHODS: We analysed about 250,000 H&N cancer cases from 86 cancer registries (CRs). Relative survival (RS) was estimated by sex, age, country and stage. We described survival time trends over 1999-2007, using the period approach. Model based survival estimates of relative excess risks (RERs) of death were also provided by country, after adjusting for sex, age and sub-site.
RESULTS: Five-year RS was the poorest for hypopharynx (25%) and the highest for larynx (59%). Outcome was significantly better in female than in male patients. In Europe, age-standardised 5-year survival remained stable from 1999-2001 to 2005-2007 for laryngeal cancer, while it increased for all the other H&N cancers. Five-year age-standardised RS was low in Eastern countries, 47% for larynx and 28% for all the other H&N cancers combined, and high in Ireland and the United Kingdom (UK), and Northern Europe (62% and 46%). Adjustment for sub-site narrowed the difference between countries. Fifty-four percent of patients was diagnosed at advanced stage (regional or metastatic). Five-year RS for localised cases ranged between 42% (hypopharynx) and 74% (larynx).
CONCLUSIONS: This study shows survival progresses during the study period. However, slightly more than half of patients were diagnosed with regional or metastatic disease at diagnosis. Early diagnosis and timely start of treatment are crucial to reduce the European gap to further improve H&N cancers outcome.
Resumo:
BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control.
METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25·7 million adults (age 15-99 years) and 75,000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights.
FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease.
INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems.
Resumo:
The discovery of somatic mutations, primarily JAK2V617F and CALR, in classic BCR-ABL1-negative myeloproliferative neoplasms (MPNs) has generated interest in the development of molecularly targeted therapies, whose accurate assessment requires a standardized framework. A working group, comprised of members from European LeukemiaNet (ELN) and International Working Group for MPN Research and Treatment (IWG-MRT), prepared consensus-based recommendations regarding trial design, patient selection and definition of relevant end points. Accordingly, a response able to capture the long-term effect of the drug should be selected as the end point of phase II trials aimed at developing new drugs for MPNs. A time-to-event, such as overall survival, or progression-free survival or both, as co-primary end points, should measure efficacy in phase III studies. New drugs should be tested for preventing disease progression in myelofibrosis patients with early disease in randomized studies, and a time to event, such as progression-free or event-free survival should be the primary end point. Phase III trials aimed at preventing vascular events in polycythemia vera and essential thrombocythemia should be based on a selection of the target population based on new prognostic factors, including JAK2 mutation. In conclusion, we recommended a format for clinical trials in MPNs that facilitates communication between academic investigators, regulatory agencies and drug companies.