958 resultados para INTERVALS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Thoracoscopic anterior instrumented fusion (TASF) is a safe and viable surgical option for corrective stabilisation of progressive adolescent idiopathic scoliosis (AIS) [1-2]. However, there is a paucity of literature examining optimum methods of analgesia following this type of surgery. The aim of this study was to identify; if local anaesthetic bolus via an intrapleural catheter provides effective analgesia following thoracoscopic scoliosis correction; what pain levels may be expected; and any adverse effects associated with the use of intermittent intrapleural analgesia at our centre. Methods: A subset of the most recent 80 patients from a large single centre consecutive series of 201 patients (April 2000 to present) who had undergone TASF had their medical records reviewed. 32 patients met the inclusion criteria for the analysis (i.e. pain scores must have been recorded within the hour prior and within two hours following an intrapleural bolus being given). All patients received an intrapleural catheter inserted during surgery, in addition to patient-controlled opiate analgesia and oral analgesia as required. After surgery, patients received a bolus of 0.25% bupivacaine every four hours via the intrapleural catheter. Visual analogue pain scale scores were recorded before and after the bolus of local anaesthetic and the quantity and time of day that any other analgesia was taken, were also recorded. Results and Discussion: 28 female and four male patients (mean age 14.5 ± 1.5 years) had a total of 230 boluses of local anaesthetic administered intrapleurally, directly onto the spine, in the 96 hour period following surgery. Pain scores significantly decreased following the administration of a bolus (p<0.0001), with the mean pain score decreasing from 3.66 to 1.83. The quantity of opiates via patient-controlled analgesia after surgery decreased steadily between successive 24 hours intervals after an initial increase in the second 24 hour period when patients were mobilised. One intrapleural catheter required early removal at 26 hours postop due to leakage; there were no other associated complications with the intermittent intrapleural analgesia method. Post-operative pain following anterior scoliosis correction was decreased significantly with the administration of regular local anaesthetic boluses and can be reduced to ‘mild’ levels by combined analgesia regimes. The intermittent intrapleural analgesia method was not associated with any adverse events or complications in the full cohort of 201 patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Establishing age-at-death for skeletal remains is a vital component of forensic anthropology. The Suchey-Brooks (S-B) method of age estimation has been widely utilised since 1986 and relies on a visual assessment of the pubic symphyseal surface in comparison to a series of casts. Inter-population studies (Kimmerle et al., 2005; Djuric et al., 2007; Sakaue, 2006) demonstrate limitations of the S-B method, however, no assessment of this technique specific to Australian populations has been published. Aim: This investigation assessed the accuracy and applicability of the S-B method to an adult Australian Caucasian population by highlighting error rates associated with this technique. Methods: Computed tomography (CT) and contact scans of the S-B casts were performed; each geometrically modelled surface was extracted and quantified for reference purposes. A Queensland skeletal database for Caucasian remains aged 15 – 70 years was initiated at the Queensland Health Forensic and Scientific Services – Forensic Pathology Mortuary (n=350). Three-dimensional reconstruction of the bone surface using innovative volume visualisation protocols in Amira® and Rapidform® platforms was performed. Samples were allocated into 11 sub-sets of 5-year age intervals and changes associated with the surface geometry were quantified in relation to age, gender and asymmetry. Results: Preliminary results indicate that computational analysis was successfully applied to model morphological surface changes. Significant differences in observed versus actual ages were noted. Furthermore, initial morphological assessment demonstrates significant bilateral asymmetry of the pubic symphysis, which is unaccounted for in the S-B method. These results propose refinements to the S-B method, when applied to Australian casework. Conclusion: This investigation promises to transform anthropological analysis to be more quantitative and less invasive using CT imaging. The overarching goal contributes to improving skeletal identification and medico-legal death investigation in the coronial process by narrowing the range of age-at-death estimation in a biological profile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The focus of nutrition is often on healthy diets and exercise to minimise the risk of developing lifestyle diseases such as cancer, diabetes and cardiovascular disease. However, during the shift into older years often the nutrition priorities change towards meeting increased nutrient needs with less energy requirements and minimising lean muscle loss. There are several causes of general malnutrition in the elderly that lead to depletion of muscle including starvation (protein-energy malnutrition), sarcopenia and cachexia. The prevalence of protein-energy malnutrition increases with age and the number of comorbidities. A range of simple and validated screening tools can be used to identify malnutrition in older adults e.g. MST, MNA-SF and ‘MUST’. Older adults should be screened for nutritional issues at diagnosis, on admission to hospitals or care homes and during follow up at outpatient or General Practitioner clinics, at regular intervals depending on clinical status. Early identification and treatment of nutrition problems can lead to improved outcomes and better quality of life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The health of an individual is determined by the interaction of genetic and individual factors with wider social and environmental elements. Public health approaches to improving the health of disadvantaged populations will be most effective if they optimise influences at each of these levels, particularly in the early part of the life course. In order to better ascertain the relative contribution of these multi-level determinants there is a need for robust studies, longitudinal and prospective in nature, that examine individual, familial, social and environmental exposures. This paper describes the study background and methods, as it has been implemented in an Australian birth cohort study, Environments for Healthy Living (EFHL): The Griffith Study of Population Health. EFHL is a prospective, multi-level, multi-year longitudinal birth cohort study, designed to collect information from before birth through to adulthood across a spectrum of eco-epidemiological factors, including genetic material from cord-blood samples at birth, individual and familial factors, to spatial data on the living environment. EFHL commenced the pilot phase of recruitment in 2006 and open recruitment in 2007, with a target sample size of 4000 mother/infant dyads. Detailed information on each participant is obtained at birth, 12-months, 3-years, 5-years and subsequent three to five yearly intervals. The findings of this research will provide detailed evidence on the relative contribution of multi-level determinants of health, which can be used to inform social policy and intervention strategies that will facilitate healthy behaviours and choices across sub-populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australia has continued to benefit from the human, social and economic capital contributed by immigrant resettlement over many years. Humanitarian entrants have also made significant economic, social and civic contributions to the Australian society. Since 2000, approximately 160,000 people have entered Australia under the refugee and humanitarian resettlement program; around 15% have come from South Sudan and one third of these are adult males. In response to the 2003 evaluation of the Integrated Humanitarian Settlement Strategy (IHSS), which recommended to seek further opportunities to settle humanitarian entrants in regional Australia, the Department of Immigration and Citizenship (DIAC) has since encouraged regional settlement to “address the demand for less skilled labour in regional economies and to assist humanitarian entrants to achieve early employment”. There is evidence, however, of the many challenges faced by humanitarian arrivals living in regional areas. This chapter focuses on the educational and occupational outcomes among 117 South Sudanese adult men from refugee backgrounds. In particular, the chapter uses both cross-sectional (at first interview) and longitudinal data (four interviews with each participant at six-month intervals) to compares outcomes between men living in Brisbane and those living in the Toowoomba–Gatton region in Southeast Queensland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Neutral cluster and Air Ion Spectrometer (NAIS) was used to monitor the concentration of airborne ions on 258 full days between Nov 2011 and Dec 2012 in Brisbane, Australia. The air was sampled from outside a window on the sixth floor of a building close to the city centre, approximately 100 m away from a busy freeway. The NAIS detects all ions and charged particles smaller than 42 nm. It was operated in a 4 min measurement cycle, with ion data recorded at 10 s intervals over 2 min during each cycle. The data were analysed to derive the diurnal variation of small, large and total ion concentrations in the environment. We adapt the definition of Horrak et al (2000) and classify small ions as molecular clusters smaller than 1.6 nm and large ions as charged particles larger than this size...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives To examine the effects on monotonous driving of normal sleep versus one night of sleep restriction in continuous positive airway pressure (CPAP) treated obstructive sleep apnoea (OSA) patients compared with age matched healthy controls. Methods Nineteen CPAP treated compliant male OSA patients (OSA-treated patients (OPs)), aged 50–75 years, and 20 healthy age-matched controls underwent both a normal night’s sleep and sleep restriction to 5 h (OPs remained on CPAP) in a counterbalanced design. All participants completed a 2 h afternoon monotonous drive in a realistic car simulator. Driving was monitored for sleepiness-related minor and major lane deviations, with ‘safe’ driving time being total time driven prior to first major lane deviation. EEGs were recorded continuously, and subjective sleepiness ratings were taken at regular intervals throughout the drive. Results After a normal night’s sleep, OPs and controls did not differ in terms of driving performance or in their ability to assess the levels of their own sleepiness, with both groups driving ‘safely’ for approximately 90 min. However, after sleep restriction, OPs had a significantly shorter (65 min) safe driving time and had to apply more compensatory effort to maintain their alertness compared with controls. They also underestimated the enhanced sleepiness. Nevertheless, apart from this caveat, there were generally close associations between subjective sleepiness, likelihood of a major lane deviation and EEG changes indicative of sleepiness. Conclusions With a normal night’s sleep, effectively treated older men with OSA drive as safely as healthy men of the same age. However, after restricted sleep, driving impairment is worse than that of controls. This suggests that, although successful CPAP treatment can alleviate potential detrimental effects of OSA on monotonous driving following normal sleep, these patients remain more vulnerable to sleep restriction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To investigate the influence of monocular hyperopic defocus on the normal diurnal rhythms in axial length and choroidal thickness of young adults. Methods A series of axial length and choroidal thickness measurements (collected at ~3 hourly intervals, with the first measurement at ~9 am and the final measurement at ~9 pm) were obtained for 15 emmetropic young adults over three consecutive days. The natural diurnal rhythms (Day 1, no defocus), diurnal rhythms with monocular hyperopic defocus (Day 2, – 2.00 DS spectacle lens over the right eye), and the recovery from any defocus induced changes (Day 3, no defocus) in diurnal rhythms were examined. Results Both axial length and choroidal thickness underwent significant diurnal changes on each of the three measurement days (p<0.0001). The introduction of monocular hyperopic defocus resulted in significant changes in the diurnal variations observed in both parameters (p<0.05). A significant (p<0.001) increase in the mean amplitude (peak to trough) of change in axial length (mean increase, 0.016 ± 0.005 mm) and choroidal thickness (mean increase, 0.011 ± 0.003 mm) was observed on day 2 with hyperopic defocus compared to the two ‘no defocus’ days (days 1 and 3). At the second measurement (mean time 12:10 pm) on the day with hyperopic defocus, the eye was significantly longer by 0.012 ± 0.002 mm compared to the other two days (p<0.05). No significant difference was observed in the average timing of the daily peaks in axial length (mean peak time 12:12 pm) and choroidal thickness (21:02 pm) over the three days. Conclusions The introduction of monocular hyperopic defocus resulted in a significant increase in the amplitude of the diurnal change in axial length and choroidal thickness that returned to normal the following day after removal of the blur stimulus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The delivery of health care in the 21st century will look like no other in the past. The fast paced technological advances that are being made will need to transition from the information age into clinical practice. The phenomenon of e-Health is the over-arching form of information technology and telehealth is one arm of that phenomenon. The uptake of telehealth both in Australia and overseas, has changed the face of health service delivery to many rural and remote communities for the better, removing what is known as the tyranny of distance. Many studies have evaluated the satisfaction and cost-benefit analysis of telehealth across the organisational aspects as well as the various adaptations of clinical pathways and this is the predominant focus of most studies published to date. However, whilst comments have been made by many researchers about the need to improve and attend to the communication and relationship building aspects of telehealth no studies have examined this further. The aim of this study was to identify the patient and clinician experiences, concerns, behaviours and perceptions of the telehealth interaction and develop a training tool to assist these clinicians to improve their interaction skills. Methods: A mixed methods design combining quantitative (survey analysis and data coding) and qualitative (interview analysis) approaches was adopted. This study utilised four phases to firstly qualitatively explore the needs of clients (patients) and clinicians within a telehealth consultation then designed, developed, piloted and quantitatively and qualitatively evaluated the telehealth communication training program. Qualitative data was collected and analysed during Phase 1 of this study to describe and define the missing 'communication and rapport building' aspects within telehealth. This data was then utilised to develop a self-paced communication training program that enhanced clinicians existing skills, which comprised of Phase 2 of this study to develop the interactive program. Phase 3 included evaluating the training program with 26 clinicians and results were recorded pre and post training, whilst phase 4 was the pilot for future recommendations of this training program using a patient group within a Queensland Health setting at two rural hospitals. Results: Comparisons of pre and post training data on 1) Effective communication styles, 2) Involvement in communication training package, 3) satisfaction pre and post training, and 4) health outcomes pre and post training indicated that there were differences between pre and post training in relation to effective communication style, increased satisfaction and no difference in health outcomes between pre and post training for this patient group. The post training results revealed over half of the participants (N= 17, 65%) were more responsive to non-verbal cues and were better able to reflect and respond to looks of anxiousness and confusion from a 'patient' within a telehealth consultation. It was also found that during post training evaluations, clinicians had enhanced their therapeutic communication with greater detail to their own body postures, eye contact and presentation. There was greater time spent looking at the 'patient' with an increase of 35 second intervals of direct eye contact and less time spent looking down at paperwork which decreased by 20 seconds. Overall 73% of the clinicians were satisfied with the training program and 61% strongly agreed that they recognised areas of their communication that needed improving during a telehealth consultation. For the patient group there was significant difference post training in rapport with a mean score from 42 (SD = 28, n = 27) to 48 (SD = 5.9, n = 24). For communication comfort of the patient group there was a significant difference between the pre and post training scores t(10) = 27.9, p = .002, which meant that overall the patients felt less inhibited whilst talking to the clinicians and more understood. Conclusion: The aim of this study was to explore the characteristics of good patient-clinician communication and unmet training needs for telehealth consultations. The study developed a training program that was specific for telehealth consultations and not dependent on a 'trainer' to deliver the content. In light of the existing literature this is a first of its kind and a valuable contribution to the research on this topic. It was found that the training program was effective in improving the clinician's communication style and increased the satisfaction of patient's within an e-health environment. This study has identified some historical myths that telehealth cannot be part of empathic patient centred care due to its technology tag.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Hyperactive platelets contribute to the thrombotic response in humans, and exercise transiently increases platelet function. Caffeine is routinely used by athletes as an ergogenic aid, but the combined effect of exercise and caffeine on platelet function has not been investigated. Methods: Twelve healthy males were randomly assigned to one of four groups and undertook four experimental trials of a high-intensity aerobic interval training (AIT) bout or rest with ingestion of caffeine (3 mg·kg-1) or placebo. AIT was 8 × 5 min at approximately 75% peak power output (approximately 80% V?O2peak) and 1-min recovery (approximately 40% peak power output, approximately 50% V?O2peak) intervals. Blood/urine was collected before, 60, and 90 min after capsule ingestion and analyzed for platelet aggregation/activation. Results: AIT increased platelet reactivity to adenosine diphosphate (placebo 30.3%, caffeine 13.4%, P < 0.05) and collagen (placebo 10.8%, caffeine 5.1%, P < 0.05) compared with rest. Exercise placebo increased adenosine diphosphate-induced aggregation 90 min postingestion compared with baseline (40.5%, P < 0.05), but the increase when exercise was combined with caffeine was small (6.6%). During the resting caffeine protocol, collagen-induced aggregation was reduced (-4.3%, P < 0.05). AIT increased expression of platelet activation marker PAC-1 with exercise placebo (P < 0.05) but not when combined with caffeine. Conclusion: A single bout of AIT increases platelet function, but caffeine ingestion (3 mg·kg) does not exacerbate platelet function at rest or in response to AIT. Our results provide new information showing caffeine at a dose that can elicit ergogenic effects on performance has no detrimental effect on platelet function and may have the potential to attenuate increases in platelet activation and aggregation when undertaking strenuous exercise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The pattern of protein intake following exercise may impact whole-body protein turnover and net protein retention. We determined the effects of different protein feeding strategies on protein metabolism in resistance-trained young men. Methods: Participants were randomly assigned to ingest either 80g of whey protein as 8x10g every 1.5h (PULSE; n=8), 4x20g every 3h (intermediate, INT; n=7), or 2x40g every 6h (BOLUS; n=8) after an acute bout of bilateral knee extension exercise (4x10 repetitions at 80% maximal strength). Whole-body protein turnover (Q), synthesis (S), breakdown (B), and net balance (NB) were measured throughout 12h of recovery by a bolus ingestion of [ 15N]glycine with urinary [15N]ammonia enrichment as the collected end-product. Results PULSE Q rates were greater than BOLUS (?19%, P<0.05) with a trend towards being greater than INT (?9%, P=0.08). Rates of S were 32% and 19% greater and rates of B were 51% and 57% greater for PULSE as compared to INT and BOLUS, respectively (P<0.05), with no difference between INT and BOLUS. There were no statistical differences in NB between groups (P=0.23); however, magnitude-based inferential statistics revealed likely small (mean effect90%CI; 0.590.87) and moderate (0.800.91) increases in NB for PULSE and INT compared to BOLUS and possible small increase (0.421.00) for INT vs. PULSE. Conclusion We conclude that the pattern of ingested protein, and not only the total daily amount, can impact whole-body protein metabolism. Individuals aiming to maximize NB would likely benefit from repeated ingestion of moderate amounts of protein (?20g) at regular intervals (?3h) throughout the day.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We determined the effect of coingestion of caffeine (Caff) with carbohydrate (CHO) on rates of muscle glycogen resynthesis during recovery from exhaustive exercise in seven trained subjects who completed two experimental trials in a randomized, double-blind crossover design. The evening before an experiment subjects performed intermittent exhaustive cycling and then consumed a low-CHO meal. The next morning subjects rode until volitional fatigue. On completion of this ride subjects consumed either CHO [4 g/kg body mass (BM)] or the same amount of CHO + Caff (8 mg/kg BM) during 4 h of passive recovery. Muscle biopsies and blood samples were taken at regular intervals throughout recovery. Muscle glycogen levels were similar at exhaustion [?75 mmol/kg dry wt (dw)] and increased by a similar amount (?80%) after 1 h of recovery (133 ± 37.8 vs. 149 ± 48 mmol/kg dw for CHO and Caff, respectively). After 4 h of recovery Caff resulted in higher glycogen accumulation (313 ± 69 vs. 234 ± 50 mmol/kg dw, P < 0.001). Accordingly, the overall rate of resynthesis for the 4-h recovery period was 66% higher in Caff compared with CHO (57.7 ± 18.5 vs. 38.0 ± 7.7 mmol·kg dw-1·h-1, P < 0.05). After 1 h of recovery plasma Caff levels had increased to 31 ± 11 ?M (P < 0.001) and at the end of the recovery reached 77 ± 11 ?M (P < 0.001) with Caff. Phosphorylation of CaMKThr286 was similar after exercise and after 1 h of recovery, but after 4 h CaMKThr286 phosphorylation was higher in Caff than CHO (P < 0.05). Phosphorylation of AMP-activated protein kinase (AMPK)Thr172 and AktSer473 was similar for both treatments at all time points. We provide the first evidence that in trained subjects coingestion of large amounts of Caff (8 mg/kg BM) with CHO has an additive effect on rates of postexercise muscle glycogen accumulation compared with consumption of CHO alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose The objectives of this study were to examine the effect of 4-week moderate- and high-intensity interval training (MIIT and HIIT) on fat oxidation and the responses of blood lactate (BLa) and rating of perceived exertion (RPE). Methods Ten overweight/obese men (age = 29 ±3.7 years, BMI = 30.7 ±3.4 kg/m2) participated in a cross-over study of 4-week MIIT and HIIT training. The MIIT training sessions consisted of 5-min cycling stages at mechanical workloads 20% above and 20% below 45%VO2peak. The HIIT sessions consisted of intervals of 30-s work at 90%VO2peak and 30-s rest. Pre- and post-training assessments included VO2max using a graded exercise test (GXT) and fat oxidation using a 45-min constant-load test at 45%VO2max. BLa and RPE were also measured during the constant-load exercise test. Results There were no significant changes in body composition with either intervention. There were significant increases in fat oxidation after MIIT and HIIT (p ≤ 0.01), with no effect of intensity. BLa during the constant-load exercise test significantly decreased after MIIT and HIIT (p ≤ 0.01), and the difference between MIIT and HIIT was not significant (p = 0.09). RPE significantly decreased after HIIT greater than MIIT (p ≤ 0.05). Conclusion Interval training can increase fat oxidation with no effect of exercise intensity, but BLa and RPE decreased after HIIT to greater extent than MIIT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today, the majority of semiconductor fabrication plants (fabs) conduct equipment preventive maintenance based on statistically-derived time- or wafer-count-based intervals. While these practices have had relative success in managing equipment availability and product yield, the cost, both in time and materials, remains high. Condition-based maintenance has been successfully adopted in several industries, where costs associated with equipment downtime range from potential loss of life to unacceptable affects to companies’ bottom lines. In this paper, we present a method for the monitoring of complex systems in the presence of multiple operating regimes. In addition, the new representation of degradation processes will be used to define an optimization procedure that facilitates concurrent maintenance and operational decision-making in a manufacturing system. This decision-making procedure metaheuristically maximizes a customizable cost function that reflects the benefits of production uptime, and the losses incurred due to deficient quality and downtime. The new degradation monitoring method is illustrated through the monitoring of a deposition tool operating over a prolonged period of time in a major fab, while the operational decision-making is demonstrated using simulated operation of a generic cluster tool.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To determine stage-specific and average disability weights (DWs) of malignant neoplasm and provide support and evidence for study on burden of cancer and policy development in Shandong province. Methods Health status of each cancer patient identified during the cancer prevalence survey in Shandong, 2007 was investigated. In line with the GBD methodology in estimating DWs, the disability extent of every case was classified and evaluated according to the Six-class Disability Classification version and then the stage-specific weights and average DWs with their 95 % confidence intervals were calculated, using SAS software. Results A total of 11 757 cancer cases were investigated and evaluated. DWs of specific stage of therapy, remission, metastasis and terminal of all cancers were 0.310, 0.218, 0.450 and 0.653 respectively. The average DW of all cancers was 0.317(95 % CI:0.312-0.321). Weights of different stage and different cancer varied significantly, while no significant differences were found between males and females. DWs were found higher (>0.4) for liver cancer, bone cancer, lymphoma and pancreas cancer. Lower DWs (<0.3) were found for breast cancer, cervix uteri, corpus uteri, ovarian cancer, larynx cancer, mouth and oropharynx cancer. Conclusion Stage-specific and average DWs for various cancers were estimated based on a large sample size survey. The average DWs of 0.317 for all cancers indicated that 1/3 healthy year lost for each survived life year of them. The difference of DWs between different cancer and stage provide scientific evidence for cancer prevention strategy development. Abstract in Chinese 目的 测算各种恶性肿瘤的分病程残疾权重和平均残疾权重,为山东省恶性肿瘤疾病负担研究及肿瘤防治对策制定提供参考依据. 方法 在山东省2007年恶性肿瘤现患调查中对所有恶性肿瘤患者的健康状况进行调查,参考全球疾病负担研究的方法 ,利用六级社会功能分级标准对患者残疾状况进行分级和赋值,分别计算20种恶性肿瘤的分病程残疾权重和平均残疾权重及其95%CI. 结果 共调查恶性肿瘤患者11757例,所有恶性肿瘤治疗期、恢复期、转移期和晚期的残疾权重分别为0.310、0.218、0.450和0.653,平均残疾权重为0.317(95%CI:0.312~0.321).不同恶性肿瘤和不同病程阶段的残疾权重差别显著,性别间差异无统计学意义.肝癌、骨癌、淋巴瘤和胰腺癌平均残疾权重较高(>0.4),乳腺癌、子宫体癌、子宫颈癌、卵巢癌、喉癌和口咽部癌症相对较低(<0.3). 结论 山东省恶性肿瘤平均残疾权重为0.317,即恶性肿瘤患者每存活1年平均损失近1/3个健康生命年;不同恶性肿瘤和不同病程阶段的残疾权重差别为肿瘤防治对策的制定具有重要意义.