950 resultados para Plant Disease
Resumo:
The somatosensory system plays an important role in balance control and age-related changes to this system have been implicated in falls. Parkinson’s disease (PD) is a chronic and progressive disease of the brain, characterized by postural instability and gait disturbance. Previous research has shown that deficiencies in somatosensory feedback may contribute to the poorer postural control demonstrated by PD individuals. However, few studies have comprehensively explored differences in somatosensory function and postural control between PD participants and healthy older individuals. The soles of the feet contain many cutaneous mechanoreceptors that provide important somatosensory information sources for postural control. Different types of insole devices have been developed to enhance this somatosensory information and improve postural stability, but these devices are often too complex and expensive to integrate into daily life. Textured insoles provide a more passive intervention that may be an inexpensive and accessible means to enhance the somatosensory input from the plantar surface of the feet. However, to date, there has been little work conducted to test the efficacy of enhanced somatosensory input induced by textured insoles in both healthy and PD populations during standing and walking. Therefore, the aims of this thesis were to determine: 1) whether textured insole surfaces can improve postural stability by enhancing somatosensory information in younger and older adults, 2) the differences between healthy older participants and PD participants for measures of physiological function and postural stability during standing and walking, 3) how changes in somatosensory information affect postural stability in both groups during standing and walking; and 4), whether textured insoles can improve postural stability in both groups during standing and walking. To address these aims, Study 1 recruited seven older individuals and ten healthy young controls to investigate the effects of two textured insole surfaces on postural stability while performing standing balance tests on a force plate. Participants were tested under three insole surface conditions: 1) barefoot; 2) standing on a hard textured insole surface; and 3), standing on a soft textured insole surface. Measurements derived from the centre of pressure displacement included the range of anterior-posterior and medial-lateral displacement, path length and the 90% confidence elliptical area (C90 area). Results of study 1 revealed a significant Group*Surface*Insole interaction for the four measures. Both textured insole surfaces reduced postural sway for the older group, especially in the eyes closed condition on the foam surface. However, participants reported that the soft textured insole surface was more comfortable and, hence, the soft textured insoles were adopted for Studies 2 and 3. For Study 2, 20 healthy older adults (controls) and 20 participants with Parkinson’s disease were recruited. Participants were evaluated using a series of physiological assessments that included touch sensitivity, vibratory perception, and pain and temperature threshold detection. Furthermore, nerve function and somatosensory evoked potentials tests were utilized to provide detailed information regarding peripheral nerve function for these participants. Standing balance and walking were assessed on different surfaces using a force plate and the 3D Vicon motion analysis system, respectively. Data derived from the force plate included the range of anterior-posterior and medial-lateral sway, while measures of stride length, stride period, cadence, double support time, stance phase, velocity and stride timing variability were reported for the walking assessment. The results of this study demonstrated that the PD group had decrements in somatosensory function compared to the healthy older control group. For electrodiagnosis, PD participants had poorer nerve function than controls, as evidenced by slower nerve conduction velocities and longer latencies in sural nerve and prolonged latency in the P37 somatosensory evoked potential. Furthermore, the PD group displayed more postural sway in both the anterior-posterior and medial-lateral directions relative to controls and these differences were increased when standing on a foam surface. With respect to the gait assessment, the PD group took shorter strides and had a reduced stride period compared with the control group. Furthermore, the PD group spent more time in the stance phase and had increased cadence and stride timing variability than the controls. Compared with walking on the firm surface, the two groups demonstrated different gait adaptations while walking on the uneven surface. Controls increased their stride length and stride period and decreased their cadence, which resulted in a consistent walking velocity on both surfaces. Conversely, while the PD patients also increased their stride period and decreased their cadence and stance period on the uneven surface, they did not increase their stride length and, hence walked slower on the uneven surface. In the PD group, there was a strong positive association between decreased somatosensory function and decreased clinical balance, as assessed by the Tinetti test. Poorer somatosensory function was also strongly positively correlated with the temporospatial gait parameters, especially shorter stride length. Study 3 evaluated the effects of manipulating the somatosensory information from the plantar surface of the feet using textured insoles in the same populations assessed in Study 2. For this study, participants performed the standing and walking balance tests under three footwear conditions: 1) barefoot; 2) with smooth insoles; and 3), with textured insoles. Standing balance and walking were evaluated using a force plate and a Vicon motion analysis system and the data were analysed in the same way outlined for Study 2. The findings showed that the smooth and textured insoles caused different effects on postural control during both the standing and walking trials. Both insoles decreased medial-lateral sway to the same level on the firm surface. The greatest benefits were observed in the PD group while wearing the textured insole. When standing under a more challenging condition on the foam surface with eyes closed, only the textured insole decreased medial-lateral sway in the PD group. With respect to the gait trials, both insoles increased walking velocity, stride length and stride time and decreased cadence, but these changes were more pronounced for the textured insoles. The effects of the textured insoles were evident under challenging conditions in the PD group and increased walking velocity and stride length, while decreasing cadence. Textured insoles were also effective in reducing the time spent in the double support and stance phases of the gait cycle and did not increase stride timing variability, as was the case for the smooth insoles for the PD group. The results of this study suggest that textured insoles, such as those evaluated in this research, may provide a low-cost means of improving postural stability in high-risk groups, such as people with PD, which may act as an important intervention to prevent falls.
Resumo:
Pan et al. claim that our results actually support a strong linear positive relationship between productivity and richness, whereas Fridley et al. contend that the data support a strong humped relationship. These responses illustrate how preoccupation with bivariate patterns distracts from a deeper understanding of the multivariate mechanisms that control these important ecosystem properties.
Resumo:
In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.
Resumo:
Background Seasonal changes in cardiovascular disease (CVD) risk factors may be due to exposure to seasonal environmental variables like temperature and acute infections or seasonal behavioural patterns in physical activity and diet. Investigating the seasonal pattern of risk factors should help determine the causes of the seasonal pattern in CVD. Few studies have investigated the seasonal variation in risk factors using repeated measurements from the same individual, which is important as individual and population seasonal patterns may differ. Methods The authors investigated the seasonal pattern in systolic and diastolic blood pressure, heart rate, body weight, total cholesterol, triglycerides, high-density lipoprotein cholesterol, C reactive protein and fibrinogen. Measurements came from 38 037 participants in the population-based cohort, the Tromsø Study, examined up to eight times from 1979 to 2008. Individual and population seasonal patterns were estimated using a cosinor in a mixed model. Results All risk factors had a highly statistically significant seasonal pattern with a peak time in winter, except for triglycerides (peak in autumn), C reactive protein and fibrinogen (peak in spring). The sizes of the seasonal variations were clinically modest. Conclusions Although the authors found highly statistically significant individual seasonal patterns for all risk factors, the sizes of the changes were modest, probably because this subarctic population is well adapted to a harsh climate. Better protection against seasonal risk factors like cold weather could help reduce the winter excess in CVD observed in milder climates.
Resumo:
Parkinson’s disease (PD) is a progressive, chronic neurodegenerative disorder for which there is no known cure. Physical exercise programs may be used to assist with the physical management of PD. Several studies have demonstrated that community based physical therapy programs are effective in reducing physical aspects of disability among people with PD. While multidisciplinary therapy interventions may have the potential to reduce disability and improve the quality of life of people with PD, there is very limited clinical trial evidence to support or refute the use of a community based multidisciplinary or interdisciplinary programs for people with PD. A two group randomized trial is being undertaken within a community rehabilitation service in Brisbane, Australia. Community dwelling adults with a diagnosis of Idiopathic Parkinson’s disease are being recruited. Eligible participants are randomly allocated to a standard exercise rehabilitation group program or an intervention group which incorporates physical, cognitive and speech activities in a multi-tasking framework. Outcomes will be measured at 6-week intervals for a period of six months. Primary outcome measures are the Montreal Cognitive Assessment (MoCA) and the Timed Up and Go (TUG) cognitive test. Secondary outcomes include changes in health related quality of life, communication, social participation, mobility, strength and balance, and carer burden measures. This study will determine the immediate and long-term effectiveness of a unique multifocal, interdisciplinary, dual-tasking approach to the management of PD as compared to an exercise only program. We anticipate that the results of this study will have implications for the development of cost effective evidence based best practice for the treatment of people with PD living in the community.
Resumo:
Objective The spondylarthritides (SpA), including ankylosing spondylitis (AS), psoriatic arthritis (PsA), reactive arthritis, and arthritis associated with inflammatory bowel disease, cause chronic inflammation of the large peripheral and axial joints, eyes, skin, ileum, and colon. Genetic studies reveal common candidate genes for AS, PsA, and Crohn's disease, including IL23R, IL12B, STAT3, and CARD9, all of which are associated with interleukin-23 (IL-23) signaling downstream of the dectin 1 β-glucan receptor. In autoimmune-prone SKG mice with mutated ZAP-70, which attenuates T cell receptor signaling and increases the autoreactivity of T cells in the peripheral repertoire, IL-17–dependent inflammatory arthritis developed after dectin 1–mediated fungal infection. This study was undertaken to determine whether SKG mice injected with 1,3-β-glucan (curdlan) develop evidence of SpA, and the relationship of innate and adaptive autoimmunity to this process. Methods SKG mice and control BALB/c mice were injected once with curdlan or mannan. Arthritis was scored weekly, and organs were assessed for pathologic features. Anti–IL-23 monoclonal antibodies were injected into curdlan-treated SKG mice. CD4+ T cells were transferred from curdlan-treated mice to SCID mice, and sera were analyzed for autoantibodies. Results After systemic injection of curdlan, SKG mice developed enthesitis, wrist, ankle, and sacroiliac joint arthritis, dactylitis, plantar fasciitis, vertebral inflammation, ileitis resembling Crohn's disease, and unilateral uveitis. Mannan triggered spondylitis and arthritis. Arthritis and spondylitis were T cell– and IL-23–dependent and were transferable to SCID recipients with CD4+ T cells. SpA was associated with collagen- and proteoglycan-specific autoantibodies. Conclusion Our findings indicate that the SKG ZAP-70W163C mutation predisposes BALB/c mice to SpA, resulting from innate and adaptive autoimmunity, after systemic β-glucan or mannan exposure.
Resumo:
Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.
Resumo:
Objectives In non-alcoholic fatty liver disease (NAFLD), hepatic steatosis is intricately linked with a number of metabolic alterations. We studied substrate utilisation in NAFLD during basal, insulin-stimulated and exercise conditions, and correlated these outcomes with disease severity. Methods 20 patients with NAFLD (mean±SD body mass index (BMI) 34.1±6.7 kg/m2) and 15 healthy controls (BMI 23.4±2.7 kg/m2) were assessed. Respiratory quotient (RQ), whole-body fat (Fatox) and carbohydrate (CHOox) oxidation rates were determined by indirect calorimetry in three conditions: basal (resting and fasted), insulin-stimulated (hyperinsulinaemic–euglycaemic clamp) and exercise (cycling at an intensity to elicit maximal Fatox). Severity of disease and steatosis were determined by liver histology, hepatic Fatox from plasma β-hydroxybutyrate concentrations, aerobic fitness expressed as , and visceral adipose tissue (VAT) measured by computed tomography. Results Within the overweight/obese NAFLD cohort, basal RQ correlated positively with steatosis (r=0.57, p=0.01) and was higher (indicating smaller contribution of Fatox to energy expenditure) in patients with NAFLD activity score (NAS) ≥5 vs <5 (p=0.008). Both results were independent of VAT, % body fat and BMI. Compared with the lean control group, patients with NAFLD had lower basal whole-body Fatox (1.2±0.3 vs 1.5±0.4 mg/kgFFM/min, p=0.024) and lower basal hepatic Fatox (ie, β-hydroxybutyrate, p=0.004). During exercise, they achieved lower maximal Fatox (2.5±1.4 vs. 5.8±3.7 mg/kgFFM/min, p=0.002) and lower (p<0.001) than controls. Fatox during exercise was not associated with disease severity (p=0.79). Conclusions Overweight/obese patients with NAFLD had reduced hepatic Fatox and reduced whole-body Fatox under basal and exercise conditions. There was an inverse relationship between ability to oxidise fat in basal conditions and histological features of NAFLD including severity of steatosis and NAS
Resumo:
After more than 25 years of published investigation, including randomized controlled trials, the role of omega-3 polyunsaturated fatty acids in the treatment of kidney disease remains unclear. In vitro and in vivo experimental studies support the efficacy of omega-3 polyunsaturated fatty acids on inflammatory pathways involved with the progression of kidney disease. Clinical investigations have focused predominantly on immunoglobulin A (IgA) nephropathy. More recently, lupus nephritis, polycystic kidney disease, and other glomerular diseases have been investigated. Clinical trials have shown conflicting results for the efficacy of omega-3 polyunsaturated fatty acids in IgA nephropathy, which may relate to varying doses, proportions of eicosapentaenoic acid and docosahexaenoic acid, duration of therapy, and sample size of the study populations. Meta-analyses of clinical trials using omega-3 polyunsaturated fatty acids in IgA nephropathy have been limited by the quality of available studies. However, guidelines suggest that omega-3 polyunsaturated fatty acids should be considered in progressive IgA nephropathy. Omega-3 polyunsaturated fatty acids decrease blood pressure, a known accelerant of kidney disease progression. Well-designed, adequately powered, randomized, controlled clinical trials are required to further investigate the potential benefits of omega-3 polyunsaturated fatty acids on the progression of kidney disease and patient survival.
Resumo:
The purpose of this paper is to determine and discuss on the plant and machinery valuation syllabus for higher learning education in Malaysia to ensure the practicality of the subject in the real market. There have been limited studies in plant and machinery area, either by scholars or practitioners. Most papers highlighted the methodologies but limited papers discussed on the plant and machinery valuation education. This paper will determine inputs for plant and machinery valuation guidance focussing on the syllabus set up and references for valuers interested in this area of expertise. A qualitative approach via content analysis is conducted to compare international and Malaysian plant and machinery valuation syllabus and suggest improvements for Malaysian syllabus. It is found that there are few higher education institutions in the world that provide plant and machinery valuation courses as part of their property studies syllabus. Further investigation revealed that on the job training is the preferable method for plant and machinery valuation education and based on the valuers experience. The significance of this paper is to increase the level of understanding of plant and machinery valuation criteria and provide suggestions to Malaysian stakeholders with the relevant elements in plant and machinery valuation education syllabus.
Resumo:
Herbivory is generally regarded as negatively impacting on host plant fitness. Frugivorous insects, which feed directly on plant reproductive tissues, are predicted to be particularly damaging to hosts. We tested this prediction with the fruit fly, Bactrocera tryoni, by recording the impact of larval feeding on two direct (seed number and germination) and two indirect (fruit decay rate and attraction/deterrence of vertebrate frugivores) measures of host plant fitness. Experiments were done in the laboratory, glasshouse and tropical rainforest. We found no negative impact of larval feeding on seed number or germination for three test plants: tomato, capsicum and eggplant. Further, larval feeding accelerated the initiation of decay and increased the final level of fruit decay in tomatoes, apples, pawpaw and pear, a result considered to be beneficial to the fruit. In rainforest studies, native rodents preferred infested apple and pears compared to uninfested control fruit; however, there were no differences observed between treatments for tomato and pawpaw. For our study fruits, these results demonstrate that fruit fly larval infestation has neutral or beneficial impacts on the host plant, an outcome which may be largely influenced by the physical properties of the host. These results may contribute to explaining why fruit flies have not evolved the same level of host specialization generally observed for other herbivore groups.
Resumo:
Diabetes is one of the greatest public health challenges to face Australia. It is already Australia’s leading cause of kidney failure, blindness (in those under 60 years) and lower limb amputation, and causes significant cardiovascular disease. Australia’s diabetes amputation rate is one of the worst in the developed world, and appears to have significantly increased in the last decade, whereas some other diabetes complication rates appear to have decreased. This paper aims to compare the national burden of disease for the four major diabetes-related complications and the availability of government funding to combat these complications, in order to determine where diabetes foot disease ranks in Australia. Our review of relevant national literature indicates foot disease ranks second overall in burden of disease and last in evidenced-based government funding to combat these diabetes complications. This suggests public funding to address foot disease in Australia is disproportionately low when compared to funding dedicated to other diabetes complications. There is ample evidence that appropriate government funding of evidence-based care improves all diabetes complication outcomes and reduces overall costs. Numerous diverse Australian peak bodies have now recommended similar diabetes foot evidence-based strategies that have reduced diabetes amputation rates and associated costs in other developed nations. It would seem intuitive that “it’s time” to fund these evidence-based strategies for diabetes foot disease in Australia as well.
Resumo:
Based on a national audit of chronic heart failure (CHF) management programmes (CHF-MPs) conducted in 2006, Driscoll et al identified a disproportionate distribution ranging from 0 to 4.2 programmes/million population in the various states of Australia with many programmes not following best practice.1 We welcome their proposal to develop national benchmarks for CHF management and acknowledge the contributions of the Heart Foundation and health professionals in finalising these recommendations.2 We would like to share the Queensland experience in striving towards best practice with the number of CHF-MPs increasing from four (at the time of the 2006 survey) to 23, equating to 5.0 programmes/million population. Queensland now has a state-wide heart failure service steering committee with a focus on the development of CHF-MPs supported by a central coordinator...
Resumo:
Abstract Background: As low HDL cholesterol levels are a risk factor for cardiovascular disease, raising HDL cholesterol substantially by inhibiting or modulating cholesteryl ester transfer protein (CETP) may be useful in coronary artery disease. The first CETP inhibitor that went into clinical trial, torcetrapib, was shown to increase the levels of HDL cholesterol, but it also increased cardiovascular outcomes, probably due to an increase in blood pressure and aldosterone secretion, by an off-target mechanism/s. Objective/methods: Dalcetrapib is a new CETP modulator that increases the levels of HDL cholesterol, but does not increase blood pressure or aldosterone secretion. The objective was to evaluate a paper describing the effects of dalcetrapib on carotid and aortic wall thickness in subjects with, or at high risk, of coronary artery disease; the dal-PLAQUE study. Results: dal-PLAQUE showed that dalcetrapib reduced the progression of atherosclerosis and may also reduce the vascular inflammation associated with this, in subjects with, or with high risk of, coronary heart disease, who were already taking statins. Conclusions: These results suggest that modulating CETP with dalcetrapib may be a beneficial mechanism in cardiovascular disease. The results of the dal-HEART series, which includes dal-PLAQUE 1 and 2, and dal-OUTCOMES, when complete, will provide more definitive information about the benefit, or not, of dalcetrapib in coronary artery disease.
Resumo:
Bananas are one of the world�fs most important crops, serving as a staple food and an important source of income for millions of people in the subtropics. Pests and diseases are a major constraint to banana production. To prevent the spread of pests and disease, farmers are encouraged to use disease�] and insect�]free planting material obtained by micropropagation. This option, however, does not always exclude viruses and concern remains on the quality of planting material. Therefore, there is a demand for effective and reliable virus indexing procedures for tissue culture (TC) material. Reliable diagnostic tests are currently available for all of the economically important viruses of bananas with the exception of Banana streak viruses (BSV, Caulimoviridae, Badnavirus). Development of a reliable diagnostic test for BSV is complicated by the significant serological and genetic variation reported for BSV isolates, and the presence of endogenous BSV (eBSV). Current PCR�] and serological�]based diagnostic methods for BSV may not detect all species of BSV, and PCR�]based methods may give false positives because of the presence of eBSV. Rolling circle amplification (RCA) has been reported as a technique to detect BSV which can also discriminate between episomal and endogenous BSV sequences. However, the method is too expensive for large scale screening of samples in developing countries, and little information is available regarding its sensitivity. Therefore the development of reliable PCR�]based assays is still considered the most appropriate option for large scale screening of banana plants for BSV. This MSc project aimed to refine and optimise the protocols for BSV detection, with a particular focus on developing reliable PCR�]based diagnostics Initially, the appropriateness and reliability of PCR and RCA as diagnostic tests for BSV detection were assessed by testing 45 field samples of banana collected from nine districts in the Eastern region of Uganda in February 2010. This research was also aimed at investigating the diversity of BSV in eastern Uganda, identifying the BSV species present and characterising any new BSV species. Out of the 45 samples tested, 38 and 40 samples were considered positive by PCR and RCA, respectively. Six different species of BSV, namely Banana streak IM virus (BSIMV), Banana streak MY virus (BSMYV), Banana streak OL virus (BSOLV), Banana streak UA virus (BSUAV), Banana streak UL virus (BSULV), Banana streak UM virus (BSUMV), were detected by PCR and confirmed by RCA and sequencing. No new species were detected, but this was the first report of BSMYV in Uganda. Although RCA was demonstrated to be suitable for broad�]range detection of BSV, it proved time�]consuming and laborious for identification in field samples. Due to the disadvantages associated with RCA, attempts were made to develop a reliable PCR�]based assay for the specific detection of episomal BSOLV, Banana streak GF virus (BSGFV), BSMYV and BSIMV. For BSOLV and BSGFV, the integrated sequences exist in rearranged, repeated and partially inverted portions at their site of integration. Therefore, for these two viruses, primers sets were designed by mapping previously published sequences of their endogenous counterparts onto published sequences of the episomal genomes. For BSOLV, two primer sets were designed while, for BSGFV, a single primer set was designed. The episomalspecificity of these primer sets was assessed by testing 106 plant samples collected during surveys in Kenya and Uganda, and 33 leaf samples from a wide range of banana cultivars maintained in TC at the Maroochy Research Station of the Department of Employment, Economic Development and Innovation (DEEDI), Queensland. All of these samples had previously been tested for episomal BSV by RCA and for both BSOLV and BSGFV by PCR using published primer sets. The outcome from these analyses was that the newly designed primer sets for BSOLV and BSGFV were able to distinguish between episomal BSV and eBSV in most cultivars with some B�]genome component. In some samples, however, amplification was observed using the putative episomal�]specific primer sets where episomal BSV was not identified using RCA. This may reflect a difference in the sensitivity of PCR compared to RCA, or possibly the presence of an eBSV sequence of different conformation. Since the sequences of the respective eBSV for BSMYV and BSIMV in the M. balbisiana genome are not available, a series of random primer combinations were tested in an attempt to find potential episomal�]specific primer sets for BSMYV and BSIMV. Of an initial 20 primer combinations screened for BSMYV detection on a small number of control samples, 11 primers sets appeared to be episomal�]specific. However, subsequent testing of two of these primer combinations on a larger number of control samples resulted in some inconsistent results which will require further investigation. Testing of the 25 primer combinations for episomal�]specific detection of BSIMV on a number of control samples showed that none were able to discriminate between episomal and endogenous BSIMV. The final component of this research project was the development of an infectious clone of a BSV endemic in Australia, namely BSMYV. This was considered important to enable the generation of large amounts of diseased plant material needed for further research. A terminally redundant fragment (.1.3 �~ BSMYV genome) was cloned and transformed into Agrobacterium tumefaciens strain AGL1, and used to inoculate 12 healthy banana plants of the cultivars Cavendish (Williams) by three different methods. At 12 weeks post�]inoculation, (i) four of the five banana plants inoculated by corm injection showed characteristic BSV symptoms while the remaining plant was wilting/dying, (ii) three of the five banana plants inoculated by needle�]pricking of the stem showed BSV symptoms, one plant was symptomless while the remaining had died and (iii) both banana plants inoculated by leaf infiltration were symptomless. When banana leaf samples were tested for BSMYV by PCR and RCA, BSMYV was confirmed in all banana plants showing symptoms including those were wilting and/or dying. The results from this research have provided several avenues for further research. By completely sequencing all variants of eBSOLV and eBSGFV and fully sequencing the eBSIMV and eBSMYV regions, episomal BSV�]specific primer sets for all eBSVs could potentially be designed that could avoid all integrants of that particular BSV species. Furthermore, the development of an infectious BSV clone will enable large numbers of BSVinfected plants to be generated for the further testing of the sensitivity of RCA compared to other more established assays such as PCR. The development of infectious clones also opens the possibility for virus induced gene silencing studies in banana.