966 resultados para Loss prevention
Resumo:
Mastitis is one of the most economically significant diseases for the dairy industry for backyard farmers in developing countries and high producing herds worldwide. Two of the major factors impeding reduction in the incidence of this disease is [a] the lack of availability of an effective vaccine capable of protecting against multiple etiological agents and [b] propensity of some of the etiological agents to develop persistent antibiotic resistance in biofilms. This is further complicated by the continuing revolving shift in the predominant etiological agents of mastitis, depending upon a multitude of factors such as variability in hygienic practices on farms, easy access leading to overuse of appropriate or inappropriate antibiotics at suboptimal concentrations, particularly in developing countries, and lack of compliance with the recommended treatment schedules. Regardless, Staphylococcus aureus and Streptococcus uberis followed by Escherichia coli, Streptococcus agalactiae has become the predominant etiological agents of bovine mastitis followed Streptococcus agalactiae, Streptococcus dysagalactiae, Klebsiella pneumonia and the newly emerging Mycoplasma bovis. Current approaches being pursued to reduce the negative economic impact of this disease are through early diagnosis of infection, immediate treatment with an antibiotic found to either inhibit or kill the pathogen(s) in vitro using planktonic cultures and the use of the currently marketed vaccines regardless of their demonstrated effectiveness. Given the limitations of breeding programs, including genetic selection to improve resistance against infectious diseases including mastitis, it is imperative to have the availability of an effective broad-spectrum, preferably cross-protective, vaccine capable of protecting against bovine mastitis for reduction in the incidence of bovine mastitis, as well as interrupting the potential cross-species transmission to humans. This overview highlights the major etiological agents, factors affecting susceptibility to mastitis, and the current status of antibiotic-based therapies and prototype vaccine candidates or commercially available vaccines against bovine mastitis as potential preventative strategies. © 2013 Tiwari JG, et al.
Resumo:
Lemon myrtle, anise myrtle, and Tasmanian pepper leaf are commercial Australian native herbs with a high volatile or essential oil content. Packaging of the herbs in high- or low-density polyethylene (HDPE and LDPE) has proven to be ineffective in preventing a significant loss of volatile components on storage. This study investigates and compares the effectiveness of alternate high-barrier property packaging materials, namely, polyvinylidene chloride coated polyethylene terephthalate/casted polypropylene (PVDC coated PET/CPP) and polyethylene terephthalate/polyethylene terephthalate/aluminum foil/linear low-density polyethylene (PET/PET/Foil/LLDPE), in prevention of volatile compound loss from the three native herbs stored at ambient temperature for 6 months. Concentrations of major volatiles were monitored using gas chromatography?mass spectrometry (GC-MS) techniques. After 6 months of storage, the greatest loss of volatiles from lemon myrtle was observed in traditional LDPE packaging (87% loss) followed by storage in PVDC coated PET/CPP (58% loss) and PET/PET/Foil/LLDPE (loss of 23%). The volatile loss from anise myrtle and Tasmanian pepper leaf stored in PVDC coated PET/CPP and PET/PET/Foil/LLDPE packaging was <30%. This study clearly indicates the importance of selecting the correct packaging material to retain the quality of herbs with high volatile content.
Resumo:
There is a world-wide trend for deteriorating water quality and light levels in the coastal zone, and this has been linked to declines in seagrass abundance. Localized management of seagrass meadow health requires that water quality guidelines for meeting seagrass growth requirements are available. Tropical seagrass meadows are diverse and can be highly dynamic and we have used this dynamism to identify light thresholds in multi-specific meadows dominated by Halodule uninervis in the northern Great Barrier Reef, Australia. Seagrass cover was measured at similar to 3 month intervals from 2008 to 2011 at three sites: Magnetic Island (MI) Dunk Island (DI) and Green Island (GI). Photosynthetically active radiation was continuously measured within the seagrass canopy, and three light metrics were derived. Complete seagrass loss occurred at MI and DI and at these sites changes in seagrass cover were correlated with the three light metrics. Mean daily irradiance (I-d) above 5 and 8.4 mol m(-2) d(-1) was associated with gains in seagrass at MI and DI, however a significant correlation (R = 0.649, p < 0.05) only occurred at MI. The second metric, percent of days below 3 mol m(-2) d(-1), correlated the most strongly (MI, R = -0.714, p < 0.01 and DI, R = -0.859, p = <0.001) with change in seagrass cover with 16-18% of days below 3 mol m(-2) d(-1) being associated with more than 50% seagrass loss. The third metric, the number of hours of light saturated irradiance (H-sat) was calculated using literature-derived data on saturating irradiance (E-k). H-sat correlated well (R = 0.686, p <0.01; and DI, R = 0.704, p < 0.05) with change in seagrass abundance, and was very consistent between the two sites as 4 H-sat was associated with increases in seagrass abundance at both sites, and less than 4 H-sat with more than 50% loss. At the third site (GI), small seasonal losses of seagrass quickly recovered during the growth season and the light metrics did not correlate (p > 0.05) with change in percent cover, except for I-d which was always high, but correlated with change in seagrass cover. Although distinct light thresholds were observed, the departure from threshold values was also important. For example, light levels that are well below the thresholds resulted in more severe loss of seagrass than those just below the threshold. Environmental managers aiming to achieve optimal seagrass growth conditions can use these threshold light metrics as guidelines; however, other environmental conditions, including seasonally varying temperature and nutrient availability, will influence seagrass responses above and below these thresholds. (C) 2012 Published by Elsevier Ltd.
Resumo:
Adsorption of CO has been investigated on the surfaces of polycrystalline transition metals as well as alloys by employing electron energy loss spectroscopy (eels) and ultraviolet photoelectron spectroscopy (ups). CO adsorbs on polycrystalline transition metal surfaces with a multiplicity of sites, each being associated with a characteristic CO stretching frequency; the relative intensities vary with temperature as well as coverage. Whilst at low temperatures (80- 120 K), low coordination sites are stabilized, the higher coordination sites are stabilized at higher temperatures (270-300 K). Adsorption on surfaces of polycrystalline alloys gives characteristic stretching frequencies due to the constituent metal sites. Alloying, however, causes a shift in the stretching frequencies, indicating the effect of the band structure on the nature of adsorption. The up spectra provide confirmatory evidence for the existence of separate metal sites in the alloys as well as for the high-temperature and low-temperature phases of adsorbed CO.
Resumo:
Perfluorinated alkyl acids (PFAAs) have been detected in serum at low concentrations in background populations. Higher concentrations haven been observed in adult males compared to females, with a possible explanation that menstruation offers females an additional elimination route. In this study, we examined the significance of blood loss as an elimination route of PFAAs. Pooled serum samples were collected from individuals undergoing a medical procedure involving ongoing blood withdrawal called venesection. Concentrations from male venesection patients were approximately 40% lower than males in the general population for perfluorohexane sulfonate (PFHxS), perfluorooctane sulfonate (PFOS) and perfluorooctanoic acid (PFOA). A simple pharmacokinetic model was used to test the hypothesis that blood loss could explain why adult males have higher concentrations of PFAAs than females, and why males undergoing venesections had lower concentrations compared to males in the general population. The model application generally supported these hypotheses showing that venesection might reduce blood serum concentrations by 37% (PFOA) and 53% (PFOS) compared to the observed difference of 44% and 37%. Menstruation was modeled to show a 22% reduction in PFOA serum concentrations compared to a 24% difference in concentrations between males and females in the background population. Uncertainties in the modeling and the data are identified and discussed.
Resumo:
Extreme vibration has been reported for small, high speed craft in the maritime sector, with performance and health threatening effects on boat operators and crew. Musculoskeletal injuries are an enduring problem for high speed craft passengers. Spinal or joint injuries and neurological disorders may occur from repetitive pounding over rough water, continued vibration and single impact events. The risk from whole body vibration (WBV) induced through the small vessels mainly depends on time spent on the craft, which can’t be changed in a military scenario; as well as the number of shocks and jolts, and their magnitude and frequency. In the European Union for example, physical agents directives require all employers to control exposure to a number of physical agents including noise and vibration. The EC Vibration Directive 2002/44/EC then sets out regulations for the control of health and safety risks from the exposure of workers to hand arm vibration (HAV) and WBV in the workplace. Australia has exposure standards relating to WBV, AS 2670.1-2001 – Evaluation of human exposure to whole body vibration. This standard is identical to the ISO 2631-1:1997, Mechanical vibration and shock – Evaluation of human exposure to whole-body vibration. Currently, none of the jurisdictions in Australia have specific regulations for vibration exposures in workplaces. However vibration is mentioned to varying degrees in their general regulations, codes of practice and guidance material. WBV on high speed craft is normally caused by “continuous 'hammering' from short steep seas or wind against tide conditions. Shock on High Speed Craft is usually caused by random impacts. Military organisations need the knowledge to make informed decisions regarding their marine operations, compliance with legislation and potentially harmful health effects, and develop and implement appropriate counter-measures. Marine case studies in the UK such as published MAIB (Marine Accident Investigation Branch) reports show injuries that have occurred in operation, and subsequent MCA (Maritime Coastguard Agency) guidance is provided (MGN 436 (M+F), WHOLE-BODY VIBRATION: Guidance on Mitigating Against the Effects of Shocks and Impacts on Small Vessels. MCA, 2011). This paper proposes a research framework to study the origin, impact and pathways for prevention of WBV in small, high speed craft in a maritime environment.
Resumo:
- Objectives Falls are the most frequent adverse event reported in hospitals. Patient and staff education delivered by trained educators significantly reduced falls and injurious falls in an older rehabilitation population. The purpose of the study was to explore the educators’ perspectives of delivering the education and to conceptualise how the programme worked to prevent falls among older patients who received the education. - Design A qualitative exploratory study. - Methods Data were gathered from three sources: conducting a focus group and an interview (n=10 educators), written educator notes and reflective researcher field notes based on interactions with the educators during the primary study. The educators delivered the programme on eight rehabilitation wards for periods of between 10 and 40 weeks. They provided older patients with individualised education to engage in falls prevention and provided staff with education to support patient actions. Data were thematically analysed and presented using a conceptual framework. - Results Falls prevention education led to mutual understanding between staff and patients which assisted patients to engage in falls prevention behaviours. Mutual understanding was derived from the following observations: the educators perceived that they could facilitate an effective three-way interaction between staff actions, patient actions and the ward environment which led to behaviour change on the wards. This included engaging with staff and patients, and assisting them to reconcile differing perspectives about falls prevention behaviours. - Conclusions Individualised falls prevention education effectively provides patients who receive it with the capability and motivation to develop and undertake behavioural strategies that reduce their falls, if supported by staff and the ward environment.
Resumo:
Reducing crop row spacing and delaying time of weed emergence may provide crops a competitive edge over weeds. Field experiments were conducted to evaluate the effects of crop row spacing (11, 15, and 23-cm) and weed emergence time (0, 20, 35, 45, 55, and 60 days after wheat emergence; DAWE) on Galium aparine and Lepidium sativum growth and wheat yield losses. Season-long weed-free and crop-free treatments were also established to compare wheat yield and weed growth, respectively. Row spacing and weed emergence time significantly affected the growth of both weed species and wheat grain yields. For both weed species, the maximum plant height, shoot biomass, and seed production were observed in the crop-free plots, and delayed emergence decreased these variables. In weed-crop competition plots, maximum weed growth was observed when weeds emerged simultaneously with the crop in rows spaced 23-cm apart. Less growth of both weed species was observed in narrow row spacing (11-cm) of wheat as compared with wider rows (15 and 23-cm). These weed species produced less than 5 seeds plant-1 in 11-cm wheat rows when they emerged at 60 DAWE. Presence of weeds in the crop especially at early stages was devastating for wheat yields. Therefore, maximum grain yield (4.91tha-1) was recorded in the weed-free treatment at 11-cm row spacing. Delay in time of weed emergence and narrow row spacing reduced weed growth and seed production and enhanced wheat grain yield, suggesting that these strategies could contribute to weed management in wheat.
Resumo:
Invasive and noxious weeds are well known as a pervasive problem, imposing significant economic burdens on all areas of agriculture. Whilst there are multiple possible pathways of weed dispersal in this industry, of particular interest to this discussion is the unintended dispersal of weed seeds within fodder. During periods of drought or following natural disasters such as wild fire or flood, there arises the urgent need for 'relief' fodder to ensure survival and recovery of livestock. In emergency situations, relief fodder may be sourced from widely dispersed geographic regions, and some of these regions may be invaded by an extensive variety of weeds that are both exotic and detrimental to the intended destination for the fodder. Pasture hay is a common source of relief fodder and it typically consists of a mixture of grassy and broadleaf species that may include noxious weeds. When required urgently, pasture hay for relief fodder can be cut, baled, and transported over long distances in a short period of time, with little opportunity for prebaling inspection. It appears that, at the present time, there has been little effort towards rapid testing of bales, post-baling, for the presence of noxious weeds, as a measure to prevent dispersal of seeds. Published studies have relied on the analysis of relatively small numbers of bales, tested to destruction, in order to reveal seed species for identification and enumeration. The development of faster, more reliable, and non-destructive sampling methods is essential to increase the fodder industry's capacity to prevent the dispersal of noxious weeds to previously unaffected locales.
Resumo:
AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.
Resumo:
- Aim This study aimed (i) to determine the change in the number of government-funded nutrition positions following structural and political reforms and (ii) to describe the remaining workforce available to do nutrition prevention work, including student placements, in Queensland. - Methods Positions funded by the Queensland government were counted using departmental human resource data and compared with data collected 4 years earlier. Positions not funded by the government were identified using formal professional networks and governance group lists. Both groups were sent an online survey that explored their position name, funding source, employer, qualifications, years of experience, work in prevention and ability to supervise students. - Results There was a 90% reduction in the number of nutrition prevention positions funded by the government between 2009 (137 full time equivalents (FTE)) and 2013 (14 FTE). In 2013, 313 specialist (n = 92) and generalist (n = 221) practitioners were identified as potentially working in nutrition prevention throughout Queensland. A total of 30 permanent FTEs indicated over 75% of their work focused on prevention. This included the 14 FTE funded by the Queensland government and an additional 16 FTE from other sectors. Generalists did not consider themselves part of the nutrition workforce. - Conclusions Queensland experienced an extreme reduction in its nutrition prevention workforce as a result of political and structural reforms. This disinvestment by the Queensland government was not compensated for by other sectors, and has left marked deficits in public health nutrition capacity, including student placements.
Resumo:
Reforestation will have important consequences for the global challenges of mitigating climate change, arresting habitat decline and ensuring food security. We examined field-scale trade-offs between carbon sequestration of tree plantings and biodiversity potential and loss of agricultural land. Extensive surveys of reforestation across temperate and tropical Australia (N = 1491 plantings) were used to determine how planting width and species mix affect carbon sequestration during early development (< 15 year). Carbon accumulation per area increased significantly with decreasing planting width and with increasing proportion of eucalypts (the predominant over-storey genus). Highest biodiversity potential was achieved through block plantings (width > 40 m) with about 25% of planted individuals being eucalypts. Carbon and biodiversity goals were balanced in mixed-species plantings by establishing narrow belts (width < 20 m) with a high proportion (>75%) of eucalypts, and in monocultures of mallee eucalypt plantings by using the widest belts (ca. 6–20 m). Impacts on agriculture were minimized by planting narrow belts (ca. 4 m) of mallee eucalypt monocultures, which had the highest carbon sequestering efficiency. A plausible scenario where only 5% of highly-cleared areas (<30% native vegetation cover remaining) of temperate Australia are reforested showed substantial mitigation potential. Total carbon sequestration after 15 years was up to 25 Mt CO2-e year−1 when carbon and biodiversity goals were balanced and 13 Mt CO2-e year−1 if block plantings of highest biodiversity potential were established. Even when reforestation was restricted to marginal agricultural land (<$2000 ha−1 land value, 28% of the land under agriculture in Australia), total mitigation potential after 15 years was 17–26 Mt CO2-e year−1 using narrow belts of mallee plantings. This work provides guidance on land use to governments and planners. We show that the multiple benefits of young tree plantings can be balanced by manipulating planting width and species choice at establishment. In highly-cleared areas, such plantings can sequester substantial biomass carbon while improving biodiversity and causing negligible loss of agricultural land.
Resumo:
Background Prevention of foot ulcers in patients with diabetes is extremely important to help reduce the enormous burden of foot ulceration on both patient and health resources. A comprehensive analysis of reported interventions is not currently available, but is needed to better inform caregivers about effective prevention. The aim of this systematic review is to investigate the effectiveness of interventions to prevent first and recurrent foot ulcers in persons with diabetes who are at risk for ulceration. Methods The available medical scientific literature in PubMed, EMBASE, CINAHL and the Cochrane database was searched for original research studies on preventative interventions. Both controlled and non-controlled studies were selected. Data from controlled studies were assessed for methodological quality by two independent reviewers. Results From the identified records, a total of 30 controlled studies (of which 19 RCTs) and another 44 non-controlled studies were assessed and described. Few controlled studies, of generally low to moderate quality, were identified on the prevention of a first foot ulcer. For the prevention of recurrent plantar foot ulcers, multiple RCTs with low risk of bias show the benefit for the use of daily foot skin temperature measurements and consequent preventative actions, as well as for therapeutic footwear that demonstrates to relieve plantar pressure and that is worn by the patient. To prevent recurrence, some evidence exists for integrated foot care when it includes a combination of professional foot treatment, therapeutic footwear and patient education; for just a single session of patient education, no evidence exists. Surgical interventions can be effective in selected patients, but the evidence base is small. Conclusion The evidence base to support the use of specific self-management and footwear interventions for the prevention of recurrent plantar foot ulcers is quite strong, but is small for the use of other, sometimes widely applied, interventions and is practically nonexistent for the prevention of a first foot ulcer and non-plantar foot ulcer.
Resumo:
Background Flexor tenotomy is a minimally invasive surgical alternative for the treatment of neuropathic diabetic foot ulcers on the distal end of the toe. The influence of infection on healing and time to heal after flexor tenotomy is unknown. Flexor tenotomy can also be used as a prophylactic treatment. The effectiveness as a prophylactic treatment has not been described before. Methods A retrospective study was performed with the inclusion of all consecutive flexor tenotomies from one hospital between January 2005 and December 2011. Results From 38 ulcers, 35 healed (92%), with a mean time to heal of 22 ± 26 days. The longest duration for healing was found for infected ulcers that were penetrating to bone (35 days; p = .042). Cases of prophylactic flexor tenotomies (n=9) did not result in any ulcer or other complications during follow-up. Conclusions The results of this study suggest that flexor tenotomy may be beneficial for neuropathic diabetic foot ulcers on the distal end of the toe, with a high healing percentage and a short mean time to heal. Infected ulcers that penetrated to bone took a significantly longer time to heal. Prospective research, to confirm the results of this retrospective study, should be performed.