969 resultados para Standard setting
Resumo:
For many fisheries, there is a need to develop appropriate indicators, methodologies, and rules for sustainably harvesting marine resources. Complexities of scientific and financial factors often prevent addressing these, but new methodologies offer significant improvements on current and historical approaches. The Australian spanner crab fishery is used to demonstrate this. Between 1999 and 2006, an empirical management procedure using linear regression of fishery catch rates was used to set the annual total allowable catch (quota). A 6-year increasing trend in catch rates revealed shortcomings in the methodology, with a 68% increase in quota calculated for the 2007 fishing year. This large quota increase was prevented by management decision rules. A revised empirical management procedure was developed subsequently, and it achieved a better balance between responsiveness and stability. Simulations identified precautionary harvest and catch rate baselines to set quotas that ensured sustainable crab biomass and favourable performance for management and industry. The management procedure was simple to follow, cost-effective, robust to strong trends and changes in catch rates, and adaptable for use in many fisheries. Application of such “tried-and-tested” empirical systems will allow improved management of both data-limited and data-rich fisheries.
Resumo:
In political journalism, the battle over agenda-setting between journalists and their sources has been described using many metaphors and concepts. Herbert Gans saw it as a dance where the two parties competed for leadership, arguing that sources usually got the lead. We address the question of how social media, in particular Twitter, contribute to media agenda-building and agenda-setting by looking at how tweets are sourced in election campaign coverage in Australia, Norway and Sweden. Our findings show that the popularity of elite political sources is a common characteristic across all countries and media. Sourcing from Twitter reinforces the power of the political elites to set the agenda of the news media – they are indeed “still leading the dance”. Twitter content travels to the news media as opinions, comments, announcements, factual statements, and photos. Still, there are variations that must be explained both by reference to different political and cultural characteristics of the three countries, as well as by the available resources and journalistic profiles of each media outlet.
Resumo:
Objective: In Australian residential aged care facilities (RACFs), the use of certain classes of high-risk medication such as antipsychotics, potent analgesics, and sedatives is high. Here, we examined the prescribed medications and subsequent changes recommended by geriatricians during comprehensive geriatric consultations provided to residents of RACFs via videoconference. Design: This is a prospective observational study. Setting: Four RACFs in Queensland, Australia, are included. Participants: A total of 153 residents referred by general practitioners for comprehensive assessment by geriatricians delivered by video-consultation. Results: Residents’ mean (standard deviation, SD) age was 83.0 (8.1) years and 64.1% were female. They had multiple comorbidities (mean 6), high levels of dependency, and were prescribed a mean (SD) of 9.6 (4.2) regular medications. Ninety-one percent of patients were taking five or more medications daily. Of total medications prescribed (n=1,469), geriatricians recommended withdrawal of 9.8% (n=145) and dose alteration of 3.5% (n=51). New medications were initiated in 47.7% (n=73) patients. Of the 10.3% (n=151) medications considered as high risk, 17.2% were stopped and dose altered in 2.6%. Conclusion: There was a moderate prevalence of potentially inappropriate high-risk medications. However, geriatricians made relatively few changes, suggesting either that, on balance, prescription of these medications was appropriate or, because of other factors, there was a reluctance to adjust medications. A structured medication review using an algorithm for withdrawing medications of high disutility might help optimize medications in frail patients. Further research, including a broader survey, is required to understand these dynamics.
Resumo:
The results of the pilot demonstrated that a pharmacist delivered vaccinations services is feasible in community pharmacy and is safe and effective. The accessibility of the pharmacist across the influenza season provided the opportunity for more people to be vaccinated, particularly those who had never received an influenza vaccine before. Patient satisfaction was extremely high with nearly all patients happy to recommend the service and to return again next year. Factors critical to the success of the service were: 1. Appropriate facilities 2. Competent pharmacists 3. Practice and decision support tools 4. In-‐store implementation support We demonstrated in the pilot that vaccination recipients preferred a private consultation area. As the level of privacy afforded to the patients increased (private room vs. booth), so did the numbers of patients vaccinated. We would therefore recommend that the minimum standard of a private consultation room or closed-‐in booth, with adequate space for multiple chairs and a work / consultation table be considered for provision of any vaccination services. The booth or consultation room should be used exclusively for delivering patient services and should not contain other general office equipment, nor be used as storage for stock. The pilot also demonstrated that a pharmacist-‐specific training program produced competent and confident vaccinators and that this program can be used to retrofit the profession with these skills. As vaccination is within the scope of pharmacist practice as defined by the Pharmacy Board of Australia, there is potential for the universities to train their undergraduates with this skill and provide a pharmacist vaccination workforce in the near future. It is therefore essential to explore appropriate changes to the legislation to facilitate pharmacists’ practice in this area. Given the level of pharmacology and medicines knowledge of pharmacists, combined with their new competency of providing vaccinations through administering injections, it is reasonable to explore additional vaccines that pharmacists could administer in the community setting. At the time of writing, QPIP has already expanded into Phase 2, to explore pharmacists vaccinating for whooping cough and measles. Looking at the international experience of pharmacist delivered vaccination, we would recommend considering expansion to other vaccinations in the future including travel vaccinations, HPV and selected vaccinations to those under the age of 18 years. Overall the results of the QPIP implementation have demonstrated that an appropriately trained pharmacist can deliver safely and effectively influenza vaccinations to adult patients in the community. The QPIP showed the value that the accessibility of pharmacists brings to public health outcomes through improved access to vaccinations and the ability to increase immunisation rates in the general population. Over time with the expansion of pharmacist vaccination services this will help to achieve more effective herd immunity for some of the many diseases which currently have suboptimal immunisation rates.
Resumo:
Memory, time and metaphor are central triggers for artists in exploring and shaping their creative work. This paper examines the place of artists as ‘memory-keepers’, and ‘memory-makers’, in particular through engagement with the time-based art of site-specific performance. Naik Naik (Ascent) was a multi-site performance project in the historic setting of Melaka, Malaysia, and is partially recaptured through the presence and voices of its collaborating artists. Distilled from moments recalled, this paper seeks to uncover the poetics of memory to emerge from the project; one steeped in metaphor rather than narrative. It elicits some of the complex and interdependent layers of experience revealed by the artists in Naik Naik; cultural, ancestral, historical, personal, instinctual and embodied memories connected to sound, smell, touch, sensation and light, in a spatiotemporal context for which site is the catalyst. The liminal nature of memory at the heart of Naik Naik, provides a shared experience of past and present and future, performatively interwoven.
Resumo:
Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.
Resumo:
This work deals with the formulation and implementation of finite deformation viscoplasticity within the framework of stress-based hybrid finite element methods. Hybrid elements, which are based on a two-field variational formulation, are much less susceptible to locking than conventional displacement-based elements. The conventional return-mapping scheme cannot be used in the context of hybrid stress methods since the stress is known, and the strain and the internal plastic variables have to be recovered using this known stress field.We discuss the formulation and implementation of the consistent tangent tensor, and the return-mapping algorithm within the context of the hybrid method. We demonstrate the efficacy of the algorithm on a wide range of problems.
Resumo:
Zebu (Bos indicus) crossbred beef cows (Droughtmaster) were maintained long-term (16 months) on standard nutrition (SN) or improved nutrition (IN). Cows on IN had better body condition and greater (P<0.05) circulating concentrations of leptin than cows on SN (0.7±0.1n/ml and 1.7±0.1n/ml, respectively). There were no outstanding differences between SN and IN cows in basal number of ovarian follicles (≤4mm, 5-8mm, and≥9mm) and there were also no differences in number of oocytes recovered by oocyte pick-up. Cows on IN had a greater (P<0.05) number of total follicles after stimulation with FSH than cows on SN. Oocytes from cows on IN had greater (P<0.05) lipid content than cows on SN (-0.23±0.16 and 0.20±0.18 arbitrary units, respectively) and oocytes of the former cows also tended to have more active mitochondria, although this was not significant. Cows on IN showed a positive relationship (R2=0.31, P<0.05) between plasma leptin and oocyte lipid content. Lipids are utilized by oocytes during high energy consumptive processes including fertilization and early cleavage. The greater lipid content of oocytes from IN cows could therefore confer a reproductive advantage. The present study has shown relationships between nutrition, body condition, circulating leptin, and oocyte lipid content, but a clear cause-and-effect requires further investigation in the cow. © 2013 Elsevier B.V.
Resumo:
- Background Nilotinib and dasatinib are now being considered as alternative treatments to imatinib as a first-line treatment of chronic myeloid leukaemia (CML). - Objective This technology assessment reviews the available evidence for the clinical effectiveness and cost-effectiveness of dasatinib, nilotinib and standard-dose imatinib for the first-line treatment of Philadelphia chromosome-positive CML. - Data sources Databases [including MEDLINE (Ovid), EMBASE, Current Controlled Trials, ClinicalTrials.gov, the US Food and Drug Administration website and the European Medicines Agency website] were searched from search end date of the last technology appraisal report on this topic in October 2002 to September 2011. - Review methods A systematic review of clinical effectiveness and cost-effectiveness studies; a review of surrogate relationships with survival; a review and critique of manufacturer submissions; and a model-based economic analysis. - Results Two clinical trials (dasatinib vs imatinib and nilotinib vs imatinib) were included in the effectiveness review. Survival was not significantly different for dasatinib or nilotinib compared with imatinib with the 24-month follow-up data available. The rates of complete cytogenetic response (CCyR) and major molecular response (MMR) were higher for patients receiving dasatinib than for those with imatinib for 12 months' follow-up (CCyR 83% vs 72%, p < 0.001; MMR 46% vs 28%, p < 0.0001). The rates of CCyR and MMR were higher for patients receiving nilotinib than for those receiving imatinib for 12 months' follow-up (CCyR 80% vs 65%, p < 0.001; MMR 44% vs 22%, p < 0.0001). An indirect comparison analysis showed no difference between dasatinib and nilotinib for CCyR or MMR rates for 12 months' follow-up (CCyR, odds ratio 1.09, 95% CI 0.61 to 1.92; MMR, odds ratio 1.28, 95% CI 0.77 to 2.16). There is observational association evidence from imatinib studies supporting the use of CCyR and MMR at 12 months as surrogates for overall all-cause survival and progression-free survival in patients with CML in chronic phase. In the cost-effectiveness modelling scenario, analyses were provided to reflect the extensive structural uncertainty and different approaches to estimating OS. First-line dasatinib is predicted to provide very poor value for money compared with first-line imatinib, with deterministic incremental cost-effectiveness ratios (ICERs) of between £256,000 and £450,000 per quality-adjusted life-year (QALY). Conversely, first-line nilotinib provided favourable ICERs at the willingness-to-pay threshold of £20,000-30,000 per QALY. - Limitations Immaturity of empirical trial data relative to life expectancy, forcing either reliance on surrogate relationships or cumulative survival/treatment duration assumptions. - Conclusions From the two trials available, dasatinib and nilotinib have a statistically significant advantage compared with imatinib as measured by MMR or CCyR. Taking into account the treatment pathways for patients with CML, i.e. assuming the use of second-line nilotinib, first-line nilotinib appears to be more cost-effective than first-line imatinib. Dasatinib was not cost-effective if decision thresholds of £20,000 per QALY or £30,000 per QALY were used, compared with imatinib and nilotinib. Uncertainty in the cost-effectiveness analysis would be substantially reduced with better and more UK-specific data on the incidence and cost of stem cell transplantation in patients with chronic CML. - Funding The Health Technology Assessment Programme of the National Institute for Health Research.
Resumo:
The family of location and scale mixtures of Gaussians has the ability to generate a number of flexible distributional forms. The family nests as particular cases several important asymmetric distributions like the Generalized Hyperbolic distribution. The Generalized Hyperbolic distribution in turn nests many other well known distributions such as the Normal Inverse Gaussian. In a multivariate setting, an extension of the standard location and scale mixture concept is proposed into a so called multiple scaled framework which has the advantage of allowing different tail and skewness behaviours in each dimension with arbitrary correlation between dimensions. Estimation of the parameters is provided via an EM algorithm and extended to cover the case of mixtures of such multiple scaled distributions for application to clustering. Assessments on simulated and real data confirm the gain in degrees of freedom and flexibility in modelling data of varying tail behaviour and directional shape.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.
Resumo:
BACKGROUND The current impetus for developing alcohol and/or other drugs (AODs) workplace policies in Australia is to reduce workplace AOD impairment, improve safety, and prevent AOD-related injury in the workplace. For these policies to be effective, they need to be informed by scientific evidence. Evidence to inform the development and implementation of effective workplace AOD policies is currently lacking. There does not currently appear to be conclusive evidence for the effectiveness of workplace AOD policies in reducing impairment and preventing AOD-related injury. There is also no apparent evidence regarding which factors facilitate or impede the success of an AOD policy, or whether, for example, unsuccessful policy outcomes were due to poor policy or merely poor implementation of the policy. It was the aim of this research to undertake a process, impact, and outcome evaluation of a workplace AOD policy, and to contribute to the body of knowledge on the development and implementation of effective workplace AOD policies. METHODS The research setting was a state-based power-generating industry in Australia between May 2008 and May 2010. Participants for the process evaluation study were individuals who were integral to either the development or the implementation of the workplace AOD policy, or both of these processes (key informants), and comprised the majority of individuals who were involved in the process of developing and/or implementing the workplace AOD policy. The sample represented the two main groups of interest—management and union delegates/employee representatives—from all three of the participating organisations. For the impact and outcome evaluation studies, the population included all employees from the three participating organisations, and participants were all employees who consented to participate in the study and who completed both the pre-and post-policy implementation questionnaires. Qualitative methods in the form of interviews with key stakeholders were used to evaluate the process of developing and implementing the workplace AOD policy. In order to evaluate the impact of the policy with regard to the risk factors for workplace AOD impairment, and the outcome of the policy in terms of reducing workplace AOD impairment, quantitative methods in the form of a non-randomised single group pre- and post-test design were used. Changes from Time 1 (pre) to Time 2 (post) in the risk factors for workplace AOD impairment, and changes in the behaviour of interest—(self-reported) workplace AOD impairment—were measured. An integration of the findings from the process, impact, and outcome evaluation studies was undertaken using a combination of qualitative and quantitative methods. RESULTS For the process evaluation study Study respondents indicated that their policy was developed in the context of comparable industries across Australia developing workplace AOD policies, and that this was mainly out of concern for the deleterious health and safety impacts of workplace AOD impairment. Results from the process evaluation study also indicated that in developing and implementing the workplace AOD policy, there were mainly ‗winners', in terms of health and safety in the workplace. While there were some components of the development and implementation of the policy that were better done than others, and the process was expensive and took a long time, there were, overall, few unanticipated consequences to implementing the policy and it was reported to be thorough and of a high standard. Findings also indicated that overall the policy was developed and implemented according to best-practice in that: consultation during the policy development phase (with all the main stakeholders) was extensive; the policy was comprehensive; there was universal application of the policy to all employees; changes in the workplace (with regard to the policy) were gradual; and, the policy was publicised appropriately. Furthermore, study participants' responses indicated that the role of an independent external expert, who was trusted by all stakeholders, was integral to the success of the policy. For the impact and outcome evaluation studies Notwithstanding the limitations of pre- and post-test study designs with regard to attributing cause to the intervention, the findings from the impact evaluation study indicated that following policy implementation, statistically significant positive changes with regard to workplace AOD impairment were recorded for the following variables (risk factors for workplace AOD impairment): Knowledge; Attitudes; Perceived Behavioural Control; Perceptions of the Certainty of being punished for coming to work impaired by AODs; Perceptions of the Swiftness of punishment for coming to work impaired by AODs; and Direct and Indirect Experience with Punishment Avoidance for workplace AOD impairment. There were, however, no statistically significant positive changes following policy implementation for Behavioural Intentions, Subjective Norms, and Perceptions of the Severity of punishment for workplace AOD impairment. With regard to the outcome evaluation, there was a statistically significant reduction in self-reported workplace AOD impairment following the implementation of the policy. As with the impact evaluation, these findings need to be interpreted in light of the limitations of the study design in being able to attribute cause to the intervention alone. The findings from the outcome evaluation study also showed that while a positive change in self-reported workplace AOD impairment following implementation of the policy did not appear to be related to gender, age group, or employment type, it did appear to be related to levels of employee general alcohol use, cannabis use, site type, and employment role. Integration of the process, impact, and outcome evaluation studies There appeared to be qualitative support for the relationship between the process of developing and implementing the policy, and the impact of the policy in changing the risk factors for workplace AOD impairment. That is, overall the workplace AOD policy was developed and implemented well and, following its implementation, there were positive changes in the majority of measured risk factors for workplace AOD impairment. Quantitative findings lend further support for a relationship between the process and impact of the policy, in that there was a statistically significant association between employee perceived fidelity of the policy (related to the process of the policy) and positive changes in some risk factors for workplace AOD impairment (representing the impact of the policy). Findings also indicated support for the relationship between the impact of the policy in changing the risk factors for workplace AOD impairment and the outcome of the policy in reducing workplace AOD impairment: positive changes in the risk factors for workplace AOD impairment (impact) were related to positive changes in self reported workplace AOD impairment (representing the main goal and outcome of the policy). CONCLUSIONS The findings from the research indicate support for the conclusion that the policy was appropriately implemented and that it achieved its objectives and main goal. The Doctoral research findings also addressed a number of gaps in the literature on workplace AOD impairment, namely: the likely effectiveness of AOD policies for reducing AOD impairment in the workplace, which factors in the development and implementation of a workplace AOD policy are likely to facilitate or impede the effectiveness of the policy to reduce workplace AOD impairment, and which employee groups are less likely to respond well to policies of this type. The findings from this research not only represent an example of translational, applied research—through the evaluation of the study industry's policy—but also add to the body of knowledge on workplace AOD policies and provide policy-makers with evidence which may be useful in the development and implementation of effective workplace AOD policies. Importantly, the findings espouse the importance of scientific evidence in the development, implementation, and evaluation of workplace AOD policies.
Resumo:
The Queensland (QLD) fishery for spanner crabs primarily lands live crab for export overseas, with gross landings valued around A$5 million per year. Quota setting rules are used to assess and adjust total allowable harvest (quota) around an agreed target harvest of 1631 t and capped at a maximum of 2000 t. The quota varies based on catch rate indicators from the commercial fishery and a fishery independent survey. Quota management applies only to ‘Managed Area A’ which includes waters between Rockhampton and the New South Wales (NSW) border. This report has been prepared to inform Fisheries Queensland (Department of Agriculture and Fisheries) and stakeholders of catch trends and the estimated quota of spanner crabs in Managed Area A for the forthcoming annual quota periods (1 June 2016–31 May 2018). The quota calculations followed the methodology developed by the crab fishery Scientific Advisory Group (SAG) between November 2007 and March 2008. The QLD total reported spanner crab harvest was 1170 t for the 2015 calendar year. In 2015, a total of 55 vessels were active in the QLD fishery, down from 262 vessels at the fishery’s peak activity in 1994. Recent spanner crab harvests from NSW waters average about 125 t per year, but fell to 80 t in 2014–2015. The spanner crab Managed Area A commercial standardised catch rate averaged 0.818 kg per net-lift in 2015, 22.5% below the target level of 1.043. Compared to 2014, mean catch rates in 2015 were marginally improved south of Fraser Island. The NSW–QLD survey catch rate in 2015 was 20.541 crabs per ground-line, 33% above the target level of 13.972. This represented an increase in survey catch rates of about four crabs per groundline, compared to the 2014 survey. The QLD spanner crab total allowable harvest (quota) was set at 1923 t in the 2012-13 and 2013-14 fishing years, 1777 t in 2014-15 and 1631 t in 2015-16. The results from the current analysis rules indicate that the quota for the next two fishing years be retained at the base quota of 1631 t.
Resumo:
The Australian fishery for spanner crabs is the largest in the world, with the larger Queensland (QLD) sector’s landings primarily exported live overseas and GVP valued ~A$5 million per year. Spanner crabs are unique in that they may live up to 15 years, significantly more than blue swimmer crabs (Portunus armatus) and mud crabs (Scylla serrata), the two other important crab species caught in Queensland. Spanner crabs are caught using a flat net called a dilly, on which the crabs becoming entangled via the swimming legs. Quota setting rules are used to assess and adjust total allowable harvest (quota) around an agreed target harvest of 1631 t and capped at a maximum of 2000 t. The quota varies based on catch rate indicators from the commercial fishery and a fishery-independent survey from the previous two years, compared to target reference points. Quota management applies only to ‘Managed Area A’ which includes waters between Rockhampton and the New South Wales (NSW) border. This report has been prepared to inform Fisheries Queensland (Department of Agriculture and Fisheries) and stakeholders of catch trends and the estimated quota of spanner crabs in Managed Area A for the forthcoming quota period (1 June 2015–31 May 2016). The quota calculations followed the methodology developed by the crab fishery Scientific Advisory Group (SAG) between November 2007 and March 2008. The total reported spanner crab harvest was 917 t for the 2014 calendar year, almost all of which was taken from Managed Area A. In 2014, a total of 59 vessels were active in the QLD fishery, the lowest number since the peak in 1994 of 262 vessels. Recent spanner crab harvests from NSW waters have been about 125 t per year. The spanner crab Managed Area A commercial standardised catch rate averaged 0.739 kg per net-lift in 2014, 24% below the target level of 1.043. Mean catch rates declined in the commercial fishery in 2014, although the magnitude of the decreases was highest in the area north of Fraser Island. The NSW–QLD survey catch rate in 2014 was 16.849 crabs per ground-line, 22% above the target level of 13.972. This represented a decrease in survey catch rates of 0.366 crabs per ground-line, compared to the 2013 survey. The Queensland spanner crab total allowable harvest (quota) was set at 1923 t in 2012 and 2013. In 2014, the quota was calculated at the base level of 1631 t. However, given that the 2012 fisheryindependent survey was not undertaken for financial reasons, stakeholders proposed that the total allowable commercial catch (TACC) be decreased to 1777 t; a level that was halfway between the 2012/13 quota of 1923 t and the recommended base quota of 1631 t. The results from the current analysis indicate that the quota for the 2015-2016 financial year be decreased from 1777 t to the base quota of 1631 t.
Resumo:
Background Malnutrition and unintentional weight loss are major clinical issues in people with dementia living in residential aged care facilities (RACFs) and are associated with serious adverse outcomes. However, evidence regarding effective interventions is limited and strategies to improve the nutritional status of this population are required. This presentation describes the implementation and results of a pilot randomised controlled trial of a multi-component intervention for improving the nutritional status of RACF residents with dementia. Method Fifteen residents with moderate-severe dementia living in a secure long-term RACF participated in a five week pilot study. Participants were randomly allocated to either an Intervention (n=8) or Control group (n=7). The intervention comprised four elements delivered in a separate dining room at lunch and dinner: the systematic reinforcement of residents’ eating behaviors using a specific communication protocol; family-style dining; high ambiance table presentation; and routine Dietary-Nutrition Champion supervision. Control group participants ate their meals according to the facility’s standard practice. Baseline and follow-up assessments of nutritional status, food consumption, and body mass index were obtained by qualified nutritionists. Additional assessments included measures of cognitive functioning, mealtime agitation, depression, wandering status and multiple measures of intervention fidelity. Results No participant was malnourished at study commencement and participants in both groups gained weight from follow-up to baseline which was not significantly different between groups (t=0.43; p=0.67). A high degree of treatment fidelity was evident throughout the intervention. Qualitative data from staff indicate the intervention was perceived to be beneficial for residents. Conclusions This multi-component nutritional intervention was well received and was feasible in the RACF setting. Participants’ sound nutritional status at baseline likely accounts for the lack of an intervention effect. Further research using this protocol in malnourished residents is recommended. For success, a collaborative approach between researchers and facility staff, particularly dietary staff, is essential.