909 resultados para EVALUATION MODEL
Resumo:
Currently pathological and illness-centric policy surrounds the evaluation of the health status of a person experiencing disability. In this research partnerships were built between disability service providers, community development organizations and disability arts organizations to build a translational evaluative methodology prior to implementation of an arts-based workshop that was embedded in a strengths-based approach to health and well-being. The model consisted of three foci: participation in a pre-designed drama-based workshop program; individualized assessment and evaluation of changing health status; and longitudinal analysis of participants changing health status in their public lives following the culmination of the workshop series. Participants (n = 15) were recruited through disability service providers and disability arts organizations to complete a 13-week workshop series and public performance. The study developed accumulative qualitative analysis tools and member-checking methods specific to the communication systems used by individual participants. Principle findings included increased confidence for verbal and non-verbal communicators; increased personal drive, ambition and goal-setting; increased arts-based skills including professional engagements as artists; demonstrated skills in communicating perceptions of health status to private and public spheres. Tangential positive observations were evident in the changing recreational, vocational and educational activities participants engaged with pre- and post- the workshop series; participants advocating for autonomous accommodation and health provision and changes in the disability service staff's culture. The research is an example of translational health methodologies in disability studies.
Resumo:
This paper details the participation of the Australian e- Health Research Centre (AEHRC) in the ShARe/CLEF 2013 eHealth Evaluation Lab { Task 3. This task aims to evaluate the use of information retrieval (IR) systems to aid consumers (e.g. patients and their relatives) in seeking health advice on the Web. Our submissions to the ShARe/CLEF challenge are based on language models generated from the web corpus provided by the organisers. Our baseline system is a standard Dirichlet smoothed language model. We enhance the baseline by identifying and correcting spelling mistakes in queries, as well as expanding acronyms using AEHRC's Medtex medical text analysis platform. We then consider the readability and the authoritativeness of web pages to further enhance the quality of the document ranking. Measures of readability are integrated in the language models used for retrieval via prior probabilities. Prior probabilities are also used to encode authoritativeness information derived from a list of top-100 consumer health websites. Empirical results show that correcting spelling mistakes and expanding acronyms found in queries signi cantly improves the e ectiveness of the language model baseline. Readability priors seem to increase retrieval e ectiveness for graded relevance at early ranks (nDCG@5, but not precision), but no improvements are found at later ranks and when considering binary relevance. The authoritativeness prior does not appear to provide retrieval gains over the baseline: this is likely to be because of the small overlap between websites in the corpus and those in the top-100 consumer-health websites we acquired.
Resumo:
Responding to the idea of child friendly communities, Play a Part is an innovative program advancing preventative strategies for children and young people to minimise exposure to abuse and neglect. The program was developed ensuing an increase in notifications of suspected child abuse and neglect in 2007. Now completing the second phase, the program is a community engagement strategy that aims to prevent child abuse. Play a Part is described as “a whole of community approach to creating child friendly communities” (NAPCAN, 2012). The Play a Part program was piloted between 2007 and 2010 in five southeast Queensland communities, and is currently operating in parts of Logan City region and the Redlands region. To assess the merit of the second phase of the program the Children and Youth Research Centre at Queensland University of Technology was contracted to undertake an evaluation-research at the beginning of 2013.
Resumo:
The generational approach to conceptualising first year student learning behaviour has made a useful contribution to understanding student engagement. It has an explicit focus on student behaviour and we suggest that a capability maturity model interpretation may provide a complementary extension of that understanding as it builds on the generational approach by allowing an assessment of institutional capability to initiate, plan, manage, evaluate and review institutional student engagement practices. The development of a Student Engagement, Success and Retention Maturity Model (SESR-MM) is discussed along with its application in an Australian higher education institution. In this case study, the model identified first, second and third generation approaches and in addition achieved a ‘complementary extension’ of the generational approach, building on it by identifying additional practices not normally considered within the generational concept and indicating the capability of the institution to provide and implement the practices.
Resumo:
An accurate PV module electrical model is presented based on the Shockley diode equation. The simple model has a photo-current current source, a single diode junction and a series resistance, and includes temperature dependences. The method of parameter extraction and model evaluation in Matlab is demonstrated for a typical 60W solar panel. This model is used to investigate the variation of maximum power point with temperature and isolation levels. A comparison of buck versus boost maximum power point tracker (MPPT) topologies is made, and compared with a direct connection to a constant voltage (battery) load. The boost converter is shown to have a slight advantage over the buck, since it can always track the maximum power point.
Resumo:
An accurate PV module electrical model is presented based on the Shockley diode equation. The simple model has a photo-current current source, a single diode junction and a series resistance, and includes temperature dependences. The method of parameter extraction and model evaluation in Matlab is demonstrated for a typical 60W solar panel. This model is used to investigate the variation of maximumpower point with temperature and insolation levels. A comparison of buck versus boostmaximum power point tracker (MPPT) topologies is made, and compared with a direct connection to a constant voltage (battery) load. The boost converter is shown to have a slight advantage over the buck, since it can always track the maximum power point.
Resumo:
Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.
Resumo:
Successful anatomic fitting of a total artificial heart (TAH) is vital to achieve optimal pump hemodynamics after device implantation. Although many anatomic fitting studies have been completed in humans prior to clinical trials, few reports exist that detail the experience in animals for in vivo device evaluation. Optimal hemodynamics are crucial throughout the in vivo phase to direct design iterations and ultimately validate device performance prior to pivotal human trials. In vivo evaluation in a sheep model allows a realistically sized representation of a smaller patient, for which smaller third-generation TAHs have the potential to treat. Our study aimed to assess the anatomic fit of a single device rotary TAH in sheep prior to animal trials and to use the data to develop a threedimensional, computer-aided design (CAD)-operated anatomic fitting tool for future TAH development. Following excision of the native ventricles above the atrio-ventricular groove, a prototype TAH was inserted within the chest cavity of six sheep (28–40 kg).Adjustable rods representing inlet and outlet conduits were oriented toward the center of each atrial chamber and the great vessels, with conduit lengths and angles recorded for future analysis. A threedimensional, CAD-operated anatomic fitting tool was then developed, based on the results of this study, and used to determine the inflow and outflow conduit orientation of the TAH. The mean diameters of the sheep left atrium, right atrium, aorta, and pulmonary artery were 39, 33, 12, and 11 mm, respectively. The center-to-center distance and outer-edge-to-outer-edge distance between the atria, found to be 39 ± 9 mm and 72 ± 17 mm in this study, were identified as the most critical geometries for successful TAH connection. This geometric constraint restricts the maximum separation allowable between left and right inlet ports of a TAH to ensure successful alignment within the available atrial circumference.
Resumo:
The synthesis and evaluation of novel resveratrol-based nitroxides have been explored for the potential treatment of hypertension. New methodology for the direct aryl iodination of isoindoline and isoindoline nitroxide using periodic acid and potassium iodide in concentrated sulphuric acid was developed. Diiodinated tetramethyl and tetraethyl isoindolines and a tetramethyl isoindoline nitroxide were prepared in excellent yields (70 – 82%). A diiodinated tetraethyl isoindoline nitroxide was generated from the corresponding nitroxide in modest yield (37%) alongside iodinated nitrones. The mono-iodinated species were also generated in modest yields (34 – 48%). Incorporation of the nitroxide unit into the structure of resveratrol was achieved using palladium-catalysed Heck coupling. Use of the previously prepared iodo products 5-iodo-1,1,3,3-tetramethylisoindolin-2-yloyl 18 and 5,6-diiodo-1,1,3,3-tetramethylisoindolin-2-yloyl 22 gave resveratrol nitroxides 12 and 13 in yields of 50% (optimized) and 1.6% respectively. Preliminary evaluation of the resveratrol analogue 12 as a treatment for hypertension was undertaken in the DOCA-salt rat model. A reduction in systolic blood pressure as well as alleviation of ventricular hypertrophy was observed. A larger study involving the DOCA salt rats is currently in progress.
Resumo:
Asset service organisations often recognize asset management as a core competence to deliver benefits to their business. But how do organizations know whether their asset management processes are adequate? Asset management maturity models, which combine best practices and competencies, provide a useful approach to test the capacity of organisations to manage their assets. Asset management frameworks are required to meet the dynamic challenges of managing assets in contemporary society. Although existing models are subject to wide variations in their implementation and sophistication, they also display a distinct weakness in that they tend to focus primarily on the operational and technical level and neglect the levels of strategy, policy and governance as well as the social and human resources – the people elements. Moreover, asset management maturity models have to respond to the external environmental factors, including such as climate change and sustainability, stakeholders and community demand management. Drawing on five dimensions of effective asset management – spatial, temporal, organisational, statistical, and evaluation – as identified by Amadi Echendu et al. [1], this paper carries out a comprehensive comparative analysis of six existing maturity models to identify the gaps in key process areas. Results suggest incorporating these into an integrated approach to assess the maturity of asset-intensive organizations. It is contended that the adoption of an integrated asset management maturity model will enhance effective and efficient delivery of services.
Resumo:
Sector wide interest in Reframe: QUT’s Evaluation Framework continues with a number of institutions requesting finer details as QUT embeds the new approach to evaluation across the university in 2013. This interest, both nationally and internationally has warranted QUT’s collegial response to draw upon its experiences from developing Reframe into distilling and offering Kaleidoscope back to the sector. The word Reframe is a relevant reference for QUT’s specific re-evaluation, reframing and adoption of a new approach to evaluation; whereas Kaleidoscope reflects the unique lens through which any other institution will need to view their own cultural specificity and local context through an extensive user-led stakeholder engagement approach when introducing new approaches to learning and teaching evaluation. Kaleidoscope’s objectives are for QUT to develop its research-based stakeholder approach to distil the successful experience exhibited in the Reframe Project into a transferable set of guidelines for use by other tertiary institutions across the sector. These guidelines will assist others to design, develop, and deploy, their own culturally specific widespread organisational change informed by stakeholder engagement and organisational buy-in. It is intended that these guidelines will promote, support and enable other tertiary institutions to embark on their own evaluation projects and maximise impact. Kaleidoscope offers an institutional case study of widespread organisational change underpinned by Reframe’s (i) evidence-based methodology; (ii) research including published environmental scan, literature review (Alderman, et al., 2012), development of a conceptual model (Alderman, et al., in press 2013), project management principles (Alderman & Melanie, 2012) and national conference peer reviews; and (iii) year-long strategic project with national outreach to collaboratively engage the development of a draft set of National Guidelines. Kaleidoscope’s aims are to inform Higher Education evaluation policy development through national stakeholder engagement, the finalisation of proposed National Guidelines. In correlation with the conference paper, the authors will present a Draft Guidelines and Framework ready for external peer review by evaluation practitioners from the Higher Education sector, as part of Kaleidoscope’s dissemination strategy (Hinton & Gannaway, 2011) applying illuminative evaluation theory (Parlett & Hamilton, 1976), through conference workshops and ongoing discussions (Shapiro, et al., 1983; Jacobs, 2000). The initial National Guidelines will be distilled from the Reframe: QUT’s Evaluation Framework’s Policy, Protocols, and incorporated Business Rules. It is intended that the outcomes of Kaleidoscope are owned by and reflect sectoral engagement, including iterative evaluation through multiple avenues of dissemination and collaboration including the Higher Education sector. The dissemination strategy with the inclusion of Illuminative Evaluation methodology provides an inclusive opportunity for other institutions and stakeholders across the Higher Education sector to give voice through the information-gathering component of evaluating the draft Guidelines, providing a comprehensive understanding of the complex realities experienced across the Higher Education sector, and thereby ‘illuminating’ both the shared and unique lenses and contexts. This process will enable any final guidelines developed to have broader applicability, greater acceptance, enhanced sustainability and additional relevance benefiting the Higher Education sector, and the adoption and adaption by any single institution for their local contexts.
Resumo:
Purpose: To develop, using dacarbazine as a model, reliable techniques for measuring DNA damage and repair as pharmacodynamic endpoints for patients receiving chemotherapy. Methods: A group of 39 patients with malignant melanoma were treated with dacarbazine 1 g/m2 i.v. every 21 days. Tamoxifen 20 mg daily was commenced 24 h after the first infusion and continued until 3 weeks after the last cycle of chemotherapy. DNA strand breaks formed during dacarbazine-induced DNA damage and repair were measured in individual cells by the alkaline comet assay. DNA methyl adducts were quantified by measuring urinary 3-methyladenine (3-MeA) excretion using immunoaffinity ELISA. Venous blood was taken on cycles 1 and 2 for separation of peripheral blood lymphocytes (PBLs) for measurement of DNA strand breaks. Results: Wide interpatient variation in PBL DNA strand breaks occurred following chemotherapy, with a peak at 4 h (median 26.6 h, interquartile range 14.75- 40.5 h) and incomplete repair by 24 h. Similarly, there was a range of 3-MeA excretion with peak levels 4-10 h after chemotherapy (median 33 nmol/h, interquartile range 20.448.65 nmol/h). Peak 3-MeA excretion was positively correlated with DNA strand breaks at 4 h (Spearman's correlation coefficient, r = 0.39, P = 0.036) and 24 h (r = 0.46, P = 0.01). Drug-induced emesis correlated with PBL DNA strand breaks (Mann Whitney U-test, P = 0.03) but not with peak 3-MeA excretion. Conclusions: DNA damage and repair following cytotoxic chemotherapy can be measured in vivo by the alkaline comet assay and by urinary 3-MeA excretion in patients receiving chemotherapy.
Resumo:
There is a growing trend to offer students learning opportunities that are flexible, innovative and engaging. As educators embrace student-centred agile teaching and learning methodologies, which require continuous reflection and adaptation, the need to evaluate students’ learning in a timely manner has become more pressing. Conventional evaluation surveys currently dominate the evaluation landscape internationally, despite recognition that they are insufficient to effectively evaluate curriculum and teaching quality. Surveys often: (1) fail to address the issues for which educators need feedback, (2) constrain student voice, (3) have low response rates and (4) occur too late to benefit current students. Consequently, this paper explores principles of effective feedback to propose a framework for learner-focused evaluation. We apply a three-stage control model, involving feedforward, concurrent and feedback evaluation, to investigate the intersection of assessment and evaluation in agile learning environments. We conclude that learner-focused evaluation cycles can be used to guide action so that evaluation is not undertaken simply for the benefit of future offerings, but rather to benefit current students by allowing ‘real-time’ learning activities to be adapted in the moment. As a result, students become co-producers of learning and evaluation becomes a meaningful, responsive dialogue between students and their instructors.
Resumo:
Moderation of student assessment is a critical component of teaching and learning in contemporary universities. In Australia, moderation is mandated through university policies and through the new national university accreditation authority, Tertiary Education Quality and Standards Agency which began operations in late January 2012 (TEQSA, 2012). The TEQSA requirement to declare details of moderation and any other arrangements used to support consistency and reliability of assessment and grading across each subject in the course of study is a radical step intended to move toward heightened accountability and greater transparency in the tertiary sector as well as entrenching evidence-based practice in the management of Australian academic programs. In light of this reform, the purpose of this project was to investigate and analyse current moderation practices operating within a faculty of education at a large urban university in Queensland, Australia. This qualitative study involved interviews with the unit coordinators (n=21) and tutors (n=8) of core undergraduate education units and graduate diploma units within the faculty. Four distinct discourses of moderation that academics drew on to discuss their practices were identified in the study. These were: equity, justification, community building, and accountability. These discourses, together with recommendations for changes to moderation practices are discussed in this paper.
Resumo:
The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.