796 resultados para formative institutional evaluation
Resumo:
Assurance of learning (AoL) is a predominant feature in both quality assurance and quality enhancement in higher education. The process may be used for program development, and to inform external accreditation and evaluation bodies. However, there is an obvious challenge in trying to get academic staff to buy into the benefits of the AoL process. This project conducted an audit across 25 Australian Business Schools. The majority of those interviewed stated that academic staff considered AoL to be extra work and viewed the process as a box ticking exercise for external bodies rather than sound educational practice. A change management process is required to promote the necessary cultural change to embed AoL into practice. This paper showcases some of the educational leadership strategies that have been successfully implemented across Australia to foster staff engagement in the AoL process. These include: strong senior management commitment and leadership demonstrating a constant and high level drive for staff engagement until AoL becomes an institutional norm; developing leadership and champions among unit and program level staff, to share practices and promote the benefits that come from engaging in the process; providing professional development opportunities to discuss and resolve difficulties and tensions around AoL; demonstrating success and effectiveness by selling staff on the evidence that AoL makes a difference; and making the process inclusive with academics collaborating in the development and implementation of the process.
Resumo:
This study is motivated by, and proceeds from, a central interest in the importance of evaluating IS service quality and adopts the IS ZOT SERVQUAL instrument (Kettinger & Lee, 2005) as its core theory base. This study conceptualises IS service quality as a multidimensional formative construct and seeks to answer the main research questions: “Is the IS service quality construct valid as a 1st-order formative, 2nd-order formative multidimensional construct?” Additionally, with the aim of validating the IS service quality construct within its nomological net, as in prior service marketing work, Satisfaction was hypothesised as its immediate consequence. With the goal of testing the above research question, IS service quality and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS), employing 219 valid responses, largely evidenced the validity of IS service quality as a multidimensional formative construct. The nomological validity of the IS service quality construct was also evidenced by demonstrating that 55% of Satisfaction was explained by the multidimensional formative IS service quality construct.
Resumo:
Business process management (BPM) is becoming the dominant management paradigm. Business process modelling is central to BPM, and the resultant business process model the core artefact guiding subsequent process change. Thus, model quality is at the centre, mediating between the modelling effort and related growing investment in ultimate process improvements. Nonetheless, though research interest in the properties that differentiate high quality process models is longstanding, there have been no past reports of a valid, operationalised, holistic measure of business process model quality. In attention to this gap, this paper reports validation of a Business Process Model Quality measurement model, conceptualised as a single-order, formative index. Such a measurement model has value as the dependent variable in rigorously researching the drivers of model quality; as antecedent of ultimate process improvements; and potentially as an economical comparator and diagnostic for practice.
Resumo:
Experts’ views and commentary have been highly respected in every discipline. However, unlike traditional disciplines like medicine, mathematics and engineering, Information System (IS) expertise is difficult to define. This paper attempts to understand the characteristics of IS-expert through a comprehensive literature review of analogous disciplines and then derives a formative research model with three main constructs. Further, this research validates the formative model to identify the characteristics of expertise using data gathered from 220 respondents using a contemporary Information System. Finally this research demonstrates how individuals with different levels of expertise differ in their views in relation to system evaluations.
Resumo:
The epithelium of the corneolimbus contains stem cells for regenerating the corneal epithelium. Diseases and injuries affecting the limbus can lead to a condition known as limbal stem cell deficiency (LSCD), which results in loss of the corneal epithelium, and subsequent chronic inflammation and scarring of the ocular surface. Advances in the treatment of LSCD have been achieved through use of cultured human limbal epithelial (HLE) grafts to restore epithelial stem cells of the ocular surface. These epithelial grafts are usually produced by the ex vivo expansion of HLE cells on human donor amniotic membrane (AM), but this is not without limitations. Although AM is the most widely accepted substratum for HLE transplantation, donor variation, risk of disease transfer, and rising costs have led to the search for alternative biomaterials to improve the surgical outcome of LSCD. Recent studies have demonstrated that Bombyx mori silk fibroin (hereafter referred to as fibroin) membranes support the growth of primary HLE cells, and thus this thesis aims to explore the possibility of using fibroin as a biomaterial for ocular surface reconstruction. Optimistically, the grafted sheets of cultured epithelium would provide a replenishing source of epithelial progenitor cells for maintaining the corneal epithelium, however, the HLE cells lose their progenitor cell characteristics once removed from their niche. More severe ocular surface injuries, which result in stromal scarring, damage the epithelial stem cell niche, which subsequently leads to poor corneal re-epithelialisation post-grafting. An ideal solution to repairing the corneal limbus would therefore be to grow and transplant HLE cells on a biomaterial that also provides a means for replacing underlying stromal cells required to better simulate the normal stem cell niche. The recent discovery of limbal mesenchymal stromal cells (L-MSC) provides a possibility for stromal repair and regeneration, and therefore, this thesis presents the use of fibroin as a possible biomaterial to support a three dimensional tissue engineered corneolimbus with both an HLE and underlying L-MSC layer. Investigation into optimal scaffold design is necessary, including adequate separation of epithelial and stromal layers, as well as direct cell-cell contact. Firstly, the attachment, morphology and phenotype of HLE cells grown on fibroin were directly compared to that observed on donor AM, the current clinical standard substrate for HLE transplantation. The production, transparency, and permeability of fibroin membranes were also evaluated in this part of the study. Results revealed that fibroin membranes could be routinely produced using a custom-made film casting table and were found to be transparent and permeable. Attachment of HLE cells to fibroin after 4 hours in serum-free medium was similar to that supported by tissue culture plastic but approximately 6-fold less than that observed on AM. While HLE cultured on AM displayed superior stratification, epithelia constructed from HLE on fibroin maintained evidence of corneal phenotype (cytokeratin pair 3/12 expression; CK3/12) and displayed a comparable number and distribution of ÄNp63+ progenitor cells to that seen in cultures grown on AM. These results confirm the suitability of membranes constructed from silk fibroin as a possible substrate for HLE cultivation. One of the most important aspects in corneolimbal tissue engineering is to consider the reconstruction of the limbal stem cell niche to help form the natural limbus in situ. MSC with similar properties to bone marrow derived-MSC (BM-MSC) have recently been grown from the limbus of the human cornea. This thesis evaluated methods for culturing L-MSC and limbal keratocytes using various serum-free media. The phenotype of resulting cultures was examined using photography, flow cytometry for CD34 (keratocyte marker), CD45 (bone marrow-derived cell marker), CD73, CD90, CD105 (collectively MSC markers), CD141 (epithelial/vascular endothelial marker), and CD271 (neuronal marker), immunocytochemistry (alpha-smooth muscle actin; á-sma), differentiation assays (osteogenesis, adipogenesis and chrondrogenesis), and co-culture experiments with HLE cells. While all techniques supported to varying degrees establishment of keratocyte and L-MSC cultures, sustained growth and serial propagation was only achieved in serum-supplemented medium or the MesenCult-XF„¥ culture system (Stem Cell Technologies). Cultures established in MesenCult-XF„¥ grew faster than those grown in serum-supplemented medium and retained a more optimal MSC phenotype. L-MSC cultivated in MesenCult-XFR were also positive for CD141, rarely expressed £\-sma, and displayed multi-potency. L-MSC supported growth of HLE cells, with the largest epithelial islands being observed in the presence of L-MSC established in MesenCult-XF„¥ medium. All HLE cultures supported by L-MSC widely expressed the progenitor cell marker £GNp63, along with the corneal differentiation marker CK3/12. Our findings conclude that MesenCult-XFR is a superior culture system for L-MSC, but further studies are required to explore the significance of CD141 expression in these cells. Following on from the findings of the previous two parts, silk fibroin was tested as a novel dual-layer construct containing both an epithelium and underlying stroma for corneolimbal reconstruction. In this section, the growth and phenotype of HLE cells on non-porous versus porous fibroin membranes was compared. Furthermore, the growth of L-MSC in either serum-supplemented medium or the MesenCult-XFR culture system within fibroin fibrous mats was investigated. Lastly, the co-culture of HLE and L-MSC in serum-supplemented medium on and within fibroin dual-layer constructs was also examined. HLE on porous membranes displayed a flattened and squamous monolayer; in contrast, HLE on non-porous fibroin appeared cuboidal and stratified closer in appearance to a normal corneal epithelium. Both constructs maintained CK3/12 expression and distribution of £GNp63+ progenitor cells. Dual-layer fibroin scaffolds consisting of HLE cells and L-MSC maintained a similar phenotype as on the single layers alone. Overall, the present study proposed to create a three dimensional limbal tissue substitute of HLE cells and L-MSC together, ultimately for safe and beneficial transplantation back into the human eye. The results show that HLE and L-MSC can be cultivated separately and together whilst maintaining a clinically feasible phenotype containing a majority of progenitor cells. In addition, L-MSC were able to be cultivated routinely in the MesenCult-XF® culture system while maintaining a high purity for the MSC characteristic phenotype. However, as a serum-free culture medium was not found to sustain growth of both HLE and L-MSC, the combination scaffold was created in serum-supplemented medium, indicating that further refinement of this cultured limbal scaffold is required. This thesis has also demonstrated a potential novel marker for L-MSC, and has generated knowledge which may impact on the understanding of stromal-epithelial interactions. These results support the feasibility of a dual-layer tissue engineered corneolimbus constructed from silk fibroin, and warrant further studies into the potential benefits it offers to corneolimbal tissue regeneration. Further refinement of this technology should explore the potential benefits of using epithelial-stromal co-cultures with MesenCult-XF® derived L-MSC. Subsequent investigations into the effects of long-term culture on the phenotype and behaviour of the cells in the dual-layer scaffolds are also required. While this project demonstrated the feasibility in vitro for the production of a dual-layer tissue engineered corneolimbus, further studies are required to test the efficacy of the limbal scaffold in vivo. Future in vivo studies are essential to fully understand the integration and degradation of silk fibroin biomaterials in the cornea over time. Subsequent experiments should also investigate the use of both AM and silk fibroin with epithelial and stromal cell co-cultures in an animal model of LSCD. The outcomes of this project have provided a foundation for research into corneolimbal reconstruction using biomaterials and offer a stepping stone for future studies into corneolimbal tissue engineering.
Resumo:
The Monte Carlo DICOM Tool-Kit (MCDTK) is a software suite designed for treatment plan dose verification, using the BEAMnrc and DOSXYZnrc Monte Carlo codes. MCDTK converts DICOM-format treatment plan information into Monte Carlo input files and compares the results of Monte Carlo treatment simulations with conventional treatment planning dose calculations. In this study, a treatment is planned using a commercial treatment planning system, delivered to a pelvis phantom containing ten thermoluminescent dosimeters and simulated using BEAMnrc and DOSXYZnrc using inputs derived from MCDTK. The dosimetric accuracy of the Monte Carlo data is then evaluated via comparisons with the dose distribution obtained from the treatment planning system as well as the in-phantom point dose measurements. The simulated beam arrangement produced by MCDTK is found to be in geometric agreement with the planned treatment. An isodose display generated from the Monte Carlo data by MCDTK shows general agreement with the isodose display obtained from the treatment planning system, except for small regions around density heterogeneities in the phantom, where the pencil-beam dose calculation performed by the treatment planning systemis likely to be less accurate. All point dose measurements agree with the Monte Carlo data obtained using MCDTK, within confidence limits, and all except one of these point dose measurements show closer agreement with theMonte Carlo data than with the doses calculated by the treatment planning system. This study provides a simple demonstration of the geometric and dosimetric accuracy ofMonte Carlo simulations based on information from MCDTK.
Resumo:
The presence of air and bone interfaces makes the dose distribution for head and neck cancer treatments difficult to accurately predict. This study compared planning system dose calculations using the collapsed-cone convolution algorithm with EGSnrcMonte Carlo simulation results obtained using the Monte Carlo DICOMToolKit software, for one oropharynx, two paranasal sinus and three nodal treatment plans. The difference between median doses obtained from the treatment planning and Monte Carlo calculations was found to be greatest in two bilateral treatments: 4.8%for a retropharyngeal node irradiation and 6.7% for an ethmoid paranasal sinus treatment. These deviations in median dose were smaller for two unilateral treatments: 0.8% for an infraclavicular node irradiation and 2.8% for a cervical node treatment. Examination of isodose distributions indicated that the largest deviations between Monte Carlo simulation and collapsed-cone convolution calculations were seen in the bilateral treatments, where the increase in calculated dose beyond air cavities was most significant.
Resumo:
Background. Vertebral rotation found in structural scoliosis contributes to trunkal asymmetry which is commonly measured with a simple Scoliometer device on a patient's thorax in the forward flexed position. The new generation of mobile 'smartphones' have an integrated accelerometer, making accurate angle measurement possible, which provides a potentially useful clinical tool for assessing rib hump deformity. This study aimed to compare rib hump angle measurements performed using a Smartphone and traditional Scoliometer on a set of plaster torsos representing the range of torsional deformities seen in clinical practice. Methods. Nine observers measured the rib hump found on eight plaster torsos moulded from scoliosis patients with both a Scoliometer and an Apple iPhone on separate occasions. Each observer repeated the measurements at least a week after the original measurements, and were blinded to previous results. Intra-observer reliability and inter-observer reliability were analysed using the method of Bland and Altman and 95% confidence intervals were calculated. The Intra-Class Correlation Coefficients (ICC) were calculated for repeated measurements of each of the eight plaster torso moulds by the nine observers. Results. Mean absolute difference between pairs of iPhone/Scoliometer measurements was 2.1 degrees, with a small (1 degrees) bias toward higher rib hump angles with the iPhone. 95% confidence intervals for intra-observer variability were +/- 1.8 degrees (Scoliometer) and +/- 3.2 degrees (iPhone). 95% confidence intervals for inter-observer variability were +/- 4.9 degrees (iPhone) and +/- 3.8 degrees (Scoliometer). The measurement errors and confidence intervals found were similar to or better than the range of previously published thoracic rib hump measurement studies. Conclusions. The iPhone is a clinically equivalent rib hump measurement tool to the Scoliometer in spinal deformity patients. The novel use of plaster torsos as rib hump models avoids the variables of patient fatigue and discomfort, inconsistent positioning and deformity progression using human subjects in a single or multiple measurement sessions.
Resumo:
Queensland University of Technology (QUT) was one of the first universities in Australia to establish an institutional repository. Launched in November 2003, the repository (QUT ePrints) uses the EPrints open source repository software (from Southampton) and has enjoyed the benefit of an institutional deposit mandate since January 2004. Currently (April 2012), the repository holds over 36,000 records, including 17,909 open access publications with another 2,434 publications embargoed but with mediated access enabled via the ‘Request a copy’ button which is a feature of the EPrints software. At QUT, the repository is managed by the library.QUT ePrints (http://eprints.qut.edu.au) The repository is embedded into a number of other systems at QUT including the staff profile system and the University’s research information system. It has also been integrated into a number of critical processes related to Government reporting and research assessment. Internally, senior research administrators often look to the repository for information to assist with decision-making and planning. While some statistics could be drawn from the advanced search feature and the existing download statistics feature, they were rarely at the level of granularity or aggregation required. Getting the information from the ‘back end’ of the repository was very time-consuming for the Library staff. In 2011, the Library funded a project to enhance the range of statistics which would be available from the public interface of QUT ePrints. The repository team conducted a series of focus groups and individual interviews to identify and prioritise functionality requirements for a new statistics ‘dashboard’. The participants included a mix research administrators, early career researchers and senior researchers. The repository team identified a number of business criteria (eg extensible, support available, skills required etc) and then gave each a weighting. After considering all the known options available, five software packages (IRStats, ePrintsStats, AWStats, BIRT and Google Urchin/Analytics) were thoroughly evaluated against a list of 69 criteria to determine which would be most suitable. The evaluation revealed that IRStats was the best fit for our requirements. It was deemed capable of meeting 21 out of the 31 high priority criteria. Consequently, IRStats was implemented as the basis for QUT ePrints’ new statistics dashboards which were launched in Open Access Week, October 2011. Statistics dashboards are now available at four levels; whole-of-repository level, organisational unit level, individual author level and individual item level. The data available includes, cumulative total deposits, time series deposits, deposits by item type, % fulltexts, % open access, cumulative downloads, time series downloads, downloads by item type, author ranking, paper ranking (by downloads), downloader geographic location, domains, internal v external downloads, citation data (from Scopus and Web of Science), most popular search terms, non-search referring websites. The data is displayed in charts, maps and table format. The new statistics dashboards are a great success. Feedback received from staff and students has been very positive. Individual researchers have said that they have found the information to be very useful when compiling a track record. It is now very easy for senior administrators (including the Deputy Vice Chancellor-Research) to compare the full-text deposit rates (i.e. mandate compliance rates) across organisational units. This has led to increased ‘encouragement’ from Heads of School and Deans in relation to the provision of full-text versions.
Resumo:
Recent studies suggest that meta-evaluation can be valuable in developing new approaches to evaluation, building evaluation capacities, and enhancing organizational learning. These new extensions of the concept of meta-evaluation are significant, given the growing emphasis on improving the quality and effectiveness of evaluation practices in the South Asian region. Following a review of the literature, this paper presents a case study of the use of concurrent meta-evaluation in the four-year project Assessing Communication for Social Change which developed and trialled a participatory impact assessment methodology in collaboration with a development communication Non-government organization (NGO) in Nepal. Key objectives of the meta-evaluation included to: continuously develop, adapt and improve the impact assessment methodology, Monitoring and Evaluation (M&E) systems and process and other project activities; identify impacts of the project; and build capacities in critical reflection and review. Our analysis indicates that this meta-evaluation was essential to understanding various constraints related to the organizational context that affected the success of the project and the development of improved M&E systems and capacities within the NGO. We identified several limitations of our meta-evaluation methods, which were balanced by the strengths of other methods. Our case study suggests that as well as assessing the quality, credibility and value of evaluation practices, meta-evaluations need to focus on important contextual issues that can have significant impacts on the outcomes of participatory evaluation projects. They include hierarchical organizational cultures, communication barriers, power/knowledge relations, and the time and resources available. Meta-evaluations also need to consider wider issues such as the sustainability of evaluation systems and approaches.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented.
Resumo:
There is a need for an accurate real-time quantitative system that would enhance decision-making in the treatment of osteoarthritis. To achieve this objective, significant research is required that will enable articular cartilage properties to be measured and categorized for health and functionality without the need for laboratory tests involving biopsies for pathological evaluation. Such a system would provide the capability of access to the internal condition of the cartilage matrix and thus extend the vision-based arthroscopy that is currently used beyond the subjective evaluation of surgeons. The system required must be able to non-destructively probe the entire thickness of the cartilage and its immediate subchondral bone layer. In this thesis, near infrared spectroscopy is investigated for the purpose mentioned above. The aim is to relate it to the structure and load bearing properties of the cartilage matrix to the near infrared absorption spectrum and establish functional relationships that will provide objective, quantitative and repeatable categorization of cartilage condition outside the area of visible degradation in a joint. Based on results from traditional mechanical testing, their innovative interpretation and relationship with spectroscopic data, new parameters were developed. These were then evaluated for their consistency in discriminating between healthy viable and degraded cartilage. The mechanical and physico-chemical properties were related to specific regions of the near infrared absorption spectrum that were identified as part of the research conducted for this thesis. The relationships between the tissue's near infrared spectral response and the new parameters were modeled using multivariate statistical techniques based on partial least squares regression (PLSR). With significantly high levels of statistical correlation, the modeled relationships were demonstrated to possess considerable potential in predicting the properties of unknown tissue samples in a quick and non-destructive manner. In order to adapt near infrared spectroscopy for clinical applications, a balance between probe diameter and the number of active transmit-receive optic fibres must be optimized. This was achieved in the course of this research, resulting in an optimal probe configuration that could be adapted for joint tissue evaluation. Furthermore, as a proof-of-concept, a protocol for obtaining the new parameters from the near infrared absorption spectra of cartilage was developed and implemented in a graphical user interface (GUI)-based software, and used to assess cartilage-on-bone samples in vitro. This conceptual implementation has been demonstrated, in part by the individual parametric relationship with the near infrared absorption spectrum, the capacity of the proposed system to facilitate real-time, non-destructive evaluation of cartilage matrix integrity. In summary, the potential of the optical near infrared spectroscopy for evaluating articular cartilage and bone laminate has been demonstrated in this thesis. The approach could have a spin-off for other soft tissues and organs of the body. It builds on the earlier work of the group at QUT, enhancing the near infrared component of the ongoing research on developing a tool for cartilage evaluation that goes beyond visual and subjective methods.
Resumo:
Many substation applications require accurate time-stamping. The performance of systems such as Network Time Protocol (NTP), IRIG-B and one pulse per second (1-PPS) have been sufficient to date. However, new applications, including IEC 61850-9-2 process bus and phasor measurement, require accuracy of one microsecond or better. Furthermore, process bus applications are taking time synchronisation out into high voltage switchyards where cable lengths may have an impact on timing accuracy. IEEE Std 1588, Precision Time Protocol (PTP), is the means preferred by the smart grid standardisation roadmaps (from both the IEC and US National Institute of Standards and Technology) of achieving this higher level of performance, and integrates well into Ethernet based substation automation systems. Significant benefits of PTP include automatic path length compensation, support for redundant time sources and the cabling efficiency of a shared network. This paper benchmarks the performance of established IRIG-B and 1-PPS synchronisation methods over a range of path lengths representative of a transmission substation. The performance of PTP using the same distribution system is then evaluated and compared to the existing methods to determine if the performance justifies the additional complexity. Experimental results show that a PTP timing system maintains the synchronising performance of 1-PPS and IRIG-B timing systems, when using the same fibre optic cables, and further meets the needs of process buses in large substations.
Resumo:
This chapter charts the theories and methods being adopted in an investigation of the 'micro-politics' of teacher education policy reception at a site of higher education in Queensland from 1980 to 1990. The paper combines insights and methods from critical ethnography with those from the institutional ethnography of feminist sociologist Dorothy Smith to link local policy activity at the institutional site to broader social structures and processes. In this way, enquiry begins with--and takes into account--the experiences of those groups normally excluded from mainstream and even critical policy analysis.
Resumo:
Divergence from a random baseline is a technique for the evaluation of document clustering. It ensures cluster quality measures are performing work that prevents ineffective clusterings from giving high scores to clusterings that provide no useful result. These concepts are defined and analysed using intrinsic and extrinsic approaches to the evaluation of document cluster quality. This includes the classical clusters to categories approach and a novel approach that uses ad hoc information retrieval. The divergence from a random baseline approach is able to differentiate ineffective clusterings encountered in the INEX XML Mining track. It also appears to perform a normalisation similar to the Normalised Mutual Information (NMI) measure but it can be applied to any measure of cluster quality. When it is applied to the intrinsic measure of distortion as measured by RMSE, subtraction from a random baseline provides a clear optimum that is not apparent otherwise. This approach can be applied to any clustering evaluation. This paper describes its use in the context of document clustering evaluation.