971 resultados para Evaluation metrics
Resumo:
Introduction and aims: For a scaffold material to be considered effective and efficient for tissue engineering it must be biocompatible as well as bioinductive. Silk fiber is a natural biocompatible material suitable for scaffold fabrication; however, silk is tissue-conductive and lacks tissue-inductive properties. One proposed method to make the scaffold tissue-inductive is to introduce plasmids or viruses encoding a specific growth factor into the scaffold. In this study, we constructed adenoviruses encoding bone morphogenetic protein-7 (BMP-7) and incorporated these into silk scaffolds. The osteo-inductive and new bone formation properties of these constructs were assessed in vivo in a critical-sized skull defect animal model. Materials and methods: Silk fibroin scaffolds containing adenovirus particles coding BMP-7 were prepared. The release of the adenovirus particles from the scaffolds was quantified by tissue-culture infective dose (TCID50) and the bioactivity of the released viruses was evaluated on human bone marrow mesenchymal stromal cells (BMSCs). To demonstrate the in vivo bone forming ability of the virus-carrying silk fibroin scaffold, the scaffold constructs were implanted into calvarial defects in SCID mice. Results: In vitro studies demonstrated that the virus-carrying silk fibroin scaffold released virus particles over a 3 week period while preserving their bioactivity. In vivo test of the scaffold constructs in critical-sized skull defect areas revealed that silk scaffolds were capable of delivering the adenovirus encoding BMP-7, resulting significantly enhanced new bone formation. Conclusions: Silk scaffolds carrying BMP-7 encoding adenoviruses can effectively transfect cells and enhance both in vitro and in vivo osteogenesis. The findings of this study indicate silk fibroin is a promising biomaterial for gene delivery to repair critical-sized bone defects.
Resumo:
Smart matrices are required in bone tissueengineered grafts that provide an optimal environment for cells and retain osteo-inductive factors for sustained biological activity. We hypothesized that a slow-degrading heparin-incorporated hyaluronan (HA) hydrogel can preserve BMP-2; while an arterio–venous (A–V) loop can support axial vascularization to provide nutrition for a bioartificial bone graft. HA was evaluated for osteoblast growth and BMP-2 release. Porous PLDLLA–TCP–PCL scaffolds were produced by rapid prototyping technology and applied in vivo along with HA-hydrogel, loaded with either primary osteoblasts or BMP-2. A microsurgically created A–V loop was placed around the scaffold, encased in an isolation chamber in Lewis rats. HA-hydrogel supported growth of osteoblasts over 8 weeks and allowed sustained release of BMP-2 over 35 days. The A–V loop provided an angiogenic stimulus with the formation of vascularized tissue in the scaffolds. Bone-specific genes were detected by real time RT-PCR after 8 weeks. However, no significant amount of bone was observed histologically. The heterotopic isolation chamber in combination with absent biomechanical stimulation might explain the insufficient bone formation despite adequate expression of bone-related genes. Optimization of the interplay of osteogenic cells and osteo-inductive factors might eventually generate sufficient amounts of axially vascularized bone grafts for reconstructive surgery.
Resumo:
This research report documents work conducted by the Center for Transportation (CTR) at The University of Texas at Austin in analyzing the Joint Analysis using the Combined Knowledge (J.A.C.K.) program. This program was developed by the Texas Department of Transportation (TxDOT) to make projections of revenues and expenditures. This research effort was to span from September 2008 to August 2009, but the bulk of the work was completed and presented by December 2008. J.A.C.K. was subsequently renamed TRENDS, but for consistency with the scope of work, the original name is used throughout this report.
Resumo:
This report provides an evaluation of the Capalaba Youth Space.The evaluation included elements of process and impact evaluation and used a participatory action research approach informed by engagement processes, focus groups, a community survey, interviews and secondary analysis of existing data.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
In vector space based approaches to natural language processing, similarity is commonly measured by taking the angle between two vectors representing words or documents in a semantic space. This is natural from a mathematical point of view, as the angle between unit vectors is, up to constant scaling, the only unitarily invariant metric on the unit sphere. However, similarity judgement tasks reveal that human subjects fail to produce data which satisfies the symmetry and triangle inequality requirements for a metric space. A possible conclusion, reached in particular by Tversky et al., is that some of the most basic assumptions of geometric models are unwarranted in the case of psychological similarity, a result which would impose strong limits on the validity and applicability vector space based (and hence also quantum inspired) approaches to the modelling of cognitive processes. This paper proposes a resolution to this fundamental criticism of of the applicability of vector space models of cognition. We argue that pairs of words imply a context which in turn induces a point of view, allowing a subject to estimate semantic similarity. Context is here introduced as a point of view vector (POVV) and the expected similarity is derived as a measure over the POVV's. Different pairs of words will invoke different contexts and different POVV's. Hence the triangle inequality ceases to be a valid constraint on the angles. We test the proposal on a few triples of words and outline further research.
Resumo:
Suburbanisation has been internationally a major phenomenon in the last decades. Suburb-to-suburb routes are nowadays the most widespread road journeys; and this resulted in an increment of distances travelled, particularly on faster suburban highways. The design of highways tends to over-simplify the driving task and this can result in decreased alertness. Driving behaviour is consequently impaired and drivers are then more likely to be involved in road crashes. This is particularly dangerous on highways where the speed limit is high. While effective countermeasures to this decrement in alertness do not currently exist, the development of in-vehicle sensors opens avenues for monitoring driving behaviour in real-time. The aim of this study is to evaluate in real-time the level of alertness of the driver through surrogate measures that can be collected from in-vehicle sensors. Slow EEG activity is used as a reference to evaluate driver's alertness. Data are collected in a driving simulator instrumented with an eye tracking system, a heart rate monitor and an electrodermal activity device (N=25 participants). Four different types of highways (driving scenario of 40 minutes each) are implemented through the variation of the road design (amount of curves and hills) and the roadside environment (amount of buildings and traffic). We show with Neural Networks that reduced alertness can be detected in real-time with an accuracy of 92% using lane positioning, steering wheel movement, head rotation, blink frequency, heart rate variability and skin conductance level. Such results show that it is possible to assess driver's alertness with surrogate measures. Such methodology could be used to warn drivers of their alertness level through the development of an in-vehicle device monitoring in real-time drivers' behaviour on highways, and therefore it could result in improved road safety.
Resumo:
In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.
Resumo:
This paper reviews the current status of the application of optical non-destructive methods, particularly infrared (IR) and near infrared (NIR), in the evaluation of the physiological integrity of articular cartilage. It is concluded that a significant amount of work is still required in order to achieve specificity and clinical applicability of these methods in the assessment and treatment of dysfunctional articular joints.
Resumo:
Background Significant ongoing learning needs for nurses have occurred as a direct result of the continuous introduction of technological innovations and research developments in the healthcare environment. Despite an increased worldwide emphasis on the importance of continuing education, there continues to be an absence of empirical evidence of program and session effectiveness. Few studies determine whether continuing education enhances or develops practice and the relative cost benefits of health professionals’ participation in professional development. The implications for future clinical practice and associated educational approaches to meet the needs of an increasingly diverse multigenerational and multicultural workforce are also not well documented. There is minimal research confirming that continuing education programs contribute to improved patient outcomes, nurses’ earlier detection of patient deterioration or that standards of continuing competence are maintained. Crucially, evidence-based practice is demonstrated and international quality and safety benchmarks are adhered to. An integrated clinical learning model was developed to inform ongoing education for acute care nurses. Educational strategies included the use of integrated learning approaches, interactive teaching concepts and learner-centred pedagogies. A Respiratory Skills Update education (ReSKU) program was used as the content for the educational intervention to inform surgical nurses’ clinical practice in the area of respiratory assessment. The aim of the research was to evaluate the effectiveness of implementing the ReSKU program using teaching and learning strategies, in the context of organisational utility, on improving surgical nurses’ practice in the area of respiratory assessment. The education program aimed to facilitate better awareness, knowledge and understanding of respiratory dysfunction in the postoperative clinical environment. This research was guided by the work of Forneris (2004), who developed a theoretical framework to operationalise a critical thinking process incorporating the complexities of the clinical context. The framework used educational strategies that are learner-centred and participatory. These strategies aimed to engage the clinician in dynamic thinking processes in clinical practice situations guided by coaches and educators. Methods A quasi experimental pre test, post test non–equivalent control group design was used to evaluate the impact of the ReSKU program on the clinical practice of surgical nurses. The research tested the hypothesis that participation in the ReSKU program improves the reported beliefs and attitudes of surgical nurses, increases their knowledge and reported use of respiratory assessment skills. The study was conducted in a 400 bed regional referral public hospital, the central hub of three smaller hospitals, in a health district servicing the coastal and hinterland areas north of Brisbane. The sample included 90 nurses working in the three surgical wards eligible for inclusion in the study. The experimental group consisted of 36 surgical nurses who had chosen to attend the ReSKU program and consented to be part of the study intervention group. The comparison group included the 39 surgical nurses who elected not to attend the ReSKU program, but agreed to participate in the study. Findings One of the most notable findings was that nurses choosing not to participate were older, more experienced and less well educated. The data demonstrated that there was a barrier for training which impacted on educational strategies as this mature aged cohort was less likely to take up educational opportunities. The study demonstrated statistically significant differences between groups regarding reported use of respiratory skills, three months after ReSKU program attendance. Between group data analysis indicated that the intervention group’s reported beliefs and attitudes pertaining to subscale descriptors showed statistically significant differences in three of the six subscales following attendance at the ReSKU program. These subscales included influence on nursing care, educational preparation and clinical development. Findings suggest that the use of an integrated educational model underpinned by a robust theoretical framework is a strong factor in some perceptions of the ReSKU program relating to attitudes and behaviour. There were minimal differences in knowledge between groups across time. Conclusions This study was consistent with contemporary educational approaches using multi-modal, interactive teaching strategies and a robust overarching theoretical framework to support study concepts. The construct of critical thinking in the clinical context, combined with clinical reasoning and purposeful and collective reflection, was a powerful educational strategy to enhance competency and capability in clinicians.
Resumo:
Protecting slow sand filters (SSFs) from high-turbidity waters by pretreatment using pebble matrix filtration (PMF) has previously been studied in the laboratory at University College London, followed by pilot field trials in Papua New Guinea and Serbia. The first full-scale PMF plant was completed at a water-treatment plant in Sri Lanka in 2008, and during its construction, problems were encountered in sourcing the required size of pebbles and sand as filter media. Because sourcing of uniform-sized pebbles may be problematic in many countries, the performance of alternative media has been investigated for the sustainability of the PMF system. Hand-formed clay balls made at a 100-yearold brick factory in the United Kingdom appear to have satisfied the role of pebbles, and a laboratory filter column was operated by using these clay balls together with recycled crushed glass as an alternative to sand media in the PMF. Results showed that in countries where uniform-sized pebbles are difficult to obtain, clay balls are an effective and feasible alternative to natural pebbles. Also, recycled crushed glass performed as well as or better than silica sand as an alternative fine media in the clarification process, although cleaning by drainage was more effective with sand media. In the tested filtration velocity range of ð0:72–1:33Þ m=h and inlet turbidity range of (78–589) NTU, both sand and glass produced above 95% removal efficiencies. The head loss development during clogging was about 30% higher in sand than in glass media.
Resumo:
Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.