929 resultados para Statistical evaluation
Resumo:
Recent studies suggest that meta-evaluation can be valuable in developing new approaches to evaluation, building evaluation capacities, and enhancing organizational learning. These new extensions of the concept of meta-evaluation are significant, given the growing emphasis on improving the quality and effectiveness of evaluation practices in the South Asian region. Following a review of the literature, this paper presents a case study of the use of concurrent meta-evaluation in the four-year project Assessing Communication for Social Change which developed and trialled a participatory impact assessment methodology in collaboration with a development communication Non-government organization (NGO) in Nepal. Key objectives of the meta-evaluation included to: continuously develop, adapt and improve the impact assessment methodology, Monitoring and Evaluation (M&E) systems and process and other project activities; identify impacts of the project; and build capacities in critical reflection and review. Our analysis indicates that this meta-evaluation was essential to understanding various constraints related to the organizational context that affected the success of the project and the development of improved M&E systems and capacities within the NGO. We identified several limitations of our meta-evaluation methods, which were balanced by the strengths of other methods. Our case study suggests that as well as assessing the quality, credibility and value of evaluation practices, meta-evaluations need to focus on important contextual issues that can have significant impacts on the outcomes of participatory evaluation projects. They include hierarchical organizational cultures, communication barriers, power/knowledge relations, and the time and resources available. Meta-evaluations also need to consider wider issues such as the sustainability of evaluation systems and approaches.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented.
Resumo:
Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.
Resumo:
Many substation applications require accurate time-stamping. The performance of systems such as Network Time Protocol (NTP), IRIG-B and one pulse per second (1-PPS) have been sufficient to date. However, new applications, including IEC 61850-9-2 process bus and phasor measurement, require accuracy of one microsecond or better. Furthermore, process bus applications are taking time synchronisation out into high voltage switchyards where cable lengths may have an impact on timing accuracy. IEEE Std 1588, Precision Time Protocol (PTP), is the means preferred by the smart grid standardisation roadmaps (from both the IEC and US National Institute of Standards and Technology) of achieving this higher level of performance, and integrates well into Ethernet based substation automation systems. Significant benefits of PTP include automatic path length compensation, support for redundant time sources and the cabling efficiency of a shared network. This paper benchmarks the performance of established IRIG-B and 1-PPS synchronisation methods over a range of path lengths representative of a transmission substation. The performance of PTP using the same distribution system is then evaluated and compared to the existing methods to determine if the performance justifies the additional complexity. Experimental results show that a PTP timing system maintains the synchronising performance of 1-PPS and IRIG-B timing systems, when using the same fibre optic cables, and further meets the needs of process buses in large substations.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Divergence from a random baseline is a technique for the evaluation of document clustering. It ensures cluster quality measures are performing work that prevents ineffective clusterings from giving high scores to clusterings that provide no useful result. These concepts are defined and analysed using intrinsic and extrinsic approaches to the evaluation of document cluster quality. This includes the classical clusters to categories approach and a novel approach that uses ad hoc information retrieval. The divergence from a random baseline approach is able to differentiate ineffective clusterings encountered in the INEX XML Mining track. It also appears to perform a normalisation similar to the Normalised Mutual Information (NMI) measure but it can be applied to any measure of cluster quality. When it is applied to the intrinsic measure of distortion as measured by RMSE, subtraction from a random baseline provides a clear optimum that is not apparent otherwise. This approach can be applied to any clustering evaluation. This paper describes its use in the context of document clustering evaluation.
Resumo:
Objective: Radiation safety principles dictate that imaging procedures should minimise the radiation risks involved, without compromising diagnostic performance. This study aims to define a core set of views that maximises clinical information yield for minimum radiation risk. Angiographers would supplement these views as clinically indicated. Methods: An algorithm was developed to combine published data detailing the quality of information derived for the major coronary artery segments through the use of a common set of views in angiography with data relating to the dose–area product and scatter radiation associated with these views. Results: The optimum view set for the left coronary system comprised four views: left anterior oblique (LAO) with cranial (Cr) tilt, shallow right anterior oblique (AP-RAO) with caudal (Ca) tilt, RAO with Ca tilt and AP-RAO with Cr tilt. For the right coronary system three views were identified: LAO with Cr tilt, RAO and AP-RAO with Cr tilt. An alternative left coronary view set including a left lateral achieved minimally superior efficiency (,5%), but with an ,8% higher radiation dose to the patient and 40% higher cardiologist dose. Conclusion: This algorithm identifies a core set of angiographic views that optimises the information yield and minimises radiation risk. This basic data set would be supplemented by additional clinically determined views selected by the angiographer for each case. The decision to use additional views for diagnostic angiography and interventions would be assisted by referencing a table of relative radiation doses for the views being considered.
Resumo:
This paper reports on the development and implementation of a self-report risk assessment tool that was developed in an attempt to increase the efficacy of crash prediction within Australian fleet settings. This study forms a part of a broader program of research into work related road safety and identification of driving risk. The first phase of the study involved a series of focus groups being conducted with 217 professional drivers which revealed that the following factors were proposed to influence driving performance: Fatigue, Knowledge of risk, Mood, Impatience and frustration, Speed limits, Experience, Other road users, Passengers, Health, and Culture. The second phase of the study involved piloting the newly developed 38 item Driving Risk Assessment Scale - Work Version (DRAS-WV) with 546 professional drivers. Factor analytic techniques identified a 9 factor solution that was comprised of speeding, aggression, time pressure, distraction, casualness, awareness, maintenance, fatigue and minor damage. Speeding and aggressive driving manoeuvres were identified to be the most frequent aberrant driving behaviours engaged in by the sample. However, a series of logistic regression analyses undertaken to determine the DRAS-WV scale’s ability to predict self-reported crashes revealed limited predictive efficacy e.g., 10% of crashes. This paper outlines proposed reasons for this limited predictive ability of the DRAS-WV as well as provides suggestions regarding the future of research that aims to develop methods to identify “at risk” drivers.
Resumo:
Crop simulation models have the potential to assess the risk associated with the selection of a specific N fertilizer rate, by integrating the effects of soil-crop interactions on crop growth under different pedo-climatic and management conditions. The objective of this study was to simulate the environmental and economic impact (nitrate leaching and N2O emissions) of a spatially variable N fertilizer application in an irrigated maize field in Italy. The validated SALUS model was run with 5 nitrogen rates scenarios, 50, 100, 150, 200, and 250 kg N ha−1, with the latter being the N fertilization adopted by the farmer. The long-term (25 years) simulations were performed on two previously identified spatially and temporally stable zones, a high yielding and low yielding zone. The simulation results showed that N fertilizer rate can be reduced without affecting yield and net return. The marginal net return was on average higher for the high yield zone, with values ranging from 1550 to 2650 € ha−1 for the 200 N and 1485 to 2875 € ha−1 for the 250 N. N leaching varied between 16.4 and 19.3 kg N ha−1 for the 200 N and the 250 N in the high yield zone. In the low yield zone, the 250 N had a significantly higher N leaching. N2O emissions varied between 0.28 kg N2O ha−1 for the 50 kg N ha−1 rate to a maximum of 1.41 kg N2O ha−1 for the 250 kg N ha−1 rate.
Resumo:
A sound knowledge of pathological disease processes is required for professional practice within health professions. The project described in this paper reviewed the resources currently available for the delivery of systematic pathology tutorials. Additional complementary resources were developed and the inclusion of these additional learning resources in practical tutorial sessions was evaluated for their impact on student learning. Student evaluation of the learning resources was undertaken across one semester with two different cohorts of health profession students using questionnaires and focus group discussion. Both cohorts reported an enhancement to their understanding of pathological disease processes through the use of the additional resources. Results indicate student perception of the value of the resources correlates with staff perception and is independent of prior experiences.
Resumo:
This paper presents an evaluation of an instrument to measure teachers’ attitudes towards reporting child sexual abuse and discusses the instrument’s merit for research into reporting practice. Based on responses from 444 Australian teachers, the Teachers’ Reporting Attitude Scale for Child Sexual Abuse (TRAS - CSA) was evaluated using exploratory factor analysis. The scale isolated three dimensions: commitment to the reporting role; confidence in the system’s response to reports; and concerns about reporting. These three factors accounted for 37.5% of the variance in the 14-item measure. Alpha coefficients for the subscales were 0.769 (commitment), 0.617 (confidence), and 0.661 (concerns). The findings provide insights into the complexity of studying teachers’ attitudes towards reporting of child sexual abuse, and have implications for future research.
Resumo:
Background: Mesenchymal stromal cells (MSC) with similar properties to bone marrow derived mesenchymal stromal cells (BM-MSC) have recently been grown from the limbus of the human cornea. We presently contribute to this novel area of research by evaluating methods for culturing human limbal MSC (L-MSC). Methods: Four basic strategies are compared: serum-supplemented medium (10% foetal bovine serum; FBS), standard serum-free medium supplemented with B-27, epidermal growth factor, and fibroblast growth factor 2, or one of two commercial serum-free media including Defined Keratinocyte Serum Free Medium (Invitrogen), and MesenCult-XF (Stem Cell Technologies). The phenotype of resulting cultures was examined using photography, flow cytometry (for CD34, CD45, CD73, CD90, CD105, CD141, CD271), immunocytochemistry (α-sma), differentiation assays (osteogenesis, adipogenesis, chrondrogenesis), and co-culture experiments with human limbal epithelial (HLE) cells. Results: While all techniques supported to varying degrees establishment of cultures, sustained growth and serial propagation was only achieved in 10% FBS medium or MesenCult-XF medium. Cultures established in 10% FBS medium were 70-80% CD34-/CD45-/CD90+/CD73+/CD105+, approximately 25% α-sma+, and displayed multi-potency. Cultures established in MesenCult-XF were >95% CD34-/CD45-/CD90+/CD73+/CD105+, 40% CD141+, rarely expressed α-sma, and displayed multi-potency. L-MSC supported growth of HLE cells, with the largest epithelial islands being observed in the presence of MesenCult-XF-grown L-MSC. All HLE cultures supported by L-MSC widely expressed the progenitor cell marker ∆Np63, along with the corneal differentiation marker cytokeratin 3. Conclusions: We conclude that MesenCult-XF® is a superior culture system for L-MSC, but further studies are required to explore the significance of CD141 expression in these cells.
Resumo:
Traffic safety studies mandate more than what existing micro-simulation models can offer as they postulate that every driver exhibits a safe behaviour. All the microscopic traffic simulation models are consisting of a car-following model and the Gazis–Herman–Rothery (GHR) car-following model is a widely used model. This paper highlights the limitations of the GHR car-following model capability to model longitudinal driving behaviour for safety study purposes. This study reviews and compares different version of the GHR model. To empower the GHR model on precise metrics reproduction a new set of car-following model parameters is offered to simulate unsafe vehicle conflicts. NGSIM vehicle trajectory data is used to evaluate the new model and short following headways and Time to Collision are employed to assess critical safety events within traffic flow. Risky events are extracted from available NGSIM data to evaluate the modified model against the generic versions of the GHR model. The results from simulation tests illustrate that the proposed model does predict the safety metrics better than the generic GHR model. Additionally it can potentially facilitate assessing and predicting traffic facilities’ safety using microscopic simulation. The new model can predict Near-miss rear-end crashes.
Resumo:
Chlamydial infections represent a major threat to the long-term survival of the koala and a successful vaccine would provide a valuable management tool. Vaccination however has the potential to enhance inflammatory disease in animals exposed to a natural infection prior to vaccination, a finding in early human and primate trials of whole cell vaccines to prevent trachoma. In the present study, we vaccinated both healthy koalas as well as clinically diseased koalas with a multi-subunit vaccine consisting of Chlamydia pecorum MOMP and NrdB mixed with immune stimulating complex as adjuvant. Following vaccination, there was no increase in inflammatory pathological changes in animals previously infected with Chlamydia. Strong antibody (including neutralizing antibodies) and lymphocyte proliferation responses were recorded in all vaccinated koalas, both healthy and clinically diseased. Vaccine induced antibodies specific for both vaccine antigens were observed not only in plasma but also in ocular secretions. Our data shows that an experimental chlamydial vaccine is safe to use in previously infected koalas, in that it does not worsen infection-associated lesions. Furthermore, the prototype vaccine is effective, as demonstrated by strong levels of neutralizing antibody and lymphocyte proliferation responses in both healthy and clinically diseased koalas. Collectively, this work illustrates the feasibility of developing a safe and effective Chlamydia vaccine as a tool for management of disease in wild koalas.