998 resultados para chick quality


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In today’s electronic world vast amounts of knowledge is stored within many datasets and databases. Often the default format of this data means that the knowledge within is not immediately accessible, but rather has to be mined and extracted. This requires automated tools and they need to be effective and efficient. Association rule mining is one approach to obtaining knowledge stored with datasets / databases which includes frequent patterns and association rules between the items / attributes of a dataset with varying levels of strength. However, this is also association rule mining’s downside; the number of rules that can be found is usually very big. In order to effectively use the association rules (and the knowledge within) the number of rules needs to be kept manageable, thus it is necessary to have a method to reduce the number of association rules. However, we do not want to lose knowledge through this process. Thus the idea of non-redundant association rule mining was born. A second issue with association rule mining is determining which ones are interesting. The standard approach has been to use support and confidence. But they have their limitations. Approaches which use information about the dataset’s structure to measure association rules are limited, but could yield useful association rules if tapped. Finally, while it is important to be able to get interesting association rules from a dataset in a manageable size, it is equally as important to be able to apply them in a practical way, where the knowledge they contain can be taken advantage of. Association rules show items / attributes that appear together frequently. Recommendation systems also look at patterns and items / attributes that occur together frequently in order to make a recommendation to a person. It should therefore be possible to bring the two together. In this thesis we look at these three issues and propose approaches to help. For discovering non-redundant rules we propose enhanced approaches to rule mining in multi-level datasets that will allow hierarchically redundant association rules to be identified and removed, without information loss. When it comes to discovering interesting association rules based on the dataset’s structure we propose three measures for use in multi-level datasets. Lastly, we propose and demonstrate an approach that allows for association rules to be practically and effectively used in a recommender system, while at the same time improving the recommender system’s performance. This especially becomes evident when looking at the user cold-start problem for a recommender system. In fact our proposal helps to solve this serious problem facing recommender systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inadequate air quality and the inhalation of airborne pollutants pose many risks to human health and wellbeing, and are listed among the top environmental risks worldwide. The importance of outdoor air quality was recognised in the 1950s and indoor air quality emerged as an issue some time later and was soon recognised as having an equal, if not greater importance than outdoor air quality. Identification of ambient air pollution as a health hazard was followed by steps, undertaken by a broad range of national and international professional and government organisations, aimed at reduction or elimination of the hazard. However, the process of achieving better air quality is still in progress. The last 10 years or so have seen an unprecedented increase in the interest in, and attention to, airborne particles, with a special focus on their finer size fractions, including ultrafine (< 0.1 m) and their subset, nano particles (< 0.05 m). This paper discusses the current status of scientific knowledge on the links between air quality and health, with a particular focus on airborne particulate matter, and the directions taken by national and international bodies to improve air quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/aim In response to the high burden of disease associated with chronic heart failure (CHF), in particular the high rates of hospital admissions, dedicated CHF management programs (CHF-MP) have been developed. Over the past five years there has been a rapid growth of CHF-MPs in Australia. Given the apparent mismatch between the demand for, and availability of CHF-MPs, this paper has been designed to discuss the accessibility to and quality of current CHF-MPs in Australia. Methods The data presented in this report has been combined from the research of the co-authors, in particular a review of the inequities in access to chronic heart failure which utilised geographical information systems (GIS) and the survey of heterogeneity in quality and service provision in Australian. Results Of the 62 CHF-MPs surveyed in this study 93% (58) centres had been located areas that are rated as Highly Accessible. This result indicated that most of the CHF-MPs have been located in capital cities or large regional cities. Six percent (4 CHF-MPs) had been located in Accessible areas which were country towns or cities. No CHF-MPs had been established outside of cities to service the estimated 72,000 individuals with CHF living in rural and remote areas. 16% of programs recruited NYHA Class I patients and of these 20% lacked confirmation (echocardiogram) of their diagnosis. Conclusion Overall, these data highlight the urgent need to provide equitable access to CHF-MP's. When establishing CHF-MPs consideration of current evidence based models to ensure quality in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality, as well as project success, in construction projects should be capable of being regarded as the fulfillment of expectation of those contributors and stakeholders involved in such projects. Although a significant amount of quality practices have been introduced within the industry, establishment and attainment of reasonable levels of quality internationally in construction projects continues to be an ongoing problem. To date, some investigation into the introduction and improvement of quality practices and stakeholder management in the construction industry has been accomplished independently, but so far no major studies have been completed that examine comprehensively how quality management practices that particularly concentrate on the stakeholders’ perspective of quality can be used to contribute to final project quality outcomes. This paper aims to examine the process for development of a framework for better involvement of stakeholders in quality planning and practices and subsequently to contribute to higher quality outcomes within construction projects. Through extensive literature review it highlights various perceptions of quality, categorizes quality issues with particular focus on benefits and shortcomings and also examines stakeholders’ viewpoint of project quality in order to promote the improvement of outcomes throughout a project’s lifecycle. It proposes a set of arranged information as a basis for development of prospective framework which ultimately aims to improve project quality outcomes. The subsequent framework that will be developed from this research will provide project managers and owners with the required information and strategic direction to achieve their own and their stakeholders’ targets for implementation of quality practices and achievement of high quality outcomes on their future projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explores organizational capability and culture change through a project developing an assurance of learning program in a business school. In order to compete internationally for high quality faculty, students, strategic partnerships and research collaborations it is essential for Universities to develop and maintain an international focus and a quality produce that predicts excellence in the student experience and graduate outcomes that meet industry needs. Developing, marketing and delivering that quality product requires an organizational strategy to which all members of the organization contribute and adhere. Now, the ability to acquire, share and utilize knowledge has become a critical organizational capability in academia as well as other industries. Traditionally the functional approach to business school structures and disparate nature of the social networks and work contact limit the sharing of knowledge between academics working in different disciplines. In this project a community of practice program was established to include academics in the development of an embedded assurance of learning program affecting more than 5000 undergraduate students and 250 academics from nine different disciplines across four schools. The primary outcome from the fully developed and implemented assurance of learning program was the five year accreditation of the business schools programs by two international accrediting bodies, EQUIS and AACSB. However this study explores a different outcome, namely the change in organizational culture and individual capabilities as academics worked together in teaching and learning teams. This study uses a survey and interviews with academics involved, through a retrospective panel design which contained an experimental group and a control group. Results offer insights into communities of practice as a means of addressing organizational capability and changes in organizational culture. Knowledge management and shared learning can achieve strategic and operational benefits equally within academia as within other industrial enterprises but it comes at a cost. Traditional structures, academics that act like individual contractors and deep divides across research, teaching and service interest served a different master and required fewer resources. Collaborative structures; fewer master categories of discrete knowledge areas; specific strategic goals; greater links between academics and industry; and the means to share learned insights will require a different approach to resourcing both the individual and the team.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: This study provides a simple method for improving precision of x-ray computed tomography (CT) scans of irradiated polymer gel dosimetry. The noise affecting CT scans of irradiated gels has been an impediment to the use of clinical CT scanners for gel dosimetry studies. Method: In this study, it is shown that multiple scans of a single PAGAT gel dosimeter can be used to extrapolate a ‘zero-scan’ image which displays a similar level of precision to an image obtained by averaging multiple CT images, without the compromised dose measurement resulting from the exposure of the gel to radiation from the CT scanner. Results: When extrapolating the zero-scan image, it is shown that exponential and simple linear fits to the relationship between Hounsfield unit and scan number, for each pixel in the image, provides an accurate indication of gel density. Conclusions: It is expected that this work will be utilised in the analysis of three-dimensional gel volumes irradiated using complex radiotherapy treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is based on the premise that universities have an obligation to provide adequate student support services, such as learning assistance (that is, assistance with academic writing and other study skills) and that in order to be effective such services must be responsive to the wider policy and social implications of student attrition and retention. The paper outlines briefly some of the factors that have influenced the development of learning assistance practices in Australia and America. This is followed by an account of experiences at one Australian metropolitan university where learning assistance service provision shifted from a decentralised, faculty-based model to a centralised model of service delivery. This shift was in response to concerns about lack of quality and consistency in a support model dependent upon faculty resources yet a follow up study identified other problems in the centralised delivery of learning assistance services. These problems, clustered under the heading contextualised versus decontextualised learning assistance, include the relevance of generic learning assistance services to students struggling with specific course related demands; the apparent tensions between challenging students and assisting students at risk of failure; and variations in the level of collaboration between learning advisers and academic staff in supporting students in the learning environment. These problems are analysed using the theoretical modelling derived from the tools made available through cultural historical activity theory and expansive visibilisation (Engeström & Miettinen, 1999).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Not all cancer patients receive state-of-the-art care and providing regular feedback to clinicians might reduce this problem. The purpose of this study was to assess the utility of various data sources in providing feedback on the quality of cancer care. Methods Published clinical practice guidelines were used to obtain a list of processes-of-care of interest to clinicians. These were assigned to one of four data categories according to their availability and the marginal cost of using them for feedback. Results Only 8 (3%) of 243 processes-of-care could be measured using population-based registry or administrative inpatient data (lowest cost). A further 119 (49%) could be measured using a core clinical registry, which contains information on important prognostic factors (e.g., clinical stage, physiological reserve, hormone-receptor status). Another 88 (36%) required an expanded clinical registry or medical record review; mainly because they concerned long-term management of disease progression (recurrences and metastases) and 28 (11.5%) required patient interview or audio-taping of consultations because they involved information sharing between clinician and patient. Conclusion The advantages of population-based cancer registries and administrative inpatient data are wide coverage and low cost. The disadvantage is that they currently contain information on only a few processes-of-care. In most jurisdictions, clinical cancer registries, which can be used to report on many more processes-of-care, do not cover smaller hospitals. If we are to provide feedback about all patients, not just those in larger academic hospitals with the most developed data systems, then we need to develop sustainable population-based data systems that capture information on prognostic factors at the time of initial diagnosis and information on management of disease progression.