979 resultados para drug quality


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poly(lactide-co-glycolide) (PLGA) beads have been widely studied as a potential drug/protein carrier. The main shortcomings of PLGA beads are that they lack bioactivity and controllable drug-delivery ability, and their acidic degradation by-products can lead to pH decrease in the vicinity of the implants. Akermanite (AK) (Ca(2) MgSi(2) O(7) ) is a novel bioactive ceramic which has shown excellent bioactivity and degradation in vivo. This study aimed to incorporate AK to PLGA beads to improve the physiochemical, drug-delivery, and biological properties of PLGA beads. The microstructure of beads was characterized by SEM. The effect of AK incorporating into PLGA beads on the mechanical strength, apatite-formation ability, the loading and release of BSA, and the proliferation, and differentiation of bone marrow stromal cells (BMSCs) was investigated. The results showed that the incorporation of AK into PLGA beads altered the anisotropic microporous structure into homogenous one and improved their compressive strength and apatite-formation ability in simulated body fluids (SBF). AK neutralized the acidic products from PLGA beads, leading to stable pH value of 7.4 in biological environment. AK led to a sustainable and controllable release of bovine serum albumin (BSA) in PLGA beads. The incorporation of AK into PLGA beads enhanced the proliferation and alkaline phosphatase activity of BMSCs. This study implies that the incorporation of AK into PLGA beads is a promising method to enhance their physiochemical and biological property. AK/PLGA composite beads are a potential bioactive drug-delivery system for bone tissue repair.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The issue of ensuring that construction projects achieve high quality outcomes continues to be an important consideration for key project stakeholders. Although a lot of quality practices have been done within the industry, establishment and achievement of reasonable levels of quality in construction projects continues to be a problem. While some studies into the introduction and development of quality practices and stakeholder management in the construction industry have been undertaken separately, no major studies have so far been completed that examine in depth how quality management practices that specifically address stakeholders’ perspectives of quality can be utilised to contribute to the ultimate constructed quality of projects. This paper encompasses and summarizes a review of the literature related to previous research undertaken on quality within the industry, focuses on the benefits and shortcomings, together with examining the concept of integrating stakeholder perspectives of project quality for improvement of outcomes throughout the project lifecycle. Findings discussed in this paper reveal a pressing need for investigation, development and testing of a framework to facilitate better implementation of quality management practices and thus achievement of better quality outcomes within the construction industry. The framework will incorporate and integrate the views of stakeholders on what constitutes final project quality to be utilised in developing better quality management planning and systems aimed ultimately at achieving better project quality delivery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In today’s electronic world vast amounts of knowledge is stored within many datasets and databases. Often the default format of this data means that the knowledge within is not immediately accessible, but rather has to be mined and extracted. This requires automated tools and they need to be effective and efficient. Association rule mining is one approach to obtaining knowledge stored with datasets / databases which includes frequent patterns and association rules between the items / attributes of a dataset with varying levels of strength. However, this is also association rule mining’s downside; the number of rules that can be found is usually very big. In order to effectively use the association rules (and the knowledge within) the number of rules needs to be kept manageable, thus it is necessary to have a method to reduce the number of association rules. However, we do not want to lose knowledge through this process. Thus the idea of non-redundant association rule mining was born. A second issue with association rule mining is determining which ones are interesting. The standard approach has been to use support and confidence. But they have their limitations. Approaches which use information about the dataset’s structure to measure association rules are limited, but could yield useful association rules if tapped. Finally, while it is important to be able to get interesting association rules from a dataset in a manageable size, it is equally as important to be able to apply them in a practical way, where the knowledge they contain can be taken advantage of. Association rules show items / attributes that appear together frequently. Recommendation systems also look at patterns and items / attributes that occur together frequently in order to make a recommendation to a person. It should therefore be possible to bring the two together. In this thesis we look at these three issues and propose approaches to help. For discovering non-redundant rules we propose enhanced approaches to rule mining in multi-level datasets that will allow hierarchically redundant association rules to be identified and removed, without information loss. When it comes to discovering interesting association rules based on the dataset’s structure we propose three measures for use in multi-level datasets. Lastly, we propose and demonstrate an approach that allows for association rules to be practically and effectively used in a recommender system, while at the same time improving the recommender system’s performance. This especially becomes evident when looking at the user cold-start problem for a recommender system. In fact our proposal helps to solve this serious problem facing recommender systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatially offset Raman spectroscopy (SORS) is a powerful new technique for the non-invasive detection and identification of concealed substances and drugs. Here, we demonstrate the SORS technique in several scenarios that are relevant to customs screening, postal screening, drug detection and forensics applications. The examples include analysis of a multi-layered postal package to identify a concealed substance; identification of an antibiotic capsule inside its plastic blister pack; analysis of an envelope containing a powder; and identification of a drug dissolved in a clear solvent, contained in a non-transparent plastic bottle. As well as providing practical examples of SORS, the results highlight several considerations regarding the use of SORS in the field, including the advantages of different analysis geometries and the ability to tailor instrument parameters and optics to suit different types of packages and samples. We also discuss the features and benefits of SORS in relation to existing Raman techniques, including confocal microscopy, wide area illumination and the conventional backscattered Raman spectroscopy. The results will contribute to the recognition of SORS as a promising method for the rapid, chemically-specific analysis and detection of drugs and pharmaceuticals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An existing model for solvent penetration and drug release from a spherically-shaped polymeric drug delivery device is revisited. The model has two moving boundaries, one that describes the interface between the glassy and rubbery states of polymer, and another that defines the interface between the polymer ball and the pool of solvent. The model is extended so that the nonlinear diffusion coefficient of drug explicitly depends on the concentration of solvent, and the resulting equations are solved numerically using a front-fixing transformation together with a finite difference spatial discretisation and the method of lines. We present evidence that our scheme is much more accurate than a previous scheme. Asymptotic results in the small-time limit are presented, which show how the use of a kinetic law as a boundary condition on the innermost moving boundary dictates qualitative behaviour, the scalings being very different to the similar moving boundary problem that arises from modelling the melting of an ice ball. The implication is that the model considered here exhibits what is referred to as ``non-Fickian'' or Case II diffusion which, together with the initially constant rate of drug release, has certain appeal from a pharmaceutical perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of mesoporous bioactive glasses (MBG) for drug delivery and bone tissue regeneration has grown significantly over the past 5 years. In this review, we highlight the recent advances made in the preparation of MBG particles, spheres, fibers and scaffolds. The advantages of MBG for drug delivery and bone scaffold applications are related to this material’s well-ordered mesopore channel structure, superior bioactivity, and the application for the delivery of both hydrophilic and hydrophobic drugs. A brief forward-looking perspective on the potential clinical applications of MBG in regenerative medicine is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The aim of this study is to seek an association between markers of metastatic potential, drug resistance-related protein and monocarboxylate transporters in prostate cancer (CaP). Methods: We evaluated the expression of invasive markers (CD147, CD44v3-10), drug-resistance protein (MDR1) and monocarboxylate transporters (MCT1 and MCT4) in CaP metastatic cell lines and CaP tissue microarrays (n=140) by immunostaining. The co-expression of CD147 and CD44v3-10 with that of MDR1, MCT1 and MCT4 in CaP cell lines was evaluated using confocal microscopy. The relationship between the expression of CD147 and CD44v3-10 and the sensitivity (IC50) to docetaxel in CaP cell lines was assessed using MTT assay. The relationship between expression of CD44v3-10, MDR1 and MCT4 and various clinicopathological CaP progression parameters was examined. Results: CD147 and CD44v3-10 were co-expressed with MDR1, MCT1 and MCT4 in primary and metastatic CaP cells. Both CD147 and CD44v3-10 expression levels were inversely related to docetaxel sensitivity (IC50) in metastatic CaP cell lines. Overexpression of CD44v3-10, MDR1 and MCT4 was found in most primary CaP tissues, and was significantly associated with CaP progression. Conclusions: Our results suggest that the overexpression of CD147, CD44v3-10, MDR1 and MCT4 is associated with CaP progression. Expression of both CD147 and CD44v3-10 is correlated with drug resistance during CaP metastasis and could be a useful potential therapeutic target in advanced disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.