367 resultados para Risk based Maintenance
Resumo:
Background India has a large and evolving HIV epidemic. Little is known about cancer risk in Indian persons with HIV/AIDS (PHA) but risk is thought to be low. Methods To describe the state of knowledge about cancer patterns in Indian PHA, we reviewed reports from the international and Indian literature. Results As elsewhere, non-Hodgkin lymphomas dominate the profile of recognized cancers, with immunoblastic/large cell diffuse lymphoma being the most common type. Hodgkin lymphoma is proportionally increased, perhaps because survival with AIDS is truncated by fatal infections. In contrast, Kaposi sarcoma is rare, in association with an apparently low prevalence of Kaposi sarcoma-associated herpesvirus. If confirmed, the reasons for the low prevalence need to be understood. Cervical, anal, vulva/vaginal and penile cancers all appear to be increased in PHA, based on limited data. The association may be confounded by sexual behaviors that transmit both HIV and human papillomavirus. Head and neck tumor incidence may also be increased, an important concern since these tumors are among the most common in India. Based on limited evidence, the increase is at buccal/palatal sites, which are associated with tobacco and betel nut chewing rather than human papillomavirus. Conclusion With improving care of HIV and better management of infections, especially tuberculosis, the longer survival of PHA in India will likely increase the importance of cancer as a clinical problem in India. With the population's geographic and social diversity, India presents unique research opportunities that can be embedded in programs targeting HIV/AIDS and other public health priorities.
Resumo:
Many governments world-wide are increasingly encouraging the involvement of interested individuals, groups and organisations in their publicinfrastructure and construction (PIC) projects as a means of improving the openness, transparency and accountability of the decision-making process and help improve the projects’ long-term viability and benefits to the community. In China, however, the current participatory mechanism at the project level exists only as part of the environmental impact assessment (EIA) process. With an increasing demand for PIC projects and social equality in China, this suggests a need to bring the participatory process into line with international practice. The aim of this paper, therefore, is to identify the weaknesses of EIA-basedpublicparticipation in China and the means by which it may be improved for the whole lifecycle of PIC schemes. To do this, the results of a series of interviews with a diverse group of experts is reported which analyse the nature and extent of existing problems of publicparticipation in EIA and suggestions for improvement. These indicate that the current level of participation in PIC projects is quite limited, particularly in the crucial earlier stages, primarily due to traditional culture and values, uneven progress in the adoption of participatory mechanisms, the risk of not meeting targets and lack of confidence in public competence. Finally, aprocess flowchart is proposed to guide construction practitioners and the community in general.
Resumo:
An increase in the likelihood of navigational collisions in port waters has put focus on the collision avoidance process in port traffic safety. The most widely used on-board collision avoidance system is the automatic radar plotting aid which is a passive warning system that triggers an alert based on the pilot’s pre-defined indicators of distance and time proximities at the closest point of approaches in encounters with nearby vessels. To better help pilot in decision making in close quarter situations, collision risk should be considered as a continuous monotonic function of the proximities and risk perception should be considered probabilistically. This paper derives an ordered probit regression model to study perceived collision risks. To illustrate the procedure, the risks perceived by Singapore port pilots were obtained to calibrate the regression model. The results demonstrate that a framework based on the probabilistic risk assessment model can be used to give a better understanding of collision risk and to define a more appropriate level of evasive actions.
Resumo:
Motor vehicle crashes are a leading cause of death among young people. Fourteen percent of adolescents aged 13-14 report passenger-related injuries within three months. Intervention programs typically focus on young drivers and overlook passengers as potential protective influences. Graduated Driver Licensing restricts passenger numbers, and this study focuses on a complementary school-based intervention to increase passengers’ personal- and peer-protective behavior. The aim of this research was to assess the impact of the curriculum-based injury prevention program, Skills for Preventing Injury in Youth (SPIY), on passenger-related risk-taking and injuries, and intentions to intervene in friends’ risky road behavior. SPIY was implemented in Grade 8 Health classes and evaluated using survey and focus group data from 843 students across 10 Australian secondary schools. Intervention students reported less passenger-related risk-taking six months following the program. Their intention to protect friends from underage driving also increased. The results of this study show that a comprehensive, school-based program targeting individual and social changes can increase adolescent passenger safety.
Resumo:
The purpose of this study was to challenge the broadly based focus of injury prevention strategies towards concern with the needs of young adolescents who engage in multiple anti-social and delinquent behaviours. Five hundred and forty 13-14 year olds reported on injuries and truancy, violence, illegal road behaviours, drug, and alcohol use. Engagement in these behaviours was found to contribute to the likelihood of an injury. Those engaging in the most anti-social and delinquent behaviours were around five times more likely to report medically-treated injuries in the past three months. Their likelihood of future injury was 1.8 times more likely when they were followed up three months later. The engagement in multiple delinquent and illegal behaviours thus significantly increased the likelihood of injury and identifies a particularly vulnerable group. The findings also suggest that reaching these young people represents a key target for change strategies in injury prevention programs.
Resumo:
Background and significance: Older adults with chronic diseases are at increasing risk of hospital admission and readmission. Approximately 75% of adults have at least one chronic condition, and the odds of developing a chronic condition increases with age. Chronic diseases consume about 70% of the total Australian health expenditure, and about 59% of hospital events for chronic conditions are potentially preventable. These figures have brought to light the importance of the management of chronic disease among the growing older population. Many studies have endeavoured to develop effective chronic disease management programs by applying social cognitive theory. However, limited studies have focused on chronic disease self-management in older adults at high risk of hospital readmission. Moreover, although the majority of studies have covered wide and valuable outcome measures, there is scant evidence on examining the fundamental health outcomes such as nutritional status, functional status and health-related quality of life. Aim: The aim of this research was to test social cognitive theory in relation to self-efficacy in managing chronic disease and three health outcomes, namely nutritional status, functional status, and health-related quality of life, in older adults at high risk of hospital readmission. Methods: A cross-sectional study design was employed for this research. Three studies were undertaken. Study One examined the nutritional status and validation of a nutritional screening tool; Study Two explored the relationships between participants. characteristics, self-efficacy beliefs, and health outcomes based on the study.s hypothesized model; Study Three tested a theoretical model based on social cognitive theory, which examines potential mechanisms of the mediation effects of social support and self-efficacy beliefs. One hundred and fifty-seven patients aged 65 years and older with a medical admission and at least one risk factor for readmission were recruited. Data were collected from medical records on demographics, medical history, and from self-report questionnaires. The nutrition data were collected by two registered nurses. For Study One, a contingency table and the kappa statistic was used to determine the validity of the Malnutrition Screening Tool. In Study Two, standard multiple regression, hierarchical multiple regression and logistic regression were undertaken to determine the significant influential predictors for the three health outcome measures. For Study Three, a structural equation modelling approach was taken to test the hypothesized self-efficacy model. Results: The findings of Study One suggested that a high prevalence of malnutrition continues to be a concern in older adults as the prevalence of malnutrition was 20.6% according to the Subjective Global Assessment. Additionally, the findings confirmed that the Malnutrition Screening Tool is a valid nutritional screening tool for hospitalized older adults at risk of readmission when compared to the Subjective Global Assessment with high sensitivity (94%), and specificity (89%) and substantial agreement between these two methods (k = .74, p < .001; 95% CI .62-.86). Analysis data for Study Two found that depressive symptoms and perceived social support were the two strongest influential factors for self-efficacy in managing chronic disease in a hierarchical multiple regression. Results of multivariable regression models suggested advancing age, depressive symptoms and less tangible support were three important predictors for malnutrition. In terms of functional status, a standard regression model found that social support was the strongest predictor for the Instrumental Activities of Daily Living, followed by self-efficacy in managing chronic disease. The results of standard multiple regression revealed that the number of hospital readmission risk factors adversely affected the physical component score, while depressive symptoms and self-efficacy beliefs were two significant predictors for the mental component score. In Study Three, the results of the structural equation modelling found that self-efficacy partially mediated the effect of health characteristics and depression on health-related quality of life. The health characteristics had strong direct effects on functional status and body mass index. The results also indicated that social support partially mediated the relationship between health characteristics and functional status. With regard to the joint effects of social support and self-efficacy, social support fully mediated the effect of health characteristics on self-efficacy, and self-efficacy partially mediated the effect of social support on functional status and health-related quality of life. The results also demonstrated that the models fitted the data well with relative high variance explained by the models, implying the hypothesized constructs under discussion were highly relevant, and hence the application for social cognitive theory in this context was supported. Conclusion: This thesis highlights the applicability of social cognitive theory on chronic disease self-management in older adults at risk of hospital readmission. Further studies are recommended to validate and continue to extend the development of social cognitive theory on chronic disease self-management in older adults to improve their nutritional and functional status, and health-related quality of life.
Resumo:
Information communication and technology (ICT) systems are almost ubiquitous in the modern world. It is hard to identify any industry, or for that matter any part of society, that is not in some way dependent on these systems and their continued secure operation. Therefore the security of information infrastructures, both on an organisational and societal level, is of critical importance. Information security risk assessment is an essential part of ensuring that these systems are appropriately protected and positioned to deal with a rapidly changing threat environment. The complexity of these systems and their inter-dependencies however, introduces a similar complexity to the information security risk assessment task. This complexity suggests that information security risk assessment cannot, optimally, be undertaken manually. Information security risk assessment for individual components of the information infrastructure can be aided by the use of a software tool, a type of simulation, which concentrates on modelling failure rather than normal operational simulation. Avoiding the modelling of the operational system will once again reduce the level of complexity of the assessment task. The use of such a tool provides the opportunity to reuse information in many different ways by developing a repository of relevant information to aid in both risk assessment and management and governance and compliance activities. Widespread use of such a tool allows the opportunity for the risk models developed for individual information infrastructure components to be connected in order to develop a model of information security exposures across the entire information infrastructure. In this thesis conceptual and practical aspects of risk and its underlying epistemology are analysed to produce a model suitable for application to information security risk assessment. Based on this work prototype software has been developed to explore these concepts for information security risk assessment. Initial work has been carried out to investigate the use of this software for information security compliance and governance activities. Finally, an initial concept for extending the use of this approach across an information infrastructure is presented.
Resumo:
Aim: Whilst motorcycle rider training is commonly incorporated into licensing programs in many developed nations, little empirical support has been found in previous research to prescribe it as an effective road safety countermeasure. It has been posited that the lack of effect of motorcycle rider training on crash reduction may, in part, be due to the predominant focus on skills-based training with little attention devoted to addressing attitudes and motives that influence subsequent risky riding. However, little past research has actually endeavoured to measure attitudinal and motivational factors as a function of rider training. Accordingly, this study was undertaken to assess the effect of a commercial motorcycle rider training program on psychosocial factors that have been shown to influence risk taking by motorcyclists. Method: Four hundred and thirty-eight motorcycle riders attending a competency-based licence training course in Brisbane, Australia, voluntarily participated in the study. A self-report questionnaire adapted from the Rider Risk Assessment Measure (RRAM) was administered to participants at the commencement of training, then again at the conclusion of training. Participants were informed of the independent nature of the research and that their responses would in no way effect their chance of obtaining a licence. To minimise potential demand characteristics, participants were instructed to seal completed questionnaires in envelopes and place them in a sealed box accessible only by the research team (i.e. not able to be viewed by instructors). Results: Significant reductions in the propensity for thrill seeking and intentions to engage in risky riding in the next 12 months were found at the end of training. In addition, a significant increase in attitudes to safety was found. Conclusions: These findings indicate that rider training may have a positive short-term influence on riders’ propensity for risk taking. However, such findings must be interpreted with caution in regard to the subsequent safety of riders as these factors may be subject to further influence once riders are licensed and actively engage with peers during on-road riding. This highlights a challenge for road safety education / training programs in regard to the adoption of safety practices and the need for behavioural follow-up over time to ascertain long-term effects. This study was the initial phase of an ongoing program of research into rider training and risk taking framed around Theory of Planned Behaviour concepts. A subsequent 12 month follow-up of the study participants has been undertaken with data analysis pending.
Resumo:
I have been invited to discuss Risk and Responsibility in Women’s Prisons, a task which, is slightly intimidating for one such as I, who, having never worked in a prison, have never experienced the risks and responsibilities working in a prison entails. However, this discussion is based on what prisons’ staff have told me, as they have ruminated on the complexities of their jobs in women’s prisons and many of the examples which I will be using are taken from cross-national research which I did in 2000 and 2001 and which set out to analyse the fortunes of some innovatory programmes in relation to women’s prisons in England, Scotland, North America, Australia and Israel (Carlen 2002). The discussion draws in particular on the imaginative way in which the Scottish women’s prison, Cornton Vale, responded to the spate of suicides which it had in the late 1990s and which resulted in far reaching organizational change.
Resumo:
Background Physiotherapy and occupational therapy are two professions at high risk of work related musculoskeletal disorders (WRMD). This investigation aimed to identify risk factors for WRMD as perceived by the health professionals working in these roles (Aim 1), as well as current and future strategies they perceive will allow them to continue to work in physically demanding clinical roles (Aim 2). Methods A two phase exploratory investigation was undertaken. The first phase included a survey administered via a web based platform with qualitative open response items. The second phase involved four focus group sessions which explored topics obtained from the survey. Thematic analysis of qualitative data from the survey and focus groups was undertaken. Results Overall 112 (34.3%) of invited health professionals completed the survey; 66 (58.9%) were physiotherapists and 46 (41.1%) were occupational therapists. Twenty-four health professionals participated in one of four focus groups. The risk factors most frequently perceived by health professionals included: work postures and movements, lifting or carrying, patient related factors and repetitive tasks. The six primary themes for strategies to allow therapists to continue to work in physically demanding clinical roles included: organisational strategies, workload or work allocation, work practices, work environment and equipment, physical condition and capacity, and education and training. Conclusions Risk factors as well as current and potential strategies for reducing WRMD amongst these health professionals working in clinically demanding roles have been identified and discussed. Further investigation regarding the relative effectiveness of these strategies is warranted.
Resumo:
Objective: This study aims to describe how patients perceive the threat of falls in hospitals, to identify patient characteristics that are associated with greater or lesser perceptions of the threat of falls, and to examine whether there is a discord between the risk that patients perceive in general and the risk that they perceive for themselves personally. Method: A cross-sectional survey amongst geriatric rehabilitation inpatients in Brisbane, Australia, was implemented. The first component of the survey dealt with the ‘general’ nature of in-hospital falls and falls related risks while the second component of the survey was directed at identifying whether the patient held the same belief for themselves. Results: A total of 21 out of 125 participants (17%) indicated that they felt that they were at risk of falling during their hospitalisation and 28 (22%) felt that they would injure themselves if they were to fall. Self-perceived risk of falls was associated with decreasing age and lower cognitive function (Functional Independence Measure Cognitive score). A majority of patients felt that falls most commonly occur in the bathroom [n=67 (54%)] and that if they were to fall, they would fall in the bathroom [n=56 (45%)]. Discussion: Patients generally do not think they are at risk of falling while in hospital and this may contribute to poor adherence to falls prevention strategies. It is possible that raising patient perception of the risk of falls and injury from falls in hospitals may help improve adherence to falls prevention strategies in this setting.
Resumo:
Background Older people have higher rates of hospital admission than the general population and higher rates of readmission due to complications and falls. During hospitalisation, older people experience significant functional decline which impairs their future independence and quality of life. Acute hospital services comprise the largest section of health expenditure in Australia and prevention or delay of disease is known to produce more effective use of services. Current models of discharge planning and follow-up care, however, do not address the need to prevent deconditioning or functional decline. This paper describes the protocol of a randomised controlled trial which aims to evaluate innovative transitional care strategies to reduce unplanned readmissions and improve functional status, independence, and psycho-social well-being of community-based older people at risk of readmission. Methods/Design The study is a randomised controlled trial. Within 72 hours of hospital admission, a sample of older adults fitting the inclusion/exclusion criteria (aged 65 years and over, admitted with a medical diagnosis, able to walk independently for 3 meters, and at least one risk factor for readmission) are randomised into one of four groups: 1) the usual care control group, 2) the exercise and in-home/telephone follow-up intervention group, 3) the exercise only intervention group, or 4) the in-home/telephone follow-up only intervention group. The usual care control group receive usual discharge planning provided by the health service. In addition to usual care, the exercise and in-home/telephone follow-up intervention group receive an intervention consisting of a tailored exercise program, in-home visit and 24 week telephone follow-up by a gerontic nurse. The exercise only and in-home/telephone follow-up only intervention groups, in addition to usual care receive only the exercise or gerontic nurse components of the intervention respectively. Data collection is undertaken at baseline within 72 hours of hospital admission, 4 weeks following hospital discharge, 12 weeks following hospital discharge, and 24 weeks following hospital discharge. Outcome assessors are blinded to group allocation. Primary outcomes are emergency hospital readmissions and health service use, functional status, psychosocial well-being and cost effectiveness. Discussion The acute hospital sector comprises the largest component of health care system expenditure in developed countries, and older adults are the most frequent consumers. There are few trials to demonstrate effective models of transitional care to prevent emergency readmissions, loss of functional ability and independence in this population following an acute hospital admission. This study aims to address that gap and provide information for future health service planning which meets client needs and lowers the use of acute care services.
Resumo:
Background Coronary heart disease (CHD) and depression are leading causes of disease burden globally and the two often co-exist. Depression is common after Myocardial Infarction (MI) and it has been estimated that 15-35% of patients experience depressive symptoms. Co-morbid depression can impair health related quality of life (HRQOL), decrease medication adherence and appropriate utilisation of health services, lead to increased morbidity and suicide risk, and is associated with poorer CHD risk factor profiles and reduced survival. We aim to determine the feasibility of conducting a randomised, multi-centre trial designed to compare a tele-health program (MoodCare) for depression and CHD secondary prevention, with Usual Care (UC). Methods Over 1600 patients admitted after index admission for Acute Coronary Syndrome (ACS) are being screened for depression at six metropolitan hospitals in the Australian states of Victoria and Queensland. Consenting participants are then contacted at two weeks post-discharge for baseline assessment. One hundred eligible participants are to be randomised to an intervention or a usual medical care control group (50 per group). The intervention consists of up to 10 × 30-40 minute structured telephone sessions, delivered by registered psychologists, commencing within two weeks of baseline screening. The intervention focuses on depression management, lifestyle factors (physical activity, healthy eating, smoking cessation, alcohol intake), medication adherence and managing co-morbidities. Data collection occurs at baseline (Time 1), 6 months (post-intervention) (Time 2), 12 months (Time 3) and 24 months follow-up for longer term effects (Time 4). We are comparing depression (Cardiac Depression Scale [CDS]) and HRQOL (Short Form-12 [SF-12]) scores between treatment and UC groups, assessing the feasibility of the program through patient acceptability and exploring long term maintenance effects. A cost-effectiveness analysis of the costs and outcomes for patients in the intervention and control groups is being conducted from the perspective of health care costs to the government. Discussion This manuscript presents the protocol for a randomised, multi-centre trial to evaluate the feasibility of a tele-based depression management and CHD secondary prevention program for ACS patients. The results of this trial will provide valuable new information about potential psychological and wellbeing benefits, cost-effectiveness and acceptability of an innovative tele-based depression management and secondary prevention program for CHD patients experiencing depression.
Resumo:
The epithelium of the corneolimbus contains stem cells for regenerating the corneal epithelium. Diseases and injuries affecting the limbus can lead to a condition known as limbal stem cell deficiency (LSCD), which results in loss of the corneal epithelium, and subsequent chronic inflammation and scarring of the ocular surface. Advances in the treatment of LSCD have been achieved through use of cultured human limbal epithelial (HLE) grafts to restore epithelial stem cells of the ocular surface. These epithelial grafts are usually produced by the ex vivo expansion of HLE cells on human donor amniotic membrane (AM), but this is not without limitations. Although AM is the most widely accepted substratum for HLE transplantation, donor variation, risk of disease transfer, and rising costs have led to the search for alternative biomaterials to improve the surgical outcome of LSCD. Recent studies have demonstrated that Bombyx mori silk fibroin (hereafter referred to as fibroin) membranes support the growth of primary HLE cells, and thus this thesis aims to explore the possibility of using fibroin as a biomaterial for ocular surface reconstruction. Optimistically, the grafted sheets of cultured epithelium would provide a replenishing source of epithelial progenitor cells for maintaining the corneal epithelium, however, the HLE cells lose their progenitor cell characteristics once removed from their niche. More severe ocular surface injuries, which result in stromal scarring, damage the epithelial stem cell niche, which subsequently leads to poor corneal re-epithelialisation post-grafting. An ideal solution to repairing the corneal limbus would therefore be to grow and transplant HLE cells on a biomaterial that also provides a means for replacing underlying stromal cells required to better simulate the normal stem cell niche. The recent discovery of limbal mesenchymal stromal cells (L-MSC) provides a possibility for stromal repair and regeneration, and therefore, this thesis presents the use of fibroin as a possible biomaterial to support a three dimensional tissue engineered corneolimbus with both an HLE and underlying L-MSC layer. Investigation into optimal scaffold design is necessary, including adequate separation of epithelial and stromal layers, as well as direct cell-cell contact. Firstly, the attachment, morphology and phenotype of HLE cells grown on fibroin were directly compared to that observed on donor AM, the current clinical standard substrate for HLE transplantation. The production, transparency, and permeability of fibroin membranes were also evaluated in this part of the study. Results revealed that fibroin membranes could be routinely produced using a custom-made film casting table and were found to be transparent and permeable. Attachment of HLE cells to fibroin after 4 hours in serum-free medium was similar to that supported by tissue culture plastic but approximately 6-fold less than that observed on AM. While HLE cultured on AM displayed superior stratification, epithelia constructed from HLE on fibroin maintained evidence of corneal phenotype (cytokeratin pair 3/12 expression; CK3/12) and displayed a comparable number and distribution of ÄNp63+ progenitor cells to that seen in cultures grown on AM. These results confirm the suitability of membranes constructed from silk fibroin as a possible substrate for HLE cultivation. One of the most important aspects in corneolimbal tissue engineering is to consider the reconstruction of the limbal stem cell niche to help form the natural limbus in situ. MSC with similar properties to bone marrow derived-MSC (BM-MSC) have recently been grown from the limbus of the human cornea. This thesis evaluated methods for culturing L-MSC and limbal keratocytes using various serum-free media. The phenotype of resulting cultures was examined using photography, flow cytometry for CD34 (keratocyte marker), CD45 (bone marrow-derived cell marker), CD73, CD90, CD105 (collectively MSC markers), CD141 (epithelial/vascular endothelial marker), and CD271 (neuronal marker), immunocytochemistry (alpha-smooth muscle actin; á-sma), differentiation assays (osteogenesis, adipogenesis and chrondrogenesis), and co-culture experiments with HLE cells. While all techniques supported to varying degrees establishment of keratocyte and L-MSC cultures, sustained growth and serial propagation was only achieved in serum-supplemented medium or the MesenCult-XF„¥ culture system (Stem Cell Technologies). Cultures established in MesenCult-XF„¥ grew faster than those grown in serum-supplemented medium and retained a more optimal MSC phenotype. L-MSC cultivated in MesenCult-XFR were also positive for CD141, rarely expressed £\-sma, and displayed multi-potency. L-MSC supported growth of HLE cells, with the largest epithelial islands being observed in the presence of L-MSC established in MesenCult-XF„¥ medium. All HLE cultures supported by L-MSC widely expressed the progenitor cell marker £GNp63, along with the corneal differentiation marker CK3/12. Our findings conclude that MesenCult-XFR is a superior culture system for L-MSC, but further studies are required to explore the significance of CD141 expression in these cells. Following on from the findings of the previous two parts, silk fibroin was tested as a novel dual-layer construct containing both an epithelium and underlying stroma for corneolimbal reconstruction. In this section, the growth and phenotype of HLE cells on non-porous versus porous fibroin membranes was compared. Furthermore, the growth of L-MSC in either serum-supplemented medium or the MesenCult-XFR culture system within fibroin fibrous mats was investigated. Lastly, the co-culture of HLE and L-MSC in serum-supplemented medium on and within fibroin dual-layer constructs was also examined. HLE on porous membranes displayed a flattened and squamous monolayer; in contrast, HLE on non-porous fibroin appeared cuboidal and stratified closer in appearance to a normal corneal epithelium. Both constructs maintained CK3/12 expression and distribution of £GNp63+ progenitor cells. Dual-layer fibroin scaffolds consisting of HLE cells and L-MSC maintained a similar phenotype as on the single layers alone. Overall, the present study proposed to create a three dimensional limbal tissue substitute of HLE cells and L-MSC together, ultimately for safe and beneficial transplantation back into the human eye. The results show that HLE and L-MSC can be cultivated separately and together whilst maintaining a clinically feasible phenotype containing a majority of progenitor cells. In addition, L-MSC were able to be cultivated routinely in the MesenCult-XF® culture system while maintaining a high purity for the MSC characteristic phenotype. However, as a serum-free culture medium was not found to sustain growth of both HLE and L-MSC, the combination scaffold was created in serum-supplemented medium, indicating that further refinement of this cultured limbal scaffold is required. This thesis has also demonstrated a potential novel marker for L-MSC, and has generated knowledge which may impact on the understanding of stromal-epithelial interactions. These results support the feasibility of a dual-layer tissue engineered corneolimbus constructed from silk fibroin, and warrant further studies into the potential benefits it offers to corneolimbal tissue regeneration. Further refinement of this technology should explore the potential benefits of using epithelial-stromal co-cultures with MesenCult-XF® derived L-MSC. Subsequent investigations into the effects of long-term culture on the phenotype and behaviour of the cells in the dual-layer scaffolds are also required. While this project demonstrated the feasibility in vitro for the production of a dual-layer tissue engineered corneolimbus, further studies are required to test the efficacy of the limbal scaffold in vivo. Future in vivo studies are essential to fully understand the integration and degradation of silk fibroin biomaterials in the cornea over time. Subsequent experiments should also investigate the use of both AM and silk fibroin with epithelial and stromal cell co-cultures in an animal model of LSCD. The outcomes of this project have provided a foundation for research into corneolimbal reconstruction using biomaterials and offer a stepping stone for future studies into corneolimbal tissue engineering.
Resumo:
Despite a general belief that incentive mechanisms can improve value for money during procurement and performance during project execution, empirical research on the actual effects is nascent. This research focuses on the design and implementation of incentive mechanisms in four different infrastructure projects: two road reconstructions in the Netherlands and two building constructions in Australia. Based on an analytical framework of key motivation drivers, a cross cases analysis is conducted in view of performance on the contract assumptions, selection phase, execution phase and project contract performance. It was identified that despite significant differences in the project characteristics, results indicate that they experience similar contextual drivers on the incentive effectiveness. High value was placed on risk allocation and relationship building in the selection and construction phase. The differences can be explained from both contextual and project related characteristics. Although there are limitations with this research in drawing generalizations across two sets of case projects, the results provide a strong base to explore the nature of incentive systems across different geographical and contextual boundaries in future research.