542 resultados para BARD score
Resumo:
Background: Alcohol craving is associated with greater alcohol-related problems and less favorable treatment prognosis. The Obsessive Compulsive Drinking Scale (OCDS) is the most widely used alcohol craving instrument. The OCDS has been validated in adults with alcohol use disorders (AUDs), which typically emerge in early adulthood. This study examines the validity of the OCDS in a nonclinical sample of young adults. Methods: Three hundred and nine college students (mean age of 21.8 years, SD = 4.6 years) completed the OCDS, Alcohol Use Disorders Identification Test (AUDIT), and measures of alcohol consumption. Subjects were randomly allocated to 2 samples. Construct validity was examined via exploratory factor analysis (n = 155) and confirmatory factor analysis (n = 154). Concurrent validity was assessed using the AUDIT and measures of alcohol consumption. A second, alcohol-dependent sample (mean age 42 years, SD 12 years) from a previously published study (n = 370) was used to assess discriminant validity. Results: A unique young adult OCDS factor structure was validated, consisting of Interference/Control, Frequency of Obsessions, Alcohol Consumption and Resisting Obsessions/Compulsions. The young adult 4-factor structure was significantly associated with the AUDIT and alcohol consumption. The 4 factor OCDS successfully classified nonclinical subjects in 96.9% of cases and the older alcohol-dependent patients in 83.7% of cases. Although the OCDS was able to classify college nonproblem drinkers (AUDIT <13, n = 224) with 83.2% accuracy, it was no better than chance (49.4%) in classifying potential college problem drinkers (AUDIT score ≥13, n = 85). Conclusions: Using the 4-factor structure, the OCDS is a valid measure of alcohol craving in young adult populations. In this nonclinical set of students, the OCDS classified nonproblem drinkers well but not problem drinkers. Studies need to further examine the utility of the OCDS in young people with alcohol misuse.
Resumo:
Purpose: To examine the relationship between visual impairment and functional status in a community-dwelling sample of older adults with glaucoma. Methods: This study included 74 community-dwelling older adults with open-angle glaucoma (aged 74 ± 6 years). Assessment of central vision included high-contrast visual acuity and Pelli-Robson contrast sensitivity. Binocular integrated visual fields were derived from merged monocular Humphrey Field Analyser visual field plots. Functional status outcome measures included physical performance tests (6-min walk test, timed up and go test and lower limb strength), a physical activity questionnaire (Physical Activity Scale for the Elderly) and an overall functional status score. Correlation and linear regression analyses, adjusting for age and gender, examined the association between visual impairment and functional status outcomes. Results: Greater levels of visual impairment were significantly associated with lower levels of functional status among community-dwelling older adults with glaucoma, independent of age and gender. Specifically, lower levels of visual function were associated with slower timed up and go performance, weaker lower limb strength, lower self-reported physical activity, and lower overall functional status scores. Of the components of vision examined, the inferior visual field and contrast factors were the strongest predictors of these functional outcomes, whereas the superior visual field factor was not related to functional status. Conclusions: Greater visual impairment, particularly in the inferior visual field and loss of contrast sensitivity, was associated with poorer functional status among older adults with glaucoma. The findings of this study highlight the potential links between visual impairment and the onset of functional decline. Interventions which promote physical activity among older adults with glaucoma may assist in preventing functional decline, frailty and falls, and improve overall health and well-being.
Resumo:
Background This economic evaluation reports the results of a detailed study of the cost of major trauma treated at Princess Alexandra Hospital (PAH), Australia. Methods A bottom-up approach was used to collect and aggregate the direct and indirect costs generated by a sample of 30 inpatients treated for major trauma at PAH in 2004. Major trauma was defined as an admission for Multiple Significant Trauma with an Injury Severity Score >15. Direct and indirect costs were amalgamated from three sources, (1) PAH inpatient costs, (2) Medicare Australia, and (3) a survey instrument. Inpatient costs included the initial episode of inpatient care including clinical and outpatient services and any subsequent representations for ongoing-related medical treatment. Medicare Australia provided an itemized list of pharmaceutical and ambulatory goods and services. The survey instrument collected out-of-pocket expenses and opportunity cost of employment forgone. Inpatient data obtained from a publically funded trauma registry were used to control for any potential bias in our sample. Costs are reported in Australian dollars for 2004 and 2008. Results The average direct and indirect costs of major trauma incurred up to 1-year postdischarge were estimated to be A$78,577 and A$24,273, respectively. The aggregate costs, for the State of Queensland, were estimated to range from A$86.1 million to $106.4 million in 2004 and from A$135 million to A$166.4 million in 2008. Conclusion These results demonstrate that (1) the costs of major trauma are significantly higher than previously reported estimates and (2) the cost of readmissions increased inpatient costs by 38.1%.
Resumo:
This paper presents a modified approach to evaluate access control policy similarity and dissimilarity based on the proposal by Lin et al. (2007). Lin et al.'s policy similarity approach is intended as a filter stage which identifies similar XACML policies that can be analysed further using more computationally demanding techniques based on model checking or logical reasoning. This paper improves the approach of computing similarity of Lin et al. and also proposes a mechanism to calculate a dissimilarity score by identifying related policies that are likely to produce different access decisions. Departing from the original algorithm, the modifications take into account the policy obligation, rule or policy combining algorithm and the operators between attribute name and value. The algorithms are useful in activities involving parties from multiple security domains such as secured collaboration or secured task distribution. The algorithms allow various comparison options for evaluating policies while retaining control over the restriction level via a number of thresholds and weight factors.
Resumo:
"How do you film a punch?" This question can be posed by actors, make-up artists, directors and cameramen. Though they can all ask the same question, they are not all seeking the same answer. Within a given domain, based on the roles they play, agents of the domain have different perspectives and they want the answers to their question from their perspective. In this example, an actor wants to know how to act when filming a scene involving a punch. A make-up artist is interested in how to do the make-up of the actor to show bruises that may result from the punch. Likewise, a director wants to know how to direct such a scene and a cameraman is seeking guidance on how best to film such a scene. This role-based difference in perspective is the underpinning of the Loculus framework for information management for the Motion Picture Industry. The Loculus framework exploits the perspective of agent for information extraction and classification within a given domain. The framework uses the positioning of the agent’s role within the domain ontology and its relatedness to other concepts in the ontology to determine the perspective of the agent. Domain ontology had to be developed for the motion picture industry as the domain lacked one. A rule-based relatedness score was developed to calculate the relative relatedness of concepts with the ontology, which were then used in the Loculus system for information exploitation and classification. The evaluation undertaken to date have yielded promising results and have indicated that exploiting perspective can lead to novel methods of information extraction and classifications.
Resumo:
This paper reports a study investigating the effect of individual cognitive styles on learning through computer-based instruction. The study adopted a quasi-experimental design involving four groups which were presented with instructional material that either matched or mismatched with their preferred cognitive styles. Cognitive styles were measured by cognitive style assessment software (Riding, 1991). The instructional material was designed to cater for the four cognitive styles identified by Riding. Students' learning outcomes were measured by the time taken to perform test tasks and the number of marks scored. The results indicate no significant difference between the matched and mismatched groups on both time taken and scores on test tasks. However, there was significant difference between the four cognitive styles on test score. The Wholist/Verbaliser group performed better then all other groups. There was no significant difference between the other groups. An analysis of the performance on test task by each cognitive style showed significant difference between the groups on recall, labelling and explanation. Difference between the cognitive style groups did not reach significance level for problem-solving tasks. The findings of the study indicate a potential for cognitive style to influence learning outcomes measured by performance on test tasks.
Resumo:
Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.
Resumo:
Airports are a place of transition, empty halls of fleeting comings, goings and waitings. 'Gate 38' follows the experience of four groups of young people trapped at this point of departure. As contact with the outside world is cut off, the focus is placed squarely on what they’re doing, and where they’re going. A non-traditional musical set at the end of the world. Commissioned by MacGregor State High School's Centre of Artistic Development, script development included workshops with the CAD class of 2007. No musical score required.
Resumo:
Introduction. Surgical treatment of scoliosis is assessed in the spine clinic by the surgeon making numerous measurements on X-Rays as well as the rib hump. But it is important to understand which of these measures correlate with self-reported improvements in patients’ quality of life following surgery. The objective of this study was to examine the relationship between patient satisfaction after thoracoscopic (keyhole) anterior scoliosis surgery and standard deformity correction measures using the Scoliosis Research Society (SRS) adolescent questionnaire. Methods. A series of 100 consecutive adolescent idiopathic scoliosis patients received a single anterior rod via a keyhole approach at the Mater Children’s Hospital, Brisbane. Patients completed SRS outcomes questionnaires before surgery and again at 24 months after surgery. Multiple regression and t-tests were used to investigate the relationship between SRS scores and deformity correction achieved after surgery. Results. There were 94 females and 6 males with a mean age of 16.1 years. The mean Cobb angle improved from 52º pre-operatively to 21º for the instrumented levels post-operatively (59% correction) and the mean rib hump improved from 16º to 8º (51% correction). The mean total SRS score for the cohort was 99.4/120 which indicated a high level of satisfaction with the results of their scoliosis surgery. None of the deformity related parameters in the multiple regressions were significant. However, the twenty patients with the smallest Cobb angles after surgery reported significantly higher SRS scores than the twenty patients with the largest Cobb angles after surgery, but there was no difference on the basis of rib hump correction. Discussion. Patients undergoing thoracoscopic (keyhole) anterior scoliosis correction report good SRS scores which are comparable to those in previous studies. We suggest that the absence of any statistically significant difference in SRS scores between patients with and without rod or screw complications is because these complications are not associated with any clinically significant loss of correction in our patient group. The Cobb angle after surgery was the only significant predictor of patient satisfaction when comparing subgroups of patients with the largest and smallest Cobb angles after surgery.
Resumo:
Background Up to one-third of people affected by cancer experience ongoing psychological distress and would benefit from screening followed by an appropriate level of psychological intervention. This rarely occurs in routine clinical practice due to barriers such as lack of time and experience. This study investigated the feasibility of community-based telephone helpline operators screening callers affected by cancer for their level of distress using a brief screening tool (Distress Thermometer), and triaging to the appropriate level of care using a tiered model. Methods Consecutive cancer patients and carers who contacted the helpline from September-December 2006 (n = 341) were invited to participate. Routine screening and triage was conducted by helpline operators at this time. Additional socio-demographic and psychosocial adjustment data were collected by telephone interview by research staff following the initial call. Results The Distress Thermometer had good overall accuracy in detecting general psychosocial morbidity (Hospital Anxiety and Depression Scale cut-off score ≥ 15) for cancer patients (AUC = 0.73) and carers (AUC = 0.70). We found 73% of participants met the Distress Thermometer cut-off for distress caseness according to the Hospital Anxiety and Depression Scale (a score ≥ 4), and optimal sensitivity (83%, 77%) and specificity (51%, 48%) were obtained with cut-offs of ≥ 4 and ≥ 6 in the patient and carer groups respectively. Distress was significantly associated with the Hospital Anxiety and Depression Scale scores (total, as well as anxiety and depression subscales) and level of care in cancer patients, as well as with the Hospital Anxiety and Depression Scale anxiety subscale for carers. There was a trend for more highly distressed callers to be triaged to more intensive care, with patients with distress scores ≥ 4 more likely to receive extended or specialist care. Conclusions Our data suggest that it was feasible for community-based cancer helpline operators to screen callers for distress using a brief screening tool, the Distress Thermometer, and to triage callers to an appropriate level of care using a tiered model. The Distress Thermometer is a rapid and non-invasive alternative to longer psychometric instruments, and may provide part of the solution in ensuring distressed patients and carers affected by cancer are identified and supported appropriately.
Resumo:
The emergence of Twenty20 cricket at the elite level has been marketed on the excitement of the big hitter, where it seems that winning is a result of the muscular batter hitting boundaries at will. This version of the game has captured the imagination of many young players who all want to score runs with “big hits”. However, in junior cricket, boundary hitting is often more difficult due to size limitations of children and games played on outfields where the ball does not travel quickly. As a result, winning is often achieved via a less spectacular route – by scoring more singles than your opponents. However, most standard coaching texts only describe how to play boundary scoring shots (e.g. the drives, pulls, cuts and sweeps) and defensive shots to protect the wicket. Learning to bat appears to have been reduced to extremes of force production, i.e. maximal force production to hit boundaries or minimal force production to stop the ball from hitting the wicket. Initially, this is not a problem because the typical innings of a young player (<12 years) would be based on the concept of “block” or “bash” – they “block” the good balls and “bash” the short balls. This approach works because there are many opportunities to hit boundaries off the numerous inaccurate deliveries of novice bowlers. Most runs are scored behind the wicket by using the pace of the bowler’s delivery to re-direct the ball, because the intrinsic dynamics (i.e. lack of strength) of most children means that they can only create sufficient power by playing shots where the whole body can contribute to force production. This method works well until the novice player comes up against more accurate bowling when they find they have no way of scoring runs. Once batters begin to face “good” bowlers, batters have to learn to score runs via singles. In cricket coaching manuals (e.g. ECB, n.d), running between the wickets is treated as a separate task to batting, and the “basics” of running, such as how to “back- up”, carry the bat, calling and turning and sliding the bat into the crease are “drilled” into players. This task decomposition strategy focussing on techniques is a common approach to skill acquisition in many highly traditional sports, typified in cricket by activities where players hit balls off tees and receive “throw-downs” from coaches. However, the relative usefulness of these approaches in the acquisition of sporting skills is increasingly being questioned (Pinder, Renshaw & Davids, 2009). We will discuss why this is the case in the next section.
Resumo:
Venous leg ulceration is a serious condition affecting 1 – 3% of the population. Decline in the function of the calf muscle pump is correlated with venous ulceration. Many previous studies have reported an improvement in the function of the calf muscle pump, endurance of the calf muscle and increased range of ankle motion after structured exercise programs. However, there is a paucity of published research that assesses if these improvements result in an improvement in the healing rates of venous ulcers. The primary purpose of this pilot study was to establish the feasibility of a homebased progressive resistance exercise program and examine if there was any clinical significance or trend toward healing. The secondary aims were to examine the benefit of a home-based progressive resistance exercise program on calf muscle pump function and physical parameters. The methodology used was a randomised controlled trial where eleven participants were randomised into an intervention (n = 6) or control group (n = 5). Participants who were randomised to receive a 12-week home-based progressive resistance exercise program were instructed through weekly face-to-face consultations during their wound clinic appointment by the author. Control group participants received standard wound care and compression therapy. Changes in ulcer parameters were measured fortnightly at the clinic (number healed at 12 weeks, percentage change in area and pressure ulcer score healing score). An air plethysmography test was performed at baseline and following the 12 weeks of training to determine changes in calf muscle pump function. Functional measures included maximum number of heel raises (endurance), maximal isometric plantar flexion (strength) and range of ankle motion (ROAM); these tests were conducted at baseline, week 6 and week 12. The sample for the study was drawn from the Princess Alexandra Hospital in Brisbane, Australia. Participants with venous leg ulceration who met the inclusion criteria were recruited. The participants were screened via duplex scanning and ankle brachial pressure index (ABPI) to ensure they did not have any arterial complications. Participants were excluded if there was evidence of cellulitis. Demographic data were obtained from each participant and details regarding medical history, quality of life and geriatric depression scores were collected at baseline. Both the intervention and control group were required to complete a weekly exercise diary to monitor activity levels between groups. To test for the effect of the intervention over time, a repeated measures analysis of variance was conducted on the major outcome variables. Group (intervention versus control) was the between subject factor and time (baseline, week 6, week 12) was the within subject or repeated measures factor. Due to the small sample size, further tests were conducted to check the assumptions of the statistical test to be used. The results showed that Mauchly.s Test, the Sphericity assumptions of repeated measures for ANOVA were met. Further tests of homogeneity of variance assumptions also confirmed that this assumption was met. Data analysis was conducted using the software package SPSS for Windows Release 17.0. The pilot study proved feasible with all of the intervention (n=6) participants continuing with the resistance program for the 12 week duration and no deleterious effects noted. Clinical significance was observed in the intervention group with a 32% greater change in ulcer size (p= 0.26) than the control group, and a 10% (p = 0.74) greater difference between the numbers healed compared to the control group. Statistical significance was observed for the ejection fraction (p = 0.05), residual volume fraction (p = 0.04) and ROAM (p = 0.01), which all improved significantly in the intervention group over time. These results are encouraging, nevertheless, further investigations seem warranted to examine the effect exercise has on the healing rates of venous leg ulcers, with a multistudy site, larger sample size and longer follow up period.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Objectives The p38 mitogen-activated protein kinase (MAPK) signal transduction pathway is involved in a variety of inflammatory responses, including cytokine generation, cell differentiation proliferation and apoptosis. Here, we examined the effects of systemic p38 MAPK inhibition on cartilage cells and osteoarthritis (OA) disease progression by both in vitro and in vivo approaches. Methods p38 kinase activity was evaluated in normal and OA cartilage cells by measuring the amount of phosphorylated protein. To examine the function of p38 signaling pathway in vitro, normal chondrocytes were isolated and differentiated in the presence or absence of p38 inhibitor; SB203580 and analysed for chondrogenic phenotype. Effect of systemic p38 MAPK inhibition in normal and OA (induced by menisectomy) rats were analysed by treating animals with vehicle alone (DMS0) or p38 inhibitor (SB203580). Damage to the femur and tibial plateau was evaluated by modified Mankin score, histology and immunohistochemistry. Results Our in vitro studies have revealed that a down-regulation of chondrogenic and increase of hypertrophic gene expression occurs in the normal chondrocytes, when p38 is neutralized by a pharmacological inhibitor. We further observed that the basal levels of p38 phosphorylation were decreased in OA chondrocytes compared with normal chondrocytes. These findings together indicate the importance of this pathway in the regulation of cartilage physiology and its relevance to OA pathogenesis. At in vivo level, systematic administration of a specific p38 MAPK inhibitor, SB203580, continuously for over a month led to a significant loss of proteoglycan; aggrecan and cartilage thickness. On the other hand, SB203580 treated normal rats showed a significant increase in TUNEL positive cells, cartilage hypertrophy markers such as Type 10 collagen, Runt-related transcription factor and Matrix metalloproteinase-13 and substantially induced OA like phenotypic changes in the normal rats. In addition, menisectomy induced OA rat models that were treated with p38 inhibitor showed aggravation of cartilage damage. Conclusions In summary, this study has provided evidence that the component of the p38 MAPK pathway is important to maintain the cartilage health and its inhibition can lead to severe cartilage degenerative changes. The observations in this study highlight the possibility of using activators of the p38 pathway as an alternative approach in the treatment of OA.
Resumo:
Abstract Causative genetic variants have to date been identified for only a small proportion of familial colorectal cancer (CRC). While conditions such as Familial Adenomatous Polyposis and Lynch syndrome have well defined genetic causes, the search for variants underlying the remainder of familial CRC is plagued by genetic heterogeneity. The recent identification of families with a heritable predisposition to malignancies arising through the serrated pathway (familial serrated neoplasia or Jass syndrome) provides an opportunity to study a subset of familial CRC in which heterogeneity may be greatly reduced. A genome-wide linkage screen was performed on a large family displaying a dominantly-inherited predisposition to serrated neoplasia genotyped using the Affymetrix GeneChip Human Mapping 10 K SNP Array. Parametric and nonparametric analyses were performed and resulting regions of interest, as well as previously reported CRC susceptibility loci at 3q22, 7q31 and 9q22, were followed up by finemapping in 10 serrated neoplasia families. Genome-wide linkage analysis revealed regions of interest at 2p25.2-p25.1, 2q24.3-q37.1 and 8p21.2-q12.1. Finemapping linkage and haplotype analyses identified 2q32.2-q33.3 as the region most likely to harbour linkage, with heterogeneity logarithm of the odds (HLOD) 2.09 and nonparametric linkage (NPL) score 2.36 (P = 0.004). Five primary candidate genes (CFLAR, CASP10, CASP8, FZD7 and BMPR2) were sequenced and no segregating variants identified. There was no evidence of linkage to previously reported loci on chromosomes 3, 7 and 9.