221 resultados para tooth injuries


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Driving and using prescription medicines that have the potential to impair driving is an emerging research area. To date it is characterised by a limited (although growing) number of studies and methodological complexities that make generalisations about impairment due to medications difficult. Consistent evidence has been found for the impairing effects of hypnotics, sedative antidepressants and antihistamines, and narcotic analgesics, although it has been estimated that as many as nine medication classes have the potential to impair driving (Alvarez & del Rio, 2000; Walsh, de Gier, Christopherson, & Verstraete, 2004). There is also evidence for increased negative effects related to concomitant use of other medications and alcohol (Movig et al., 2004; Pringle, Ahern, Heller, Gold, & Brown, 2005). Statistics on the high levels of Australian prescription medication use suggest that consumer awareness of driving impairment due to medicines should be examined. One web-based study has found a low level of awareness, knowledge and risk perceptions among Australian drivers about the impairing effects of various medications on driving (Mallick, Johnston, Goren, & Kennedy, 2007). The lack of awareness and knowledge brings into question the effectiveness of the existing countermeasures. In Australia these consist of the use of ancillary warning labels administered under mandatory regulation and professional guidelines, advice to patients, and the use of Consumer Medicines Information (CMI) with medications that are known to cause impairment. The responsibility for the use of the warnings and related counsel to patients primarily lies with the pharmacist when dispensing relevant medication. A review by the Therapeutic Goods Administration (TGA) noted that in practice, advice to patients may not occur and that CMI is not always available (TGA, 2002). Researchers have also found that patients' recall of verbal counsel is very low (Houts, Bachrach, Witmer, Tringali, Bucher, & Localio, 1998). With healthcare observed as increasingly being provided in outpatient conditions (Davis et al., 2006; Vingilis & MacDonald, 2000), establishing the effectiveness of the warning labels as a countermeasure is especially important. There have been recent international developments in medication categorisation systems and associated medication warning labels. In 2005, France implemented a four-tier medication categorisation and warning system to improve patients' and health professionals' awareness and knowledge of related road safety issues (AFSSAPS, 2005). This warning system uses a pictogram and indicates the level of potential impairment in relation to driving performance through the use of colour and advice on the recommended behaviour to adopt towards driving. The comparable Australian system does not indicate the severity level of potential effects, and does not provide specific guidelines on the attitude or actions that the individual should adopt towards driving. It is reliant upon the patient to be vigilant in self-monitoring effects, to understand the potential ways in which they may be affected and how serious these effects may be, and to adopt the appropriate protective actions. This thesis investigates the responses of a sample of Australian hospital outpatients who receive appropriate labelling and counselling advice about potential driving impairment due to prescribed medicines. It aims to provide baseline data on the understanding and use of relevant medications by a Queensland public hospital outpatient sample recruited through the hospital pharmacy. It includes an exploration and comparison of the effect of the Australian and French medication warning systems on medication user knowledge, attitudes, beliefs and behaviour, and explores whether there are areas in which the Australian system may be improved by including any beneficial elements of the French system. A total of 358 outpatients were surveyed, and a follow-up telephone survey was conducted with a subgroup of consenting participants who were taking at least one medication that required an ancillary warning label about driving impairment. A complementary study of 75 French hospital outpatients was also conducted to further investigate the performance of the warnings. Not surprisingly, medication use among the Australian outpatient sample was high. The ancillary warning labels required to appear on medications that can impair driving were prevalent. A subgroup of participants was identified as being potentially at-risk of driving impaired, based on their reported recent use of medications requiring an ancillary warning label and level of driving activity. The sample reported previous behaviour and held future intentions that were consistent with warning label advice and health protective action. Participants did not express a particular need for being advised by a health professional regarding fitness to drive in relation to their medication. However, it was also apparent from the analysis that the participants would be significantly more likely to follow advice from a doctor than a pharmacist. High levels of knowledge in terms of general principles about effects of alcohol, illicit drugs and combinations of substances, and related health and crash risks were revealed. This may reflect a sample specific effect. Emphasis is placed in the professional guidelines for hospital pharmacists that make it essential that advisory labels are applied to medicines where applicable and that warning advice is given to all patients on medication which may affect driving (SHPA, 2006, p. 221). The research program applied selected theoretical constructs from Schwarzer's (1992) Health Action Process Approach, which has extended constructs from existing health theories such as the Theory of Planned Behavior (Ajzen, 1991) to better account for the intention-behaviour gap often observed when predicting behaviour. This was undertaken to explore the utility of the constructs in understanding and predicting compliance intentions and behaviour with the mandatory medication warning about driving impairment. This investigation revealed that the theoretical constructs related to intention and planning to avoid driving if an effect from the medication was noticed were useful. Not all the theoretical model constructs that had been demonstrated to be significant predictors in previous research on different health behaviours were significant in the present analyses. Positive outcome expectancies from avoiding driving were found to be important influences on forming the intention to avoid driving if an effect due to medication was noticed. In turn, intention was found to be a significant predictor of planning. Other selected theoretical constructs failed to predict compliance with the Australian warning label advice. It is possible that the limited predictive power of a number of constructs including risk perceptions is due to the small sample size obtained at follow up on which the evaluation is based. Alternately, it is possible that the theoretical constructs failed to sufficiently account for issues of particular relevance to the driving situation. The responses of the Australian hospital outpatient sample towards the Australian and French medication warning labels, which differed according to visual characteristics and warning message, were examined. In addition, a complementary study with a sample of French hospital outpatients was undertaken in order to allow general comparisons concerning the performance of the warnings. While a large amount of research exists concerning warning effectiveness, there is little research that has specifically investigated medication warnings relating to driving impairment. General established principles concerning factors that have been demonstrated to enhance warning noticeability and behavioural compliance have been extrapolated and investigated in the present study. The extent to which there is a need for education and improved health messages on this issue was a core issue of investigation in this thesis. Among the Australian sample, the size of the warning label and text, and red colour were the most visually important characteristics. The pictogram used in the French labels was also rated highly, and was salient for a large proportion of the sample. According to the study of French hospital outpatients, the pictogram was perceived to be the most important visual characteristic. Overall, the findings suggest that the Australian approach of using a combination of visual characteristics was important for the majority of the sample but that the use of a pictogram could enhance effects. A high rate of warning recall was found overall and a further important finding was that higher warning label recall was associated with increased number of medication classes taken. These results suggest that increased vigilance and care are associated with the number of medications taken and the associated repetition of the warning message. Significantly higher levels of risk perception were found for the French Level 3 (highest severity) label compared with the comparable mandatory Australian ancillary Label 1 warning. Participants' intentions related to the warning labels indicated that they would be more cautious while taking potentially impairing medication displaying the French Level 3 label compared with the Australian Label 1. These are potentially important findings for the Australian context regarding the current driving impairment warnings about displayed on medication. The findings raise other important implications for the Australian labelling context. An underlying factor may be the differences in the wording of the warning messages that appear on the Australian and French labels. The French label explicitly states "do not drive" while the Australian label states "if affected, do not drive", and the difference in responses may reflect that less severity is perceived where the situation involves the consumer's self-assessment of their impairment. The differences in the assignment of responsibility by the Australian (the consumer assesses and decides) and French (the doctor assesses and decides) approaches for the decision to drive while taking medication raises the core question of who is most able to assess driving impairment due to medication: the consumer, or the health professional? There are pros and cons related to knowledge, expertise and practicalities with either option. However, if the safety of the consumer is the primary aim, then the trend towards stronger risk perceptions and more consistent and cautious behavioural intentions in relation to the French label suggests that this approach may be more beneficial for consumer safety. The observations from the follow-up survey, although based on a small sample size and descriptive in nature, revealed that just over half of the sample recalled seeing a warning label about driving impairment on at least one of their medications. The majority of these respondents reported compliance with the warning advice. However, the results indicated variation in responses concerning alcohol intake and modifying the dose of medication or driving habits so that they could continue to drive, which suggests that the warning advice may not be having the desired impact. The findings of this research have implications for current countermeasures in this area. These have included enhancing the role that prescribing doctors have in providing warnings and advice to patients about the impact that their medication can have on driving, increasing consumer perceptions of the authority of pharmacists on this issue, and the reinforcement of the warning message. More broadly, it is suggested that there would be benefit in a wider dissemination of research-based information on increased crash risk and systematic monitoring and publicity about the representation of medications in crashes resulting in injuries and fatalities. Suggestions for future research concern the continued investigation of the effects of medications and interactions with existing medical conditions and other substances on driving skills, effects of variations in warning label design, individual behaviours and characteristics (particularly among those groups who are dependent upon prescription medication) and validation of consumer self-assessment of impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most crash severity studies ignored severity correlations between driver-vehicle units involved in the same crashes. Models without accounting for these within-crash correlations will result in biased estimates in the factor effects. This study developed a Bayesian hierarchical binomial logistic model to identify the significant factors affecting the severity level of driver injury and vehicle damage in traffic crashes at signalized intersections. Crash data in Singapore were employed to calibrate the model. Model fitness assessment and comparison using Intra-class Correlation Coefficient (ICC) and Deviance Information Criterion (DIC) ensured the suitability of introducing the crash-level random effects. Crashes occurring in peak time, in good street lighting condition, involving pedestrian injuries are associated with a lower severity, while those in night time, at T/Y type intersections, on right-most lane, and installed with red light camera have larger odds of being severe. Moreover, heavy vehicles have a better resistance on severe crash, while crashes involving two-wheel vehicles, young or aged drivers, and the involvement of offending party are more likely to result in severe injuries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: In Singapore, motorcycle crashes account for 50% of traffic fatalities and 53% of injuries. While extensive research efforts have been devoted to improve the motorcycle safety, the relationship between the rider behavior and the crash risk is still not well understood. The objective of this study is to evaluate how behavioral factors influence crash risk and to identify the most vulnerable group of motorcyclists. Methods: To explore the rider behavior, a 61-item questionnaire examining sensation seeking (Zuckerman et al., 1978), impulsiveness (Eysenck et al., 1985), aggressiveness (Buss & Perry, 1992), and risk-taking behavior (Weber et al., 2002) was developed. A total of 240 respondents with at least one year riding experience form the sample that relate behavior to their crash history, traffic penalty awareness, and demographic characteristics. By clustering the crash risk using the medoid portioning algorithm, the log-linear model relating the rider behavior to crash risk was developed. Results and Discussions: Crash-involved motorcyclists scored higher in impulsive sensation seeking, aggression and risk-taking behavior. Aggressive and high risk-taking motorcyclists were respectively 1.30 and 2.21 times more likely to fall under the high crash involvement group while impulsive sensation seeking was not found to be significant. Based on the scores on risk-taking and aggression, the motorcyclists were clustered into four distinct personality combinations namely, extrovert (aggressive, impulsive risk-takers), leader (cautious, aggressive risk-takers), follower (agreeable, ignorant risk-takers), and introvert (self-consciousness, fainthearted risk-takers). “Extrovert” motorcyclists were most prone to crashes, being 3.34 times more likely to involve in crash and 8.29 times more vulnerable than the “introvert”. Mediating factors like demographic characteristics, riding experience, and traffic penalty awareness were found not to be significant in reducing crash risk. Conclusion: The findings of this study will be useful for road safety campaign planners to be more focused in the target group as well as those who employ motorcyclists for their delivery business.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Navigational collisions are one of the major safety concerns for many seaports. Continuing growth of shipping traffic in number and sizes is likely to result in increased number of traffic movements, which consequently could result higher risk of collisions in these restricted waters. This continually increasing safety concern warrants a comprehensive technique for modeling collision risk in port waters, particularly for modeling the probability of collision events and the associated consequences (i.e., injuries and fatalities). A number of techniques have been utilized for modeling the risk qualitatively, semi-quantitatively and quantitatively. These traditional techniques mostly rely on historical collision data, often in conjunction with expert judgments. However, these techniques are hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of collision counts for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique (NTCT), which uses traffic conflicts as an alternative to the collisions for modeling the probability of collision events quantitatively. This article explores the existing techniques for modeling collision risk in port waters. In particular, it identifies the advantages and limitations of the traditional techniques and highlights the potentials of the NTCT in overcoming the limitations. In view of the principles of the NTCT, a structured method for managing collision risk is proposed. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which consequently has great potential for managing collision risk in a fast, reliable and efficient manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The epithelium of the corneolimbus contains stem cells for regenerating the corneal epithelium. Diseases and injuries affecting the limbus can lead to a condition known as limbal stem cell deficiency (LSCD), which results in loss of the corneal epithelium, and subsequent chronic inflammation and scarring of the ocular surface. Advances in the treatment of LSCD have been achieved through use of cultured human limbal epithelial (HLE) grafts to restore epithelial stem cells of the ocular surface. These epithelial grafts are usually produced by the ex vivo expansion of HLE cells on human donor amniotic membrane (AM), but this is not without limitations. Although AM is the most widely accepted substratum for HLE transplantation, donor variation, risk of disease transfer, and rising costs have led to the search for alternative biomaterials to improve the surgical outcome of LSCD. Recent studies have demonstrated that Bombyx mori silk fibroin (hereafter referred to as fibroin) membranes support the growth of primary HLE cells, and thus this thesis aims to explore the possibility of using fibroin as a biomaterial for ocular surface reconstruction. Optimistically, the grafted sheets of cultured epithelium would provide a replenishing source of epithelial progenitor cells for maintaining the corneal epithelium, however, the HLE cells lose their progenitor cell characteristics once removed from their niche. More severe ocular surface injuries, which result in stromal scarring, damage the epithelial stem cell niche, which subsequently leads to poor corneal re-epithelialisation post-grafting. An ideal solution to repairing the corneal limbus would therefore be to grow and transplant HLE cells on a biomaterial that also provides a means for replacing underlying stromal cells required to better simulate the normal stem cell niche. The recent discovery of limbal mesenchymal stromal cells (L-MSC) provides a possibility for stromal repair and regeneration, and therefore, this thesis presents the use of fibroin as a possible biomaterial to support a three dimensional tissue engineered corneolimbus with both an HLE and underlying L-MSC layer. Investigation into optimal scaffold design is necessary, including adequate separation of epithelial and stromal layers, as well as direct cell-cell contact. Firstly, the attachment, morphology and phenotype of HLE cells grown on fibroin were directly compared to that observed on donor AM, the current clinical standard substrate for HLE transplantation. The production, transparency, and permeability of fibroin membranes were also evaluated in this part of the study. Results revealed that fibroin membranes could be routinely produced using a custom-made film casting table and were found to be transparent and permeable. Attachment of HLE cells to fibroin after 4 hours in serum-free medium was similar to that supported by tissue culture plastic but approximately 6-fold less than that observed on AM. While HLE cultured on AM displayed superior stratification, epithelia constructed from HLE on fibroin maintained evidence of corneal phenotype (cytokeratin pair 3/12 expression; CK3/12) and displayed a comparable number and distribution of ÄNp63+ progenitor cells to that seen in cultures grown on AM. These results confirm the suitability of membranes constructed from silk fibroin as a possible substrate for HLE cultivation. One of the most important aspects in corneolimbal tissue engineering is to consider the reconstruction of the limbal stem cell niche to help form the natural limbus in situ. MSC with similar properties to bone marrow derived-MSC (BM-MSC) have recently been grown from the limbus of the human cornea. This thesis evaluated methods for culturing L-MSC and limbal keratocytes using various serum-free media. The phenotype of resulting cultures was examined using photography, flow cytometry for CD34 (keratocyte marker), CD45 (bone marrow-derived cell marker), CD73, CD90, CD105 (collectively MSC markers), CD141 (epithelial/vascular endothelial marker), and CD271 (neuronal marker), immunocytochemistry (alpha-smooth muscle actin; á-sma), differentiation assays (osteogenesis, adipogenesis and chrondrogenesis), and co-culture experiments with HLE cells. While all techniques supported to varying degrees establishment of keratocyte and L-MSC cultures, sustained growth and serial propagation was only achieved in serum-supplemented medium or the MesenCult-XF„¥ culture system (Stem Cell Technologies). Cultures established in MesenCult-XF„¥ grew faster than those grown in serum-supplemented medium and retained a more optimal MSC phenotype. L-MSC cultivated in MesenCult-XFR were also positive for CD141, rarely expressed £\-sma, and displayed multi-potency. L-MSC supported growth of HLE cells, with the largest epithelial islands being observed in the presence of L-MSC established in MesenCult-XF„¥ medium. All HLE cultures supported by L-MSC widely expressed the progenitor cell marker £GNp63, along with the corneal differentiation marker CK3/12. Our findings conclude that MesenCult-XFR is a superior culture system for L-MSC, but further studies are required to explore the significance of CD141 expression in these cells. Following on from the findings of the previous two parts, silk fibroin was tested as a novel dual-layer construct containing both an epithelium and underlying stroma for corneolimbal reconstruction. In this section, the growth and phenotype of HLE cells on non-porous versus porous fibroin membranes was compared. Furthermore, the growth of L-MSC in either serum-supplemented medium or the MesenCult-XFR culture system within fibroin fibrous mats was investigated. Lastly, the co-culture of HLE and L-MSC in serum-supplemented medium on and within fibroin dual-layer constructs was also examined. HLE on porous membranes displayed a flattened and squamous monolayer; in contrast, HLE on non-porous fibroin appeared cuboidal and stratified closer in appearance to a normal corneal epithelium. Both constructs maintained CK3/12 expression and distribution of £GNp63+ progenitor cells. Dual-layer fibroin scaffolds consisting of HLE cells and L-MSC maintained a similar phenotype as on the single layers alone. Overall, the present study proposed to create a three dimensional limbal tissue substitute of HLE cells and L-MSC together, ultimately for safe and beneficial transplantation back into the human eye. The results show that HLE and L-MSC can be cultivated separately and together whilst maintaining a clinically feasible phenotype containing a majority of progenitor cells. In addition, L-MSC were able to be cultivated routinely in the MesenCult-XF® culture system while maintaining a high purity for the MSC characteristic phenotype. However, as a serum-free culture medium was not found to sustain growth of both HLE and L-MSC, the combination scaffold was created in serum-supplemented medium, indicating that further refinement of this cultured limbal scaffold is required. This thesis has also demonstrated a potential novel marker for L-MSC, and has generated knowledge which may impact on the understanding of stromal-epithelial interactions. These results support the feasibility of a dual-layer tissue engineered corneolimbus constructed from silk fibroin, and warrant further studies into the potential benefits it offers to corneolimbal tissue regeneration. Further refinement of this technology should explore the potential benefits of using epithelial-stromal co-cultures with MesenCult-XF® derived L-MSC. Subsequent investigations into the effects of long-term culture on the phenotype and behaviour of the cells in the dual-layer scaffolds are also required. While this project demonstrated the feasibility in vitro for the production of a dual-layer tissue engineered corneolimbus, further studies are required to test the efficacy of the limbal scaffold in vivo. Future in vivo studies are essential to fully understand the integration and degradation of silk fibroin biomaterials in the cornea over time. Subsequent experiments should also investigate the use of both AM and silk fibroin with epithelial and stromal cell co-cultures in an animal model of LSCD. The outcomes of this project have provided a foundation for research into corneolimbal reconstruction using biomaterials and offer a stepping stone for future studies into corneolimbal tissue engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aetiology behind overuse injuries such as stress fractures is complex and multi-factorial. In sporting events where the loading is likely to be uneven (e.g. hurdling and jumps), research has suggested that the frequency of stress fractures seems to favour the athlete’s dominant limb. The tendency for an individual to have a preferred limb for voluntary motor acts makes limb selection a possible factor behind the development of unilateral overuse injuries, particularly when repeatedly used during high loading activities. The event of sprint hurdling is well suited for the study of loading asymmetry as the hurdling technique is repetitive and the limb movement asymmetrical. Of relevance to this study is the high incidence of Navicular Stress Fractures (NSF) in hurdlers, with suggestions there is a tendency for the fracture to develop in the trail leg foot, although this is not fully accepted. The Ground Reaction Force (GRF) with each foot contact is influenced by the hurdle action, with research finding step-to-step loading variations. However, it is unknown if this loading asymmetry extends to individual forefoot joints, thereby influencing stress fracture development. The first part of the study involved a series of investigations using a commercially available matrix style in-shoe sensor system (FscanTM, Tekscan Inc.). The suitability of insole sensor systems and custom made discrete sensors for use in hurdling-related training activities was assessed. The methodology used to analyse foot loading with each technology was investigated. The insole and discrete sensors systems tested proved to be unsuitable for use during full pace hurdling. Instead, a running barrier task designed to replicate the four repetitive foot contacts present during hurdling was assessed. This involved the clearance of a series of 6 barriers (low training hurdles), place in a straight line, using 4 strides between each. The second part of the study involved the analysis of "inter-limb" and "within foot loading asymmetries" using stance duration as well as vertical GRF under the Hallux (T1), the first metatarsal head (M1) and the central forefoot peak pressure site (M2), during walking, running, and running with barrier clearances. The contribution to loading asymmetry that each of the four repetitive foot contacts made during a series of barrier clearances was also assessed. Inter-limb asymmetry, in forefoot loading, occurred at discrete forefoot sites in a non-uniform manner across the three gait conditions. When the individual barrier foot contacts were compared, the stance duration was asymmetrical and the proportion of total forefoot load at M2 was asymmetrical. There were no significant differences between the proportion of forefoot load at M1, compared to M2; for any of the steps involved in the barrier clearance. A case study testing experimental (discrete) sensors during full pace sprinting and hurdling found that during both gait conditions, the trail limb experienced the greater vertical GRF at M1 and M2. During full pace hurdling, increased stance duration and vertical loading was a characteristic of the trail limb hurdle foot contacts. Commercially available in-shoe systems are not suitable for on field assessment of full pace hurdling. For the use of discrete sensor technology to become commonplace in the field, more robust sensors need to be developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bicycle riding can be a positive experience for children and young people that builds confidence, independence and promotes healthy recreation. However, these benefits are dependent upon safe bicycle riding practices. Between 1 January 2004 and 31 December 2011, 12 children and young people under the age of 18 years died in bicycle incidents in Queensland. An additional 1736 bicycle-related injuries requiring emergency department attendance are estimated to have occurred between 2008 and 2009 in Queensland for children and young people under the age of 18 years. Of the twelve bicycle-related deaths between 2004 and 2011 in Queensland, two children were aged between 5-9 years, 5 young people were 10-14 years of age and 5 young people were between 15-17 years. The two children aged 5-9 years were riding their bikes for recreation. Children aged 10-14 years were most likely to have been killed in an incident while riding to school in the morning, with teenagers aged 15-17 years most likely to be killed in incidents occurring after school and in the evening. Bicycle riders are vulnerable road users, particularly children and young people. This is due to several factors that can be grouped into: 1) developmental characteristics such as body size and proportions, perceptional and attentional issues, road safety awareness and risk taking behaviours, and 2) environmental factors such as supervision and shared road use with vehicles. This paper examines safety issues for children and young people who have died in bicycle-related incidents in Queensland, and outlines areas of focus for injury prevention practitioners.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background We have previously demonstrated that human kidney proximal tubule epithelial cells (PTEC) are able to modulate autologous T and B lymphocyte responses. It is well established that dendritic cells (DC) are responsible for the initiation and direction of adaptive immune responses and that these cells occur in the renal interstitium in close apposition to PTEC under inflammatory disease settings. However, there is no information regarding the interaction of PTEC with DC in an autologous human context. Methods Human monocytes were differentiated into monocyte-derived DC (MoDC) in the absence or presence of primary autologous activated PTEC and matured with polyinosinic:polycytidylic acid [poly(I:C)], while purified, pre-formed myeloid blood DC (CD1c+ BDC) were cultured with autologous activated PTEC in the absence or presence of poly(I:C) stimulation. DC responses were monitored by surface antigen expression, cytokine secretion, antigen uptake capacity and allogeneic T-cell-stimulatory ability. Results The presence of autologous activated PTEC inhibited the differentiation of monocytes to MoDC. Furthermore, MoDC differentiated in the presence of PTEC displayed an immature surface phenotype, efficient phagocytic capacity and, upon poly(I:C) stimulation, secreted low levels of pro-inflammatory cytokine interleukin (IL)-12p70, high levels of anti-inflammatory cytokine IL-10 and induced weak Th1 responses. Similarly, pre-formed CD1c+ BDC matured in the presence of PTEC exhibited an immature tolerogenic surface phenotype, strong endocytic and phagocytic ability and stimulated significantly attenuated T-cell proliferative responses. Conclusions Our data suggest that activated PTEC regulate human autologous immunity via complex interactions with DC. The ability of PTEC to modulate autologous DC function has important implications for the dampening of pro-inflammatory immune responses within the tubulointerstitium in renal injuries. Further dissection of the mechanisms of PTEC modulation of autologous immune responses may offer targets for therapeutic intervention in renal medicine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dentinogenesis, certain growth factors, matrix proteoglycans, and proteins are directly or indirectly dependent on growth hormone. The hypothesis that growth hormone up-regulates the expression of enzymes, sialoproteins, and other extracellular matrix proteins implicated in the formation and mineralization of tooth and bone matrices was tested by the treatment of Lewis dwarf rats with growth hormone over 5 days. The molar teeth were processed for immunohistochemical demonstration of bone-alkaline phosphatase, bone morphogenetic proteins-2 and -4, osteocalcin, osteopontin, bone sialoprotein, and E11 protein. Odontoblasts responded to growth hormone by more cells expressing bone morphogenetic protein, alkaline phosphatase, osteocalcin, and osteopontin. No changes were found in bone sialoprotein or E11 protein expression. Thus, growth hormone may stimulate odontoblasts to express several growth factors and matrix proteins associated with dentin matrix biosynthesis in mature rat molars.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Aims: Falls and fall-related injuries result in reduced functioning, loss of independence, premature nursing home admissions and mortality. Malnutrition is associated with falls in the acute setting, but little is known about malnutrition and falls risk in the community. The aim of this study was to assess the association between malnutrition risk, falls risk and falls over a one-year period in community-dwelling older adults. Methods: Two hundred and fifty four subjects >65 years of age were recruited to participate in a study in order to identify risk factors for falls. Malnutrition risk was determined using the Mini Nutritional Assessment–Short Form. Results: 28.6% had experienced a fall and according to the Mini Nutritional Assessment-Short Form 3.9% (n=10) of subjects were at risk of malnutrition. There were no associations between malnutrition risk, the risk of falls, nor actual falls in healthy older adults in the community setting. Conclusions: There was a low prevalence of malnutrition risk in this sample of community-dwelling older adults and no association between nutritional risk and falls. Screening as part of a falls prevention program should focus on the risk of developing malnutrition as this is associated with falls.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

General risky behaviour is explored for correlation with risky driving behaviour in light of two theories, self-control and cross-situational consistency. Identification of general risky behaviours associated with risky driving behaviour, and the theory that best predicts the behaviours, will enable better targeting of intervention and education strategies to reduce driving related fatalities and injuries. A correlational study using participants (N=152) drawn from first year university undergraduates and the public surveyed their lifestyle and behaviours. Relationships were found between risky driving behaviours and other risky behaviours such as alcohol consumption, cannabis use and performing unlawful activities. No significant differences were found between genders, with the exception that males were more likely to believe that they were at risk of injury from their employment, χ2 (1, N = 152) = 4.49, p = .03, were more likely to have performed an unlawful offence, χ2 (1, N = 152) = 11.77, p = .001 and were more likely to drink drive, t (55.41) = -3.87, p < .001, mean difference = -0.63, CI 95% (-0.9, -0.37). People engaged in risky driving behaviours were more likely to engage in other risky behaviours. The theories that were explored were unable to accurately predict an association between general risky behaviour and driving without a license or when disqualified. Cross-situational consistency explained 20% (R2adj = .16) of the variance in which people engaged in risky driving with low self-control theory explaining an additional 0.3% variance (R2change = .003), F (8,143) = 6.92, p < .001. Driving while under the influence of alcohol could be predicted by risky behaviours in lifestyle, health, smoking, cannabis use and alcohol consumption, F (8,143) = 6.92, p < .001. The addition of self-control was not significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ethnicity is rarely considered in the development of injury prevention programs, despite its known impact on participation in risk behaviour. This study sought to understand engagement in transport related risk behaviours, patterns of injury and perceptions of risk among early adolescents who self-identify as being from a Pacific Islander background. In total 5 high schools throughout Queensland, Australia were recruited, of which 498 Year 9 students (13-14 years) completed questionnaires relating to their perceptions of risk and recent injury experience (specifically those transport behaviours that were medically treated and those that were not medically treated). The transport related risk behaviours captured in the survey were bicycle use, motorcycle use and passenger safety (riding with a drink driver and riding with a dangerous driver). The results are explored in terms of the prevalence of engagement in risky transport related behaviour among adolescents’ of Pacific Islander background compared to others of the same age. The results of this study provide an initial insight into the target participants’ perspective of risk in a road safety context as well as their experience of such behaviour and related injuries. This information may benefit future intervention programs specific to adolescents’ of Pacific Islander background.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Hamstring strain injuries are prevalent in sport and re-injury rates have been high for many years. Whilst much focus has centred on the impact of previous hamstring strain injury on maximal eccentric strength, high rates of torque development is also of interest, given the important role of the hamstrings during the terminal swing phase of running. The impact of prior strain injury on myoelectrical activity of the hamstrings during tasks requiring high rates of torque development has received little attention. Purpose: To determine if recreational athletes with a history of unilateral hamstring strain injury, who have returned to training and competition, will exhibit lower levels of myoelectrical activity during eccentric contraction, rate of torque development and impulse 30, 50 and 100ms after the onset of myoelectrical activity or torque development in the previously injured limb compared to the uninjured limb. Study design: Case-control study Methods: Twenty-six recreational athletes were recruited. Of these, 13 athletes had a history of unilateral hamstring strain injury (all confined to biceps femoris long head) and 13 had no history of hamstring strain injury. Following familiarisation, all athletes undertook isokinetic dynamometry testing and surface electromyography assessment of the biceps femoris long head and medial hamstrings during eccentric contractions at -60 and -1800.s-1. Results: In the injured limb of the injured group, compared to the contralateral uninjured limb rate of torque development and impulse was lower during -600.s-1 eccentric contractions at 50 (RTD, injured limb = 312.27 ± 191.78Nm.s-1 vs. uninjured limb = 518.54 ± 172.81Nm.s-1, p=0.008; IMP, injured limb = 0.73 ± 0.30 Nm.s vs. uninjured limb = 0.97 ± 0.23 Nm.s, p=0.005) and 100ms (RTD, injured limb = 280.03 ± 131.42Nm.s-1 vs. uninjured limb = 460.54.54 ± 152.94Nm.s-1,p=0.001; IMP, injured limb = 2.15 ± 0.89 Nm.s vs. uninjured limb = 3.07 ± 0.63 Nm.s, p<0.001) after the onset of contraction. Biceps femoris long head muscle activation was lower at 100ms at both contraction speeds (-600.s-1, normalised iEMG activity (x1000), injured limb = 26.25 ± 10.11 vs. uninjured limb 33.57 ± 8.29, p=0.009; -1800.s-1, normalised iEMG activity (x1000), injured limb = 31.16 ± 10.01 vs. uninjured limb 39.64 ± 8.36, p=0.009). Medial hamstring activation did not differ between limbs in the injured group. Comparisons in the uninjured group showed no significant between limbs difference for any variables. Conclusion: Previously injured hamstrings displayed lower rate of torque development and impulse during slow maximal eccentric contraction compared to the contralateral uninjured limb. Lower myoelectrical activity was confined to the biceps femoris long head. Regardless of whether these deficits are the cause of or the result of injury, these findings could have important implications for hamstring strain injury and re-injury. Particularly, given the importance of high levels of muscle activity to bring about specific muscular adaptations, lower levels of myoelectrical activity may limit the adaptive response to rehabilitation interventions and suggest greater attention be given to neural function of the knee flexors following hamstring strain injury.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hamstring strain injuries are amongst the most common and problematic injuries in a wide range of sports that involve high speed running. The comparatively high rate of hamstring injury recurrence is arguably the most concerning aspect of these injuries. A number of modifiable and nonmodifiable risk factors are proposed to predispose athletes to hamstring strains. Potentially, the persistence of risk factors and the development of maladaptations following injury may explain injury recurrence. Here, the role of neuromuscular inhibition following injury is discussed as a potential mechanism for several maladaptations associated with hamstring re-injury. These maladaptations include eccentric hamstring weakness, selective hamstring atrophy and shifts in the knee flexor torque-joint angle relationship. Current evidence indicates that athletes return to competition after hamstring injury having developed maladaptations that predispose them to further injury. When rehabilitating athletes to return to competition following hamstring strain injury, the role of neuromuscular inhibition in re-injury should be considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Hamstring strain injuries (HSI) are the predominant non-contact injury in many sports. Eccentric hamstring muscle weakness following intermittent running has been implicated within the aetiology of HSI. This weakness following intermittent running is often greater eccentrically than concentrically, however the cause of this unique, contraction mode specific phenomenon is unknown. AIM: To determine if this preferential eccentric decline in strength is caused by declines in voluntary hamstring muscle activation. METHODS: Fifteen recreationally active males completed 18 × 20m overground sprints. Maximal strength (concentric and eccentric knee flexor and concentric knee extensor) was determined isokinetically at the velocities of ±1800.s-1 and ±600.s- while hamstring muscle activation was assessed using surface electromyography, before and 15 minutes after the running protocol. RESULTS: Overground intermittent running caused greater eccentric (27.2 Nm; 95% CI = 11.2 to 43.3; p=0.0001) than concentric knee flexor weakness (9.3 Nm; 95% CI = -6.7 to 25.3; P=0.6361). Following the overground running, voluntary activation levels of the lateral hamstrings showed a significant decline (0.08%; 95% CI = 0.045 to 0.120; P<0.0001). In comparison, medial hamstring activation showed no change following intermittent running. CONCLUSION: Eccentric hamstring strength is decreased significantly following intermittent overground running. Voluntary activation deficits in the biceps femoris muscle are responsible for some portion of this weakness. The implications of this finding are significant because the biceps femoris muscle is the most frequently strained of all the hamstring muscles and because fatigue appears to play an important part in injury occurrence.