593 resultados para Driving without a license.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The repair of bone defects that result from periodontal diseases remains a clinical challenge for periodontal therapy. β-tricalcium phosphate (β-TCP) ceramics are biodegradable inorganic bone substitutes with inorganic components that are similar to those of bone. Demineralized bone matrix (DBM) is an acid-extracted organic matrix derived from bone sources that consists of the collagen and matrix proteins of bone. A few studies have documented the effects of DBM on the proliferation and osteogenic differentiation of human periodontal ligament cells (hPDLCs). The aim of the present study was to investigate the effects of inorganic and organic elements of bone on the proliferation and osteogenic differentiation of hPDLCs using three-dimensional porous β-TCP ceramics and DBM with or without osteogenic inducers. Primary hPDLCs were isolated from human periodontal ligaments. The proliferation of the hPDLCs on the scaffolds in the growth culture medium was examined using a Cell‑Counting kit‑8 (CCK-8) and scanning electron microscopy (SEM). Alkaline phosphatase (ALP) activity and the osteogenic differentiation of the hPDLCs cultured on the β-TCP ceramics and DBM were examined in both the growth culture medium and osteogenic culture medium. Specific osteogenic differentiation markers were examined using reverse transcription-quantitative polymerase chain reaction (RT-qPCR). SEM images revealed that the cells on the β-TCP were spindle-shaped and much more spread out compared with the cells on the DBM surfaces. There were no significant differences observed in cell proliferation between the β-TCP ceramics and the DBM scaffolds. Compared with the cells that were cultured on β-TCP ceramics, the ALP activity, as well as the Runx2 and osteocalcin (OCN) mRNA levels in the hPDLCs cultured on DBM were significantly enhanced both in the growth culture medium and the osteogenic culture medium. The organic elements of bone may exhibit greater osteogenic differentiation effects on hPDLCs than the inorganic elements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seeking new biomarkers for epithelial ovarian cancer, the fifth most common cause of death from all cancers in women and the leading cause of death from gynaecological malignancies, we performed a meta-analysis of three independent studies and compared the results in regard to clinicopathological parameters. This analysis revealed that GAS6 was highly expressed in ovarian cancer and therefore was selected as our candidate of choice. GAS6 encodes a secreted protein involved in physiological processes including cell proliferation, chemotaxis, and cell survival. We performed immunohistochemistry on various ovarian cancer tissues and found that GAS6 expression was elevated in tumour tissue samples compared to healthy control samples (P < 0.0001). In addition, GAS6 expression was also higher in tumours from patients with residual disease compared to those without. Our data propose GAS6 as an independent predictor of poor survival, suggesting GAS6, both on the mRNA and on the protein level, as a potential biomarker for ovarian cancer. In clinical practice, the staining of a tumour biopsy for GAS6 may be useful to assess cancer prognosis and/or to monitor disease progression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: In this study, we report on initial efforts to discover putative biomarkers for differential diagnosis of a systemic inflammatory response syndrome (SIRS) versus sepsis; and different stages of sepsis. In addition, we also investigated whether there are proteins that can discriminate between patients who survived sepsis from those who did not. Materials and Methods: Our study group consisted of 16 patients, of which 6 died and 10 survived. We daily measured 28 plasma proteins, for the whole stay of the patients in the ICU. Results: We observed that metalloproteinases and sE-selectin play a role in the distinction between SIRS and sepsis, and that IL-1, IP-10, sTNF-R2 and sFas appear to be indicative for the progression from sepsis to septic shock. A combined measurement of MMP-3, -10, IL-1, IP-10, sIL-2R, sFas, sTNF-R1, sRAGE, GM-CSF, IL-1 and Eotaxin allows for a good separation of patients that survived from those that died (mortality prediction with a sensitivity of 79% and specificity of 86%). Correlation analysis suggests a novel interaction between IL-1a and IP-10. Conclusion: The marker panel is ready to be verified in a validation study with or without therapeutic intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Supine imaging modalities provide valuable 3D information on scoliotic anatomy, but the altered spine geometry between the supine and standing positions affects the Cobb angle measurement. Previous studies report a mean 7°-10° Cobb angle increase from supine to standing, but none have reported the effect of endplate pre-selection or whether other parameters affect this Cobb angle difference. Methods Cobb angles from existing coronal radiographs were compared to those on existing low-dose CT scans taken within three months of the reference radiograph for a group of females with adolescent idiopathic scoliosis. Reformatted coronal CT images were used to measure supine Cobb angles with and without endplate pre-selection (end-plates selected from the radiographs) by two observers on three separate occasions. Inter and intra-observer measurement variability were assessed. Multi-linear regression was used to investigate whether there was a relationship between supine to standing Cobb angle change and eight variables: patient age, mass, standing Cobb angle, Risser sign, ligament laxity, Lenke type, fulcrum flexibility and time delay between radiograph and CT scan. Results Fifty-two patients with right thoracic Lenke Type 1 curves and mean age 14.6 years (SD 1.8) were included. The mean Cobb angle on standing radiographs was 51.9° (SD 6.7). The mean Cobb angle on supine CT images without pre-selection of endplates was 41.1° (SD 6.4). The mean Cobb angle on supine CT images with endplate pre-selection was 40.5° (SD 6.6). Pre-selecting vertebral endplates increased the mean Cobb change by 0.6° (SD 2.3, range −9° to 6°). When free to do so, observers chose different levels for the end vertebrae in 39% of cases. Multi-linear regression revealed a statistically significant relationship between supine to standing Cobb change and fulcrum flexibility (p = 0.001), age (p = 0.027) and standing Cobb angle (p < 0.001). The 95% confidence intervals for intra-observer and inter-observer measurement variability were 3.1° and 3.6°, respectively. Conclusions Pre-selecting vertebral endplates causes minor changes to the mean supine to standing Cobb change. There is a statistically significant relationship between supine to standing Cobb change and fulcrum flexibility such that this difference can be considered a potential alternative measure of spinal flexibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deoxyribonucleic acid (DNA) extraction has considerably evolved since it was initially performed back in 1869. It is the first step required for many of the available downstream applications used in the field of molecular biology. Whole blood samples are one of the main sources used to obtain DNA, and there are many different protocols available to perform nucleic acid extraction on such samples. These methods vary from very basic manual protocols to more sophisticated methods included in automated DNA extraction protocols. Based on the wide range of available options, it would be ideal to determine the ones that perform best in terms of cost-effectiveness and time efficiency. We have reviewed DNA extraction history and the most commonly used methods for DNA extraction from whole blood samples, highlighting their individual advantages and disadvantages. We also searched current scientific literature to find studies comparing different nucleic acid extraction methods, to determine the best available choice. Based on our research, we have determined that there is not enough scientific evidence to support one particular DNA extraction method from whole blood samples. Choosing a suitable method is still a process that requires consideration of many different factors, and more research is needed to validate choices made at facilities around the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Australia is contributing to the global problem of antimicrobial resistance with one of the highest rates of antibiotic use amongst OECD countries. Data from the Australian primary healthcare sector suggests unnecessary antibiotics were prescribed for conditions that will resolve without it. If left unchecked, this will result in more resistant micro-organisms, against which antibiotics will be useless. There is a lack of understanding about what is influencing decisions to use antibiotics – what factors influences general practitioners (GPs) to prescribe antibiotics, consumers to seek antibiotics, and pharmacists to fill old antibiotic prescriptions? It is also not clear how these individuals trade-off between the possible benefits that antibiotics may provide in the immediate/short term, against the longer term societal risk of antimicrobial resistance. Method This project will investigate (a) what factors drive decisions to use antibiotics for GPs, pharmacists and consumers, and (b) how these individuals discount the future. Factors will be gleaned from published literature and from a qualitative phase using semi-structured interviews, to inform the development of Discrete Choice Experiments (DCEs). Three DCEs will be constructed – one for each group of interest – to allow investigation of which factors are more important in influencing (a) GPs to prescribe antibiotics, (b) consumers to seek antibiotics, and (c) pharmacists to fill legally valid but old or repeat prescriptions of antibiotics. Regression analysis will be conducted to understand the relative importance of these factors. A Time Trade Off exercise will be developed to investigate how these individuals discount the future, and whether GPs and pharmacists display the same extent of discounting the future, as consumers. Expected Results Findings from the DCEs will provide an insight into which factors are more important in driving decision making in antibiotic use for GPs, pharmacists and consumers. Findings from the Time Trade Off exercise will show what individuals are willing to trade for preserving the miracle of antibiotics. Conclusion The emergence of antibiotic resistance is inevitable. This research will expand on what is currently known about influencing desired behaviour change in antibiotic use, in the fight against antibiotic resistance. Real World Implications Research findings will contribute to existing national programs to bring about a reduction in inappropriate use of antibiotic in Australia. Specifically, influencing (1) how key messages and public health campaigns are crafted to increase health literacy, and (2) clinical education and empowerment of GPs and pharmacists to play a more responsive role as stewards of antibiotic use in the community.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impaired driver alertness increases the likelihood of drivers’ making mistakes and reacting too late to unexpected events while driving. This is particularly a concern on monotonous roads, where a driver’s attention can decrease rapidly. While effective countermeasures do not currently exist, the development of in-vehicle sensors opens avenues for monitoring driving behavior in real-time. The aim of this study is to predict drivers’ level of alertness through surrogate measures collected from in-vehicle sensors. Electroencephalographic activity is used as a reference to evaluate alertness. Based on a sample of 25 drivers, data was collected in a driving simulator instrumented with an eye tracking system, a heart rate monitor and an electrodermal activity device. Various classification models were tested from linear regressions to Bayesians and data mining techniques. Results indicated that Neural Networks were the most efficient model in detecting lapses in alertness. Findings also show that reduced alertness can be predicted up to 5 minutes in advance with 90% accuracy, using surrogate measures such as time to line crossing, blink frequency and skin conductance level. Such a method could be used to warn drivers of their alertness level through the development of an in-vehicle device monitoring, in real-time, drivers' behavior on highways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Driver training is one of the interventions aimed at mitigating the number of crashes that involve novice drivers. Our failure to understand what is really important for learners, in terms of risky driving, is one of the many drawbacks restraining us to build better training programs. Currently, there is a need to develop and evaluate Advanced Driving Assistance Systems that could comprehensively assess driving competencies. The aim of this paper is to present a novel Intelligent Driver Training System (IDTS) that analyses crash risks for a given driving situation, providing avenues for improvement and personalisation of driver training programs. The analysis takes into account numerous variables acquired synchronously from the Driver, the Vehicle and the Environment (DVE). The system then segments out the manoeuvres within a drive. This paper further presents the usage of fuzzy set theory to develop the safety inference rules for each manoeuvre executed during the drive. This paper presents a framework and its associated prototype that can be used to comprehensively view and assess complex driving manoeuvres and then provide a comprehensive analysis of the drive used to give feedback to novice drivers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aggressive driving has been associated with engagement in other risky driving behaviours, such as speeding; while drivers using their mobile phones have an increased crash risk, despite the tendency to reduce their speed. Research has amassed separately for mobile phone use and aggressive driving among younger drivers, however little is known about the extent to which these behaviours may function independently and in combination to influence speed selection behaviour. The main aim of the current study was to investigate the effect of driver aggression (measured by the Driving Anger Expression Inventory) and mobile phone use on speed selection by young drivers. The CARRS-Q advanced driving simulator was used to test the speed selection of drivers aged 18 to 26 years (N = 32) in a suburban (60kph zone) driving context. A 2 (level of driving anger expression: low, high) X 3 (mobile phone use condition: baseline, hands-free, hand-held) mixed factorial ANOVA was conducted with speed selection as the dependent variable. Results revealed a significant main effect for mobile phone use condition such that speed selection was lowest for the hand-held condition and highest for the baseline condition. Speed selection, however, was not significantly different across the levels of driving anger expression; nor was there a significant interaction effect between the mobile phone use and driving anger expression. As young drivers are over-represented in road crash statistics, future research should further investigate the combined impact of driver aggression and mobile phone use on speed selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional voltage driven gate drive circuits utilise a resistor to control the switching speed of power MOS-FETs. The gate resistance is adjusted to provide controlled rate of change of load current and voltage. The cascode gate drive configuration has been proposed as an alternative to the conventional resistor-fed gate drive circuit. While cascode drive is broadly understood in the literature the switching characteristics of this topology are not well documented. This paper explores, through both simulation and experimentation, the gate drive parameter space of the cascode gate drive configuration and provides a comparison to the switching characteristics of conventional gate drive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overarching Research Questions Are ACT motorists aware of roadside saliva based drug testing operations? What is the perceived deterrent impact of the operations? What factors are predictive of future intentions to drug drive? What are the differences between key subgroups

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose We designed a visual field test focused on the field utilized while driving to examine associations between field impairment and motor vehicle collision involvement in 2,000 drivers ≥70 years old. Methods The "driving visual field test" involved measuring light sensitivity for 20 targets in each eye, extending 15° superiorly, 30° inferiorly, 60° temporally and 30° nasally. The target locations were selected on the basis that they fell within the field region utilized when viewing through the windshield of a vehicle or viewing the dashboard while driving. Monocular fields were combined into a binocular field based on the more sensitive point from each eye. Severe impairment in the overall field or a region was defined as average sensitivity in the lowest quartile of sensitivity. At-fault collision involvement for five years prior to enrollment was obtained from state records. Poisson regression was used to calculate crude and adjusted rate ratios examining the association between field impairment and at-fault collision involvement. Results Drivers with severe binocular field impairment in the overall driving visual field had a 40% increased rate of at-fault collision involvement (RR 1.40, 95%CI 1.07-1.83). Impairment in the lower and left fields was associated with elevated collision rates (RR 1.40 95%CI 1.07-1.82 and RR 1.49, 95%CI 1.15-1.92, respectively), whereas impairment in the upper and right field regions was not. Conclusions Results suggest that older drivers with severe impairment in the lower or left region of the driving visual field are more likely to have a history of at-fault collision involvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To investigate the effect of different levels of refractive blur on real-world driving performance measured under day and nighttime conditions. Methods Participants included 12 visually normal, young adults (mean age = 25.8 ± 5.2 years) who drove an instrumented research vehicle around a 4 km closed road circuit with three different levels of binocular spherical refractive blur (+0.50 diopter sphere [DS], +1.00 DS, +2.00 DS) compared with a baseline condition. The subjects wore optimal spherocylinder correction and the additional blur lenses were mounted in modified full-field goggles; the order of testing of the blur conditions was randomized. Driving performance was assessed in two different sessions under day and nighttime conditions and included measures of road signs recognized, hazard detection and avoidance, gap detection, lane-keeping, sign recognition distance, speed, and time to complete the course. Results Refractive blur and time of day had significant effects on driving performance (P < 0.05), where increasing blur and nighttime driving reduced performance on all driving tasks except gap judgment and lane keeping. There was also a significant interaction between blur and time of day (P < 0.05), such that the effects of blur were exacerbated under nighttime driving conditions; performance differences were evident even for +0.50 DS blur relative to baseline for some measures. Conclusions The effects of blur were greatest under nighttime conditions, even for levels of binocular refractive blur as low as +0.50 DS. These results emphasize the importance of accurate and up-to-date refractive correction of even low levels of refractive error when driving at night.