188 resultados para Hazard Rate
Resumo:
This study used a video-based hazard perception dual task to compare the hazard perception skills of young drivers with middle aged, more experienced drivers and to determine if these skills can be improved with video-based road commentary training. The primary task required the participants to detect and verbally identify immediate hazard on video-based traffic scenarios while concurrently performing a secondary tracking task, simulating the steering of real driving. The results showed that the young drivers perceived fewer immediate hazards (mean = 75.2%, n = 24, 19 females) than the more experienced drivers (mean = 87.5%, n = 8, all females), and had longer hazard perception times, but performed better in the secondary tracking task. After the road commentary training, the mean percentage of hazards detected and identified by the young drivers improved to the level of the experienced drivers and was significantly higher than that of an age and driving experience matched control group. The results will be discussed in the context of psychological theories of hazard perception and in relation to road commentary as an evidence-based training intervention that seems to improve many aspects of unsafe driving behaviour in young drivers.
Resumo:
Aims: This paper describes the development of a risk adjustment (RA) model predictive of individual lesion treatment failure in percutaneous coronary interventions (PCI) for use in a quality monitoring and improvement program. Methods and results: Prospectively collected data for 3972 consecutive revascularisation procedures (5601 lesions) performed between January 2003 and September 2011 were studied. Data on procedures to September 2009 (n = 3100) were used to identify factors predictive of lesion treatment failure. Factors identified included lesion risk class (p < 0.001), occlusion type (p < 0.001), patient age (p = 0.001), vessel system (p < 0.04), vessel diameter (p < 0.001), unstable angina (p = 0.003) and presence of major cardiac risk factors (p = 0.01). A Bayesian RA model was built using these factors with predictive performance of the model tested on the remaining procedures (area under the receiver operating curve: 0.765, Hosmer–Lemeshow p value: 0.11). Cumulative sum, exponentially weighted moving average and funnel plots were constructed using the RA model and subjectively evaluated. Conclusion: A RA model was developed and applied to SPC monitoring for lesion failure in a PCI database. If linked to appropriate quality improvement governance response protocols, SPC using this RA tool might improve quality control and risk management by identifying variation in performance based on a comparison of observed and expected outcomes.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
To date, the formation of deposits on heat exchanger surfaces is the least understood problem in the design of heat exchangers for processing industries. Dr East has related the structure of the deposits to solution composition and has developed predictive models for composite fouling of calcium oxalate and silica in sugar factory evaporators.
Resumo:
Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.
Resumo:
The current study evaluated the effect of soluble dietary cellulose on growth, survival and digestive enzyme activity in three endemic, Australian freshwater crayfish species (redclaw: Cherax quadricarinatus, marron: C. tenuimanus, yabby: C. destructor). Separate individual feeding trials were conducted for late-stage juveniles from each species in an automated recirculating freshwater, culture system. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch, over a 12 week period. Redclaw fed with RD showed significantly higher (p<0.05) specific growth rates (SGR) compared with animals fed the TD, while SGR of marron and yabby fed the two diets were not significantly different. Expressed cellulase activity levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD (p<0.05). Amylase and protease activity in all three species were significantly higher in the animals fed with RD (p<0.05). These results indicate that test animals of all three species appear to utilize starch more efficiently than soluble dietary cellulose in their diet. The inclusion of 20% soluble cellulose in diets did not appear, however, to have a significant negative effect on growth rates.
Resumo:
The problem of estimating pseudobearing rate information of an airborne target based on measurements from a vision sensor is considered. Novel image speed and heading angle estimators are presented that exploit image morphology, hidden Markov model (HMM) filtering, and relative entropy rate (RER) concepts to allow pseudobearing rate information to be determined before (or whilst) the target track is being estimated from vision information.
Resumo:
Introduction: Within the context of road safety it is important that workload (the portion of a driver’s resources expended to perform a task) remains at a manageable level, preventing overloading and consequently performance decrements. Motorcyclists are over represented in crash statistics where the vehicle operator has a positive, low blood alcohol concentration (BAC) (e.g., 0.05%). The NASA task load index (NASA-TLX) comprises sub-scales that purportedly assess different aspects of subjective workload. It was hypothesized that, compared to a zero BAC condition, low BACs would be associated with increases in workload ratings, and decrements in riding performance. Method: Forty participants (20 novice, 20 experienced) completed simulated motorcycle rides in urban and rural scenarios under low dose BAC conditions (0.00%, 0.02%, 0.05% BAC), while completing a safety relevant peripheral detection task (PDT). Six sub-scales of the NASA-TLX were completed after each ride. Riding performance was assessed using standard deviation of lateral position (SDLP). Hazard perception was assessed by response time to the PDT. Results: Riding performance and hazard perception were affected by alcohol. There was a significant increase in SDLP in the urban scenario and of PDT reaction time in the rural scenario under 0.05% BAC compared to 0.00% BAC. Overall NASA-TLX score increased at 0.02% and 0.05% BAC in the urban environment only, with a trend for novices to rate workload higher than experienced riders. There was a significant main effect of sub-scale on workload ratings in both the urban and rural scenarios. Discussion: 0.05% BAC was associated with decrements in riding performance in the urban environment, decrements in hazard perception in the rural environment, and increases in overall ratings of subjective workload in the urban environment. The workload sub-scales of the NASA-TLX appear to be measuring distinct aspects of motorcycle riding-related workload. Issues of workload and alcohol impaired riding performance are discussed.
Resumo:
In this paper we explore the relationship between monthly random breath testing (RBT) rates (per 1000 licensed drivers) and alcohol-related traffic crash (ARTC) rates over time, across two Australian states: Queensland and Western Australia. We analyse the RBT, ARTC and licensed driver rates across 12 years; however, due to administrative restrictions, we model ARTC rates against RBT rates for the period July 2004 to June 2009. The Queensland data reveals that the monthly ARTC rate is almost flat over the five year period. Based on the results of the analysis, an average of 5.5 ARTCs per 100,000 licensed drivers are observed across the study period. For the same period, the monthly rate of RBTs per 1000 licensed drivers is observed to be decreasing across the study with the results of the analysis revealing no significant variations in the data. The comparison between Western Australia and Queensland shows that Queensland's ARTC monthly percent change (MPC) is 0.014 compared to the MPC of 0.47 for Western Australia. While Queensland maintains a relatively flat ARTC rate, the ARTC rate in Western Australia is increasing. Our analysis reveals an inverse relationship between ARTC RBT rates, that for every 10% increase in the percentage of RBTs to licensed driver there is a 0.15 decrease in the rate of ARTCs per 100,000 licenced drivers. Moreover, in Western Australia, if the 2011 ratio of 1:2 (RBTs to annual number of licensed drivers) were to double to a ratio of 1:1, we estimate the number of monthly ARTCs would reduce by approximately 15. Based on these findings we believe that as the number of RBTs conducted increases the number of drivers willing to risk being detected for drinking driving decreases, because the perceived risk of being detected is considered greater. This is turn results in the number of ARTCs diminishing. The results of this study provide an important evidence base for policy decisions for RBT operations.
Resumo:
Purpose The objectives of this study were to examine the effect of 4-week moderate- and high-intensity interval training (MIIT and HIIT) on fat oxidation and the responses of blood lactate (BLa) and rating of perceived exertion (RPE). Methods Ten overweight/obese men (age = 29 ±3.7 years, BMI = 30.7 ±3.4 kg/m2) participated in a cross-over study of 4-week MIIT and HIIT training. The MIIT training sessions consisted of 5-min cycling stages at mechanical workloads 20% above and 20% below 45%VO2peak. The HIIT sessions consisted of intervals of 30-s work at 90%VO2peak and 30-s rest. Pre- and post-training assessments included VO2max using a graded exercise test (GXT) and fat oxidation using a 45-min constant-load test at 45%VO2max. BLa and RPE were also measured during the constant-load exercise test. Results There were no significant changes in body composition with either intervention. There were significant increases in fat oxidation after MIIT and HIIT (p ≤ 0.01), with no effect of intensity. BLa during the constant-load exercise test significantly decreased after MIIT and HIIT (p ≤ 0.01), and the difference between MIIT and HIIT was not significant (p = 0.09). RPE significantly decreased after HIIT greater than MIIT (p ≤ 0.05). Conclusion Interval training can increase fat oxidation with no effect of exercise intensity, but BLa and RPE decreased after HIIT to greater extent than MIIT.
Resumo:
Recent developments in wearable ECG technology have seen renewed interest in the use of Heart Rate Variability (HRV) feedback for stress management. Yet, little is know about the efficacy of such interventions. Positive reappraisal is an emotion regulation strategy that involves changing the way a situation is construed to decrease emotional impact. We sought to test the effectiveness of an intervention that used feedback on HRV data to prompt positive reappraisal during a stressful work task. Participants (N=122) completed two 20-minute trials of an inbox activity. In-between the first and the second trial participants were assigned to the waitlist control condition, a positive reappraisal via psycho-education condition, or a positive reappraisal via HRV feedback condition. Results revealed that using HRV data to frame a positive reappraisal message is more effective than using psycho-education (or no intervention)–especially for increasing positive mood and reducing arousal.
Resumo:
Background: The randomised phase 3 First-Line Erbitux in Lung Cancer (FLEX) study showed that the addition of cetuximab to cisplatin and vinorelbine significantly improved overall survival compared with chemotherapy alone in the first-line treatment of advanced non-small-cell lung cancer (NSCLC). The main cetuximab-related side-effect was acne-like rash. Here, we assessed the association of this acne-like rash with clinical benefit. Methods: We did a subgroup analysis of patients in the FLEX study, which enrolled patients with advanced NSCLC whose tumours expressed epidermal growth factor receptor. Our landmark analysis assessed if the development of acne-like rash in the first 21 days of treatment (first-cycle rash) was associated with clinical outcome, on the basis of patients in the intention-to-treat population alive on day 21. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: 518 patients in the chemotherapy plus cetuximab group-290 of whom had first-cycle rash-and 540 patients in the chemotherapy alone group were alive on day 21. Patients in the chemotherapy plus cetuximab group with first-cycle rash had significantly prolonged overall survival compared with patients in the same treatment group without first-cycle rash (median 15·0 months [95% CI 12·8-16·4] vs 8·8 months [7·6-11·1]; hazard ratio [HR] 0·631 [0·515-0·774]; p<0·0001). Corresponding significant associations were also noted for progression-free survival (median 5·4 months [5·2-5·7] vs 4·3 months [4·1-5·3]; HR 0·741 [0·607-0·905]; p=0·0031) and response (rate 44·8% [39·0-50·8] vs 32·0% [26·0-38·5]; odds ratio 1·703 [1·186-2·448]; p=0·0039). Overall survival for patients without first-cycle rash was similar to that of patients that received chemotherapy alone (median 8·8 months [7·6-11·1] vs 10·3 months [9·6-11·3]; HR 1·085 [0·910-1·293]; p=0·36). The significant overall survival benefit for patients with first-cycle rash versus without was seen in all histology subgroups: adenocarcinoma (median 16·9 months, [14·1-20·6] vs 9·3 months [7·7-13·2]; HR 0·614 [0·453-0·832]; p=0·0015), squamous-cell carcinoma (median 13·2 months [10·6-16·0] vs 8·1 months [6·7-12·6]; HR 0·659 [0·472-0·921]; p=0·014), and carcinomas of other histology (median 12·6 months [9·2-16·4] vs 6·9 months [5·2-11·0]; HR 0·616 [0·392-0·966]; p=0·033). Interpretation: First-cycle rash was associated with a better outcome in patients with advanced NSCLC who received cisplatin and vinorelbine plus cetuximab as a first-line treatment. First-cycle rash might be a surrogate clinical marker that could be used to tailor cetuximab treatment for advanced NSCLC to those patients who would be most likely to derive a significant benefit. Funding: Merck KGaA. © 2011 Elsevier Ltd.
Resumo:
INTRODUCTION: Performance status (PS) 2 patients with non-small cell lung cancer (NSCLC) experience more toxicity, lower response rates, and shorter survival times than healthier patients treated with standard chemotherapy. Paclitaxel poliglumex (PPX), a macromolecule drug conjugate of paclitaxel and polyglutamic acid, reduces systemic exposure to peak concentrations of free paclitaxel and may lead to increased concentrations in tumors due to enhanced vascular permeability. METHODS: Chemotherapy-naive PS 2 patients with advanced NSCLC were randomized to receive carboplatin (area under the curve = 6) and either PPX (210 mg/m/10 min without routine steroid premedication) or paclitaxel (225 mg/m/3 h with standard premedication) every 3 weeks. The primary end point was overall survival. RESULTS: A total of 400 patients were enrolled. Alopecia, arthralgias/myalgias, and cardiac events were significantly less frequent with PPX/carboplatin, whereas grade ≥3 neutropenia and grade 3 neuropathy showed a trend of worsening. There was no significant difference in the incidence of hypersensitivity reactions despite the absence of routine premedication in the PPX arm. Overall survival was similar between treatment arms (hazard ratio, 0.97; log rank p = 0.769). Median and 1-year survival rates were 7.9 months and 31%, for PPX versus 8 months and 31% for paclitaxel. Disease control rates were 64% and 69% for PPX and paclitaxel, respectively. Time to progression was similar: 3.9 months for PPX/carboplatin versus 4.6 months for paclitaxel/carboplatin (p = 0.210). CONCLUSION: PPX/carboplatin failed to provide superior survival compared with paclitaxel/carboplatin in the first-line treatment of PS 2 patients with NSCLC, but the results with respect to progression-free survival and overall survival were comparable and the PPX regimen was more convenient. © 2008International Association for the Study of Lung Cancer.
Resumo:
We aimed to evaluate the effect of the appointment of a dedicated specialist thoracic surgeon on surgical practice for lung cancer previously served by cardio-thoracic surgeons. Outcomes were compared for the 240 patients undergoing surgical resection for lung cancer in two distinct 3-year periods: Group A: 65 patients, 1994-1996 (pre-specialist); Group B: 175 patients, 1997-1999 (post-specialist). The changes implemented resulted in a significant increase in resection rate (from 12.2 to 23.4%, P<0.001), operations in the elderly (over 75 years) and extended resections. There were no significant differences in stage distribution, in-hospital mortality or stage-specific survival after surgery. Lung cancer surgery provided by specialists within a multidisciplinary team resulted in increased surgical resection rates without compromising outcome. Our results strengthen the case for disease-specific specialists in the treatment of lung cancer. © 2004 Published by Elsevier Ireland Ltd.