37 resultados para INSUFFICIENCY
Resumo:
Significance: Chronic wounds represent a major burden on global healthcare systems and reduce the quality of life of those affected. Significant advances have been made in our understanding of the biochemistry of wound healing progression. However, knowledge regarding the specific molecular processes influencing chronic wound formation and persistence remains limited. Recent Advances: Generally, healing of acute wounds begins with hemostasis and the deposition of a plasma-derived provisional matrix into the wound. The deposition of plasma matrix proteins is known to occur around the microvasculature of the lower limb as a result of venous insufficiency. This appears to alter limb cutaneous tissue physiology and consequently drives the tissue into a ‘preconditioned’ state that negatively influences the response to wounding. Critical Issues: Processes, such as oxygen and nutrient suppression, edema, inflammatory cell trapping/extravasation, diffuse inflammation, and tissue necrosis are thought to contribute to the advent of a chronic wound. Healing of the wound then becomes difficult in the context of an internally injured limb. Thus, interventions and therapies for promoting healing of the limb is a growing area of interest. For venous ulcers, treatment using compression bandaging encourages venous return and improves healing processes within the limb, critically however, once treatment concludes ulcers often reoccur. Future Directions: Improved understanding of the composition and role of pericapillary matrix deposits in facilitating internal limb injury and subsequent development of chronic wounds will be critical for informing and enhancing current best practice therapies and preventative action in the wound care field.
Resumo:
We evaluated three acid-resistant pancreatic enzyme preparations by in vitro assays, and by comparing degree of steatorrhea, creatorrhea, fecal wet weight, and stool energy losses in a randomized crossover study of patients with pancreatic insufficient cystic fibrosis. Aims of the study were to assess (a) the most practicable and reliable indicator of malabsorption; (b) the variation in enzyme batch potency; (c) the decline in enzyme batch potency with prolonged shelf life; and (d) the relative bio-efficacy of the different preparations. In the in vivo study, absorption of energy, nitrogen, and fat did not differ when comparing the three preparations at roughly pharmaceu-tically equivalent doses, but when expressed per capsule of pancreatic supplement ingested, absorption reflected relative enzyme content, favoring the higher potency preparations. Although steatorrhea was reasonably controlled by these preparations, stool energy losses varied from 800 to 1,100 kJ per day, suggesting greater attention be paid to overall energy absorption rather than absorption of individual nutrients. In addition, fecal energy loss correlated more closely with fecal wet weight (r = 0.81; p < 0.05) than with steatorrhea (r = 0.40; ns), such that 1 g wet feces = 8.37 kJ (± 0.14). In vitro enzyme potency varied markedly between batches of the same brand, and also a decline of up to 20% in amylase, lipase, and trypsin activity was noted over an 8-month period for each batch. Both observations have clinical implications at times of represcription. Finally, the higher potency preparations were more effective per capsule and reduced capsule dosage is therefore attainable. © 1993 Raven Press, Ltd., New York.
Resumo:
The bentiromide test was evaluated using plasma p-aminobenzoic acid as an indirect test of pancreatic insufficiency in young children between 2 months and 4 years of age. To determine the optimal test method, the following were examined: (a) the best dose of bentiromide (15 mg/kg or 30 mg/kg); (b) the optimal sampling time for plasma p-aminobenzoic acid, and; (c) the effect of coadministration of a liquid meal. Sixty-nine children (1.6 ± 1.0 years) were studied, including 34 controls with normal fat absorption and 35 patients (34 with cystic fibrosis) with fat maldigestion due to pancreatic insufficiency. Control and pancreatic insufficient subjects were studied in three age-matched groups: (a) low-dose bentiromide (15 mg/kg) with clear fluids; (b) high-dose bentiromide (30 mg/kg) with clear fluids, and; (c) high-dose bentiromide with a liquid meal. Plasma p-aminobenzoic acid was determined at 0, 30, 60, and 90 minutes then hourly for 6 hours. The dose effect of bentiromide with clear liquids was evaluated. High-dose bentiromide best discriminated control and pancreatic insufficient subjects, due to a higher peak plasma p-aminobenzoic acid level in controls, but poor sensitivity and specificity remained. High-dose bentiromide with a liquid meal produced a delayed increase in plasma p-aminobenzoic acid in the control subjects probably caused by retarded gastric emptying. However, in the pancreatic insufficient subjects, use of a liquid meal resulted in significantly lower plasma p-aminobenzoic acid levels at all time points; plasma p-aminobenzoic acid at 2 and 3 hours completely discriminated between control and pancreatic insufficient patients. Evaluation of the data by area under the time-concentration curve failed to improve test results. In conclusion, the bentiromide test is a simple, clinically useful means of detecting pancreatic insufficiency in young children, but a higher dose administered with a liquid meal is recommended.
Resumo:
A 17-year-old white adolescent had a history of chronic diarrhea, delayed puberty, and growth failure. Investigations excluded cystic fibrosis, Shwachman syndrome, and endocrine causes of growth failure. Severe steatorrhea was diagnosed from fecal fat studies, and a jejunal suction biopsy showed total villus atrophy, consistent with a diagnosis of celiac diseases. Following introduction of a gluten-free diet, his appetite and growth improved, but he continued to have abdominal discomfort and loose offensive bowel motions. One year later, severe steatorrhea was present. A repeat jejunal biopsy showed partial recovery of villus architecture. Serum immunoreactive trypsinogen level was low, which was highly suggestive of exocrine pancreatic failure. Results of quantitative pancreatic stimulation test confirmed the presence of primary pancreatic insufficiency. After introduction of oral pancreatic enzyme supplements with meals, his gastrointestinal symptoms resolved and growth velocity accelerated. Previously, primary pancreatic insufficiency has only been described in elderly patients with long-standing untreated celiac disease. This case, however, emphasizes that pancreatic failure can occur with celiac disease at any age. Determination of a serum immunoreactive trypsinogen level should be considered a useful screening tool for pancreatic insufficiency in patients with celiac disease who have not responded to a gluten-free diet.
Resumo:
Serum immunoreactive cationic trypsinogen levels were determined in 99 control subjects and 381 cystic fibrosis (CF) patients. To evaluate the status of the exocrine pancreas all CF patients had previously undergone fecal fat balance studies and/or pancreatic stimulation tests. Three hundred fourteen CF patients had fat malabsorption and/or had inadequate pancreatic enzyme secretion (pancreatic insufficiency) requiring oral pancreatic enzyme supplements with meals. Sixty-seven CF patients did not have fat malabsorption and/or had adequate enzyme secretion (pancreatic sufficiency) and were not receiving pancreatic enzyme supplements with meals. Mean serum trypsinogen in 99 control subjects was 31.4 ± 14.8 /µg/hter (± 2 SD) and levels did not vary with age or sex. In CF infants (< 2 yr) with pancreatic insufficiency, mean serum trypsinogen was significantly above the non-CF values (p < 0.001). Ninety-one percent of the CF infants had elevated levels. Serum trypsinogen values in the pancreatic insuffi ient group declined steeply up to 5 years, reaching subnormal values by age 6. An equation was developed which described these age-related changes very accurately. Only six CF patients with pancreatic insufficiency had serum trypsinogen levels above the 95% confidence limits of this equation. In contrast, there was no age related decline in serum trypsinogen among the CF group with pancreatic sufficiency. Under 7 yr, serum trypsinogen failed to distinguish the two groups. In those over 7 yr of age, however, serum trypsinogen was significantly higher than the CF group with pancreatic insufficiency (p < 0.001), and 93% had values within or above the control range. In conclusion, serum trypsinogen appears to be a useful screening test for CF in infancy. Between 2 and 7 yr of age this test is of little diagnostic value. After 7 yr of age, serum trypsinogen can reliably distinguish between CF patients with and without pancreatic insufficiency.
Resumo:
Vitamin D deficiency and insufficiency are now seen as a contemporary health problem in Australia with possible widespread health effects not limited to bone health1. Despite this, the Vitamin D status (measured as serum 25-hydroxyvitamin D (25(OH)D)) of ambulatory adults has been overlooked in this country. Serum 25(OH)D status is especially important among this group as studies have shown a link between Vitamin D and fall risk in older adults2. Limited data also exists on the contributions of sun exposure via ultraviolet radiation and dietary intake to serum 25(OH)D status in this population. The aims of this project were to assess the serum 25(OH)D status of a group of older ambulatory adults in South East Queensland, to assess the association between their serum 25(OH)D status and functional measures as possible indicators of fall risk, obtain data on the sources of Vitamin D in this population and assess whether this intake was related to serum 25(OH)D status and describe sun protection and exposure behaviors in this group and investigate whether a relationship existed between these and serum 25(OH)D status. The collection of this data assists in addressing key gaps identified in the literature with regard to this population group and their Vitamin D status in Australia. A representative convenience sample of participants (N=47) over 55 years of age was recruited for this cross-sectional, exploratory study which was undertaken in December 2007 in south-east Queensland (Brisbane and Sunshine coast). Participants were required to complete a sun exposure questionnaire in addition to a Calcium and Vitamin D food frequency questionnaire. Timed up and go and handgrip dynamometry tests were used to examine functional capacity. Serum 25(OH)D status and blood measures of Calcium, Phosphorus and Albumin were determined through blood tests. The Mean and Median serum 25-Hydroxyvitamin D (25(OH)D) for all participants in this study was 85.8nmol/L (Standard Deviation 29.7nmol/L) and 81.0nmol/L (Range 22-158nmol/L), respectively. Analysis at the bivariate level revealed a statistically significant relationship between serum 25(OH)D status and location, with participants living on the Sunshine Coast having a mean serum 25(OH)D status 21.3nmol/L higher than participants living in Brisbane (p=0.014). While at the descriptive level there was an apparent trend towards higher outdoor exposure and increasing levels of serum 25(OH)D, no statistically significant associations between the sun measures of outdoor exposure, sun protection behaviors and phenotypic characteristics and serum 25(OH)D status were observed. Intake of both Calcium and Vitamin D was low in this sample with sixty-eight (68%) of participants not meeting the Estimated Average Requirements (EAR) for Calcium (Median=771.0mg; Range=218.0-2616.0mg), while eighty-seven (87%) did not meet the Adequate Intake for Vitamin D (Median=4.46ug; Range=0.13-30.0ug). This raises the question of how realistic meeting the new Adequate Intakes for Vitamin D is, when there is such a low level of Vitamin D fortification in this country. However, participants meeting the Adequate Intake (AI) for Vitamin D were observed to have a significantly higher serum 25(OH)D status compared to those not meeting the AI for Vitamin D (p=0.036), showing that meeting the AI for Vitamin D may play a significant role in determining Vitamin D status in this population. By stratifying our data by categories of outdoor exposure time, a trend was observed between increased importance of Vitamin D dietary intake as a possible determinant of serum 25(OH)D status in participants with lower outdoor exposures. While a trend towards higher Timed Up and Go scores in participants with higher 25(OH) D status was seen, this was only significant for females (p=0.014). Handgrip strength showed statistically significant association with serum 25(OH)D status. The high serum 25(OH)D status in our sample almost certainly explains the limited relationship between functional measures and serum 25(OH)D. However, the observation of an association between slower Time Up and Go speeds, and lower serum 25(OH)D levels, even with a small sample size, is significant as slower Timed Up and Go speeds have been associated with increased fall risk in older adults3. Multivariable regression analysis revealed Location as the only significant determinant of serum 25(OH)D status at p=0.014, with trends (p=>0.1) for higher serum 25(OH)D being shown for participants that met the AI for Vitamin D and rated themselves as having a higher health status. The results of this exploratory study show that 93.6% of participants had adequate 25(OH)D status-possibly due to measurement being taken in the summer season and the convenience nature of the sample. However, many participants do not meet their dietary Calcium and Vitamin D requirements, which may indicate inadequate intake of these nutrients in older Australians and a higher risk of osteoporosis. The relationship between serum 25(OH)D and functional measures in this population also requires further study, especially in older adults displaying Vitamin D insufficiency or deficiency.
Resumo:
The explosive growth of the World-Wide-Web and the emergence of ecommerce are the major two factors that have led to the development of recommender systems (Resnick and Varian, 1997). The main task of recommender systems is to learn from users and recommend items (e.g. information, products or books) that match the users’ personal preferences. Recommender systems have been an active research area for more than a decade. Many different techniques and systems with distinct strengths have been developed to generate better quality recommendations. One of the main factors that affect recommenders’ recommendation quality is the amount of information resources that are available to the recommenders. The main feature of the recommender systems is their ability to make personalised recommendations for different individuals. However, for many ecommerce sites, it is difficult for them to obtain sufficient knowledge about their users. Hence, the recommendations they provided to their users are often poor and not personalised. This information insufficiency problem is commonly referred to as the cold-start problem. Most existing research on recommender systems focus on developing techniques to better utilise the available information resources to achieve better recommendation quality. However, while the amount of available data and information remains insufficient, these techniques can only provide limited improvements to the overall recommendation quality. In this thesis, a novel and intuitive approach towards improving recommendation quality and alleviating the cold-start problem is attempted. This approach is enriching the information resources. It can be easily observed that when there is sufficient information and knowledge base to support recommendation making, even the simplest recommender systems can outperform the sophisticated ones with limited information resources. Two possible strategies are suggested in this thesis to achieve the proposed information enrichment for recommenders: • The first strategy suggests that information resources can be enriched by considering other information or data facets. Specifically, a taxonomy-based recommender, Hybrid Taxonomy Recommender (HTR), is presented in this thesis. HTR exploits the relationship between users’ taxonomic preferences and item preferences from the combination of the widely available product taxonomic information and the existing user rating data, and it then utilises this taxonomic preference to item preference relation to generate high quality recommendations. • The second strategy suggests that information resources can be enriched simply by obtaining information resources from other parties. In this thesis, a distributed recommender framework, Ecommerce-oriented Distributed Recommender System (EDRS), is proposed. The proposed EDRS allows multiple recommenders from different parties (i.e. organisations or ecommerce sites) to share recommendations and information resources with each other in order to improve their recommendation quality. Based on the results obtained from the experiments conducted in this thesis, the proposed systems and techniques have achieved great improvement in both making quality recommendations and alleviating the cold-start problem.
Resumo:
Objective: To determine whether there are clinical and public health dilemmas resulting from the reproducibility of routine vitamin D assays. Methods: Blinded agreement studies were conducted in eight clinical laboratories using two commonly used assays to measure serum 25-hydroxyvitamin D (25(OH)D) levels in Australasia and Canada (DiaSorin Radioimmunoassay (RIA) and DiaSorin LIAISON® one). Results: Only one laboratory measured 25(OH)D with excellent precision. Replicate 25(OH)D measurements varied by up to 97% and 15% of paired results differed by more than 50%. Thirteen percent of subjects received one result indicating insufficiency [25-50 nmol/l] and another suggesting adequacy [>50 nmol/l]). Agreement ranged from poor to excellent for laboratories using the manual RIA, while the precision of the semi-automated Liaison® system was consistently poor. Conclusions: Recent interest in the relevance of vitamin D to human health has increased demand for 25(OH)D testing and associated costs. Our results suggest clinicians and public health authorities are making decisions about treatment or changes to public health policy based on imprecise data. Clinicians, researchers and policy makers should be made aware of the imprecision of current 25(OH)D testing so that they exercise caution when using these assays for clinical practice, and when interpreting the findings of epidemiological studies based on vitamin D levels measured using these assays. Development of a rapid, reproducible, accurate and robust assay should be a priority due to interest in populationbased screening programs and research to inform public health policy about the amount of sun exposure required for human health. In the interim, 25(OH)D results should routinely include a statement of measurement uncertainty.
Resumo:
This study is the first to investigate the effect of prolonged reading on reading performance and visual functions in students with low vision. The study focuses on one of the most common modes of achieving adequate magnification for reading by students with low vision, their close reading distance (proximal or relative distance magnification). Close reading distances impose high demands on near visual functions, such as accommodation and convergence. Previous research on accommodation in children with low vision shows that their accommodative responses are reduced compared to normal vision. In addition, there is an increased lag of accommodation for higher stimulus levels as may occur at close reading distance. Reduced accommodative responses in low vision and higher lag of accommodation at close reading distances together could impact on reading performance of students with low vision especially during prolonged reading tasks. The presence of convergence anomalies could further affect reading performance. Therefore, the aims of the present study were 1) To investigate the effect of prolonged reading on reading performance in students with low vision 2) To investigate the effect of prolonged reading on visual functions in students with low vision. This study was conducted as cross-sectional research on 42 students with low vision and a comparison group of 20 students with normal vision, aged 7 to 20 years. The students with low vision had vision impairments arising from a range of causes and represented a typical group of students with low vision, with no significant developmental delays, attending school in Brisbane, Australia. All participants underwent a battery of clinical tests before and after a prolonged reading task. An initial reading-specific history and pre-task measurements that included Bailey-Lovie distance and near visual acuities, Pelli-Robson contrast sensitivity, ocular deviations, sensory fusion, ocular motility, near point of accommodation (pull-away method), accuracy of accommodation (Monocular Estimation Method (MEM)) retinoscopy and Near Point of Convergence (NPC) (push-up method) were recorded for all participants. Reading performance measures were Maximum Oral Reading Rates (MORR), Near Text Visual Acuity (NTVA) and acuity reserves using Bailey-Lovie text charts. Symptoms of visual fatigue were assessed using the Convergence Insufficiency Symptom Survey (CISS) for all participants. Pre-task measurements of reading performance and accuracy of accommodation and NPC were compared with post-task measurements, to test for any effects of prolonged reading. The prolonged reading task involved reading a storybook silently for at least 30 minutes. The task was controlled for print size, contrast, difficulty level and content of the reading material. Silent Reading Rate (SRR) was recorded every 2 minutes during prolonged reading. Symptom scores and visual fatigue scores were also obtained for all participants. A visual fatigue analogue scale (VAS) was used to assess visual fatigue during the task, once at the beginning, once at the middle and once at the end of the task. In addition to the subjective assessments of visual fatigue, tonic accommodation was monitored using a photorefractor (PlusoptiX CR03™) every 6 minutes during the task, as an objective assessment of visual fatigue. Reading measures were done at the habitual reading distance of students with low vision and at 25 cms for students with normal vision. The initial history showed that the students with low vision read for significantly shorter periods at home compared to the students with normal vision. The working distances of participants with low vision ranged from 3-25 cms and half of them were not using any optical devices for magnification. Nearly half of the participants with low vision were able to resolve 8-point print (1M) at 25 cms. Half of the participants in the low vision group had ocular deviations and suppression at near. Reading rates were significantly reduced in students with low vision compared to those of students with normal vision. In addition, there were a significantly larger number of participants in the low vision group who could not sustain the 30-minute task compared to the normal vision group. However, there were no significant changes in reading rates during or following prolonged reading in either the low vision or normal vision groups. Individual changes in reading rates were independent of their baseline reading rates, indicating that the changes in reading rates during prolonged reading cannot be predicted from a typical clinical assessment of reading using brief reading tasks. Contrary to previous reports the silent reading rates of the students with low vision were significantly lower than their oral reading rates, although oral and silent reading was assessed using different methods. Although the visual acuity, contrast sensitivity, near point of convergence and accuracy of accommodation were significantly poorer for the low vision group compared to those of the normal vision group, there were no significant changes in any of these visual functions following prolonged reading in either group. Interestingly, a few students with low vision (n =10) were found to be reading at a distance closer than their near point of accommodation. This suggests a decreased sensitivity to blur. Further evaluation revealed that the equivalent intrinsic refractive errors (an estimate of the spherical dioptirc defocus which would be expected to yield a patient’s visual acuity in normal subjects) were significantly larger for the low vision group compared to those of the normal vision group. As expected, accommodative responses were significantly reduced for the low vision group compared to the expected norms, which is consistent with their close reading distances, reduced visual acuity and contrast sensitivity. For those in the low vision group who had an accommodative error exceeding their equivalent intrinsic refractive errors, a significant decrease in MORR was found following prolonged reading. The silent reading rates however were not significantly affected by accommodative errors in the present study. Suppression also had a significant impact on the changes in reading rates during prolonged reading. The participants who did not have suppression at near showed significant decreases in silent reading rates during and following prolonged reading. This impact of binocular vision at near on prolonged reading was possibly due to the high demands on convergence. The significant predictors of MORR in the low vision group were age, NTVA, reading interest and reading comprehension, accounting for 61.7% of the variances in MORR. SRR was not significantly influenced by any factors, except for the duration of the reading task sustained; participants with higher reading rates were able to sustain a longer reading duration. In students with normal vision, age was the only predictor of MORR. Participants with low vision also reported significantly greater visual fatigue compared to the normal vision group. Measures of tonic accommodation however were little influenced by visual fatigue in the present study. Visual fatigue analogue scores were found to be significantly associated with reading rates in students with low vision and normal vision. However, the patterns of association between visual fatigue and reading rates were different for SRR and MORR. The participants with low vision with higher symptom scores had lower SRRs and participants with higher visual fatigue had lower MORRs. As hypothesized, visual functions such as accuracy of accommodation and convergence did have an impact on prolonged reading in students with low vision, for students whose accommodative errors were greater than their equivalent intrinsic refractive errors, and for those who did not suppress one eye. Those students with low vision who have accommodative errors higher than their equivalent intrinsic refractive errors might significantly benefit from reading glasses. Similarly, considering prisms or occlusion for those without suppression might reduce the convergence demands in these students while using their close reading distances. The impact of these prescriptions on reading rates, reading interest and visual fatigue is an area of promising future research. Most importantly, it is evident from the present study that a combination of factors such as accommodative errors, near point of convergence and suppression should be considered when prescribing reading devices for students with low vision. Considering these factors would also assist rehabilitation specialists in identifying those students who are likely to experience difficulty in prolonged reading, which is otherwise not reflected during typical clinical reading assessments.
Resumo:
Aims To identify self-care activities undertaken and determine relationships between self-efficacy, depression, quality of life, social support and adherence to compression therapy in a sample of patients with chronic venous insufficiency. Background Up to 70% of venous leg ulcers recur after healing. Compression hosiery is a primary strategy to prevent recurrence, however, problems with adherence to this strategy are well documented and an improved understanding of how psychosocial factors influence patients with chronic venous insufficiency will help guide effective preventive strategies. Design Cross-sectional survey and retrospective medical record review. Method All patients previously diagnosed with a venous leg ulcer which healed between 12–36 months prior to the study were invited to participate. Data on health, psychosocial variables and self-care activities were obtained from a self-report survey and data on medical and previous ulcer history were obtained from medical records. Multiple linear regression modelling was used to determine the independent influences of psychosocial factors on adherence to compression therapy. Results In a sample of 122 participants, the most frequently identified self-care activities were application of topical skin treatments, wearing compression hosiery and covering legs to prevent trauma. Compression hosiery was worn for a median of 4 days/week (range 0–7). After adjustment for all variables and potential confounders in a multivariable regression model, wearing compression hosiery was found to be significantly positively associated with participants’ knowledge of the cause of their condition (p=0.002), higher self-efficacy scores (p=0.026) and lower depression scores (p=0.009). Conclusion In this sample, depression, self-efficacy and knowledge were found to be significantly related to adherence to compression therapy. Relevance to clinical practice These findings support the need to screen for and treat depression in this population. In addition, strategies to improve patient knowledge and self-efficacy may positively influence adherence to compression therapy.
Resumo:
Navigational collisions are one of the major safety concerns in many seaports. To address this safety concern, a comprehensive and structured method of collision risk management is necessary. Traditionally management of port water collision risks has been relied on historical collision data. However, this collision-data-based approach is hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of samples for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique that uses traffic conflicts as an alternative to the collision data. This paper proposes a collision risk management method by utilizing the principles of this technique. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which, consequently, has great potential for managing collision risks in a fast, reliable and efficient manner.
Resumo:
Navigational collisions are one of the major safety concerns for many seaports. Continuing growth of shipping traffic in number and sizes is likely to result in increased number of traffic movements, which consequently could result higher risk of collisions in these restricted waters. This continually increasing safety concern warrants a comprehensive technique for modeling collision risk in port waters, particularly for modeling the probability of collision events and the associated consequences (i.e., injuries and fatalities). A number of techniques have been utilized for modeling the risk qualitatively, semi-quantitatively and quantitatively. These traditional techniques mostly rely on historical collision data, often in conjunction with expert judgments. However, these techniques are hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of collision counts for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique (NTCT), which uses traffic conflicts as an alternative to the collisions for modeling the probability of collision events quantitatively. This article explores the existing techniques for modeling collision risk in port waters. In particular, it identifies the advantages and limitations of the traditional techniques and highlights the potentials of the NTCT in overcoming the limitations. In view of the principles of the NTCT, a structured method for managing collision risk is proposed. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which consequently has great potential for managing collision risk in a fast, reliable and efficient manner.
Resumo:
Traditionally navigational safety analyses rely on historical collision data which is often hampered because of low collision counts, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these problems is using navigational traffic conflicts or near-misses as an alternative to the collision data. This book discusses how traffic conflicts can effectively be used in modeling of port water collision risks. Techniques for measuring and predicting collision risks in fairways, intersections, and anchorages are discussed by utilizing advanced statistical models. Risk measurement models, which quantitatively measure collision risks in waterways, are discussed. To predict risks, a hierarchical statistical modeling technique is discussed which identifies the factors influencing the risks. The modeling techniques are illustrated for Singapore port data. Results showed that traffic conflicts are an ethically appealing alternative to collision data for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.
Resumo:
Critically ill patients receiving extracorporeal membrane oxygenation (ECMO) are often noted to have increased sedation requirements. However, data related to sedation in this complex group of patients is limited. The aim of our study was to characterise the sedation requirements in adult patients receiving ECMO for cardiorespiratory failure. A retrospective chart review was performed to collect sedation data for 30 consecutive patients who received venovenous or venoarterial ECMO between April 2009 and March 2011. To test for a difference in doses over time we used a regression model. The dose of midazolam received on ECMO support increased by an average of 18 mg per day (95% confidence interval 8, 29 mg, P=0.001), while the dose of morphine increased by 29 mg per day (95% confidence interval 4, 53 mg, P=0.021) The venovenous group received a daily midazolam dose that was 157 mg higher than the venoarterial group (95% confidence interval 53, 261 mg, P=0.005). We did not observe any significant increase in fentanyl doses over time (95% confidence interval 1269, 4337 µg, P=0.94). There is a significant increase in dose requirement for morphine and midazolam during ECMO. Patients on venovenous ECMO received higher sedative doses as compared to patients on venoarterial ECMO. Future research should focus on mechanisms behind these changes and also identify drugs that are most suitable for sedation during ECMO.