42 resultados para Street signs
Resumo:
BACKGROUND Heart failure with preserved ejection fraction (HFpEF) represents a growing health burden associated with substantial mortality and morbidity. Consequently, risk prediction is of highest importance. Endothelial dysfunction has been recently shown to play an important role in the complex pathophysiology of HFpEF. We therefore aimed to assess von Willebrand factor (vWF), a marker of endothelial damage, as potential biomarker for risk assessment in patients with HFpEF. METHODS AND RESULTS Concentrations of vWF were assessed in 457 patients with HFpEF enrolled as part of the LUdwigshafen Risk and Cardiovascular Health (LURIC) study. All-cause mortality was observed in 40% of patients during a median follow-up time of 9.7 years. vWF significantly predicted mortality with a hazard ratio (HR) per increase of 1 SD of 1.45 (95% confidence interval, 1.26-1.68; P<0.001) and remained a significant predictor after adjustment for age, sex, body mass index, N-terminal pro-B-type natriuretic peptide (NT-proBNP), renal function, and frequent HFpEF-related comorbidities (adjusted HR per 1 SD, 1.22; 95% confidence interval, 1.05-1.42; P=0.001). Most notably, vWF showed additional prognostic value beyond that achievable with NT-proBNP indicated by improvements in C-Statistic (vWF×NT-proBNP: 0.65 versus NT-proBNP: 0.63; P for comparison, 0.004) and category-free net reclassification index (37.6%; P<0.001). CONCLUSIONS vWF is an independent predictor of long-term outcome in patients with HFpEF, which is in line with endothelial dysfunction as potential mediator in the pathophysiology of HFpEF. In particular, combined assessment of vWF and NT-proBNP improved risk prediction in this vulnerable group of patients.
Resumo:
Sleeping disease is a contagious disease mainly of freshwater farmed rainbow trout, caused by salmonid alphavirus (SAV) Subtype 2. Here we describe the first case in Switzerland. Pathological changes ranged from acute pancreas necrosis to more chronic lesions with complete loss of exocrine pancreas and simultaneous degenerative, inflammatory and regenerative heart and muscle lesions. The partial sequencing of SAV E2 and nsp3 genes placed the Swiss SAV variant within the Subtype 2 clustering together with freshwater isolates from UK and continental Europe. Although mortality stayed low, growth rates were significantly reduced, making the disease economically relevant.
Resumo:
BACKGROUND: Central and peripheral vision is needed for object detection. Previous research has shown that visual target detection is affected by age. In addition, light conditions also influence visual exploration. The aim of the study was to investigate the effects of age and different light conditions on visual exploration behavior and on driving performance during simulated driving. METHODS: A fixed-base simulator with 180 degree field of view was used to simulate a motorway route under daylight and night conditions to test 29 young subjects (25-40 years) and 27 older subjects (65-78 years). Drivers' eye fixations were analyzed and assigned to regions of interests (ROI) such as street, road signs, car ahead, environment, rear view mirror, side mirror left, side mirror right, incoming car, parked car, road repair. In addition, lane-keeping and driving speed were analyzed as a measure of driving performance. RESULTS: Older drivers had longer fixations on the task relevant ROI, but had a lower frequency of checking mirrors when compared to younger drivers. In both age groups, night driving led to a less fixations on the mirror. At the performance level, older drivers showed more variation in driving speed and lane-keeping behavior, which was especially prominent at night. In younger drivers, night driving had no impact on driving speed or lane-keeping behavior. CONCLUSIONS: Older drivers' visual exploration behavior are more fixed on the task relevant ROI, especially at night, when driving performance becomes more heterogeneous than in younger drivers.
Resumo:
BACKGROUND: Crossing a street can be a very difficult task for older pedestrians. With increased age and potential cognitive decline, older people take the decision to cross a street primarily based on vehicles' distance, and not on their speed. Furthermore, older pedestrians tend to overestimate their own walking speed, and could not adapt it according to the traffic conditions. Pedestrians' behavior is often tested using virtual reality. Virtual reality presents the advantage of being safe, cost-effective, and allows using standardized test conditions. METHODS: This paper describes an observational study with older and younger adults. Street crossing behavior was investigated in 18 healthy, younger and 18 older subjects by using a virtual reality setting. The aim of the study was to measure behavioral data (such as eye and head movements) and to assess how the two age groups differ in terms of number of safe street crossings, virtual crashes, and missed street crossing opportunities. Street crossing behavior, eye and head movements, in older and younger subjects, were compared with non-parametric tests. RESULTS: The results showed that younger pedestrians behaved in a more secure manner while crossing a street, as compared to older people. The eye and head movements analysis revealed that older people looked more at the ground and less at the other side of the street to cross. CONCLUSIONS: The less secure behavior in street crossing found in older pedestrians could be explained by their reduced cognitive and visual abilities, which, in turn, resulted in difficulties in the decision-making process, especially under time pressure. Decisions to cross a street are based on the distance of the oncoming cars, rather than their speed, for both groups. Older pedestrians look more at their feet, probably because of their need of more time to plan precise stepping movement and, in turn, pay less attention to the traffic. This might help to set up guidelines for improving senior pedestrians' safety, in terms of speed limits, road design, and mixed physical-cognitive trainings.
Resumo:
The decision when to cross a street safely is a challenging task that poses high demands on perception and cognition. Both can be affected by normal aging, neurodegenerative disorder, and brain injury, and there is an increasing interest in studying street-crossing decisions. In this article, we describe how driving simulators can be modified to study pedestrians' street-crossing decisions. The driving simulator's projection system and the virtual driving environment were used to present street-crossing scenarios to the participants. New sensors were added to measure when the test person starts to cross the street. Outcome measures were feasibility, usability, task performance, and visual exploration behavior, and were measured in 15 younger persons, 15 older persons, and 5 post-stroke patients. The experiments showed that the test is feasible and usable, and the selected difficulty level was appropriate. Significant differences in the number of crashes between young participants and patients (p = .001) as well as between healthy older participants and patients (p = .003) were found. When the approaching vehicle's speed is high, significant differences between younger and older participants were found as well (p = .038). Overall, the new test setup was well accepted, and we demonstrated that driving simulators can be used to study pedestrians' street-crossing decisions.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
The majority of first-episode psychoses are preceded by a prodromal phase that is several years on average, frequently leads to some decline in psychosocial functioning and offers the opportunity for early detection within the framework of an indicated prevention. To this, two approaches are currently mainly followed. The ultra-high-risk (UHR) criteria were explicitly developed to predict first-episode psychosis within 12 months, and indeed the majority of conversions in clinical UHR samples seem to occur within the first 12 months of initial assessment. Their main criterion, the attenuated psychotic symptoms criterion, captures symptoms that resemble positive symptoms of psychosis (i.e. delusions, hallucinations and formal thought disorders) with the exception that some level of insight is still maintained, and these frequently compromise functioning already. In contrast, the basic symptom criteria try to catch patients at increased risk of psychoses at the earliest possible time, i.e. ideally when only the first subtle disturbances in information processing have developed that are experienced with full insight and do not yet overload the person's coping abilities, and thus have not yet resulted in any functional decline. First results from prospective studies not only support this view, but indicate that the combination of both approaches might be a more favorable way to increase sensitivity and detect risk earlier, as well as to establish a change-sensitive risk stratification approach.
Resumo:
In this study we investigate the relative frequencies of female and male terms in early reading material for children using the Children’s Printed Word Database as a resource. As roles of females and males have changed over time it is of interest to see if there has been a corresponding change in representations of females and males in children’s books. We carried out analyses regarding different words related to gender. Except for nouns referring to relatives, we found in all word groups a preponderance of male terms. The imbalance of male and female pronouns is equivalent to that reported by Carroll, Davies, and Richman (1971) in a frequency count of printed words in children’s book in the USA conducted some 40 years ago. The results are discussed in terms of gender inequality in reading materials and the development of social mores and stereotypical ideas.
Resumo:
South Africa is one of the countries most affected by HIV/AIDS: According to 2014 UNAIDS data 6.8 million South Africans live with HIV/AIDS, which means a 18.9% prevalence rate among adults (15-49 years old). Despite this strong presence of HIV/AIDS in South African society it remains relatively stigmatized and is not openly talked about. The silence about HIV/AIDS maintained in everyday conversations and the superstitions associated with this illness have led to the creation of a taboo language. This study aims at shedding light on how South African users resort to specific emoticons and graphic signs to talk about HIV/AIDS online. For this purpose 368 Facebook status updates and comments concerning HIV/AIDS and its side effects were analysed. All participants, aged 14-48, lived at the moment of data collection in Cape Town, in the Cape Flats area. The online conversations investigated are mainly in English mixed with Afrikaans and/or Xhosa. The emoticons and graphic signs in most cases display a graphic depiction of the physical (and mental) effects of the illness. These linguistic and semiotic practices employed on Facebook provide insight into how Capetonian users, on the one hand, express solidarity and sympathy with people suffering from HIV/AIDS. On the other hand, the emoticons and graphic signs are used to label and position people affected by HIV/AIDS. Thus, in the South African context social network sites have become an important space and means for communicating HIV/AIDS issues.