912 resultados para Veracity Judgment
Resumo:
Over the last decade, increasing evidence of cognitive functions of the cerebellum during development and learning processes could be ascertained. Posterior fossa malformations such as cerebellar hypoplasia or Joubert syndrome are known to be related to developmental problems in a marked to moderate extent. More detailed analyses reveal special deficits in attention, processing speed, visuospatial functions, and language. A study about Dandy Walker syndrome states a relationship of abnormalities in vermis lobulation with developmental problems. Further lobulation or volume abnormalities of the cerebellum and/or vermis can be detected in disorders as fragile X syndrome, Downs's syndrome, William's syndrome, and autism. Neuropsychological studies reveal a relation of dyslexia and attention deficit disorder with cerebellar functions. These functional studies are supported by structural abnormalities in neuroimaging in these disorders. Acquired cerebellar or vermis atrophy was found in groups of children with developmental problems such as prenatal alcohol exposure or extreme prematurity. Also, focal lesions during childhood or adolescence such as cerebellar tumor or stroke are related with neuropsychological abnormalities, which are most pronounced in visuospatial, language, and memory functions. In addition, cerebellar atrophy was shown to be a bad prognostic factor considering cognitive outcome in children after brain trauma and leukemia. In ataxia teleangiectasia, a neurodegenerative disorder affecting primarily the cerebellar cortex, a reduced verbal intelligence quotient and problems of judgment of duration are a hint of the importance of the cerebellum in cognition. In conclusion, the cerebellum seems to play an important role in many higher cognitive functions, especially in learning. There is a suggestion that the earlier the incorrect influence, the more pronounced the problems.
Resumo:
The present study was conducted to determine the effects of different variables on the perception of vehicle speeds in a driving simulator. The motivations of the study include validation of the Michigan Technological University Human Factors and Systems Lab driving simulator, obtaining a better understanding of what influences speed perception in a virtual environment, and how to improve speed perception in future simulations involving driver performance measures. Using a fixed base driving simulator, two experiments were conducted, the first to evaluate the effects of subject gender, roadway orientation, field of view, barriers along the roadway, opposing traffic speed, and subject speed judgment strategies on speed estimation, and the second to evaluate all of these variables as well as feedback training through use of the speedometer during a practice run. A mixed procedure model (mixed model ANOVA) in SAS® 9.2 was used to determine the significance of these variables in relation to subject speed estimates, as there were both between and within subject variables analyzed. It was found that subject gender, roadway orientation, feedback training, and the type of judgment strategy all significantly affect speed perception. By using curved roadways, feedback training, and speed judgment strategies including road lines, speed limit experience, and feedback training, speed perception in a driving simulator was found to be significantly improved.
Resumo:
BACKGROUND: Through the opinion of Swiss headhunters, we wanted to determine the influence of strabismus on the ability to obtain employment. METHODS: Out of 31 randomly selected Swiss headhunters, 20 could be interviewed using a validated questionnaire. RESULTS: Forty-seven percent of the headhunters judged that strabismic subjects have more difficulties in obtaining a job. Gender has no influence on discrimination (p > 0.1). Asked about six facial disfigurements, strabismus was found to have the second largest negative impact on employment directly after acne. Strabismus was estimated to decrease the attractiveness of job applicants (p < 0.0001) and to have a negative impact on the overall judgment of a potential employer (p < 0.05). CONCLUSIONS: Visible strabismus influences negatively the ability to obtain a job. Because of its impact on the employability of a person, we believe that strabismus surgery in adults cannot be considered to be only a beautifying procedure.
Resumo:
BACKGROUND: Bleeding is a frequent complication during surgery. The intraoperative administration of blood products, including packed red blood cells, platelets and fresh frozen plasma (FFP), is often live saving. Complications of blood transfusions contribute considerably to perioperative costs and blood product resources are limited. Consequently, strategies to optimize the decision to transfuse are needed. Bleeding during surgery is a dynamic process and may result in major blood loss and coagulopathy due to dilution and consumption. The indication for transfusion should be based on reliable coagulation studies. While hemoglobin levels and platelet counts are available within 15 minutes, standard coagulation studies require one hour. Therefore, the decision to administer FFP has to be made in the absence of any data. Point of care testing of prothrombin time ensures that one major parameter of coagulation is available in the operation theatre within minutes. It is fast, easy to perform, inexpensive and may enable physicians to rationally determine the need for FFP. METHODS/DESIGN: The objective of the POC-OP trial is to determine the effectiveness of point of care prothrombin time testing to reduce the administration of FFP. It is a patient and assessor blind, single center randomized controlled parallel group trial in 220 patients aged between 18 and 90 years undergoing major surgery (any type, except cardiac surgery and liver transplantation) with an estimated blood loss during surgery exceeding 20% of the calculated total blood volume or a requirement of FFP according to the judgment of the physicians in charge. Patients are randomized to usual care plus point of care prothrombin time testing or usual care alone without point of care testing. The primary outcome is the relative risk to receive any FFP perioperatively. The inclusion of 110 patients per group will yield more than 80% power to detect a clinically relevant relative risk of 0.60 to receive FFP of the experimental as compared with the control group. DISCUSSION: Point of care prothrombin time testing in the operation theatre may reduce the administration of FFP considerably, which in turn may decrease costs and complications usually associated with the administration of blood products. TRIAL REGISTRATION: NCT00656396.
Resumo:
There is no accepted way of measuring prothrombin time without time loss for patients undergoing major surgery who are at risk of intraoperative dilution and consumption coagulopathy due to bleeding and volume replacement with crystalloids or colloids. Decisions to transfuse fresh frozen plasma and procoagulatory drugs have to rely on clinical judgment in these situations. Point-of-care devices are considerably faster than the standard laboratory methods. In this study we assessed the accuracy of a Point-of-care (PoC) device measuring prothrombin time compared to the standard laboratory method. Patients undergoing major surgery and intensive care unit patients were included. PoC prothrombin time was measured by CoaguChek XS Plus (Roche Diagnostics, Switzerland). PoC and reference tests were performed independently and interpreted under blinded conditions. Using a cut-off prothrombin time of 50%, we calculated diagnostic accuracy measures, plotted a receiver operating characteristic (ROC) curve and tested for equivalence between the two methods. PoC sensitivity and specificity were 95% (95% CI 77%, 100%) and 95% (95% CI 91%, 98%) respectively. The negative likelihood ratio was 0.05 (95% CI 0.01, 0.32). The positive likelihood ratio was 19.57 (95% CI 10.62, 36.06). The area under the ROC curve was 0.988. Equivalence between the two methods was confirmed. CoaguChek XS Plus is a rapid and highly accurate test compared with the reference test. These findings suggest that PoC testing will be useful for monitoring intraoperative prothrombin time when coagulopathy is suspected. It could lead to a more rational use of expensive and limited blood bank resources.
Resumo:
A general theory of violence may only be possible in the sense of a meta-theoretical framework, As such it should comprise a parsimonious set of general mechanisms that operate across various manifestations of violence. In order to identify such mechanisms, a general theory of violence needs to equally consider all manifestations of violence, in all societies, and at all times. Departing from this assumption this paper argues that three theoretical approaches may be combined in a non-contradictory way to understand violence as goal-directed instrumental behaviour: a theory of the judgment and decision-making processes operating in the situations that give rise to violence; a theory of the evolutionary processes that have resulted in universal cognitive and emotional mechanisms associated with violence; and a theory of the way in which social institutions structure violence by selectively enhancing its effectiveness for some purposes (i.e legitimate use of force) and controlling other types of violence (i.e crime). To illustrate the potential use of such a perspective the paper then examines some general mechanisms that may explain many different types of violence. In particular, it examines how the mechanisms of moralistic aggression (Trivers) and moral disengagement (Bandura) may account for many different types of violence.
Resumo:
Increasing demand for marketing accountability requires an efficient allocation of marketing expenditures. Managers who know the elasticity of their marketing instruments can allocate their budgets optimally. Meta-analyses offer a basis for deriving benchmark elasticities for advertising. Although they provide a variety of valuable insights, a major shortcoming of prior meta-analyses is that they report only generalized results as the disaggregated raw data are not made available. This problem is highly relevant because coding of empirical studies, at least to a certain extent, involves subjective judgment. For this reason, meta-studies would be more valuable if researchers and practitioners had access to disaggregated data allowing them to conduct further analyses of individual, e.g., product-level-specific, interests. We are the first to address this gap by providing (1) an advertising elasticity database (AED) and (2) empirical generalizations about advertising elasticities and their determinants. Our findings indicate that the average current-period advertising elasticity is 0.09, which is substantially smaller than the value 0f 0.12 that was recently reported by Sethuraman, Tellis, and Briesch (2011). Furthermore, our meta-analysis reveals a wide range of significant determinants of advertising elasticity. For example, we find that advertising elasticities are higher (i) for hedonic and experience goods than for other goods; (ii) for new than for established goods; (iii) when advertising is measured in gross rating points (GRP) instead of absolute terms; and (iv) when the lagged dependent or lagged advertising variable is omitted.
Resumo:
On October 10, 2013, the Chamber of the European Court of Human Rights (ECtHR) handed down a judgment (Delfi v. Estonia) condoning Estonia for a law which, as interpreted, held a news portal liable for the defamatory comments of its users. Amongst the considerations that led the Court to find no violation of freedom of expression in this particular case were, above all, the inadequacy of the automatic screening system adopted by the website and the users’ option to post their comments anonymously (i.e. without need for prior registration via email), which in the Court’s view rendered the protection conferred to the injured party via direct legal action against the authors of the comments ineffective. Drawing on the implications of this (not yet final) ruling, this paper discusses a few questions that the tension between the risk of wrongful use of information and the right to anonymity generates for the development of Internet communication, and examines the role that intermediary liability legislation can play to manage this tension.
Resumo:
New tools for editing of digital images, music and films have opened up new possibilities to enable wider circles of society to engage in ’artistic’ activities of different qualities. User-generated content has produced a plethora of new forms of artistic expression. One type of user-generated content is the mashup. Mashups are compositions that combine existing works (often) protected by copyright and transform them into new original creations. The European legislative framework has not yet reacted to the copyright problems provoked by mashups. Neither under the US fair use doctrine, nor under the strict corset of limitations and exceptions in Art 5 (2)-(3) of the Copyright Directive (2001/29/EC) have mashups found room to develop in a safe legal environment. The contribution analyzes the current European legal framework and identifies its insufficiencies with regard to enabling a legal mashup culture. By comparison with the US fair use approach, in particular the parody defense, a recent CJEU judgment serves as a comparative example. Finally, an attempt is made to suggest solutions for the European legislator, based on the policy proposals of the EU Commission’s “Digital Agenda” and more recent policy documents (e.g. “On Content in the Digital Market”, “Licenses for Europe”). In this context, a distinction is made between non-commercial mashup artists and the emerging commercial mashup scene.
Resumo:
BACKGROUND The objective of this study was to compare transtelephonic ECG every 2 days and serial 7-day Holter as two methods of follow-up after atrial fibrillation (AF) catheter ablation for the judgment of ablation success. Patients with highly symptomatic AF are increasingly treated with catheter ablation. Several methods of follow-up have been described, and judgment on ablation success often relies on patients' symptoms. However, the optimal follow-up strategy objectively detecting most of the AF recurrences is yet unclear. METHODS Thirty patients with highly symptomatic AF were selected for circumferential pulmonary vein ablation. During follow-up, a transtelephonic ECG was transmitted once every 2 days for half a year. Additionally, a 7-day Holter was recorded preablation, after ablation, after 3 and 6 months, respectively. With both, procedures symptoms and actual rhythm were correlated thoroughly. RESULTS A total of 2,600 transtelephonic ECGs were collected with 216 of them showing AF. 25% of those episodes were asymptomatic. On a Kaplan-Meier analysis 45% of the patients with paroxysmal AF were still in continuous SR after 6 months. Simulating a follow-up based on symptomatic recurrences only, that number would have increased to 70%. Using serial 7-day ECG, 113 Holter with over 18,900 hours of ECG recording were acquired. After 6 months the percentage of patients classified as free from AF was 50%. Of the patients with recurrences, 30-40% were completely asymptomatic. The percentage of asymptomatic AF episodes stepwise increased from 11% prior ablation to 53% 6 months after. CONCLUSIONS The success rate in terms of freedom from AF was 70% on a symptom-only-based follow-up; using serial 7-day Holter it decreased to 50% and on transtelephonic monitoring to 45%, respectively. Transtelephonic ECG and serial 7-day Holter were equally effective to objectively determine long-term success and to detect asymptomatic patients.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.
Resumo:
We investigated the role of horizontal body motion on the processing of numbers. We hypothesized that leftward self-motion leads to shifts in spatial attention and therefore facilitates the processing of small numbers, and vice versa, we expected that rightward self-motion facilitates the processing of large numbers. Participants were displaced by means of a motion platform during a parity judgment task. We found a systematic influence of self-motion direction on number processing, suggesting that the processing of numbers is intertwined with the processing of self-motion perception. The results differed from known spatial numerical compatibility effects in that self-motion exerted a differential influence on inner and outer numbers of the given interval. The results highlight the involvement of sensory body motion information in higher-order spatial cognition.
Resumo:
Purpose Skeletal-related events represent a substantial burden for patients with advanced cancer. Randomized, controlled studies suggested superiority of denosumab over zoledronic acid in the prevention of skeletal-related events in metastatic cancer patients, with a favorable safety profile. Experts gathered at the 2012 Skeletal Care Academy in Istanbul to bring forward practical recommendations, based on current evidence, for the use of denosumab in patients with bone metastases of lung cancer. Recommendations Based on current evidence, use of denosumab in lung cancer patients with confirmed bone metastases is recommended. It is important to note that clinical judgment should take into consideration the patient’s general performance status, overall prognosis, and live expectancy. Currently, the adverse event profile reported for denosumab includes hypocalcemia and infrequent occurrence of osteonecrosis of the jaw. Therefore, routine calcium and vitamin D supplementation, along with dental examination prior to denosumab initiation are recommended. There is no evidence for renal function impairment due to denosumab administration. At present, there is no rationale to discourage concomitant use of denosumab and surgery or radiotherapy.
Drug-related emergency department visits by elderly patients presenting with non-specific complaints
Resumo:
BACKGROUND Since drug-related emergency department (ED) visits are common among older adults, the objectives of our study were to identify the frequency of drug-related problems (DRPs) among patients presenting to the ED with non-specific complaints (NSC), such as generalized weakness and to evaluate responsible drug classes. METHODS Delayed type cross-sectional diagnostic study with a prospective 30 day follow-up in the ED of the University Hospital Basel, Switzerland. From May 2007 until April 2009, all non-trauma patients presenting to the ED with an Emergency Severity Index (ESI) of 2 or 3 were screened and included, if they presented with non-specific complaints. After having obtained complete 30-day follow-up, two outcome assessors reviewed all available information, judged whether the initial presentation was a DRP and compared their judgment with the initial ED diagnosis. Acute morbidity ("serious condition") was allocated to individual cases according to predefined criteria. RESULTS The study population consisted of 633 patients with NSC. Median age was 81 years (IQR 72/87), and the mean Charlson comorbidity index was 2.5 (IQR 1/4). DRPs were identified in 77 of the 633 cases (12.2%). At the initial assessment, only 40% of the DRPs were correctly identified. 64 of the 77 identified DRPs (83%) fulfilled the criteria "serious condition". Polypharmacy and certain drug classes (thiazides, antidepressants, benzodiazepines, anticonvulsants) were associated with DRPs. CONCLUSION Elderly patients with non-specific complaints need to be screened systematically for drug-related problems. TRIAL REGISTRATION ClinicalTrials.gov: NCT00920491.