84 resultados para Practical Knowledge
em Université de Lausanne, Switzerland
Resumo:
OBJECTIVE: To assess the theoretical and practical knowledge of the Glasgow Coma Scale (GCS) by trained Air-rescue physicians in Switzerland. METHODS: Prospective anonymous observational study with a specially designed questionnaire. General knowledge of the GCS and its use in a clinical case were assessed. RESULTS: From 130 questionnaires send out, 103 were returned (response rate of 79.2%) and analyzed. Theoretical knowledge of the GCS was consistent for registrars, fellows, consultants and private practitioners active in physician-staffed helicopters. The clinical case was wrongly scored by 38 participants (36.9%). Wrong evaluation of the motor component occurred in 28 questionnaires (27.2%), and 19 errors were made for the verbal score (18.5%). Errors were made most frequently by registrars (47.5%, p = 0.09), followed by fellows (31.6%, p = 0.67) and private practitioners (18.4%, p = 1.00). Consultants made significantly less errors than the rest of the participating physicians (0%, p < 0.05). No statistically significant differences were shown between anesthetists, general practitioners, internal medicine trainees or others. CONCLUSION: Although the theoretical knowledge of the GCS by out-of-hospital physicians is correct, significant errors were made in scoring a clinical case. Less experienced physicians had a higher rate of errors. Further emphasis on teaching the GCS is mandatory.
Resumo:
In my thesis, I defend the idea that Aristotle's notion of phronêsis (practical wisdom) is best understood as a kind of practical knowledge. I interpret phronêsis as the knowledge we display when we make the correct decision to act. In a particular situation that demands a specific response, we have practical knowledge of what to do when we make the best decision possible. This interpretation of phronêsis involves that it is possible to evaluate our decisions epistemically, that is, to evaluate whether we really know what to do or not. Aristotle provides a tool for the evaluation of our decisions, which is a definite kind of argument and which the tradition has called the 'practical syllogism'. The practical syllogism stands as the explanation of our decisions or actions. We invoke it when we want to explain or justify why we act as we do. My claim is that the components of the practical syllogism enable one to evaluate not only the moral character of our actions, but also the epistemic strength of our decisions. Correspondingly, a decision is morally right, i.e. virtuous, if the agent considers the right moral principle to apply, and if he is aware of the relevant circumstances of the situation (moral evaluation). Moreover, a decision displays practical knowledge if the agent meets three conditions (epistemic evaluation): he must desire the moral principle for its own sake; he must have experience in spotting the relevant circumstances of the situation; and he must be able to closely connect these circumstances with the moral principle. This interpretation of phronêsis differs from other more traditional interpretations in the emphasis it puts on phronêsis as knowledge. Other interpretations focus more on the moral dimension on phronêsis, without taking its epistemic value seriously. By contrast, I raise seriously the question of what it takes to genuinely know what one should do in a particular situation. -- Dans ma thèse, je défends l'idée que la notion aristotélicienne de phronêsis (sagesse pratique) doit être interprétée comme connaissance pratique. Je comprends la phronêsis comme étant la connaissance que nous avons lorsque nous prenons une bonne décision. Dans une situation particulière qui demande une réponse précise, nous avons une connaissance pratique lorsque nous prenons la meilleure décision possible. Cette interprétation de la phronêsis implique qu'il est possible d'évaluer nos décisions de manière épistémique, c'est-à-dire, d'évaluer si nous savons vraiment ce qu'il faut faire ou non. Ma position est qu'Aristote fournit un outil pour évaluer épistémiquement nos décisions, qui consiste en un certain type d'argument et que la tradition a appelé le 'syllogisme pratique'. Le syllogisme pratique correspond à l'explication de nos décisions ou de nos actions. Nous invoquons un syllogisme pratique lorsque nous voulons expliquer ou justifier pourquoi nous agissons comme nous le faisons. Les éléments du syllogisme pratique permettent d'évaluer non seulement le caractère moral de nos actions, mais aussi la force épistémique de nos décisions. Par conséquent, une décision est moralement correcte, i.e. vertueuse, si l'agent considère le bon principe moral, et s'il est attentif aux circonstances pertinentes de la situation (évaluation morale). En outre, une décision inclut la connaissance pratique si l'agent remplit trois conditions (évaluation épistémique) : il doit désirer le principe moral pour lui-même, il doit avoir de l'expérience pour déceler les circonstances pertinentes, et il doit pouvoir lier intimement ces circonstances avec le principe moral. Cette interprétation de la phronêsis diffère d'autres interprétations plus traditionnelles par l'emphase mise sur la phronêsis en tant que connaissance. D'autres interprétations se concentrent plus sur la dimension morale de la phronêsis, sans se préoccuper sérieusement de sa valeur épistémique. Au contraire, je pose sérieusement la question des conditions nécessaires pour réellement savoir ce qu'il faut faire dans une situation donnée.
Resumo:
A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.
Resumo:
Recent advances in CT technologies had significantly improved the clinical utility of cardiac CT. Major efforts have been made to optimize the image quality, standardize protocols and limit the radiation exposure. Rapid progress in post-processing tools dedicated not only to the coronary artery assessment but also to the cardiac cavities, valves and veins extended applications of cardiac CT. This potential might be however used optimally considering the current appropriate indications for use as well as the current technical imitations. Coronary artery disease and related ischemic cardiomyopathy remain the major applications of cardiac CT and at the same time the most complex one. Integration of a specific knowledge is mandatory for optimal use in this area for asymptomatic as for symptomatic patients, with a specific regards to patient with acute chest pain. This review aimed to propose a practical approach to implement appropriate indications in our routine practice. Emerging indications and future direction are also discussed. Adequate preparation of the patient, training of physicians, and the multidisciplinary interaction between actors are the key of successful implementation of cardiac CT in daily practice.
Resumo:
Abstract Since its creation, the Internet has permeated our daily life. The web is omnipresent for communication, research and organization. This exploitation has resulted in the rapid development of the Internet. Nowadays, the Internet is the biggest container of resources. Information databases such as Wikipedia, Dmoz and the open data available on the net are a great informational potentiality for mankind. The easy and free web access is one of the major feature characterizing the Internet culture. Ten years earlier, the web was completely dominated by English. Today, the web community is no longer only English speaking but it is becoming a genuinely multilingual community. The availability of content is intertwined with the availability of logical organizations (ontologies) for which multilinguality plays a fundamental role. In this work we introduce a very high-level logical organization fully based on semiotic assumptions. We thus present the theoretical foundations as well as the ontology itself, named Linguistic Meta-Model. The most important feature of Linguistic Meta-Model is its ability to support the representation of different knowledge sources developed according to different underlying semiotic theories. This is possible because mast knowledge representation schemata, either formal or informal, can be put into the context of the so-called semiotic triangle. In order to show the main characteristics of Linguistic Meta-Model from a practical paint of view, we developed VIKI (Virtual Intelligence for Knowledge Induction). VIKI is a work-in-progress system aiming at exploiting the Linguistic Meta-Model structure for knowledge expansion. It is a modular system in which each module accomplishes a natural language processing task, from terminology extraction to knowledge retrieval. VIKI is a supporting system to Linguistic Meta-Model and its main task is to give some empirical evidence regarding the use of Linguistic Meta-Model without claiming to be thorough.
Resumo:
Despite the limited research on the effects of altitude (or hypoxic) training interventions on team-sport performance, players from all around the world engaged in these sports are now using altitude training more than ever before. In March 2013, an Altitude Training and Team Sports conference was held in Doha, Qatar, to establish a forum of research and practical insights into this rapidly growing field. A round-table meeting in which the panellists engaged in focused discussions concluded this conference. This has resulted in the present position statement, designed to highlight some key issues raised during the debates and to integrate the ideas into a shared conceptual framework. The present signposting document has been developed for use by support teams (coaches, performance scientists, physicians, strength and conditioning staff) and other professionals who have an interest in the practical application of altitude training for team sports. After more than four decades of research, there is still no consensus on the optimal strategies to elicit the best results from altitude training in a team-sport population. However, there are some recommended strategies discussed in this position statement to adopt for improving the acclimatisation process when training/competing at altitude and for potentially enhancing sea-level performance. It is our hope that this information will be intriguing, balanced and, more importantly, stimulating to the point that it promotes constructive discussion and serves as a guide for future research aimed at advancing the bourgeoning body of knowledge in the area of altitude training for team sports.
Resumo:
The liver segmentation system, described by Couinaud, is based on the identification of the three hepatic veins and the plane passing by the portal vein bifurcation. Nowadays, Couinaud's description is the most widely used classification since it is better suited for surgery and more accurate for the localisation and monitoring of intra-parenchymal lesions. Knowledge of the anatomy of the portal and venous system is therefore essential, as is knowledge of the variants resulting from changes occurring during the embryological development of the vitelline and umbilical veins. In this paper, the authors propose a straightforward systematisation of the liver in six steps using several additional anatomical points of reference. These points of reference are simple and quickly identifiable in any radiological examination with section imaging, in order to avoid any mistakes in daily practice. In fact, accurate description impacts on many diagnostic and therapeutic applications in interventional radiology and surgery. This description will allow better preparation for biopsy, portal vein embolisation, transjugular intrahepatic portosystemic shunt, tumour resection or partial hepatectomy for transplantation. Such advance planning will reduce intra- and postoperative difficulties and complications.
Resumo:
Many classifiers achieve high levels of accuracy but have limited applicability in real world situations because they do not lead to a greater understanding or insight into the^way features influence the classification. In areas such as health informatics a classifier that clearly identifies the influences on classification can be used to direct research and formulate interventions. This research investigates the practical applications of Automated Weighted Sum, (AWSum), a classifier that provides accuracy comparable to other techniques whilst providing insight into the data. This is achieved by calculating a weight for each feature value that represents its influence on the class value. The merits of this approach in classification and insight are evaluated on a Cystic Fibrosis and Diabetes datasets with positive results.
Resumo:
Gestures are the first forms of conventional communication that young children develop in order to intentionally convey a specific message. However, at first, infants rarely communicate successfully with their gestures, prompting caregivers to interpret them. Although the role of caregivers in early communication development has been examined, little is known about how caregivers attribute a specific communicative function to infants' gestures. In this study, we argue that caregivers rely on the knowledge about the referent that is shared with infants in order to interpret what communicative function infants wish to convey with their gestures. We videotaped interactions from six caregiver-infant dyads playing with toys when infants were 8, 10, 12, 14, and 16 months old. We coded infants' gesture production and we determined whether caregivers interpreted those gestures as conveying a clear communicative function or not; we also coded whether infants used objects according to their conventions of use as a measure of shared knowledge about the referent. Results revealed an association between infants' increasing knowledge of object use and maternal interpretations of infants' gestures as conveying a clear communicative function. Our findings emphasize the importance of shared knowledge in shaping infants' emergent communicative skills.
Resumo:
From the 1st of January 2011, new conditions have been validated in which surgery for weight loss is borne by the basic insurance. These are very significant changes compared to the old criteria. Indeed, on one hand, patients with BMI > or = 35 kg/m2 may, without age limit and in the absence of comorbidities benefit from surgery without prior request to the medical council health insurance company concerned. On the other hand, the notion of a minimum casuistry is for the first time introduced in centers performing this type of intervention. In addition, certified centers are required to follow standard procedures for the patients' teaching and follow up.
Resumo:
Intrarenal neurotransmission implies the co-release of neuropeptides at the neuro-effector junction with direct influence on parameters of kidney function. The presence of an angiotensin (Ang) II-containing phenotype in catecholaminergic postganglionic and sensory fibers of the kidney, based on immunocytological investigations, has only recently been reported. These angiotensinergic fibers display a distinct morphology and intrarenal distribution, suggesting anatomical and functional subspecialization linked to neuronal Ang II-expression. This review discusses the present knowledge concerning these fibers, and their significance for renal physiology and the pathogenesis of hypertension in light of established mechanisms. The data suggest a new role of Ang II as a co-transmitter stimulating renal target cells or modulating nerve traffic from or to the kidney. Neuronal Ang II is likely to be an independent source of intrarenal Ang II. Further physiological experimentation will have to explore the role of the angiotensinergic renal innervation and integrate it into existing concepts.
Resumo:
The recent developments in high magnetic field 13C magnetic resonance spectroscopy with improved localization and shimming techniques have led to important gains in sensitivity and spectral resolution of 13C in vivo spectra in the rodent brain, enabling the separation of several 13C isotopomers of glutamate and glutamine. In this context, the assumptions used in spectral quantification might have a significant impact on the determination of the 13C concentrations and the related metabolic fluxes. In this study, the time domain spectral quantification algorithm AMARES (advanced method for accurate, robust and efficient spectral fitting) was applied to 13 C magnetic resonance spectroscopy spectra acquired in the rat brain at 9.4 T, following infusion of [1,6-(13)C2 ] glucose. Using both Monte Carlo simulations and in vivo data, the goal of this work was: (1) to validate the quantification of in vivo 13C isotopomers using AMARES; (2) to assess the impact of the prior knowledge on the quantification of in vivo 13C isotopomers using AMARES; (3) to compare AMARES and LCModel (linear combination of model spectra) for the quantification of in vivo 13C spectra. AMARES led to accurate and reliable 13C spectral quantification similar to those obtained using LCModel, when the frequency shifts, J-coupling constants and phase patterns of the different 13C isotopomers were included as prior knowledge in the analysis.
Resumo:
Vaccines have been used as a successful tool in medicine by way of controlling many major diseases. In spite of this, vaccines today represent only a handful of all infectious diseases. Therefore, there is a pressing demand for improvements of existing vaccines with particular reference to higher efficacy and undisputed safety profiles. To this effect, as an alternative to available vaccine technologies, there has been a drive to develop vaccine candidate polypeptides by chemical synthesis. In our laboratory, we have recently developed a technology to manufacture long synthetic peptides of up to 130 residues, which are correctly folded and biologically active. This paper discusses the advantages of the molecularly defined, long synthetic peptide approach in the context of vaccine design, development and use in human vaccination.
Resumo:
Acute cardiovascular dysfunction occurs perioperatively in more than 20% of cardiosurgical patients, yet current acute heart failure (HF) classification is not applicable to this period. Indicators of major perioperative risk include unstable coronary syndromes, decompensated HF, significant arrhythmias and valvular disease. Clinical risk factors include history of heart disease, compensated HF, cerebrovascular disease, presence of diabetes mellitus, renal insufficiency and high-risk surgery. EuroSCORE reliably predicts perioperative cardiovascular alteration in patients aged less than 80 years. Preoperative B-type natriuretic peptide level is an additional risk stratification factor. Aggressively preserving heart function during cardiosurgery is a major goal. Volatile anaesthetics and levosimendan seem to be promising cardioprotective agents, but large trials are still needed to assess the best cardioprotective agent(s) and optimal protocol(s). The aim of monitoring is early detection and assessment of mechanisms of perioperative cardiovascular dysfunction. Ideally, volume status should be assessed by 'dynamic' measurement of haemodynamic parameters. Assess heart function first by echocardiography, then using a pulmonary artery catheter (especially in right heart dysfunction). If volaemia and heart function are in the normal range, cardiovascular dysfunction is very likely related to vascular dysfunction. In treating myocardial dysfunction, consider the following options, either alone or in combination: low-to-moderate doses of dobutamine and epinephrine, milrinone or levosimendan. In vasoplegia-induced hypotension, use norepinephrine to maintain adequate perfusion pressure. Exclude hypovolaemia in patients under vasopressors, through repeated volume assessments. Optimal perioperative use of inotropes/vasopressors in cardiosurgery remains controversial, and further large multinational studies are needed. Cardiosurgical perioperative classification of cardiac impairment should be based on time of occurrence (precardiotomy, failure to wean, postcardiotomy) and haemodynamic severity of the patient's condition (crash and burn, deteriorating fast, stable but inotrope dependent). In heart dysfunction with suspected coronary hypoperfusion, an intra-aortic balloon pump is highly recommended. A ventricular assist device should be considered before end organ dysfunction becomes evident. Extra-corporeal membrane oxygenation is an elegant solution as a bridge to recovery and/or decision making. This paper offers practical recommendations for management of perioperative HF in cardiosurgery based on European experts' opinion. It also emphasizes the need for large surveys and studies to assess the optimal way to manage perioperative HF in cardiac surgery.