140 resultados para CLASSICAL-THEORY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To evaluate the safety-efficacy of Gamma Knife surgery (GKS) as a second treatment for classical trigeminal neuralgia (CTN), and the influence of prior microvascular decompression (MVD). Methods: Between July 1992 and November 2010, 737 patients have been operated with GKRS for ITN and prospectively evaluated in Timone University Hospital in Marseille, France. Among these, 54 patients had a previous history of MVD. Radiosurgery using a Gamma Knife (model B or C or Perfexion) was performed on the basis of on both MR and CT targeting. A single 4 mm isocenter was positioned in the cisternal portion of the trigeminal nerve at a median distance of 7.6 mm (range 3.9-11.9) anteriorly to the emergence of the nerve (retrogasserian target). A median maximum dose of 85 Gy (range 70-90) was delivered. Here, the 45 patients with previous MVD and a follow-up longer than one year are evaluated (the patients with megadolichobasilar artery compression and multiple sclerosis were excluded). Results: The median age in this series was 56.75 years (range 28.09-82.39). The median follow-up period was 39.48 months (range 14.10-144.65). All the patients had a past history of surgery, with at least one previous failed MVD, but also radiofrequency lesion (RFL) in 16 patients (35.6%), balloon microcompression in 7 (15.6%) and glycerol rhizotomy in 1 (2.2%). Thirty-five patients (77.8%) were initially pain free after GKS within a median time of 14 days (range 0, 180). Patients from this group had less probability of being pain free compared to our global population of essential trigeminal neuralgia without previous MVD history (p=0.010, hazard ratio of 0.64). Their probability of remaining pain free at 3, 5, 7 and 10 years was 66.5%, 59.1%, 59.1% and 44.3%, respectively. Twelve patients (34.3%) initially pain free experienced a recurrence with a median delay of 31.21 months (range 3.40-89.93). The hypoesthesia actuarial rate at 1 year was 9.1% and remained stable till 12 years with a median delay of onset of 8 months (range 8-8). Conclusions: Retrogasserian GKS proofed to be safe and effective on the long-term basis even after failed previous MVD. Even if the initial result of pain free was only 77.8%, the toxicity was low with only 9.1% hypoesthesia. No patient reported a bothersome hypoesthesia. The probability of maintaining pain relief in the long-term was of 44.3% at 10 years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ancient Greek medical theory based on balance or imbalance of humors disappeared in the western world, but does survive elsewhere. Is this survival related to a certain degree of health care efficiency? We explored this hypothesis through a study of classical Greco-Arab medicine in Mauritania. Modern general practitioners evaluated the safety and effectiveness of classical Arabic medicine in a Mauritanian traditional clinic, with a prognosis/follow-up method allowing the following comparisons: (i) actual patient progress (clinical outcome) compared with what the traditional 'tabib' had anticipated (= prognostic ability) and (ii) patient progress compared with what could be hoped for if the patient were treated by a modern physician in the same neighborhood. The practice appeared fairly safe and, on average, clinical outcome was similar to what could be expected with modern medicine. In some cases, patient progress was better than expected. The ability to correctly predict an individual's clinical outcome did not seem to be better along modern or Greco-Arab theories. Weekly joint meetings (modern and traditional practitioners) were spontaneously organized with a modern health centre in the neighborhood. Practitioners of a different medical system can predict patient progress. For the patient, avoiding false expectations with health care and ensuring appropriate referral may be the most important. Prognosis and outcome studies such as the one presented here may help to develop institutions where patients find support in making their choices, not only among several treatment options, but also among several medical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. There is some cross-sectional evidence that theory of mind ability is associated with social functioning in those with psychosis but the direction of this relationship is unknown. This study investigates the longitudinal association between both theory of mind and psychotic symptoms and social functioning outcome in first-episode psychosis. Methods. Fifty-four people with first-episode psychosis were followed up at 6 and 12 months. Random effects regression models were used to estimate the stability of theory of mind over time and the association between baseline theory of mind and psychotic symptoms and social functioning outcome. Results. Neither baseline theory of mind ability (regression coefficients: Hinting test 1.07 95% CI 0.74, 2.88; Visual Cartoon test 2.91 95% CI 7.32, 1.51) nor baseline symptoms (regression coefficients: positive symptoms 0.04 95% CI 1.24, 1.16; selected negative symptoms 0.15 95% CI 2.63, 2.32) were associated with social functioning outcome. There was evidence that theory of mind ability was stable over time, (regression coefficients: Hinting test 5.92 95% CI 6.66, 8.92; Visual Cartoon test score 0.13 95% CI 0.17, 0.44). Conclusions. Neither baseline theory of mind ability nor psychotic symptoms are associated with social functioning outcome. Further longitudinal work is needed to understand the origin of social functioning deficits in psychosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human imaging studies examining fear conditioning have mainly focused on the neural responses to conditioned cues. In contrast, the neural basis of the unconditioned response and the mechanisms by which fear modulates inter-regional functional coupling have received limited attention. We examined the neural responses to an unconditioned stimulus using a partial-reinforcement fear conditioning paradigm and functional MRI. The analysis focused on: (1) the effects of an unconditioned stimulus (an electric shock) that was either expected and actually delivered, or expected but not delivered, and (2) on how related brain activity changed across conditioning trials, and (3) how shock expectation influenced inter-regional coupling within the fear network. We found that: (1) the delivery of the shock engaged the red nucleus, amygdale, dorsal striatum, insula, somatosensory and cingulate cortices, (2) when the shock was expected but not delivered, only the red nucleus, the anterior insular and dorsal anterior cingulate cortices showed activity increases that were sustained across trials, and (3) psycho-physiological interaction analysis demonstrated that fear led to increased red nucleus coupling to insula but decreased hippocampus coupling to the red nucleus, thalamus and cerebellum. The hippocampus and the anterior insula may serve as hubs facilitating the switch between engagement of a defensive immediate fear network and a resting network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dans le cadre d'une étude rétrospective au sein d'une unité de réhabilitation, nous avons cherché à examiner le degré de respect de recommandations de pratique clinique (RPC) abordant le traitement pharmacologique au long cours de la schizophrénie, par des médecins qui n'en ont qu'une connaissance indirecte. The Expert Consensus Guideline for the treatment of schizophrenia (ECGTS) a été retenu comme référence sur la base d'une comparaison avec cinq autres RPC principales. Sur un collectif de 20 patients, les recommandations de l'ECGTS sont totalement respectées dans 65 % des cas, partiellement respectées dans 10 % et non respectées dans 25 %, démontrant ainsi que la pratique clinique est clairement perfectible (principalement dans le traitement des symptômes psychotiques et dépressifs). Cependant, le respect des RPC ne garantit pas forcément la résolution de tous les problèmes cliniques rencontrés : 12 patients sur 20 présentent des effets secondaires à l'évaluation clinique et pour huit d'entre eux, les recommandations à ce niveau, sont respectées. Notre étude montre cependant que le choix et l'application d'une RPC ne sont pas simples. Les RPC actuelles donnent peu ou pas d'instruments de mesure, ni de critères précis pour évaluer les problèmes cliniques auxquels elles font référence. L'avenir appartient donc à des RPC qui proposent, outre les recommandations cliniques elles-mêmes, les moyens de leur vérification et de leur application sur le terrain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation investigates some relevant metaphysical issues arising in the context of spacetime theories. In particular, the inquiry focuses on general relativity and canonical quantum gravity. A formal definition of spacetime theory is proposed and, against this framework, an analysis of the notions of general covariance, symmetry and background independence is performed. It is argued that many conceptual issues in general relativity and canonical quantum gravity derive from putting excessive emphasis on general covariance as an ontological prin-ciple. An original metaphysical position grounded in scientific essential- ism and causal realism (weak essentialism) is developed and defended. It is argued that, in the context of general relativity, weak essentialism supports spacetime substantivalism. It is also shown that weak essentialism escapes arguments from metaphysical underdetermination by positing a particular kind of causation, dubbed geometric. The proposed interpretive framework is then applied to Bohmian mechanics, pointing out that weak essentialism nicely fits into this theory. In the end, a possible Bohmian implementation of loop quantum gravity is considered, and such a Bohmian approach is interpreted in a geometric causal fashion. Under this interpretation, Bohmian loop quantum gravity straightforwardly commits us to an ontology of elementary extensions of space whose evolution is described by a non-local law. The causal mechanism underlying this evolution clarifies many conceptual issues related to the emergence of classical spacetime from the quantum regime. Although there is as yet no fully worked out physical theory of quantum gravity, it is argued that the proposed approach sets up a standard that proposals for a serious ontology in this field should meet.