111 resultados para user testing, usability testing, system integration, thinking aloud, card sorting

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integration of specific institutions for teacher education into the higher education system represents a milestone in the Swiss educational policy and has broad implications. This thesis explores organizational and institutional change resulting from this policy reform, and attempts to assess structural change in terms of differentiation and convergence within the system of higher education. Key issues that are dealt with are, on the one hand, the adoption of a research function by the newly conceptualized institutions of teacher education, and on the other, the positioning of the new institutions within the higher education system. Drawing on actor-centred approaches to differentiation, this dissertation discusses system-level specificities of tertiarized teacher education and asks how this affects institutional configurations and actor constellations. On the basis of qualitative and quantitative empirical data, a comparative analysis has been carried out including case studies of four universities of teacher education as well as multivariate regression analysis of micro-level data on students' educational choices. The study finds that the process of system integration and adaption to the research function by the various institutions have unfolded differently depending on the institutional setting and the specific actor constellations. The new institutions have clearly made a strong push to position themselves as a new institutional type and to find their identity beyond the traditional binary divide which assigns the universities of teacher education to the college sector. Potential conflicts have been identified in divergent cognitive normative orientations and perceptions of researchers, teacher educators, policy-makers, teachers, and students as to the mission and role of the new type of higher education institution. - L'intégration dans le système d'enseignement supérieur d'institutions qui ont pour tâche spécifique de former des enseignants peut être considérée comme un événement majeur dans la politique éducative suisse, qui se trouve avoir des conséquences importantes à plusieurs niveaux. Cette thèse explore les changements organisationnels et institutionnels résultant de cette réforme politique, et elle se propose d'évaluer en termes de différentiation et de convergence les changements structurels intervenus dans le système d'éducation tertiaire. Les principaux aspects traités sont d'une part la nouvelle mission de recherche attribuée à ces institutions de formation pédagogique, et de l'autre la place par rapport aux autres institutions du système d'éducation tertiaire. Recourant à une approche centrée sur les acteurs pour étudier les processus de différen-tiation, la thèse met en lumière et en discussion les spécificités inhérentes au système tertiaire au sein duquel se joue la formation des enseignants nouvellement conçue et soulève la question des effets de cette nouvelle façon de former les enseignants sur les configurations institutionnelles et les constellations d'acteurs. Une analyse comparative a été réalisée sur la base de données qualitatives et quantitatives issues de quatre études de cas de hautes écoles pédagogiques et d'analyses de régression multiple de données de niveau micro concernant les choix de carrière des étudiants. Les résultats montrent à quel point le processus d'intégration dans le système et la nouvelle mission de recherche peuvent apparaître de manière différente selon le cadre institutionnel d'une école et la constellation spécifique des acteurs influents. A pu clairement être observée une forte aspiration des hautes écoles pédagogiques à se créer une identité au-delà de la structure binaire du système qui assigne la formation des enseignants au secteur des hautes écoles spéciali-sées. Des divergences apparaissent dans les conceptions et perceptions cognitives et normatives des cher-cheurs, formateurs, politiciens, enseignants et étudiants quant à la mission et au rôle de ce nouveau type de haute école. - Die Integration spezieller Institutionen für die Lehrerbildung ins Hochschulsystem stellt einen bedeutsamen Schritt mit weitreichenden Folgen in der Entwicklung des schweizerischen Bildungswesens dar. Diese Dissertation untersucht die mit der Neuerung verbundenen Veränderungen auf organisatorischer und institutioneller Ebene und versucht, die strukturelle Entwicklung unter den Gesichtspunkten von Differenzierung und Konvergenz innerhalb des tertiären Bildungssystems einzuordnen. Zentrale Themen sind dabei zum einen die Einführung von Forschung und Entwicklung als zusätzlichem Leistungsauftrag in der Lehrerbildung und zum andern die Positionierung der pädagogischen Hochschulen innerhalb des Hochschulsystems. Anhand akteurzentrierter Ansätze zur Differenzierung werden die Besonderheiten einer tertiarisierten Lehrerbildung hinsichtlich der Systemebenen diskutiert und Antworten auf die Frage gesucht, wie die Reform die institutionellen Konfigurationen und die Akteurkonstellationen beeinflusst. Auf der Grundlage qualitativer und quantitativer Daten wurde eine vergleichende Analyse durchgeführt, welche Fallstudien zu vier pädagogischen Hochschulen umfasst sowie Regressionsanalysen von Mikrodaten zur Studienwahl von Maturanden. Die Ergebnisse machen deutlich, dass sich der Prozess der Systemintegration und die Einführung von Forschung in die Lehrerbildung in Abhängigkeit von institutionellen Ordnungen und der jeweiligen Akteurkonstellation unterschiedlich gestalten. Es lässt sich bei den neu gegründeten pädagogischen Hochschulen ein starkes Bestreben feststellen, sich als neuen Hochschultypus zu positionieren und sich eine Identität zu schaffen jenseits der herkömmlichen binären Struktur, welche die pädagogischen Hochschulen dem Fachhochschul-Sektor zuordnet. Potentielle Konflikte zeichnen sich ab in den divergierenden kognitiven und normativen Orientierungen und Wahrnehmungen von Forschern, Ausbildern, Bildungspolitikern, Lehrern und Studierenden hinsichtlich des Auftrags und der Rolle dieses neuen Typs Hochschule.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an increasing need to develop improved systems for predicting the safety of xenobiotics. However, to move beyond hazard identification the available concentration of the test compounds needs to be incorporated. In this study cyclosporine A (CsA) was used as a model compound to assess the kinetic profiles in two rodent brain cell cultures after single and repeated exposures. CsA induced-cyclophilin B (Cyp-B) secretion was also determined as CsA-specific pharmacodynamic endpoint. Since CsA is a potent p-glycoprotein substrate, the ability of this compound to cross the blood-brain barrier (BBB) was also investigated using an in vitro bovine model with repeated exposures up to 14days. Finally, CsA uptake mechanisms were studied using a parallel artificial membrane assay (PAMPA) in combination with a Caco-2 model. Kinetic results indicate a low intracellular CsA uptake, with no marked bioaccumulation or biotransformation. In addition, only low CsA amounts crossed the BBB. PAMPA and Caco-2 experiments revealed that CsA is mostly trapped to lipophilic compartments and exits the cell apically via active transport. Thus, although CsA is unlikely to enter the brain at cytotoxic concentrations, it may cause alterations in electrical activity and is likely to increase the CNS concentration of other compounds by occupying the BBBs extrusion capacity. Such an integrated testing system, incorporating BBB, brain culture models and kinetics could be applied for assessing neurotoxicity potential of compounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A traditional photonic-force microscope (PFM) results in huge sets of data, which requires tedious numerical analysis. In this paper, we propose instead an analog signal processor to attain real-time capabilities while retaining the richness of the traditional PFM data. Our system is devoted to intracellular measurements and is fully interactive through the use of a haptic joystick. Using our specialized analog hardware along with a dedicated algorithm, we can extract the full 3D stiffness matrix of the optical trap in real time, including the off-diagonal cross-terms. Our system is also capable of simultaneously recording data for subsequent offline analysis. This allows us to check that a good correlation exists between the classical analysis of stiffness and our real-time measurements. We monitor the PFM beads using an optical microscope. The force-feedback mechanism of the haptic joystick helps us in interactively guiding the bead inside living cells and collecting information from its (possibly anisotropic) environment. The instantaneous stiffness measurements are also displayed in real time on a graphical user interface. The whole system has been built and is operational; here we present early results that confirm the consistency of the real-time measurements with offline computations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1. Harsh environmental conditions experienced during development can reduce the performance of the same individuals in adulthood. However, the 'predictive adaptive response' hypothesis postulates that if individuals adapt their phenotype during development to the environments where they are likely to live in the future, individuals exposed to harsh conditions in early life perform better when encountering the same harsh conditions in adulthood compared to those never exposed to these conditions before. 2. Using the common vole (Microtus arvalis) as study organism, we tested how exposure to flea parasitism during the juvenile stage affects the physiology (haematocrit, resistance to oxidative stress, resting metabolism, spleen mass, and testosterone), morphology (body mass, testis mass) and motor performance (open field activity and swimming speed) of the same individuals when infested with fleas in adulthood. According to the 'predictive adaptive response' hypothesis, we predicted that voles parasitized at the adult stage would perform better if they had already been parasitized with fleas at the juvenile stage. 3. We found that voles exposed to fleas in adulthood had a higher metabolic rate if already exposed to fleas when juvenile, compared to voles free of fleas when juvenile and voles free of fleas in adulthood. Independently of juvenile parasitism, adult parasitism impaired adult haematocrit and motor performances. Independently of adult parasitism, juvenile parasitism slowed down crawling speed in adult female voles. 4. Our results suggest that juvenile parasitism has long-term effects that do not protect from the detrimental effects of adult parasitism. On the contrary, experiencing parasitism in early-life incurs additional costs upon adult parasitism measured in terms of higher energy expenditure, rather than inducing an adaptive shift in the developmental trajectory. 5. Hence, our study provides experimental evidence for long term costs of parasitism. We found no support for a predictive adaptive response in this host-parasite system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An ammonium chloride procedure was used to prepare a bacterial pellet from positive blood cultures, which was used for direct inoculation of VITEK 2 cards. Correct identification reached 99% for Enterobacteriaceae and 74% for staphylococci. For antibiotic susceptibility testing, very major and major errors were 0.1 and 0.3% for Enterobacteriaceae, and 0.7 and 0.1% for staphylococci, respectively. Thus, bacterial pellets prepared with ammonium chloride allow direct inoculation of VITEK cards with excellent accuracy for Enterobacteriaceae and a lower accuracy for staphylococci.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Finding out whether Plasmodium spp. are coevolving with their vertebrate hosts is of both theoretical and applied interest and can influence our understanding of the effects and dynamics of malaria infection. In this study, we tested for local adaptation as a signature of coevolution between malaria blood parasites, Plasmodium spp. and its host, the great tit, Parus major. We conducted a reciprocal transplant experiment of birds in the field, where we exposed birds from two populations to Plasmodium parasites. This experimental set-up also provided a unique opportunity to study the natural history of malaria infection in the wild and to assess the effects of primary malaria infection on juvenile birds. We present three main findings: i) there was no support for local adaptation; ii) there was a male-biased infection rate; iii) infection occurred towards the end of the summer and differed between sites. There were also site-specific effects of malaria infection on the hosts. Taken together, we present one of the few experimental studies of parasite-host local adaptation in a natural malaria system, and our results shed light on the effects of avian malaria infection in the wild.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Big sports events like the 2008 European Football Championship are a challenge for anti-doping activities, particularly when the sports event is hosted by two different countries and there are two laboratories accredited by the World Anti-Doping Agency. This challenges the logistics of sample collection as well as the chemical analyses, which must be carried out timeously. The following paper discusses the handling of whereabouts information for each athlete and the therapeutic use exemption system, experiences in sample collection and transportation of blood and urine samples, and the results of the chemical analysis in two different accredited laboratories. An overview of the analytical results of blood profiling and growth hormone testing in comparison with the distribution of the normal population is also presented.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Sex allocation data in eusocial Hymenoptera (ants, bees and wasps) provide an excellent opportunity to assess the effectiveness of kin selection, because queens and workers differ in their relatedness to females and males. The first studies on sex allocation in eusocial Hymenoptera compared population sex investment ratios across species. Female-biased investment in monogyne (= with single-queen colonies) populations of ants suggested that workers manipulate sex allocation according to their higher relatedness to females than males (relatedness asymmetry). However, several factors may confound these comparisons across species. First, variation in relatedness asymmetry is typically associated with major changes in breeding system and life history that may also affect sex allocation. Secondly, the relative cost of females and males is difficult to estimate across sexually dimorphic taxa, such as ants. Thirdly, each species in the comparison may not represent an independent data point, because of phylogenetic relationships among species. Recently, stronger evidence that workers control sex allocation has been provided by intraspecific studies of sex ratio variation across colonies. In several species of eusocial Hymenoptera, colonies with high relatedness asymmetry produced mostly females, in contrast to colonies with low relatedness asymmetry which produced mostly males. Additional signs of worker control were found by investigating proximate mechanisms of sex ratio manipulation in ants and wasps. However, worker control is not always effective, and further manipulative experiments will be needed to disentangle the multiple evolutionary factors and processes affecting sex allocation in eusocial Hymenoptera.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The development of new medical devices, such as aortic valves, requires numerous preliminary studies on animals and training of personnel on cadavers before the devices can be used in patients. Postmortem circulation, a technique used for postmortem angiography, allows the vascular system to be reperfused in a way similar to that in living persons. This technique is used for postmortem investigations to visualize the human vascular system and to make vascular diagnoses. Specific material for reperfusing a human body was developed recently. Our aim was to investigate whether postmortem circulation that imitates in vivo conditions allows for the testing of medical materials on cadavers. We did this by delivering an aortic valve using minimally invasive methods. Postmortem circulation was established in eight corpses to recreate an environment as close as possible to in vivo conditions. Mobile fluoroscopy and a percutaneous catheterization technique were used to deliver the material to the correct place. Once the valve was implanted, the heart and primary vessels were extracted to confirm its position. Postmortem circulation proved to be essential in several of the cadavers because it helped the clinicians to deliver the material and improve their implantation techniques. Due to the intravascular circulation, sites with substantial arteriosclerotic stenosis could be bypassed, which would have been impossible without perfusion. Although originally developed for postmortem investigations, this reperfusion technique could be useful for testing new medical devices intended for living patients.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Big sports events like the 2008 European Football Championship are a challenge for anti-doping activities, particularly when the sports event is hosted by two different countries and there are two laboratories accredited by the World Anti-Doping Agency. This challenges the logistics of sample collection as well as the chemical analyses, which must be carried out timeously. The following paper discusses the handling of whereabouts information for each athlete and the therapeutic use exemption system, experiences in sample collection and transportation of blood and urine samples, and the results of the chemical analysis in two different accredited laboratories. An overview of the analytical results of blood profiling and growth hormone testing in comparison with the distribution of the normal population is also presented.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

British mammalogists have used two different systems for surveying the common dormouse Muscardinus avellanarius: a modified bird nest box with the entrance facing the tree trunk, and a smaller, cheaper model called a "nest tube". However, only few data comparing different nest box systems are currently available. To determine which system is more efficient, we compared the use of the large (GB-type) and small nest boxes (DE-type, a commercial wooden mouse trap without a door) in three Swiss forest. The presence of Muscardinus, potential competitors, and any evidence of occupation were examined in 60 pairs of nest boxes based on 2,280 nest box checks conducted over 5 years. Mean annual occupation and cumulative numbers of Muscardinus present were both significantly higher for the DE than for the GB boxes (64.6% versus 32.1%, and 149 versus 67 dormice, respectively). In contrast, the annual occupation by competitors including Glis glis, Apodemus spp. and hole-nesting birds was significantly higher in the GB than in the DE boxes in all forest (19-68% versus 0-16%, depending on the species and forest). These results suggest that smaller nest boxes are preferred by the common dormouse and are rarely occupied by competitors. These boxes hence appear to be preferable for studying Muscardinus populations.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The McIsaac scoring system is a tool designed to predict the probability of streptococcal pharyngitis in children aged 3 to 17 years with a sore throat. Although it does not allow the physician to make the diagnosis of streptococcal pharyngitis, it enables to identify those children with a sore throat in whom rapid antigen detection tests have a good predictive value.