943 resultados para usability
Resumo:
3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.
Resumo:
The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. Security testing methodologies are the first step towards standardized security evaluation processes and understanding of how the security threats evolve over time. This dissertation analyzes some of the most used identifying differences and commonalities, useful to compare them and assess their quality. The dissertation then proposes a new enhanced methodology built by keeping the best of every analyzed methodology. The designed methodology is tested over different systems with very effective results, which is the main evidence that it could really be applied in practical cases. Most of the dissertation discusses and proves how the presented testing methodology could be applied to such different systems and even to evade security measures by inverting goals and scopes. Real cases are often hard to find in methodology' documents, in contrary this dissertation wants to show real and practical cases offering technical details about how to apply it. Electronic voting systems are the first field test considered, and Pvote and Scantegrity are the two tested electronic voting systems. The usability and effectiveness of the designed methodology for electronic voting systems is proved thanks to this field cases analysis. Furthermore reputation and anti virus engines have also be analyzed with similar results. The dissertation concludes by presenting some general guidelines to build a coordination-based approach of electronic voting systems to improve the security without decreasing the system modularity.
Resumo:
This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.
Resumo:
Le scelte di asset allocation costituiscono un problema ricorrente per ogni investitore. Quest’ultimo è continuamente impegnato a combinare diverse asset class per giungere ad un investimento coerente con le proprie preferenze. L’esigenza di supportare gli asset manager nello svolgimento delle proprie mansioni ha alimentato nel tempo una vasta letteratura che ha proposto numerose strategie e modelli di portfolio construction. Questa tesi tenta di fornire una rassegna di alcuni modelli innovativi di previsione e di alcune strategie nell’ambito dell’asset allocation tattica, per poi valutarne i risvolti pratici. In primis verificheremo la sussistenza di eventuali relazioni tra la dinamica di alcune variabili macroeconomiche ed i mercati finanziari. Lo scopo è quello di individuare un modello econometrico capace di orientare le strategie dei gestori nella costruzione dei propri portafogli di investimento. L’analisi prende in considerazione il mercato americano, durante un periodo caratterizzato da rapide trasformazioni economiche e da un’elevata volatilità dei prezzi azionari. In secondo luogo verrà esaminata la validità delle strategie di trading momentum e contrarian nei mercati futures, in particolare quelli dell’Eurozona, che ben si prestano all’implementazione delle stesse, grazie all’assenza di vincoli sulle operazioni di shorting ed ai ridotti costi di transazione. Dall’indagine emerge che entrambe le anomalie si presentano con carattere di stabilità. I rendimenti anomali permangono anche qualora vengano utilizzati i tradizionali modelli di asset pricing, quali il CAPM, il modello di Fama e French e quello di Carhart. Infine, utilizzando l’approccio EGARCH-M, verranno formulate previsioni sulla volatilità dei rendimenti dei titoli appartenenti al Dow Jones. Quest’ultime saranno poi utilizzate come input per determinare le views da inserire nel modello di Black e Litterman. I risultati ottenuti, evidenziano, per diversi valori dello scalare tau, extra rendimenti medi del new combined vector superiori al vettore degli extra rendimenti di equilibrio di mercato, seppur con livelli più elevati di rischio.
Resumo:
The relationship and phylogeny of the western Palearctic harvestmen family Trogulidae is investigated. The traditional system of seven genera and approximately 40 species appeared to be artificially composed but a phylogenetic approach and a comprehensive revision has long been sought after. Species are poorly characterised due to their uniform morphology and species evaluation is furthermore complicated by the variability of the few characters used for species delineation. To meet these demands a molecular genetic analysis is accomplished using the nuclear 28S rRNA gene and the mitochondrial cytochrome b gene. This analysis incorporates most genera and species of Trogulidae as well as a comprehensive set of Nemastomatidae and Dicranolasmatidae as outgroup taxa. Phylogenetic results of Bayesian analysis, Maximum Parsimony, Maximum Likelihood and Neighbor Joining are compared with distributional data, morphological characters and results of canonical discriminant analysis of morphometric characters and general congruence of these data sets is shown. To demonstrate the applicability of this method the revision of two species-groups within Trogulus is set out in detail. The Trogulus hirtus species-group and the Trogulus coriziformis species-group are revised. The former is in the central and north-western Balkan Peninsula. T. tricarinatus ssp. hirtus is raised to species level and four new species are described (T. karamanorum [man.n.], T. melitensis [man.n.], T. pharensis [man.n]; T. thaleri [man.n.]). The Trogulus coriziformis species-group is confined to the western Mediterranean area. T. coriziformis, T. aquaticus are re-described, T. cristatus and T. lusitanicus are re-established and four species are described as new (T. balearicus, T. huberi, T. prietoi, T. pyrenaicus). In both species-groups two further cryptic species probably exist but were not described. The species groups are shown to represent different phylogenetic levels and this information is used for the revisional work on the genus Trogulus as well as for the generic system of Trogulidae. Family status of Dicranolasmatidae is rejected and Dicranolasma is shown to be best incorporated within Trogulidae. Calathocratus, Platybessobius and Trogulocratus appear to be polyphyletic and are best to be united within Calathocratus, the oldest name of this set. The cryptic diversity within Trogulidae, especially in Trogulus and the composed genus Calathocratus rates to 150-235% and is thereby remarkably high for a group of the generally well researched European fauna. Genetic features of the group such as heteroplasmy, the possibility of major gene rearrangements and usability of the cytochrome b gene for phylogenetic studies in Opiliones are outlined.
Resumo:
The prospect of the continuous multiplication of life styles, the obsolescence of the traditional typological diagrams, the usability of spaces on different territorial scales, imposes on contemporary architecture the search for new models of living. Limited densities in urban development have produced the erosion of territory, the increase of the harmful emissions and energy consumption. High density housing cannot refuse the social emergency to ensure high quality and low cost dwellings, to a new people target: students, temporary workers, key workers, foreign, young couples without children, large families and, in general, people who carry out public services. Social housing strategies have become particularly relevant in regenerating high density urban outskirts. The choice of this research topic derives from the desire to deal with the recent accommodation emergency, according to different perspectives, with a view to give a contribution to the current literature, by proposing some tools for a correct design of the social housing, by ensuring good quality, cost-effective, and eco-sustainable solutions, from the concept phase, through management and maintenance, until the end of the building life cycle. The purpose of the thesis is defining a framework of guidelines that become effective instruments to be used in designing the social housing. They should also integrate the existing regulations and are mainly thought for those who work in this sector. They would aim at supporting students who have to cope with this particular residential theme, and also the users themselves. The scientific evidence of either the recent specialized literature or the solutions adopted in some case studies within the selected metropolitan areas of Milan, London and São Paulo, it is possible to identify the principles of this new design approach, in which the connection between typology, morphology and technology pursues the goal of a high living standard.
Resumo:
Im Rahmen der vorliegenden Arbeit wurden Eignung und Nutzen des „Objective therapy Compliance Measurement“ (OtCMTM)-Systems, einer innovativen Weiterentwicklung im Bereich der elektronischen Compliance-Messung, untersucht. Unter experimentellen Bedingungen wurden Funktionalität und Verlässlichkeit der elektronischen OtCMTM-Blisterpackungen überprüft, um deren Eignung für den klinischen Einsatz zu zeigen. Funktionalität (≥90% lesbare Blister), Richtigkeit (≤2% Fehler) und Robustheit waren bei den OtCMTM-Blistern der Version 3 gegeben, nachdem die Fehler der Versionen 1 und 2 in Zusammenarbeit mit dem Hersteller TCG identifiziert und eliminiert worden waren. Der als Alternative zu den elektronischen Blistern für die Verpackung von klinischen Prüfmustern entwickelte OtCMTM e-Dispenser wurde bezüglich Funktionalität und Anwenderfreundlichkeit in einer Pilotstudie untersucht. Dabei wurde ein Optimierungsbedarf festgestellt. In einer klinischen Studie wurde das OtCMTM-System mit dem als „Goldstandard“ geltenden MEMS® verglichen. Vergleichskriterien waren Datenqualität, Akzeptanz und Anwenderfreundlichkeit, Zeitaufwand bei der Bereitstellung der Medikation und Datenauswertung, sowie Validität. Insgesamt 40 Patienten, die mit Rekawan® retard 600mg behandelt wurden, nahmen an der offenen, randomisierten, prospektiven Studie teil. Das OtCMTM-System zeigte sich bezüglich Validität, Akzeptanz und Anwenderfreundlichkeit mit MEMS® vergleichbar. Eine erwartete Zeitersparnis wurde mit dem OtCMTM-System gegenüber MEMS® nicht erreicht. Vorteile des OtCMTM-Systems sind eine höhere Datenqualität und die Möglichkeit zum Einsatz in der Telemedizin.
Resumo:
Testing e Analisi di problemi di usabilità che potrebbero sorgere se due sistemi venissero integrati in un unico nuovo sistema.
Resumo:
La formazione, in ambito sanitario, è considerata una grande leva di orientamento dei comportamenti, ma la metodologia tradizionale di formazione frontale non è la più efficace, in particolare nella formazione continua o “long-life education”. L’obiettivo primario della tesi è verificare se l’utilizzo della metodologia dello “studio di caso”, di norma utilizzata nella ricerca empirica, può favorire, nel personale sanitario, l’apprendimento di metodi e strumenti di tipo organizzativo-gestionale, partendo dalla descrizione di processi, decisioni, risultati conseguiti in contesti reali. Sono stati progettati e realizzati 4 studi di caso con metodologia descrittiva, tre nell’Azienda USL di Piacenza e uno nell’Azienda USL di Bologna, con oggetti di studio differenti: la continuità di cura in una coorte di pazienti con stroke e l’utilizzo di strumenti di monitoraggio delle condizioni di autonomia; l’adozione di un approccio “patient-centred” nella presa in carico domiciliare di una persona con BPCO e il suo caregiver; la percezione che caregiver e Medici di Medicina Generale o altri professionisti hanno della rete aziendale Demenze e Alzheimer; la ricaduta della formazione di Pediatri di Libera Scelta sull’attività clinica. I casi di studio sono stati corredati da note di indirizzo per i docenti e sono stati sottoposti a quattro referee per la valutazione dei contenuti e della metodologia. Il secondo caso è stato somministrato a 130 professionisti sanitari all’interno di percorso di valutazione delle competenze e dei potenziali realizzato nell’AUSL di Bologna. I referee hanno commentato i casi e gli strumenti di lettura organizzativa, sottolineando la fruibilità, approvando la metodologia utilizzata, la coniugazione tra ambiti clinico-assistenziali e organizzativi, e le teaching note. Alla fine di ogni caso è presente la valutazione di ogni referee.
Resumo:
Das vorliegende Werk behandelt die Ursachen für die Nicht-, Schwer- und Missverständlichkeit von Bedienungsanleitungen und Instruktionen sowohl theoretisch durch eine Auswertung der einschlägigen Fachliteratur als auch praktisch durch eine empirische Untersuchung dreier Informationsprodukte. Zur Veranschaulichung der Tragweite von dysfunktionalen Instruktionen stellt die vorliegende Arbeit zunächst die rechtlichen Rahmenbedingungen für Bedienungsanleitungen dar. Im Anschluss daran erläutert sie die thematisch relevanten Kommunikationstheorien, die grundlegenden Kommunikationsmodelle sowie die zentralen Theorien der Kognitionswissenschaft zur Textverarbeitung und zum Textverstehen als Grundlage für die durchgeführten Lese- und Benutzertests. Die praktische Untersuchung veranschaulicht die vielfältigen und omnipräsenten Ursachen für eine dysfunktionale Rezeption von Instruktionen und legt aufgrund der potenziell gefährlichen Folgen die Durchführung von Benutzertests zur retrospektiven Vermeidung von Kommunikationsstörungen und zur prospektiven Stärkung des Problembewusstseins bei der Erstellung von Bedienungsanleitungen nahe.
Resumo:
This thesis presents a possible method to calculate sea level variation using geodetic-quality Global Navigate Satellite System (GNSS) receivers. Three antennas are used: two small antennas and a choke ring one, analyzing only Global Positioning System signals. The main goal of the thesis is to test a modified configuration for antenna set up. In particular, measurements obtained tilting one antenna to face the horizon are compared to measurements obtained from antennas looking upward. The location of the experiment is a coastal environment nearby the Onsala Space Observatory in Sweden. Sea level variations are obtained using periodogram analysis of the SNR signal and compared to synthetic gauge generated from two independent tide gauges. The choke ring antenna provides poor result, with an RMS around 6 cm and a correlation coefficients of 0.89. The smaller antennas provide correlation coefficients around 0.93. The antenna pointing upward present an RMS of 4.3 cm and the one pointing the horizon an RMS of 6.7 cm. Notable variation in the statistical parameters is found when modifying the length of the interval analyzed. In particular, doubts are risen on the reliability of certain scattered data. No relation is found between the accuracy of the method and weather conditions. Possible methods to enhance the available data are investigated, and correlation coefficient above 0.97 can be obtained with small antennas when sacrificing data points. Hence, the results provide evidence of the suitability of SNR signal analysis for sea level variation in coastal environment even in the case of adverse weather conditions. In particular, tilted configurations provides comparable result with upward looking geodetic antennas. A SNR signal simulator is also tested to investigate its performance and usability. Various configuration are analyzed in combination with the periodogram procedure used to calculate the height of reflectors. Consistency between the data calculated and those received is found, and the overall accuracy of the height calculation program is found to be around 5 mm for input height below 5 m. The procedure is thus found to be suitable to analyze the data provided by the GNSS antennas at Onsala.
Resumo:
Plasmonic nanoparticles are great candidates for sensing applications with optical read-out. Plasmon sensing is based on the interaction of the nanoparticle with electromagnetic waves where the particle scatters light at its resonance wavelength. This wavelength depends on several intrinsic factors like material, shape and size of the nanoparticle as well as extrinsic factors like the refractive index of the surrounding medium. The latter allows the nanoparticle to be used as a sensor; changes in the proximate environment can be directly monitored by the wavelength of the emitted light. Due to their minuscule size and high sensitivity this allows individual nanoparticles to report on changes in particle coverage.rnrnTo use this single particle plasmon sensor for future sensing applications it has to meet the demand for detection of incidents on the single molecule level, such as single molecule sensing or even the detection of conformational changes of a single molecule. Therefore, time resolution and sensitivity have to be enhanced as today’s measurement methods for signal read-out are too slow and not sensitive enough to resolve these processes. This thesis presents a new experimental setup, the 'Plasmon Fluctuation Setup', that leads to tremendous improvements in time resolution and sensitivity. This is achieved by implementation of a stronger light source and a more sensitive detector. The new setup has a time resolution in the microsecond regime, an advancement of 4-6 orders of magnitude to previous setups. Its resonance wavelength stability of 0.03 nm, measured with an exposure time of 10 ms, is an improvement of a factor of 20 even though the exposure time is 3000 times shorter than in previous reports. Thus, previously unresolvable wavelength changes of the plasmon sensor induced by minor local environmental alteration can be monitored with extremely high temporal resolution.rnrnUsing the 'Plasmon Fluctuation Setup', I can resolve adsorption events of single unlabeled proteins on an individual nanorod. Additionally, I monitored the dynamic evolution of a single protein binding event on a millisecond time scale. This feasibility is of high interest as the role of certain domains in the protein can be probed by a study of modified analytes without the need for labels possibly introducing conformational or characteristic changes to the target. The technique also resolves equilibrium fluctuations in the coverage, opening a window into observing Brownian dynamics of unlabeled macromolecules. rnrnA further topic addressed in this thesis is the usability of the nanoruler, two nanospheres connected with a spacer molecule, as a stiffness sensor for the interparticle linker under strong illumination. Here, I discover a light induced collapse of the nanoruler. Furthermore, I exploit the sensing volume of a fixed nanorod to study unlabeled analytes diffusing around the nanorod at concentrations that are too high for fluorescence correlation spectroscopy but realistic for biological systems. Additionally, local pH sensing with nanoparticles is achieved.
Resumo:
Principale obiettivo della ricerca è quello di ricostruire lo stato dell’arte in materia di sanità elettronica e Fascicolo Sanitario Elettronico, con una precipua attenzione ai temi della protezione dei dati personali e dell’interoperabilità. A tal fine sono stati esaminati i documenti, vincolanti e non, dell’Unione europea nonché selezionati progetti europei e nazionali (come “Smart Open Services for European Patients” (EU); “Elektronische Gesundheitsakte” (Austria); “MedCom” (Danimarca); “Infrastruttura tecnologica del Fascicolo Sanitario Elettronico”, “OpenInFSE: Realizzazione di un’infrastruttura operativa a supporto dell’interoperabilità delle soluzioni territoriali di fascicolo sanitario elettronico nel contesto del sistema pubblico di connettività”, “Evoluzione e interoperabilità tecnologica del Fascicolo Sanitario Elettronico”, “IPSE - Sperimentazione di un sistema per l’interoperabilità europea e nazionale delle soluzioni di Fascicolo Sanitario Elettronico: componenti Patient Summary e ePrescription” (Italia)). Le analisi giuridiche e tecniche mostrano il bisogno urgente di definire modelli che incoraggino l’utilizzo di dati sanitari ed implementino strategie effettive per l’utilizzo con finalità secondarie di dati sanitari digitali , come Open Data e Linked Open Data. L’armonizzazione giuridica e tecnologica è vista come aspetto strategico per ridurre i conflitti in materia di protezione di dati personali esistenti nei Paesi membri nonché la mancanza di interoperabilità tra i sistemi informativi europei sui Fascicoli Sanitari Elettronici. A questo scopo sono state individuate tre linee guida: (1) armonizzazione normativa, (2) armonizzazione delle regole, (3) armonizzazione del design dei sistemi informativi. I principi della Privacy by Design (“prottivi” e “win-win”), così come gli standard del Semantic Web, sono considerate chiavi risolutive per il suddetto cambiamento.
Resumo:
Ziel dieser Arbeit ist die Untersuchung der Einflüsse von Blister-Design und Folienqualität auf die Funktionalität von Blisterverpackungen. Hierzu werden analytische Methoden mittels Interferometrie, IR-Spektroskopie, Betarückstreuverfahren, Wirbelstromverfahren und Impedanzspektroskopie entwickelt, die zur quantitativen Bestimmung von Heißsiegellacken und Laminatbeschichtungen von Aluminium-Blisterfolien geeignet sind. Ein Vergleich der Methoden zeigt, dass sich das Betarückstreuverfahren, die Interferometrie und IR-Messungen für die Heißsiegellackbestimmung, die Interferometrie und das Wirbelstromverfahren für die Bestimmung von Kunststofflaminaten eignen.rnIm zweiten Abschnitt der Arbeit werden Einflüsse des Heißsiegellack-Flächengewichtes von Deckfolien auf die Qualität von Blisterverpackungen untersucht. Mit Zunahme des Flächengewichtes zeigt sich eine Erhöhung der Siegelnahtfestigkeit aber auch der Wasserdampfdurchlässigkeit von Blistern. Die untersuchten Heißsiegellacke zeigen Permeationskoeffizienten vergleichbar mit Polyvinylchlorid. In Untersuchungen zur Siegelprozessvalidität zeigt das Heißsiegellack-Flächengewicht nur geringfügige Auswirkungen auf diese. rnIm dritten Abschnitt der Arbeit werden Einflüsse des Blister-Designs auf die Benutzerfreundlichkeit von Blisterverpackungen durch eine Handlingstudie untersucht. Variationen der Öffnungskräfte von Durchdrück-Blistern wirken sich deutlich auf die Bewertungen der Blister durch die Probanden aus. Während die meisten Probanden alle getesteten Durchdrück-Blister innerhalb der Testdauer von 4 Minuten öffnen können (>84%), treten beim Peel-Blister und Peel-off-push-through-Blister deutlich mehr Handlingprobleme auf. Die Handlingprobleme korrelieren mit dem Alter, der Lebenssituation, der gesundheitlichen Verfassung und der Sehfähigkeit der Probanden. rn
Resumo:
The use of linear programming in various areas has increased with the significant improvement of specialized solvers. Linear programs are used as such to model practical problems, or as subroutines in algorithms such as formal proofs or branch-and-cut frameworks. In many situations a certified answer is needed, for example the guarantee that the linear program is feasible or infeasible, or a provably safe bound on its objective value. Most of the available solvers work with floating-point arithmetic and are thus subject to its shortcomings such as rounding errors or underflow, therefore they can deliver incorrect answers. While adequate for some applications, this is unacceptable for critical applications like flight controlling or nuclear plant management due to the potential catastrophic consequences. We propose a method that gives a certified answer whether a linear program is feasible or infeasible, or returns unknown'. The advantage of our method is that it is reasonably fast and rarely answers unknown'. It works by computing a safe solution that is in some way the best possible in the relative interior of the feasible set. To certify the relative interior, we employ exact arithmetic, whose use is nevertheless limited in general to critical places, allowing us to rnremain computationally efficient. Moreover, when certain conditions are fulfilled, our method is able to deliver a provable bound on the objective value of the linear program. We test our algorithm on typical benchmark sets and obtain higher rates of success compared to previous approaches for this problem, while keeping the running times acceptably small. The computed objective value bounds are in most of the cases very close to the known exact objective values. We prove the usability of the method we developed by additionally employing a variant of it in a different scenario, namely to improve the results of a Satisfiability Modulo Theories solver. Our method is used as a black box in the nodes of a branch-and-bound tree to implement conflict learning based on the certificate of infeasibility for linear programs consisting of subsets of linear constraints. The generated conflict clauses are in general small and give good rnprospects for reducing the search space. Compared to other methods we obtain significant improvements in the running time, especially on the large instances.