80 resultados para Portable interactive devices
em Université de Lausanne, Switzerland
Resumo:
Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.
Resumo:
RATIONALE: Limited-channel portable monitors (PMs) are increasingly used as an alternative to polysomnography (PSG) for the diagnosis of obstructive sleep apnoea (OSA). However, recommendations for the scoring of PM recordings are still lacking. Pulse-wave amplitude (PWA) drops, considered as surrogates for EEG arousals, may increase the detection sensitivity for respiratory events in PM recordings. OBJECTIVES: To investigate the performance of four different hypopnoea scoring criteria, using 3% or 4% oxygen desaturation levels, including or not PWA drops as surrogates for EEG arousals, and to determine the impact of measured versus reported sleep time on OSA diagnosis. METHODS: Subjects drawn from a population-based cohort underwent a complete home PSG. The PSG recordings were scored using the 2012 American Academy of Sleep Medicine criteria to determine the apnoea-hypopnoea index (AHI). Recordings were then rescored using only parameters available on type 3 PM devices according to different hypopnoea criteria and patients-reported sleep duration to determine the 'portable monitor AHIs' (PM-AHIs). MAIN RESULTS: 312 subjects were included. Overall, PM-AHIs showed a good concordance with the PSG-based AHI although it tended to slightly underestimate it. The PM-AHI using 3% desaturation without PWA drops showed the best diagnostic accuracy for AHI thresholds of ≥5/h and ≥15/h (correctly classifying 94.55% and 93.27% of subjects, respectively, vs 80.13% and 87.50% with PWA drops). There was a significant but modest correlation between PWA drops and EEG arousals (r=0.20, p=0.0004). CONCLUSION: Interpretation of PM recordings using hypopnoea criteria which include 3% desaturation without PWA drops as EEG arousal surrogate showed the best diagnosis accuracy compared with full PSG.
Resumo:
The PulseCath iVAC 3L? left ventricular assist device is an option to treat transitory left heart failure or dysfunction post-cardiac surgery. Assisted blood flow should reach up to 3 l/min. In the present in vitro model exact pump flow, depending on various frequencies and afterload was examined. Optimal flow was achieved with inflation/deflation frequencies of about 70-80/min. The maximal flow rate was achieved at about 2.5 l/min with a minimal afterload of 22 mmHg. Handling of the device was easy due to the connection to a standard intra-aortic balloon pump console. With increasing afterload (up to a simulated mean systemic pressure of 66 mmHg) flow rate and cardiac support are in some extent limited.
Resumo:
BACKGROUND: Screening for obstructive sleep apnea (OSA) is recommended as part of the preoperative assessment of obese patients scheduled for bariatric surgery. The objective of this study was to compare the sensitivity of oximetry alone versus portable polygraphy in the preoperative screening for OSA. METHODS: Polygraphy (type III portable monitor) and oximetry data recorded as part of the preoperative assessment before bariatric surgery from 68 consecutive patients were reviewed. We compared the sensitivity of 3% or 4% desaturation index (oximetry alone) with the apnea-hypopnea index (AHI; polygraphy) to diagnose OSA and classify the patients as normal (<10 events per hour), mild to moderate (10-30 events per hour), or severe (>30 events per hour). RESULTS: Using AHI, the prevalence of OSA (AHI > 10 per hour) was 57.4%: 16.2% of the patients were classified as severe, 41.2% as mild to moderate, and 42.6% as normal. Using 3% desaturation index, 22.1% were classified as severe, 47.1% as mild to moderate, and 30.9% as normal. With 4% desaturation index, 17.6% were classified as severe, 32.4% as mild, and 50% as normal. Overall, 3% desaturation index compared to AHI yielded a 95% negative predictive value to rule out OSA (AHI > 10 per hour) and a 100% sensitivity (0.73 positive predictive value) to detect severe OSA (AHI > 30 per hour). CONCLUSIONS: Using oximetry with 3% desaturation index as a screening tool for OSA could allow us to rule out significant OSA in almost a third of the patients and to detect patients with severe OSA. This cheap and widely available technique could accelerate preoperative work-up of these patients.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
Background: Temporary percutaneous left ventricular assist devices (TPLVAD) can be inserted and removed in awake patients. They substitute left ventricular function for a period of up to a few weeks and provide an excellent backup and bridge to recovery or decision. Methods: Retrospective analysis of 75 patients who received TPLVAD to treat cardiogenic shock (n = 49) or to facilitate high-risk percutaneous coronary intervention (PCI) (n = 26). Forty-two patients with cardiogenic shock and 16 patients with high-risk PCI received a TandemHeart and 7 patients and 10 patients, respectively, received an Impella Recover LP 2.5. Outcome and related complications up to 1 month are reported with reference to device depending function. Results: One-month survival was 53% in patients with shock and 96% in patients with PCI. Conclusion: TPLVADs can support the failing heart with acceptable risk. Outcome is better in prophylactic use than in patients with cardiogenic shock. (C) 2011 Wiley-Liss, Inc.
Resumo:
Hypoglycemia, if recurrent, may have severe consequences on cognitive and psychomotor development of neonates. Therefore, screening for hypoglycemia is a daily routine in every facility taking care of newborn infants. Point-of-care-testing (POCT) devices are interesting for neonatal use, as their handling is easy, measurements can be performed at bedside, demanded blood volume is small and results are readily available. However, such whole blood measurements are challenged by a wide variation of hematocrit in neonates and a spectrum of normal glucose concentration at the lower end of the test range. We conducted a prospective trial to check precision and accuracy of the best suitable POCT device for neonatal use from three leading companies in Europe. Of the three devices tested (Precision Xceed, Abbott; Elite XL, Bayer; Aviva Nano, Roche), Aviva Nano exhibited the best precision. None completely fulfilled the ISO-accuracy-criteria 15197: 2003 or 2011. Aviva Nano fulfilled these criteria in 92% of cases while the others were <87%. Precision Xceed reached the 95% limit of the 2003 ISO-criteria for values ≤4.2 mmol/L, but not for the higher range (71%). Although validated for adults, new POCT devices need to be specifically evaluated on newborn infants before adopting their routine use in neonatology.
Resumo:
Hypoglycaemia is a major cause of neonatal morbidity and may induce long-term developmental sequelae. Clinical signs of hypoglycaemia in neonatal infants are unspecific or even absent, and therefore, precise and accurate methods for the assessment of glycaemia are needed. Glycaemia measurement in newborns has some particularities like a very low limit of normal glucose concentration compared to adults and a large range of normal haematocrit values. Many bedside point-of-care testing (POCT) systems are available, but literature about their accuracy in newborn infants is scarce and not very convincing. In this retrospective study, we identified over a 1-year study period 1,324 paired glycaemia results, one obtained at bedside with one of three different POCT systems (Elite? XL, Ascensia? Contour? and ABL 735) and the other in the central laboratory of the hospital with the hexokinase reference method. All three POCT systems tended to overestimate glycaemia values, and none of them fulfilled the ISO 15197 accuracy criteria. The Elite XL appeared to be more appropriate than Contour to detect hypoglycaemia, however with a low specificity. Contour additionally showed an important inaccuracy with increasing haematocrit. The bench analyzer ABL 735 was the most accurate of the three tested POCT systems. Both of the tested handheld glucometers have important drawbacks in their use as screening tools for hypoglycaemia in newborn infants. ABL 735 could be a valuable alternative, but the blood volume needed is more than 15 times higher than for handheld glucometers. Before daily use in the newborn population, careful clinical evaluation of each new POCT system for glucose measurement is of utmost importance.
Resumo:
The MyHits web server (http://myhits.isb-sib.ch) is a new integrated service dedicated to the annotation of protein sequences and to the analysis of their domains and signatures. Guest users can use the system anonymously, with full access to (i) standard bioinformatics programs (e.g. PSI-BLAST, ClustalW, T-Coffee, Jalview); (ii) a large number of protein sequence databases, including standard (Swiss-Prot, TrEMBL) and locally developed databases (splice variants); (iii) databases of protein motifs (Prosite, Interpro); (iv) a precomputed list of matches ('hits') between the sequence and motif databases. All databases are updated on a weekly basis and the hit list is kept up to date incrementally. The MyHits server also includes a new collection of tools to generate graphical representations of pairwise and multiple sequence alignments including their annotated features. Free registration enables users to upload their own sequences and motifs to private databases. These are then made available through the same web interface and the same set of analytical tools. Registered users can manage their own sequences and annotations using only web tools and freeze their data in their private database for publication purposes.
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.
Resumo:
CONTEXT: Infection of implantable cardiac devices is an emerging disease with significant morbidity, mortality, and health care costs. OBJECTIVES: To describe the clinical characteristics and outcome of cardiac device infective endocarditis (CDIE) with attention to its health care association and to evaluate the association between device removal during index hospitalization and outcome. DESIGN, SETTING, AND PATIENTS: Prospective cohort study using data from the International Collaboration on Endocarditis-Prospective Cohort Study (ICE-PCS), conducted June 2000 through August 2006 in 61 centers in 28 countries. Patients were hospitalized adults with definite endocarditis as defined by modified Duke endocarditis criteria. MAIN OUTCOME MEASURES: In-hospital and 1-year mortality. RESULTS: CDIE was diagnosed in 177 (6.4% [95% CI, 5.5%-7.4%]) of a total cohort of 2760 patients with definite infective endocarditis. The clinical profile of CDIE included advanced patient age (median, 71.2 years [interquartile range, 59.8-77.6]); causation by staphylococci (62 [35.0% {95% CI, 28.0%-42.5%}] Staphylococcus aureus and 56 [31.6% {95% CI, 24.9%-39.0%}] coagulase-negative staphylococci); and a high prevalence of health care-associated infection (81 [45.8% {95% CI, 38.3%-53.4%}]). There was coexisting valve involvement in 66 (37.3% [95% CI, 30.2%-44.9%]) patients, predominantly tricuspid valve infection (43/177 [24.3%]), with associated higher mortality. In-hospital and 1-year mortality rates were 14.7% (26/177 [95% CI, 9.8%-20.8%]) and 23.2% (41/177 [95% CI, 17.2%-30.1%]), respectively. Proportional hazards regression analysis showed a survival benefit at 1 year for device removal during the initial hospitalization (28/141 patients [19.9%] who underwent device removal during the index hospitalization had died at 1 year, vs 13/34 [38.2%] who did not undergo device removal; hazard ratio, 0.42 [95% CI, 0.22-0.82]). CONCLUSIONS: Among patients with CDIE, the rate of concomitant valve infection is high, as is mortality, particularly if there is valve involvement. Early device removal is associated with improved survival at 1 year.
Resumo:
The paper describes how to integrate audience measurement and site visibility as the main research approaches in outdoor advertising research in a single concept. Details are portrayed on how GPS is used on a large scale in Switzerland for mobility analysis and audience measurement. Furthermore, the development of a software solution is introduced that allows the integration of all mobility data and poster location information. Finally a model and its results is presented for the calculation of coverage of individual poster campaigns and for the calculation of the number of contacts generated by each billboard.