78 resultados para Measurement-based quantum computing
Resumo:
Background: TIDratio indirectly reflects myocardial ischemia and is correlated with cardiacprognosis. We aimed at comparing the influence of three different softwarepackages for the assessment of TID using Rb-82 cardiac PET/CT. Methods: Intotal, data of 30 patients were used based on normal myocardial perfusion(SSS<3 and SRS<3) and stress myocardial blood flow 2mL/min/g)assessed by Rb-82 cardiac PET/CT. After reconstruction using 2D OSEM (2Iterations, 28 subsets), 3-D filtering (Butterworth, order=10, ωc=0.5), data were automatically processed, and then manually processed fordefining identical basal and apical limits on both stress and rest images.TIDratio were determined with Myometrix®, ECToolbox® and QGS®software packages. Comparisons used ANOVA, Student t-tests and Lin concordancetest (ρc). Results: All of the 90 processings were successfullyperformed. TID ratio were not statistically different between software packageswhen data were processed automatically (P=0.2) or manually (P=0.17). There was a slight, butsignificant relative overestimation of TID with automatic processing incomparison to manual processing using ECToolbox® (1.07 ± 0.13 vs 1.0± 0.13, P=0.001)and Myometrix® (1.07 ± 0.15 vs 1.01 ± 0.11, P=0.003) but not using QGS®(1.02 ±0.12 vs 1.05 ± 0.11, P=0.16). The best concordance was achieved between ECToolbox®and Myometrix® manual (ρc=0.67) processing.Conclusion: Using automatic or manual mode TID estimation was not significantlyinfluenced by software type. Using Myometrix® or ECToolbox®TID was significantly different between automatic and manual processing, butnot using QGS®. Software package should be account for when definingTID normal reference limits, as well as when used in multicenter studies. QGS®software seemed to be the most operator-independent software package, whileECToolbox® and Myometrix® produced the closest results.
Resumo:
PURPOSE: Awareness of being monitored can influence participants' habitual physical activity (PA) behavior. This reactivity effect may threaten the validity of PA assessment. Reports on reactivity when measuring the PA of children and adolescents have been inconsistent. The aim of this study was to investigate whether PA outcomes measured by accelerometer devices differ from measurement day to measurement day and whether the day of the week and the day on which measurement started influence these differences. METHODS: Accelerometer data (counts per minute [cpm]) of children and adolescents (n = 2081) pooled from eight studies in Switzerland with at least 10 h of daily valid recording were investigated for effects of measurement day, day of the week, and start day using mixed linear regression. RESULTS: The first measurement day was the most active day. Counts per minute were significantly higher than on the second to the sixth day, but not on the seventh day. Differences in the age-adjusted means between the first and consecutive days ranged from 23 to 45 cpm (3.6%-7.1%). In preschoolchildren, the differences almost reached 10%. The start day significantly influenced PA outcome measures. CONCLUSIONS: Reactivity to accelerometer measurement of PA is likely to be present to an extent of approximately 5% on the first day and may introduce a relevant bias to accelerometer-based studies. In preschoolchildren, the effects are larger than those in elementary and secondary schoolchildren. As the day of the week and the start day significantly influence PA estimates, researchers should plan for at least one familiarization day in school-age children and randomly assign start days.
Resumo:
Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.
Resumo:
Measurements and simulations were performed to assess workers' exposure to solvent vapors and aerosols during the waterproofing of a tiled surface. This investigation followed two recent incidents in the same company where workers experienced acute respiratory illness after spraying a stain-repellent resin containing fluorinated polymers on stone-tiled walls and floors. Because the waterproofing activity had been done for years at the tile company without encountering any exposure problems prior to these cases, it was strongly suspected that the incidents were linked to a recent change in the composition of the coating mixture. Experimental measurements and simulations indicated that the emission rate of particles smaller than 10 microm may be estimated at 0.66 mg/sec (SD 0.10) for the old resin and at 0.37 mg/sec (SD 0.04) for the new one. The measurement of the solvent emission rate from surfaces coated with the two resins indicated that shortly after spraying, the emission was in the range of 18 to 20 mg/sec x m2 and was similar for both products. Solvent and overspray emission rates were introduced in a two-zone compartment model. The results obtained in the near-field indicate significant exposure to overspray mist (7 and 34 mg/m3 for new resin) and solvent vapors (80 to 350 ppm for the new resin). It was also shown that the introduction of the new resin tended to significantly decrease the levels of solvents and particulates in the workers' breathing zone. These results strongly suggest that cases of acute respiratory illness are related to the specific toxicity of the fluorinated polymer itself. The fact that the same polymer is used in various commercial products raises concern regarding other possible occupational and domestic exposures.
Resumo:
This paper presents the segmentation of bilateral parotid glands in the Head and Neck (H&N) CT images using an active contour based atlas registration. We compare segmentation results from three atlas selection strategies: (i) selection of "single-most-similar" atlas for each image to be segmented, (ii) fusion of segmentation results from multiple atlases using STAPLE, and (iii) fusion of segmentation results using majority voting. Among these three approaches, fusion using majority voting provided the best results. Finally, we present a detailed evaluation on a dataset of eight images (provided as a part of H&N auto segmentation challenge conducted in conjunction with MICCAI-2010 conference) using majority voting strategy.
Resumo:
Objectives: The AMS 800 is the current artifi cial urinary sphincter (AUS) forincontinence due to intrinsic sphincter defi ciency. Despite good clinical results,technical failures inherent to the hydraulic mechanism or urethral ischemicinjury contribute to revisions up to 60%. We are developing an electronic AUS,called ARTUS to overcome the rigors of AMS. The objective of this study wasto evaluate the technical effi cacy and tissue tolerance of the ARTUS systemin an animal model.Methods: The ARTUS is composed by three parts: thecontractile unit, a series of rings and an integrated microprocessor. The contractileunit is made of Nitinol fi bers. The rings are placed around the urethrato control the fl ow of urine by squeezing the urethra. They work in a sequentialalternative mode and are controlled by a microprocessor. In the fi rst phase athree-rings device was used while in the second phase a two-rings ARTUS wasused. The device was implanted in 14 sheep divided in two groups of six andeight animals for study purpose. The fi rst group aimed at bladder leak pointpressure (BLPP) measurement and validation of the animal model; the secondgroup aimed at verifying midterm tissue tolerance by explants at twelve weeks.General animal tolerance was also evaluated.Results: The ARTUS systemimplantation was uneventful. When the system was activated, the BLPP wasmeasured at 1.038 ± 0.044 bar (mean ± SD). Urethral tissue analysis did notshow signifi cant morphological changes. No infection and no sign of discomfortwere noted in animals at 12 weeks.Conclusions: The ARTUS proved to beeffective in continence achievement in this study. Histological results supportour idea that a sequential alternative mode can avoid urethral atrophy andischemia. Further technical developments are needed to verify long-termoutcome and permit human use.
Resumo:
Abstract in English : Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels. The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform. The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform. Résumé en Français : Basée sur le constat que les calculs ubiquitaires vont devenir le paradigme de programmation dans les années à venir, cette thèse propose une analyse des contraintes matérielles et environnementale auxquelles sont soumises les plateformes pervasives. Ces contraintes ayant un impact fort sur la programmation des plateformes. Des solutions sont donc proposées pour faciliter cette programmation tant au niveau de l'ensemble des noeuds qu'au niveau de chacun des noeuds de la plateforme. La première contribution présentée dans ce document propose d'utiliser une alliance de programmation orientée agent avec les grands principes de la bio-inspiration (Phylogénèse, Ontogénèse et Épigénèse). Ceci pour répondres aux contraintes de programmation de plateformes pervasives comme la plateforme PERvasive computing framework for modeling comPLEX virtually Unbounded Systems . La seconde contribution propose quant à elle une méthode permettant de programmer efficacement des applications parallélisable sur chaque noeud de calcul de la plateforme
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
The relaxivity of commercially available gadolinium (Gd)-based contrast agents was studied for X-nuclei resonances with long intrinsic relaxation times ranging from 6 s to several hundred seconds. Omniscan in pure 13C formic acid had a relaxivity of 2.9 mM(-1) s(-1), whereas its relaxivity on glutamate C1 and C5 in aqueous solution was approximately 0.5 mM(-1) s(-1). Both relaxivities allow the preparation of solutions with a predetermined short T1 and suggest that in vitro substantial sensitivity gains in their measurement can be achieved. 6Li has a long intrinsic relaxation time, on the order of several minutes, which was strongly affected by the contrast agents. Relaxivity ranged from approximately 0.1 mM(-1) s(-1) for Omniscan to 0.3 for Magnevist, whereas the relaxivity of Gd-DOTP was at 11 mM(-1) s(-1), which is two orders of magnitude higher. Overall, these experiments suggest that the presence of 0.1- to 10-microM contrast agents should be detectable, provided sufficient sensitivity is available, such as that afforded by hyperpolarization, recently introduced to in vivo imaging.
Resumo:
Usually the measurement of multi-segment foot and ankle complex kinematics is done with stationary motion capture devices which are limited to use in a gait laboratory. This study aimed to propose and validate a wearable system to measure the foot and ankle complex joint angles during gait in daily conditions, and then to investigate its suitability for clinical evaluations. The foot and ankle complex consisted of four segments (shank, hindfoot, forefoot, and toes), with an inertial measurement unit (3D gyroscopes and 3D accelerometers) attached to each segment. The angles between the four segments were calculated in the sagittal, coronal, and transverse planes using a new algorithm combining strap-down integration and detection of low-acceleration instants. To validate the joint angles measured by the wearable system, three subjects walked on a treadmill for five minutes at three different speeds. A camera-based stationary system that used a cluster of markers on each segment was used as a reference. To test the suitability of the system for clinical evaluation, the joint angle ranges were compared between a group of 10 healthy subjects and a group of 12 patients with ankle osteoarthritis, during two 50-m walking trials where the wearable system was attached to each subject. On average, over all joints and walking speeds, the RMS differences and correlation coefficients between the angular curves obtained using the wearable system and the stationary system were 1 deg and 0.93, respectively. Moreover, this system was able to detect significant alteration of foot and ankle function between the group of patients with ankle osteoarthritis and the group of healthy subjects. In conclusion, this wearable system was accurate and suitable for clinical evaluation when used to measure the multi-segment foot and ankle complex kinematics during long-distance walks in daily life conditions.
Resumo:
This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.
Resumo:
This paper presents automated segmentation of structuresin the Head and Neck (H\&N) region, using an activecontour-based joint registration and segmentation model.A new atlas selection strategy is also used. Segmentationis performed based on the dense deformation fieldcomputed from the registration of selected structures inthe atlas image that have distinct boundaries, onto thepatient's image. This approach results in robustsegmentation of the structures of interest, even in thepresence of tumors, or anatomical differences between theatlas and the patient image. For each patient, an atlasimage is selected from the available atlas-database,based on the similarity metric value, computed afterperforming an affine registration between each image inthe atlas-database and the patient's image. Unlike manyof the previous approaches in the literature, thesimilarity metric is not computed over the entire imageregion; rather, it is computed only in the regions ofsoft tissue structures to be segmented. Qualitative andquantitative evaluation of the results is presented.
Resumo:
Tumour immunologists strive to develop efficient tumour vaccination and adoptive transfer therapies that enlarge the pool of tumour-specific and -reactive effector T-cells in vivo. To assess the efficiency of the various strategies, ex vivo assays are needed for the longitudinal monitoring of the patient's specific immune responses providing both quantitative and qualitative data. In particular, since tumour cell cytolysis is the end goal of tumour immunotherapy, routine immune monitoring protocols need to include a read-out for the cytolytic efficiency of Ag-specific cells. We propose to combine current immune monitoring techniques in a highly sensitive and reproducible multi-parametric flow cytometry based cytotoxicity assay that has been optimised to require low numbers of Ag-specific T-cells. The possibility of re-analysing those T-cells that have undergone lytic activity is illustrated by the concomitant detection of CD107a upregulation on the surface of degranulated T-cells. To date, the LiveCount Assay provides the only possibility of assessing the ex vivo cytolytic activity of low-frequency Ag-specific cytotoxic T-lymphocytes from patient material.
Resumo:
BACKGROUND: Chronic hepatitis C infection is a major cause of end-stage liver disease. Therapy outcome is influenced by 25-OH vitamin D deficiency. To further address this observation, our study investigates the impact of the vitamin D receptor (NR1I1) haplotype and combined effects of plasma vitamin D levels in a well-described cohort of hepatitis C patients. METHODS: A total of 155 chronic hepatitis C patients were recruited from the Swiss Hepatitis C Cohort Study for NR1I1 genotyping and plasma 25-OH vitamin D level measurement. NR1I1 genotype data and combined effects of plasma 25-OH vitamin D level were analysed regarding therapy response (sustained virological response). RESULTS: A strong association was observed between therapy non-response and the NR1I1 CCA (bAt) haplotype consisting of rs1544410 (BsmI) C, rs7975232 (ApaI) C and rs731236 (TaqI) A alleles. Of the HCV patients carrying the CCA haplotype, 50.3% were non-responders (odds ratio [OR] 1.69, 95% CI 1.07, 2.67; P=0.028). A similar association was observed for the combinational CCCCAA genotype (OR 2.94, 95% CI 1.36, 6.37; P=0.007). The combinational CCCCAA genotype was confirmed as an independent risk factor for non-response in multivariate analysis (OR 2.50, 95% CI 1.07, 5.87; P=0.034). Analysing combined effects, a significant impact of low 25-OH vitamin D levels on sustained virological response were only seen in patients with the unfavourable NR1I1 CCA (bAt) haplotype (OR for non-SVR 3.55; 95% CI 1.005, 12.57; P=0.049). CONCLUSIONS: NR1I1 vitamin D receptor polymorphisms influence response to pegylated-interferon/ribavirin-based therapy in chronic hepatitis C and exert an additive genetic predisposition to previously described low 25-OH vitamin D serum levels.
Resumo:
AIM: Total imatinib concentrations are currently measured for the therapeutic drug monitoring of imatinib, whereas only free drug equilibrates with cells for pharmacological action. Due to technical and cost limitations, routine measurement of free concentrations is generally not performed. In this study, free and total imatinib concentrations were measured to establish a model allowing the confident prediction of imatinib free concentrations based on total concentrations and plasma proteins measurements. METHODS: One hundred and fifty total and free plasma concentrations of imatinib were measured in 49 patients with gastrointestinal stromal tumours. A population pharmacokinetic model was built up to characterize mean total and free concentrations with inter-patient and intrapatient variability, while taking into account α1 -acid glycoprotein (AGP) and human serum albumin (HSA) concentrations, in addition to other demographic and environmental covariates. RESULTS: A one compartment model with first order absorption was used to characterize total and free imatinib concentrations. Only AGP influenced imatinib total clearance. Imatinib free concentrations were best predicted using a non-linear binding model to AGP, with a dissociation constant Kd of 319 ng ml(-1) , assuming a 1:1 molar binding ratio. The addition of HSA in the equation did not improve the prediction of imatinib unbound concentrations. CONCLUSION: Although free concentration monitoring is probably more appropriate than total concentrations, it requires an additional ultrafiltration step and sensitive analytical technology, not always available in clinical laboratories. The model proposed might represent a convenient approach to estimate imatinib free concentrations. However, therapeutic ranges for free imatinib concentrations remain to be established before it enters into routine practice.