836 resultados para Text retrieval


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To determine the incidence of venous thromboembolism (VTE) after removal of retrievable inferior vena cava (IVC) filters. MATERIALS AND METHODS: Retrospective study was conducted of 67 patients who underwent 72 consecutive filter retrievals at a single institution. Data collected included VTE status at the time of filter placement, anticoagulant medications at the time of filter retrieval and afterward, new or recurrent VTE after filter removal, and insertion of subsequent filters. Patient questionnaires were completed in 50 cases, chart review in all patients. RESULTS: At the time of filter placement, 30 patients had documented VTE, 19 had a history of treated VTE, and 23 were at risk for but had neither previous nor present VTE. Mean duration of follow-up after filter removal was 20.6 months +/- 10.9. A total of 52 patients (57 filters) received anticoagulation and/or antiplatelet medications after filter removal. There were two documented episodes of recurrent deep vein thrombosis (2.8% of filters removed), both in patients who had VTE at the time of filter placement and underwent therapeutic anticoagulation at the time of filter removal. One of these patients (1.4% of filters removed) also experienced pulmonary embolism. Of the 23 patients without VTE when the filter was placed, none developed VTE after filter removal. Four patients (5.5% of filters removed) required subsequent permanent filters, three for complications of anticoagulation, one for failure of anticoagulation. CONCLUSIONS: VTE was rare after removal of IVC filters, but was most likely to occur in patients who had VTE at the time of filter placement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An optional inferior vena cava (IVC) filter prototype was evaluated for safety and long-term retrievability as an initial feasibility study in an animal model. This filter has four centering struts that have the ability to disengage from the filtering cone portion, allowing the legs to slide out of endothelial growth. Retrieval of six filters in three animals was successful up to 27 weeks. There was no substantial filter tilt, migration, or IVC damage. In conclusion, this filter design may help overcome some of the shortcomings in currently approved optional IVC filters, including long-term retrieval difficulties, tilting, or migration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Obesity is becoming an epidemic phenomenon in most developed countries. The fundamental cause of obesity and overweight is an energy imbalance between calories consumed and calories expended. It is essential to monitor everyday food intake for obesity prevention and management. Existing dietary assessment methods usually require manually recording and recall of food types and portions. Accuracy of the results largely relies on many uncertain factors such as user's memory, food knowledge, and portion estimations. As a result, the accuracy is often compromised. Accurate and convenient dietary assessment methods are still blank and needed in both population and research societies. In this thesis, an automatic food intake assessment method using cameras, inertial measurement units (IMUs) on smart phones was developed to help people foster a healthy life style. With this method, users use their smart phones before and after a meal to capture images or videos around the meal. The smart phone will recognize food items and calculate the volume of the food consumed and provide the results to users. The technical objective is to explore the feasibility of image based food recognition and image based volume estimation. This thesis comprises five publications that address four specific goals of this work: (1) to develop a prototype system with existing methods to review the literature methods, find their drawbacks and explore the feasibility to develop novel methods; (2) based on the prototype system, to investigate new food classification methods to improve the recognition accuracy to a field application level; (3) to design indexing methods for large-scale image database to facilitate the development of new food image recognition and retrieval algorithms; (4) to develop novel convenient and accurate food volume estimation methods using only smart phones with cameras and IMUs. A prototype system was implemented to review existing methods. Image feature detector and descriptor were developed and a nearest neighbor classifier were implemented to classify food items. A reedit card marker method was introduced for metric scale 3D reconstruction and volume calculation. To increase recognition accuracy, novel multi-view food recognition algorithms were developed to recognize regular shape food items. To further increase the accuracy and make the algorithm applicable to arbitrary food items, new food features, new classifiers were designed. The efficiency of the algorithm was increased by means of developing novel image indexing method in large-scale image database. Finally, the volume calculation was enhanced through reducing the marker and introducing IMUs. Sensor fusion technique to combine measurements from cameras and IMUs were explored to infer the metric scale of the 3D model as well as reduce noises from these sensors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Das Werk eröffnet eine Serie von Bänden, die sich die Publikation der wichtigsten Materialien zum Schweizerischen Zivilgesetzbuch zum Ziel gesetzt hat. Sie sollen den Berner Kommentar zum Schweizerischen Privatrecht um eine kodifikationshistorische Dimension ergänzen und den Ausgangspunkt für die Erläuterungen zum geltenden Recht bilden. Der erste Band ist eigens auf die Centenarfeier des ZGB am 10. Dezember 2007 hin fertig gestellt worden. Er enthält mit den Erläuterungen zum Vorentwurf des Eidg. Justiz- und Polizeidepartements von 1900 das vielleicht schönste und anregendste, sicher aber das monumentalste Werk der Materialien. Es zeugt vom magistralen Überblick Eugen Hubers über den zivilrechtlichen Rechtsstoff und einer Weitsicht, die noch heute Vorbild jeder Gesetzgebungsarbeit sein sollte. Erstmals seit Jahrzehnten wird es wieder einem breiteren Publikum in originalgetreuer Form zugänglich gemacht. Zwecks prägnanter Zitierbarkeit sind Randnoten hinzugefügt worden. Ein Stichwortverzeichnis garantiert die schnelle Auffindbarkeit der Textstellen.