976 resultados para Diagnostic techniques and procedures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Leishmaniasis, caused by Leishmania infantum, is a vector-borne zoonotic disease that is endemic to the Mediterranean basin. The potential of rabbits and hares to serve as competent reservoirs for the disease has recently been demonstrated, although assessment of the importance of their role on disease dynamics is hampered by the absence of quantitative knowledge on the accuracy of diagnostic techniques in these species. A Bayesian latent-class model was used here to estimate the sensitivity and specificity of the Immuno-fluorescence antibody test (IFAT) in serum and a Leishmania-nested PCR (Ln-PCR) in skin for samples collected from 217 rabbits and 70 hares from two different populations in the region of Madrid, Spain. A two-population model, assuming conditional independence between test results and incorporating prior information on the performance of the tests in other animal species obtained from the literature, was used. Two alternative cut-off values were assumed for the interpretation of the IFAT results: 1/50 for conservative and 1/25 for sensitive interpretation. Results suggest that sensitivity and specificity of the IFAT were around 70–80%, whereas the Ln-PCR was highly specific (96%) but had a limited sensitivity (28.9% applying the conservative interpretation and 21.3% with the sensitive one). Prevalence was higher in the rabbit population (50.5% and 72.6%, for the conservative and sensitive interpretation, respectively) than in hares (6.7% and 13.2%). Our results demonstrate that the IFAT may be a useful screening tool for diagnosis of leishmaniasis in rabbits and hares. These results will help to design and implement surveillance programmes in wild species, with the ultimate objective of early detecting and preventing incursions of the disease into domestic and human populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thyroid nodules are frequent findings, especially when sensitive imaging methods are used. Although thyroid cancer is relatively rare, its incidence is increasing, particularly in terms of small tumors, which have an uncertain clinical relevance. Most patients with differentiated thyroid cancer exhibit satisfactory clinical outcomes when treatment is appropriate, and their mortality rate is similar to that of the overall population. However, relapse occurs in a considerable fraction of these patients, and some patients stop responding to conventional treatment and eventually die from their disease. Therefore, the challenge is how to identify the individuals who require more aggressive disease management while sparing the majority of patients from unnecessary treatments and procedures. We have updated the Brazilian Consensus that was published in 2007, emphasizing the diagnostic and therapeutic advances that the participants, representing several Brazilian university centers, consider most relevant in clinical practice. The formulation of the present guidelines was based on the participants' experience and a review of the relevant literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To evaluate the sensitivity and specificity of machine learning classifiers (MLCs) for glaucoma diagnosis using Spectral Domain OCT (SD-OCT) and standard automated perimetry (SAP). METHODS: Observational cross-sectional study. Sixty two glaucoma patients and 48 healthy individuals were included. All patients underwent a complete ophthalmologic examination, achromatic standard automated perimetry (SAP) and retinal nerve fiber layer (RNFL) imaging with SD-OCT (Cirrus HD-OCT; Carl Zeiss Meditec Inc., Dublin, California). Receiver operating characteristic (ROC) curves were obtained for all SD-OCT parameters and global indices of SAP. Subsequently, the following MLCs were tested using parameters from the SD-OCT and SAP: Bagging (BAG), Naive-Bayes (NB), Multilayer Perceptron (MLP), Radial Basis Function (RBF), Random Forest (RAN), Ensemble Selection (ENS), Classification Tree (CTREE), Ada Boost M1(ADA),Support Vector Machine Linear (SVML) and Support Vector Machine Gaussian (SVMG). Areas under the receiver operating characteristic curves (aROC) obtained for isolated SAP and OCT parameters were compared with MLCs using OCT+SAP data. RESULTS: Combining OCT and SAP data, MLCs' aROCs varied from 0.777(CTREE) to 0.946 (RAN).The best OCT+SAP aROC obtained with RAN (0.946) was significantly larger the best single OCT parameter (p<0.05), but was not significantly different from the aROC obtained with the best single SAP parameter (p=0.19). CONCLUSION: Machine learning classifiers trained on OCT and SAP data can successfully discriminate between healthy and glaucomatous eyes. The combination of OCT and SAP measurements improved the diagnostic accuracy compared with OCT data alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A retrospective survey was designed to identify diagnostic subgroups and clinical factors associated with odontogenic pain and discomfort in dental urgency patients. A consecutive sample of 1,765 patients seeking treatment for dental pain at the Urgency Service of the Dental School of the Federal University of Goiás, Brazil, was selected. Inclusion criteria were pulpal or periapical pain that occurred before dental treatment (minimum 6 months after the last dental appointment), and the exclusion criteria were teeth with odontogenic developmental anomalies and missing information or incomplete records. Clinical and radiographic examinations were performed to assess clinical presentation of pain complaints including origin, duration, frequency and location of pain, palpation, percussion and vitality tests, radiographic features, endodontic diagnosis and characteristics of teeth. Chi-square test and multiple logistic regression were used to analyze association between pulpal and periapical pain and independent variables. The most frequent endodontic diagnosis of pulpal pain were symptomatic pulpitis (28.3%) and hyperreactive pulpalgia (14.4%), and the most frequent periapical pain was symptomatic apical periodontitis of infectious origin (26.4%). Regression analysis revealed that closed pulp chamber and caries were highly associated with pulpal pain and, conversely, open pulp chamber was associated with periapical pain (p<0.001). Endodontic diagnosis and local factors associated with pulpal and periapical pain suggest that the important clinical factor of pulpal pain was closed pulp chamber and caries, and of periapical pain was open pulp chamber.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Surgical staple line dehiscence usually leads to severe complications. Several techniques and materials have been used to reinforce this stapling and thus reduce the related complications. The objective was to compare safety of two types of anastomotic reinforcement in open gastric bypass. Methods: A prospective, randomized study comparing an extraluminal suture, fibrin glue, and a nonpermanent buttressing material, Seamguard (R), for staple line reinforcement. Fibrin glue was excluded from the study and analysis after two leaks, requiring surgical reintervention, antibiotic therapy, and prolonged patient hospitalization. Results: Twenty patients were assigned to the suture and Seamguard reinforcement groups. The groups were similar in terms of preoperative characteristics. No staple line dehiscence occurred in the two groups, whereas two cases of dehiscence occurred in the fibrin glue group. No mortality occurred and surgical time was statistically similar for both techniques. Seamguard made the surgery more expensive. Conclusion: In our service, staple line reinforcement in open bariatric surgery with oversewing or Seamguard was considered to be safe. Seamguard application was considered to be easier than oversewing, but more expensive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the application of two relatively new diagnostic techniques for the determination of insulation condition in aged transformers. The techniques are (a) measurements of interfacial polarization spectra by a DC method and (b) measurements of molecular weight and its distribution by gel permeation chromatography. Several other electrical properties of the cellulose polymer were also investigated. Samples were obtained from a retired power transformer and they were analysed by the developed techniques. Six distribution transformers were also tested with the interfacial polarization spectra measurement technique, and the molecular weight of paper/pressboard samples from these transformers were also measured by the gel permeation chromatography. The variation of the results through different locations in a power transformer is discussed in this paper. The possible correlation between different measured properties was investigated and discussed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the analysis of accelerated aged insulation samples to investigate the degradation processes observed in the insulation from aged transformers. Short term accelerated ageing experiments were performed on paper wrapped insulated conductors and on pressboard samples. The condition of aged insulation samples was investigated by two relatively new diagnostic techniques: (a) measurements of interfacial polarization spectra by a DC method (b) measurements of molecular weight and its distribution by gel permeation chromatography. Several other electrical properties of the paper/pressboard samples were also studied. Possible correlations have been investigated among the different measured properties. The GPC results have been used to predict how molecular weights change with temperature and time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Recently, there has been an increase in the incidence of cutaneous leishmaniasis (CL), which represents an important health problem. This increase may be related to the epidemiologic expansion of the infective agent and the increase in tourism in tropical areas. The difficulty in clinical diagnosis, mainly in areas in which CL is not the first consideration of local physicians, has intensified efforts to describe diagnostic tests, which should be specific, sensitive, and practical. Amongst the new tests described are those including nucleic acid amplification (polymerase chain reaction, PCR) and immunohistochemistry (IHC). Methods In this study, we evaluated the sensitivity of a PCR based on small subunit (SSU) ribosomal DNA, in comparison with IHC using Leishmania spp. antibodies, in biopsies embedded in paraffin. Result The results indicated a total sensitivity of 96% (90.9% with PCR and 68.8% with IHC), showing the possibility of using paraffin-embedded biopsies to diagnose CL. Conclusion We propose the use of the two tests together as a routine protocol for diagnosis. This would require the provision of local medical services to perform molecular biology techniques and adequate Leishmania antibodies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. In view of the high prevalence of headache in the general population, the availability of well defined regarding the request of complementary exams is highly desirable. Aim. To analyze the criteria that will guide the physician requests of complementary exams during the investigation of headache. Patients and methods. The data were obtained by reviewing medical records of all patients who had been scheduled to be seen in a tertiary Headache Outpatient Clinic in 2004. Results. The exam most frequently requested was computed tomography of the head and the exams that most contributed to a change in clinical diagnosis or medical conduct were computed tomography of paranasal sinuses, simple rediography of paranasal sinuses. and magnetic resonance image of the brain. The exams that did not contribute to a change in the diagnosis or medical conduct were computed tomography and simple radiography of the cervical spine. As expected, the most expensive exams for the institution were computed tomography and magnetic resonance image. Conclusion. The importance of complementary exams in the investigation of headache is indisputable in many cases. Howerer, it is necessary the availability of more, studies that evaluate the request of complementay ; exams for headache patients. [REV NEUROL 2009: 48: 183-7]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a great demand for simpler and less costly laboratory techniques and for more accessible procedures for orchid breeders who do not have the necessary theoretical basis to use the traditional seed and clone production methods of orchids in vitro. The aim of this study was to assess the use of sodium hypochlorite (NaClO) as a decontaminant in the process of inoculating adult orchid explants of Arundina bambusifolia and Epidendrum ibaguenses. Solutions of NaClO (1.200, 2.400, 3.600, 4.800 and 6.000 mg L-1 - equivalent to 50, 100, 150, 200 and 250 mL L-1 of commercial bleach - CB) were sprayed on the explants (1.0 mL) and the culture medium (GB5), in the presence or absence of activated charcoal (2 g L-1). The explants used were nodal segments of field-grown adult plants. The procedures for inoculating the explants were conducted outside the laminar flow chamber (LFC), except for the control treatment (autoclaved medium and explant inoculation inside the LFC). The best results for fresh weight yield, height and number of shoots were obtained using NaClO in solution at 1.200 mg L-1 (equivalent to 50 mL L-1 commercial bleach) with activated charcoal in the culture medium. Fresh weight figures were 1.10 g/jar for Arundina bambusifolia and 0.16 g/jar for Epidendrum ibaguenses. Spraying the NaClO solutions controls the contamination of the culture medium already inoculated with the explants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To analyze the regional governance of the health systemin relation to management strategies and disputes.METHODOLOGICAL PROCEDURES A qualitative study with health managers from 19 municipalities in the health region of Bahia, Northeastern Brazil. Data were drawn from 17 semi-structured interviews of state, regional, and municipal health policymakers and managers; a focus group; observations of the regional interagency committee; and documents in 2012. The political-institutional and the organizational components were analyzed in the light of dialectical hermeneutics.RESULTS The regional interagency committee is the chief regional governance strategy/component and functions as a strategic tool for strengthening governance. It brings together a diversity of members responsible for decision making in the healthcare territories, who need to negotiate the allocation of funding and the distribution of facilities for common use in the region. The high turnover of health secretaries, their lack of autonomy from the local executive decisions, inadequate technical training to exercise their function, and the influence of party politics on decision making stand as obstacles to the regional interagency committee’s permeability to social demands. Funding is insufficient to enable the fulfillment of the officially integrated agreed-upon program or to boost public supply by the system, requiring that public managers procure services from the private market at values higher than the national health service price schedule (Brazilian Unified Health System Table). The study determined that “facilitators” under contract to health departments accelerated access to specialized (diagnostic, therapeutic and/or surgical) services in other municipalities by direct payment to physicians for procedure costs already covered by the Brazilian Unified Health System.CONCLUSIONS The characteristics identified a regionalized system with a conflictive pattern of governance and intermediate institutionalism. The regional interagency committee’s managerial routine needs to incorporate more democratic devices for connecting with educational institutions, devices that are more permeable to social demands relating to regional policy making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main purpose of this work was the development of procedures for the simulation of atmospheric ows over complex terrain, using OpenFOAM. For this aim, tools and procedures were developed apart from this code for the preprocessing and data extraction, which were thereafter applied in the simulation of a real case. For the generation of the computational domain, a systematic method able to translate the terrain elevation model to a native OpenFOAM format (blockMeshDict) was developed. The outcome was a structured mesh, in which the user has the ability to de ne the number of control volumes and its dimensions. With this procedure, the di culties of case set up and the high computation computational e ort reported in literature associated to the use of snappyHexMesh, the OpenFOAM resource explored until then for the accomplishment of this task, were considered to be overwhelmed. Developed procedures for the generation of boundary conditions allowed for the automatic creation of idealized inlet vertical pro les, de nition of wall functions boundary conditions and the calculation of internal eld rst guesses for the iterative solution process, having as input experimental data supplied by the user. The applicability of the generated boundary conditions was limited to the simulation of turbulent, steady-state, incompressible and neutrally strati ed atmospheric ows, always recurring to RaNS (Reynolds-averaged Navier-Stokes) models. For the modelling of terrain roughness, the developed procedure allowed to the user the de nition of idealized conditions, like an uniform aerodynamic roughness length or making its value variable as a function of topography characteristic values, or the using of real site data, and it was complemented by the development of techniques for the visual inspection of generated roughness maps. The absence and the non inclusion of a forest canopy model limited the applicability of this procedure to low aerodynamic roughness lengths. The developed tools and procedures were then applied in the simulation of a neutrally strati ed atmospheric ow over the Askervein hill. In the performed simulations was evaluated the solution sensibility to di erent convection schemes, mesh dimensions, ground roughness and formulations of the k - ε and k - ω models. When compared to experimental data, calculated values showed a good agreement of speed-up in hill top and lee side, with a relative error of less than 10% at a height of 10 m above ground level. Turbulent kinetic energy was considered to be well simulated in the hill windward and hill top, and grossly predicted in the lee side, where a zone of ow separation was also identi ed. Despite the need of more work to evaluate the importance of the downstream recirculation zone in the quality of gathered results, the agreement between the calculated and experimental values and the OpenFOAM sensibility to the tested parameters were considered to be generally in line with the simulations presented in the reviewed bibliographic sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.