940 resultados para Printing in three-dimensional imaging


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To construct a Portuguese language index of information on the practice of diagnostic radiology in order to improve the standardization of the medical language and terminology. Materials and Methods A total of 61,461 definitive reports were collected from the database of the Radiology Information System at Hospital das Clínicas – Faculdade de Medicina de Ribeirão Preto (RIS/HCFMRP) as follows: 30,000 chest x-ray reports; 27,000 mammography reports; and 4,461 thyroid ultrasonography reports. The text mining technique was applied for the selection of terms, and the ANSI/NISO Z39.19-2005 standard was utilized to construct the index based on a thesaurus structure. The system was created in *html. Results The text mining resulted in a set of 358,236 (n = 100%) words. Out of this total, 76,347 (n = 21%) terms were selected to form the index. Such terms refer to anatomical pathology description, imaging techniques, equipment, type of study and some other composite terms. The index system was developed with 78,538 *html web pages. Conclusion The utilization of text mining on a radiological reports database has allowed the construction of a lexical system in Portuguese language consistent with the clinical practice in Radiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractObjective:The present study is aimed at contributing to identify the most appropriate OSEM parameters to generate myocardial perfusion imaging reconstructions with the best diagnostic quality, correlating them with patients' body mass index.Materials and Methods:The present study included 28 adult patients submitted to myocardial perfusion imaging in a public hospital. The OSEM method was utilized in the images reconstruction with six different combinations of iterations and subsets numbers. The images were analyzed by nuclear cardiology specialists taking their diagnostic value into consideration and indicating the most appropriate images in terms of diagnostic quality.Results:An overall scoring analysis demonstrated that the combination of four iterations and four subsets has generated the most appropriate images in terms of diagnostic quality for all the classes of body mass index; however, the role played by the combination of six iterations and four subsets is highlighted in relation to the higher body mass index classes.Conclusion:The use of optimized parameters seems to play a relevant role in the generation of images with better diagnostic quality, ensuring the diagnosis and consequential appropriate and effective treatment for the patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a method for surface reconstruction from point sets that is able to cope with noise and outliers. First, a splat-based representation is computed from the point set. A robust local 3D RANSAC-based procedure is used to filter the point set for outliers, then a local jet surface - a low-degree surface approximation - is fitted to the inliers. Second, we extract the reconstructed surface in the form of a surface triangle mesh through Delaunay refinement. The Delaunay refinement meshing approach requires computing intersections between line segment queries and the surface to be meshed. In the present case, intersection queries are solved from the set of splats through a 1D RANSAC procedure

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to investigate the thermal loading of medium voltage three-level NPC inverter’s semiconductor IGCT switches in different operation points. The objective is to reach both a fairly accurate off-line simulation program and also so simple a simulation model that its implementation into an embedded system could be reasonable in practice and a real time use should become feasible. Active loading limitation of the inverter can be realized with a thermal model which is practical in a real time use. Determining of the component heating has been divided into two parts; defining of component losses and establishing the structure of a thermal network. Basics of both parts are clarified. The simulation environment is Matlab-Simulink. Two different models are constructed – a more accurate one and a simplified one. Potential simplifications are clarified with the help of the first one. Simplifications are included in the latter model and the functionalities of both models are compared. When increasing the calculation time step a decreased number of considered components and time constants of the thermal network can be used in the simplified model. Heating of a switching component is dependent on its topological position and inverter’s operation point. The output frequency of the converter defines mainly which one of the switching components is – because of its losses and heating – the performance limiting component of the converter. Comparison of results given by different thermal models demonstrates that with larger time steps, describing of fast occurring switching losses becomes difficult. Generally articles and papers dealing with this subject are written for two-level inverters. Also inverters which apply direct torque control (DTC) are investigated rarely from the heating point of view. Hence, this thesis completes the former material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En del av de intressantaste fenomenen inom dagens materialfysik uppstår ur ett intrikat samspel mellan myriader av elektroner. Högtemperatursupraledare är det mest berömda exemplet. Varken klassiska teorier eller modeller där elektronerna är oberoende av varandra kan förklara de häpnadsväckande effekterna i de starkt korrelerade elektronsystemen. I vissa kopparoxider, till exempel La2CuO4, är det känt att valenselektronerna till följd av en stark ömsesidig växelverkan lokaliseras en och en till kopparatomerna i föreningens CuO2 plan. Laddningarnas inneboende magnetiska moment—spinnet—får då en avgörande roll för materialets elektriska och magnetiska egenskaper, vilka i exemplets fall kan beskrivas med Heisenbergmodellen som är den grundläggande teoretiska modellen för mikroskopisk magnetism. Men exakt varför föreningarna kan bli supraledande då de dopas med överskottsladdningar är än så länge en obesvarad fråga. Min avhandling undersöker orenheters inverkan på Heisenbergmodellens magnetiska egenskaper—ett problem av både experimentell och teoretisk relevans. En etablerad numerisk metod har använts—en kvantmekanisk Monte Carlo teknik—för att utföra omfattande datorsimuleringar av den matematiska modellen på två dedikerade Linux datorkluster. Arbetet hör till området beräkningsfysik. De teoretiska modellerna för starkt korrelerade elektronsystem, däribland Heisenbergmodellen, är ytterst invecklade matematiskt sett och de kan inte lösas exakt. Analytiska utredningar bygger för det mesta på antaganden och förenklingar vars inverkningar på slutresultatet är ofta oklara. I det avseende kan numeriska studier vara exakta, det vill säga de kan behandla modellerna som de är. Oftast behövs bägge tillvägagångssätten. Den röda tråden i arbetet har varit att numeriskt testa vissa högaktuella analytiska förutsägelser rörande effekterna av orenheter i Heisenbergmodellen. En del av dem har vi på basen av mycket noggranna data kunnat bekräfta. Men våra resultat har också påvisat felaktigheter i de analytiska prognoserna som sedermera delvis reviderats. En del av avhandlingens numeriska upptäckter har i sin tur stimulerat till helt nya teoretiska studier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Standard Model of particle physics is currently the best description of fundamental particles and their interactions. All particles save the Higgs boson have been observed in particle accelerator experiments over the years. Despite the predictive power the Standard Model there are many phenomena that the scenario does not predict or explain. Among the most prominent dilemmas is matter-antimatter asymmetry, and much effort has been made in formulating scenarios that accurately predict the correct amount of matter-antimatter asymmetry in the universe. One of the most appealing explanations is baryogenesis via leptogenesis which not only serves as a mechanism of producing excess matter over antimatter but can also explain why neutrinos have very small non-zero masses. Interesting leptogenesis scenarios arise when other possible candidates of theories beyond the Standard Model are brought into the picture. In this thesis, we have studied leptogenesis in an extra dimensional framework and in a modified version of supersymmetric Standard Model. The first chapters of this thesis introduce the standard cosmological model, observations made on the photon to baryon ratio and necessary preconditions for successful baryogenesis. Baryogenesis via leptogenesis is then introduced and its connection to neutrino physics is illuminated. The final chapters concentrate on extra dimensional theories and supersymmetric models and their ability to accommodate leptogenesis. There, the results of our research are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seedlings production is an essential part for vegetables production. Thus, this study aimed to evaluate the environment, the substrates and the containers in the development of tomato seedlings, cv. Santa crus Kada Gigante, in Aquidauana -MS, Brazil region, from October to November, 2008. Polystyrene trays with 72; 128 and 200 cells, filled with four substrates (soil; Plantmax®; coconut fiber and vermiculite) were tested in three protected environments (greenhouse; screened with Sombrite® and screened with Aluminet®). The experimental design was completely randomized, factorial scheme 3x4 (three trays x four substrates), with four replications, being analyzed individual variance analysis and joint analysis for the environments. The environment with screens (Sombrite® and Aluminet®), the trays with 72 cells and the vermiculite produced better results.