866 resultados para the Fuzzy Colour Segmentation Algorithm
Resumo:
This paper proposes a method to evaluate hierarchical image segmentation procedures, in order to enable comparisons between different hierarchical algorithms and of these with other (non-hierarchical) segmentation techniques (as well as with edge detectors) to be made. The proposed method builds up on the edge-based segmentation evaluation approach by considering a set of reference human segmentations as a sample drawn from the population of different levels of detail that may be used in segmenting an image. Our main point is that, since a hierarchical sequence of segmentations approximates such population, those segmentations in the sequence that best capture each human segmentation level of detail should provide the basis for the evaluation of the hierarchical sequence as a whole. A small computational experiment is carried out to show the feasibility of our approach.
Resumo:
Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient’s medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method.
Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated.
Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated.
Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.
Resumo:
This thesis describes the development of an open-source system for virtual bronchoscopy used in combination with electromagnetic instrument tracking. The end application is virtual navigation of the lung for biopsy of early stage cancer nodules. The open-source platform 3D Slicer was used for creating freely available algorithms for virtual bronchscopy. Firstly, the development of an open-source semi-automatic algorithm for prediction of solitary pulmonary nodule malignancy is presented. This approach may help the physician decide whether to proceed with biopsy of the nodule. The user-selected nodule is segmented in order to extract radiological characteristics (i.e., size, location, edge smoothness, calcification presence, cavity wall thickness) which are combined with patient information to calculate likelihood of malignancy. The overall accuracy of the algorithm is shown to be high compared to independent experts' assessment of malignancy. The algorithm is also compared with two different predictors, and our approach is shown to provide the best overall prediction accuracy. The development of an airway segmentation algorithm which extracts the airway tree from surrounding structures on chest Computed Tomography (CT) images is then described. This represents the first fundamental step toward the creation of a virtual bronchoscopy system. Clinical and ex-vivo images are used to evaluate performance of the algorithm. Different CT scan parameters are investigated and parameters for successful airway segmentation are optimized. Slice thickness is the most affecting parameter, while variation of reconstruction kernel and radiation dose is shown to be less critical. Airway segmentation is used to create a 3D rendered model of the airway tree for virtual navigation. Finally, the first open-source virtual bronchoscopy system was combined with electromagnetic tracking of the bronchoscope for the development of a GPS-like system for navigating within the lungs. Tools for pre-procedural planning and for helping with navigation are provided. Registration between the lungs of the patient and the virtually reconstructed airway tree is achieved using a landmark-based approach. In an attempt to reduce difficulties with registration errors, we also implemented a landmark-free registration method based on a balanced airway survey. In-vitro and in-vivo testing showed good accuracy for this registration approach. The centreline of the 3D airway model is extracted and used to compensate for possible registration errors. Tools are provided to select a target for biopsy on the patient CT image, and pathways from the trachea towards the selected targets are automatically created. The pathways guide the physician during navigation, while distance to target information is updated in real-time and presented to the user. During navigation, video from the bronchoscope is streamed and presented to the physician next to the 3D rendered image. The electromagnetic tracking is implemented with 5 DOF sensing that does not provide roll rotation information. An intensity-based image registration approach is implemented to rotate the virtual image according to the bronchoscope's rotations. The virtual bronchoscopy system is shown to be easy to use and accurate in replicating the clinical setting, as demonstrated in the pre-clinical environment of a breathing lung method. Animal studies were performed to evaluate the overall system performance.
Resumo:
Tourmaline from a gem-quality deposit in the Grenville province has been studied with X-ray diffraction, visible-near infrared spectroscopy, Fourier transform infrared spectroscopy, scanning electron microscopy, electron microprobe and optical measurements. The tourmaline is found within tremolite-rich calc-silicate pods hosted in marble of the Central Metasedimentary Belt. The crystals are greenish-greyish-brown and have yielded facetable material up to 2.09 carats in size. Using the classification of Henry et al. 2011 the tourmaline is classified as a dravite, with a representative formula shown to be (Na0.73Ca0.2380.032)(Mg2+2.913Fe2+0.057Ti4+0.030) (Al3+5.787Fe3+0.017Mg2+0.14)(Si6.013O18)(BO3)3(OH)3((OH,O)0.907F0.093). Rietveld analysis of powder diffraction data gives a = 15.9436(8) Å, c = 7.2126(7) Å and a unit cell volume of 1587.8 Å3. A polished thin section was cut perpendicular to the c-axis of one tourmaline crystal, which showed zoning from a dark brown core into a lighter rim into a thin darker rim and back into lighter zonation. Through the geochemical data, three key stages of crystal growth can be seen within this thin section. The first is the core stage which occurs from the dark core to the first colourless zone; the second is from this colourless zone increasing in brown colour to the outer limit before a sudden absence of colour is noted; the third is a sharp change from the end of the second and is entirely colourless. These events are the result of metamorphism and hydrothermal fluids resulting from nearby felsic intrusive plutons. Scanning electron microscope, and electron microprobe traverses across this cross-section revealed that the green colour is the result of iron present throughout the system while the brown colour is correlated with titanium content. Crystal inclusions in the tourmaline of chlorapatite, and zircon were identified by petrographic analysis and confirmed using scanning electron microscope data and occur within the third stage of formation.
Resumo:
Sensitive detection of pathogens is critical to ensure the safety of food supplies and to prevent bacterial disease infection and outbreak at the first onset. While conventional techniques such as cell culture, ELISA, PCR, etc. have been used as the predominant detection workhorses, they are however limited by either time-consuming procedure, complicated sample pre-treatment, expensive analysis and operation, or inability to be implemented at point-of-care testing. Here, we present our recently developed assay exploiting enzyme-induced aggregation of plasmonic gold nanoparticles (AuNPs) for label-free and ultrasensitive detection of bacterial DNA. In the experiments, AuNPs are first functionalized with specific, single-stranded RNA probes so that they exhibit high stability in solution even under high electrolytic condition thus exhibiting red color. When bacterial DNA is present in a sample, a DNA-RNA heteroduplex will be formed and subsequently prone to the RNase H cleavage on the RNA probe, allowing the DNA to liberate and hybridize with another RNA strand. This continuously happens until all of the RNA strands are cleaved, leaving the nanoparticles ‘unprotected’. The addition of NaCl will cause the ‘unprotected’ nanoparticles to aggregate, initiating a colour change from red to blue. The reaction is performed in a multi-well plate format, and the distinct colour signal can be discriminated by naked eye or simple optical spectroscopy. As a result, bacterial DNA as low as pM could be unambiguously detected, suggesting that the enzyme-induced aggregation of AuNPs assay is very easy to perform and sensitive, it will significantly benefit to development of fast and ultrasensitive methods that can be used for disease detection and diagnosis.
Resumo:
PURPOSE:
To evaluate the combination of a pressure-indicating sensor film with hydrogel-forming microneedle arrays, as a method of feedback to confirm MN insertion in vivo.
METHODS:
Pilot in vitro insertion studies were conducted using a Texture Analyser to insert MN arrays, coupled with a pressure-indicating sensor film, at varying forces into excised neonatal porcine skin. In vivo studies involved twenty human volunteers, who self-applied two hydrogel-forming MN arrays, one with a pressure-indicating sensor film incorporated and one without. Optical coherence tomography was employed to measure the resulting penetration depth and colorimetric analysis to investigate the associated colour change of the pressure-indicating sensor film.
RESULTS:
Microneedle insertion was achieved in vitro at three different forces, demonstrating the colour change of the pressure-indicating sensor film upon application of increasing pressure. When self-applied in vivo, there was no significant difference in the microneedle penetration depth resulting from each type of array, with a mean depth of 237 μm recorded. When the pressure-indicating sensor film was present, a colour change occurred upon each application, providing evidence of insertion.
CONCLUSIONS:
For the first time, this study shows how the incorporation of a simple, low-cost pressure-indicating sensor film can indicate microneedle insertion in vitro and in vivo, providing visual feedback to assure the user of correct application. Such a strategy may enhance usability of a microneedle device and, hence, assist in the future translation of the technology to widespread clinical use.
Resumo:
Essay exploring the giallo genre using Dario Argento's Profondo Rosso/Deep Red as example. Produced for Arrow Video's Blu-ray of Deep Red, and included as part of the full colour booklet.
Resumo:
The thesis begins with the classical cooperation and transfers it to the digital world. This work gives a detailed overview of the young fields of research smart city, shareconomy and crowdsourcing and links these fields with entrepreneurship. The core research aim is the finding of connections between the research fields smart city, shareconomy and crowdsourcing and entrepreneurial activities and the specific fields of application, success factors and conditions for entrepreneurs. The thesis consists of seven peer-reviewed publications. Based on primary and secondary data, the existence of entrepreneurial opportunities in the fields of smart city, shareconomy and crowdsourcing could be confirmed. The first part (publications 1-3) of the thesis are literature reviews to secure the fundamental base for further research. This part consists of newly created definitions and an extreme sharpening of the research fields for the near future. In the second part of the thesis (publications 4-7), empirical field work (in-depth interviews with entrepreneurs) and quantitative analyses (fuzzy set/qualitative comparative analysis and binary logistic regression analysis) contribute to the field of research with additional new insights. Summarizing, the insights are multi-layered: theoretical (e.g. new definitions, sharpening of the research field), methodical (e.g. first time application of the fuzzy set/qualitative comparative analysis in the field of crowdfunding) and qualitative (first time application of in-depth interviews with entrepreneurs in the fields of smart city and shareconomy). The global research question could be answered: the link between entrepreneurship and smart city, shareconomy and crowdfunding could be confirmed, concrete fields of application could be identified and further developments could be touched upon. This work strongly contributes to the young fields of research through much-needed basic work, new qualitative approaches, innovative methods and new insights and offers opportunities for discussion, criticism and support for further research.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Research in ubiquitous and pervasive technologies have made it possible to recognise activities of daily living through non-intrusive sensors. The data captured from these sensors are required to be classified using various machine learning or knowledge driven techniques to infer and recognise activities. The process of discovering the activities and activity-object patterns from the sensors tagged to objects as they are used is critical to recognising the activities. In this paper, we propose a topic model process of discovering activities and activity-object patterns from the interactions of low level state-change sensors. We also develop a recognition and segmentation algorithm to recognise activities and recognise activity boundaries. Experimental results we present validates our framework and shows it is comparable to existing approaches.
Resumo:
In this work, we present results from teleseismic P-wave receiver functions (PRFs) obtained in Portugal, Western Iberia. A dense seismic station deployment conducted between 2010 and 2012, in the scope of the WILAS project and covering the entire country, allowed the most spatially extensive probing on the bulk crustal seismic properties of Portugal up to date. The application of the H-kappa stacking algorithm to the PRFs enabled us to estimate the crustal thickness (H) and the average crustal ratio of the P- and S-waves velocities V (p)/V (s) (kappa) for the region. Observations of Moho conversions indicate that this interface is relatively smooth with the crustal thickness ranging between 24 and 34 km, with an average of 30 km. The highest V (p)/V (s) values are found on the Mesozoic-Cenozoic crust beneath the western and southern coastal domain of Portugal, whereas the lowest values correspond to Palaeozoic crust underlying the remaining part of the subject area. An average V (p)/V (s) is found to be 1.72, ranging 1.63-1.86 across the study area, indicating a predominantly felsic composition. Overall, we systematically observe a decrease of V (p)/V (s) with increasing crustal thickness. Taken as a whole, our results indicate a clear distinction between the geological zones of the Variscan Iberian Massif in Portugal, the overall shape of the anomalies conditioned by the shape of the Ibero-Armorican Arc, and associated Late Paleozoic suture zones, and the Meso-Cenozoic basin associated with Atlantic rifting stages. Thickened crust (30-34 km) across the studied region may be inherited from continental collision during the Paleozoic Variscan orogeny. An anomalous crustal thinning to around 28 km is observed beneath the central part of the Central Iberian Zone and the eastern part of South Portuguese Zone.
Resumo:
With the development of electronic devices, more and more mobile clients are connected to the Internet and they generate massive data every day. We live in an age of “Big Data”, and every day we generate hundreds of million magnitude data. By analyzing the data and making prediction, we can carry out better development plan. Unfortunately, traditional computation framework cannot meet the demand, so the Hadoop would be put forward. First the paper introduces the background and development status of Hadoop, compares the MapReduce in Hadoop 1.0 and YARN in Hadoop 2.0, and analyzes the advantages and disadvantages of them. Because the resource management module is the core role of YARN, so next the paper would research about the resource allocation module including the resource management, resource allocation algorithm, resource preemption model and the whole resource scheduling process from applying resource to finishing allocation. Also it would introduce the FIFO Scheduler, Capacity Scheduler, and Fair Scheduler and compare them. The main work has been done in this paper is researching and analyzing the Dominant Resource Fair algorithm of YARN, putting forward a maximum resource utilization algorithm based on Dominant Resource Fair algorithm. The paper also provides a suggestion to improve the unreasonable facts in resource preemption model. Emphasizing “fairness” during resource allocation is the core concept of Dominant Resource Fair algorithm of YARM. Because the cluster is multiple users and multiple resources, so the user’s resource request is multiple too. The DRF algorithm would divide the user’s resources into dominant resource and normal resource. For a user, the dominant resource is the one whose share is highest among all the request resources, others are normal resource. The DRF algorithm requires the dominant resource share of each user being equal. But for these cases where different users’ dominant resource amount differs greatly, emphasizing “fairness” is not suitable and can’t promote the resource utilization of the cluster. By analyzing these cases, this thesis puts forward a new allocation algorithm based on DRF. The new algorithm takes the “fairness” into consideration but not the main principle. Maximizing the resource utilization is the main principle and goal of the new algorithm. According to comparing the result of the DRF and new algorithm based on DRF, we found that the new algorithm has more high resource utilization than DRF. The last part of the thesis is to install the environment of YARN and use the Scheduler Load Simulator (SLS) to simulate the cluster environment.
Resumo:
O presente trabalho faz um enlace de teorias propostas por dois trabalhos: Transformação de valores crisp em valores fuzzy e construção de gráfico de controle fuzzy. O resultado desse enlace é um gráfico de controle fuzzy que foi aplicado em um processo de produção de iogurte, onde as variáveis analisadas foram: Cor, Aroma, Consistência, Sabor e Acidez. São características que dependem da percepção dos indivíduos, então a forma utilizada para coletar informações a respeito de tais característica foi a análise sensorial. Nas analises um grupo denominado de juízes, atribuía individualmente notas para cada amostra de iogurte em uma escala de 0 a 10. Esses valores crisp, notas atribuídas pelos juízes, foram então, transformados em valores fuzzy, na forma de número fuzzy triangular. Com os números fuzzy, foram construídos os gráficos de controle fuzzy de média e amplitude. Com os valores crisp foram construídos gráficos de controle de Shewhart para média e amplitude, já consolidados pela literatura. Por fim, os resultados encontrados nos gráficos tradicionais foram comparados aos encontrados nos gráficos de controle fuzzy. O que pode-se observar é que o gráfico de controle fuzzy, parece satisfazer de forma significativa a realidade do processo, pois na construção do número fuzzy é considerada a variabilidade do processo. Além disso, caracteriza o processo de produção em alguns níveis, onde nem sempre o processo estará totalmente em controle ou totalmente fora de controle. O que vai ao encontro da teoria fuzzy: se não é possível prever com exatidão determinados resultados é melhor ter uma margem de aceitação, o que implicará na redução de erros.
Resumo:
The electric power systems are getting more complex and covering larger areas day by day. This fact has been contribuiting to the development of monitoring techniques that aim to help the analysis, control and planning of power systems. Supervisory Control and Data Acquisition (SCADA) systems, Wide Area Measurement Systems and disturbance record systems. Unlike SCADA and WAMS, disturbance record systems are mainly used for offilne analysis in occurrences where a fault resulted in tripping of and apparatus such as a transimission line, transformer, generator and so on. The device responsible for record the disturbances is called Digital Fault Recorder (DFR) and records, basically, electrical quantities as voltage and currents and also, records digital information from protection system devices. Generally, in power plants, all the DFRs data are centralized in the utility data centre and it results in an excess of data that difficults the task of analysis by the specialist engineers. This dissertation shows a new methodology for automated analysis of disturbances in power plants. A fuzzy reasoning system is proposed to deal with the data from the DFRs. The objective of the system is to help the engineer resposnible for the analysis of the DFRs’s information by means of a pre-classification of data. For that, the fuzzy system is responsible for generating unit operational state diagnosis and fault classification.
Resumo:
Food irradiation is a treatment that involves subjecting in-bulk or packaged food to a controlled dose of ionizing radiation, with a clearly defined goal. It has been used for disinfestation and sanitization of food commodities and to retard postharvest ripening and senescence processes, being a sustainable alternative to chemical agents 1 . Doses up to 10 kGy are approved by several international authorities for not offering negative effects to food from a nutrition and toxicology point of view 2 . However, the adoption of this technology for food applications has been a slow process due to some misunderstandings by the consumer who often chooses non-irradiated foods. In this study, the effects of the ionizing radiation treatment on physical, chemical and bioactive properties of dried herbs and its suitability for preserving quality attributes of fresh vegetables during cold storage were evaluated. The studied herbs, perennial spotted rockrose (Tuberaria lignosa (Sweet) Samp.) and common mallow (Malva neglecta Wallr.) were freeze-dried and then irradiated up to 10 kGy in a Cobalt-60 chamber. The selected vegetables, watercress (Nasturtium officinale R. Br.) and buckler sorrel (Rumex induratus Boiss. Reut.) were rinsed in tap water, packaged in polyethylene bags, submitted to irradiation doses up to 6 kGy and then were stored at 4 C for a period of up to 12 days. Physical, chemical and bioactive parameters of irradiated and non-irradiated samples were evaluated using different methodologies the colour was measured with a colorimeter, individual chemical compounds were analyzed by chromatographic techniques, antioxidant properties were evaluated using in vitro assays based on different reaction mechanisms, and other quality analyses were performed following official methods of analysis. The irradiation treatment did not significantly affect the colour of the perennial spotted rockrose samples, or its phenolic composition and antioxidant activity 3 . Medium doses preserved the colour of common mallow and a low dose did not induce any adverse effect in the organic acids profile. The green colour of the irradiated vegetables was maintained during cold storage but the treatment had pros and cons in other quality attributes. The 2 kGy dose preserved free sugars and favoured polyunsaturated fatty acids (PUFA) while the 5 kGy dose favoured tocopherols and preserved the antioxidant properties in watercress samples. The 6 kGy dose was a suitable option for preserving PUFA and the ω-6 ω-3 fatty acids ratio in buckler sorrel samples. This comprehensive experimental work allowed selecting appropriate processing doses for the studied plant foods in order to preserve its quality attributes and edibility.