912 resultados para Efficient image processing
Resumo:
Monet teollisuuden konenäkö- ja hahmontunnistusongelmat ovat hyvin samantapaisia, jolloin prototyyppisovelluksia suunniteltaessa voitaisiin hyödyntää pitkälti samoja komponentteja. Oliopohjaiset sovelluskehykset tarjoavat erinomaisen tavan nopeuttaa ohjelmistokehitystä uudelleenkäytettävyyttä parantamalla. Näin voidaan sekä mahdollistaa konenäkösovellusten laajempi käyttö että säästää kustannuksissa. Tässä työssä esitellään konenäkösovelluskehys, joka on perusarkkitehtuuriltaan liukuhihnamainen. Ylätason rakenne koostuu sensorista, datankäsittelyoperaatioista, piirreirrottimesta sekä luokittimesta. Itse sovelluskehyksen lisäksi on toteutettu joukko kuvankäsittely- ja hahmontunnistusoperaatioita. Sovelluskehys nopeuttaa selvästi ohjelmointityötä ja helpottaa uusien kuvankäsittelyoperaatioiden lisää mistä.
Resumo:
The problem of understanding how humans perceive the quality of a reproduced image is of interest to researchers of many fields related to vision science and engineering: optics and material physics, image processing (compression and transfer), printing and media technology, and psychology. A measure for visual quality cannot be defined without ambiguity because it is ultimately the subjective opinion of an “end-user” observing the product. The purpose of this thesis is to devise computational methods to estimate the overall visual quality of prints, i.e. a numerical value that combines all the relevant attributes of the perceived image quality. The problem is limited to consider the perceived quality of printed photographs from the viewpoint of a consumer, and moreover, the study focuses only on digital printing methods, such as inkjet and electrophotography. The main contributions of this thesis are two novel methods to estimate the overall visual quality of prints. In the first method, the quality is computed as a visible difference between the reproduced image and the original digital (reference) image, which is assumed to have an ideal quality. The second method utilises instrumental print quality measures, such as colour densities, measured from printed technical test fields, and connects the instrumental measures to the overall quality via subjective attributes, i.e. attributes that directly contribute to the perceived quality, using a Bayesian network. Both approaches were evaluated and verified with real data, and shown to predict well the subjective evaluation results.
Resumo:
Tässä tutkimuksessa toteutettiin uusi versio aikaisemmin tuotetusta työkalusta merkintöjen tekemiseksi pääasiassa silmänpohjakuviin. Tarkoituksena oli toteuttaa kuvankäsittelyyn perustuvia aputoimintoja kuvien valaistuksenkorjaamiseksi, sekä korostaa lääkärille mahdollisia diabeettiseen retinopatiaan kuuluvia löydöksiä. Kuvien annotoinnin helpottamiseksi toteutettiin kaksi menetelmää valaistuksenkorjaamiseksi: yksiulotteinen käyrämenetelmä sekä värikanavien ominaisuuksia hyödyntävä menetelmä. Kuvien annotoinin helpottamiseksi toteutettiin kuvan vihreän kanavan jakaumaan perustuva aputoiminto, joka pyrkii korostamaan mahdollisia diabeettiseen retinopatiaan kuuluvia löydöksiä.
Resumo:
Blood flow in human aorta is an unsteady and complex phenomenon. The complex patterns are related to the geometrical features like curvature, bends, and branching and pulsatile nature of flow from left ventricle of heart. The aim of this work was to understand the effect of aorta geometry on the flow dynamics. To achieve this, 3D realistic and idealized models of descending aorta were reconstructed from Computed Tomography (CT) images of a female patient. The geometries were reconstructed using medical image processing code. The blood flow in aorta was assumed to be laminar and incompressible and the blood was assumed to be Newtonian fluid. A time dependent pulsatile and parabolic boundary condition was deployed at inlet. Steady and unsteady blood flow simulations were performed in real and idealized geometries of descending aorta using a Finite Volume Method (FVM) code. Analysis of Wall Shear Stress (WSS) distribution, pressure distribution, and axial velocity profiles were carried out in both geometries at steady and unsteady state conditions. The results obtained in thesis work reveal that the idealization of geometry underestimates the values of WSS especially near the region with sudden change of diameter. However, the resultant pressure and velocity in idealized geometry are close to those in real geometry
Resumo:
Print quality and the printability of paper are very important attributes when modern printing applications are considered. In prints containing images, high print quality is a basic requirement. Tone unevenness and non uniform glossiness of printed products are the most disturbing factors influencing overall print quality. These defects are caused by non ideal interactions of paper, ink and printing devices in high speed printing processes. Since print quality is a perceptive characteristic, the measurement of unevenness according to human vision is a significant problem. In this thesis, the mottling phenomenon is studied. Mottling is a printing defect characterized by a spotty, non uniform appearance in solid printed areas. Print mottle is usually the result of uneven ink lay down or non uniform ink absorption across the paper surface, especially visible in mid tone imagery or areas of uniform color, such as solids and continuous tone screen builds. By using existing knowledge on visual perception and known methods to quantify print tone variation, a new method for print unevenness evaluation is introduced. The method is compared to previous results in the field and is supported by psychometric experiments. Pilot studies are made to estimate the effect of optical paper characteristics prior to printing, on the unevenness of the printed area after printing. Instrumental methods for print unevenness evaluation have been compared and the results of the comparison indicate that the proposed method produces better results in terms of visual evaluation correspondence. The method has been successfully implemented as ail industrial application and is proved to be a reliable substitute to visual expertise.
Resumo:
The aim of this study was to simulate blood flow in thoracic human aorta and understand the role of flow dynamics in the initialization and localization of atherosclerotic plaque in human thoracic aorta. The blood flow dynamics in idealized and realistic models of human thoracic aorta were numerically simulated in three idealized and two realistic thoracic aorta models. The idealized models of thoracic aorta were reconstructed with measurements available from literature, and the realistic models of thoracic aorta were constructed by image processing Computed Tomographic (CT) images. The CT images were made available by South Karelia Central Hospital in Lappeenranta. The reconstruction of thoracic aorta consisted of operations, such as contrast adjustment, image segmentations, and 3D surface rendering. Additional design operations were performed to make the aorta model compatible for the numerical method based computer code. The image processing and design operations were performed with specialized medical image processing software. Pulsatile pressure and velocity boundary conditions were deployed as inlet boundary conditions. The blood flow was assumed homogeneous and incompressible. The blood was assumed to be a Newtonian fluid. The simulations with idealized models of thoracic aorta were carried out with Finite Element Method based computer code, while the simulations with realistic models of thoracic aorta were carried out with Finite Volume Method based computer code. Simulations were carried out for four cardiac cycles. The distribution of flow, pressure and Wall Shear Stress (WSS) observed during the fourth cardiac cycle were extensively analyzed. The aim of carrying out the simulations with idealized model was to get an estimate of flow dynamics in a realistic aorta model. The motive behind the choice of three aorta models with distinct features was to understand the dependence of flow dynamics on aorta anatomy. Highly disturbed and nonuniform distribution of velocity and WSS was observed in aortic arch, near brachiocephalic, left common artery, and left subclavian artery. On the other hand, the WSS profiles at the roots of branches show significant differences with geometry variation of aorta and branches. The comparison of instantaneous WSS profiles revealed that the model with straight branching arteries had relatively lower WSS compared to that in the aorta model with curved branches. In addition to this, significant differences were observed in the spatial and temporal profiles of WSS, flow, and pressure. The study with idealized model was extended to study blood flow in thoracic aorta under the effects of hypertension and hypotension. One of the idealized aorta models was modified along with the boundary conditions to mimic the thoracic aorta under the effects of hypertension and hypotension. The results of simulations with realistic models extracted from CT scans demonstrated more realistic flow dynamics than that in the idealized models. During systole, the velocity in ascending aorta was skewed towards the outer wall of aortic arch. The flow develops secondary flow patterns as it moves downstream towards aortic arch. Unlike idealized models, the distribution of flow was nonplanar and heavily guided by the artery anatomy. Flow cavitation was observed in the aorta model which was imaged giving longer branches. This could not be properly observed in the model with imaging containing a shorter length for aortic branches. The flow circulation was also observed in the inner wall of the aortic arch. However, during the diastole, the flow profiles were almost flat and regular due the acceleration of flow at the inlet. The flow profiles were weakly turbulent during the flow reversal. The complex flow patterns caused a non-uniform distribution of WSS. High WSS was distributed at the junction of branches and aortic arch. Low WSS was distributed at the proximal part of the junction, while intermedium WSS was distributed in the distal part of the junction. The pulsatile nature of the inflow caused oscillating WSS at the branch entry region and inner curvature of aortic arch. Based on the WSS distribution in the realistic model, one of the aorta models was altered to induce artificial atherosclerotic plaque at the branch entry region and inner curvature of aortic arch. Atherosclerotic plaque causing 50% blockage of lumen was introduced in brachiocephalic artery, common carotid artery, left subclavian artery, and aortic arch. The aim of this part of the study was first to study the effect of stenosis on flow and WSS distribution, understand the effect of shape of atherosclerotic plaque on flow and WSS distribution, and finally to investigate the effect of lumen blockage severity on flow and WSS distributions. The results revealed that the distribution of WSS is significantly affected by plaque with mere 50% stenosis. The asymmetric shape of stenosis causes higher WSS in branching arteries than in the cases with symmetric plaque. The flow dynamics within thoracic aorta models has been extensively studied and reported here. The effects of pressure and arterial anatomy on the flow dynamic were investigated. The distribution of complex flow and WSS is correlated with the localization of atherosclerosis. With the available results we can conclude that the thoracic aorta, with complex anatomy is the most vulnerable artery for the localization and development of atherosclerosis. The flow dynamics and arterial anatomy play a role in the localization of atherosclerosis. The patient specific image based models can be used to diagnose the locations in the aorta vulnerable to the development of arterial diseases such as atherosclerosis.
Resumo:
The study aimed to evaluate a methodology to quantify the porosity of the soil using computed tomography in areas under no-tillage, conventional tillage and native forest. Three soil management systems were selected for the study: forest, conventional tillage and no-tillage. In each soil management system, undisturbed soil samples were collected in the surface layer (0.0 to 0.10 m). The tomographic images were obtained using a X-ray microtomography. After obtaining the images, they were processed, and a methodology was evaluated for image conversion into numerical values. The statistical method which provided the greatest accuracy was the percentile method. The methodology used to analyze the tomographic image allowed quantifying the porosity of the soil under different soil management. The method enabled the characterization of soil porosity in a non-evasive and non-destructive way.
Resumo:
The authors thoroughly report the development, the technical aspects and the performance of the first navigated liver resections, by laparotomy and laparoscopy, in Brazil, done at the National Cancer Institute, Ministry of Health, using a surgical navigator.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.
Resumo:
Lignocellulosic biomasses (e.g., wood and straws) are a potential renewable source for the production of a wide variety of chemicals that could be used to replace those currently produced by petrochemical industry. This would lead to lower greenhouse gas emissions and waste amounts, and to economical savings. There are many possible pathways available for the manufacturing of chemicals from lignocellulosic biomasses. One option is to hydrolyze the cellulose and hemicelluloses of these biomasses into monosaccharides using concentrated sulfuric acid as catalyst. This process is an efficient method for producing monosaccharides which are valuable platforn chemicals. Also other valuable products are formed in the hydrolysis. Unfortunately, the concentrated acid hydrolysis has been deemed unfeasible mainly due to high chemical consumption resulting from the need to remove sulfuric acid from the obtained hydrolysates prior to the downstream processing of the monosaccharides. Traditionally, this has been done by neutralization with lime. This, however, results in high chemical consumption. In addition, the by-products formed in the hydrolysis are not removed and may, thus, hinder the monosaccharide processing. In order to improve the feasibility of the concentrated acid hydrolysis, the chemical consumption should be decreased by recycling of sulfuric acid without neutralization. Furthermore, the monosaccharides and the other products formed in the hydrolysis should be recovered selectively for efficient downstream processing. The selective recovery of the hydrolysis by-products would have additional economical benefits on the process due to their high value. In this work, the use of chromatographic fractionation for the recycling of sulfuric acid and the selective recovery of the main components from the hydrolysates formed in the concentrated acid hydrolysis was investigated. Chromatographic fractionation based on the electrolyte exclusion with gel type strong acid cation exchange resins in acid (H+) form as a stationary phase was studied. A systematic experimental and model-based study regarding the separation task at hand was conducted. The phenomena affecting the separation were determined and their effects elucidated. Mathematical models that take accurately into account these phenomena were derived and used in the simulation of the fractionation process. The main components of the concentrated acid hydrolysates (sulfuric acid, monosaccharides, and acetic acid) were included into this model. Performance of the fractionation process was investigated experimentally and by simulations. Use of different process options was also studied. Sulfuric acid was found to have a significant co-operative effect on the sorption of the other components. This brings about interesting and beneficial effects in the column operations. It is especially beneficial for the separation of sulfuric acid and the monosaccharides. Two different approaches for the modelling of the sorption equilibria were investigated in this work: a simple empirical approach and a thermodynamically consistent approach (the Adsorbed Solution theory). Accurate modelling of the phenomena observed in this work was found to be possible using the simple empirical models. The use of the Adsorbed Solution theory is complicated by the nature of the theory and the complexity of the studied system. In addition to the sorption models, a dynamic column model that takes into account the volume changes of the gel type resins as changing resin bed porosity was also derived. Using the chromatography, all the main components of the hydrolysates can be recovered selectively, and the sulfuric acid consumption of the hydrolysis process can be lowered considerably. Investigation of the performance of the chromatographic fractionation showed that the highest separation efficiency in this separation task is obtained with a gel type resin with a high crosslinking degree (8 wt. %); especially when the hydrolysates contain high amounts of acetic acid. In addition, the concentrated acid hydrolysis should be done with as low sulfuric acid concentration as possible to obtain good separation performance. The column loading and flow rate also have large effects on the performance. In this work, it was demonstrated that when recycling of the fractions obtained in the chromatographic fractionation are recycled to preceding unit operations these unit operations should included in the performance evaluation of the fractionation. When this was done, the separation performance and the feasibility of the concentrated acid hydrolysis process were found to improve considerably. Use of multi-column chromatographic fractionation processes, the Japan Organo process and the Multi-Column Recycling Chromatography process, was also investigated. In the studied case, neither of these processes could compete with the single-column batch process in the productivity. However, due to internal recycling steps, the Multi-Column Recycling Chromatography was found to be superior to the batch process when the product yield and the eluent consumption were taken into account.
Resumo:
Weed mapping is a useful tool for site-specific herbicide applications. The objectives of this study were (1) to determine the percentage of land area covered by weeds in no-till and conventionally tilled fields of common bean using digital image processing and geostatistics, and (2) to compare two types of cameras. Two digital cameras (color and infrared) and a differential GPS were affixed to a center pivot structure for image acquisition. Sample field images were acquired in a regular grid pattern, and the images were processed to estimate the percentage of weed cover. After calculating the georeferenced weed percentage values, maps were constructed using geostatistical techniques. Based on the results, color images are recommended for mapping the percentage of weed cover in no-till systems, while infrared images are recommended for weed mapping in conventional tillage systems.
Resumo:
Steganografian tarkoituksena on salaisen viestin piilottaminen muun informaation sekaan. Tutkielmassa perehdytään kirjallisuuden pohjalta steganografiaan ja kuvien digitaaliseen vesileimaamiseen. Tutkielmaan kuuluu myös kokeellinen osuus. Siinä esitellään vesileimattujen kuvien tunnistamiseen kehitetty testausjärjestelmä ja testiajojen tulokset. Testiajoissa kuvasarjoja on vesileimattu valituilla vesileimausmenetelmillä parametreja vaihdellen. Tunnistettaville kuville tehdään piirreirrotus. Erotellut piirteet annetaan parametreina luokittimelle, joka tekee lopullisen tunnistamispäätöksen. Tutkimuksessa saatiin toteutettua toimiva ohjelmisto vesileiman lisäämiseen ja vesileimattujen kuvien tunnistamiseen kuvajoukosta. Tulosten perusteella, sopivalla piirreirrottimella ja tukivektorikoneluokittimella päästään yli 95 prosentin tunnistamistarkkuuteen.
Resumo:
This thesis presents a framework for segmentation of clustered overlapping convex objects. The proposed approach is based on a three-step framework in which the tasks of seed point extraction, contour evidence extraction, and contour estimation are addressed. The state-of-art techniques for each step were studied and evaluated using synthetic and real microscopic image data. According to obtained evaluation results, a method combining the best performers in each step was presented. In the proposed method, Fast Radial Symmetry transform, edge-to-marker association algorithm and ellipse fitting are employed for seed point extraction, contour evidence extraction and contour estimation respectively. Using synthetic and real image data, the proposed method was evaluated and compared with two competing methods and the results showed a promising improvement over the competing methods, with high segmentation and size distribution estimation accuracy.
Influence of surface functionalization on the behavior of silica nanoparticles in biological systems
Resumo:
Personalized nanomedicine has been shown to provide advantages over traditional clinical imaging, diagnosis, and conventional medical treatment. Using nanoparticles can enhance and clarify the clinical targeting and imaging, and lead them exactly to the place in the body that is the goal of treatment. At the same time, one can reduce the side effects that usually occur in the parts of the body that are not targets for treatment. Nanoparticles are of a size that can penetrate into cells. Their surface functionalization offers a way to increase their sensitivity when detecting target molecules. In addition, it increases the potential for flexibility in particle design, their therapeutic function, and variation possibilities in diagnostics. Mesoporous nanoparticles of amorphous silica have attractive physical and chemical characteristics such as particle morphology, controllable pore size, and high surface area and pore volume. Additionally, the surface functionalization of silica nanoparticles is relatively straightforward, which enables optimization of the interaction between the particles and the biological system. The main goal of this study was to prepare traceable and targetable silica nanoparticles for medical applications with a special focus on particle dispersion stability, biocompatibility, and targeting capabilities. Nanoparticle properties are highly particle-size dependent and a good dispersion stability is a prerequisite for active therapeutic and diagnostic agents. In the study it was shown that traceable streptavidin-conjugated silica nanoparticles which exhibit a good dispersibility could be obtained by the suitable choice of a proper surface functionalization route. Theranostic nanoparticles should exhibit sufficient hydrolytic stability to effectively carry the medicine to the target cells after which they should disintegrate and dissolve. Furthermore, the surface groups should stay at the particle surface until the particle has been internalized by the cell in order to optimize cell specificity. Model particles with fluorescently-labeled regions were tested in vitro using light microscopy and image processing technology, which allowed a detailed study of the disintegration and dissolution process. The study showed that nanoparticles degrade more slowly outside, as compared to inside the cell. The main advantage of theranostic agents is their successful targeting in vitro and in vivo. Non-porous nanoparticles using monoclonal antibodies as guiding ligands were tested in vitro in order to follow their targeting ability and internalization. In addition to the targeting that was found successful, a specific internalization route for the particles could be detected. In the last part of the study, the objective was to clarify the feasibility of traceable mesoporous silica nanoparticles, loaded with a hydrophobic cancer drug, being applied for targeted drug delivery in vitro and in vivo. Particles were provided with a small molecular targeting ligand. In the study a significantly higher therapeutic effect could be achieved with nanoparticles compared to free drug. The nanoparticles were biocompatible and stayed in the tumor for a longer time than a free medicine did, before being eliminated by renal excretion. Overall, the results showed that mesoporous silica nanoparticles are biocompatible, biodegradable drug carriers and that cell specificity can be achieved both in vitro and in vivo.