867 resultados para image processing and analysis
Resumo:
This article presents the principal results of the doctoral thesis “Recognition of neume notation in historical documents” by Lasko Laskov (Institute of Mathematics and Informatics at Bulgarian Academy of Sciences), successfully defended before the Specialized Academic Council for Informatics and Mathematical Modelling on 07 June 2010.
Resumo:
Congenital nystagmus (CN) is an ocular-motor disorder characterised by involuntary, conjugated ocular oscillations, that can arise since the first months of life. Pathogenesis of congenital nystagmus is still under investigation. In general, CN patients show a considerable decrease of their visual acuity: image fixation on the retina is disturbed by nystagmus continuous oscillations, mainly horizontal. However, image stabilisation is still achieved during the short periods in which eye velocity slows down while the target image is placed onto the fovea (called foveation intervals). To quantify the extent of nystagmus, eye movement recording are routinely employed, allowing physicians to extract and analyse nystagmus main features such as shape, amplitude and frequency. Using eye movement recording, it is also possible to compute estimated visual acuity predictors: analytical functions which estimates expected visual acuity using signal features such as foveation time and foveation position variability. Use of those functions add information to typical visual acuity measurement (e.g. Landolt C test) and could be a support for therapy planning or monitoring. This study focus on robust detection of CN patients' foveations. Specifically, it proposes a method to recognize the exact signal tracts in which a subject foveates, This paper also analyses foveation sequences. About 50 eyemovement recordings, either infrared-oculographic or electrooculographic, from different CN subjects were acquired. Results suggest that an exponential interpolation for the slow phases of nystagmus could improve foveation time computing and reduce influence of breaking saccades and data noise. Moreover a concise description of foveation sequence variability can be achieved using non-fitting splines. © 2009 Springer Berlin Heidelberg.
Resumo:
The term oxylipin is applied to the generation of oxygenated products of polyunsaturated fatty acids that can arise either through non-enzymatic or enzymatic processes generating a complex array of products, including alcohols, aldehydes, ketones, acids and hydrocarbon gases. The biosynthetic origin of these products has revealed an array of enzymes involved in their formation and more recently a radical pathway. These include lipoxygenases and α-dioxygenase that insert both oxygen atoms in to the acyl chain to initiate the pathways, to specialised P450 monooxygenases that are responsible for their downstream processing. This latter group include enzymes at the branch points such as allene oxide synthase, leading to jasmonate signalling, hydroperoxide lyase, responsible for generating pathogen/pest defensive volatiles and divinyl ether synthases and peroxygenases involved in the formation of antimicrobial compounds. The complexity of the products generated raises significant challenges for their rapid identification and quantification using metabolic screening methods. Here the current developments in oxylipin metabolism are reviewed together with the emerging technologies required to expand this important field of research that underpins advances in plant-pest/pathogen interactions.
Resumo:
This research pursued the conceptualization and real-time verification of a system that allows a computer user to control the cursor of a computer interface without using his/her hands. The target user groups for this system are individuals who are unable to use their hands due to spinal dysfunction or other afflictions, and individuals who must use their hands for higher priority tasks while still requiring interaction with a computer. ^ The system receives two forms of input from the user: Electromyogram (EMG) signals from muscles in the face and point-of-gaze coordinates produced by an Eye Gaze Tracking (EGT) system. In order to produce reliable cursor control from the two forms of user input, the development of this EMG/EGT system addressed three key requirements: an algorithm was created to accurately translate EMG signals due to facial movements into cursor actions, a separate algorithm was created that recognized an eye gaze fixation and provided an estimate of the associated eye gaze position, and an information fusion protocol was devised to efficiently integrate the outputs of these algorithms. ^ Experiments were conducted to compare the performance of EMG/EGT cursor control to EGT-only control and mouse control. These experiments took the form of two different types of point-and-click trials. The data produced by these experiments were evaluated using statistical analysis, Fitts' Law analysis and target re-entry (TRE) analysis. ^ The experimental results revealed that though EMG/EGT control was slower than EGT-only and mouse control, it provided effective hands-free control of the cursor without a spatial accuracy limitation, and it also facilitated a reliable click operation. This combination of qualities is not possessed by either EGT-only or mouse control, making EMG/EGT cursor control a unique and practical alternative for a user's cursor control needs. ^
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
Issues of body image and ability to achieve intimacy are connected to body weight, yet remain largely unexplored and have not been evaluated by gender. The underlying purpose of this research was to determine if avoidant attitudes and perceptions of one's body may hold implications toward its use in intimate interactions, and if an above average body weight would tend to increase this avoidance. The National Health and Nutrition Examination Survey (NHANES, 1999-2002) finds that 64.5% of US adults are overweight, with 61.9% of women and 67.2% of men. The increasing prevalence of overweight and obesity in men and women shows no reverse trend, nor have prevention and treatment proven effective in the long term. The researcher gathered self-reported age, gender, height and weight data from 55 male and 58 female subjects (determined by a prospective power analysis with a desired medium effect size (r=.30) to determine body mass index (BMI), determining a mean age of 21.6 years and mean BMI of 25.6. Survey instruments consisted of two scales that are germane to the variables being examined. They were (1) Descutner and Thelen of the University of Missouri‘s (1991) Fear-of-Intimacy scale; and (2) Rosen, Srebnik, Saltzberg, and Wendt's (1991) Body Image Avoidance Questionnaire. Results indicated that as body mass index increases, fear of intimacy increases (p<0.05) and that as body mass index increases, body image avoidance increases (p<0.05). The relationship that as body image avoidance increases, fear of intimacy increases was not supported, but approached significance at (p<0.07). No differences in these relationships were determined between gender groups. For age, the only observed relationship was that of a difference between scores for age groups [18 to 22 (group 1) and ages 23 to 34 (group 2)] for the relationship of body image avoidance and fear of intimacy (p<0.02). The results suggest that the relationship of body image avoidance and fear of intimacy, as well as age, bear consideration toward the escalating prevalence of overweight and obesity. An integrative approach to body weight that addresses issues of body image and intimacy may prove effective in prevention and treatment.
Resumo:
The purpose of this study was to determine the racial and ethnic differences on body image perceptions and weight concerns of fourth grade girls. A purposive sample of 182 fourth grade girls were eligible to participate, 166 were included in the data analysis. The Children's Eating Attitude Test (ChEAT) and a Dieting and Demographic Questionnaire (DDQ) were used to determine eating attitudes of fourth grade girls. A pictoral instrument that was modified from the original was used to assess body image. Anthropometric data was assessed and body mass index (BMI) values were used to classify subjects into percentiles. Results revealed that 56% of all fourth grade girls studied wanted to be thinner and 53% had tried to lose weight. Significantly more non-Hispanic white (NHW) girls reported wanting to be thinner than non-Hispanic black (NHB) and H girls (65.5% vs.32% and 47%, respectively, P=0.005) No significant racial/ethnic differences were revealed for the ChEAT scores. However, 19% of all subjects studied fell into the category indicative of anorexia nervosa. H girls who were less than the 85" %tile for BMI chose significantly smaller figures as their perceived body image (3.5±0.7) than both NHB and NHW girls (4.0±0.6 and 3.9±0.5, respectively, P<0.01). These findings demonstrated that weight concerns were prevalent among girls ages 9- 11 years. NHW and H girls may have more concerns about their body size and shape than their NHB counterparts. Implementing intervention programs at an early age may prevent eating disorders in adolescence and adulthood.
Resumo:
This paper presents an image processing based detection method for detecting pitting corrosion in steel structures. High Dynamic Range (HDR) imaging has been carried out in this regard to demonstrate the effectiveness of such relatively inexpensive techniques that are of immense benefit to Non – Destructive – Tesing (NDT) community. The pitting corrosion of a steel sample in marine environment is successfully detected in this paper using the proposed methodology. It is observed, that the proposed method has a definite potential to be applied to a wider range of applications.
Resumo:
Current state of the art techniques for landmine detection in ground penetrating radar (GPR) utilize statistical methods to identify characteristics of a landmine response. This research makes use of 2-D slices of data in which subsurface landmine responses have hyperbolic shapes. Various methods from the field of visual image processing are adapted to the 2-D GPR data, producing superior landmine detection results. This research goes on to develop a physics-based GPR augmentation method motivated by current advances in visual object detection. This GPR specific augmentation is used to mitigate issues caused by insufficient training sets. This work shows that augmentation improves detection performance under training conditions that are normally very difficult. Finally, this work introduces the use of convolutional neural networks as a method to learn feature extraction parameters. These learned convolutional features outperform hand-designed features in GPR detection tasks. This work presents a number of methods, both borrowed from and motivated by the substantial work in visual image processing. The methods developed and presented in this work show an improvement in overall detection performance and introduce a method to improve the robustness of statistical classification.
Resumo:
Issues of body image and ability to achieve intimacy are connected to body weight, yet remain largely unexplored and have not been evaluated by gender. The underlying purpose of this research was to determine if avoidant attitudes and perceptions of one’s body may hold implications toward its use in intimate interactions, and if an above average body weight would tend to increase this avoidance. The National Health and Nutrition Examination Survey (NHANES, 1999-2002) finds that 64.5% of US adults are overweight, with 61.9% of women and 67.2% of men. The increasing prevalence of overweight and obesity in men and women shows no reverse trend, nor have prevention and treatment proven effective in the long term. The researcher gathered self-reported age, gender, height and weight data from 55 male and 58 female subjects (determined by a prospective power analysis with a desired medium effect size (r =.30) to determine body mass index (BMI), determining a mean age of 21.6 years and mean BMI of 25.6. Survey instruments consisted of two scales that are germane to the variables being examined. They were (1) Descutner and Thelen of the University of Missouri’s (1991) Fear-of-Intimacy scale and (2) Rosen, Srebnik, Saltzberg, and Wendt’s (1991) Body Image Avoidance Questionnaire. Results indicated that as body mass index increases, fear of intimacy increases (p<0.05) and that as body mass index increases, body image avoidance increases (p<0.05). The relationship that as body image avoidance increases, fear of intimacy increases was not supported, but approached significance at (p<0.07). No differences in these relationships were determined between gender groups. For age, the only observed relationship was that of a difference between scores for age groups [18 to 22 (group 1) and ages 23 to 34 (group 2)] for the relationship of body image avoidance and fear of intimacy (p<0.02). The results suggest that the relationship of body image avoidance and fear of intimacy, as well as age, bear consideration toward the escalating prevalence of overweight and obesity. An integrative approach to body weight that addresses issues of body image and intimacy may prove effective in prevention and treatment.
Resumo:
Economic policy-making has long been more integrated than social policy-making in part because the statistics and much of the analysis that supports economic policy are based on a common conceptual framework – the system of national accounts. People interested in economic analysis and economic policy share a common language of communication, one that includes both concepts and numbers. This paper examines early attempts to develop a system of social statistics that would mirror the system of national accounts, particular the work on the development of social accounts that took place mainly in the 60s and 70s. It explores the reasons why these early initiatives failed but argues that the preconditions now exist to develop a new conceptual framework to support integrated social statistics – and hence a more coherent, effective social policy. Optimism is warranted for two reasons. First, we can make use of the radical transformation that has taken place in information technology both in processing data and in providing wide access to the knowledge that can flow from the data. Second, the conditions exist to begin to shift away from the straight jacket of government-centric social statistics, with its implicit assumption that governments must be the primary actors in finding solutions to social problems. By supporting the decision-making of all the players (particularly individual citizens) who affect social trends and outcomes, we can start to move beyond the sterile, ideological discussions that have dominated much social discourse in the past and begin to build social systems and structures that evolve, almost automatically, based on empirical evidence of ‘what works best for whom’. The paper describes a Canadian approach to developing a framework, or common language, to support the evolution of an integrated, citizen-centric system of social statistics and social analysis. This language supports the traditional social policy that we have today; nothing is lost. However, it also supports a quite different social policy world, one where individual citizens and families (not governments) are seen as the central players – a more empirically-driven world that we have referred to as the ‘enabling society’.
Resumo:
A small scale sample nuclear waste package, consisting of a 28 mm diameter uranium penny encased in grout, was imaged by absorption contrast radiography using a single pulse exposure from an X-ray source driven by a high-power laser. The Vulcan laser was used to deliver a focused pulse of photons to a tantalum foil, in order to generate a bright burst of highly penetrating X-rays (with energy >500 keV), with a source size of <0.5 mm. BAS-TR and BAS-SR image plates were used for image capture, alongside a newly developed Thalium doped Caesium Iodide scintillator-based detector coupled to CCD chips. The uranium penny was clearly resolved to sub-mm accuracy over a 30 cm2 scan area from a single shot acquisition. In addition, neutron generation was demonstrated in situ with the X-ray beam, with a single shot, thus demonstrating the potential for multi-modal criticality testing of waste materials. This feasibility study successfully demonstrated non-destructive radiography of encapsulated, high density, nuclear material. With recent developments of high-power laser systems, to 10 Hz operation, a laser-driven multi-modal beamline for waste monitoring applications is envisioned.
Resumo:
With security and surveillance, there is an increasing need to process image data efficiently and effectively either at source or in a large data network. Whilst a Field-Programmable Gate Array (FPGA) has been seen as a key technology for enabling this, the design process has been viewed as problematic in terms of the time and effort needed for implementation and verification. The work here proposes a different approach of using optimized FPGA-based soft-core processors which allows the user to exploit the task and data level parallelism to achieve the quality of dedicated FPGA implementations whilst reducing design time. The paper also reports some preliminary
progress on the design flow to program the structure. An implementation for a Histogram of Gradients algorithm is also reported which shows that a performance of 328 fps can be achieved with this design approach, whilst avoiding the long design time, verification and debugging steps associated with conventional FPGA implementations.
Resumo:
Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.