977 resultados para Pattern Informatics Method
Resumo:
The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes) using Schirmer tear test (STT) strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT) strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001) were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT) strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.
Resumo:
We studied the distribution of NADPH-diaphorase activity in the visual cortex of normal adult New World monkeys (Saimiri sciureus) using the malic enzyme "indirect" method. NADPH-diaphorase neuropil activity had a heterogeneous distribution. In coronal sections, it had a clear laminar pattern that was coincident with Nissl-stained layers. In tangential sections, we observed blobs in supragranular layers of V1 and stripes throughout the entire V2. We quantified and compared the tangential distribution of NADPH-diaphorase and cytochrome oxidase blobs in adjacent sections of the supragranular layers of V1. Although their spatial distributions were rather similar, the two enzymes did not always overlap. The histochemical reaction also revealed two different types of stained cells: a slightly stained subpopulation and a subgroup of deeply stained neurons resembling a Golgi impregnation. These neurons were sparsely spined non-pyramidal cells. Their dendritic arbors were very well stained but their axons were not always evident. In the gray matter, heavily stained neurons showed different dendritic arbor morphologies. However, most of the strongly reactive cells lay in the subjacent white matter, where they presented a more homogenous morphology. Our results demonstrate that the pattern of NADPH-diaphorase activity is similar to that previously described in Old World monkeys
Resumo:
Optical microscopy is living its renaissance. The diffraction limit, although still physically true, plays a minor role in the achievable resolution in far-field fluorescence microscopy. Super-resolution techniques enable fluorescence microscopy at nearly molecular resolution. Modern (super-resolution) microscopy methods rely strongly on software. Software tools are needed all the way from data acquisition, data storage, image reconstruction, restoration and alignment, to quantitative image analysis and image visualization. These tools play a key role in all aspects of microscopy today – and their importance in the coming years is certainly going to increase, when microscopy little-by-little transitions from single cells into more complex and even living model systems. In this thesis, a series of bioimage informatics software tools are introduced for STED super-resolution microscopy. Tomographic reconstruction software, coupled with a novel image acquisition method STED< is shown to enable axial (3D) super-resolution imaging in a standard 2D-STED microscope. Software tools are introduced for STED super-resolution correlative imaging with transmission electron microscopes or atomic force microscopes. A novel method for automatically ranking image quality within microscope image datasets is introduced, and it is utilized to for example select the best images in a STED microscope image dataset.
Resumo:
This is a Self-study about my role as a teacher, driven by the question: "How do I improve my practice?" (Whitehead, 1989)? In this study, I explored the discomfort that I had with the way that I had been teaching. Specifically, I worked to uncover the reasons behind my obsessive (mis)management of my students. I wrote of how I came to give my Self permission for this critique: how I came to know that all knowledge is a construction, and that my practice, too, is a construction. I grounded this journey within my experiences. I constructed these experiences in narrative fomi in order to reach a greater understanding of how I came to be the teacher I initially was. I explored metaphors that impacted my practice, re-constructed them, and saw more clearly the assumptions and influences that have guided my teaching. I centred my inquiry into my teaching within an Action Reflection methodology, bon-owing Jack Whitehead's (1989) term to describe my version of Action Research. I relied upon the embedded cyclical pattern of Action Reflection to understand my teaching Self: beginning from a critical moment, reflecting upon it, and then taking appropriate action, and continuing in this way, working to improve my practice. To understand these critical moments, I developed a personal definition of critical literacy. I then tumed this definition inward. In treating my practice as a textual production, I applied critical literacy as a framework in coming to know and understand the construction that is my teaching. I grounded my thesis journey within my Self, positioning my study within my experiences of being a grade 1 teacher struggling to teach critical literacy. I then repositioned my journey to that of a grade 1 teacher struggling to use critical literacy to improve my practice. This journey, then, is about the transition from critical literacyit as-subject to critical literacy-as-instmctional-method in improving my practice. I joumeyed inwards, using a critical moment to build new understandings, leading me to the next critical moment, and continued in this cyclical way. I worked in this meandering yet deliberate way to reach a new place in my teaching: one that is more inclusive of all the voices in my room. I concluded my journey with a beginning: a beginning of re-visioning my practice. In telling the stories of my journey, of my teaching, of my experiences, I changed into the teacher that I am more comfortable with. I've come to the frightening conclusion that I am the decisive element in the classroom. It's my personal approach that creates the climate. It's my daily mood that makes the weather As a teacher, I possess a tremendous power to make a person's life miserable or joyous. I can be a tool of torture or an instrument of inspiration. I can humiliate or humour, hurt or heal. In all situations, it is my response that decides whether a crisis will be escalated or de-escalated and a person humanized or de-humanized. (Ginott, as cited in Buscaglia, 2002, p. 22)
Resumo:
In this paper the effectiveness of a novel method of computer assisted pedicle screw insertion was studied using testing of hypothesis procedure with a sample size of 48. Pattern recognition based on geometric features of markers on the drill has been performed on real time optical video obtained from orthogonally placed CCD cameras. The study reveals the exactness of the calculated position of the drill using navigation based on CT image of the vertebra and real time optical video of the drill. The significance value is 0.424 at 95% confidence level which indicates good precision with a standard mean error of only 0.00724. The virtual vision method is less hazardous to both patient and the surgeon
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
The absolute necessity of obtaining 3D information of structured and unknown environments in autonomous navigation reduce considerably the set of sensors that can be used. The necessity to know, at each time, the position of the mobile robot with respect to the scene is indispensable. Furthermore, this information must be obtained in the least computing time. Stereo vision is an attractive and widely used method, but, it is rather limited to make fast 3D surface maps, due to the correspondence problem. The spatial and temporal correspondence among images can be alleviated using a method based on structured light. This relationship can be directly found codifying the projected light; then each imaged region of the projected pattern carries the needed information to solve the correspondence problem. We present the most significant techniques, used in recent years, concerning the coded structured light method
Resumo:
El coneixement de la superfície d'energia potencial (PES) ha estat essencial en el món de la química teòrica per tal de discutir tant la reactivitat química com l'estructura i l'espectroscòpia molecular. En el camp de la reactivitat química es hem proposat continuar amb el desenvolupament de nova metodologia dins el marc de la teoria del funcional de la densitat conceptual. En particular aquesta tesis es centrarà en els següents punts: a) El nombre i la naturalesa dels seus punts estacionaris del PES poden sofrir canvis radicals modificant el nivell de càlcul utilitzats, de tal manera que per estar segurs de la seva naturalesa cal anar a nivells de càlcul molt elevats. La duresa és una mesura de la resistència d'un sistema químic a canviar la seva configuració electrònica, i segons el principi de màxima duresa on hi hagi un mínim o un màxim d'energia trobarem un màxim o un mínim de duresa, respectivament. A l'escollir tot un conjunt de reaccions problemàtiques des del punt de vista de presència de punts estacionaris erronis, hem observat que els perfils de duresa són més independents de la base i del mètode utilitzats, a més a més sempre presenten el perfil correcte. b) Hem desenvolupat noves expressions basades en les integracions dels kernels de duresa per tal de determinar la duresa global d'una molècula de manera més precisa que la utilitzada habitualment que està basada en el càlcul numèric de la derivada segona de l'energia respecte al número d'electrons. c) Hem estudiat la validesa del principis de màxima duresa i de mínima polaritzabiliat en les vibracions asimètriques en sistemes aromàtics. Hem trobat que per aquests sistemes alguns modes vibracionals incompleixen aquests principis i hem analitzat la relació d'aquest l'incompliment amb l'efecte de l'acoblament pseudo-Jahn-Teller. A més a més, hem postulat tot un conjunt de regles molt senzilles que ens permetien deduir si una molècula compliria o no aquests principis sense la realització de cap càlcul previ. Tota aquesta informació ha estat essencial per poder determinar exactament quines són les causes del compliment o l'incompliment del MHP i MPP. d) Finalment, hem realitzat una expansió de l'energia funcional en termes del nombre d'electrons i de les coordenades normals dintre del conjunt canònic. En la comparació d'aquesta expansió amb l'expansió de l'energia del nombre d'electrons i del potencial extern hem pogut recuperar d'una altra forma diferent tot un conjunt de relacions ja conegudes entre alguns coneguts descriptors de reactivitat del funcional de la densitat i en poden establir tot un conjunt de noves relacions i de nous descriptors. Dins del marc de les propietats moleculars es proposa generalitzar i millorar la metodologia pel càlcul de la contribució vibracional (Pvib) a les propietats òptiques no lineals (NLO). Tot i que la Pvib no s'ha tingut en compte en la majoria dels estudis teòrics publicats de les propietats NLO, recentment s'ha comprovat que la Pvib de diversos polímers orgànics amb altes propietats òptiques no lineals és fins i tot més gran que la contribució electrònica. Per tant, tenir en compte la Pvib és essencial en el disseny dels nous materials òptics no lineals utilitzats en el camp de la informàtica, les telecomunicacions i la tecnologia làser. Les principals línies d'aquesta tesis sobre aquest tema són: a) Hem calculat per primera vegada els termes d'alt ordre de Pvib de diversos polímers orgànics amb l'objectiu d'avaluar la seva importància i la convergència de les sèries de Taylor que defineixen aquestes contribucions vibracionals. b) Hem avaluat les contribucions electròniques i vibracionals per una sèrie de molècules orgàniques representatives utilitzant diferents metodologies, per tal de poder de determinar quina és la manera més senzilla per poder calcular les propietats NLO amb una precisió semiquantitativa.
Resumo:
In all biological processes, protein molecules and other small molecules interact to function and form transient macromolecular complexes. This interaction of two or more molecules can be described by a docking event. Docking is an important phase for structure-based drug design strategies, as it can be used as a method to simulate protein-ligand interactions. Various docking programs exist that allow automated docking, but most of them have limited visualization and user interaction. It would be advantageous if scientists could visualize the molecules participating in the docking process, manipulate their structures and manually dock them before submitting the new conformations to an automated docking process in an immersive environment, which can help stimulate the design/docking process. This also could greatly reduce docking time and resources. To achieve this, we propose a new virtual modelling/docking program, whereby the advantages of virtual modelling programs and the efficiency of the algorithms in existing docking programs will be merged.
Resumo:
We use the point-source method (PSM) to reconstruct a scattered field from its associated far field pattern. The reconstruction scheme is described and numerical results are presented for three-dimensional acoustic and electromagnetic scattering problems. We give new proofs of the algorithms, based on the Green and Stratton-Chu formulae, which are more general than with the former use of the reciprocity relation. This allows us to handle the case of limited aperture data and arbitrary incident fields. Both for 3D acoustics and electromagnetics, numerical reconstructions of the field for different settings and with noisy data are shown. For shape reconstruction in acoustics, we develop an appropriate strategy to identify areas with good reconstruction quality and combine different such regions into one joint function. Then, we show how shapes of unknown sound-soft scatterers are found as level curves of the total reconstructed field.
Resumo:
We present a method to enhance fault localization for software systems based on a frequent pattern mining algorithm. Our method is based on a large set of test cases for a given set of programs in which faults can be detected. The test executions are recorded as function call trees. Based on test oracles the tests can be classified into successful and failing tests. A frequent pattern mining algorithm is used to identify frequent subtrees in successful and failing test executions. This information is used to rank functions according to their likelihood of containing a fault. The ranking suggests an order in which to examine the functions during fault analysis. We validate our approach experimentally using a subset of Siemens benchmark programs.
Resumo:
We describe a general likelihood-based 'mixture model' for inferring phylogenetic trees from gene-sequence or other character-state data. The model accommodates cases in which different sites in the alignment evolve in qualitatively distinct ways, but does not require prior knowledge of these patterns or partitioning of the data. We call this qualitative variability in the pattern of evolution across sites "pattern-heterogeneity" to distinguish it from both a homogenous process of evolution and from one characterized principally by differences in rates of evolution. We present studies to show that the model correctly retrieves the signals of pattern-heterogeneity from simulated gene-sequence data, and we apply the method to protein-coding genes and to a ribosomal 12S data set. The mixture model outperforms conventional partitioning in both these data sets. We implement the mixture model such that it can simultaneously detect rate- and pattern-heterogeneity. The model simplifies to a homogeneous model or a rate- variability model as special cases, and therefore always performs at least as well as these two approaches, and often considerably improves upon them. We make the model available within a Bayesian Markov-chain Monte Carlo framework for phylogenetic inference, as an easy-to-use computer program.
Resumo:
This paper identifies the major challenges in the area of pattern formation. The work is also motivated by the need for development of a single framework to surmount these challenges. A framework based on the control of macroscopic parameters is proposed. The issue of transformation of patterns is specifically considered. A definition for transformation and four special cases, namely elementary and geometrical transformations by repositioning all or some robots in the pattern are provided. Two feasible tools for pattern transformation namely, a macroscopic parameter method and a mathematical tool - Moebius transformation also known as the linear fractional transformation are introduced. The realization of the unifying framework considering planning and communication is reported.
Resumo:
The work reported in this paper is motivated by the need to investigate general methods for pattern transformation. A formal definition for pattern transformation is provided and four special cases namely, elementary and geometric transformation based on repositioning all and some agents in the pattern are introduced. The need for a mathematical tool and simulations for visualizing the behavior of a transformation method is highlighted. A mathematical method based on the Moebius transformation is proposed. The transformation method involves discretization of events for planning paths of individual robots in a pattern. Simulations on a particle physics simulator are used to validate the feasibility of the proposed method.