856 resultados para backward mapping
Resumo:
L'attività di tesi è stata svolta presso la divisione System Ceramics della società System Group S.p.A. di Fiorano Modenese (MO) che si occupa dello sviluppo di soluzioni per l'industria ceramica, tra cui la decorazione delle piastrelle. Tipicamente nelle industrie ceramiche la movimentazione dei pezzi è effettuata tramite nastro trasportatore e durante il trasporto i pezzi possono subire leggeri movimenti. Se il pezzo non viene allineato alla stampante prima della fase di decorazione la stampa risulta disallineata e vi possono essere alcune zone non stampate lungo i bordi del pezzo. Perciò prima di procedere con la decorazione è fondamentale correggere il disallineamento. La soluzione più comune è installare delle guide all'ingresso del sistema di decorazione. Oltre a non consentire un’alta precisione, questa soluzione si dimostra inadatta nel caso la decorazione venga applicata in fasi successive da stampanti diverse. Il reparto di ricerca e sviluppo di System Ceramics ha quindi ideato una soluzione diversa e innovativa seguendo l'approccio inverso: allineare la grafica via software a ogni pezzo in base alla sua disposizione, invece che intervenire fisicamente modificandone la posizione. Il nuovo processo di stampa basato sull'allineamento software della grafica consiste nel ricavare inizialmente la disposizione di ogni piastrella utilizzando un sistema di visione artificiale posizionato sul nastro prima della stampante. Successivamente la grafica viene elaborata in base alla disposizione del pezzo ed applicata una volta che il pezzo arriva presso la zona di stampa. L'attività di tesi si è focalizzata sulla fase di rotazione della grafica ed è consistita nello studio e nell’ottimizzazione del prototipo di applicazione esistente al fine di ridurne i tempi di esecuzione. Il prototipo infatti, sebbene funzionante, ha un tempo di esecuzione così elevato da risultare incompatibile con la velocità di produzione adottata dalle industrie ceramiche.
Resumo:
In December, 1980, following increasing congressional and constituent-interest in problems associated with hazardous waste, the Comprehensive Environmental Recovery, Compensation and Liability Act (CERCLA) was passed. During its development, the legislative initiative was seriously compromised which resulted in a less exhaustive approach than was formerly sought. Still, CERCLA (Superfund) which established, among other things, authority to clean up abandoned waste dumps and to respond to emergencies caused by releases of hazardous substances was welcomed by many as an important initial law critical to the cleanup of the nation's hazardous waste. Expectations raised by passage of this bill were tragically unmet. By the end of four years, only six sites had been declared by the EPA as cleaned. Seemingly, even those determinations were liberal; of the six sites, two were identified subsequently as requiring further cleanup.^ This analysis is focused upon the implementation failure of the Superfund. In light of that focus, discussion encompasses development of linkages between flaws in the legislative language and foreclosure of chances for implementation success. Specification of such linkages is achieved through examination of the legislative initiative, identification of its flaws and characterization of attendant deficits in implementation ability. Subsequent analysis is addressed to how such legislative frailities might have been avoided and to attendant regulatory weaknesses which have contributed to implementation failure. Each of these analyses are accomplished through application of an expanded approach to the backward mapping analytic technique as presented by Elmore. Results and recommendations follow.^ Consideration is devoted to a variety of regulatory issues as well as to those pertinent to legislative and implementation analysis. Problems in assessing legal liability associated with hazardous waste management are presented, as is a detailed review of the legislative development of Superfund, and its initial implementation by Gorsuch's EPA. ^
Resumo:
La evolución de los teléfonos móviles inteligentes, dotados de cámaras digitales, está provocando una creciente demanda de aplicaciones cada vez más complejas que necesitan algoritmos de visión artificial en tiempo real; puesto que el tamaño de las señales de vídeo no hace sino aumentar y en cambio el rendimiento de los procesadores de un solo núcleo se ha estancado, los nuevos algoritmos que se diseñen para visión artificial han de ser paralelos para poder ejecutarse en múltiples procesadores y ser computacionalmente escalables. Una de las clases de procesadores más interesantes en la actualidad se encuentra en las tarjetas gráficas (GPU), que son dispositivos que ofrecen un alto grado de paralelismo, un excelente rendimiento numérico y una creciente versatilidad, lo que los hace interesantes para llevar a cabo computación científica. En esta tesis se exploran dos aplicaciones de visión artificial que revisten una gran complejidad computacional y no pueden ser ejecutadas en tiempo real empleando procesadores tradicionales. En cambio, como se demuestra en esta tesis, la paralelización de las distintas subtareas y su implementación sobre una GPU arrojan los resultados deseados de ejecución con tasas de refresco interactivas. Asimismo, se propone una técnica para la evaluación rápida de funciones de complejidad arbitraria especialmente indicada para su uso en una GPU. En primer lugar se estudia la aplicación de técnicas de síntesis de imágenes virtuales a partir de únicamente dos cámaras lejanas y no paralelas—en contraste con la configuración habitual en TV 3D de cámaras cercanas y paralelas—con información de color y profundidad. Empleando filtros de mediana modificados para la elaboración de un mapa de profundidad virtual y proyecciones inversas, se comprueba que estas técnicas son adecuadas para una libre elección del punto de vista. Además, se demuestra que la codificación de la información de profundidad con respecto a un sistema de referencia global es sumamente perjudicial y debería ser evitada. Por otro lado se propone un sistema de detección de objetos móviles basado en técnicas de estimación de densidad con funciones locales. Este tipo de técnicas es muy adecuada para el modelado de escenas complejas con fondos multimodales, pero ha recibido poco uso debido a su gran complejidad computacional. El sistema propuesto, implementado en tiempo real sobre una GPU, incluye propuestas para la estimación dinámica de los anchos de banda de las funciones locales, actualización selectiva del modelo de fondo, actualización de la posición de las muestras de referencia del modelo de primer plano empleando un filtro de partículas multirregión y selección automática de regiones de interés para reducir el coste computacional. Los resultados, evaluados sobre diversas bases de datos y comparados con otros algoritmos del estado del arte, demuestran la gran versatilidad y calidad de la propuesta. Finalmente se propone un método para la aproximación de funciones arbitrarias empleando funciones continuas lineales a tramos, especialmente indicada para su implementación en una GPU mediante el uso de las unidades de filtraje de texturas, normalmente no utilizadas para cómputo numérico. La propuesta incluye un riguroso análisis matemático del error cometido en la aproximación en función del número de muestras empleadas, así como un método para la obtención de una partición cuasióptima del dominio de la función para minimizar el error. ABSTRACT The evolution of smartphones, all equipped with digital cameras, is driving a growing demand for ever more complex applications that need to rely on real-time computer vision algorithms. However, video signals are only increasing in size, whereas the performance of single-core processors has somewhat stagnated in the past few years. Consequently, new computer vision algorithms will need to be parallel to run on multiple processors and be computationally scalable. One of the most promising classes of processors nowadays can be found in graphics processing units (GPU). These are devices offering a high parallelism degree, excellent numerical performance and increasing versatility, which makes them interesting to run scientific computations. In this thesis, we explore two computer vision applications with a high computational complexity that precludes them from running in real time on traditional uniprocessors. However, we show that by parallelizing subtasks and implementing them on a GPU, both applications attain their goals of running at interactive frame rates. In addition, we propose a technique for fast evaluation of arbitrarily complex functions, specially designed for GPU implementation. First, we explore the application of depth-image–based rendering techniques to the unusual configuration of two convergent, wide baseline cameras, in contrast to the usual configuration used in 3D TV, which are narrow baseline, parallel cameras. By using a backward mapping approach with a depth inpainting scheme based on median filters, we show that these techniques are adequate for free viewpoint video applications. In addition, we show that referring depth information to a global reference system is ill-advised and should be avoided. Then, we propose a background subtraction system based on kernel density estimation techniques. These techniques are very adequate for modelling complex scenes featuring multimodal backgrounds, but have not been so popular due to their huge computational and memory complexity. The proposed system, implemented in real time on a GPU, features novel proposals for dynamic kernel bandwidth estimation for the background model, selective update of the background model, update of the position of reference samples of the foreground model using a multi-region particle filter, and automatic selection of regions of interest to reduce computational cost. The results, evaluated on several databases and compared to other state-of-the-art algorithms, demonstrate the high quality and versatility of our proposal. Finally, we propose a general method for the approximation of arbitrarily complex functions using continuous piecewise linear functions, specially formulated for GPU implementation by leveraging their texture filtering units, normally unused for numerical computation. Our proposal features a rigorous mathematical analysis of the approximation error in function of the number of samples, as well as a method to obtain a suboptimal partition of the domain of the function to minimize approximation error.
Resumo:
[EN] The aim of this work is to propose a new method for estimating the backward flow directly from the optical flow. We assume that the optical flow has already been computed and we need to estimate the inverse mapping. This mapping is not bijective due to the presence of occlusions and disocclusions, therefore it is not possible to estimate the inverse function in the whole domain. Values in these regions has to be guessed from the available information. We propose an accurate algorithm to calculate the backward flow uniquely from the optical flow, using a simple relation. Occlusions are filled by selecting the maximum motion and disocclusions are filled with two different strategies: a min-fill strategy, which fills each disoccluded region with the minimum value around the region; and a restricted min-fill approach that selects the minimum value in a close neighborhood. In the experimental results, we show the accuracy of the method and compare the results using these two strategies.
Resumo:
Dulce de leche samples available in the Brazilian market were submitted to sensory profiling by quantitative descriptive analysis and acceptance test, as well sensory evaluation using the just-about-right scale and purchase intent. External preference mapping and the ideal sensory characteristics of dulce de leche were determined. The results were also evaluated by principal component analysis, hierarchical cluster analysis, partial least squares regression, artificial neural networks, and logistic regression. Overall, significant product acceptance was related to intermediate scores of the sensory attributes in the descriptive test, and this trend was observed even after consumer segmentation. The results obtained by sensometric techniques showed that optimizing an ideal dulce de leche from the sensory standpoint is a multidimensional process, with necessary adjustments on the appearance, aroma, taste, and texture attributes of the product for better consumer acceptance and purchase. The optimum dulce de leche was characterized by high scores for the attributes sweet taste, caramel taste, brightness, color, and caramel aroma in accordance with the preference mapping findings. In industrial terms, this means changing the parameters used in the thermal treatment and quantitative changes in the ingredients used in formulations.
Resumo:
The evolution and population dynamics of avian coronaviruses (AvCoVs) remain underexplored. In the present study, in-depth phylogenetic and Bayesian phylogeographic studies were conducted to investigate the evolutionary dynamics of AvCoVs detected in wild and synanthropic birds. A total of 500 samples, including tracheal and cloacal swabs collected from 312 wild birds belonging to 42 species, were analysed using molecular assays. A total of 65 samples (13%) from 22 bird species were positive for AvCoV. Molecular evolution analyses revealed that the sequences from samples collected in Brazil did not cluster with any of the AvCoV S1 gene sequences deposited in the GenBank database. Bayesian framework analysis estimated an AvCoV strain from Sweden (1999) as the most recent common ancestor of the AvCoVs detected in this study. Furthermore, the analysis inferred an increase in the AvCoV dynamic demographic population in different wild and synanthropic bird species, suggesting that birds may be potential new hosts responsible for spreading this virus.
Resumo:
Mapping of elements in biological tissue by laser induced mass spectrometry is a fast growing analytical methodology in life sciences. This method provides a multitude of useful information of metal, nonmetal, metalloid and isotopic distribution at major, minor and trace concentration ranges, usually with a lateral resolution of 12-160 µm. Selected applications in medical research require an improved lateral resolution of laser induced mass spectrometric technique at the low micrometre scale and below. The present work demonstrates the applicability of a recently developed analytical methodology - laser microdissection associated to inductively coupled plasma mass spectrometry (LMD ICP-MS) - to obtain elemental images of different solid biological samples at high lateral resolution. LMD ICP-MS images of mouse brain tissue samples stained with uranium and native are shown, and a direct comparison of LMD and laser ablation (LA) ICP-MS imaging methodologies, in terms of elemental quantification, is performed.
Resumo:
QTL mapping provides usefull information for breeding programs since it allows the estimation of genomic locations and genetic effects of chromossomal regions related to the expression of quantitative traits. The objective of this study was to map QTL related to several agronomic important traits associated with grain yield: ear weight (EW), prolificacy (PROL), ear number (NE), ear length (EL) and diameter (ED), number of rows on the ear (NRE) and number of kernels per row on the ear (NKPR). Four hundred F-2:3 tropical maize progenies were evaluated in five environments in Piracicaba, Sao Paulo, Brazil. The genetic map was previously estimated and had 117 microssatelite loci with average distance of 14 cM. Data was analysed using Composite Interval Mapping for each trait. Thirty six QTL were mapped and related to the expression of EW (2), PROL (3), NE (2), EL (5), ED (5), NRE (10), NKPR (5). Few QTL were mapped since there was high GxE interaction. Traits EW, PROL and EN showed high genetic correlation with grain yield and several QTL mapped to similar genomic regions, which could cause the observed correlation. However, further analysis using apropriate statistical models are required to separate linked versus pleiotropic QTL. Five QTL (named Ew1, Ne1, Ed3, Nre3 and Nre10) had high genetic effects, explaining from 10.8% (Nre3) to 16.9% (Nre10) of the phenotypic variance, and could be considered in further studies.
Resumo:
The identification of alternatively spliced transcripts has contributed to a better comprehension of developmental mechanisms, tissue-specific physiological processes and human diseases. Polymerase chain reaction amplification of alternatively spliced variants commonly leads to the formation of heteroduplexes as a result of base pairing involving exons common between the two variants. S1 nuclease cleaves single-stranded loops of heteroduplexes and also nicks the opposite DNA strand. In order to establish a strategy for mapping alternative splice-prone sites in the whole transcriptome, we developed a method combining the formation of heteroduplexes between 2 distinct splicing variants and S1 nuclease digestion. For 20 consensuses identified here using this methodology, 5 revealed a conserved splice site after inspection of the cDNA alignment against the human genome (exact splice sites). For 8 other consensuses, conserved splice sites were mapped at 2 to 30 bp from the border, called proximal splice sites; for the other 7 consensuses, conserved splice sites were mapped at 40 to 800 bp, called distal splice sites. These latter cases showed a nonspecific activity of S1 nuclease in digesting double-strand DNA. From the 20 consensuses identified here, 5 were selected for reverse transcription-polymerase chain reaction validation, confirming the splice sites. These data showed the potential of the strategy in mapping splice sites. However, the lack of specificity of the S1 nuclease enzyme is a significant obstacle that impedes the use of this strategy in large-scale studies.
Resumo:
Forward-backward multiplicity correlation strengths have been measured with the STAR detector for Au + Au and p + p collisions at root s(NN) = 200 GeV. Strong short- and long-range correlations (LRC) are seen in central Au + Au collisions. The magnitude of these correlations decrease with decreasing centrality until only short-range correlations are observed in peripheral Au + Au collisions. Both the dual parton model (DPM) and the color glass condensate (CGC) predict the existence of the long-range correlations. In the DPM, the fluctuation in the number of elementary (parton) inelastic collisions produces the LRC. In the CGC, longitudinal color flux tubes generate the LRC. The data are in qualitative agreement with the predictions of the DPM and indicate the presence of multiple parton interactions.
Resumo:
A combined analytical and numerical study is performed of the mapping between strongly interacting fermions and weakly interacting spins, in the framework of the Hubbard, t-J, and Heisenberg models. While for spatially homogeneous models in the thermodynamic limit the mapping is thoroughly understood, we here focus on aspects that become relevant in spatially inhomogeneous situations, such as the effect of boundaries, impurities, superlattices, and interfaces. We consider parameter regimes that are relevant for traditional applications of these models, such as electrons in cuprates and manganites, and for more recent applications to atoms in optical lattices. The rate of the mapping as a function of the interaction strength is determined from the Bethe-Ansatz for infinite systems and from numerical diagonalization for finite systems. We show analytically that if translational symmetry is broken through the presence of impurities, the mapping persists and is, in a certain sense, as local as possible, provided the spin-spin interaction between two sites of the Heisenberg model is calculated from the harmonic mean of the onsite Coulomb interaction on adjacent sites of the Hubbard model. Numerical calculations corroborate these findings also in interfaces and superlattices, where analytical calculations are more complicated.
Resumo:
This paper reports the use of a non-destructive, continuous magnetic Barkhausen noise (CMBN) technique to investigate the size and thickness of volumetric defects, in a 1070 steel. The magnetic behavior of the used probe was analyzed by numerical simulation, using the finite element method (FEM). Results indicated that the presence of a ferrite coil core in the probe favors MBN emissions. The samples were scanned with different speeds and probe configurations to determine the effect of the flaw on the CMBN signal amplitude. A moving smooth window, based on a second-order statistical moment, was used for analyzing the time signal. The results show the technique`s good repeatability, and high capacity for detection of this type of defect. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The competition among the companies depends on the velocity and efficience they can create and commercialize knowledge in a timely and cost-efficient manner. In this context, collaboration emerges as a reaction to the environmental changes. Although strategic alliances and networks have been exploited in the strategic literature for decades, the complexity and continuous usage of these cooperation structures, in a world of growing competition, justify the continuous interest in both themes. This article presents a scanning of the contemporary academic production in strategic alliances and networks, covering the period from January 1997 to august 2007, based on the top five journals accordingly to the journal of Citation Report 2006 in the business and management categories simultaneously. The results point to a retraction in publications about strategic alliances and a significant growth in the area of strategic. networks. The joint view of strategic alliances and networks, cited by some authors a the evolutionary path of study, still did not appear salient. The most cited topics found in the alliance literature are the governance structure, cooperation, knowledge transfer, culture, control, trust, alliance formation,,previous experience, resources, competition and partner selection. The theme network focuses mainly on structure, knowledge transfer and social network, while the joint vision is highly concentrated in: the subjects of alliance formation and the governance choice.
Resumo:
Phaeosphaeria leaf spot (PLS) is an important disease in tropical and subtropical maize (Zea mays, L.) growing areas, but there is limited information on its inheritance. Thus, this research was conducted to study the inheritance of the PLS disease in tropical maize by using QTL mapping and to assess the feasibility of using marker-assisted selection aimed to develop genotypes resistance to this disease. Highly susceptible L14-04B and highly resistant L08-05F inbred lines were crossed to develop an F(2) population. Two-hundred and fifty six F(2) plants were genotyped with 143 microsatellite markers and their F(2:3) progenies were evaluated at seven environments. Ten plants per plot were evaluated 30 days after silk emergence following a rating scale, and the plot means were used for analyses. The heritability coefficient on a progeny mean basis was high (91.37%), and six QTL were mapped, with one QTL on chromosomes 1, 3, 4, and 6, and two QTL on chromosome 8. The gene action of the QTL ranged from additive to partial dominance, and the average level of dominance was partial dominance; also a dominance x dominance epistatic effect was detected between the QTL mapped on chromosome 8. The phenotypic variance explained by each QTL ranged from 2.91 to 11.86%, and the joint QTL effects explained 41.62% of the phenotypic variance. The alleles conditioning resistance to PLS disease of all mapped QTL were in the resistant parental inbred L08-05F. Thus, these alleles could be transferred to other elite maize inbreds by marker-assisted backcross selection to develop hybrids resistant to PLS disease.
Resumo:
Despite its importance to agriculture, the genetic basis of heterosis is still not well understood. The main competing hypotheses include dominance, overdominance, and epistasis. NC design III is an experimental design that. has been used for estimating the average degree of dominance of quantitative trait 106 (QTL) and also for studying heterosis. In this study, we first develop a multiple-interval mapping (MIM) model for design III that provides a platform to estimate the number, genomic positions, augmented additive and dominance effects, and epistatic interactions of QTL. The model can be used for parents with any generation of selling. We apply the method to two data sets, one for maize and one for rice. Our results show that heterosis in maize is mainly due to dominant gene action, although overdominance of individual QTL could not completely be ruled out due to the mapping resolution and limitations of NC design III. For rice, the estimated QTL dominant effects could not explain the observed heterosis. There is evidence that additive X additive epistatic effects of QTL could be the main cause for the heterosis in rice. The difference in the genetic basis of heterosis seems to be related to open or self pollination of the two species. The MIM model for NC design III is implemented in Windows QTL Cartographer, a freely distributed software.