922 resultados para NK Decorative arts Applied arts Decoration and ornament
Resumo:
Several positioning techniques have been developed to explore the GPS capability to provide precise coordinates in real time. However, a significant problem to all techniques is the ionosphere effect and the troposphere refraction. Recent researches in Brazil, at São Paulo State University (UNESP), have been trying to tackle these problems. In relation to the ionosphere effects it has been developed a model named Mod_Ion. Concerning tropospheric refraction, a model of Numerical Weather Prediction(NWP) has been used to compute the zenithal tropospheric delay (ZTD). These two models have been integrated with two positioning methods: DGPS (Differential GPS) and network RTK (Real Time Kinematic). These two positioning techniques are being investigated at São Paulo State University (UNESP), Brazil. The in-house DGPS software was already finalized and has provided very good results. The network RTK software is still under development. Therefore, only preliminary results from this method using the VRS (Virtual Reference Station) concept are presented.
Resumo:
The purpose of this study was to evaluate the influence of intrapulpal pressure and dentin depth on bond strengths of an etch-and-rinse and a self-etching bonding agent to dentin in vitro and in vivo. Twenty-four pairs of premolars were randomly divided into four groups (n = 6) according to the dentin bonding agent, Single Bond and Clearfil SE Bond, and intrapulpal pressure, null or positive. Each tooth of the pair was further designated to be treated in vivo or in vitro. The intrapulpal pressure was controlled in vivo by the delivery of local anesthetics containing or not a vasoconstrictor, while in vitro, it was achieved by keeping the teeth under hydrostatic pressure. Class I cavities were prepared and the dentin bonding agents were applied followed by incremental resin restoration. For the teeth treated in vitro, the same restorative procedures were performed after a 6 month-storage period. Beams with I mm 2 cross-sectional area were prepared and, microtensile tested. Clearfil SE Bond was not influenced by any of the variables of the study, while bond strengths produced in vitro were significatly higher for Single Bond. Overall, lower bond strengths were produced in deep dentin, which reached statistical significance when Single Bond was applied under physiological or simulated intrapulpal pressure. In conclusion, in vitro bonding may overestimate the immediate adhesive performance of more technique-sensitive dentin bonding systems. The impact of intrapulpal pressure on bond strength seems to be more adhesive dependent than dentin morphological characteristics related to depth. (C) 2007 Wiley Periodicals, Inc.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Nitrous oxide (N2O) is involved in both ozone destruction and global warming. In agricultural soils it is produced by nitrification and denitrification mainly after fertilization. Nitrification inhibitors have been proposed as one of the management tools for the reduction of the potential hazards of fertilizer-derived N2O. Addition of nitrification inhibitors to fertilizers maintains soil N in ammonium form, thereby gaseous N losses by nitrification and denitrification are less likely to occur and there is increased N utilization by the sward. We present a study aimed to evaluate the effectiveness of the nitrification inhibitor dicyandiamide (DCD) and of the slurry additive Actilith F2 on N2O emissions following application of calcium ammonium nitrate or cattle slurry to a mixed clover/ryegrass sward in the Basque Country. The results indicate that large differences in N2O emission occur depending on fertilizer type and the presence or absence of a nitrification inhibitor. There is considerable scope for immediate reduction of emissions by applying DCD with calcium ammonium nitrate or cattle slurry. DCD, applied at 25 kg ha-1, reduced the amount of N lost as N2O by 60% and 42% when applied with cattle slurry and calcium ammonium nitrate, respectively. Actilith F2 did not reduce N2O emissions and it produced a long lasting mineralization of previously immobilized added N.
Resumo:
Includes bibliography
Resumo:
This Thesis aims at building and discussing mathematical models applications focused on Energy problems, both on the thermal and electrical side. The objective is to show how mathematical programming techniques developed within Operational Research can give useful answers in the Energy Sector, how they can provide tools to support decision making processes of Companies operating in the Energy production and distribution and how they can be successfully used to make simulations and sensitivity analyses to better understand the state of the art and convenience of a particular technology by comparing it with the available alternatives. The first part discusses the fundamental mathematical background followed by a comprehensive literature review about mathematical modelling in the Energy Sector. The second part presents mathematical models for the District Heating strategic network design and incremental network design. The objective is the selection of an optimal set of new users to be connected to an existing thermal network, maximizing revenues, minimizing infrastructure and operational costs and taking into account the main technical requirements of the real world application. Results on real and randomly generated benchmark networks are discussed with particular attention to instances characterized by big networks dimensions. The third part is devoted to the development of linear programming models for optimal battery operation in off-grid solar power schemes, with consideration of battery degradation. The key contribution of this work is the inclusion of battery degradation costs in the optimisation models. As available data on relating degradation costs to the nature of charge/discharge cycles are limited, we concentrate on investigating the sensitivity of operational patterns to the degradation cost structure. The objective is to investigate the combination of battery costs and performance at which such systems become economic. We also investigate how the system design should change when battery degradation is taken into account.
Resumo:
OBJECTIVES AND METHODS: This study investigated the sealing ability of a current available unfilled fissure sealant applied over sound (n=80), artificially created (n=80) and naturally carious fissures (n=80) under different humidity conditions (90+/-2 and 45+/-2% relative humidity) and etching times (40 and 60s). All samples were submitted to 5000 thermal cycles and examined by light microscopy after sectioning. Microleakage, penetration ability, fissure type, fissure entrance angle, sealant occlusal length, caries location and caries depth were assessed. RESULTS: The significantly longer sealant occlusal length and larger entrance angle exhibited by shallow fissures, contributed to their higher microleakage and smaller amounts of unfilled areas compared to deep fissures. Sealant microleakage was significantly influenced by the condition of the enamel (sound, artificial and natural caries) and the caries location in the fissures, but not by enamel caries depth (D1 and D2), etching time, or humidity condition. Natural caries exhibited significantly higher microleakage than sound or artificially created carious fissures. CONCLUSIONS: Based on the results of this study, it can be concluded that location of caries in the fissure rather than its depth should be taken into account when applying a fissure sealant. When the borders of the fissure sealant are on carious enamel, a significantly higher microleakage must be expected. The artificial caries model was not a suitable method to assess the behavior of natural fissure caries.
Resumo:
Acute vascular rejection represents a formidable barrier to clinical xenotransplantation and it is known that this type of rejection can also be initiated by xenoreactive antibodies that have limited complement-activating ability. Using a sophisticated mouse model, a recent study has provided in vivo evidence for the existence of an IgG(1)-mediated vascular rejection, which uniquely depends on both the activation of complement and interactions with FcgammaRIII on natural killer (NK) cells.
Resumo:
Synthetic seismograms provide a crucial link between lithologic variations within a drill hole and reflectors on seismic profiles crossing the site. In essence, they provide a ground-truth for the interpretation of seismic data. Using a combination of core and logging data, we created synthetic seismograms for Ocean Drilling Program Sites 1165 and 1166, drilled during Leg 188, and Site 742, drilled during Leg 119, all in Prydz Bay, Antarctica. Results from Site 1165 suggest that coring penetrated a target reflector initially thought to represent the onset of drift sedimentation, but the lithologic change across the boundary does not show a change from predrift to drift sediments. The origin of a shallow reflector packet in the seismic line across Site 1166 and a line connecting Sites 1166 and 742 was resolved into its constituent sources, as this reflector occurs in a region of large-scale, narrowly spaced impedance changes. Furthermore, Site 1166 was situated in a fluvio-deltaic system with widely variable geology, and bed thickness changes were estimated between the site and both seismic lines.
Resumo:
This paper presents the detection and identification of hydrocarbons through flu oro-sensing by developing a simple and inexpensive detector for inland water, in contrast to current systems, designed to be used for marine waters at large distances and being extremely costly. To validate the proposed system, three test-benches have been mounted, with various UV-Iight sources. Main application of this system would be detect hydrocarbons pollution in rivers, lakes or dams, which in fact, is of growing interest by administrations.
Resumo:
The Project you are about to see it is based on the technologies used on object detection and recognition, especially on leaves and chromosomes. To do so, this document contains the typical parts of a scientific paper, as it is what it is. It is composed by an Abstract, an Introduction, points that have to do with the investigation area, future work, conclusions and references used for the elaboration of the document. The Abstract talks about what are we going to find in this paper, which is technologies employed on pattern detection and recognition for leaves and chromosomes and the jobs that are already made for cataloguing these objects. In the introduction detection and recognition meanings are explained. This is necessary as many papers get confused with these terms, specially the ones talking about chromosomes. Detecting an object is gathering the parts of the image that are useful and eliminating the useless parts. Summarizing, detection would be recognizing the objects borders. When talking about recognition, we are talking about the computers or the machines process, which says what kind of object we are handling. Afterwards we face a compilation of the most used technologies in object detection in general. There are two main groups on this category: Based on derivatives of images and based on ASIFT points. The ones that are based on derivatives of images have in common that convolving them with a previously created matrix does the treatment of them. This is done for detecting borders on the images, which are changes on the intensity of the pixels. Within these technologies we face two groups: Gradian based, which search for maximums and minimums on the pixels intensity as they only use the first derivative. The Laplacian based methods search for zeros on the pixels intensity as they use the second derivative. Depending on the level of details that we want to use on the final result, we will choose one option or the other, because, as its logic, if we used Gradian based methods, the computer will consume less resources and less time as there are less operations, but the quality will be worse. On the other hand, if we use the Laplacian based methods we will need more time and resources as they require more operations, but we will have a much better quality result. After explaining all the derivative based methods, we take a look on the different algorithms that are available for both groups. The other big group of technologies for object recognition is the one based on ASIFT points, which are based on 6 image parameters and compare them with another image taking under consideration these parameters. These methods disadvantage, for our future purposes, is that it is only valid for one single object. So if we are going to recognize two different leaves, even though if they refer to the same specie, we are not going to be able to recognize them with this method. It is important to mention these types of technologies as we are talking about recognition methods in general. At the end of the chapter we can see a comparison with pros and cons of all technologies that are employed. Firstly comparing them separately and then comparing them all together, based on our purposes. Recognition techniques, which are the next chapter, are not really vast as, even though there are general steps for doing object recognition, every single object that has to be recognized has its own method as the are different. This is why there is not a general method that we can specify on this chapter. We now move on into leaf detection techniques on computers. Now we will use the technique explained above based on the image derivatives. Next step will be to turn the leaf into several parameters. Depending on the document that you are referring to, there will be more or less parameters. Some papers recommend to divide the leaf into 3 main features (shape, dent and vein] and doing mathematical operations with them we can get up to 16 secondary features. Next proposition is dividing the leaf into 5 main features (Diameter, physiological length, physiological width, area and perimeter] and from those, extract 12 secondary features. This second alternative is the most used so it is the one that is going to be the reference. Following in to leaf recognition, we are based on a paper that provides a source code that, clicking on both leaf ends, it automatically tells to which specie belongs the leaf that we are trying to recognize. To do so, it only requires having a database. On the tests that have been made by the document, they assure us a 90.312% of accuracy over 320 total tests (32 plants on the database and 10 tests per specie]. Next chapter talks about chromosome detection, where we shall pass the metaphasis plate, where the chromosomes are disorganized, into the karyotype plate, which is the usual view of the 23 chromosomes ordered by number. There are two types of techniques to do this step: the skeletonization process and swiping angles. Skeletonization progress consists on suppressing the inside pixels of the chromosome to just stay with the silhouette. This method is really similar to the ones based on the derivatives of the image but the difference is that it doesnt detect the borders but the interior of the chromosome. Second technique consists of swiping angles from the beginning of the chromosome and, taking under consideration, that on a single chromosome we cannot have more than an X angle, it detects the various regions of the chromosomes. Once the karyotype plate is defined, we continue with chromosome recognition. To do so, there is a technique based on the banding that chromosomes have (grey scale bands] that make them unique. The program then detects the longitudinal axis of the chromosome and reconstructs the band profiles. Then the computer is able to recognize this chromosome. Concerning the future work, we generally have to independent techniques that dont reunite detection and recognition, so our main focus would be to prepare a program that gathers both techniques. On the leaf matter we have seen that, detection and recognition, have a link as both share the option of dividing the leaf into 5 main features. The work that would have to be done is to create an algorithm that linked both methods, as in the program, which recognizes leaves, it has to be clicked both leaf ends so it is not an automatic algorithm. On the chromosome side, we should create an algorithm that searches for the beginning of the chromosome and then start to swipe angles, to later give the parameters to the program that searches for the band profiles. Finally, on the summary, we explain why this type of investigation is needed, and that is because with global warming, lots of species (animals and plants] are beginning to extinguish. That is the reason why a big database, which gathers all the possible species, is needed. For recognizing animal species, we just only have to have the 23 chromosomes. While recognizing a plant, there are several ways of doing it, but the easiest way to input a computer is to scan the leaf of the plant. RESUMEN. El proyecto que se puede ver a continuación trata sobre las tecnologías empleadas en la detección y reconocimiento de objetos, especialmente de hojas y cromosomas. Para ello, este documento contiene las partes típicas de un paper de investigación, puesto que es de lo que se trata. Así, estará compuesto de Abstract, Introducción, diversos puntos que tengan que ver con el área a investigar, trabajo futuro, conclusiones y biografía utilizada para la realización del documento. Así, el Abstract nos cuenta qué vamos a poder encontrar en este paper, que no es ni más ni menos que las tecnologías empleadas en el reconocimiento y detección de patrones en hojas y cromosomas y qué trabajos hay existentes para catalogar a estos objetos. En la introducción se explican los conceptos de qué es la detección y qué es el reconocimiento. Esto es necesario ya que muchos papers científicos, especialmente los que hablan de cromosomas, confunden estos dos términos que no podían ser más sencillos. Por un lado tendríamos la detección del objeto, que sería simplemente coger las partes que nos interesasen de la imagen y eliminar aquellas partes que no nos fueran útiles para un futuro. Resumiendo, sería reconocer los bordes del objeto de estudio. Cuando hablamos de reconocimiento, estamos refiriéndonos al proceso que tiene el ordenador, o la máquina, para decir qué clase de objeto estamos tratando. Seguidamente nos encontramos con un recopilatorio de las tecnologías más utilizadas para la detección de objetos, en general. Aquí nos encontraríamos con dos grandes grupos de tecnologías: Las basadas en las derivadas de imágenes y las basadas en los puntos ASIFT. El grupo de tecnologías basadas en derivadas de imágenes tienen en común que hay que tratar a las imágenes mediante una convolución con una matriz creada previamente. Esto se hace para detectar bordes en las imágenes que son básicamente cambios en la intensidad de los píxeles. Dentro de estas tecnologías nos encontramos con dos grupos: Los basados en gradientes, los cuales buscan máximos y mínimos de intensidad en la imagen puesto que sólo utilizan la primera derivada; y los Laplacianos, los cuales buscan ceros en la intensidad de los píxeles puesto que estos utilizan la segunda derivada de la imagen. Dependiendo del nivel de detalles que queramos utilizar en el resultado final nos decantaremos por un método u otro puesto que, como es lógico, si utilizamos los basados en el gradiente habrá menos operaciones por lo que consumirá más tiempo y recursos pero por la contra tendremos menos calidad de imagen. Y al revés pasa con los Laplacianos, puesto que necesitan más operaciones y recursos pero tendrán un resultado final con mejor calidad. Después de explicar los tipos de operadores que hay, se hace un recorrido explicando los distintos tipos de algoritmos que hay en cada uno de los grupos. El otro gran grupo de tecnologías para el reconocimiento de objetos son los basados en puntos ASIFT, los cuales se basan en 6 parámetros de la imagen y la comparan con otra imagen teniendo en cuenta dichos parámetros. La desventaja de este método, para nuestros propósitos futuros, es que sólo es valido para un objeto en concreto. Por lo que si vamos a reconocer dos hojas diferentes, aunque sean de la misma especie, no vamos a poder reconocerlas mediante este método. Aún así es importante explicar este tipo de tecnologías puesto que estamos hablando de técnicas de reconocimiento en general. Al final del capítulo podremos ver una comparación con los pros y las contras de todas las tecnologías empleadas. Primeramente comparándolas de forma separada y, finalmente, compararemos todos los métodos existentes en base a nuestros propósitos. Las técnicas de reconocimiento, el siguiente apartado, no es muy extenso puesto que, aunque haya pasos generales para el reconocimiento de objetos, cada objeto a reconocer es distinto por lo que no hay un método específico que se pueda generalizar. Pasamos ahora a las técnicas de detección de hojas mediante ordenador. Aquí usaremos la técnica explicada previamente explicada basada en las derivadas de las imágenes. La continuación de este paso sería diseccionar la hoja en diversos parámetros. Dependiendo de la fuente a la que se consulte pueden haber más o menos parámetros. Unos documentos aconsejan dividir la morfología de la hoja en 3 parámetros principales (Forma, Dentina y ramificación] y derivando de dichos parámetros convertirlos a 16 parámetros secundarios. La otra propuesta es dividir la morfología de la hoja en 5 parámetros principales (Diámetro, longitud fisiológica, anchura fisiológica, área y perímetro] y de ahí extraer 12 parámetros secundarios. Esta segunda propuesta es la más utilizada de todas por lo que es la que se utilizará. Pasamos al reconocimiento de hojas, en la cual nos hemos basado en un documento que provee un código fuente que cucando en los dos extremos de la hoja automáticamente nos dice a qué especie pertenece la hoja que estamos intentando reconocer. Para ello sólo hay que formar una base de datos. En los test realizados por el citado documento, nos aseguran que tiene un índice de acierto del 90.312% en 320 test en total (32 plantas insertadas en la base de datos por 10 test que se han realizado por cada una de las especies]. El siguiente apartado trata de la detección de cromosomas, en el cual se debe de pasar de la célula metafásica, donde los cromosomas están desorganizados, al cariotipo, que es como solemos ver los 23 cromosomas de forma ordenada. Hay dos tipos de técnicas para realizar este paso: Por el proceso de esquelotonización y barriendo ángulos. El proceso de esqueletonización consiste en eliminar los píxeles del interior del cromosoma para quedarse con su silueta; Este proceso es similar a los métodos de derivación de los píxeles pero se diferencia en que no detecta bordes si no que detecta el interior de los cromosomas. La segunda técnica consiste en ir barriendo ángulos desde el principio del cromosoma y teniendo en cuenta que un cromosoma no puede doblarse más de X grados detecta las diversas regiones de los cromosomas. Una vez tengamos el cariotipo, se continua con el reconocimiento de cromosomas. Para ello existe una técnica basada en las bandas de blancos y negros que tienen los cromosomas y que son las que los hacen únicos. Para ello el programa detecta los ejes longitudinales del cromosoma y reconstruye los perfiles de las bandas que posee el cromosoma y que lo identifican como único. En cuanto al trabajo que se podría desempeñar en el futuro, tenemos por lo general dos técnicas independientes que no unen la detección con el reconocimiento por lo que se habría de preparar un programa que uniese estas dos técnicas. Respecto a las hojas hemos visto que ambos métodos, detección y reconocimiento, están vinculados debido a que ambos comparten la opinión de dividir las hojas en 5 parámetros principales. El trabajo que habría que realizar sería el de crear un algoritmo que conectase a ambos ya que en el programa de reconocimiento se debe clicar a los dos extremos de la hoja por lo que no es una tarea automática. En cuanto a los cromosomas, se debería de crear un algoritmo que busque el inicio del cromosoma y entonces empiece a barrer ángulos para después poder dárselo al programa que busca los perfiles de bandas de los cromosomas. Finalmente, en el resumen se explica el por qué hace falta este tipo de investigación, esto es que con el calentamiento global, muchas de las especies (tanto animales como plantas] se están empezando a extinguir. Es por ello que se necesitará una base de datos que contemple todas las posibles especies tanto del reino animal como del reino vegetal. Para reconocer a una especie animal, simplemente bastará con tener sus 23 cromosomas; mientras que para reconocer a una especie vegetal, existen diversas formas. Aunque la más sencilla de todas es contar con la hoja de la especie puesto que es el elemento más fácil de escanear e introducir en el ordenador.
Resumo:
The purpose of this research was to apply the use of direct ablation plasma spectroscopic techniques, including spark-induced breakdown spectroscopy (SIBS) and laser-induced breakdown spectroscopy (LIBS), to a variety of environmental matrices. These were applied to two different analytical problems. SIBS instrumentation was adapted in order to develop a fieldable monitor for the measurement of carbon in soil. SIBS spectra in the 200 nm to 400 nm region of several soils were collected, and the neutral carbon line (247.85 nm) was compared to total carbon concentration determined by standard dry combustion analysis. Additionally, Fe and Si were evaluated in a multivariate model in order to determine their impacts on the model's predictive power for total carbon concentrations. The results indicate that SIBS is a viable method to quantify total carbon levels in soils; obtaining a good correlation between measured and predicated carbon in soils. These results indicate that multivariate analysis can be used to construct a calibration model for SIBS soil spectra, and SIBS is a promising method for the determination of total soil carbon. SIBS was also applied to the study of biological warfare agent simulants. Elemental compositions (determined independently) of bioaerosol samples were compared to the SIBS atomic (Ca, Al, Fe and Si) and molecular (CN, N2 and OH) emission signals. Results indicate a linear relationship between the temporally integrated emission strength and the concentration of the associated element. Finally, LIBS signals of hematite were analyzed under low pressures of pure CO2 and compared with signals acquired with a mixture of CO2, N2 and Ar, which is representative of the Martian atmosphere. This research was in response to the potential use of LIBS instrumentation on the Martian surface and to the challenges associated with these measurements. Changes in Ca, Fe and Al lineshapes observed in the LIBS spectra at different gas compositions and pressures were studied. It was observed that the size of the plasma formed on the hematite changed in a non-linear way as a function of decreasing pressure in a CO2 atmosphere and a simulated Martian atmosphere.
Resumo:
Mode of access: Internet.