873 resultados para Coarse-to-fine processing
Resumo:
Atualmente, sensores remotos e computadores de alto desempenho estão sendo utilizados como instrumentos principais na coleta e produção de dados oceanográficos. De posse destes dados, é possível realizar estudos que permitem simular e prever o comportamento do oceano por meio de modelos numéricos regionais. Dentre os fatores importantes no estudo da oceanografia, podem ser destacados àqueles referentes aos impactos ambientais, de contaminação antrópica, utilização de energias renováveis, operações portuárias e etc. Contudo, devido ao grande volume de dados gerados por instituições ambientais, na forma de resultados de modelos globais como o HYCOM (Hybrid Coordinate Ocean Model) e dos programas de Reanalysis da NOAA (National Oceanic and Atmospheric Administration), torna-se necessária a criação de rotinas computacionais para realizar o tratamento de condições iniciais e de contorno, de modo que possam ser aplicadas a modelos regionais como o TELEMAC3D (www.opentelemac.org). Problemas relacionados a baixa resolução, ausência de dados e a necessidade de interpolação para diferentes malhas ou sistemas de coordenadas verticais, tornam necessária a criação de um mecanismo computacional que realize este tratamento adequadamente. Com isto, foram desenvolvidas rotinas na linguagem de programação Python, empregando interpoladores de vizinho mais próximo, de modo que, a partir de dados brutos dos modelos HYCOM e do programa de Reanalysis da NOAA, foram preparadas condições iniciais e de contorno para a realização de uma simulação numérica teste. Estes resultados foram confrontados com outro resultado numérico onde, as condições foram construídas a partir de um método de interpolação mais sofisticado, escrita em outra linguagem, e que já vem sendo utilizada no laboratório. A análise dos resultados permitiu concluir que, a rotina desenvolvida no âmbito deste trabalho, funciona adequadamente para a geração de condições iniciais e de contorno do modelo TELEMAC3D. Entretanto, um interpolador mais sofisticado deve ser desenvolvido de forma a aumentar a qualidade nas interpolações, otimizar o custo computacional, e produzir condições que sejam mais realísticas para a utilização do modelo TELEMAC3D.
Resumo:
Abstract: As time has passed, the general purpose programming paradigm has evolved, producing different hardware architectures whose characteristics differ widely. In this work, we are going to demonstrate, through different applications belonging to the field of Image Processing, the existing difference between three Nvidia hardware platforms: two of them belong to the GeForce graphics cards series, the GTX 480 and the GTX 980 and one of the low consumption platforms which purpose is to allow the execution of embedded applications as well as providing an extreme efficiency: the Jetson TK1. With respect to the test applications we will use five examples from Nvidia CUDA Samples. These applications are directly related to Image Processing, as the algorithms they use are similar to those from the field of medical image registration. After the tests, it will be proven that GTX 980 is both the device with the highest computational power and the one that has greater consumption, it will be seen that Jetson TK1 is the most efficient platform, it will be shown that GTX 480 produces more heat than the others and we will learn other effects produced by the existing difference between the architecture of the devices.
Resumo:
Resuspension of the top few sediment layers of tidal mud flats is known to enhance planktonic biomass of microbiota (benthic diatoms and bacteria). This process is mainly controlled by tidal shear stress and cohesiveness of mud, and is also influenced by bioturbation activities. Laboratory experiments in a race track flume were performed to test the interactive effects of these factors on both the critical entrainment and resuspension kinetics of microbiota from silt-clay sediments from the Marennes-Oleron Bay, France. The marine snail Hydrobia ulvae was used to mimic surface bioturbation activities. As expected, the kinetics of microbial resuspension versus shear stress were largely controlled by the cohesiveness of silt-clay sediments. However, our results indicate that the effect of surface tracking by H. ulvae on microbial resuspension was clearly dependent on the interaction between sediment cohesiveness and shear velocity. Evidence was also found that microphytobenthos and bacteria are not simultaneously resuspended from silt-clay bioturbated sediments. This supports the theory that diatoms within the easily eroded mucus matrix behave actively and bacteria adhering to fine silt particles eroded at higher critical shear velocities behave passively.
Resumo:
The Office of State Auditor audited the financial statements of the Town of Cheraw Municipal Court using agreed upon procedures. The following topics were included in the audit: violations of state laws, adherence to fine guidelines and opinion on supplementary schedule. A response is included by the Town of Cheraw.
Resumo:
Bimetallic alloys are increasingly used in heterogeneous catalysis. This interest is explained by the emergence of new features that are absent in the parent single metals. Synergistic effects between the two combined elements create a more efficient catalyst. One of the most challenging aspect of multicomponent materials in catalysis is the ability to fine-tune the catalytic properties of an alloy by controlling the nature and composition of the surface [1]. For example, the gold/silver alloy combines a high activity and a large selectivity for a broad range of oxidation reaction.It is well established that the surface composition of alloys may deviate from that of the bulk phase. Surface enrichment has also important consequences in some applications of heterogeneous catalysis. In some cases, the thermal and chemical treatments can lead to opposite trends regarding the nature of the metal prone to surface enrichment. Using atom probe tomography we aim to link the physicochemical conditions the composition of the very first atomic layers of bimetallic catalysts and eventually to fine-tune the catalytic features of the latter.
Resumo:
The state of Rio Grande do Norte counts with a relevant potential in the shrimp farming supply chain. In the larviculture step the state responds for more than half of the national production. In the farming step it is the second largest producer. In the industrial step, its industries have almost 40% of the shrimp processing capacity of the northeast of Brazil. However, this country has the highest tax rate comparing with the main shrimp producer countries. Considering the influence of taxes in the competition among companies, the main goal of this research is to analyze the impact of indirect taxes in the above steps of the supply chain. To achieve it, it will be used the data of the 2011 Census of the Shrimp Farming and it will be applied the Herfindahl-Hirschman Index to identify the market form of those steps. In order to contribute with the characterization of the supply chain, CEO´s of farms and industries will be interviewed. The price-elasticity of the shrimp larvae, the in natura shrimp and the processed shrimp will be analyzed in order to verify the possibility that each one of those three steps has to pass-through the onus of the end of benefit over the ICMS. The data analysis shows that the larviculture step functions as a duopoly and, facing the end of that benefit, it will be able to pass-through most its onus to the farming step. On the other hand, this step functions similar to a perfect competing market, which diminishes its capacity to pass-through that onus to the processing step. This step operates as oligopoly with a lower concentration than the larviculture step but, due to the fact that it faces an oligopsony, it will end up assuming most of that onus, which will cause a decrease in the amount of processed shrimp. It is concluded that the end of that benefit would impact negatively, in this state, the supply chain at all, but mainly the farming and the industrial steps
Resumo:
The demand of highest quality foods in terms of taste and their properties preservation without the use of additives is constantly increasing. Consequently, new approaches to food processing have been developed, as for example high-pressure technology which has proven to be very valuable because it allows to maintain good properties of food like some vitamins and, at the same time, to reduce some undesirable bacteria. This technology avoids the use of high temperatures during the process (not like Pasteurization), which may have adverse effect on some nutritional properties of the food, its flavour, etc. The models for some enzymatic inactivations, which depend on the pressure and temperature profiles are presented. This work deals with the optimization of the inactivation of certain enzymes when high pressure treatment on food processing is applied. The optimization algorithms will minimize the inactivation not only of a certain isolated enzyme but also to several enzymes that can be involved simultaneously in the high-pressure process.
Resumo:
Tese (doutorado)—Universidade de Brasília, Faculdade de Direito, Programa de Pós-Graduação em Direito, 2016.
Resumo:
The United States of America is making great efforts to transform the renewable and abundant biomass resources into cost-competitive, high-performance biofuels, bioproducts, and biopower. This is the key to increase domestic production of transportation fuels and renewable energy, and reduce greenhouse gas and other pollutant emissions. This dissertation focuses specifically on assessing the life cycle environmental impacts of biofuels and bioenergy produced from renewable feedstocks, such as lignocellulosic biomass, renewable oils and fats. The first part of the dissertation presents the life cycle greenhouse gas (GHG) emissions and energy demands of renewable diesel (RD) and hydroprocessed jet fuels (HRJ). The feedstocks include soybean, camelina, field pennycress, jatropha, algae, tallow and etc. Results show that RD and HRJ produced from these feedstocks reduce GHG emissions by over 50% compared to comparably performing petroleum fuels. Fossil energy requirements are also significantly reduced. The second part of this dissertation discusses the life cycle GHG emissions, energy demands and other environmental aspects of pyrolysis oil as well as pyrolysis oil derived biofuels and bioenergy. The feedstocks include waste materials such as sawmill residues, logging residues, sugarcane bagasse and corn stover, and short rotation forestry feedstocks such as hybrid poplar and willow. These LCA results show that as much as 98% GHG emission savings is possible relative to a petroleum heavy fuel oil. Life cycle GHG savings of 77 to 99% were estimated for power generation from pyrolysis oil combustion relative to fossil fuels combustion for electricity, depending on the biomass feedstock and combustion technologies used. Transportation fuels hydroprocessed from pyrolysis oil show over 60% of GHG reductions compared to petroleum gasoline and diesel. The energy required to produce pyrolysis oil and pyrolysis oil derived biofuels and bioelectricity are mainly from renewable biomass, as opposed to fossil energy. Other environmental benefits include human health, ecosystem quality and fossil resources. The third part of the dissertation addresses the direct land use change (dLUC) impact of forest based biofuels and bioenergy. An intensive harvest of aspen in Michigan is investigated to understand the GHG mitigation with biofuels and bioenergy production. The study shows that the intensive harvest of aspen in MI compared to business as usual (BAU) harvesting can produce 18.5 billion gallons of ethanol to blend with gasoline for the transport sector over the next 250 years, or 32.2 billion gallons of bio-oil by the fast pyrolysis process, which can be combusted to generate electricity or upgraded to gasoline and diesel. Intensive harvesting of these forests can result in carbon loss initially in the aspen forest, but eventually accumulates more carbon in the ecosystem, which translates to a CO2 credit from the dLUC impact. Time required for the forest-based biofuels to reach carbon neutrality is approximately 60 years. The last part of the dissertation describes the use of depolymerization model as a tool to understand the kinetic behavior of hemicellulose hydrolysis under dilute acid conditions. Experiments are carried out to measure the concentrations of xylose and xylooligomers during dilute acid hydrolysis of aspen. The experiment data are used to fine tune the parameters of the depolymerization model. The results show that the depolymerization model successfully predicts the xylose monomer profile in the reaction, however, it overestimates the concentrations of xylooligomers.
Resumo:
Chloroperoxidase (CPO) is a heme-containing glycoprotein secreted by the marine fungus Caldariomyces fumago. Chloroperoxidase contains one ferriprotoporphyrin IX prosthetic group per molecule and catalyzes a variety of reactions, such as halogenation, peroxidation and epoxidation. The versatile catalytic activities of CPO coupled with the increasing demands for chiral synthesis have attracted an escalating interest in understanding the mechanistic and structural properties of this enzyme. In order to better understand the mechanisms of CPO-catalyzed enantioselective reactions and to fine-tune the catalytic properties of chloroperoxidase, asparagine 74 (N74) located in the narrow substrate access channel of CPO was replaced by a bulky, nonpolar valine and a polar glutamine using site-directed mutagenesis. The CPO N74 mutants displayed significantly enhanced activity toward nonpolar substrates compared to wild-type CPO as a result of changes in space and polarity of the heme distal environment. More interestingly, N74 mutants showed dramatically decreased chlorination and catalase activity but significantly enhanced epoxidation activity as a consequence of improved kinetic perfection introduced by the mutation as reflected by the favorable changes in kcat and kcat/KM of these reactions. It is also noted that the N74V mutant is capable of decomposing cyanide, the most notorious poison for many hemoproteins, as judged by the unique binding behavior of N74V with potassium cyanide. Histidine 105 (H105) was replaced by a nonpolar amino acid alanine using site-directed mutagenesis. The CPO H105 mutant (H105A) displayed dramatically decreased chlorination and catalase activity possibly because of the decreased polarity in the heme distal environment and loss of the hydrogen bonds between histidine 105 and glutamic acid 183. However, significantly increased enantioselectivity was observed for the epoxidation of bulky styrene derivatives. Furthermore, my study provides strong evidence for the proposed histidine/cysteine ligand switch in chloroperoxidase, providing experimental support for the structure of the 420-nm absorption maximum for a number of carbon monoxide complexes of heme-thiolate proteins. For the NMR study, [dCPO(heme)] was produced using 90% deuterated growth medium with excess heme precursors and [dCPO(Phe)] was grown in the same highly deuterated medium that had been supplemented with excess natural phenylalanine. To make complete heme proton assignments, NMR spectroscopy has been performed for high-resolution structural characterization of [dCPO(heme)] and [dCPO(Phe)] to achieve unambiguous and complete heme proton assignments, which also allows important amino acids close to the heme active center to be determined.
Resumo:
Severe plastic deformation techniques are known to produce grain sizes up to submicron level. This leads to conventional Hall-Petch strengthening of the as-processed materials. In addition, the microstructures of severe plastic deformation processed materials are characterized by relatively lower dislocation density compared to the conventionally processed materials subjected to the same amount of strain. These two aspects taken together lead to many important attributes. Some examples are ultra-high yield and fracture strengths, superplastic formability at lower temperatures and higher strain rates, superior wear resistance, improved high cycle fatigue life. Since these processes are associated with large amount of strain, depending on the strain path, characteristic crystallographic textures develop. In the present paper, a detailed account of underlying mechanisms during SPD has been discussed and processing-microstructure-texture-property relationship has been presented with reference to a few varieties of steels that have been investigated till date.
Resumo:
Cu samples were subjected to high-pressure torsion (HPT) with up to 6 turns at room temperature (RT) and liquid nitrogen temperature (LNT), respectively. The effects of temperature on grain refinement and microhardness variation were investigated. For the samples after HPT processing at RT, the grain size reduced from 43 mu m to 265 nm, and the Vickers microhardness increased from HV52 to HV140. However, for the samples after HPT processing at LNT, the value of microhardness reached its maximum of HV150 near the center of the sample and it decreased to HV80 at the periphery region. Microstructure observations revealed that HPT straining at LNT induced lamellar structures with thickness less than 100 nm appearing near the central region of the sample, but further deformation induced an inhomogeneous distribution of grain sizes, with submicrometer-sized grains embedded inside micrometer-sized grains. The submicrometer-sized grains with high dislocation density indicated their nonequilibrium nature. On the contrary, the micrometer-sized grains were nearly free of dislocation, without obvious deformation trace remaining in them. These images demonstrated that the appearance of micrometer-sized grains is the result of abnormal grain growth of the deformed fine grains.
Resumo:
Two depositional models to account for Holocene gravel-dominated beach ridges covered by dunes, occurring on the northern coast of Ireland, are considered in the light of infrared-stimulated luminescence ages of sand units within beach ridges, and 14C ages from organic horizons in dunes. A new chronostratigraphy obtained from prograded beach ridges with covering dunes at Murlough, north-east Ireland, supports a model of mesoscale alternating sediment decoupling (ASD) on the upper beach, rather than macroscale sequential sediment sourcing to account for prograded beach ridges and covering dunes. The ASD model specifies storm or fair-weather sand beach ridges forming at high-tide positions (on an annual basis at minimum), which acted as deflationary sources for landward foredune development. Only a limited number of such late-Holocene beach ridges survive in the observed prograded series. Beach ridges only survive when capped by storm-generated gravel beaches that are deposited on a mesoscale time spacing of 50–130 years. The morphodynamic shift from a dissipative beach face for dune formation to a reflective beach face for gravel capping appears to be controlled by the beach sand volume falling to a level where reflective conditions can prevail. Sediment volume entering the beach is thought to have fluctuated as a function of a forced regression associated with the falling sea level from the mid-Holocene highstand (ca. 6000 cal. yr BP) identified in north-east Ireland. The prograded beach ridges dated at ca. 3000 to 2000 cal. yr BP indicate that the Holocene highstand’s regressive phase may have lasted longer than previously specified.
Resumo:
The authors demonstrate four real-time reactive responses to movement in everyday scenes using an active head/eye platform. They first describe the design and realization of a high-bandwidth four-degree-of-freedom head/eye platform and visual feedback loop for the exploration of motion processing within active vision. The vision system divides processing into two scales and two broad functions. At a coarse, quasi-peripheral scale, detection and segmentation of new motion occurs across the whole image, and at fine scale, tracking of already detected motion takes place within a foveal region. Several simple coarse scale motion sensors which run concurrently at 25 Hz with latencies around 100 ms are detailed. The use of these sensors are discussed to drive the following real-time responses: (1) head/eye saccades to moving regions of interest; (2) a panic response to looming motion; (3) an opto-kinetic response to continuous motion across the image and (4) smooth pursuit of a moving target using motion alone.
Resumo:
Human beings perceive images through their properties, like colour, shape, size, and texture. Texture is a fertile source of information about the physical environment. Images of low density crowds tend to present coarse textures, while images of dense crowds tend to present fine textures. This paper describes a new technique for automatic estimation of crowd density, which is a part of the problem of automatic crowd monitoring, using texture information based on grey-level transition probabilities on digitised images. Crowd density feature vectors are extracted from such images and used by a self organising neural network which is responsible for the crowd density estimation. Results obtained respectively to the estimation of the number of people in a specific area of Liverpool Street Railway Station in London (UK) are presented.