991 resultados para graphics processing unit (GPU)
Resumo:
Traditionally, braided river research has considered flow, sediment transport processes and, recently, vegetation dynamics in relation to river morphodynamics. However, if considering the development of woody vegetated patches over a time scale of decades, we must consider the extent to which soil forming processes, particularly related to soil organic matter, impact the alluvial geomorphic-vegetation system. Here we quantify the soil organic matter processing (humification) that occurs on young alluvial landforms. We sampled different geomorphic units, ranging from the active river channel to established river terraces in a braided river system. For each geomorphic unit, soil pits were used to sample sediment/soil layers that were analysed in terms of grain size (<2mm) and organic matter quantity and quality (RockEval method). A principal components analysis was used to identify patterns in the dataset. Results suggest that during the succession from bare river gravels to a terrace soil, there is a transition from small amounts of external organic matter supply provided by sedimentation processes (e.g. organic matter transported in suspension and deposited on bars), to large amounts of autogenic in situ organic matter production due to plant colonisation. This appears to change the time scale and pathways of alluvial succession (bio-geomorphic succession). However, this process is complicated by: the ongoing possibility of local sedimentation, which can serve to isolate surface layers via aggradation from the exogenic supply; and erosion which tends to create fresh deposits upon which organic matter processing must re-start. The result is a complex pattern of organic matter states as well as a general lack of any clear chronosequence within the active river corridor. This state reflects the continual battle between deposition events that can isolate organic matter from the surface, erosion events that can destroy accumulating organic matter and the early ecosystem processes necessary to assist the co-evolution of soil and vegetation. A key question emerges over the extent to which the fresh organic matter deposited in the active zone is capable of significantly transforming the local geochemical environment sufficiently to accelerate soil development.
Resumo:
This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.
Resumo:
In the context of autonomous sensors powered by small-size photovoltaic (PV) panels, this work analyses how the efficiency of DC/DC-converter-based power processing circuits can be improved by an appropriate selection of the inductor current that transfers the energy from the PV panel to a storage unit. Each component of power losses (fixed, conduction and switching losses) involved in the DC/DC converter specifically depends on the average inductor current so that there is an optimal value of this current that causes minimal losses and, hence, maximum efficiency. Such an idea has been tested experimentally using two commercial DC/DC converters whose average inductor current is adjustable. Experimental results show that the efficiency can be improved up to 12% by selecting an optimal value of that current, which is around 300-350 mA for such DC/DC converters.
Resumo:
The effects of pulp processing on softwood fiber properties strongly influence the properties of wet and dry paper webs. Pulp strength delivery studies have provided observations that much of the strength potential of long fibered pulp is lost during brown stock fiber line operations where the pulp is merely washed and transferred to the subsequent processing stages. The objective of this work was to study the intrinsic mechanisms which maycause fiber damage in the different unit operations of modern softwood brown stock processing. The work was conducted by studying the effects of industrial machinery on pulp properties with some actions of unit operations simulated in laboratory scale devices under controlled conditions. An optical imaging system was created and used to study the orientation of fibers in the internal flows during pulp fluidization in mixers and the passage of fibers through the screen openings during screening. The qualitative changes in fibers were evaluated with existing and standardized techniques. The results showed that each process stage has its characteristic effects on fiber properties: Pulp washing and mat formation in displacement washers introduced fiber deformations especially if the fibers entering the stage were intact, but it did not decrease the pulp strength properties. However, storage chests and pulp transfer after displacement washers contributed to strength deterioration. Pulp screening proved to be quite gentle, having the potential of slightly evening out fiber deformations from very deformed pulps and vice versa inflicting a marginal increase in the deformation indices if the fibers were previously intact. Pulp mixing in fluidizing industrial mixers did not have detrimental effects on pulp strength and had the potential of slightly evening out the deformations, provided that the intensity of fluidization was high enough to allow fiber orientation with the flow and that the time of mixing was short. The chemical and mechanical actions of oxygen delignification had two distinct effects on pulp properties: chemical treatment clearly reduced pulp strength with and without mechanical treatment, and the mechanical actions of process machinery introduced more conformability to pulp fibers, but did not clearly contribute to a further decrease in pulp strength. The chemical composition of fibers entering the oxygen stage was also found to affect the susceptibility of fibers to damage during oxygen delignification. Fibers with the smallest content of xylan were found to be more prone to irreversibledeformations accompanied with a lower tensile strength of the pulp. Fibers poor in glucomannan exhibited a lower fiber strength while wet after oxygen delignification as compared to the reference pulp. Pulps with the smallest lignin content on the other hand exhibited improved strength properties as compared to the references.
Resumo:
The purpose of this project was to identify in a subject group of engineers and technicians (N = 62) a preferred mode of representation for facilitating correct recall of information from complex graphics. The modes of representation were black and white (b&w) block, b&w icon, color block, and color icon. The researcher's test instrument included twelve complex graphics (six b&w and six color - three per mode). Each graphics presentation was followed by two multiple-choice questions. Recall performance was better using b&w block mode graphics and color icon mode graphics. A standardized test, the Group Embedded Figures Test (GEFT) was used to identify a cognitive style preference (field dependence). Although engineers and technicians in the sample were strongly field-independent, they were not significantly more field-independent than the normative group in the Witkin, Oltman, Raskin, and Karp study (1971). Tests were also employed to look for any significant difference in cognitive style preference due to gender. None was found. Implications from the project results for the design of visuals and their use in technical training are discussed.
Resumo:
Le code source de la libraire développée accompagne ce dépôt dans l'état où il était à ce moment. Il est possible de trouver une version plus à jour sur github (http://github.com/abergeron).
Resumo:
In the present studies it is clear that Bacillus pumilus xylanase is having the characteristic suited for an industrial enzyme (xylanases that are active and stable at elevated temperatures and alkaline pH are needed). SSF production of xylanases and its application appears to be an innovative technology where the fermented substrate is the enzyme source that is used directly in the bleaching process without a prior downstream processing. The direct use of SSF enzymes in bleaching is a relatively new biobleaching approach. This can certainly benefit the bleaching process to lower the xylanase production costs and improve the economics and viability of the biobleaching technology. The application of enzymes to the bleaching process has been considered as an environmentally friendly approach that can reduce the negative impact on the environment exerted by the use of chlorine-based bleaching agents. It has been demonstrated that pretreatment of kraft pulp with xylanase prior to bleaching (biobleaching) can facilitate subsequent removal of lignin by bleaching chemicals, thereby, reducing the demand for elemental chlorine or improving final paper brightness. Using this xylanase pre-treatment, has resulted in an increased of brightness (8.5 Unit) when compared to non-enzymatic treated bleached pulp prepared using identical conditions. Reduction of the consumption of active chlorine can be achieved which results in a decrease in the toxicity, colour, chloride and absorbable organic halogen (AOX) levels of bleaching effluents. The xylanase treatment improves drainage, strength properties and the fragility of pulps, and also increases the brightness of pulps. This positive result shows that enzyme pre-treatment facilitates the removal of chromophore fragments of pulp there by making the process more environment friendly
Resumo:
In the present work, the author has designed and developed all types of solar air heaters called porous and nonporous collectors. The developed solar air heaters were subjected to different air mass flow rates in order to standardize the flow per unit area of the collector. Much attention was given to investigate the performance of the solar air heaters fitted with baffles. The output obtained from the experiments on pilot models, helped the installation of solar air heating system for industrial drying applications also. Apart from these, various types of solar dryers, for small and medium scale drying applications, were also built up. The feasibility of ‘latent heat thermal energy storage system’ based on Phase Change Material was also undertaken. The application of solar greenhouse for drying industrial effluent was analyzed in the present study and a solar greenhouse was developed. The effectiveness of Computational Fluid Dynamics (CFD) in the field of solar air heaters was also analyzed. The thesis is divided into eight chapters.
Resumo:
This thesis investigated the potential use of Linear Predictive Coding in speech communication applications. A Modified Block Adaptive Predictive Coder is developed, which reduces the computational burden and complexity without sacrificing the speech quality, as compared to the conventional adaptive predictive coding (APC) system. For this, changes in the evaluation methods have been evolved. This method is as different from the usual APC system in that the difference between the true and the predicted value is not transmitted. This allows the replacement of the high order predictor in the transmitter section of a predictive coding system, by a simple delay unit, which makes the transmitter quite simple. Also, the block length used in the processing of the speech signal is adjusted relative to the pitch period of the signal being processed rather than choosing a constant length as hitherto done by other researchers. The efficiency of the newly proposed coder has been supported with results of computer simulation using real speech data. Three methods for voiced/unvoiced/silent/transition classification have been presented. The first one is based on energy, zerocrossing rate and the periodicity of the waveform. The second method uses normalised correlation coefficient as the main parameter, while the third method utilizes a pitch-dependent correlation factor. The third algorithm which gives the minimum error probability has been chosen in a later chapter to design the modified coder The thesis also presents a comparazive study beh-cm the autocorrelation and the covariance methods used in the evaluaiicn of the predictor parameters. It has been proved that the azztocorrelation method is superior to the covariance method with respect to the filter stabf-it)‘ and also in an SNR sense, though the increase in gain is only small. The Modified Block Adaptive Coder applies a switching from pitch precitzion to spectrum prediction when the speech segment changes from a voiced or transition region to an unvoiced region. The experiments cont;-:ted in coding, transmission and simulation, used speech samples from .\£=_‘ajr2_1a:r1 and English phrases. Proposal for a speaker reecgnifion syste: and a phoneme identification system has also been outlized towards the end of the thesis.
Resumo:
The National Housing and Planning Advice Unit commissioned Professor Michael Ball of Reading University to undertake empirical research into how long it was taking to obtain planning consent for major housing sites in England. The focus on sites as opposed to planning applications is important because it is sites that generate housing.
Resumo:
The technique of constructing a transformation, or regrading, of a discrete data set such that the histogram of the transformed data matches a given reference histogram is commonly known as histogram modification. The technique is widely used for image enhancement and normalization. A method which has been previously derived for producing such a regrading is shown to be “best” in the sense that it minimizes the error between the cumulative histogram of the transformed data and that of the given reference function, over all single-valued, monotone, discrete transformations of the data. Techniques for smoothed regrading, which provide a means of balancing the error in matching a given reference histogram against the information lost with respect to a linear transformation are also examined. The smoothed regradings are shown to optimize certain cost functionals. Numerical algorithms for generating the smoothed regradings, which are simple and efficient to implement, are described, and practical applications to the processing of LANDSAT image data are discussed.
Resumo:
Most multidimensional projection techniques rely on distance (dissimilarity) information between data instances to embed high-dimensional data into a visual space. When data are endowed with Cartesian coordinates, an extra computational effort is necessary to compute the needed distances, making multidimensional projection prohibitive in applications dealing with interactivity and massive data. The novel multidimensional projection technique proposed in this work, called Part-Linear Multidimensional Projection (PLMP), has been tailored to handle multivariate data represented in Cartesian high-dimensional spaces, requiring only distance information between pairs of representative samples. This characteristic renders PLMP faster than previous methods when processing large data sets while still being competitive in terms of precision. Moreover, knowing the range of variation for data instances in the high-dimensional space, we can make PLMP a truly streaming data projection technique, a trait absent in previous methods.
Resumo:
Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.