992 resultados para Graphical processing unit
Resumo:
The introduction of time-series graphs into British economics in the 19th century depended on the « timing » of history. This involved reconceptualizing history into events which were both comparable and measurable and standardized by time unit. Yet classical economists in Britain in the early 19th century viewed history as a set of heterogenous and complex events and statistical tables as giving unrelated facts. Both these attitudes had to be broken down before time-series graphs could be brought into use for revealing regularities in economic events by the century's end.
Resumo:
In the context of autonomous sensors powered by small-size photovoltaic (PV) panels, this work analyses how the efficiency of DC/DC-converter-based power processing circuits can be improved by an appropriate selection of the inductor current that transfers the energy from the PV panel to a storage unit. Each component of power losses (fixed, conduction and switching losses) involved in the DC/DC converter specifically depends on the average inductor current so that there is an optimal value of this current that causes minimal losses and, hence, maximum efficiency. Such an idea has been tested experimentally using two commercial DC/DC converters whose average inductor current is adjustable. Experimental results show that the efficiency can be improved up to 12% by selecting an optimal value of that current, which is around 300-350 mA for such DC/DC converters.
Resumo:
The effects of pulp processing on softwood fiber properties strongly influence the properties of wet and dry paper webs. Pulp strength delivery studies have provided observations that much of the strength potential of long fibered pulp is lost during brown stock fiber line operations where the pulp is merely washed and transferred to the subsequent processing stages. The objective of this work was to study the intrinsic mechanisms which maycause fiber damage in the different unit operations of modern softwood brown stock processing. The work was conducted by studying the effects of industrial machinery on pulp properties with some actions of unit operations simulated in laboratory scale devices under controlled conditions. An optical imaging system was created and used to study the orientation of fibers in the internal flows during pulp fluidization in mixers and the passage of fibers through the screen openings during screening. The qualitative changes in fibers were evaluated with existing and standardized techniques. The results showed that each process stage has its characteristic effects on fiber properties: Pulp washing and mat formation in displacement washers introduced fiber deformations especially if the fibers entering the stage were intact, but it did not decrease the pulp strength properties. However, storage chests and pulp transfer after displacement washers contributed to strength deterioration. Pulp screening proved to be quite gentle, having the potential of slightly evening out fiber deformations from very deformed pulps and vice versa inflicting a marginal increase in the deformation indices if the fibers were previously intact. Pulp mixing in fluidizing industrial mixers did not have detrimental effects on pulp strength and had the potential of slightly evening out the deformations, provided that the intensity of fluidization was high enough to allow fiber orientation with the flow and that the time of mixing was short. The chemical and mechanical actions of oxygen delignification had two distinct effects on pulp properties: chemical treatment clearly reduced pulp strength with and without mechanical treatment, and the mechanical actions of process machinery introduced more conformability to pulp fibers, but did not clearly contribute to a further decrease in pulp strength. The chemical composition of fibers entering the oxygen stage was also found to affect the susceptibility of fibers to damage during oxygen delignification. Fibers with the smallest content of xylan were found to be more prone to irreversibledeformations accompanied with a lower tensile strength of the pulp. Fibers poor in glucomannan exhibited a lower fiber strength while wet after oxygen delignification as compared to the reference pulp. Pulps with the smallest lignin content on the other hand exhibited improved strength properties as compared to the references.
Resumo:
In the present studies it is clear that Bacillus pumilus xylanase is having the characteristic suited for an industrial enzyme (xylanases that are active and stable at elevated temperatures and alkaline pH are needed). SSF production of xylanases and its application appears to be an innovative technology where the fermented substrate is the enzyme source that is used directly in the bleaching process without a prior downstream processing. The direct use of SSF enzymes in bleaching is a relatively new biobleaching approach. This can certainly benefit the bleaching process to lower the xylanase production costs and improve the economics and viability of the biobleaching technology. The application of enzymes to the bleaching process has been considered as an environmentally friendly approach that can reduce the negative impact on the environment exerted by the use of chlorine-based bleaching agents. It has been demonstrated that pretreatment of kraft pulp with xylanase prior to bleaching (biobleaching) can facilitate subsequent removal of lignin by bleaching chemicals, thereby, reducing the demand for elemental chlorine or improving final paper brightness. Using this xylanase pre-treatment, has resulted in an increased of brightness (8.5 Unit) when compared to non-enzymatic treated bleached pulp prepared using identical conditions. Reduction of the consumption of active chlorine can be achieved which results in a decrease in the toxicity, colour, chloride and absorbable organic halogen (AOX) levels of bleaching effluents. The xylanase treatment improves drainage, strength properties and the fragility of pulps, and also increases the brightness of pulps. This positive result shows that enzyme pre-treatment facilitates the removal of chromophore fragments of pulp there by making the process more environment friendly
Resumo:
In the present work, the author has designed and developed all types of solar air heaters called porous and nonporous collectors. The developed solar air heaters were subjected to different air mass flow rates in order to standardize the flow per unit area of the collector. Much attention was given to investigate the performance of the solar air heaters fitted with baffles. The output obtained from the experiments on pilot models, helped the installation of solar air heating system for industrial drying applications also. Apart from these, various types of solar dryers, for small and medium scale drying applications, were also built up. The feasibility of ‘latent heat thermal energy storage system’ based on Phase Change Material was also undertaken. The application of solar greenhouse for drying industrial effluent was analyzed in the present study and a solar greenhouse was developed. The effectiveness of Computational Fluid Dynamics (CFD) in the field of solar air heaters was also analyzed. The thesis is divided into eight chapters.
Resumo:
This thesis investigated the potential use of Linear Predictive Coding in speech communication applications. A Modified Block Adaptive Predictive Coder is developed, which reduces the computational burden and complexity without sacrificing the speech quality, as compared to the conventional adaptive predictive coding (APC) system. For this, changes in the evaluation methods have been evolved. This method is as different from the usual APC system in that the difference between the true and the predicted value is not transmitted. This allows the replacement of the high order predictor in the transmitter section of a predictive coding system, by a simple delay unit, which makes the transmitter quite simple. Also, the block length used in the processing of the speech signal is adjusted relative to the pitch period of the signal being processed rather than choosing a constant length as hitherto done by other researchers. The efficiency of the newly proposed coder has been supported with results of computer simulation using real speech data. Three methods for voiced/unvoiced/silent/transition classification have been presented. The first one is based on energy, zerocrossing rate and the periodicity of the waveform. The second method uses normalised correlation coefficient as the main parameter, while the third method utilizes a pitch-dependent correlation factor. The third algorithm which gives the minimum error probability has been chosen in a later chapter to design the modified coder The thesis also presents a comparazive study beh-cm the autocorrelation and the covariance methods used in the evaluaiicn of the predictor parameters. It has been proved that the azztocorrelation method is superior to the covariance method with respect to the filter stabf-it)‘ and also in an SNR sense, though the increase in gain is only small. The Modified Block Adaptive Coder applies a switching from pitch precitzion to spectrum prediction when the speech segment changes from a voiced or transition region to an unvoiced region. The experiments cont;-:ted in coding, transmission and simulation, used speech samples from .\£=_‘ajr2_1a:r1 and English phrases. Proposal for a speaker reecgnifion syste: and a phoneme identification system has also been outlized towards the end of the thesis.
Resumo:
This paper discusses the implementation details of a child friendly, good quality, English text-to-speech (TTS) system that is phoneme-based, concatenative, easy to set up and use with little memory. Direct waveform concatenation and linear prediction coding (LPC) are used. Most existing TTS systems are unit-selection based, which use standard speech databases available in neutral adult voices.Here reduced memory is achieved by the concatenation of phonemes and by replacing phonetic wave files with their LPC coefficients. Linguistic analysis was used to reduce the algorithmic complexity instead of signal processing techniques. Sufficient degree of customization and generalization catering to the needs of the child user had been included through the provision for vocabulary and voice selection to suit the requisites of the child. Prosody had also been incorporated. This inexpensive TTS systemwas implemented inMATLAB, with the synthesis presented by means of a graphical user interface (GUI), thus making it child friendly. This can be used not only as an interesting language learning aid for the normal child but it also serves as a speech aid to the vocally disabled child. The quality of the synthesized speech was evaluated using the mean opinion score (MOS).
Resumo:
Almost everyone sketches. People use sketches day in and day out in many different and heterogeneous fields, to share their thoughts and clarify ambiguous interpretations, for example. The media used to sketch varies from analog tools like flipcharts to digital tools like smartboards. Whereas analog tools are usually affected by insufficient editing capabilities like cut/copy/paste, digital tools greatly support these scenarios. Digital tools can be grouped into informal and formal tools. Informal tools can be understood as simple drawing environments, whereas formal tools offer sophisticated support to create, optimize and validate diagrams of a certain application domain. Most digital formal tools force users to stick to a concrete syntax and editing workflow, limiting the user’s creativity. For that reason, a lot of people first sketch their ideas using the flexibility of analog or digital informal tools. Subsequently, the sketch is "portrayed" in an appropriate digital formal tool. This work presents Scribble, a highly configurable and extensible sketching framework which allows to dynamically inject sketching features into existing graphical diagram editors, based on Eclipse GEF. This allows to combine the flexibility of informal tools with the power of formal tools without any effort. No additional code is required to augment a GEF editor with sophisticated sketching features. Scribble recognizes drawn elements as well as handwritten text and automatically generates the corresponding domain elements. A local training data library is created dynamically by incrementally learning shapes, drawn by the user. Training data can be shared with others using the WebScribble web application which has been created as part of this work.
Resumo:
A long development time is needed from the design to the implementation of an AUV. During the first steps, simulation plays an important role, since it allows for the development of preliminary versions of the control system to be integrated. Once the robot is ready, the control systems are implemented, tuned and tested. The use of a real-time simulator can help closing the gap between off-line simulation and real testing using the already implemented robot. When properly interfaced with the robot hardware, a real-time graphical simulation with a "hardware in the loop" configuration, can allow for the testing of the implemented control system running in the actual robot hardware. Hence, the development time is drastically reduced. These paper overviews the field of graphical simulators used for AUV development proposing a classification. It also presents NEPTUNE, a multi-vehicle, real-time, graphical simulator based on OpenGL that allows hardware in the loop simulations
Resumo:
The paper describes the implementation of an offline, low-cost Brain Computer Interface (BCI) alternative to more expensive commercial models. Using inexpensive general purpose clinical EEG acquisition hardware (Truscan32, Deymed Diagnostic) as the base unit, a synchronisation module was constructed to allow the EEG hardware to be operated precisely in time to allow for recording of automatically time stamped EEG signals. The synchronising module allows the EEG recordings to be aligned in stimulus time locked fashion for further processing by the classifier to establish the class of the stimulus, sample by sample. This allows for the acquisition of signals from the subject’s brain for the goal oriented BCI application based on the oddball paradigm. An appropriate graphical user interface (GUI) was constructed and implemented as the method to elicit the required responses (in this case Event Related Potentials or ERPs) from the subject.
Resumo:
The National Housing and Planning Advice Unit commissioned Professor Michael Ball of Reading University to undertake empirical research into how long it was taking to obtain planning consent for major housing sites in England. The focus on sites as opposed to planning applications is important because it is sites that generate housing.
Resumo:
The Perspex Machine arose from the unification of computation with geometry. We now report significant redevelopment of both a partial C compiler that generates perspex programs and of a Graphical User Interface (GUI). The compiler is constructed with standard compiler-generator tools and produces both an explicit parse tree for C and an Abstract Syntax Tree (AST) that is better suited to code generation. The GUI uses a hash table and a simpler software architecture to achieve an order of magnitude speed up in processing and, consequently, an order of magnitude increase in the number of perspexes that can be manipulated in real time (now 6,000). Two perspex-machine simulators are provided, one using trans-floating-point arithmetic and the other using transrational arithmetic. All of the software described here is available on the world wide web. The compiler generates code in the neural model of the perspex. At each branch point it uses a jumper to return control to the main fibre. This has the effect of pruning out an exponentially increasing number of branching fibres, thereby greatly increasing the efficiency of perspex programs as measured by the number of neurons required to implement an algorithm. The jumpers are placed at unit distance from the main fibre and form a geometrical structure analogous to a myelin sheath in a biological neuron. Both the perspex jumper-sheath and the biological myelin-sheath share the computational function of preventing cross-over of signals to neurons that lie close to an axon. This is an example of convergence driven by similar geometrical and computational constraints in perspex and biological neurons.
Resumo:
When performing data fusion, one often measures where targets were and then wishes to deduce where targets currently are. There has been recent research on the processing of such out-of-sequence data. This research has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships among the algorithms so that any approximations made are explicit. Results for a multi-sensor scenario involving out-of-sequence data association are used to illustrate the utility of this approach in a specific context.