977 resultados para Short Loadlength, Fast Algorithms
Resumo:
Mode of access: Internet.
Resumo:
Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.
Resumo:
This thesis investigates the cost of electricity generation using bio-oil produced by the fast pyrolysis of UK energy crops. The study covers cost from the farm to the generator’s terminals. The use of short rotation coppice willow and miscanthus as feedstocks was investigated. All costs and performance data have been taken from published papers, reports or web sites. Generation technologies are compared at scales where they have proved economic burning other fuels, rather than at a given size. A pyrolysis yield model was developed for a bubbling fluidised bed fast pyrolysis reactor from published data to predict bio-oil yields and pyrolysis plant energy demands. Generation using diesel engines, gas turbines in open and combined cycle (CCGT) operation and steam cycle plants was considered. The use of bio-oil storage to allow the pyrolysis and generation plants to operate independently of each other was investigated. The option of using diesel generators and open cycle gas turbines for combined heat and power was examined. The possible cost reductions that could be expected through learning if the technology is widely implemented were considered. It was found that none of the systems analysed would be viable without subsidy, but with the current Renewable Obligation Scheme CCGT plants in the 200 to 350 MWe range, super-critical coal fired boilers co-fired with bio-oil, and groups of diesel engine based CHP schemes supplied by a central pyrolysis plant would be viable. It was found that the cost would reduce with implementation and the planting of more energy crops but some subsidy would still be needed to make the plants viable.
Resumo:
Dedicated Short Range Communication (DSRC) is a promising technique for vehicle ad-hoc network (VANET) and collaborative road safety applications. As road safety applications require strict quality of services (QoS) from the VANET, it is crucial for DSRC to provide timely and reliable communications to make safety applications successful. In this paper we propose two adaptive message rate control algorithms for low priority safety messages, in order to provide highly available channel for high priority emergency messages while improve channel utilization. In the algorithms each vehicle monitors channel loads and independently controls message rate by a modified additive increase and multiplicative decrease (AIMD) method. Simulation results demonstrated the effectiveness of the proposed rate control algorithms in adapting to dynamic traffic load.
Resumo:
The aim of this study is to characterise and compare fast pyrolysis product yields from straw, high yielding perennial grasses and hardwoods. Feedstocks selected for this study include: wheat straw (Triticum aestivum), switch grass (Panicum virgatum), miscanthus (Miscanthus x giganteus), willow short rotation coppice (Salix viminalis) and beech wood (Fagus sylvatica). The experimental work is divided into two sections: analytical (TGA and Py-GC-MS) and laboratory scale processing using a continuously fed bubbling fluidized bed reactor with a capacity of up to 1 kg/h. Pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) has been used to quantify pyrolysis products and simulate fast pyrolysis heating rates, in order to study potential key light and medium volatile decomposition products found in these feedstocks. Py-GC-MS quantification results show that the highest yields of furfural (0.57 wt.%), 2-furanmethanol (0.18 wt.%), levoglucosan (0.73 wt.%), 1,2-benzenediol (0.27 wt.%) and 2-methoxy-4-vinylphenol (0.38 wt.%) were found in switch grass, and that willow SRC produced the highest yield of phenol (0.33 wt.%). The bio-oil higher heating value was highest for switch grass (22.3 MJ/kg). Water content within the bio-oil is highest in the straw and perennial grasses and lowest in the hardwood willow SRC. The high bio-oil and char heating value and low water content found in willow SRC, makes this crop an attractive energy feedstock for fast pyrolysis processing, if the associated production costs and harvest yields can be maintained at current reported values. The bio-oil from switch grass has the highest potential for the production of high value chemicals. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
The focus of this study is development of parallelised version of severely sequential and iterative numerical algorithms based on multi-threaded parallel platform such as a graphics processing unit. This requires design and development of a platform-specific numerical solution that can benefit from the parallel capabilities of the chosen platform. Graphics processing unit was chosen as a parallel platform for design and development of a numerical solution for a specific physical model in non-linear optics. This problem appears in describing ultra-short pulse propagation in bulk transparent media that has recently been subject to several theoretical and numerical studies. The mathematical model describing this phenomenon is a challenging and complex problem and its numerical modeling limited on current modern workstations. Numerical modeling of this problem requires a parallelisation of an essentially serial algorithms and elimination of numerical bottlenecks. The main challenge to overcome is parallelisation of the globally non-local mathematical model. This thesis presents a numerical solution for elimination of numerical bottleneck associated with the non-local nature of the mathematical model. The accuracy and performance of the parallel code is identified by back-to-back testing with a similar serial version.
Thermochemical characterisation of various biomass feedstock and bio-oil generated by fast pyrolysis
Resumo:
The projected decline in fossil fuel availability, environmental concerns, and security of supply attract increased interest in renewable energy derived from biomass. Fast pyrolysis is a possible thermochemical conversion route for the production of bio-oil, with promising advantages. The purpose of the experiments reported in this thesis was to extend our understanding of the fast pyrolysis process for straw, perennial grasses and hardwoods, and the implications of selective pyrolysis, crop harvest and storage on the thermal decomposition products. To this end, characterisation and laboratory-scale fast pyrolysis were conducted on the available feedstocks, and their products were compared. The variation in light and medium volatile decomposition products was investigated at different pyrolysis temperatures and heating rates, and a comparison of fast and slow pyrolysis products was conducted. Feedstocks from different harvests, storage durations and locations were characterised and compared in terms of their fuel and chemical properties. A range of analytical (e.g. Py-GC-MS and TGA) and processing equipment (0.3 kg/h and 1.0 kg/h fast pyrolysis reactors and 0.15 kg slow pyrolysis reactor) was used. Findings show that the high bio-oil and char heating value, and low water content of willow short rotation coppice (SRC) make this crop attractive for fast pyrolysis processing compared to the other investigated feedstocks in this project. From the analytical sequential investigation of willow SRC, it was found that the volatile product distribution can be tailored to achieve a better final product, by a variation of the heating rate and temperature. Time of harvest was most influential on the fuel properties of miscanthus; overall the late harvest produced the best fuel properties (high HHV, low moisture content, high volatile content, low ash content), and storage of the feedstock reduced the moisture and acid content.
Resumo:
As microblog services such as Twitter become a fast and convenient communication approach, identification of trendy topics in microblog services has great academic and business value. However detecting trendy topics is very challenging due to huge number of users and short-text posts in microblog diffusion networks. In this paper we introduce a trendy topics detection system under computation and communication resource constraints. In stark contrast to retrieving and processing the whole microblog contents, we develop an idea of selecting a small set of microblog users and processing their posts to achieve an overall acceptable trendy topic coverage, without exceeding resource budget for detection. We formulate the selection operation of these subset users as mixed-integer optimization problems, and develop heuristic algorithms to compute their approximate solutions. The proposed system is evaluated with real-time test data retrieved from Sina Weibo, the dominant microblog service provider in China. It's shown that by monitoring 500 out of 1.6 million microblog users and tracking their microposts (about 15,000 daily) with our system, nearly 65% trendy topics can be detected, while on average 5 hours earlier before they appear in Sina Weibo official trends.
Resumo:
A new generalized sphere decoding algorithm is proposed for underdetermined MIMO systems with fewer receive antennas N than transmit antennas M. The proposed algorithm is significantly faster than the existing generalized sphere decoding algorithms. The basic idea is to partition the transmitted signal vector into two subvectors x and x with N - 1 and M - N + 1 elements respectively. After some simple transformations, an outer layer Sphere Decoder (SD) can be used to choose proper x and then use an inner layer SD to decide x, thus the whole transmitted signal vector is obtained. Simulation results show that Double Layer Sphere Decoding (DLSD) has far less complexity than the existing Generalized Sphere Decoding (GSDs).
Resumo:
This paper presents an assessment of the technical and economic performance of thermal processes to generate electricity from a wood chip feedstock by combustion, gasification and fast pyrolysis. The scope of the work begins with the delivery of a wood chip feedstock at a conversion plant and ends with the supply of electricity to the grid, incorporating wood chip preparation, thermal conversion, and electricity generation in dual fuel diesel engines. Net generating capacities of 1–20 MWe are evaluated. The techno-economic assessment is achieved through the development of a suite of models that are combined to give cost and performance data for the integrated system. The models include feed pretreatment, combustion, atmospheric and pressure gasification, fast pyrolysis with pyrolysis liquid storage and transport (an optional step in de-coupled systems) and diesel engine or turbine power generation. The models calculate system efficiencies, capital costs and production costs. An identical methodology is applied in the development of all the models so that all of the results are directly comparable. The electricity production costs have been calculated for 10th plant systems, indicating the costs that are achievable in the medium term after the high initial costs associated with novel technologies have reduced. The costs converge at the larger scale with the mean electricity price paid in the EU by a large consumer, and there is therefore potential for fast pyrolysis and diesel engine systems to sell electricity directly to large consumers or for on-site generation. However, competition will be fierce at all capacities since electricity production costs vary only slightly between the four biomass to electricity systems that are evaluated. Systems de-coupling is one way that the fast pyrolysis and diesel engine system can distinguish itself from the other conversion technologies. Evaluations in this work show that situations requiring several remote generators are much better served by a large fast pyrolysis plant that supplies fuel to de-coupled diesel engines than by constructing an entire close-coupled system at each generating site. Another advantage of de-coupling is that the fast pyrolysis conversion step and the diesel engine generation step can operate independently, with intermediate storage of the fast pyrolysis liquid fuel, increasing overall reliability. Peak load or seasonal power requirements would also benefit from de-coupling since a small fast pyrolysis plant could operate continuously to produce fuel that is stored for use in the engine on demand. Current electricity production costs for a fast pyrolysis and diesel engine system are 0.091/kWh at 1 MWe when learning effects are included. These systems are handicapped by the typical characteristics of a novel technology: high capital cost, high labour, and low reliability. As such the more established combustion and steam cycle produces lower cost electricity under current conditions. The fast pyrolysis and diesel engine system is a low capital cost option but it also suffers from relatively low system efficiency particularly at high capacities. This low efficiency is the result of a low conversion efficiency of feed energy into the pyrolysis liquid, because of the energy in the char by-product. A sensitivity analysis has highlighted the high impact on electricity production costs of the fast pyrolysis liquids yield. The liquids yield should be set realistically during design, and it should be maintained in practice by careful attention to plant operation and feed quality. Another problem is the high power consumption during feedstock grinding. Efficiencies may be enhanced in ablative fast pyrolysis which can tolerate a chipped feedstock. This has yet to be demonstrated at commercial scale. In summary, the fast pyrolysis and diesel engine system has great potential to generate electricity at a profit in the long term, and at a lower cost than any other biomass to electricity system at small scale. This future viability can only be achieved through the construction of early plant that could, in the short term, be more expensive than the combustion alternative. Profitability in the short term can best be achieved by exploiting niches in the market place and specific features of fast pyrolysis. These include: •countries or regions with fiscal incentives for renewable energy such as premium electricity prices or capital grants; •locations with high electricity prices so that electricity can be sold direct to large consumers or generated on-site by companies who wish to reduce their consumption from the grid; •waste disposal opportunities where feedstocks can attract a gate fee rather than incur a cost; •the ability to store fast pyrolysis liquids as a buffer against shutdowns or as a fuel for peak-load generating plant; •de-coupling opportunities where a large, single pyrolysis plant supplies fuel to several small and remote generators; •small-scale combined heat and power opportunities; •sales of the excess char, although a market has yet to be established for this by-product; and •potential co-production of speciality chemicals and fuel for power generation in fast pyrolysis systems.
Resumo:
In this paper, we demonstrate a fast switching dual polarization DDQPSK packet switched receiver with very short waiting times. The system employs mth power DDQPSK decoding for high frequency offset tolerance, and Stokes parameter estimation for robust polarization demultiplexing.
Resumo:
Transition P systems are computational models based on basic features of biological membranes and the observation of biochemical processes. In these models, membrane contains objects multisets, which evolve according to given evolution rules. In the field of Transition P systems implementation, it has been detected the necessity to determine whichever time are going to take active evolution rules application in membranes. In addition, to have time estimations of rules application makes possible to take important decisions related to the hardware / software architectures design. In this paper we propose a new evolution rules application algorithm oriented towards the implementation of Transition P systems. The developed algorithm is sequential and, it has a linear order complexity in the number of evolution rules. Moreover, it obtains the smaller execution times, compared with the preceding algorithms. Therefore the algorithm is very appropriate for the implementation of Transition P systems in sequential devices.
Resumo:
In this paper a genetic algorithm (GA) is applied on Maximum Betweennes Problem (MBP). The maximum of the objective function is obtained by finding a permutation which satisfies a maximal number of betweenness constraints. Every permutation considered is genetically coded with an integer representation. Standard operators are used in the GA. Instances in the experimental results are randomly generated. For smaller dimensions, optimal solutions of MBP are obtained by total enumeration. For those instances, the GA reached all optimal solutions except one. The GA also obtained results for larger instances of up to 50 elements and 1000 triples. The running time of execution and finding optimal results is quite short.
Resumo:
This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^
Resumo:
Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.