973 resultados para Short Loadlength, Fast Algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new generalized sphere decoding algorithm is proposed for underdetermined MIMO systems with fewer receive antennas N than transmit antennas M. The proposed algorithm is significantly faster than the existing generalized sphere decoding algorithms. The basic idea is to partition the transmitted signal vector into two subvectors x and x with N - 1 and M - N + 1 elements respectively. After some simple transformations, an outer layer Sphere Decoder (SD) can be used to choose proper x and then use an inner layer SD to decide x, thus the whole transmitted signal vector is obtained. Simulation results show that Double Layer Sphere Decoding (DLSD) has far less complexity than the existing Generalized Sphere Decoding (GSDs).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an assessment of the technical and economic performance of thermal processes to generate electricity from a wood chip feedstock by combustion, gasification and fast pyrolysis. The scope of the work begins with the delivery of a wood chip feedstock at a conversion plant and ends with the supply of electricity to the grid, incorporating wood chip preparation, thermal conversion, and electricity generation in dual fuel diesel engines. Net generating capacities of 1–20 MWe are evaluated. The techno-economic assessment is achieved through the development of a suite of models that are combined to give cost and performance data for the integrated system. The models include feed pretreatment, combustion, atmospheric and pressure gasification, fast pyrolysis with pyrolysis liquid storage and transport (an optional step in de-coupled systems) and diesel engine or turbine power generation. The models calculate system efficiencies, capital costs and production costs. An identical methodology is applied in the development of all the models so that all of the results are directly comparable. The electricity production costs have been calculated for 10th plant systems, indicating the costs that are achievable in the medium term after the high initial costs associated with novel technologies have reduced. The costs converge at the larger scale with the mean electricity price paid in the EU by a large consumer, and there is therefore potential for fast pyrolysis and diesel engine systems to sell electricity directly to large consumers or for on-site generation. However, competition will be fierce at all capacities since electricity production costs vary only slightly between the four biomass to electricity systems that are evaluated. Systems de-coupling is one way that the fast pyrolysis and diesel engine system can distinguish itself from the other conversion technologies. Evaluations in this work show that situations requiring several remote generators are much better served by a large fast pyrolysis plant that supplies fuel to de-coupled diesel engines than by constructing an entire close-coupled system at each generating site. Another advantage of de-coupling is that the fast pyrolysis conversion step and the diesel engine generation step can operate independently, with intermediate storage of the fast pyrolysis liquid fuel, increasing overall reliability. Peak load or seasonal power requirements would also benefit from de-coupling since a small fast pyrolysis plant could operate continuously to produce fuel that is stored for use in the engine on demand. Current electricity production costs for a fast pyrolysis and diesel engine system are 0.091/kWh at 1 MWe when learning effects are included. These systems are handicapped by the typical characteristics of a novel technology: high capital cost, high labour, and low reliability. As such the more established combustion and steam cycle produces lower cost electricity under current conditions. The fast pyrolysis and diesel engine system is a low capital cost option but it also suffers from relatively low system efficiency particularly at high capacities. This low efficiency is the result of a low conversion efficiency of feed energy into the pyrolysis liquid, because of the energy in the char by-product. A sensitivity analysis has highlighted the high impact on electricity production costs of the fast pyrolysis liquids yield. The liquids yield should be set realistically during design, and it should be maintained in practice by careful attention to plant operation and feed quality. Another problem is the high power consumption during feedstock grinding. Efficiencies may be enhanced in ablative fast pyrolysis which can tolerate a chipped feedstock. This has yet to be demonstrated at commercial scale. In summary, the fast pyrolysis and diesel engine system has great potential to generate electricity at a profit in the long term, and at a lower cost than any other biomass to electricity system at small scale. This future viability can only be achieved through the construction of early plant that could, in the short term, be more expensive than the combustion alternative. Profitability in the short term can best be achieved by exploiting niches in the market place and specific features of fast pyrolysis. These include: •countries or regions with fiscal incentives for renewable energy such as premium electricity prices or capital grants; •locations with high electricity prices so that electricity can be sold direct to large consumers or generated on-site by companies who wish to reduce their consumption from the grid; •waste disposal opportunities where feedstocks can attract a gate fee rather than incur a cost; •the ability to store fast pyrolysis liquids as a buffer against shutdowns or as a fuel for peak-load generating plant; •de-coupling opportunities where a large, single pyrolysis plant supplies fuel to several small and remote generators; •small-scale combined heat and power opportunities; •sales of the excess char, although a market has yet to be established for this by-product; and •potential co-production of speciality chemicals and fuel for power generation in fast pyrolysis systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we demonstrate a fast switching dual polarization DDQPSK packet switched receiver with very short waiting times. The system employs mth power DDQPSK decoding for high frequency offset tolerance, and Stokes parameter estimation for robust polarization demultiplexing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transition P systems are computational models based on basic features of biological membranes and the observation of biochemical processes. In these models, membrane contains objects multisets, which evolve according to given evolution rules. In the field of Transition P systems implementation, it has been detected the necessity to determine whichever time are going to take active evolution rules application in membranes. In addition, to have time estimations of rules application makes possible to take important decisions related to the hardware / software architectures design. In this paper we propose a new evolution rules application algorithm oriented towards the implementation of Transition P systems. The developed algorithm is sequential and, it has a linear order complexity in the number of evolution rules. Moreover, it obtains the smaller execution times, compared with the preceding algorithms. Therefore the algorithm is very appropriate for the implementation of Transition P systems in sequential devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a genetic algorithm (GA) is applied on Maximum Betweennes Problem (MBP). The maximum of the objective function is obtained by finding a permutation which satisfies a maximal number of betweenness constraints. Every permutation considered is genetically coded with an integer representation. Standard operators are used in the GA. Instances in the experimental results are randomly generated. For smaller dimensions, optimal solutions of MBP are obtained by total enumeration. For those instances, the GA reached all optimal solutions except one. The GA also obtained results for larger instances of up to 50 elements and 1000 triples. The running time of execution and finding optimal results is quite short.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Buffered crossbar switches have recently attracted considerable attention as the next generation of high speed interconnects. They are a special type of crossbar switches with an exclusive buffer at each crosspoint of the crossbar. They demonstrate unique advantages over traditional unbuffered crossbar switches, such as high throughput, low latency, and asynchronous packet scheduling. However, since crosspoint buffers are expensive on-chip memories, it is desired that each crosspoint has only a small buffer. This dissertation proposes a series of practical algorithms and techniques for efficient packet scheduling for buffered crossbar switches. To reduce the hardware cost of such switches and make them scalable, we considered partially buffered crossbars, whose crosspoint buffers can be of an arbitrarily small size. Firstly, we introduced a hybrid scheme called Packet-mode Asynchronous Scheduling Algorithm (PASA) to schedule best effort traffic. PASA combines the features of both distributed and centralized scheduling algorithms and can directly handle variable length packets without Segmentation And Reassembly (SAR). We showed by theoretical analysis that it achieves 100% throughput for any admissible traffic in a crossbar with a speedup of two. Moreover, outputs in PASA have a large probability to avoid the more time-consuming centralized scheduling process, and thus make fast scheduling decisions. Secondly, we proposed the Fair Asynchronous Segment Scheduling (FASS) algorithm to handle guaranteed performance traffic with explicit flow rates. FASS reduces the crosspoint buffer size by dividing packets into shorter segments before transmission. It also provides tight constant performance guarantees by emulating the ideal Generalized Processor Sharing (GPS) model. Furthermore, FASS requires no speedup for the crossbar, lowering the hardware cost and improving the switch capacity. Thirdly, we presented a bandwidth allocation scheme called Queue Length Proportional (QLP) to apply FASS to best effort traffic. QLP dynamically obtains a feasible bandwidth allocation matrix based on the queue length information, and thus assists the crossbar switch to be more work-conserving. The feasibility and stability of QLP were proved, no matter whether the traffic distribution is uniform or non-uniform. Hence, based on bandwidth allocation of QLP, FASS can also achieve 100% throughput for best effort traffic in a crossbar without speedup.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Personalized recommender systems aim to assist users in retrieving and accessing interesting items by automatically acquiring user preferences from the historical data and matching items with the preferences. In the last decade, recommendation services have gained great attention due to the problem of information overload. However, despite recent advances of personalization techniques, several critical issues in modern recommender systems have not been well studied. These issues include: (1) understanding the accessing patterns of users (i.e., how to effectively model users' accessing behaviors); (2) understanding the relations between users and other objects (i.e., how to comprehensively assess the complex correlations between users and entities in recommender systems); and (3) understanding the interest change of users (i.e., how to adaptively capture users' preference drift over time). To meet the needs of users in modern recommender systems, it is imperative to provide solutions to address the aforementioned issues and apply the solutions to real-world applications. ^ The major goal of this dissertation is to provide integrated recommendation approaches to tackle the challenges of the current generation of recommender systems. In particular, three user-oriented aspects of recommendation techniques were studied, including understanding accessing patterns, understanding complex relations and understanding temporal dynamics. To this end, we made three research contributions. First, we presented various personalized user profiling algorithms to capture click behaviors of users from both coarse- and fine-grained granularities; second, we proposed graph-based recommendation models to describe the complex correlations in a recommender system; third, we studied temporal recommendation approaches in order to capture the preference changes of users, by considering both long-term and short-term user profiles. In addition, a versatile recommendation framework was proposed, in which the proposed recommendation techniques were seamlessly integrated. Different evaluation criteria were implemented in this framework for evaluating recommendation techniques in real-world recommendation applications. ^ In summary, the frequent changes of user interests and item repository lead to a series of user-centric challenges that are not well addressed in the current generation of recommender systems. My work proposed reasonable solutions to these challenges and provided insights on how to address these challenges using a simple yet effective recommendation framework.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a fast integer sorting algorithm, herein referred to as Bit-index sort, which does not use comparisons and is intended to sort partial permutations. Experimental results exhibit linear complexity order in execution time. Bit-index sort uses a bit-array to classify input sequences of distinct integers, and exploits built-in bit functions in C compilers, supported by machine hardware, to retrieve the ordered output sequence. Results show that Bit-index sort outperforms quicksort and counting sort algorithms when compared in their execution time. A parallel approach for Bit-index sort using two simultaneous threads is also included, which obtains further speedups of up to 1.6 compared to its sequential case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]In visual surveillance face detection can be an important cue for initializing tracking algorithms. Recent work in psychophics hints at the importance of the local context of a face for robust detection, such as head contours and torso. This paper describes a detector that actively utilizes the idea of local context. The promise is to gain robustness that goes beyond the capabilities of traditional face detection making it particularly interesting for surveillance. The performance of the proposed detector in terms of accuracy and speed is evaluated on data sets from PETS 2000 and PETS 2003 and compared to the object-centered approach. Particular attention is paid to the role of available image resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]This paper describes an Active Vision System whose design assumes a distinction between fast or reactive and slow or background processes. Fast processes need to operate in cycles with critical timeouts that may affect system stability. While slow processes, though necessary, do not compromise system stability if its execution is delayed. Based on this simple taxonomy, a control architecture has been proposed and a prototype implemented that is able to track people in real-time with a robotic head while trying to identify the target. In this system, the tracking module is considered as the reactive part of the system while person identification is considered a background task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how biodiversity spatially distribute over both the short term and long term, and what factors are affecting the distribution, are critical for modeling the spatial pattern of biodiversity as well as for promoting effective conservation planning and practices. This dissertation aims to examine factors that influence short-term and long-term avian distribution from the geographical sciences perspective. The research develops landscape level habitat metrics to characterize forest height heterogeneity and examines their efficacies in modelling avian richness at the continental scale. Two types of novel vegetation-height-structured habitat metrics are created based on second order texture algorithms and the concepts of patch-based habitat metrics. I correlate the height-structured metrics with the richness of different forest guilds, and also examine their efficacies in multivariate richness models. The results suggest that height heterogeneity, beyond canopy height alone, supplements habitat characterization and richness models of two forest bird guilds. The metrics and models derived in this study demonstrate practical examples of utilizing three-dimensional vegetation data for improved characterization of spatial patterns in species richness. The second and the third projects focus on analyzing centroids of avian distributions, and testing hypotheses regarding the direction and speed of these shifts. I first showcase the usefulness of centroids analysis for characterizing the distribution changes of a few case study species. Applying the centroid method on 57 permanent resident bird species, I show that multi-directional distribution shifts occurred in large number of studied species. I also demonstrate, plain birds are not shifting their distribution faster than mountain birds, contrary to the prediction based on climate change velocity hypothesis. By modelling the abundance change rate at regional level, I show that extreme climate events and precipitation measures associate closely with some of the long-term distribution shifts. This dissertation improves our understanding on bird habitat characterization for species richness modelling, and expands our knowledge on how avian populations shifted their ranges in North America responding to changing environments in the past four decades. The results provide an important scientific foundation for more accurate predictive species distribution modeling in future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.