949 resultados para Processing Time


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A computação paralela permite uma série de vantagens para a execução de aplicações de grande porte, sendo que o uso efetivo dos recursos computacionais paralelos é um aspecto relevante da computação de alto desempenho. Este trabalho apresenta uma metodologia que provê a execução, de forma automatizada, de aplicações paralelas baseadas no modelo BSP com tarefas heterogêneas. É considerado no modelo adotado, que o tempo de computação de cada tarefa secundária não possui uma alta variância entre uma iteração e outra. A metodologia é denominada de ASE e é composta por três etapas: Aquisição (Acquisition), Escalonamento (Scheduling) e Execução (Execution). Na etapa de Aquisição, os tempos de processamento das tarefas são obtidos; na etapa de Escalonamento a metodologia busca encontrar a distribuição de tarefas que maximize a velocidade de execução da aplicação paralela, mas minimizando o uso de recursos, por meio de um algoritmo desenvolvido neste trabalho; e por fim a etapa de Execução executa a aplicação paralela com a distribuição definida na etapa anterior. Ferramentas que são aplicadas na metodologia foram implementadas. Um conjunto de testes aplicando a metodologia foi realizado e os resultados apresentados mostram que os objetivos da proposta foram alcançados.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A rápida evolução do hardware demanda uma evolução contínua dos compiladores. Um processo de ajuste deve ser realizado pelos projetistas de compiladores para garantir que o código gerado pelo compilador mantenha uma determinada qualidade, seja em termos de tempo de processamento ou outra característica pré-definida. Este trabalho visou automatizar o processo de ajuste de compiladores por meio de técnicas de aprendizado de máquina. Como resultado os planos de compilação obtidos usando aprendizado de máquina com as características propostas produziram código para programas cujos valores para os tempos de execução se aproximaram daqueles seguindo o plano padrão utilizado pela LLVM.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biotic indices have been developed to summarise information provided by benthic macroinvertebrates, but their use can require specialized taxonomic expertise as well as a time-consuming operation. Using high taxonomic level in biotic indices reduces sampling processing time but should be considered with caution, since assigning tolerance level to high taxonomic levels may cause uncertainty. A methodology for family level tolerance categorization based on the affinity of each family with disturbed or undisturbed conditions was employed. This family tolerance classification approach was tested in two different areas from Mediterranean Sea affected by sewage discharges. Biotic indices employed at family level responded correctly to sewage presence. However, in areas with different communities among stations and high diversity of species within each family, assigning the same tolerance level to a whole family could imply mistakes. Thus, use of high taxonomic level in biotic indices should be only restricted to areas where homogeneous community is presented and families across sites have similar species composition.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Paper submitted to the 43rd International Symposium on Robotics (ISR), Taipei, Taiwan, August 29-31, 2012.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The delineation of functional economic areas, or market areas, is a problem of high practical relevance, since the delineation of functional sets such as economic areas in the US, Travel-to-Work Areas in the United Kingdom, and their counterparts in other OECD countries are the basis of many statistical operations and policy making decisions at local level. This is a combinatorial optimisation problem defined as the partition of a given set of indivisible spatial units (covering a territory) into regions characterised by being (a) self-contained and (b) cohesive, in terms of spatial interaction data (flows, relationships). Usually, each region must reach a minimum size and self-containment level, and must be continuous. Although these optimisation problems have been typically solved through greedy methods, a recent strand of the literature in this field has been concerned with the use of evolutionary algorithms with ad hoc operators. Although these algorithms have proved to be successful in improving the results of some of the more widely applied official procedures, they are so time consuming that cannot be applied directly to solve real-world problems. In this paper we propose a new set of group-based mutation operators, featuring general operations over disjoint groups, tailored to ensure that all the constraints are respected during the operation to improve efficiency. A comparative analysis of our results with those from previous approaches shows that the proposed algorithm systematically improves them in terms of both quality and processing time, something of crucial relevance since it allows dealing with most large, real-world problems in reasonable time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Event-based visual servoing is a recently presented approach that performs the positioning of a robot using visual information only when it is required. From the basis of the classical image-based visual servoing control law, the scheme proposed in this paper can reduce the processing time at each loop iteration in some specific conditions. The proposed control method enters in action when an event deactivates the classical image-based controller (i.e. when there is no image available to perform the tracking of the visual features). A virtual camera is then moved through a straight line path towards the desired position. The virtual path used to guide the robot improves the behavior of the previous event-based visual servoing proposal.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There are a large number of image processing applications that work with different performance requirements and available resources. Recent advances in image compression focus on reducing image size and processing time, but offer no real-time solutions for providing time/quality flexibility of the resulting image, such as using them to transmit the image contents of web pages. In this paper we propose a method for encoding still images based on the JPEG standard that allows the compression/decompression time cost and image quality to be adjusted to the needs of each application and to the bandwidth conditions of the network. The real-time control is based on a collection of adjustable parameters relating both to aspects of implementation and to the hardware with which the algorithm is processed. The proposed encoding system is evaluated in terms of compression ratio, processing delay and quality of the compressed image when compared with the standard method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many variables that are of interest in social science research are nominal variables with two or more categories, such as employment status, occupation, political preference, or self-reported health status. With longitudinal survey data it is possible to analyse the transitions of individuals between different employment states or occupations (for example). In the statistical literature, models for analysing categorical dependent variables with repeated observations belong to the family of models known as generalized linear mixed models (GLMMs). The specific GLMM for a dependent variable with three or more categories is the multinomial logit random effects model. For these models, the marginal distribution of the response does not have a closed form solution and hence numerical integration must be used to obtain maximum likelihood estimates for the model parameters. Techniques for implementing the numerical integration are available but are computationally intensive requiring a large amount of computer processing time that increases with the number of clusters (or individuals) in the data and are not always readily accessible to the practitioner in standard software. For the purposes of analysing categorical response data from a longitudinal social survey, there is clearly a need to evaluate the existing procedures for estimating multinomial logit random effects model in terms of accuracy, efficiency and computing time. The computational time will have significant implications as to the preferred approach by researchers. In this paper we evaluate statistical software procedures that utilise adaptive Gaussian quadrature and MCMC methods, with specific application to modeling employment status of women using a GLMM, over three waves of the HILDA survey.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tonal, textural and contextual properties are used in manual photointerpretation of remotely sensed data. This study has used these three attributes to produce a lithological map of semi arid northwest Argentina by semi automatic computer classification procedures of remotely sensed data. Three different types of satellite data were investigated, these were LANDSAT MSS, TM and SIR-A imagery. Supervised classification procedures using tonal features only produced poor classification results. LANDSAT MSS produced classification accuracies in the range of 40 to 60%, while accuracies of 50 to 70% were achieved using LANDSAT TM data. The addition of SIR-A data produced increases in the classification accuracy. The increased classification accuracy of TM over the MSS is because of the better discrimination of geological materials afforded by the middle infra red bands of the TM sensor. The maximum likelihood classifier consistently produced classification accuracies 10 to 15% higher than either the minimum distance to means or decision tree classifier, this improved accuracy was obtained at the cost of greatly increased processing time. A new type of classifier the spectral shape classifier, which is computationally as fast as a minimum distance to means classifier is described. However, the results for this classifier were disappointing, being lower in most cases than the minimum distance or decision tree procedures. The classification results using only tonal features were felt to be unacceptably poor, therefore textural attributes were investigated. Texture is an important attribute used by photogeologists to discriminate lithology. In the case of TM data, texture measures were found to increase the classification accuracy by up to 15%. However, in the case of the LANDSAT MSS data the use of texture measures did not provide any significant increase in the accuracy of classification. For TM data, it was found that second order texture, especially the SGLDM based measures, produced highest classification accuracy. Contextual post processing was found to increase classification accuracy and improve the visual appearance of classified output by removing isolated misclassified pixels which tend to clutter classified images. Simple contextual features, such as mode filters were found to out perform more complex features such as gravitational filter or minimal area replacement methods. Generally the larger the size of the filter, the greater the increase in the accuracy. Production rules were used to build a knowledge based system which used tonal and textural features to identify sedimentary lithologies in each of the two test sites. The knowledge based system was able to identify six out of ten lithologies correctly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conventionally, biometrics resources, such as face, gait silhouette, footprint, and pressure, have been utilized in gender recognition systems. However, the acquisition and processing time of these biometrics data makes the analysis difficult. This letter demonstrates for the first time how effective the footwear appearance is for gender recognition as a biometrics resource. A footwear database is also established with reprehensive shoes (footwears). Preliminary experimental results suggest that footwear appearance is a promising resource for gender recognition. Moreover, it also has the potential to be used jointly with other developed biometrics resources to boost performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We address the important bioinformatics problem of predicting protein function from a protein's primary sequence. We consider the functional classification of G-Protein-Coupled Receptors (GPCRs), whose functions are specified in a class hierarchy. We tackle this task using a novel top-down hierarchical classification system where, for each node in the class hierarchy, the predictor attributes to be used in that node and the classifier to be applied to the selected attributes are chosen in a data-driven manner. Compared with a previous hierarchical classification system selecting classifiers only, our new system significantly reduced processing time without significantly sacrificing predictive accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The production of agricultural and horticultural products requires the use of nitrogenous fertiliser that can cause pollution of surface and ground water and has a large carbon footprint as it is mainly produced from fossil fuels. The overall objective of this research project was to investigate fast pyrolysis and in-situ nitrogenolysis of biomass and biogenic residues as an alternative route to produce a sustainable solid slow release fertiliser mitigating the above stated problems. A variety of biomasses and biogenic residues were characterized by proximate analysis, ultimate analysis, thermogravimetric analysis (TGA) and Pyrolysis – Gas chromatography – Mass Spectroscopy (Py–GC–MS) for their potential use as feedstocks using beech wood as a reference material. Beech wood was virtually nitrogen free and therefore suitable as a reference material as added nitrogen can be identified as such while Dried Distillers Grains with Solubles (DDGS) and rape meal had a nitrogen content between 5.5wt.% and 6.1wt.% qualifying them as high nitrogen feedstocks. Fast pyrolysis and in-situ nitrogenolysis experiments were carried out in a continuously fed 1kg/h bubbling fluidized bed reactor at around 500°C quenching the pyrolysis vapours with isoparaffin. In-situ nitrogenolysis experiments were performed by adding ammonia gas to the fast pyrolysis reactor at nominal nitrogen addition rates between 5wt.%C and 20wt.%C based on the dry feedstock’s carbon content basis. Mass balances were established for the processing experiments. The fast pyrolysis and in-situ nitrogenolysis products were characterized by proximate analysis, ultimate analysis and GC– MS. High liquid yields and good mass balance closures of over 92% were obtained. The most suitable nitrogen addition rate for the in-situ nitrogenolysis experiments was determined to be 12wt.%C on dry feedstock carbon content basis. However, only a few nitrogen compounds that were formed during in-situ nitrogenolysis could be identified by GC–MS. A batch reactor process was developed to thermally solidify the fast pyrolysis and in-situ nitrogenolysis liquids of beech wood and Barley DDGS producing a brittle solid product. This was obtained at 150°C with an addition of 2.5wt% char (as catalyst) after a processing time of 1h. The batch reactor was also used for modifying and solidifying fast pyrolysis liquids derived from beech wood by adding urea or ammonium phosphate as post processing nitrogenolysis. The results showed that this type of combined approach was not suitable to produce a slow release fertiliser, because the solid product contained up to 65wt.% of highly water soluble nitrogen compounds that would be released instantly by rain. To complement the processing experiments a comparative study via Py–GC–MS with inert and reactive gas was performed with cellulose, hemicellulose, lignin and beech wood. This revealed that the presence of ammonia gas during analytical pyrolysis did not appear to have any direct impact on the decomposition products of the tested materials. The chromatograms obtained showed almost no differences between inert and ammonia gas experiments indicating that the reaction between ammonia and pyrolysis vapours does not occur instantly. A comparative study via Fourier Transformed Infrared Spectroscopy of solidified fast pyrolysis and in-situ nitrogenolysis products showed that there were some alterations in the spectra obtained. A shift in frequencies indicating C=O stretches typically related to the presence of carboxylic acids to C=O stretches related to amides was observed and no double or triple bonded nitrogen was detected. This indicates that organic acids reacted with ammonia and that no potentially harmful or non-biodegradable triple bonded nitrogen compounds were formed. The impact of solid slow release fertiliser (SRF) derived from pyrolysis and in-situ nitrogenolysis products from beech wood and Barley DDGS on microbial life in soils and plant growth was tested in cooperation with Rothamsted Research. The microbial incubation tests indicated that microbes can thrive on the SRFs produced, although some microbial species seem to have a reduced activity at very high concentrations of beech wood and Barley DDGS derived SRF. The plant tests (pot trials) showed that the application of SRF derived from beech wood and barley DDGS had no negative impact on germination or plant growth of rye grass. The fertilizing effect was proven by the dry matter yields in three harvests after 47 days, 89 days and 131 days. The findings of this research indicate that in general a slow release fertiliser can be produced from biomass and biogenic residues by in-situ nitrogenolysis. Nevertheless the findings also show that additional research is necessary to identify which compounds are formed during this process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication. © 2014 Porcaro et al.