949 resultados para Processing time
Resumo:
We develop efficient techniques for the non-rigid registration of medical images by using representations that adapt to the anatomy found in such images. Images of anatomical structures typically have uniform intensity interiors and smooth boundaries. We create methods to represent such regions compactly using tetrahedra. Unlike voxel-based representations, tetrahedra can accurately describe the expected smooth surfaces of medical objects. Furthermore, the interior of such objects can be represented using a small number of tetrahedra. Rather than describing a medical object using tens of thousands of voxels, our representations generally contain only a few thousand elements. Tetrahedra facilitate the creation of efficient non-rigid registration algorithms based on finite element methods (FEM). We create a fast, FEM-based method to non-rigidly register segmented anatomical structures from two subjects. Using our compact tetrahedral representations, this method generally requires less than one minute of processing time on a desktop PC. We also create a novel method for the non-rigid registration of gray scale images. To facilitate a fast method, we create a tetrahedral representation of a displacement field that automatically adapts to both the anatomy in an image and to the displacement field. The resulting algorithm has a computational cost that is dominated by the number of nodes in the mesh (about 10,000), rather than the number of voxels in an image (nearly 10,000,000). For many non-rigid registration problems, we can find a transformation from one image to another in five minutes. This speed is important as it allows use of the algorithm during surgery. We apply our algorithms to find correlations between the shape of anatomical structures and the presence of schizophrenia. We show that a study based on our representations outperforms studies based on other representations. We also use the results of our non-rigid registration algorithm as the basis of a segmentation algorithm. That algorithm also outperforms other methods in our tests, producing smoother segmentations and more accurately reproducing manual segmentations.
Resumo:
El fang biològic es produeix en les plantes de tractament d'aigües residuals urbanes i industrials. El tractament i la gestió dels fangs és un dels problemes més importants en el camp del tractament de les aigües residuals. Aquesta situació es preveu que es veurà agreujada en el futur, per un increment del volum de fang produït associat a l'exigència de nivells més alts de depuració i per l'augment dels nombre d'estacions depuradores en funcionament. D'altre banda, les limitacions que presenten les opcions tradicionals de gestió dels fangs, fa necessari buscar solucions innovadores i efectives per a solucionar el problema que suposa la gestió d'aquests fangs biològics. Els fangs biològics són de naturalesa carbonosa i amb un alt contingut de matèria orgànica. Aquestes característiques, permeten la conversió del fang en un sòlid adsorbent de tipus carbonós. Aquesta conversió ofereix el doble benefici de reduir el volum de fang que ha de ser gestionat i alhora produir un adsorbent amb un cost inferior a la dels adsorbents convencionals (carbons actius comercials). Fins el moment, els tractaments alta temperatura han demostrat la seva efectivitat per du a terme el procés de transformació dels excedents de fang biològic en un sòlid adsorbent carbonós (carbó actiu). Com a alternativa a aquests processos a alta temperatura, es proposa un nou procés d'obtenció d'un sòlid adsorbent carbonós a partir dels excedents de fangs biològics, mitjançant un tractament a baixa temperatura, combinant el tractament per microones amb l'addició d'un reactiu químic (H2SO4). La present tesi analitza el tractament dels excedents de fangs biològics utilitzant un tractament mitjançant microones i l'addició d'àcid sulfúric (H2SO4), al mateix temps analitza la possibilitat d'utilitzar els sòlids adsorbents obtinguts per a millorar la qualitat de les aigües residuals. Paràmetres d'operació com poden ser la quantitat d'àcid sulfúric addicionada al fang, el nivell de potència del forn microones i el temps de tractament, es modificaran per tal de determinar la influència que poden tenir sobre la qualitat del sòlid adsorbent. Un cop determinada la qualitat dels diferent sòlids adsorbents s'avalua la seva capacitat per a l'eliminació de colorant i metalls en fase líquida. Els resultats obtinguts es comparen amb els obtinguts per un carbó actiu derivat de fangs i un carbó actiu comercial.
Resumo:
The accurate prediction of storms is vital to the oil and gas sector for the management of their operations. An overview of research exploring the prediction of storms by ensemble prediction systems is presented and its application to the oil and gas sector is discussed. The analysis method used requires larger amounts of data storage and computer processing time than other more conventional analysis methods. To overcome these difficulties eScience techniques have been utilised. These techniques potentially have applications to the oil and gas sector to help incorporate environmental data into their information systems
Resumo:
Uncertainties associated with the representation of various physical processes in global climate models (GCMs) mean that, when projections from GCMs are used in climate change impact studies, the uncertainty propagates through to the impact estimates. A complete treatment of this ‘climate model structural uncertainty’ is necessary so that decision-makers are presented with an uncertainty range around the impact estimates. This uncertainty is often underexplored owing to the human and computer processing time required to perform the numerous simulations. Here, we present a 189-member ensemble of global river runoff and water resource stress simulations that adequately address this uncertainty. Following several adaptations and modifications, the ensemble creation time has been reduced from 750 h on a typical single-processor personal computer to 9 h of high-throughput computing on the University of Reading Campus Grid. Here, we outline the changes that had to be made to the hydrological impacts model and to the Campus Grid, and present the main results. We show that, although there is considerable uncertainty in both the magnitude and the sign of regional runoff changes across different GCMs with climate change, there is much less uncertainty in runoff changes for regions that experience large runoff increases (e.g. the high northern latitudes and Central Asia) and large runoff decreases (e.g. the Mediterranean). Furthermore, there is consensus that the percentage of the global population at risk to water resource stress will increase with climate change.
Resumo:
Pullpipelining, a pipeline technique where data is pulled from successor stages from predecessor stages is proposed Control circuits using a synchronous, a semi-synchronous and an asynchronous approach are given. Simulation examples for a DLX generic RISC datapath show that common control pipeline circuit overhead is avoided using the proposal. Applications to linear systolic arrays in cases when computation is finished at early stages in the array are foreseen. This would allow run-time data-driven digital frequency modulation of synchronous pipelined designs. This has applications to implement algorithms exhibiting average-case processing time using a synchronous approach.
Resumo:
This paper presents an improved Two-Pass Hexagonal (TPA) algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS) for motion estimation. In the TPA, Motion Vectors (MV) are generated from the first-pass LHMEA and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of Macroblocks (MBs). The hashtable structure of LHMEA is improved compared to the original TPA and LHMEA. The evaluation of the algorithm considers the three important metrics being processing time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms.
Resumo:
This paper presents a clocking pipeline technique referred to as a single-pulse pipeline (PP-Pipeline) and applies it to the problem of mapping pipelined circuits to a Field Programmable Gate Array (FPGA). A PP-pipeline replicates the operation of asynchronous micropipelined control mechanisms using synchronous-orientated logic resources commonly found in FPGA devices. Consequently, circuits with an asynchronous-like pipeline operation can be efficiently synthesized using a synchronous design methodology. The technique can be extended to include data-completion circuitry to take advantage of variable data-completion processing time in synchronous pipelined designs. It is also shown that the PP-pipeline reduces the clock tree power consumption of pipelined circuits. These potential applications are demonstrated by post-synthesis simulation of FPGA circuits. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
We know that from mid-childhood onwards most new words are learned implicitly via reading; however, most word learning studies have taught novel items explicitly. We examined incidental word learning during reading by focusing on the well-documented finding that words which are acquired early in life are processed more quickly than those acquired later. Novel words were embedded in meaningful sentences and were presented to adult readers early (day 1) or later (day 2) during a five-day exposure phase. At test adults read the novel words in semantically neutral sentences. Participants’ eye movements were monitored throughout exposure and test. Adults also completed a surprise memory test in which they had to match each novel word with its definition. Results showed a decrease in reading times for all novel words over exposure, and significantly longer total reading times at test for early than late novel words. Early-presented novel words were also remembered better in the offline test. Our results show that order of presentation influences processing time early in the course of acquiring a new word, consistent with partial and incremental growth in knowledge occurring as a function of an individual’s experience with each word.
Resumo:
Barium molybdate (BaMoO(4)) powders were synthesized by the co-precipitation method and processed in microwave-hydrothermal at 140 degrees C for different times. These powders were characterized by X-ray diffraction (XRD), Fourier transform Raman (FT-Raman), Fourier transform infrared (FT-IR), ultraviolet-visible (UV-vis) absorption spectroscopies and photoluminescence (PL) measurements. XRD patterns and FT-Raman spectra showed that these powders present a scheelite-type tetragonal structure without the presence of deleterious phases. FT-IR spectra exhibited a large absorption band situated at around 850.4 cm(-1), which is associated to the Mo-O antisymmetric stretching vibrations into the [MoO(4)] clusters. UV-vis absorption spectra indicated a reduction in the intermediary energy levels within band gap with the processing time evolution. First-principles quantum mechanical calculations based on the density functional theory were employed in order to understand the electronic structure (band structure and density of states) of this material. The powders when excited with different wavelengths (350 nm and 488 nm) presented variations. This phenomenon was explained through a model based in the presence of intermediary energy levels (deep and shallow holes) within the band gap. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In order to achieve the high performance, we need to have an efficient scheduling of a parallelprogram onto the processors in multiprocessor systems that minimizes the entire executiontime. This problem of multiprocessor scheduling can be stated as finding a schedule for ageneral task graph to be executed on a multiprocessor system so that the schedule length can be minimize [10]. This scheduling problem is known to be NP- Hard.In multi processor task scheduling, we have a number of CPU’s on which a number of tasksare to be scheduled that the program’s execution time is minimized. According to [10], thetasks scheduling problem is a key factor for a parallel multiprocessor system to gain betterperformance. A task can be partitioned into a group of subtasks and represented as a DAG(Directed Acyclic Graph), so the problem can be stated as finding a schedule for a DAG to beexecuted in a parallel multiprocessor system so that the schedule can be minimized. Thishelps to reduce processing time and increase processor utilization. The aim of this thesis workis to check and compare the results obtained by Bee Colony algorithm with already generatedbest known results in multi processor task scheduling domain.
Resumo:
Colour segmentation is the most commonly used method in road signs detection. Road sign contains several basic colours such as red, yellow, blue and white which depends on countries.The objective of this thesis is to do an evaluation of the four colour segmentation algorithms. Dynamic Threshold Algorithm, A Modification of de la Escalera’s Algorithm, the Fuzzy Colour Segmentation Algorithm and Shadow and Highlight Invariant Algorithm. The processing time and segmentation success rate as criteria are used to compare the performance of the four algorithms. And red colour is selected as the target colour to complete the comparison. All the testing images are selected from the Traffic Signs Database of Dalarna University [1] randomly according to the category. These road sign images are taken from a digital camera mounted in a moving car in Sweden.Experiments show that the Fuzzy Colour Segmentation Algorithm and Shadow and Highlight Invariant Algorithm are more accurate and stable to detect red colour of road signs. And the method could also be used in other colours analysis research. The yellow colour which is chosen to evaluate the performance of the four algorithms can reference Master Thesis of Yumei Liu.
Resumo:
Delineation of commuting regions has always been based on statistical units, often municipalities or wards. However, using these units has certain disadvantages as their land areas differ considerably. Much information is lost in the larger spatial base units and distortions in self-containment values, the main criterion in rule-based delineation procedures, occur. Alternatively, one can start from relatively small standard size units such as hexagons. In this way, much greater detail in spatial patterns is obtained. In this paper, regions are built by means of intrazonal maximization (Intramax) on the basis of hexagons. The use of geoprocessing tools, specifically developed for the processing ofcommuting data, speeds up processing time considerably. The results of the Intramax analysis are evaluated with travel-to-work area constraints, and comparisons are made with commuting fields, accessibility to employment, commuting flow density and network commuting flow size. From selected steps in the regionalization process, a hierarchy of nested commuting regions emerges, revealing the complexity of commuting patterns.
Resumo:
FERNANDES, Fabiano A. N. et al. Optimization of Osmotic Dehydration of Papaya of followed by air-drying. Food Research Internation, v. 39, p. 492-498, 2006.
Resumo:
Water injection is the most widely used method for supplementary recovery in many oil fields due to various reasons, like the fact that water is an effective displacing agent of low viscosity oils, the water injection projects are relatively simple to establish and the water availability at a relatively low cost. For design of water injection projects is necessary to do reservoir studies in order to define the various parameters needed to increase the effectiveness of the method. For this kind of study can be used several mathematical models classified into two general categories: analytical or numerical. The present work aims to do a comparative analysis between the results presented by flow lines simulator and conventional finite differences simulator; both types of simulators are based on numerical methods designed to model light oil reservoirs subjected to water injection. Therefore, it was defined two reservoir models: the first one was a heterogeneous model whose petrophysical properties vary along the reservoir and the other one was created using average petrophysical properties obtained from the first model. Comparisons were done considering that the results of these two models were always in the same operational conditions. Then some rock and fluid parameters have been changed in both models and again the results were compared. From the factorial design, that was done to study the sensitivity analysis of reservoir parameters, a few cases were chosen to study the role of water injection rate and the vertical position of wells perforations in production forecast. It was observed that the results from the two simulators are quite similar in most of the cases; differences were found only in those cases where there was an increase in gas solubility ratio of the model. Thus, it was concluded that in flow simulation of reservoirs analogous of those now studied, mainly when the gas solubility ratio is low, the conventional finite differences simulator may be replaced by flow lines simulator the production forecast is compatible but the computational processing time is lower.
Análise genética de escores de avaliação visual de bovinos com modelos bayesianos de limiar e linear
Resumo:
O objetivo deste trabalho foi comparar as estimativas de parâmetros genéticos obtidas em análises bayesianas uni-característica e bi-característica, em modelo animal linear e de limiar, considerando-se as características categóricas morfológicas de bovinos da raça Nelore. Os dados de musculosidade, estrutura física e conformação foram obtidos entre 2000 e 2005, em 3.864 animais de 13 fazendas participantes do Programa Nelore Brasil. Foram realizadas análises bayesianas uni e bi-características, em modelos de limiar e linear. de modo geral, os modelos de limiar e linear foram eficientes na estimação dos parâmetros genéticos para escores visuais em análises bayesianas uni-características. Nas análises bi-características, observou-se que: com utilização de dados contínuos e categóricos, o modelo de limiar proporcionou estimativas de correlação genética de maior magnitude do que aquelas do modelo linear; e com o uso de dados categóricos, as estimativas de herdabilidade foram semelhantes. A vantagem do modelo linear foi o menor tempo gasto no processamento das análises. Na avaliação genética de animais para escores visuais, o uso do modelo de limiar ou linear não influenciou a classificação dos animais, quanto aos valores genéticos preditos, o que indica que ambos os modelos podem ser utilizados em programas de melhoramento genético.