11 resultados para HDFS bottleneck
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
An anomalously long transient is needed to achieve a steady pressurization of a fluid when forced to flow through micronarrowed channels under constant mechanical driving. This phenomenon, known as the "bottleneck effect" is here revisited from a different perspective, by using confined displacements of interfacial fluids. Compared to standard microfluidics, such effect admits in this case a neat quantitative characterization, which reveals intrinsic material characteristics of flowing monolayers and permits to envisage strategies for their controlled micromanipulation.
Resumo:
In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms
Resumo:
Durant el període de gaudiment de la beca, des del dia 9 de març del 2007 fins el dia 8 de març del 2010, s’han dut a terme diferents tipus d’experiments amb sistemes bidimensionals com són les monocapes de Langmuir. Inicialment es va començar per l’estudi i la caracterització d’aquests sistemes experimentals, tant en repòs com en dinàmic, com és l’estudi de la reposta col•lectiva molecular de dominis d’un azoderivat fotosensible al rotar el pla de polarització mentre es mante sota il•luminació constant i els estudis de sistemes bidimensionals al collapse que es poden relacionar a les propietats viscoplàstiques dels sòlids. Una altra via d’estudi és la reologia d’aquests sistemes bidimensionals quan flueixen a través de canals. Arrel del sistema experimental més simple, una monocapa fluint per un canal, s’ha observat i estudiat l’efecte coll d’ampolla. Un cop assolit i estudiat el sistema més senzill, s’han aplicat tècniques més complexes de fabricació per fotolitografia per fer fluir monocapes de Langmuir per circuits on hi ha un gran contrast de mullat. Un cop aquests circuits es van implementar satisfactòriament en un sistema pel control de fluxos bidimensionals, es posen de manifest les possibles aplicacions futures d’aquests sistemes per l’estudi i el desenvolupament de la microfluídica bidimensional.
Resumo:
La E/S Paralela es un área de investigación que tiene una creciente importancia en el cómputo de Altas Prestaciones. Si bien durante años ha sido el cuello de botella de los computadores paralelos en la actualidad, debido al gran aumento del poder de cómputo, el problema de la E/S se ha incrementado y la comunidad del Cómputo de Altas Prestaciones considera que se debe trabajar en mejorar el sistema de E/S de los computadores paralelos, para lograr cubrir las exigencias de las aplicaciones científicas que usan HPC. La Configuración de la Entrada/Salida (E/S) Paralela tiene una gran influencia en las prestaciones y disponibilidad, por ello es importante “Analizar configuraciones de E/S paralela para identificar los factores claves que influyen en las prestaciones y disponibilidad de la E/S de Aplicaciones Científicas que se ejecutan en un clúster”. Para realizar el análisis de las configuraciones de E/S se propone una metodología que permite identificar los factores de E/S y evaluar su influencia para diferentes configuraciones de E/S formada por tres fases: Caracterización, Configuración y Evaluación. La metodología permite analizar el computador paralelo a nivel de Aplicación Científica, librerías de E/S y de arquitectura de E/S, pero desde el punto de vista de la E/S. Los experimentos realizados para diferentes configuraciones de E/S y los resultados obtenidos indican la complejidad del análisis de los factores de E/S y los diferentes grados de influencia en las prestaciones del sistema de E/S. Finalmente se explican los trabajos futuros, el diseño de un modelo que de soporte al proceso de Configuración del sistema de E/S paralela para aplicaciones científicas. Por otro lado, para identificar y evaluar los factores de E/S asociados con la disponibilidad a nivel de datos, se pretende utilizar la Arquitectura Tolerante a Fallos RADIC.
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
Photo-mosaicing techniques have become popular for seafloor mapping in various marine science applications. However, the common methods cannot accurately map regions with high relief and topographical variations. Ortho-mosaicing borrowed from photogrammetry is an alternative technique that enables taking into account the 3-D shape of the terrain. A serious bottleneck is the volume of elevation information that needs to be estimated from the video data, fused, and processed for the generation of a composite ortho-photo that covers a relatively large seafloor area. We present a framework that combines the advantages of dense depth-map and 3-D feature estimation techniques based on visual motion cues. The main goal is to identify and reconstruct certain key terrain feature points that adequately represent the surface with minimal complexity in the form of piecewise planar patches. The proposed implementation utilizes local depth maps for feature selection, while tracking over several views enables 3-D reconstruction by bundle adjustment. Experimental results with synthetic and real data validate the effectiveness of the proposed approach
Resumo:
Language Resources are a critical component for Natural Language Processing applications. Throughout the years many resources were manually created for the same task, but with different granularity and coverage information. To create richer resources for a broad range of potential reuses, nformation from all resources has to be joined into one. The hight cost of comparing and merging different resources by hand has been a bottleneck for merging existing resources. With the objective of reducing human intervention, we present a new method for automating merging resources. We have addressed the merging of two verbs subcategorization frame (SCF) lexica for Spanish. The results achieved, a new lexicon with enriched information and conflicting information signalled, reinforce our idea that this approach can be applied for other task of NLP.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to have richer resources with a broad range of potential uses for a significant number of languages.With the objective of reducing cost byeliminating human intervention, we present a new method for automating the merging of resources,with special emphasis in what we call the mapping step. This mapping step, which converts the resources into a common format that allows latter the merging, is usually performed with huge manual effort and thus makes the whole process very costly. Thus, we propose a method to perform this mapping fully automatically. To test our method, we have addressed the merging of two verb subcategorization frame lexica for Spanish, The resultsachieved, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to obtain richer resources and a broader range of potential uses for a significant number of languages. With the objective of reducing cost by eliminating human intervention, we present a new method towards the automatic merging of resources. This method includes both, the automatic mapping of resources involved to a common format and merging them, once in this format. This paper presents how we have addressed the merging of two verb subcategorization frame lexica for Spanish, but our method will be extended to cover other types of Lexical Resources. The achieved results, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
Process variations are a major bottleneck for digital CMOS integrated circuits manufacturability and yield. That iswhy regular techniques with different degrees of regularity are emerging as possible solutions. Our proposal is a new regular layout design technique called Via-Configurable Transistors Array (VCTA) that pushes to the limit circuit layout regularity for devices and interconnects in order to maximize regularity benefits. VCTA is predicted to perform worse than the Standard Cell approach designs for a certain technology node but it will allow the use of a future technology on an earlier time. Ourobjective is to optimize VCTA for it to be comparable to the Standard Cell design in an older technology. Simulations for the first unoptimized version of our VCTA of delay and energy consumption for a Full Adder circuit in the 90 nm technology node are presented and also the extrapolation for Carry-RippleAdders from 4 bits to 64 bits.
Resumo:
Es va realitzar una sèrie d'assaigs d'adobat nitrogenat en diferents comarques de la Catalunya interior. En el conjunt d'aquests assaigs es varen comprovar tres mètodes diferents que es va considerar que eren prometedors per tal de millorar la fertilització nitrogenada. Els mètodes assajats eren el mètode del balanç de nitrogen, el del nitrogen mineral i el del contingut de nitrats al suc de la base de les tiges (CNSBT). Els sòls on es van realitzar els assaigs no presentaven cap limitació especial per al cultiu del blat i eren profunds, ben drenats, no salins i de textura mitjana; l'única excepció era un assaig sobre sòl moderadament profund. Per tant, i també pel que fa a la fertilitat química, els sòls s'han de considerar d'un potencial productiu mitjàalt. El mètode del balanç de nitrogen s'ha mostrat com a molt prometedor de cara a definir si cal la magnitud de l'adobat de cobertora per a les condicions estudiades. El mètode de nitrogen mineral també ha estat efectiu en aquest sentit, mentre que el del CNSBT s'ha revelat com a no aplicable en les condicions assajades, on en molts casos l'aigua és també factor limitant. Al llarg dels assaigs s'han identificat un seguit de factors que impedeixen ajustar la fertilitat nitrogenada. Entre aquests cal esmentar la mala estimació de la producció objectiu, la dificultat de predir el N disponible a partir dels adobs orgànics, dificultats de mostreig pel nitrogen nítric i l'efecte crític que té l'erràtica disponibilitat d'aigua que complica molt l'estratègia de fertilització nitrogenada a adoptar.