947 resultados para Which-way experiments


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le processus de vieillissement humain est un processus complexe qui varie grandement d’une personne à l’autre. Malgré l’ampleur des recherches faites sur le sujet, il reste encore beaucoup à explorer et à comprendre. Cette thèse propose trois expériences qui nous permettent d’améliorer notre compréhension des changements qui s’opèrent dans la mémoire de travail visuelle et l’attention visuospatiale avec la prise en âge. La première expérience propose d’examiner les changements dans les capacités de mémoire de travail visuelle entre des jeunes adultes, des adultes âgés sains et des personnes atteintes de trouble cognitif léger (TCL). De plus, grâce à un suivi fait avec les personnes ayant un TCL, nous avons pu examiner si des différences existaient au niveau comportemental entre les âgés qui ont déclinés vers un type de démence et ceux dont l’état est resté stable. Plusieurs techniques peuvent être utilisées pour étudier les effets du vieillissement sur le cerveau. Les tests neuropsychologiques et les tâches comportementales présentées dans la première expérience en sont un exemple. La neuroimagerie peut aussi s’avérer particulièrement utile. En effet, certaines mesures électrophysiologiques, aussi appelées potentiels reliés aux évènements (PRE), sont associées à des fonctions cognitives précises. Ces composantes nous permettent de suivre ces processus et d’observer les modulations causées par les caractéristiques des stimuli ou l’âge par exemple. C’est le cas de la N2pc (négativité 2 postérieure controlatérale) et de la SPCN (sustained posterior contralateral negativity), des composantes électrophysiologiques liées respectivement à l’attention visuospatiale et la mémoire de travail visuelle. On connait bien ces deux composantes ainsi que les facteurs qui les modulent, or elles sont peu utilisées pour les changements qui occurrent dans l’attention et la mémoire de travail visuelle dans le cadre du processus de vieillissement. Les deuxième et troisième expériences proposent d’utiliser une tâche de recherche visuelle (nombre d’items de couleur et identification d’une relation spatiale entre deux items de couleur) afin d’explorer les changements observables sur ces composantes électrophysiologiques. La deuxième expérience examine l’efficacité d’un paradigme à présentations multiples (‘multiple frame’) afin de mesurer la N2pc et la SPCN chez de jeunes adultes. La troisième expérience a pour but d’examiner les effets du vieillissement normal sur l’amplitude et la latence de la N2pc et la SPCN en utilisant le même type de tâche de recherche visuelle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work aims to study induced maturation of the pearl oyster for induced spawning experiments. The work on larval development was done with a view to developing techniques for the artificial rearing of commercially important pearl oyster P fucata, and also to elucidate the principles and problems of tropical bivalve larvae in general for detailed investigations in the future. The present study is designed to probe into the details of the basic aspects of the biology related to the hatchery technology of Pinctada fucata and the understanding of the factors which influence induction of maturation, spawning, larval rearing and spat settlement. This would go a long way in the upgradation of hatchery technology of the Indian Pearl oyster Pinctada fucata fora commercial level seed production..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of self-assembly as a strategy for the synthesis has been confined largely to molecules, because of the importance of manipulating the structure of matter at the molecular scale. We have investigated the influence of temperature and pH, in addition to the concentration of the capping agent used for the formation of the nano-bio conjugates. For example, the formation of the narrower size distribution of the nanoparticles was observed with the increase in the concentration of the protein, which supports the fact that γ-globulin acts both as a controller of nucleation as well as stabiliser. As analyzed through various photophysical, biophysical and microscopic techniques such as TEM, AFM, C-AFM, SEM, DLS, OPM, CD and FTIR, we observed that the initial photoactivation of γ-globulin at pH 12 for 3 h resulted in small protein fibres of ca. Further irradiation for 24 h, led to the formation of selfassembled long fibres of the protein of ca. 5-6 nm and observation of surface plasmon resonance band at around 520 nm with the concomitant quenching of luminescence intensity at 680 nm. The observation of light triggered self-assembly of the protein and its effect on controlling the fate of the anchored nanoparticles can be compared with the naturally occurring process such as photomorphogenesis.Furthermore,our approach offers a way to understand the role played by the self-assembly of the protein in ordering and knock out of the metal nanoparticles and also in the design of nano-biohybrid materials for medicinal and optoelectronic applications. Investigation of the potential applications of NIR absorbing and water soluble squaraine dyes 1-3 for protein labeling and anti-amyloid agents forms the subject matter of the third chapter of the thesis. The study of their interactions with various proteins revealed that 1-3 showed unique interactions towards serum albumins as well as lysozyme. 69%, 71% and 49% in the absorption spectra as well as significant quenching in the fluorescence intensity of the dyes 1-3, respectively. Half-reciprocal analysis of the absorption data and isothermal titration calorimetric (ITC) analysis of the titration experiments gave a 1:1 stoichiometry for the complexes formed between the lysozyme and squaraine dyes with association constants (Kass) in the range 104-105 M-1. We have determined the changes in the free energy (ΔG) for the complex formation and the values are found to be -30.78, -32.31 and -28.58 kJmol-1, respectively for the dyes 1, 2 and 3. Furthermore, we have observed a strong induced CD (ICD) signal corresponding to the squaraine chromophore in the case of the halogenated squaraine dyes 2 and 3 at 636 and 637 nm confirming the complex formation in these cases. To understand the nature of interaction of the squaraine dyes 1-3 with lysozyme, we have investigated the interaction of dyes 1-3 with different amino acids. These results indicated that the dyes 1-3 showed significant interactions with cysteine and glutamic acid which are present in the side chains of lysozyme. In addition the temperature dependent studies have revealed that the interaction of the dye and the lysozyme are irreversible. Furthermore, we have investigated the interactions of these NIR dyes 1-3 with β- amyloid fibres derived from lysozyme to evaluate their potential as inhibitors of this biologically important protein aggregation. These β-amyloid fibrils were insoluble protein aggregates that have been associated with a range of neurodegenerative diseases, including Huntington, Alzheimer’s, Parkinson’s, and Creutzfeldt-Jakob diseases. We have synthesized amyloid fibres from lysozyme through its incubation in acidic solution below pH 4 and by allowing to form amyloid fibres at elevated temperature. To quantify the binding affinities of the squaraine dyes 1-3 with β-amyloids, we have carried out the isothermal titration calorimetric (ITC) measurements. The association constants were determined and are found to be 1.2 × 105, 3.6× 105 and 3.2 × 105 M-1 for the dyes, 1-3, respectively. To gain more insights into the amyloid inhibiting nature of the squaraine dyes under investigations, we have carried out thioflavin assay, CD, isothermal titration calorimetry and microscopic analysis. The addition of the dyes 1-3 (5μM) led to the complete quenching in the apparent thioflavin fluorescence, thereby indicating the destabilization of β-amyloid fibres in the presence of the squaraine dyes. Further, the inhibition of the amyloid fibres by the squaraine dyes 1-3, has been evidenced though the DLS, TEM AFM and SAED, wherein we observed the complete destabilization of the amyloid fibre and transformation of the fibre into spherical particles of ca. These results demonstrate the fact that the squaraine dyes 1-3 can act as protein labeling agents as well as the inhibitors of the protein amyloidogenesis. The last chapter of the thesis describes the synthesis and investigation of selfassembly as well as bio-imaging aspects of a few novel tetraphenylethene conjugates 4-6.Expectedly, these conjugates showed significant solvatochromism and exhibited a hypsochromic shift (negative solvatochromism) as the solvent polarity increased, and these observations were justified though theoretical studies employing the B3LYP/6-31g method. We have investigated the self-assembly properties of these D-A conjugates though variation in the percentage of water in acetonitrile solution due to the formation of nanoaggregates. Further the contour map of the observed fluorescence intensity as a function of the fluorescence excitation and emission wavelength confirmed the formation of J-type aggregates in these cases. To have a better understanding of the type of self-assemblies formed from the TPE conjugates 4-6, we have carried out the morphological analysis through various microscopic techniques such as DLS, SEM and TEM. 70%, we observed rod shape architectures having ~ 780 nm in diameter and ~ 12 μM in length as evidenced through TEM and SEM analysis. We have made similar observations with the dodecyl conjugate 5 at ca. 70% and 50% water/acetonitrile mixtures, the aggregates formed from 4 and 5 were found to be highly crystalline and such structures were transformed to amorphous nature as the water fraction was increased to 99%. To evaluate the potential of the conjugate as bio-imaging agents, we have carried out their in vitro cytotoxicity and cellular uptake studies though MTT assay, flow cytometric and confocal laser scanning microscopic techniques. Thus nanoparticle of these conjugates which exhibited efficient emission, large stoke shift, good stability, biocompatibility and excellent cellular imaging properties can have potential applications for tracking cells as well as in cell-based therapies. In summary we have synthesized novel functional organic chromophores and have studied systematic investigation of self-assembly of these synthetic and biological building blocks under a variety of conditions. The investigation of interaction of water soluble NIR squaraine dyes with lysozyme indicates that these dyes can act as the protein labeling agents and the efficiency of inhibition of β-amyloid indicate, thereby their potential as anti-amyloid agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diese Arbeit beschreibt den Evaluationsprozess einer dreidimensionalen Visualisierungstechnik, die am Institut für periphere Mikroelektronik der Universität Kassel entwickelt wurde. Hinter der dreidimensionalen Darstellung mittels Linsenrasterscheibe verbirgt sich eine neue Dimension der Interaktion mit dem Computer. Im Vergleich zu gewöhnlichen dreidimensionalen Darstellungen, bei denen ein 3D-Objekt auf einer 2D-Fläche abgebildet wird und somit nach wie vor nicht aus der Bildschirmebene heraus kann, können bei der stereoskopen Darstellung die Objekte dreidimensional visualisiert werden. Die Objekte tauchen vor, beziehungsweise hinter der Darstellungsebene auf. Da die Linsenrasterscheibe bisher noch nicht wahrnehmungspsychologisch untersucht wurde und auch allgemein auf dem Gebiet der Evaluation von 3D-Systemen nur wenige Untersuchungen mit quantitativen Ergebnissen verfügbar sind (Vollbracht, 1997), besteht hier ein zentrales Forschungsinteresse. Um eine Evaluation dieses 3D-Systems durchzuführen, wird im Theorieteil der Arbeit zunächst der Begriff der Evaluation definiert. Des Weiteren werden die wahrnehmungspsychologischen Grundlagen der monokularen und binokularen Raumwahrnehmung erörtert. Anschließend werden Techniken zur Erzeugung von Tiefe in Bildern und auf Bildschirmen erläutert und die Unterschiede zwischen der technisch erzeugten und der natürlichen Tiefenwahrnehmung näher beleuchtet. Nach der Vorstellung verschiedener stereoskoper Systeme wird näher auf die autostereoskope Linsenrasterscheibe eingegangen. Zum Abschluss des theoretischen Teils dieser Arbeit wird die Theorie des eingesetzten Befindlichkeitsfragebogens veranschaulicht. Gegenstand des empirischen Teils dieser Arbeit sind zwei zentrale Fragestellungen. Erstens soll untersucht werden, ob durch den höheren Informationsgehalt grundlegende Wahrnehmungsleistungen in bestimmten Bereichen positiv beeinflusst werden können. Zweitens soll untersucht werden, ob sich die höhere visuelle Natürlichkeit und die Neuartigkeit der Bildpräsentation auch auf die subjektive Befindlichkeit der Probanden auswirkt. Die empirische Überprüfung dieser Hypothesen erfolgt mittels dreier Experimente. Bei den ersten beiden Experimenten stehen grundlegende wahrnehmungspsychologische Leistungen im Vordergrund, während in der dritten Untersuchung der Bereich der subjektiven Befindlichkeit gemessen wird. Abschließend werden die Ergebnisse der Untersuchungen vorgestellt und diskutiert. Des Weiteren werden konkrete Einsatzmöglichkeiten für die Linsenrasterscheibe aufgezeigt und denkbare nachfolgende experimentelle Vorgehensweisen skizziert.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The progress in microsystem technology or nano technology places extended requirements to the fabrication processes. The trend is moving towards structuring within the nanometer scale on the one hand, and towards fabrication of structures with high aspect ratio (ratio of vertical vs. lateral dimensions) and large depths in the 100 µm scale on the other hand. Current procedures for the microstructuring of silicon are wet chemical etching and dry or plasma etching. A modern plasma etching technique for the structuring of silicon is the so-called "gas chopping" etching technique (also called "time-multiplexed etching"). In this etching technique, passivation cycles, which prevent lateral underetching of sidewalls, and etching cycles, which etch preferably in the vertical direction because of the sidewall passivation, are constantly alternated during the complete etching process. To do this, a CHF3/CH4 plasma, which generates CF monomeres is employed during the passivation cycle, and a SF6/Ar, which generates fluorine radicals and ions plasma is employed during the etching cycle. Depending on the requirements on the etched profile, the durations of the individual passivation and etching cycles are in the range of a few seconds up to several minutes. The profiles achieved with this etching process crucially depend on the flow of reactants, i.e. CF monomeres during the passivation cycle, and ions and fluorine radicals during the etching cycle, to the bottom of the profile, especially for profiles with high aspect ratio. With regard to the predictability of the etching processes, knowledge of the fundamental effects taking place during a gas chopping etching process, and their impact onto the resulting profile is required. For this purpose in the context of this work, a model for the description of the profile evolution of such etching processes is proposed, which considers the reactions (etching or deposition) at the sample surface on a phenomenological basis. Furthermore, the reactant transport inside the etching trench is modelled, based on angular distribution functions and on absorption probabilities at the sidewalls and bottom of the trench. A comparison of the simulated profiles with corresponding experimental profiles reveals that the proposed model reproduces the experimental profiles, if the angular distribution functions and absorption probabilities employed in the model is in agreement with data found in the literature. Therefor the model developed in the context of this work is an adequate description of the effects taking place during a gas chopping plasma etching process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Im Vordergrund der Arbeit stand die Erfassung der mikrobiellen Biomasse bzw. Residualmasse an der Wurzeloberfläche, im Rhizosphärenboden und im umgebenden Boden. Durch den Vergleich von verschiedenen Methoden zur Erfassung der mikrobiellen Biomasse wurden die Gehalte von pilzlichem und bakteriellem Kohlenstoff an der Rhizoplane und in der Rhizosphäre quantifiziert. Dabei wurde die Fumigations-Extraktions-Methode zur Erfassung der mikrobiellen Biomasse eingesetzt. Ergosterol diente als Indikator für die pilzliche Biomasse und die Aminozucker Glucosamin und Muraminsäure sollten Aufschluss geben über die bakterielle und pilzliche Biomasse bzw. Residualmasse in den drei Probenfraktionen. Dazu wurden Umrechnungsfaktoren erstellt, die zur Berechnung des bakteriellen und pilzlichen Kohlenstoffs aus den Gehalten von Muraminsäure und Pilz-Glucosamin dienten. Die Bestimmung von Aminozuckern wurde insoweit modifiziert, dass sowohl in Boden- als auch in Wurzelhydrolysaten die Messung von Glucosamin, Galactosamin, Muraminsäure und Mannosamin gleichzeitig als automatisiertes Standardverfahren mit Hilfe der HPLC erfolgen konnte. Es wurden drei Gefäßversuche durchgeführt: Im ersten Versuch wurde der Einfluss der Pflanzenart auf die mikrobielle Besiedlung der Wurzeloberflächen untersucht. Dabei wurden Wurzeln und Rhizosphärenboden von 15 verschiedenen Pflanzenarten miteinander verglichen. Im zweiten Versuch stand der Einfluss der mikrobiellen Biomasse eines Bodens auf die mikrobielle Besiedlung von Wurzeloberflächen im Vordergrund. Deutsches Weidelgras (Lolium perenne L.) wurde auf sieben verschiedenen Böden angezogen. Bei den Böden handelte es sich um sechs Oberböden, die sich hinsichtlich des Bodentyps und der Bewirtschaftungsform voneinander unterschieden, und einen Unterboden. Im dritten Versuch wurde die mikrobielle Besiedlung von Wurzeln nach teilweiser und vollständiger Entfernung der oberirdischen Biomasse beobachtet. Welsches Weidelgras (Lolium multiflorum Lam.) wurde 24 Tage nach der Aussaat beschnitten. Anschließend wurde über einen Versuchszeitraum von acht Tagen die mikrobielle Besiedlung an den Wurzeln und in den Bodenfraktionen bestimmt. Es bestätigte sich, dass der Einfluss der einzelnen Pflanzenart von entscheidender Bedeutung für die mikrobielle Besiedlung von Wurzeln ist. Bei fast allen Pflanzen wurde die mikrobielle Biomasse an den Wurzeln von Pilzen dominiert. Das Verhältnis von pilzlichem zu bakteriellem Kohlenstoff an den Wurzeln der 15 Pflanzenarten lag im Mittel bei 2,6. Bei der Betrachtung verschiedener Böden zeigte sich, dass die mikrobielle Besiedlung in tieferen Bodenschichten signifikant niedriger ist als in den Oberböden. Dabei war der Pilzanteil an der mikrobiellen Biomasse im Unterboden deutlich erhöht. Der Vergleich der Oberböden untereinander ergab, dass sowohl der Bodentyp als auch die Bewirtschaftungsform einen signifikanten Einfluss auf mikrobielle Besiedlung ausüben. Durch die teilweise oder vollständige Entfernung der oberirdischen Biomasse wurde eine Veränderung der mikrobiellen Besiedlung an den Wurzeln beobachtet. Das Verhältnis von pilzlichem zu bakteriellem Kohlenstoff sank in dem Versuchszeitraum von 2,5 auf 1,4. Dabei war die Förderung der Pilze in der Variante mit teilweise entfernter oberirdischer Biomasse relativ größer als in der Variante mit vollständig entfernter oberirdischer Biomasse. Entgegen der weit verbreiteten Annahme, dass bei den wurzelbesiedelnden Mikroorganismen die Bakterien gegenüber den Pilzen dominieren, zeigten die Ergebnisse ein gegensätzliches Bild. In allen drei Versuchen ergab sich gleichermaßen, dass sowohl im Boden als auch an den Wurzeln die Pilze gegenüber den Bakterien dominieren.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the number of resources on the web exceeds by far the number of documents one can track, it becomes increasingly difficult to remain up to date on ones own areas of interest. The problem becomes more severe with the increasing fraction of multimedia data, from which it is difficult to extract some conceptual description of their contents. One way to overcome this problem are social bookmark tools, which are rapidly emerging on the web. In such systems, users are setting up lightweight conceptual structures called folksonomies, and overcome thus the knowledge acquisition bottleneck. As more and more people participate in the effort, the use of a common vocabulary becomes more and more stable. We present an approach for discovering topic-specific trends within folksonomies. It is based on a differential adaptation of the PageRank algorithm to the triadic hypergraph structure of a folksonomy. The approach allows for any kind of data, as it does not rely on the internal structure of the documents. In particular, this allows to consider different data types in the same analysis step. We run experiments on a large-scale real-world snapshot of a social bookmarking system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scheduling tasks to efficiently use the available processor resources is crucial to minimizing the runtime of applications on shared-memory parallel processors. One factor that contributes to poor processor utilization is the idle time caused by long latency operations, such as remote memory references or processor synchronization operations. One way of tolerating this latency is to use a processor with multiple hardware contexts that can rapidly switch to executing another thread of computation whenever a long latency operation occurs, thus increasing processor utilization by overlapping computation with communication. Although multiple contexts are effective for tolerating latency, this effectiveness can be limited by memory and network bandwidth, by cache interference effects among the multiple contexts, and by critical tasks sharing processor resources with less critical tasks. This thesis presents techniques that increase the effectiveness of multiple contexts by intelligently scheduling threads to make more efficient use of processor pipeline, bandwidth, and cache resources. This thesis proposes thread prioritization as a fundamental mechanism for directing the thread schedule on a multiple-context processor. A priority is assigned to each thread either statically or dynamically and is used by the thread scheduler to decide which threads to load in the contexts, and to decide which context to switch to on a context switch. We develop a multiple-context model that integrates both cache and network effects, and shows how thread prioritization can both maintain high processor utilization, and limit increases in critical path runtime caused by multithreading. The model also shows that in order to be effective in bandwidth limited applications, thread prioritization must be extended to prioritize memory requests. We show how simple hardware can prioritize the running of threads in the multiple contexts, and the issuing of requests to both the local memory and the network. Simulation experiments show how thread prioritization is used in a variety of applications. Thread prioritization can improve the performance of synchronization primitives by minimizing the number of processor cycles wasted in spinning and devoting more cycles to critical threads. Thread prioritization can be used in combination with other techniques to improve cache performance and minimize cache interference between different working sets in the cache. For applications that are critical path limited, thread prioritization can improve performance by allowing processor resources to be devoted preferentially to critical threads. These experimental results show that thread prioritization is a mechanism that can be used to implement a wide range of scheduling policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a perceptual system for a humanoid robot that integrates abilities such as object localization and recognition with the deeper developmental machinery required to forge those competences out of raw physical experiences. It shows that a robotic platform can build up and maintain a system for object localization, segmentation, and recognition, starting from very little. What the robot starts with is a direct solution to achieving figure/ground separation: it simply 'pokes around' in a region of visual ambiguity and watches what happens. If the arm passes through an area, that area is recognized as free space. If the arm collides with an object, causing it to move, the robot can use that motion to segment the object from the background. Once the robot can acquire reliable segmented views of objects, it learns from them, and from then on recognizes and segments those objects without further contact. Both low-level and high-level visual features can also be learned in this way, and examples are presented for both: orientation detection and affordance recognition, respectively. The motivation for this work is simple. Training on large corpora of annotated real-world data has proven crucial for creating robust solutions to perceptual problems such as speech recognition and face detection. But the powerful tools used during training of such systems are typically stripped away at deployment. Ideally they should remain, particularly for unstable tasks such as object detection, where the set of objects needed in a task tomorrow might be different from the set of objects needed today. The key limiting factor is access to training data, but as this thesis shows, that need not be a problem on a robotic platform that can actively probe its environment, and carry out experiments to resolve ambiguity. This work is an instance of a general approach to learning a new perceptual judgment: find special situations in which the perceptual judgment is easy and study these situations to find correlated features that can be observed more generally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a recent experiment, Freedman et al. recorded from inferotemporal (IT) and prefrontal cortices (PFC) of monkeys performing a "cat/dog" categorization task (Freedman 2001 and Freedman, Riesenhuber, Poggio, Miller 2001). In this paper we analyze the tuning properties of view-tuned units in our HMAX model of object recognition in cortex (Riesenhuber 1999) using the same paradigm and stimuli as in the experiment. We then compare the simulation results to the monkey inferotemporal neuron population data. We find that view-tuned model IT units that were trained without any explicit category information can show category-related tuning as observed in the experiment. This suggests that the tuning properties of experimental IT neurons might primarily be shaped by bottom-up stimulus-space statistics, with little influence of top-down task-specific information. The population of experimental PFC neurons, on the other hand, shows tuning properties that cannot be explained just by stimulus tuning. These analyses are compatible with a model of object recognition in cortex (Riesenhuber 2000) in which a population of shape-tuned neurons provides a general basis for neurons tuned to different recognition tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major obstacle to processing images of the ocean floor comes from the absorption and scattering effects of the light in the aquatic environment. Due to the absorption of the natural light, underwater vehicles often require artificial light sources attached to them to provide the adequate illumination. Unfortunately, these flashlights tend to illuminate the scene in a nonuniform fashion, and, as the vehicle moves, induce shadows in the scene. For this reason, the first step towards application of standard computer vision techniques to underwater imaging requires dealing first with these lighting problems. This paper analyses and compares existing methodologies to deal with low-contrast, nonuniform illumination in underwater image sequences. The reviewed techniques include: (i) study of the illumination-reflectance model, (ii) local histogram equalization, (iii) homomorphic filtering, and, (iv) subtraction of the illumination field. Several experiments on real data have been conducted to compare the different approaches

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El glifosat, N-(fosfonometil) glicina, és un dels herbicides més utilitzats arreu del món a causa de la seva baixa toxicitat i al seu ampli espectre d'aplicació. A conseqüència del gran ús que se'n fa, és necessari monitoritzar aquest compost i el seu principal metabòlit, l'àcid aminometilfosfònic (AMPA), en el medi ambient. S'han descrit diversos mètodes instrumentals basats en cromatografia de gasos (GC) i de líquids (HPLC), sent aquesta darrera l'opció més favorable a causa del caràcter polar dels anàlits. Per assolir nivells de concentració baixos cal, però, la preconcentració dels anàlits. En aquest treball s'estudien diferents alternatives amb aquest objectiu. S'ha avaluat la tècnica de membrana líquida suportada (SLM) on la membrana consisteix en una dissolució orgànica, que conté un transportador (en el nostre cas, un bescanviador d'anions comercial, Aliquat 336), que impregna un suport polimèric microporós que se situa entre dues solucions aquoses: la de càrrega, que conté els anàlits inicialment, i la receptora, on es retenen els anàlits després del seu transport a través de la membrana. Les condicions d'extracció més adequades s'obtenen treballant en medi bàsic amb NaOH on els anàlits estan en forma aniònica i les majors recuperacions s'obtenen amb HCl 0,1 M o NaCl 0,5 M, la qual cosa indica que l'ió clorur és la força impulsora del transport. Un cop dissenyat el sistema, es duen a terme experiments de preconcentració amb dues geometries diferents: un sistema de membrana laminar (LSLM) on recircula la fase receptora i un sistema de fibra buida (HFSLM). Els millors resultats s'obtenen amb el mòdul de fibra buida, amb factors de concentració de 25 i 3 per a glifosat i AMPA, respectivament, fent recircular durant 24 hores 100 ml de solució de càrrega i 4 ml de solució receptora. També s'aplica una tècnica més selectiva, la cromatografia d'afinitat amb ió metàl·lic immobilitzat (IMAC), basada en la interacció entre els anàlits i un metall immobilitzat en una resina a través d'un grup funcional d'aquesta. En aquest estudi s'immobilitza pal·ladi al grup funcional 8-hidroxiquinoleïna de la resina amb matriu acrílica Spheron Oxine 1000 i s'avalua per a l'extracció i preconcentració de glifosat i AMPA. Per a ambdós anàlits l'adsorció és del 100 % i les recuperacions són superiors al 80 % i al 60 % per a glifosat i AMPA, respectivament, utilitzant HCl 0,1 M + NaCl 1 M com a eluent. Aquests resultats es comparen amb els obtinguts amb dues resines més, també carregades amb pal·ladi: Iontosorb Oxin 100, que té el mateix grup funcional però matriu de cel·lulosa, i Spheron Thiol 1000, on el grup funcional és un tiol i la matriu també és acrílica. Per al glifosat els resultats són similars amb totes les resines, però per a l'AMPA la resina Spheron Thiol és la única que proporciona recuperacions superiors al 93 %. Finalment, una altra opció estudiada és l'acoblament de dues columnes de cromatografia líquida (LC-LC). En l'estudi l'objectiu és millorar el mètode existent per a glifosat i AMPA en aigües naturals on el LOD era de 0,25 ug/l. El mètode consisteix en la derivatització precolumna amb el reactiu fluorescent FMOC i l'anàlisi amb l'acoblament LC-LC-fluorescència. Variant lleugerament les condicions de derivatització s'aconsegueix quantificar 0,1 ug/l de glifosat i AMPA. Es fortifiquen aigües naturals amb 0,1, 1 i 10 ug/l dels anàlits per validar el mètode. S'obtenen recuperacions d'entre el 85 % i el 100 %, amb desviacions estàndard relatives inferiors al 8 %. Aplicant una tècnica de preconcentració prèvia a la derivatització i anàlisi utilitzant una resina de bescanvi aniònic, Amberlite IRA-900, es millora la sensibilitat del mètode i s'assoleix un LOD per al glifosat de 0,02 ug/l.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'increment de bases de dades que cada vegada contenen imatges més difícils i amb un nombre més elevat de categories, està forçant el desenvolupament de tècniques de representació d'imatges que siguin discriminatives quan es vol treballar amb múltiples classes i d'algorismes que siguin eficients en l'aprenentatge i classificació. Aquesta tesi explora el problema de classificar les imatges segons l'objecte que contenen quan es disposa d'un gran nombre de categories. Primerament s'investiga com un sistema híbrid format per un model generatiu i un model discriminatiu pot beneficiar la tasca de classificació d'imatges on el nivell d'anotació humà sigui mínim. Per aquesta tasca introduïm un nou vocabulari utilitzant una representació densa de descriptors color-SIFT, i desprès s'investiga com els diferents paràmetres afecten la classificació final. Tot seguit es proposa un mètode par tal d'incorporar informació espacial amb el sistema híbrid, mostrant que la informació de context es de gran ajuda per la classificació d'imatges. Desprès introduïm un nou descriptor de forma que representa la imatge segons la seva forma local i la seva forma espacial, tot junt amb un kernel que incorpora aquesta informació espacial en forma piramidal. La forma es representada per un vector compacte obtenint un descriptor molt adequat per ésser utilitzat amb algorismes d'aprenentatge amb kernels. Els experiments realitzats postren que aquesta informació de forma te uns resultats semblants (i a vegades millors) als descriptors basats en aparença. També s'investiga com diferents característiques es poden combinar per ésser utilitzades en la classificació d'imatges i es mostra com el descriptor de forma proposat juntament amb un descriptor d'aparença millora substancialment la classificació. Finalment es descriu un algoritme que detecta les regions d'interès automàticament durant l'entrenament i la classificació. Això proporciona un mètode per inhibir el fons de la imatge i afegeix invariança a la posició dels objectes dins les imatges. S'ensenya que la forma i l'aparença sobre aquesta regió d'interès i utilitzant els classificadors random forests millora la classificació i el temps computacional. Es comparen els postres resultats amb resultats de la literatura utilitzant les mateixes bases de dades que els autors Aixa com els mateixos protocols d'aprenentatge i classificació. Es veu com totes les innovacions introduïdes incrementen la classificació final de les imatges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La gestió de xarxes és un camp molt ampli i inclou molts aspectes diferents. Aquesta tesi doctoral està centrada en la gestió dels recursos en les xarxes de banda ampla que disposin de mecanismes per fer reserves de recursos, com per exemple Asynchronous Transfer Mode (ATM) o Multi-Protocol Label Switching (MPLS). Es poden establir xarxes lògiques utilitzant els Virtual Paths (VP) d'ATM o els Label Switched Paths (LSP) de MPLS, als que anomenem genèricament camins lògics. Els usuaris de la xarxa utilitzen doncs aquests camins lògics, que poden tenir recursos assignats, per establir les seves comunicacions. A més, els camins lògics són molt flexibles i les seves característiques es poden canviar dinàmicament. Aquest treball, se centra, en particular, en la gestió dinàmica d'aquesta xarxa lògica per tal de maximitzar-ne el rendiment i adaptar-la a les connexions ofertes. En aquest escenari, hi ha diversos mecanismes que poden afectar i modificar les característiques dels camins lògics (ample de banda, ruta, etc.). Aquests mecanismes inclouen els de balanceig de la càrrega (reassignació d'ample de banda i reencaminament) i els de restauració de fallades (ús de camins lògics de backup). Aquests dos mecanismes poden modificar la xarxa lògica i gestionar els recursos (ample de banda) dels enllaços físics. Per tant, existeix la necessitat de coordinar aquests mecanismes per evitar possibles interferències. La gestió de recursos convencional que fa ús de la xarxa lògica, recalcula periòdicament (per exemple cada hora o cada dia) tota la xarxa lògica d'una forma centralitzada. Això introdueix el problema que els reajustaments de la xarxa lògica no es realitzen en el moment en què realment hi ha problemes. D'altra banda també introdueix la necessitat de mantenir una visió centralitzada de tota la xarxa. En aquesta tesi, es proposa una arquitectura distribuïda basada en un sistema multi agent. L'objectiu principal d'aquesta arquitectura és realitzar de forma conjunta i coordinada la gestió de recursos a nivell de xarxa lògica, integrant els mecanismes de reajustament d'ample de banda amb els mecanismes de restauració preplanejada, inclosa la gestió de l'ample de banda reservada per a la restauració. Es proposa que aquesta gestió es porti a terme d'una forma contínua, no periòdica, actuant quan es detecta el problema (quan un camí lògic està congestionat, o sigui, quan està rebutjant peticions de connexió dels usuaris perquè està saturat) i d'una forma completament distribuïda, o sigui, sense mantenir una visió global de la xarxa. Així doncs, l'arquitectura proposada realitza petits rearranjaments a la xarxa lògica adaptant-la d'una forma contínua a la demanda dels usuaris. L'arquitectura proposada també té en consideració altres objectius com l'escalabilitat, la modularitat, la robustesa, la flexibilitat i la simplicitat. El sistema multi agent proposat està estructurat en dues capes d'agents: els agents de monitorització (M) i els de rendiment (P). Aquests agents estan situats en els diferents nodes de la xarxa: hi ha un agent P i diversos agents M a cada node; aquests últims subordinats als P. Per tant l'arquitectura proposada es pot veure com una jerarquia d'agents. Cada agent és responsable de monitoritzar i controlar els recursos als que està assignat. S'han realitzat diferents experiments utilitzant un simulador distribuït a nivell de connexió proposat per nosaltres mateixos. Els resultats mostren que l'arquitectura proposada és capaç de realitzar les tasques assignades de detecció de la congestió, reassignació dinàmica d'ample de banda i reencaminament d'una forma coordinada amb els mecanismes de restauració preplanejada i gestió de l'ample de banda reservat per la restauració. L'arquitectura distribuïda ofereix una escalabilitat i robustesa acceptables gràcies a la seva flexibilitat i modularitat.