397 resultados para GHOST PROPAGATORS
Resumo:
The implementation of boundary conditions is one of the points where the SPH methodology still has some work to do. The aim of the present work is to provide an in-depth analysis of the most representative mirroring techniques used in SPH to enforce boundary conditions (BC) along solid profiles. We specifically refer to dummy particles, ghost particles, and Takeda et al. [1] boundary integrals. A Pouseuille flow has been used as a example to gradually evaluate the accuracy of the different implementations. Our goal is to test the behavior of the second-order differential operator with the proposed boundary extensions when the smoothing length h and other dicretization parameters as dx/h tend simultaneously to zero. First, using a smoothed continuous approximation of the unidirectional Pouseuille problem, the evolution of the velocity profile has been studied focusing on the values of the velocity and the viscous shear at the boundaries, where the exact solution should be approximated as h decreases. Second, to evaluate the impact of the discretization of the problem, an Eulerian SPH discrete version of the former problem has been implemented and similar results have been monitored. Finally, for the sake of completeness, a 2D Lagrangian SPH implementation of the problem has been also studied to compare the consequences of the particle movement
Resumo:
Synthetic Aperture Radar (SAR) images a target region reflectivity function in the multi-dimensional spatial domain of range and cross-range. SAR synthesizes a large aperture radar in order to achieve finer azimuth resolution than the one provided by any on-board real antenna. Conventional SAR techniques assume a single reflection of transmitted waveforms from targets. Nevertheless, today¿s new scenes force SAR systems to work in urban environments. Consequently, multiple-bounce returns are added to direct-scatter echoes. We refer to these as ghost images, since they obscure true target image and lead to poor resolution. By analyzing the quadratic phase error (QPE), this paper demonstrates that Earth¿s curvature influences the defocusing degree of multipath returns. In addition to the QPE, other parameters such as integrated sidelobe ratio (ISLR), peak sidelobe ratio (PSLR), contrast and entropy provide us with the tools to identify direct-scatter echoes in images containing undesired returns coming from multipath.
Resumo:
Synthetic Aperture Radar (SAR) images a target region reflectivity function in the multi-dimensional spatial domain of range and cross-range. SAR synthesizes a large aperture radar in order to achieve a finer azimuth resolution than the one provided by any on-board real antenna. Conventional SAR techniques assume a single reflection of transmitted waveforms from targets. Nevertheless, today¿s new scenes force SAR systems to work in urban environments. Consequently, multiple-bounce returns are added to directscatter echoes. We refer to these as ghost images, since they obscure true target image and lead to poor resolution. By analyzing the quadratic phase error (QPE), this paper demonstrates that Earth¿s curvature influences the defocusing degree of multipath returns. In addition to the QPE, other parameters such as integrated sidelobe ratio (ISLR), peak sidelobe ratio (PSLR), contrast (C) and entropy (E) provide us with the tools to identify direct-scatter echoes in images containing undesired returns coming from multipath.
Resumo:
Conventional SAR (Synthetic Aperture Radar) techniques only consider a single reflection of transmitted waveforms from targets. Nevertheless, today?s new applications force SAR systems to work in much more complex scenes such as urban environments. As a result, multiple-bounce returns are additionally superposed to direct echoes. We refer to these as ghost images, since they obscure true target image and lead to poor resolution. By applying Time Reversal concept to SAR imaging (TR-SAR), it is possible to reduce considerably ?or almost mitigate? ghosting artifacts, recovering the lost resolution due to multipath effects. Furthermore, some focusing indicators such as entropy (E), contrast (C) and Rényi entropy (RE) provide us a good focusing criterion when using TR-SAR.
Resumo:
Synthetic Aperture Radar (SAR) images a target region reflectivity function in the multi-dimensional spatial domain of range and cross-range with a finer azimuth resolution than the one provided by any on-board real antenna. Conventional SAR techniques assume a single reflection of transmitted waveforms from targets. Nevertheless, new uses of Unmanned Aerial Vehicles (UAVs) for civilian-security applications force SAR systems to work in much more complex scenes such as urban environments. Consequently, multiple-bounce returns are additionally superposed to direct-scatter echoes. They are known as ghost images, since they obscure true target image and lead to poor resolution. All this may involve a significant problem in applications related to surveillance and security. In this work, an innovative multipath mitigation technique is presented in which Time Reversal (TR) concept is applied to SAR images when the target is concealed in clutter, leading to TR-SAR technique. This way, the effect of multipath is considerably reduced ?or even removed?, recovering the lost resolution due to multipath propagation. Furthermore, some focusing indicators such as entropy (E), contrast (C) and Rényi entropy (RE) provide us with a good focusing criterion when using TR-SAR.
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
La ciudad es un interior total. Como una nueva naturaleza, el medio urbano tiene tendencia a proliferar indefinidamente y llenar todo el espacio disponible. No sólo está caracterizado por las cualidades atmosféricas de la climatología sino por un rango cada vez mayor de sustancias y fuerzas que forman un vasto océano químico, de energía e información. La arquitectura fantasma que penetra inadvertidamente entre los objetos y ensamblajes presentes en nuestra vida cotidiana es tan importante para la cualificación de nuestro entorno como la arquitectura en sentido tradicional, sólida y visible. En consecuencia, desde el punto de vista de las prácticas materiales ya no es posible situarse en un contexto (social, político, profesional) donde este proceso de densificación ambiental pueda ser ignorado sin más. Vivimos sumergidos en una atmósfera compleja, activa y en gran medida artificial, que de manera voluntaria o involuntaria incorporamos a nuestro organismo, en un proceso en que finalmente sujeto y medio forman un ámbito común. Este es el punto de partida de la presente investigación: a partir de procesos esencialmente arquitectónicos que podemos tildar de ‘encantamientos urbanos’, individuos y objetos comparten una impregnación recíproca, una suerte de erotismo espacial, consistente en la relación a veces inadvertida y agresiva, a veces lúdica y hedonista entre el sujeto y el medio. El objetivo de esta tesis es la construcción de un concepto ampliado y polifacético del ambiente mediante el estudio de los componentes y estímulos físicos que lo caracterizan, los aquí denominados efectos ambientales. Dos elementos fundamentales gravitan inevitablemente en torno a los efectos. Por una parte la estructura, que hace referencia a aquellos objetos o soportes que los producen. Por otra, la afección producida en el sujeto, esto es, las consecuencias emocionales y fisiológicas que comportan su asimilación. La terna resultante, Estructura-Efecto-Afecto, proporcionará un orden conceptual global a la investigación. La tesis está dividida en tres partes. La primera investiga el concepto de efecto ambiental: como figura artística, como detonador de un nuevo paradigma espacial originado en el ámbito de la práctica científica y, finalmente, como categoría estética. La sección intermedia aborda la relación entre estructura y efecto, y se centra en experiencias de diversos ámbitos culturales donde la construcción de determinados artefactos y ensamblajes tiene como único propósito la caracterización del espacio únicamente mediante emisiones ambientales. Finalmente, la tercera parte investiga la arquitectura en busca una materialidad ambiental creciente. Es decir, un espacio donde estructura, atmósfera y psique finalmente convergen. Aquellas estrategias -epistemológicas, teóricas, técnicas- conducentes a la producción de todo tipo de efectos, sean ornamentales, emocionales o fisiológicos, y aquellas prácticas que hacen énfasis en los efectos y no en los objetos de los que proceden serán un importante referente para comprender aquello hoy en día conforma nuestro medio, y contribuirán a concebir la construcción de nuevas formas de habitabilidad. ABSTRACT The city is a total interior. Like a new kind of nature, urban mediums proliferate indefinitely, pervading all available space. Their atmospheric qualities are not only defined by the weather, but by a growing range of substances and forces that form a vast ocean of chemicals, energy and information. The ghost architecture that inadvertently penetrates the objects and assemblages present in our daily lives is as important for the qualification of our environment as traditional, solid and visible architecture. Consequently, there is virtually no context (social, political, professional) where material practices can simply ignore such environmental densification. We live immersed in a complex and—largely artificial—active environment that we voluntarily or involuntarily incorporate to our bodies in a process where ultimately the subject and the environment form a common substance. This is the starting point of our research: through a series of essentially architectural processes that can be called 'urban enchantments,’ individuals and objects come to share a reciprocal impregnation. This spatial erotica consists of the relationship, sometimes unnoticed and violent, sometimes playful and hedonistic, between subject and medium. This research aims at the construction of an expanded and multifaceted idea of environment through the analysis of its quintessential components and physical stimuli, here called ambient effects. Such effects are inevitably accompanied by two key elements. On the one hand there is Structure, which refers to the objects or devices that produce them. On the other hand, the Affect experimented by the subject; namely, the emotional and physiological consequences involved in effect assimilation. The resulting three interlinked concepts, Structure-Effect-Affect, provide the overall conceptual structure of this study. Three main sections are proposed. The first one investigates the concept of ambient effect in different ways: as an artistic figure, as the origin of a new spatial paradigm originated within scientific practices and, finally, as an aesthetic category. The middle section deals with the relationship between structure and effect, and focuses on the construction of certain artifacts and assemblages whose sole purpose is to characterize space by environmental emissions only. Finally, the third part investigates architecture’s quest for ultimate ambiental materiality, that is, a space where structure, atmosphere and psyche finally converge. The strategies, from the epistemological to the technical, leading to the production of all kinds of effects—be they ornamental, emotional or physiological,—and the practices that focus on effects and not the objects from where they come, will be studied. All of them will open new windows to a contemporary notion of environment and will contribute to the construction of new living habitats.
Resumo:
Bead models are used in dynamical simulation of tethers. These models discretize a cable using beads distributed along its length. The time evolution is obtained nu- merically. Typically the number of particles ranges between 5 and 50, depending on the required accuracy. Sometimes the simulation is extended over long periods (several years). The complex interactions between the cable and its spatial environment require to optimize the propagators —both in runtime and precisión that constitute the central core of the process. The special perturbation method treated on this article conjugates simpleness of computer implementation, speediness and precision, and is capable to propagate the orbit of whichever material particle. The paper describes the evolution of some orbital elements, which are constants in a non-perturbed problem, but which evolve in the time scale imposed by the perturbation. It can be used with any kind of orbit and it is free of sin- gularities related to small inclination and/or small eccentricity. The use of Euler parameters makes it robust.
Resumo:
En esta tesis se investiga la interacción entre un fluido viscoso y un cuerpo sólido en presencia de una superficie libre. El problema se expresa teóricamente poniendo especial atención a los aspectos de conservación de energía y de la interacción del fluido con el cuerpo. El problema se considera 2D y monofásico, y un desarrollo matemático permite una descomposición de los términos disipativos en términos relacionados con la superficie libre y términos relacionados con la enstrofía. El modelo numérico utilizado en la tesis se basa en el método sin malla Smoothed Particle Hydrodynamics (SPH). De manera análoga a lo que se hace a nivel continuo, las propiedades de conservación se estudian en la tesis con el sistema discreto de partículas. Se tratan también las condiciones de contorno de un cuerpo que se mueve en un flujo viscoso, implementadas con el método ghost-fluid. Se ha desarrollado un algoritmo explícito de interacción fluido / cuerpo. Se han documentado algunos casos de modo detallado con el objetivo de comprobar la capacidad del modelo para reproducir correctamente la disipación de energía y el movimiento del cuerpo. En particular se ha investigado la atenuación de una onda estacionaria, comparando la simulación numérica con predicciones teóricas. Se han realizado otras pruebas para monitorizar la disipación de energía para flujos más violentos que implican la fragmentación de la superficie libre. La cantidad de energía disipada con los diferentes términos se ha evaluado en los casos estudiados con el modelo numérico. Se han realizado otras pruebas numéricas para verificar la técnica de modelización de la interacción fluido / cuerpo, concretamente las fuerzas ejercidas por las olas en cuerpos con formas simples, y el equilibrio de un cuerpo flotante con una forma compleja. Una vez que el modelo numérico ha sido validado, se han realizado simulaciones numéricas para obtener una comprensión más completa de la física implicada en casos (casi) realistas sobre los había aspectos que no se conocían suficientemente. En primer lugar se ha estudiado el el flujo alrededor de un cilindro bajo la superficie libre. El estudio se ha realizado con un número de Reynolds moderado, para un rango de inmersiones del cilindro y números de Froude. La solución numérica permite una investigación de los patrones complejos que se producen. La estela del cilindro interactúa con la superficie libre. Se han identificado algunos inestabilidades características. El segundo estudio se ha realizado sobre el problema de sloshing, tanto experimentalmente como numéricamente. El análisis se restringe a aguas poco profundas y con oscilación horizontal, pero se ha estudiado un gran número de condiciones, lo que lleva a una comprensión bastante completa de los sistemas de onda involucradas. La última parte de la tesis trata también sobre un problema de sloshing pero esta vez el tanque está oscilando con rotación y hay acoplamiento con un sistema mecánico. El sistema se llama pendulum-TLD (Tuned Liquid Damper - con líquido amortiguador). Este tipo de sistema se utiliza normalmente para la amortiguación de las estructuras civiles. El análisis se ha realizado analíticamente, numéricamente y experimentalmente utilizando líquidos con viscosidades diferentes, centrándose en características no lineales y mecanismos de disipación. ABSTRA C T The subject of the present thesis is the interaction between a viscous fluid and a solid body in the presence of a free surface. The problem is expressed first theoretically with a particular focus on the energy conservation and the fluid-body interaction. The problem is considered 2D and monophasic, and some mathematical development allows for a decomposition of the energy dissipation into terms related to the Free Surface and others related to the enstrophy. The numerical model used on the thesis is based on Smoothed Particle Hydrodynamics (SPH): a computational method that works by dividing the fluid into particles. Analogously to what is done at continuum level, the conservation properties are studied on the discrete system of particles. Additionally the boundary conditions for a moving body in a viscous flow are treated and discussed using the ghost-fluid method. An explicit algorithm for handling fluid-body coupling is also developed. Following these theoretical developments on the numerical model, some test cases are devised in order to test the ability of the model to correctly reproduce the energy dissipation and the motion of the body. The attenuation of a standing wave is used to compare what is numerically simulated to what is theoretically predicted. Further tests are done in order to monitor the energy dissipation in case of more violent flows involving the fragmentation of the free-surface. The amount of energy dissipated with the different terms is assessed with the numerical model. Other numerical tests are performed in order to test the fluid/body interaction method: forces exerted by waves on simple shapes, and equilibrium of a floating body with a complex shape. Once the numerical model has been validated, numerical tests are performed in order to get a more complete understanding of the physics involved in (almost) realistic cases. First a study is performed on the flow passing a cylinder under the free surface. The study is performed at moderate Reynolds numbers, for various cylinder submergences, and various Froude numbers. The capacity of the numerical solver allows for an investigation of the complex patterns which occur. The wake from the cylinder interacts with the free surface, and some characteristical flow mechanisms are identified. The second study is done on the sloshing problem, both experimentally and numerically. The analysis is restrained to shallow water and horizontal excitation, but a large number of conditions are studied, leading to quite a complete understanding of the wave systems involved. The last part of the thesis still involves a sloshing problem but this time the tank is rolling and there is coupling with a mechanical system. The system is named pendulum-TLD (Tuned Liquid Damper). This kind of system is normally used for damping of civil structures. The analysis is then performed analytically, numerically and experimentally for using liquids with different viscosities, focusing on non-linear features and dissipation mechanisms.
Resumo:
Electric probes are objects immersed in the plasma with sharp boundaries which collect of emit charged particles. Consequently, the nearby plasma evolves under abrupt imposed and/or naturally emerging conditions. There could be localized currents, different time scales for plasma species evolution, charge separation and absorbing-emitting walls. The traditional numerical schemes based on differences often transform these disparate boundary conditions into computational singularities. This is the case of models using advection-diffusion differential equations with source-sink terms (also called Fokker-Planck equations). These equations are used in both, fluid and kinetic descriptions, to obtain the distribution functions or the density for each plasma species close to the boundaries. We present a resolution method grounded on an integral advancing scheme by using approximate Green's functions, also called short-time propagators. All the integrals, as a path integration process, are numerically calculated, what states a robust grid-free computational integral method, which is unconditionally stable for any time step. Hence, the sharp boundary conditions, as the current emission from a wall, can be treated during the short-time regime providing solutions that works as if they were known for each time step analytically. The form of the propagator (typically a multivariate Gaussian) is not unique and it can be adjusted during the advancing scheme to preserve the conserved quantities of the problem. The effects of the electric or magnetic fields can be incorporated into the iterative algorithm. The method allows smooth transitions of the evolving solutions even when abrupt discontinuities are present. In this work it is proposed a procedure to incorporate, for the very first time, the boundary conditions in the numerical integral scheme. This numerical scheme is applied to model the plasma bulk interaction with a charge-emitting electrode, dealing with fluid diffusion equations combined with Poisson equation self-consistently. It has been checked the stability of this computational method under any number of iterations, even for advancing in time electrons and ions having different time scales. This work establishes the basis to deal in future work with problems related to plasma thrusters or emissive probes in electromagnetic fields.
Resumo:
Praying mantids use binocular cues to judge whether their prey is in striking distance. When there are several moving targets within their binocular visual field, mantids need to solve the correspondence problem. They must select between the possible pairings of retinal images in the two eyes so that they can strike at a single real target. In this study, mantids were presented with two targets in various configurations, and the resulting fixating saccades that precede the strike were analyzed. The distributions of saccades show that mantids consistently prefer one out of several possible matches. Selection is in part guided by the position and the spatiotemporal features of the target image in each eye. Selection also depends upon the binocular disparity of the images, suggesting that insects can perform local binocular computations. The pairing rules ensure that mantids tend to aim at real targets and not at “ghost” targets arising from false matches.
Resumo:
Plasma membrane ghosts form when plant protoplasts attached to a substrate are lysed to leave a small patch of plasma membrane. We have identified several factors, including the use of a mildly acidic actin stabilization buffer and the inclusion of glutaraldehyde in the fixative, that allow immunofluorescent visualization of extensive cortical actin arrays retained on membrane ghosts made from tobacco (Nicotiana tabacum L.) suspension-cultured cells (line Bright Yellow 2). Normal microtubule arrays were also retained using these conditions. Membrane-associated actin is random; it exhibits only limited coalignment with the microtubules, and microtubule depolymerization in whole cells before wall digestion and ghost formation has little effect on actin retention. Actin and microtubules also exhibit different sensitivities to the pH and K+ and Ca2+ concentrations of the lysis buffer. There is, however, strong evidence for interactions between actin and the microtubules at or near the plasma membrane, because both ghosts and protoplasts prepared from taxol-pretreated cells have microtubules arranged in parallel arrays and an increased amount of actin coaligned with the microtubules. These experiments suggest that the organization of the cortical actin arrays may be dependent on the localization and organization of the microtubules.
Resumo:
The primary goal of this thesis was to determine if spaced synaptic stimulation induced the differential expression of microRNAs (miRNAs) in the Drosophila melanogaster central nervous system (CNS). Prior to attaining this goal, we needed to identify and validate a spaced stimulation paradigm that could induce the formation of new synaptic growth at a model synapse, the larval neuromuscular junction (NMJ). Both Channelrhodopsin- and high potassium-based stimulation paradigms adapted from (Ataman, et al. 2008) were tested. Once validation of these paradigms was complete, we sought to characterize the miRNA expression profile of the larval CNS by miRNA array. Following attainment of these data, we used quantitative real-time PCR (RT-qPCR) to determine if acute synaptic stimulation caused the differential expression of neuronal miRNAs. We found that upon high potassium spaced training in a wild type (Canton S) genotype, 5 miRNAs showed significant differential expression when normalized to a validated reference gene, the U1 snRNA. Moreover, absolute quantification of our RT-qPCR study implicated one miRNA: miR-958 as being significantly regulated by activity. Investigation into potential targets for miR-958 revealed it to be a potential regular of Dlar, a protein tyrosine phosphatase implicated in synapse development. This investigation provides the foundation to directly test our underlying hypothesis that, following spaced training, differential expression of miRNAs alters the translation of proteins required to induce and maintain these structural changes at the synapse.
Resumo:
The purpose of this doctoral paper was to use the “Ghosts in the Nursery” theory (Fraiberg, Adelson, & Shapiro, 1975) as a framework for understanding clinicians’ perceptions of women’s experience of miscarriage. Specific attention was paid to the experience of becoming pregnant with a subsequent child. Professionals who work in the field of infant mental health were asked to explore the theory’s utility in conceptualizing the experience of becoming pregnant after a miscarriage. Results indicated that the perceptions of women’s experiences of miscarriage and subsequent pregnancy are congruent with previous research findings. Further elaboration and information are provided to illustrate the experience of having a child and being a parent after experiencing a loss, and to explore the idea of understanding miscarriage as a “ghost”. This study applies a new perspective to the theory of “Ghosts in the Nursery” (Fraiberg et al., 1975) to children born after a loss.
Resumo:
The Quaternary climate of southern Europe (south Italy and Greece) is investigated by pollen analysis of the sapropels which were deposited in the deep eastern Mediterranean Sea during the last 1 million year (Ma). The time-scale of core KC01b in the Ionian Sea has been established by tuning its oxygen isotopic record to the ice volume model of Imbrie and Imbrie (1980, doi:10.1126/science.207.4434.943). For the last 250,000 year (250 ka), the previous pollen studies and astronomical tuning have been confirmed. Sapropels were deposited under a large range of Mediterranean climates: fully interglacial, fully glacial, and intermediary, as revealed mainly by the balance between the respective pollen abundances of oak (Quercus) and sage-brush (Artemisia). The high value of the oak reveals the warm and wet climate of an Interglacial, and the high value of the sage-brush, the dry and cold climate of a Glacial. Whereas the Mediterranean climate is directly related to the variation of the high-latitude ice sheets, the deposition of sapropels is not so. In contrast with the wide climatic range, sapropels were deposited only when summer insolation in the low latitudes reached its highest peaks. However, between 250 ka and 1 Ma, that stable pattern is not yet established. Only six sapropels are observed, many expected ones do not appear, even as ghosts signalled by peaks of barium abundance, that remain after the post-deposition oxidation of organic matter. The pattern of sapropel formation in stable and direct relationship to highest insolation does not seem to apply. For five of those sapropels, neither climate extremes are observed; they mainly formed during intermediary types of Mediterranean climate. In contrast, one sapropel (and one ghost) relates to a relatively low peak of insolation, and its climate is of a unique, composite type not seen later. This might suggest an unsuspected, more complex pattern linking the formation of Mediterranean sapropels to the astronomical configuration.