736 resultados para Bubble
Resumo:
The dynamics of a gas-filled microbubble encapsulated by a viscoelastic fluid shell immersed in a Newtonian liquid and subject to an external pressure field is theoretically studied. The problem is formulated by considering a nonlinear Oldroyd type constitutive equation to model the rheological behavior of the fluid shell. Heat and mass transfer across the surface bubble have been neglected but radiation losses due to the compressibility of the surrounding liquid have been taken into account. Bubble collapse under sudden increase of the external pressure as well as nonlinear radial oscillations under ultrasound fields are investigated. The numerical results obtained show that the elasticity of the fluid coating intensifies oscillatory collapse and produces a strong increase of the amplitudes of radial oscillations which may become chaotic even for moderate driving pressure amplitudes. The role played by the elongational viscosity has also been analyzed and its influence on both, bubble collapse and radial oscillations, has been recognized. According to the theoretical predictions provided in the present work, a microbubble coated by a viscoelastic fluid shell is an oscillating system that, under acoustic driving, may experience volume oscillations of large amplitude, being, however, more stable than a free bubble. Thus, it could be expected that such a system may have a suitable behavior as an echogenic agent.
Resumo:
La Tesis decodifica una selección de veinte proyectos representativos de Sejima-SANAA, desde su primer proyecto construido, la Casa Platform I, 1987, hasta el Centro Rolex, 2010, año en que Sejima y Nishizawa –SANAA- reciben el Premio Pritzker. De los veinte proyectos once son de Sejima: Casa Platform I, Casa Platform II, Residencia de Mujeres, Casa N, Pachinco Parlor I, Villa en el Bosque, Comisaría en Chofu, Casa Y, Apartamentos en Gifu, Edificio de equipamientos en la Expo Tokio 96, Pachinko Parlor III; y nueve de SANAA: edificio Multimedia en Oogaki, estudio de viviendas metropolitanas,Park Café en Koga, De Kunstlinie en Almere, Museo de Kanazawa, Pabellón de Toledo, Escuela de Zollverein, Casa Flor y Centro Rolex. La decodificación lee la obra de Sejima-SANAA a la inversa para ‘reconstruir’, en un ejercicio de simulación ficticia, una versión verosímil y coherente de los que podrían haber sido sus procesos proyectuales; podrían, porque los verdaderos son imposibles de dilucidar. Los que se proponen se pretenden exclusivamente verosímiles y plausibles. Con ello se pretende contribuir al entendimiento y comprensión de la arquitectura de Sejima-SANAA y, tangencialmente y en menor medida, a la teoría sobre el ejercicio proyectual arquitectónico. La decodificación se centra en dos aspectos concretos: la forma arquitectónica y el papel proyectual de la estructura portante. Ambas decodificaciones se extienden inevitablemente a otros aspectos relacionados, como, por ejemplo, la naturaleza del espacio arquitectónico. El procedimiento de investigación partió de una descripción objetiva y pormenorizada de los significantes formales y estructurales de cada proyecto desde su propia configuración física y geométrica. Esa descripción ‘objetiva’, llevada al límite, permitió que afloraran estructuras conceptuales y lógicas subyacentes de cada proyecto. Unida a interpretación crítica, –mediante su relación y confrontación con otras arquitecturas y otros modos de hacer conocidos- permitió trazar la reconstitución ficticia que persigue la decodificación. Ese trabajo se materializó en veinte ensayos críticos y se acompañó de un conjunto de otros textos sobre temas sugeridos o reclamados por el proceso de investigación. El conjunto de todos esos textos constituye el material de trabajo de la tesis. A partir de ahí, con una visión de conjunto, la tesis identifica una trayectoria de estrategias formales y una trayectoria de estrategias proyectuales relacionadas con lo portante. Juntas conforman el grueso de la tesis que se expone en los cuatro capítulos centrales. Los precede un capítulo introductorio que expone el recorrido biográfico de K. Sejima y la trayectoria profesional de Sejima-SANAA; y los siguen de unos textos transversales sobre forma, lugar y espacio. La tesis termina con una síntesis de sus conclusiones. Las estrategias formales se exponen en tres capítulos. El primero, ‘Primeras estrategias formales’ agrupa proyectos de la primera etapa de Sejima. El segundo capítulo está dedicado enteramente al proyecto de los apartamentos en Gifu, 1994-98, que según esta tesis, supuso un importante punto de inflexión en la trayectoria de Sejima; tanto el tercer capítulo lleva por nombre ‘Estrategias formales después de Gifu’ y recoge los proyectos que le siguieron. Las ‘Primeras estrategias formales’, varias y balbucientes, se mueven en general en torno a dos modos o procedimientos de composición, bien conocidos: por partes y sistemático. Éste última inicia en la trayectoria de SANAA un aspecto que va a ser relevante de aquí en adelante: entender el proyecto como propuesta genérica en la que, más allá de su realidad específica y tangible, subyace una lógica, en cada proyecto la suya, extrapolable a otros lugares, otras dimensiones, incluso otros programas: cada proyecto podría dar lugar a otros proyectos de la misma familia. La composición sistemática incluye, entre otros, la Casa Platform II, basada en la definición de un elemento constructivo, y la formulación de unas leyes de repetición y de posibles modos de agrupación. Incluye también la Residencia de Mujeres Saishunkan Seiyaku- proyecto que lanzó a Sejima a la fama internacional-, que también sería un sistema, pero distinto: basado en la repetición regular de una serie de elementos a lo largo de una directriz generando un hipotético contenedor infinito del que el proyecto sería tan solo un fragmento. La estrategia formal del edificio de Gifu ahondaría en la voluntad genérica del proyecto, adoptando la lógica de un juego. El proyecto sería una partida del juego, pero no la única posible, podrían jugarse otras. Esta hipótesis del juego está verificada en ‘El Juego de Gifu’ que - tras formular el juego identificando sus elementos (tablero y fichas), reglas y procedimientos- juega una partida: la que habría dado lugar al edificio proyectado por Sejima. Gifu extiende el concepto de ‘repetir’ un elemento constructivo a la de repetir un patrón espacial, lo que conlleva: la desvinculación entre forma y función; y un nuevo concepto de flexibilidad, que deja de referirse al uso flexible del edificio construido para pertenecer al momento proyectual en que se asignan funciones específicas a los patrones espaciales. Esta tesis propone que esa asignación de funciones sería uno de los últimos eslabones del proceso proyectual, algo opuesto a la premisa moderna de “la forma sigue a la función”. Las estrategias formales ‘Después de Gifu’ tienen también lógicas de juego, pero cada estrategia responde a un juego distinto, como dejan entrever sus nombres: ‘Tableros de Juego’, que con distintos grados de madurez estaría presente en varios proyectos; ‘Elementos de Catálogo’ en el Museo de Kanazawa; ‘Forma apriorística’, en la Casa Flor y ‘Repetición de una situación topológica’, en el Centro Rolex. Todas esas estrategias, o juegos, mantienen aspectos comunes relativos a la forma arquitectónica, precisamente los aspectos Gifu: la repetición aplicada al patrón espacial, y lo que conlleva: desvinculación entre forma y función y la nueva acepción de flexibilidad. ‘Tableros de Juego’ consiste en configurar cada sistema de proyecto (estructura, cerramientos, particiones y mobiliario) eligiendo elementos ofrecidos por una geometría de base, en cada proyecto la suya, en general reticular: intersecciones, líneas, módulos. Cada sistema se configura, en principio, sin relación de subordinación con cualquiera de los demás; cuando esa subordinación es ineludible, el juego determina que el sistema portante no puede materializar el orden geométrico de base, lo que se traduce en que no ejerce el papel dominante. Por lo tanto, ‘Tableros de Juego’ transgrede la lógica de la planta libre moderna: la estructura ni refleja ni revela el orden de base y los sistemas no respetan las relaciones de subordinación jerárquica y encadenada que aquella determinaba. Esta estrategia de ‘Tableros de juego’ deriva en soluciones y proyectos formales muy distintos: los proyectos de Oogaki y Park Café, que presentarían ‘Tableros de Juego’ incipientes; De Kunstlinie en Almere y la Escuela de Zollverein, que presentarían una consolidación de esta estrategia; y el Pabellón de Vidrio de Toledo que resultaría de la subversión de la estrategia. Este último proyecto, además, lleva el concepto de repetición más allá del elemento constructivo y del patrón espacial (que en este caso tiene forma de burbuja) parar acabar afectando a la propia experiencia del espectador, que esté donde esté, siempre tiene la sensación de estar en el mismo sitio. Esta tesis denomina a ese espacio repetitivo como ‘espacio mantra’. La estrategia ‘Elementos de Catálogo’ se ilustra con el Museo de Kanazawa. Su lógica parte de la definición de una serie de elementos, muy pocos, y se basa en el ingente número de posibles combinaciones entre sí. Gifu habría anunciado el catalogo de elementos en la caracterización de sus patrones espaciales. La estrategia ‘Forma Apriorística’ se ilustra con la Casa Flor. La decisión sobre el tipo de forma -en este caso la de una ameba- estaría al principio del proceso proyectual, lo que no quiere decir que sea una forma arbitraria: la forma de la ameba lleva implícita la repetición de un patrón espacial (el seudópodo) y una apoteosis del concepto de repetición que, alcanzando la experiencia espacial, da lugar a un espacio repetitivo o mantra. El ‘Espacio Mantra’ es uno de los leitmotivs, que se emplean como argumento en la última estrategia formal que la Tesis decodifica: el Centro Rolex. Con respecto a la estructura portante, la tesis identifica y traza una trayectoria de cinco estrategias proyectuales: preeminencia, ocultación, disolución, desaparición y desvirtuación. --Ocultación, reduce el papel dominante de la estructura. Al principio es una ocultación literal, casi un tapado de los elementos estructurales, como en Gifu; luego se hace más sofisticada, como la ocultación por camuflaje o la paradójica ocultación por multiplicación de Park Café. --La disolución merma la condición dominante de la estructura que en lugar de configurarse como sistema unitario u homogéneo se fragmenta en varios subsistemas. --La desaparición se refiere a estructuras que desaparecen como sistemas propios y autónomos, a proyectos en los que la función portante es desempeñada por otros sistemas como el de las particiones. La desaparición culmina con la Casa Flor, cuyo perímetro ejerce la función portante y además es transparente, está desmaterializado: la estructura se ha hecho invisible, ha desaparecido. --La desvirtuación se refiere a estructuras que sí se presentan como sistemas propios y autónomos, pero dejan de tener un papel preeminente por cuanto no materializan el orden de base: esta estrategia es correlativa a la estrategia formal ‘Tableros de juego’. Las conclusiones de la tesis están en la propia organización de la tesis: la identificación de las estrategias. Aún así, y como epílogos, se exponen seis. Las dos primeras subrayan el hilo conductor del trabajo realizado, que radica en la cualidad genérica de las estrategias proyectuales en Sejima-SANAA. Las cuatro siguientes dilucidan hasta qué punto hay, en sus proyectos, rasgos o significantes formales y/o estructurales que sean a su vez señales características del panorama arquitectónico contemporáneo; y plantean la pregunta estrella: ¿hay algunos que, apuntando más lejos, supongan aportaciones originales? --Como aportaciones originales la tesis destaca: la identificación entre el ideal genérico y proyecto concreto; y la propuesta de un espacio nuevo, híbrido, una suerte de estadio intermedio entre el espacio subdividido y compartimentado de la tradición y el continuo moderno. --Como síntomas de contemporaneidad se destacan: respecto de la forma, la traslación de la especificidad formal de la parte al conjunto; y respecto de la estructura, la tendencia contemporánea a hacer estructuras cada vez más ligeras y livianas, que tienden a lo evanescente. Ésta última, la tendencia al evanescencia estructural, podría tener la condición de aportación original, no en vano la desaparición de la estructura lleva la evanescencia hacia sus últimas consecuencias, y en el caso de estructuras con presencia física, hace que dejen de ser el sistema ordenador orquestador del proceso proyectual. ABSTRACT The Thesis decodes a selection of twenty representative Sejima-SANAA projects, from the first one built, the Platform I House in 1987, to the Rolex Center in 2010, year in which Sejima and Nishizawa –SANAA- received the Pritzker Prize. Eleven projects are from Sejima: Platform I, Platform II, Saishunkan Seiyaku Women´s Dormitory, N- House, Pachinco Parlor I, Villa in the Forest, Policy Box at Chofu Station, Y-House, Gifu Kitigata Apartment, World City Expo ´96 Facilities Building, Pachinko Parlor III; and nine from SANAA: Multimedia Workshop in Ogaki, Metropolitan Housing Studies, Park Café in Koga, De Kunstlinie in Almere, Kanazawa Museum, Glass Pavilion at the Toledo Museum of Art, Zollverein School, Flower House and the Rolex Center. This decoding reads the Sejima-SANAA’s projects inversely aiming ‘to reconstruct', in a fictitious simulation exercise, a likely and coherent version of what her/their projectual processes ‘could’ have been; ‘could’, because the true ones are impossible to explain. The ones proposed here pretend only to be likely and reasonable. By so doing the Thesis tries to contribute to the understanding and comprehension of Sejima-SANAA architecture and, tangentially and to a lesser extent, to the theory of architectural projects exercise. Decoding centers in two specific aspects: architectural form, and projectual role of the load bearing structure. Both decodes inevitably extend to other related aspects such as, for example, the nature of space. The research procedure begun by carrying out an objective and detailed description of the formal and structural signifiers of each project; looking at them from their physical and geometric configuration. Taken to the limit, the ‘objective’ descriptions allowed the conceptual structures and underlying logics of each project to arise. Together with critical interpretations, which related and confronted them with other architectures and well-known projectual working ways, it became possible to outline and trace the intended fictitious reconstruction decodes. The descriptive analytical work materialized in twenty critical essays, and was accompanied by a set of other essays on subjects suggested or demanded by the research process. Together, all those texts were the material basis on which thesis work was built. Looking at the whole and taking it from there, the thesis identifies two related projectual trajectories: a trajectory of formal strategies and a trajectory of strategies having to do with structural systems and components. Both, together, constitute the bulk of the thesis, as presented in the four central chapters. Preceding them there is an introductory chapter outlining the biographical path of Kazuyo Sejima and the professional trajectory of Sejima-SANAA. And following them there is another one containing transversal texts on form, place and space. The thesis ends with a synthesis on conclusions. The formal strategies are displayed in three chapters. The first one, `Early formal strategies' groups the first phase projects by Sejima. The second one, ‘Formal strategies of Gifu’s paradigm’, is entirely dedicated to the Gifu apartments project, 1994-98, which according to this thesis meant an important inflexion point in Sejima’s trajectory; so much so that the third chapter is named `Formal strategies after Gifu' and gathers the selected projects that followed it. The ‘Early formal strategies', diverse and tentative, move in general around two well-known projectual composition methods ‘composition by parts’, and ‘systematic composition’. This last one –systematic composition- begins and leads in SANAA’s trajectory an aspect which will remain relevant from here on: the understanding of the project as if it were an specific instance of a generic proposal in which -below and beyond the project tangible reality- there lays a logic that could be applicable at other places, for other dimensions, even with other programs; from each project, other projects of the same family could rise. The set of projects using this systematic composition method include, among others, the ‘Platform II House, based on the definition of a constructive element and of rules having to do with its replicas and their possible groupings. It also includes the Saishunkan Seiyaku Women Residence -project that launched Sejima to international fame- that could also be seen as a system, but of a different kind: a system based on the regular repetition of a series of elements along a directive line, thus generating a hypothetical infinite container of which the project would be only a fragment. The formal strategy of the Gifu apartments building would push further towards the generic project concept, adopting the logic of a game. The project would be a bout, a round, one play…, but not the only possible one; others could be played. The thesis confirms this game hypothesis -after having formulated `The Game of Gifu' and identified its elements (board, chips, rules and procedures)- playing the one play from which the building as projected by Sejima would have raised. Gifu extends the concept of ‘repeating a constructive element’ to that of ‘repeating a space pattern element’, and to what it implies: the decoupling of form and function, leading to a new concept of flexibility that no longer refers to the flexible use of the constructed building but to the projectual moment at which the specific functions are assigned to the space patterns. This thesis proposes that this allocation of functions would be one of the last steps in projectual process, quite opposite from the modern premise: “form follows function”. The Formal strategies after Gifu do also have a game logic; but, as their names reveal, each strategy responds to a different game: ‘Game Boards’, present with different maturity levels in several projects; ‘Elements from a Catalogue’, in the Kanazawa Museum; ‘Aprioristic Form’, in the Flower House; and ‘Repetition of a topologic situation', in the Rolex Center. All of these strategies, or games, maintain common aspects having to do with architectural form; aspects that were already present, precisely, in Gifu: repetition of space pattern units, uncoupling of form and function, and a new meaning of flexibility. -`Game Boards’ consists on setting up a base geometry -each project his, generally reticular- and give form to each project system (structure, closings, partitions and furniture) by choosing elements -intersections, lines, modules- it offers. Each project system is formed, in principle, with no subordinated relation with any of the others; when subordination is unavoidable, the game rules determine that the load bearing structural system may not be the one to materialize the base geometric order, which means that it does not exert the dominant role. Therefore, ‘Game Boards' transgresses the Modern logic, because the structure neither reflects nor reveals the base order, and because the systems do not respect any of the hierarchic and chained subordination relations that the ‘free plan’ called for. ‘Game Boards' leads to quite different solutions and formal projects: the Oogaki and Park Coffee projects show incipient Game Boards; The Almere Kunstlinie and the Zollverein School present consolidations of this strategy; and the Toledo’s Glass Pavilion results from subverting the strategy. In addition, the Toledo project takes the repetition concept beyond that of using a constructive element and a space pattern element (in this case with a bubble form) to end up affecting the personal experience of the spectator, who, wherever he is, feels to always be in the same place. This thesis denominates that repetitive space as ‘Mantra space '. -‘Elements from a Catalogue’ is shown with the Kanazawa Museum. Its logic starts from the definition of a series of elements, very few, and it is based on the huge number of possible combinations among them. The ‘Elements from a Catalogue’ approach was announced in the Gifu project when characterizing its space pattern elements. -Aprioristic Form' is illustrated by the Flower House. The decision on the type of form -in this case the form of an amoeba- would be the beginning of the projectual process, but it does not mean it is arbitrary form: the amoeba form implies repeating a space pattern (pseudopodia) and an apotheosis of the repetition concept: embracing the space experience, it gives rise to a repetitive or mantra space. ‘Mantra Space’ is one of leitmotivs used as an argument in the last formal strategy Thesis decodes: the Rolex Center. With respect to the ‘Projectual strategies of the load bearing structure’, the thesis finds and traces a trajectory of five projectual strategies: ‘preeminence, concealment, dissolution, disappearance and desvirtuación’. --Preeminence is present in Sejima’s first works in which she resorts to structures which have a dominant preeminent role in the project in so far as they impersonate the greater scale and/or materialize the base geometric order. In later works that preeminence will be inverted, the projects aiming towards its opposite: lighter, slighter, smaller structures. -Concealment reduces the dominant role of the structure. At the outset concealment is literal, almost hiding the structural elements, as in Gifu; soon it will become more sophisticated, such as the concealment by camouflage or the paradoxical concealment by multiplication in the Koga Park Café. -Dissolution diminishes the dominant condition of the structure: instead of its’ being configured as unitary or homogenous system is fragmented in several subsystems. -Disappearance talks about structures that fade away as self referred and independent systems; projects in which the load bearing function is carried out by other systems such as the set of partitions. Disappearance reaches its zenith at the Flower House, whose perimeter functions structurally being, in addition, transparent, immaterial: its structure has become invisible, has disappeared. -Desvirtuación talks about structures that do appear like independent self-systems, but which that do not longer have a preeminent paper, inasmuch as they do not materialize the base order. This strategy correlates with the ‘Game Boards’ formal strategy. The thesis conclusions are show by the organization of the thesis itself: its identification of the different strategies. Even so, as epilogues, the thesis exposes six ‘Conclusions’. The first two emphasize the leading thread of the work done, rooted in the generic quality of the Sejima-SANAA projectual strategies. The following four expound to what extent their projects show features, or formal and/or structural signifiers, which also are or can be read as characteristic signals of the contemporary architectonic panorama, and raise the key question: aiming farther, may some of them be taken as original contributions? -As original contributions the conclusions highlight: the identification between the generic ideal and the concrete project; and the proposal of a new, hybrid space, kind of an intermediate stage between the traditional subdivided compartmented space and the continuous modern. -As symptoms of contemporaneousness: in relation to the form it highlights the transferring of the formal specificity from the part to the whole; and in relation to the structure, it underscore the contemporary tendency towards lighter and growingly slimmer structures, tending to the evanescent. This last one, the tendency towards structural evanescence, could have condition of being an original contribution, not in vain it carries the structural disappearance towards its last consequences; and in the case of structures with physical presence, it makes them to cease being the ordering system orchestrating the projectual process.
Resumo:
We present and analyze a subgrid viscosity Lagrange-Galerk in method that combines the subgrid eddy viscosity method proposed in W. Layton, A connection between subgrid scale eddy viscosity and mixed methods. Appl. Math. Comp., 133: 14 7-157, 2002, and a conventional Lagrange-Galerkin method in the framework of P1⊕ cubic bubble finite elements. This results in an efficient and easy to implement stabilized method for convection dominated convection diffusion reaction problems. Numerical experiments support the numerical analysis results and show that the new method is more accurate than the conventional Lagrange-Galerkin one.
Resumo:
Several models have been proposed for the mechanism of transcript termination by Escherichia coli RNA polymerase at rho-independent terminators. Yager and von Hippel (Yager, T. D. & von Hippel, P. H. (1991) Biochemistry 30, 1097–118) postulated that the transcription complex is stabilized by enzyme–nucleic acid interactions and the favorable free energy of a 12-bp RNA–DNA hybrid but is destabilized by the free energy required to maintain an extended transcription bubble. Termination, by their model, is viewed simply as displacement of the RNA transcript from the hybrid helix by reformation of the DNA helix. We have proposed an alternative model where the RNA transcript is stably bound to RNA polymerase primarily through interactions with two single-strand specific RNA-binding sites; termination is triggered by formation of an RNA hairpin that reduces binding of the RNA to one RNA-binding site and, ultimately, leads to its ejection from the complex. To distinguish between these models, we have tested whether E. coli RNA polymerase can terminate transcription at rho-independent terminators on single-stranded DNA. RNA polymerase cannot form a transcription bubble on these templates; thus, the Yager–von Hippel model predicts that intrinsic termination will not occur. We find that transcript elongation on single-stranded DNA templates is hindered somewhat by DNA secondary structure. However, E. coli RNA polymerase efficiently terminates and releases transcripts at several rho-independent terminators on such templates at the same positions as termination occurs on duplex DNAs. Therefore, neither the nontranscribed DNA strand nor the transcription bubble is essential for rho-independent termination by E. coli RNA polymerase.
Resumo:
Studies of recombination-dependent replication (RDR) in the T4 system have revealed the critical roles played by mediator proteins in the timely and productive loading of specific enzymes onto single-stranded DNA (ssDNA) during phage RDR processes. The T4 recombination mediator protein, uvsY, is necessary for the proper assembly of the T4 presynaptic filament (uvsX recombinase cooperatively bound to ssDNA), leading to the recombination-primed initiation of leading strand DNA synthesis. In the lagging strand synthesis component of RDR, replication mediator protein gp59 is required for the assembly of gp41, the DNA helicase component of the T4 primosome, onto lagging strand ssDNA. Together, uvsY and gp59 mediate the productive coupling of homologous recombination events to the initiation of T4 RDR. UvsY promotes presynaptic filament formation on 3′ ssDNA-tailed chromosomes, the physiological primers for T4 RDR, and recent results suggest that uvsY also may serve as a coupling factor between presynapsis and the nucleolytic resection of double-stranded DNA ends. Other results indicate that uvsY stabilizes uvsX bound to the invading strand, effectively preventing primosome assembly there. Instead, gp59 directs primosome assembly to the displaced strand of the D loop/replication fork. This partitioning mechanism enforced by the T4 recombination/replication mediator proteins guards against antirecombination activity of the helicase component and ensures that recombination intermediates formed by uvsX/uvsY will efficiently be converted into semiconservative DNA replication forks. Although the major mode of T4 RDR is semiconservative, we present biochemical evidence that a conservative “bubble migration” mode of RDR could play a role in lesion bypass by the T4 replication machinery.
Resumo:
Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.
Resumo:
A qualidade da dispersão de gás em células de flotação é comumente caracterizada através de parâmetros como velocidade superficial do gás (Jg), hold-up do gás (?g), distribuição de tamanho de bolha (db ou D3,2) e fluxo de superfície de bolha (Sb). Sendo um processo de separação de minerais que é dependente da interação (colisão + adesão) entre partículas hidrofóbicas e bolhas de ar, a flotação tem seu desempenho dependente de uma dispersão de gás apropriada na polpa de minério. Desta forma, este trabalho objetivou caracterizar o estado da dispersão de gás de duas células em um banco composto por quatro células Wemco de 42,5 m³ (subaeradas), operando em série na usina da Vale Fertilizantes (Cajati-SP). Realizaram-se três campanhas de medidas que foram conduzidas sob diferentes condições operacionais: a) Diâmetro do rotor (D) de 1,09 m e rotação (N) entre 145 RPM e 175 RPM; b) D = 0,99 m e N entre 110 RPM e 190 RPM; c) D = 0,99 m e N de 120 RPM e de 130 RPM. Observaram-se os seguintes valores de dispersão de gás: 0,7 <= Jg <= 5,4 cm/s, 7 <= ?g <= 15%, 1,6 <= D3,2 <= 2,4 mm e Sb na faixa de 24 a 162 s-1. A magnitude de Jg medida na 1ª e 2ª campanhas mostrou-se acima dos valores reportados pela literatura, indicando necessidade de modificação de condições operacionais dos equipamentos, assim como cuidadosa manutenção. Posteriormente, a 3ª campanha indicou maior conformidade dos parâmetros de dispersão de gás em relação à literatura, constatando-se uma considerável melhora de desempenho do processo de flotação.
Resumo:
A iodação do sal de mesa é considerada o caminho mais eficiente para controlar os Distúrbios por Deficiência de Iodo. Em países tropicais, o elemento pode ser adicionado na forma de KIO3. Para garantir que os níveis ideais do ânion estejam disponíveis ao consumidor, o controle de qualidade do sal consiste numa estratégia fundamental. Sistemas em fluxo com multicomutação representam uma alternativa versátil para o desenvolvimento de procedimentos simples, rápidos e limpos, minimizando o consumo de reagentes e a geração de resíduos. Nesse contexto, um procedimento analítico utilizando sistema com multicomutação e detecção espectrofotométrica foi desenvolvido para a determinação de iodato em sal de mesa. A reação empregada foi baseada na formação de um composto roxo (540 nm) entre iodato (IO3-) e p-aminofenol (PAP) em meio ácido. O tempo de residência da zona de amostra no percurso analítico foi explorado a fim de favorecer a reação lenta e a frequência de amostragem para a melhoria do desempenho analítico. Foram selecionados 2 pulsos para inserção de amostra, 3 pulsos para reagente (PAP 0,25% (m/v) em HCl 0,025 mol L-1), 7 ciclos de amostragem, 200 pulsos de carregador (água), bolha de ar de 1 s (40 µL), reator de 70 cm (3 mm d.i.) e parada de fluxo de 480 s. Resposta linear foi observada entre 2,28x10-5 e 3,65x10-4 mol L-1, descrita pela equação A = 0,2443 + 2030 C, r = 0,997. Limite de detecção (99,7% de confiança), coeficiente de variação (n = 20) e frequência de amostragem foram estimados em 8,2x10-6 mol L-1, 0,42% e 70 determinações por hora, respectivamente. Houve consumo de 1,05 mg de PAP e geração de 0,70 mL de resíduos por determinação. As principais espécies concomitantes presentes na amostra não interferiram na determinação de iodato em concentrações até 8 vezes maiores que as usualmente encontradas. Estudos de adição e recuperação de iodato foram realizados pelo procedimento proposto, obtendo porcentagens de recuperação entre 88 e 104%. O procedimento analítico desenvolvido apresenta sensibilidade adequada para a determinação de iodato em amostra de sal de mesa e elevada frequência de amostragem quando comparado com procedimentos descritos na literatura
Resumo:
Um dos aspectos regulatórios fundamentais para o mercado imobiliário no Brasil são os limites para obtenção de financiamento no Sistema Financeiro de Habitação. Esses limites podem ser definidos de forma a aumentar ou reduzir a oferta de crédito neste mercado, alterando o comportamento dos seus agentes e, com isso, o preço de mercado dos imóveis. Neste trabalho, propomos um modelo de formação de preços no mercado imobiliário brasileiro com base no comportamento dos agentes que o compõem. Os agentes vendedores têm comportamento heterogêneo e são influenciados pela demanda histórica, enquanto que os agentes compradores têm o seu comportamento determinado pela disponibilidade de crédito. Esta disponibilidade de crédito, por sua vez, é definida pelos limites para concessão de financiamento no Sistema Financeiro de Habitação. Verificamos que o processo markoviano que descreve preço de mercado converge para um sistema dinâmico determinístico quando o número de agentes aumenta, e analisamos o comportamento deste sistema dinâmico. Mostramos qual é a família de variáveis aleatórias que representa o comportamento dos agentes vendedores de forma que o sistema apresente um preço de equilíbrio não trivial, condizente com a realidade. Verificamos ainda que o preço de equilíbrio depende não só das regras de concessão de financiamento no Sistema Financeiro de Habitação, como também do preço de reserva dos compradores e da memória e da sensibilidade dos vendedores a alterações na demanda. A memória e a sensibilidade dos vendedores podem levar a oscilações de preços acima ou abaixo do preço de equilíbrio (típicas de processos de formação de bolhas); ou até mesmo a uma bifurcação de Neimark-Sacker, quando o sistema apresenta dinâmica oscilatória estável.
Resumo:
The catalytic activity and durability of 2 wt.% Pd/Al2O3 in powder and washcoated on cordierite monoliths were examined for the liquid phase hydrodechlorination (LPHDC) of polychlorinated dibenzo-p-dioxins/polychlorinated dibenzofurans (PCDD/Fs), also known as dioxins. NaOH was employed as a neutralizing agent, and 2-propanol was used as a hydrogen donor and a solvent. Fresh and spent powder and monolith samples were characterized by elemental analysis, surface area, hydrogen chemisorption, scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDX), and transmission electron microscopy/energy dispersive X-ray spectroscopy (TEM/EDX). Three reactor configurations were compared including the slurry and monolith batch reactors as well as the bubble loop column resulting in 100, 70, and 72% sample toxicity reduction, respectively, after 5 h of reaction. However, the slurry and monolith batch reactors lead to catalyst sample loss via a filtration process (slurry) and washcoat erosion (monolith batch), as well as rapid deactivation of the powder catalyst samples. The monolith employed in the bubble loop column remained stable and active after four reaction runs. Three preemptive regeneration methods were evaluated on spent monolith catalyst including 2-propanol washing, oxidation/reduction, and reduction. All three procedures reactivated the spent catalyst samples, but the combustion methods proved to be more efficient at eliminating the more stable poisons.
Resumo:
Context. The discovery of several clusters of red supergiants towards l = 24°−30° has triggered interest in this area of the Galactic plane, where lines of sight are very complex and previous explorations of the stellar content were very preliminary. Aims. We attempt to characterise the stellar population associated with the H ii region RCW 173 (=Sh2-60), located at, as previous studies have suggested that this population could be beyond the Sagittarius arm. Methods. We obtained UBV photometry of a stellar field to the south of the brightest part of RCW 173, as well as spectroscopy of about twenty stars in the area. We combined our new data with archival 2MASS near-infrared photometry and Spitzer/GLIMPSE imaging and photometry, to achieve a more accurate characterisation of the stellar sources and the associated cloud. Results. We find a significant population of early-type stars located at d = 3.0 kpc, in good agreement with the “near” dynamical distance to the H ii region. This population should be located at the near intersection of the Scutum-Crux arm. A luminous O7 II star is likely to be the main source of ionisation. Many stars are concentrated around the bright nebulosity, where GLIMPSE images in the mid infrared show the presence of a bubble of excited material surrounding a cavity that coincides spatially with a number of B0-1 V stars. We interpret this as an emerging cluster, perhaps triggered by the nearby O7 II star. We also find a number of B-type giants. Some of them are located at approximately the same distance, and may be part of an older population in the same area, characterised by much lower reddening. A few have shorter distance moduli and are likely to be located in the Sagittarius arm. Conclusions. The line of sight in this direction is very complex. Optically visible tracers delineate two spiral arms, but seem to be absent beyond d ≈ 3 kpc. Several H ii regions in this area suggest that the Scutum-Crux arm contains thick clouds actively forming stars. All these populations are projected on top of the major stellar complex signposted by the clusters of red supergiants.
Resumo:
La actual crisis económica ha tenido impacto en prácticamente todos los sectores socioeconómicos, entre los que se incluye la educación como uno de los grandes perjudicados. En este momento de incertidumbre sobre la viabilidad de la educación tradicional y la supuesta burbuja en la enseñanza (agravada por las tasas de paro juvenil), surgen los MOOC como respuesta a la necesidad de acercamiento entre la formación y la sociedad del conocimiento. Pero que los cursos masivos y abiertos marquen el futuro de la educación dependerá en gran medida de su viabilidad económica. Aunque aun no se haya encontrado un modelo de negocio definitivo, las plataformas de MOOC experimentan con diferentes alternativas que, unidas a la escalabilidad de los proyectos, es previsible que generen grandes resultados.
Resumo:
Purpose: To compare outcomes of big-bubble deep anterior lamellar keratoplasty (DALK) and penetrating keratoplasty (PK) for macular corneal dystrophy. Design: Prospective, randomized, interventional case series. Methods: Setting: Single hospital. Patients: Eighty-two eyes of 54 patients requiring keratoplasty for the treatment of macular corneal dystrophy without endothelial involvement were included. Main outcome measures: Operative complications, uncorrected visual acuity, best-corrected visual acuity, contrast sensitivity function, higher-order aberrations, and endothelial cell density were evaluated. Results: The DALK and PK group consisted of 35 and 41 eyes, respectively. Best-corrected visual acuity after surgery was 20/40 or better 68.5% and 70.7% of the eyes in the DALK and PK groups, respectively (P > .05). No statistically significant differences between groups were found in contrast sensitivity function with and without glare for any spatial frequency (P > .05). Significantly higher levels of higher-order aberrations were found in the DALK group (P < .01). In both groups, a progressive and statistically significant reduction in endothelial cell density was found (P < .01). At the last follow-up, the mean endothelial cell loss was 18.1% and 26.9% in DALK and PK groups, respectively (P = .03). Graft rejection episodes were seen in 5 eyes (12.1%) in the PK group, and regrafting was necessary in 3 eyes (7.3%). Recurrence of the disease was documented in 5.7% and 4.8% of the eyes in the DALK and PK groups, respectively. Conclusions: Deep anterior lamellar keratoplasty with the big-bubble technique provided comparable visual and optical results as PK and resulted in less endothelial damage, as well as eliminating endothelial rejection in macular corneal dystrophy. Deep anterior lamellar keratoplasty surgery is a viable option for macular corneal dystrophy without endothelial involvement.
Resumo:
The decomposition of azodicarbonamide, used as foaming agent in PVC—plasticizer (1/1) plastisols was studied by DSC. Nineteen different plasticizers, all belonging to the ester family, two being polymeric (polyadipates), were compared. The temperature of maximum decomposition rate (in anisothermal regime at 5 K min−1 scanning rate), ranges between 434 and 452 K. The heat of decomposition ranges between 8.7 and 12.5 J g−1. Some trends of variation of these parameters appear significant and are discussed in terms of solvent (matrix) and viscosity effects on the decomposition reactions. The shear modulus at 1 Hz frequency was determined at the temperature of maximum rate of foaming agent decomposition, and differs significantly from a sample to another. The foam density was determined at ambient temperature and the volume fraction of bubbles was used as criterion to judge the efficiency of the foaming process. The results reveal the existence of an optimal shear modulus of the order of 2 kPa that corresponds roughly to plasticizer molar masses of the order of 450 ± 50 g mol−1. Heavier plasticizers, especially polymeric ones are too difficult to deform. Lighter plasticizers such as diethyl phthalate (DEP) deform too easily and presumably facilitate bubble collapse.
Resumo:
Introducing teaching about healthy solutions in buildings and BIM has been a challenge for the University of Alicante. Teaching attached to very tighten study plans conditioned the types of methods that could be used in the past. The worldwide situation of crisis that especially reached Spain and the bursting of the housing bubble generated a lack of employment that reached universities where careers related to construction, Architecture and Architectural Technologist, suffered a huge reduction in the number of students enrolled. In the case of the University of Alicante, students’ enrolment for Architectural Technology reached an 80% reduction. The necessity of a reaction against this situation made the teachers be innovative and use the new Bologna adapted study plans to develop new teaching experiences introducing new concepts: people wellbeing in buildings and BIM. Working with healthy solutions in buildings provided new approaches for building design and construction as an alternative to sustainability. For many years sustainability was the concept that applied to housing gave buildings an added value and the possibility of having viability in a very complex scenario. But after lots of experiences, the approved methodologies for obtaining sustainable housing were ambiguous and at the end, investors, designers, constructors and purchasers cannot find real and validated criteria for obtaining an effective sustainable house. It was the moment to work with new ideas and concepts and start facing buildings from the users’ point of view. At the same time the development of new tools, BIM, has opened a wide range of opportunities, innovative and suggestive, that allows simulation and evaluation of many building factors. This paper describes the research in teaching developed by the University of Alicante to adapt the current study plans, introducing work with healthy solutions in buildings and the use of BIM, with the aim of attracting students by improving their future employability. Pilot experiences have been carried out in different subjects based on the work with projects and case studies under an international frame with the cooperation of different European partner universities. The use of BIM tools, introduced in 2014, solved the problems that appeared in some subjects, mainly building construction, and helped with the evaluation of some healthy concepts that presented difficulties until this moment as knowledge acquired by the students was hard to be evaluated. The introduction of BIM tools: Vasari, FormIt, Revit and Light Control among others, allowed the study of precise healthy concepts and provided the students a real understand of how these different parameters can condition a healthy architectural space. The analysis of the results showed a clear acceptance by the students and gave teachers the possibility of opening new research lines. At the same time, working with BIM tools to obtain healthy solutions in building has been a good option to improve students’ employability as building market in Spain is increasing the number of specialists in BIM with a wider knowledge.