23 resultados para strong consistency


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In just a few years cloud computing has become a very popular paradigm and a business success story, with storage being one of the key features. To achieve high data availability, cloud storage services rely on replication. In this context, one major challenge is data consistency. In contrast to traditional approaches that are mostly based on strong consistency, many cloud storage services opt for weaker consistency models in order to achieve better availability and performance. This comes at the cost of a high probability of stale data being read, as the replicas involved in the reads may not always have the most recent write. In this paper, we propose a novel approach, named Harmony, which adaptively tunes the consistency level at run-time according to the application requirements. The key idea behind Harmony is an intelligent estimation model of stale reads, allowing to elastically scale up or down the number of replicas involved in read operations to maintain a low (possibly zero) tolerable fraction of stale reads. As a result, Harmony can meet the desired consistency of the applications while achieving good performance. We have implemented Harmony and performed extensive evaluations with the Cassandra cloud storage on Grid?5000 testbed and on Amazon EC2. The results show that Harmony can achieve good performance without exceeding the tolerated number of stale reads. For instance, in contrast to the static eventual consistency used in Cassandra, Harmony reduces the stale data being read by almost 80% while adding only minimal latency. Meanwhile, it improves the throughput of the system by 45% while maintaining the desired consistency requirements of the applications when compared to the strong consistency model in Cassandra.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La Región Metropolitana de Madrid (RMM) ha sufrido una gran transformación urbana en el periodo 1985-2007, en el cual ha crecido la población, ha crecido fuertemente el cuerpo físico, pero sobre todo han crecido su coste y su consumo, lo que supone que se ha vuelto más insostenible. Para tratar de comprender esta evolución asimétrica se ensayan sucesivos modelos que tratan de explicar la transformación de la realidad a través de la articulación de las formas de poder y sus políticas asociadas dentro del contexto local-metropolitano. Si se compara la transformación urbana en el periodo 1985-2007 respecto a la registrada durante el desarrollismo previo al presente periodo democrático, se encuentran similitudes, como el amplio consumo de suelo, pero el modelo desarrollista se inscribe en otras lógicas y tiene otros parámetros de contexto y es congruente ya que las últimas décadas del Régimen Franquista se caracterizan por un importantísimo aumento poblacional que se correspondía con el fuerte crecimiento industrial de la RMM. Esa congruencia relativa se pierde en el periodo estudiado, a pesar de que en 1985, se aprueba el Plan General de Ordenación Urbana de Madrid centrado en la ciudad existente y con un crecimiento contenido, y que puede considerarse un modelo abortado. Tras numerosas transformaciones políticas, económicas, sociales y urbanísticas se llega a una situación opuesta a la prevista en el citado Plan. Más de veinte años después, en 2007, se presentan no solo síntomas de agotamiento del modelo finalmente adoptado, sino su quiebra dramática tanto en su dimensión inmobiliario-financiera como del espacio del bienestar. Es precisamente la supresión de los mecanismos de regulación lo que ha caracterizado la evolución de los modelos urbanos, en correspondencia con la desregulación de las actividades económicas y de los flujos de capital propios del modelo "neoliberal". La actual crisis financiera internacional, en especial en algunos países periféricos europeos como España, ha demostrado cómo las políticas económicas que se han llevado a cabo, fuera de toda regulación, han resultado insostenibles. Pero no se trata solo de una crisis económica. En el caso español, de todas las dimensiones de la crisis, destaca la dimensión urbana, o el auge y caída del ciclo inmobiliario, debido a la urbanización intensiva del territorio en relación con el circuito secundario de la acumulación capitalista, habiendo tenido especial incidencia en algunos territorios como la RMM. En la Región Metropolitana de Madrid la situación actual es de crisis urbana, causada principalmente por el divorcio entre las necesidades y la producción de ciudad, pues no se ha basado el crecimiento en la creación de nuevos hogares, u otras cuestiones demográficas, sino en la acumulación de capital a través del crecimiento de la ciudad. Además, dicho crecimiento está conformado por una expansión urbana descontrolada, con mayores requerimientos energéticos que el modelo compacto y complejo tradicional, lo que unido a la escala de los procesos, supone un sistema urbano progresivamente ineficiente. El caso de la RMM resulta paradigmático, ya que la región ha desempeñado un papel como laboratorio de nuevas formas de gobierno y planificación que han dado un mayor protagonismo al espacio, que ha entrado en las dinámicas centrales principalmente por el apoyo al crecimiento físico, a la vez que han confluido circunstancias específicas, como un nuevo impulso al centralismo, lo que ha potenciado ciertas políticas, como considerar la ciudad como motor de crecimiento económico y de competitividad en el concierto europeo y mundial de ciudades. El estudio del papel de la planificación y sus crisis en la sucesión de los modelos, muestra su función nuclear en la propia constitución de estos —es parte fundamental de su aparato de regulación— y su valor no solo para poder entender el periodo, sino para poder proyectar otro futuro urbano. Este enfoque conduce a establecer la relación del planeamiento con las diferentes crisis económicas en el periodo de estudio lo que permite diferenciar tres momentos de dicha relación: la planificación urbanística austera bajo la influencia de la crisis fordista, la salida de la crisis a través de la imposición de un modelo urbano basado en el sobreproducción de espacio urbano, y la entrada en una crisis inmobiliaria y de financiarización en relación a la adopción de un modelo multidimensionalmente insostenible. El análisis de este periodo es la base para apuntar perspectivas que permitan transformar el gobierno urbano hacia un modelo urbano más deseable, o mejor aún, otros futuros posibles, que se enmarcan dentro de la alternativa principal que supone la sostenibilidad. Madrid's Metropolitan Region (MMR) has undergone a major urban transformation in the period 1985-2007, where the population has grown up, the built environment has grown strongly, but mostly its cost and consumption have grown, which means that it has become unsustainable. To try to understand this evolution successive asymmetric models are tested in order to explain the transformation of reality through the articulation of forms of power and its associated policies in that localmetropolitan context. Comparing the urban transformation in the period 1985-2007 to the existing during developmentalism in the current predemocratic period, both have similarities in terms of land consumption, but the previous developmentalism model is part of another logics and has got other context parameters. It is consistent since the last decades of the Franco Regime was characterized by an important population increase that corresponded to strong industrial growth of the MMR. This relative consistency is lost during the study period, although in 1985, with the approval of the Master Plan of Madrid that was focused on the existing city, with a limited growth, and it may be considered an interrupted model. After numerous political, economic, social and urban changes, there is the opposite situation to that foresight under that Plan. Over twenty years later, in 2007, there are not only signs of exhaustion of the model which was finally adopted, but also its dramatic collapse in both real estate and financial dimension of space as well. The urban transformation under analysis has relaunched the hegemony of the sectors that rule the growth of the Madrid's Metropolitan Region and it is supported by decision making and financing of the different administrations with the passivity of the social stakeholders and citizens. This has meant the removal of regulatory mechanisms that have characterized the evolution of urban models, corresponding to the deregulation of economic activities and capital flows according to "neoliberal" model. The current international financial crisis, especially in some European peripheral countries like Spain, has shown how economic policies that have been carried out, without any regulation, have proven unsustainable. But it is not only an economic crisis. In the Spanish case, of all the dimensions of the crisis, it is the urban dimension that is highlighted, or the rise and fall of real estate cycle, due to intensive urbanization of the territory in relation to the secondary circuit of capital accumulation, having had a particular impact in some territories such as the Madrid's Metropolitan Region. In Madrid's Metropolitan Region there is the current situation of urban crisis, mainly caused by the divorce between needs and the city (space) production, because no growth has been based on creating new homes, or other demographic issues, but in the capital accumulation through growth of the city. Furthermore, this growth is made up of urban sprawl, with higher energy requirements than the traditional compact and complex one, which together with the scale of processes, is increasingly an inefficient urban system. The case of Madrid's Metropolitan Region is paradigmatic, since the region has played a role as a laboratory for new forms of governance and planning have given a greater role to space, which has entered the core dynamics supported mainly by physical growth, while specific circumstances have come together as a new impulse to centralization. This has promoted policies such as considering the city as an engine of economic growth and competitiveness in the international and the European hierarchy of cities. The study of the role of planning and crisis in the succession of models, shows its nuclear role in the constitution of these models is a fundamental part of its regulatory apparatus- and also its value not only to understand the period, but to anticipate to other urban future. This approach leads to establish the relationship of planning with the various crises in the study period, allowing three different moments of that relationship: the austere urban planning under the influence of Fordist crisis, the output of the crisis through imposition of an urban model based on the overproduction of urban space, and entry into a housing crisis and financialisation in relation to the adoption of a multi-dimensionally unsustainable model. The analysis of this period is the basis for targeting prospects that translate urban governance towards a more desirable urban model, or better yet, other possible futures, which are part of the main alternative that is sustainability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La propulsión eléctrica constituye hoy una tecnología muy competitiva y de gran proyección de futuro. Dentro de los diversos motores de plasma existentes, el motor de efecto Hall ha adquirido una gran madurez y constituye un medio de propulsión idóneo para un rango amplio de misiones. En la presente Tesis se estudian los motores Hall con geometría convencional y paredes dieléctricas. La compleja interacción entre los múltiples fenómenos físicos presentes hace que sea difícil la simulación del plasma en estos motores. Los modelos híbridos son los que representan un mejor compromiso entre precisión y tiempo de cálculo. Se basan en utilizar un modelo fluido para los electrones y algoritmos de dinámica de partículas PIC (Particle-In- Cell) para los iones y los neutros. Permiten hacer uso de la hipótesis de cuasineutralidad del plasma, a cambio de resolver separadamente las capas límite (o vainas) que se forman en torno a las paredes de la cámara. Partiendo de un código híbrido existente, llamado HPHall-2, el objetivo de la Tesis doctoral ha sido el desarrollo de un código híbrido avanzado que mejorara la simulación de la descarga de plasma en un motor de efecto Hall. Las actualizaciones y mejoras realizadas en las diferentes partes que componen el código comprenden tanto aspectos teóricos como numéricos. Fruto de la extensa revisión de la algoritmia del código HPHall-2 se han conseguido reducir los errores de precisión un orden de magnitud, y se ha incrementado notablemente su consistencia y robustez, permitiendo la simulación del motor en un amplio rango de condiciones. Algunos aspectos relevantes a destacar en el subcódigo de partículas son: la implementación de un nuevo algoritmo de pesado que permite determinar de forma más precisa el flujo de las magnitudes del plasma; la implementación de un nuevo algoritmo de control de población, que permite tener suficiente número de partículas cerca de las paredes de la cámara, donde los gradientes son mayores y las condiciones de cálculo son más críticas; las mejoras en los balances de masa y energía; y un mejor cálculo del campo eléctrico en una malla no uniforme. Merece especial atención el cumplimiento de la condición de Bohm en el borde de vaina, que en los códigos híbridos representa una condición de contorno necesaria para obtener una solución consistente con el modelo de interacción plasma-pared, y que en HPHall-2 aún no se había resuelto satisfactoriamente. En esta Tesis se ha implementado el criterio cinético de Bohm para una población de iones con diferentes cargas eléctricas y una gran dispersión de velocidades. En el código, el cumplimiento de la condición cinética de Bohm se consigue por medio de un algoritmo que introduce una fina capa de aceleración nocolisional adyacente a la vaina y mide adecuadamente el flujo de partículas en el espacio y en el tiempo. Las mejoras realizadas en el subcódigo de electrones incrementan la capacidad de simulación del código, especialmente en la región aguas abajo del motor, donde se simula la neutralización del chorro del plasma por medio de un modelo de cátodo volumétrico. Sin abordar el estudio detallado de la turbulencia del plasma, se implementan modelos sencillos de ajuste de la difusión anómala de Bohm, que permiten reproducir los valores experimentales del potencial y la temperatura del plasma, así como la corriente de descarga del motor. En cuanto a los aspectos teóricos, se hace especial énfasis en la interacción plasma-pared y en la dinámica de los electrones secundarios libres en el interior del plasma, cuestiones que representan hoy en día problemas abiertos en la simulación de los motores Hall. Los nuevos modelos desarrollados buscan una imagen más fiel a la realidad. Así, se implementa el modelo de vaina de termalización parcial, que considera una función de distribución no-Maxwelliana para los electrones primarios y contabiliza unas pérdidas energéticas más cercanas a la realidad. Respecto a los electrones secundarios, se realiza un estudio cinético simplificado para evaluar su grado de confinamiento en el plasma, y mediante un modelo fluido en el límite no-colisional, se determinan las densidades y energías de los electrones secundarios libres, así como su posible efecto en la ionización. El resultado obtenido muestra que los electrones secundarios se pierden en las paredes rápidamente, por lo que su efecto en el plasma es despreciable, no así en las vainas, donde determinan el salto de potencial. Por último, el trabajo teórico y de simulación numérica se complementa con el trabajo experimental realizado en el Pnnceton Plasma Physics Laboratory, en el que se analiza el interesante transitorio inicial que experimenta el motor en el proceso de arranque. Del estudio se extrae que la presencia de gases residuales adheridos a las paredes juegan un papel relevante, y se recomienda, en general, la purga completa del motor antes del modo normal de operación. El resultado final de la investigación muestra que el código híbrido desarrollado representa una buena herramienta de simulación de un motor Hall. Reproduce adecuadamente la física del motor, proporcionando resultados similares a los experimentales, y demuestra ser un buen laboratorio numérico para estudiar el plasma en el interior del motor. Abstract Electric propulsion is today a very competitive technology and has a great projection into the future. Among the various existing plasma thrusters, the Hall effect thruster has acquired a considerable maturity and constitutes an ideal means of propulsion for a wide range of missions. In the present Thesis only Hall thrusters with conventional geometry and dielectric walls are studied. The complex interaction between multiple physical phenomena makes difficult the plasma simulation in these engines. Hybrid models are those representing a better compromise between precision and computational cost. They use a fluid model for electrons and Particle-In-Cell (PIC) algorithms for ions and neutrals. The hypothesis of plasma quasineutrality is invoked, which requires to solve separately the sheaths formed around the chamber walls. On the basis of an existing hybrid code, called HPHall-2, the aim of this doctoral Thesis is to develop an advanced hybrid code that better simulates the plasma discharge in a Hall effect thruster. Updates and improvements of the code include both theoretical and numerical issues. The extensive revision of the algorithms has succeeded in reducing the accuracy errors in one order of magnitude, and the consistency and robustness of the code have been notably increased, allowing the simulation of the thruster in a wide range of conditions. The most relevant achievements related to the particle subcode are: the implementation of a new weighing algorithm that determines more accurately the plasma flux magnitudes; the implementation of a new algorithm to control the particle population, assuring enough number of particles near the chamber walls, where there are strong gradients and the conditions to perform good computations are more critical; improvements in the mass and energy balances; and a new algorithm to compute the electric field in a non-uniform mesh. It deserves special attention the fulfilment of the Bohm condition at the edge of the sheath, which represents a boundary condition necessary to match consistently the hybrid code solution with the plasma-wall interaction, and remained as a question unsatisfactory solved in the HPHall-2 code. In this Thesis, the kinetic Bohm criterion has been implemented for an ion particle population with different electric charges and a large dispersion in their velocities. In the code, the fulfilment of the kinetic Bohm condition is accomplished by an algorithm that introduces a thin non-collisional layer next to the sheaths, producing the ion acceleration, and measures properly the flux of particles in time and space. The improvements made in the electron subcode increase the code simulation capabilities, specially in the region downstream of the thruster, where the neutralization of the plasma jet is simulated using a volumetric cathode model. Without addressing the detailed study of the plasma turbulence, simple models for a parametric adjustment of the anomalous Bohm difussion are implemented in the code. They allow to reproduce the experimental values of the plasma potential and the electron temperature, as well as the discharge current of the thruster. Regarding the theoretical issues, special emphasis has been made in the plasma-wall interaction of the thruster and in the dynamics of free secondary electrons within the plasma, questions that still remain unsolved in the simulation of Hall thrusters. The new developed models look for results closer to reality, such as the partial thermalization sheath model, that assumes a non-Maxwellian distribution functions for primary electrons, and better computes the energy losses at the walls. The evaluation of secondary electrons confinement within the chamber is addressed by a simplified kinetic study; and using a collisionless fluid model, the densities and energies of free secondary electrons are computed, as well as their effect on the plasma ionization. Simulations show that secondary electrons are quickly lost at walls, with a negligible effect in the bulk of the plasma, but they determine the potential fall at sheaths. Finally, numerical simulation and theoretical work is complemented by the experimental work carried out at the Princeton Plasma Physics Laboratory, devoted to analyze the interesting transitional regime experienced by the thruster in the startup process. It is concluded that the gas impurities adhered to the thruster walls play a relevant role in the transitional regime and, as a general recomendation, a complete purge of the thruster before starting its normal mode of operation it is suggested. The final result of the research conducted in this Thesis shows that the developed code represents a good tool for the simulation of Hall thrusters. The code reproduces properly the physics of the thruster, with results similar to the experimental ones, and represents a good numerical laboratory to study the plasma inside the thruster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La aplicación de criterios de sostenibilidad ha de entenderse como el procedimiento esencial para la necesaria reconversión del sector de la construcción, que movilizando el 10% de la economía mundial, representa más de la tercera parte del consumo mundial de recursos, en torno al 30-40% del consumo energético y emisiones de gases de efecto invernadero, 30-40% de la generación de residuos y el 12% de todo el gasto en agua dulce del planeta. La presente investigación se enmarca en una estrategia general de promover la evaluación de la sostenibilidad en la edificación en el contexto español, dando un primer paso centrado en la evaluación del comportamiento ambiental. El hilo conductor de la investigación parte de la necesidad de establecer un marco teórico de sostenibilidad, que permita clarificar conceptos y definir criterios de valoración adecuados. Como siguiente paso, la investigación se dirige a la revisión del panorama internacional de normativa e instrumentos voluntarios, con el objetivo de clarificar el difuso panorama que caracteriza a la sostenibilidad en el sector de la edificación en la actualidad y enmarcar la investigación en un contexto de políticas y programaciones ya existentes. El objetivo principal reside en el planteamiento de una metodología de evaluación de los aspectos o impactos ambientales asociados al ciclo de vida de la edificación, aplicable al contexto español, como una de las tres dimensiones que constituyen los pilares básicos de la sostenibilidad. Los ámbitos de evaluación de los aspectos sociales y económicos, para los que no existe actualmente un grado de definición metodológico suficientemente congruente, son adicionalmente examinados, de cara a ofrecer una visión holística de la evaluación. Previo al desarrollo de la propuesta, se aborda, en primer lugar, la descripción de las características básicas y limitaciones de la metodología de Análisis de Ciclo de Vida (ACV), para posteriormente proceder a profundizar en el estado del arte de aplicación de ACV a la edificación, realizando una revisión crítica de los trabajos de investigación que han sido desarrollados en los últimos años. Esta revisión permite extraer conclusiones sobre su grado de coherencia con el futuro entorno normativo e identificar dos necesidades prioritarias de actuación: -La necesidad de armonización, dadas las fuertes inconsistencias metodológicas detectadas, que imposibilitan la comparación de los resultados obtenidos en los trabajos de evaluación. -La necesidad de simplificación, dada la complejidad inherente a la evaluación, de modo que, manteniendo el máximo rigor, sea viable su aplicación práctica en el contexto español. A raíz de la participación en los trabajos de desarrollo normativo a nivel europeo, se ha adquirido una visión crítica sobre las implicaciones metodológicas de la normativa en definición, que permite identificar la hoja de ruta que marcará el escenario europeo en los próximos años. La definición de la propuesta metodológica integra los principios generales de aplicación de ACV con el protocolo metodológico establecido en la norma europea, considerando adicionalmente las referencias normativas de las prácticas constructivas en el contexto español. En el planteamiento de la propuesta se han analizado las posibles simplificaciones aplicables, con el objetivo de hacer viable su implementación, centrando los esfuerzos en la sistematización del concepto de equivalente funcional, el establecimiento de recomendaciones sobre el tipo de datos en función de su disponibilidad y la revisión crítica de los modelos de cálculo de los impactos ambientales. Las implicaciones metodológicas de la propuesta se describen a través de una serie de casos de estudio, que ilustran su viabilidad y las características básicas de aplicación. Finalmente, se realiza un recorrido por los aspectos que han sido identificados como prioritarios en la conformación del escenario de perspectivas futuras, líneas de investigación y líneas de acción. Abstract Sustainability criteria application must be understood as the essential procedure for the necessary restructuring of the construction sector, which mobilizes 10% of the world economy, accounting for more than one third of the consumption of the world's resources, around 30 - 40% of energy consumption and emissions of greenhouse gases, 30-40% of waste generation and 12% of all the fresh water use in the world. This research is in line with an overall strategy to promote the sustainability assessment of building in the Spanish context, taking a first step focused on the environmental performance assessment. The thread of the present research sets out from the need to establish a theoretical framework of sustainability which clarifies concepts and defines appropriate endpoints. As a next step, the research focuses on the review of the international panorama regulations and voluntary instruments, with the aim of clarifying the fuzzy picture that characterizes sustainability in the building sector at present while framing the research in the context of existing policies and programming. The main objective lies in the approach of a methodology for the assessment of the environmental impacts associated with the life cycle of building, applicable to the Spanish context, as one of the three dimensions that constitute the pillars of sustainability. The areas of assessment of social and economic issues, for which there is currently a degree of methodological definition consistent enough, are further examined, in order to provide a holistic view of the assessment. The description of the basic features and limitations of the methodology of Life Cycle Assessment (LCA) are previously addressed, later proceeding to deepen the state of the art of LCA applied to the building sector, conducting a critical review of the research works that have been developed in recent years. This review allows to establish conclusions about the degree of consistency with the future regulatory environment and to identify two priority needs for action: - The need for harmonization, given the strong methodological inconsistencies detected that prevent the comparison of results obtained in assessment works. - The need for simplification, given the inherent complexity of the assessment, so that, while maintaining the utmost rigor, make the practical application feasible in the Spanish context. The participation in the work of policy development at European level has helped to achieve a critical view of the methodological implications of the rules under debate, identifying the roadmap that will mark the European scene in the coming years. The definition of the proposed methodology integrates the general principles of LCA methodology with the protocol established in the European standard, also considering the regulatory standards to construction practices in the Spanish context. In the proposed approach, possible simplifications applicable have been analyzed, in order to make its implementation possible, focusing efforts in systematizing the functional equivalent concept, establishing recommendations on the type of data based on their availability and critical review of the calculation models of environmental impacts. The methodological implications of the proposal are described through a series of case studies, which illustrate the feasibility and the basic characteristics of its application. Finally, the main aspects related to future prospects, research lines and lines of action that have been identified as priorities are outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pru p 3 has been suggested to be the primary sensitizing allergen in patients with peanut allergy in the Mediterranean area. We aimed to confirm this hypothesis, studying 79 subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los cambios percibidos hacia finales del siglo XX y a principios del nuevo milenio, nos ha mostrado que la crisis cultural de la que somos participes refleja también una crisis de los modelos universales. Nuestra situación contemporánea, parece indicar que ya no es posible formular un sistema estético para atribuirle una vigencia universal e intemporal más allá de su estricta eficacia puntual. La referencia organizada, delimitada, invariable y específica que ofrecía cualquier emplazamiento, en tanto preexistencia, reflejaba una jerarquía del sistema formal basado en lo extensivo: la medida, las normas, el movimiento, el tiempo, la modulación, los códigos y las reglas. Sin embargo, actualmente, algunos aspectos que permanecían latentes sobre lo construido, emergen bajo connotaciones intensivas, transgrediendo la simple manifestación visual y expresiva, para centrase en las propiedades del comportamiento de la materia y la energía como determinantes de un proceso de adaptación en el entorno. A lo largo del todo el siglo XX, el desarrollo de la relación del proyecto sobre lo construido ha sido abordado, casi en exclusiva, entre acciones de preservación o intervención. Ambas perspectivas, manifestaban esfuerzos por articular un pensamiento que diera una consistencia teórica, como soporte para la producción de la acción aditiva. No obstante, en las últimas décadas de finales de siglo, la teoría arquitectónica terminó por incluir pensamientos de otros campos que parecen contaminar la visión sesgada que nos refería lo construido. Todo este entramado conceptual previo, aglomeraba valiosos intentos por dar contenido a una teoría que pudiese ser entendida desde una sola posición argumental. Es así, que en 1979 Ignasi Solá-Morales integró todas las imprecisiones que referían una actuación sobre una arquitectura existente, bajo el termino de “intervención”, el cual fue argumentado en dos sentidos: El primero referido a cualquier tipo de actuación que se puede hacer en un edificio, desde la defensa, preservación, conservación, reutilización, y demás acciones. Se trata de un ámbito donde permanece latente el sentido de intensidad, como factor común de entendimiento de una misma acción. En segundo lugar, más restringido, la idea de intervención se erige como el acto crítico a las ideas anteriores. Ambos representan en definitiva, formas de interpretación de un nuevo discurso. “Una intervención, es tanto como intentar que el edificio vuelva a decir algo o lo diga en una determinada dirección”. A mediados de 1985, motivado por la corriente de revisión historiográfica y la preocupación del deterioro de los centros históricos que recorría toda Europa, Solá-Morales se propone reflexionar sobre “la relación” entre una intervención de nueva arquitectura y la arquitectura previamente existente. Relación condicionada estrictamente bajo consideraciones lingüísticas, a su entender, en sintonía con toda la producción arquitectónica de todo el siglo XX. Del Contraste a la Analogía, resumirá las transformaciones en la concepción discursiva de la intervención arquitectónica, como un fenómeno cambiante en función de los valores culturales, pero a su vez, mostrando una clara tendencia dialógica entres dos categorías formales: El Contraste, enfatizando las posibilidades de la novedad y la diferencia; y por otro lado la emergente Analogía, como una nueva sensibilidad de interpretación del edificio antiguo, donde la semejanza y la diversidad se manifiestan simultáneamente. El aporte reflexivo de los escritos de Solá-Morales podría ser definitivo, si en las últimas décadas antes del fin de siglo, no se hubiesen percibido ciertos cambios sobre la continuidad de la expresión lingüística que fomentaba la arquitectura, hacia una especie de hipertrofia figurativa. Entre muchos argumentos: La disolución de la consistencia compositiva y el estilo unitario, la incorporación volumétrica del proyecto como dispositivo reactivo, y el cambio de visión desde lo retrospectivo hacia lo prospectivo que sugiere la nueva conservación. En este contexto de desintegración, el proyecto, en tanto incorporación o añadido sobre un edificio construido, deja de ser considerado como un apéndice volumétrico subordinado por la reglas compositivas y formales de lo antiguo, para ser considerado como un organismo de orden reactivo, que produce en el soporte existente una alteración en su conformación estructural y sistémica. La extensión, antes espacial, se considera ahora una extensión sensorial y morfológica con la implementación de la tecnología y la hiper-información, pero a su vez, marcados por una fuerte tendencia de optimización energética en su rol operativo, ante el surgimiento del factor ecológico en la producción contemporánea. En una sociedad, como la nuestra, que se está modernizando intensamente, es difícil compartir una adecuada sintonía con las formas del pasado. Desde 1790, fecha de la primera convención francesa para la conservación de monumentos, la escala de lo que se pretende preservar es cada vez más ambiciosa, tanto es así, que al día de hoy el repertorio de lo que se conserva incluye prácticamente todas las tipologías del entorno construido. Para Koolhaas, el intervalo entre el objeto y el momento en el cual se decide su conservación se ha reducido, desde dos milenios en 1882 a unas décadas hoy en día. En breve este lapso desaparecerá, demostrando un cambio radical desde lo retrospectivo hacia lo prospectivo, es decir, que dentro de poco habrá que decidir que es lo que se conserva antes de construir. Solá-Morales, en su momento, distinguió la relación entre lo nuevo y lo antiguo, entre el contraste y la analogía. Hoy casi tres décadas después, el objetivo consiste en evaluar si el modelo de intervención arquitectónica sobre lo construido se ha mantenido desde entonces o si han aparecido nuevas formas de posicionamiento del proyecto sobre lo construido. Nuestro trabajo pretende demostrar el cambio de enfoque proyectual con la preexistencia y que éste tiene estrecha relación con la incorporación de nuevos conceptos, técnicas, herramientas y necesidades que imprimen el contexto cultural, producido por el cambio de siglo. Esta suposición nos orienta a establecer un paralelismo arquitectónico entre los modos de relación en que se manifiesta lo nuevo, entre una posición comúnmente asumida (Tópica), genérica y ortodoxa, fundamentada en lo visual y expresivo de las últimas décadas del siglo XX, y una realidad emergente (Heterotópica), extraordinaria y heterodoxa que estimula lo inmaterial y que parece emerger con creciente intensidad en el siglo XXI. Si a lo largo de todo el siglo XX, el proyecto de intervención arquitectónico, se debatía entre la continuidad y discontinuidad de las categorías formales marcadas por la expresión del edificio preexistente, la nueva intervención contemporánea, como dispositivo reactivo en el paisaje y en el territorio, demanda una absoluta continuidad, ya no visual, expresiva, ni funcional, sino una continuidad fisiológica de adaptación y cambio con la propia dinámica del territorio, bajo nuevas reglas de juego y desplegando planes y estrategias operativas (proyectivas) desde su propia lógica y contingencia. El objeto de esta investigación es determinar los nuevos modos de continuidad y las posibles lógicas de producción que se manifiestan dentro de la Intervención Arquitectónica, intentando superar lo aparente de su relación física y visual, como resultado de la incorporación del factor operativo desplegado por el nuevo dispositivo contemporáneo. Creemos que es acertado mantener la senda connotativa que marca la denominación intervención arquitectónica, por aglutinar conceptos y acercamientos teóricos previos que han ido evolucionando en el tiempo. Si bien el término adolece de mayor alcance operativo desde su formulación, una cualidad que infieren nuestras lógicas contemporáneas, podría ser la reformulación y consolidación de un concepto de intervención más idóneo con nuestros tiempos, anteponiendo un procedimiento lógico desde su propia necesidad y contingencia. Finalmente, nuestro planteamiento inicial aspira a constituir un nueva forma de reflexión que nos permita comprender las complejas implicaciones que infiere la nueva arquitectura sobre la preexistencia, motivada por las incorporación de factores externos al simple juicio formal y expresivo preponderante a finales del siglo XX. Del mismo modo, nuestro camino propuesto, como alternativa, permite proyectar posibles sendas de prospección, al considerar lo preexistente como un ámbito que abarca la totalidad del territorio con dinámicas emergentes de cambio, y con ellas, sus lógicas de intervención.Abstract The perceived changes towards the end of the XXth century and at the beginning of the new milennium have shown us that the cultural crisis in which we participate also reflects a crisis of the universal models. The difference between our contemporary situation and the typical situations of modern orthodoxy and post-modernistic fragmentation, seems to indicate that it is no longer possible to formulate a valid esthetic system, to assign a universal and eternal validity to it beyond its strictly punctual effectiveness; which is even subject to questioning because of the continuous transformations that take place in time and in the sensibility of the subject itself every time it takes over the place. The organised reference that any location offered, limited, invariable and specific, while pre-existing, reflected a hierarchy of the formal system based on the applicable: measure, standards, movement, time, modulation, codes and rules. Authors like Marshall Mc Luhan, Paul Virilio, or Marc Augé anticipated a reality where the conventional system already did not seem to respond to the new architectural requests in which information, speed, disappearance and the virtual had blurred the traditional limits of place; pre-existence did no longer possess a specific delimitation and, on the contrary, they expect to reach a global scale. Currently, some aspects that stayed latent relating to the constructed, surface from intensive connotations, transgressing the simple visual and expressive manifestation in order to focus on the traits of the behaviour of material and energy as determinants of a process of adaptation to the surroundings. Throughout the entire Century, the development of the relation of the project relating to the constructed has been addressed, almost exclusively, in preservational or interventianal actions. Both perspectives showed efforts in order to express a thought that would give a theoretical consistency as a base for the production of the additive action. Nevertheless, the last decades of the Century, architectural theory ended up including thoughts from other fields that seem to contaminate the biased vision 15 which the constructed related us. Ecology, planning, philosophy, global economy, etc, suggest new approaches to the construction of the contemporary city; but this time with a determined idea of change and continuous transformation, that enriches the panorama of thought and architectural practice, at the same time, according to some, it puts disciplinary specification at risk, given that there is no architecture without destruction, the constructed organism requires mutation in order to adjust to the change of shape. All of this previous conceptual framework gathered valuable intents to give importance to a theory that could be understood solely from an argumental position. Thusly, in 1979 Ignasi Solá-Morales integrated all of the imprecisions that referred to an action in existing architecture under the term of “Intervention”, which was explained in two ways: The first referring to any type of intervention that can be carried out in a building, regarding protection, conservation, reuse, etc. It is about a scope where the meaning of intensity stays latent as a common factor of the understanding of a single action. Secondly, more limitedly, the idea of intervention is established as the critical act to the other previous ideas such as restauration, conservation, reuse, etc. Both ultimately represent ways of interpretation of a new speech. “An intervention, is as much as trying to make the building say something again or that it be said in a certain direction”. Mid 1985, motivated by the current of historiographical revision and the concerns regarding the deterioration of historical centres that traversed Europe, Solá-Morales decides to reflect on “the relationship” between an intervention of the new architecture and the previously existing architecture. A relationship determined strictly by linguistic considerations, to his understanding, in harmony with all of the architectural production of the XXth century. From Contrast to Analogy would summarise transformations in the discursive perception of architectural intervention, as a changing phenomenon depending on cultural values, but at the same time, showing a clear dialogical tendency between two formal categories: Contrast, emphasising the possibilities of novelty and difference; and on the other hand the emerging Analogy, as a new awareness of interpretation of the ancient building, where the similarity and diversity are manifested simultaneously. For Solá-Morales the analogical procedure is not based on the visible simultaneity of formal orders, but on associations that the subject establishes throughout time. Through analogy it is tried to overcome the simple visual relationship with the antique, to focus on its spacial, physical and geographical nature. If the analogical attempt guides an opening towards a new continuity; it still persists in the connection of dimensional, typological and figurative factors, subordinate to the formal hierarchy of the preexisting subjects. 16 The reflexive contribution of Solá-Morales’ works could be final, if in the last decades before the end of the century there had not been certain changes regarding linguistic expression, encouraged by architecture, towards a kind of figurative hypertrophy, amongst many arguments we are in this case interested in three moments: The dissolution of the compositional consistency and the united style, the volumetric incorporation of the project as a reactive mechanism, and the change of the vision from retrospective towards prospective that the new conservation suggests. The recurrence to the history of architecture and its recognisable forms, as a way of perpetuating memory and establishing a reference, dissolved any instinct of compositive unity and style, provoking permanent relationships to tend to disappear. The composition and coherence lead to suppose a type of discontinuity of isolated objects in which only possible relationships could appear; no longer as an order of certain formal and compositive rules, but as a special way of setting elements in a specific work. The new globalised field required new forms of consistency between the project and the pre-existent subject, motivated amongst others by the higher pace of market evolution, increase of consumer tax and the level of information and competence between different locations; aspects which finally made stylistic consistence inefficient. In this context of disintegration, the project, in incorporation as well as added to a constructed building, stops being considered as a volumetric appendix subordinate to compositive and formal rules of old, to be considered as an organism of reactive order, that causes a change in the structural and systematic configuration of the existing foundation. The extension, previsouly spatial, is now considered a sensorial and morphological extension, with the implementation of technology and hyper-information, but at the same time, marked by a strong tendency of energetic optimization in its operational role, facing the emergence of the ecological factor in contemporary production. The technological world turns into a new nature, a nature that should be analysed from ecological terms; in other words, as an event of transition in the continuous redistribution of energy. In this area, effectiveness is not only determined by the capacity of adaptation to changing conditions, but also by its transforming capacity “expressly” in order to change an environment. In a society, like ours, that is modernising intensively, it is difficult to share an adecuate agreement with the forms of the past. From 1790, the date of the first French convention for the conservation of monuments, the scale of what is expexted to be preserved is more and more ambitious, so much so that nowadays the repertoire of that what is conserved includes practically all typologies of the constructed surroundings. For Koolhaas, the ínterval between the object and the moment when its conservation is decided has been reduced, from two 17 milennia in 1882 to a few decades nowadays. Shortly this lapse will disappear, showing a radical change of retrospective towards prospective, that is to say, that soon it will be necessary to decide what to conserve before constructing. The shapes of cities are the result of the continuous incorporation of architecture, and perhaps that only through architecture the response to the universe can be understood, the continuity of what has already been constructed. Our work is understood also within that system, modifying the field of action and leaving the road ready for the next movement of those that will follow after us. Continuity does not mean conservatism, continuity means being conscient of the transitory value of our answers to specific needs, accepting the change that we have received. That what has been constructed to remain and last, should cause future interventions to be integrated in it. It is necessary to accept continuity as a rule. Solá-Morales, in his time, distinguished between the relationship with new and old, between contrast and analogy. Today, almost three decades later, the objective consists of evaluating whether the model of architectural intervention in the constructed has been maintained since then or if new ways of positioning the project regarding the constructed have appeared. Our work claims to show the change of the approach of projects with pre-existing subjects and that this has got a close relation to the incorporation of new concepts, techniques, tools and necessities that impress the cultural context, caused by the change of centuries. This assumption guides us to establish a parallelism between the forms of connection where that what is new is manifested between a commonly assumed (topical), generic and orthodox position, based on that what is visual and expressive in the last decades of the XXth century, and an emerging (heterotopical), extraordinary and heterodox reality that stimulates the immaterial and that seems to emerge with growing intensity in the XXIst century. If throughout the XXth century the project of architectural intervention was considered from the continuity and discontinuity of formal categories, marked by the expression of the pre-existing building, the new contemporary intervention, as a reactive device in the landscape and territory, demands an absolute continuity. No longer a visual, expressive or functional one but a morphological continuity of adaptation and change with its own territorial dynamics, under new game rules and unfolding new operative (projective) strategies from its own logic and contingency. 18 The aim of this research is to determine new forms of continuity and the possible logic of production that are expressed in the Architectural Intervention, trying to overcome the obviousness of its physical and visual relationship, at the beginning of this new century, as a result of the incorporation of the operative factor that the new architectural device unfolds. We think it is correct to maintain the connotative path that marks the name architectural intervention by bringing previous concepts and theorical approaches that have been evolving through time together. If the name suffers from a wider operational range because of its formulation, a quality that our contemporary logic provokes, the reformulation and consolidation of an interventional concept could be more suitable for our times, giving preference to a logical method from its own necessity and contingency. It seems that now time shapes the topics, it is no longer about materialising a certain time but about expressing the changes that its new temporality generates. Finally, our initial approach aspires to form a new way of reflection that permits us to understand the complex implications that the new architecture submits the pre-existing subject to, motivated by the incorporation of factors external to simple formal and expressive judgement, prevailing at the end of the XXth century. In the same way, our set road, as an alternative, permits the contemplation of possible research paths, considering that what is pre-existing as an area that spans the whole territory with emerging changing dynamics and, with them, their interventional logics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is clear that in the near future much broader transmissions in the HF band will replace part of the current narrow band links. Our personal view is that a real wide band signal is infeasible in this environment because the usage is typically very intensive and may suffer interferences from all over the world. Therefore, we envision that dynamic multiband transmissions may provide better satisfactory performance. From the very beginning, we observed that real links with our broadband transceiver suffered interferences out of our multiband but within the acquisition bandwidth that degrade the expected performance. Therefore, we concluded that a mitigation structure is required that operates on severely saturated signals as the interference may be of much higher power. In this paper we address a procedure based on Higher Order Crossings (HOC) statistics that are able to extract most of the signal structure in the case where the amplitude is severely distorted and allows the estimation of the interference carrier frequency to command a variable notch filter that mitigates its effect in the analog domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We envision that dynamic multiband transmissions taking advantage of the receiver diversity (even for collocated antennas with different polarization or radiation pattern) will create a new paradigm for these links guaranteeing high quality and reliability. However, there are many challenges to face regarding the use of broadband reception where several out of band (with respect to multiband transmission) strong interferers, but still within the acquisition band, may limit dramatically the expected performance. In this paper we address this problem introducing a specific capability of the communication system that is able to mitigate these interferences using analog beamforming principles. Indeed, Higher Order Crossing (HOCs) joint statistics of the Single Input ? Multiple Output (SIMO) system are shown to effectively determine the angle on arrival of the wavefront even operating over highly distorted signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An asymptotic analysis of the Langmuir-probe problem in a quiescent, fully ionized plasma in a strong magnetic field is performed, for electron cyclotron radius and Debye length much smaller than probe radius, and this not larger than either ion cyclotron radius or mean free path. It is found that the electric potential, which is not confined to a sheath, controls the diffusion far from the probe; inside the magnetic tube bounded by the probe cross section the potential overshoots to a large value before decaying to its value in the body of the plasma. The electron current is independent of the shape of the body along the field and increases with ion temperature; due to the overshoot in the potential, (1) the current at negative voltages does not vary exponentially, (2) its magnitude is strongly reduced by the field, and (3) the usual sharp knee at space potential, disappears. In the regions of the C-V diagram studied the ion current is negligible or unaffected by the field. Some numerical results are presented.The theory, which fails beyond certain positive voltage, fields useful results for weak fields, too.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce adequate concepts of expansion of a digraph to obtain a sequential construction of minimal strong digraphs. We obtain a characterization of the class of minimal strong digraphs whose expansion preserves the property of minimality. We prove that every minimal strong digraph of order nmayor que=2 is the expansion of a minimal strong digraph of order n-1 and we give sequentially generative procedures for the constructive characterization of the classes of minimal strong digraphs. Finally we describe algorithms to compute unlabeled minimal strong digraphs and their isospectral classes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A theory is developed of an electrostatic probe in a fully-ionized plasma in the presence of a strong magnetic field. The ratio of electron Larmor radius to probe transverse dimension is assumed to be small. Poisson's equation, together with kinetic equations for ions and electrons are considered. An asymptotic perturbation method of multiple scales is used by considering the characteristic lengths appearing in the problem. The leading behavior of the solution is found. The results obtained appear to apply to weaker fields also, agreeing with the solutions known in the limit of no magnetic field. The range of potentials for wich results are presented is limited. The basic effects produced by the field are a depletion of the plasma near the probe and a non-monotonic potential surrounding the probe. The ion saturation current is not changed but changes appear in both the floating potential Vf and the slope of the current-voltage diagram at Vf. The transition region extends beyond the space potential Vs,at wich point the current is largely reduced. The diagram does not have an exponential form in this region as commonly assumed. There exists saturation in electron collection. The extent to which the plasma is disturbed is determined. A cylindrical probe has no solution because of a logarithmic singularity at infinity. Extensions of the theory are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A kinetic approach is used to develop a theory of electrostatic probes in a fully ionized plasma in the presence of a magnetic field. A consistent asymptotic expansion is obtained assuming that the electron Larmor radius is small compared to the radius of the probe. The order of magnitude of neglected terms is given. It is found that the electric potential within the tube of force defined by the cross section of the probe decays non-mono tonic ally from the probe; this bump disappears at a certain probe voltage and the theory is valid up to this voltage. The transition region, which extends beyond plasma potential, is not exponential. The possible saturation of the electron current is discussed. Restricted numerical results are given; they seem to be useful for weaker magnetic fields down to the zero-field limit. Extensions of the theory a r e considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La artroplastia de cadera se considera uno de los mayores avances quirúrgicos de la Medicina. La aplicación de esta técnica de Traumatología se ha incrementado notablemente en los últimos anos, a causa principalmente del progresivo incremento de la esperanza de vida. En efecto, con la edad aumentan los problemas de artrosis y osteoporosis, enfermedades típicas de las articulaciones y de los huesos que requieren en muchos casos la sustitución protésica total o parcial de la articulación. El buen comportamiento funcional de una prótesis depende en gran medida de la estabilidad primaria, es decir, el correcto anclaje de la prótesis en el momento de su implantación. Las prótesis no cementadas basan su éxito a largo plazo en la osteointegración que tiene lugar entre el material protésico y el tejido óseo, y para lograrla es imprescindible conseguir unas buenas condiciones de estabilidad primaria. El aflojamiento aséptico es la principal causa de fallo de artroplastia total de cadera. Este es un fenómeno en el que, debido a complejas interacciones de factores mecánicos y biológicos, se producen movimientos relativos que comprometen la funcionalidad del implante. La minimización de los correspondientes danos depende en gran medida de la detección precoz del aflojamiento. Para lograr la detección temprana del aflojamiento aséptico del vástago femoral se han ensayado diferentes técnicas, tanto in vivo como in vitro: análisis numéricos y técnicas experimentales basadas en sensores de movimientos provocados por cargas transmitidas natural o artificialmente, tales como impactos o vibraciones de distintas frecuencias. Los montajes y procedimientos aplicados son heterogéneos y, en muchas ocasiones, complejos y costosos, no existiendo acuerdo sobre una técnica simple y eficaz de aplicación general. Asimismo, en la normativa vigente que regula las condiciones que debe cumplir una prótesis previamente a su comercialización, no hay ningún apartado referido específicamente a la evaluación de la bondad del diseño del vástago femoral con respecto a la estabilidad primaria. El objetivo de esta tesis es desarrollar una metodología para el análisis, in vitro, de la estabilidad de un vástago femoral implantado, a fin de poder evaluar las técnicas de implantación y los diferentes diseños de prótesis previamente a su oferta en el mercado. Además se plantea como requisito fundamental que el método desarrollado sea sencillo, reversible, repetible, no destructivo, con control riguroso de parámetros (condiciones de contorno de cargas y desplazamientos) y con un sistema de registro e interpretación de resultados rápido, fiable y asequible. Como paso previo, se ha realizado un análisis cualitativo del problema de contacto en la interfaz hueso-vástago aplicando una técnica optomecánica del campo continuo (fotoelasticidad). Para ello se han fabricado tres modelos en 2D del conjunto hueso-vástago, simulando tres tipos de contactos en la interfaz: contacto sin adherencia y con holgura, contacto sin adherencia y sin holgura, y contacto con adherencia y homogéneo. Aplicando la misma carga a cada modelo, y empleando la técnica de congelación de tensiones, se han visualizado los correspondientes estados tensionales, siendo estos más severos en el modelo de unión sin adherencia, como cabía esperar. En todo caso, los resultados son ilustrativos de la complejidad del problema de contacto y confirman la conveniencia y necesidad de la vía experimental para el estudio del problema. Seguidamente se ha planteado un ensayo dinámico de oscilaciones libres con instrumentación de sensores resistivos tipo galga extensométrica. Las muestras de ensayo han sido huesos fémur en todas sus posibles variantes: modelos simplificados, hueso sintético normalizado y hueso de cadáver, seco y fresco. Se ha diseñado un sistema de empotramiento del extremo distal de la muestra (fémur) con control riguroso de las condiciones de anclaje. La oscilación libre de la muestra se ha obtenido mediante la liberación instantánea de una carga estética determinada y aplicada previamente, bien con una maquina de ensayo o bien por gravedad. Cada muestra se ha instrumentado con galgas extensométricas convencionales cuya señal se ha registrado con un equipo dinámico comercial. Se ha aplicado un procedimiento de tratamiento de señal para acotar, filtrar y presentar las respuestas de los sensores en el dominio del tiempo y de la frecuencia. La interpretación de resultados es de tipo comparativo: se aplica el ensayo a una muestra de fémur Intacto que se toma de referencia, y a continuación se repite el ensayo sobre la misma muestra con una prótesis implantada; la comparación de resultados permite establecer conclusiones inmediatas sobre los efectos de la implantación de la prótesis. La implantación ha sido realizada por un cirujano traumatólogo utilizando las mismas técnicas e instrumental empleadas en el quirófano durante la práctica clínica real, y se ha trabajado con tres vástagos femorales comerciales. Con los resultados en el dominio del tiempo y de la frecuencia de las distintas aplicaciones se han establecido conclusiones sobre los siguientes aspectos: Viabilidad de los distintos tipos de muestras sintéticas: modelos simplificados y fémur sintético normalizado. Repetibilidad, linealidad y reversibilidad del ensayo. Congruencia de resultados con los valores teóricos deducidos de la teoría de oscilaciones libres de barras. Efectos de la implantación de tallos femorales en la amplitud de las oscilaciones, amortiguamiento y frecuencias de oscilación. Detección de armónicos asociados a la micromovilidad. La metodología se ha demostrado apta para ser incorporada a la normativa de prótesis, es de aplicación universal y abre vías para el análisis de la detección y caracterización de la micromovilidad de una prótesis frente a las cargas de servicio. ABSTRACT Total hip arthroplasty is considered as one of the greatest surgical advances in medicine. The application of this technique on Traumatology has increased significantly in recent years, mainly due to the progressive increase in life expectancy. In fact, advanced age increases osteoarthritis and osteoporosis problems, which are typical diseases of joints and bones, and in many cases require full or partial prosthetic replacement on the joint. Right functional behavior of prosthesis is highly dependent on the primary stability; this means it depends on the correct anchoring of the prosthesis at the time of implantation. Uncemented prosthesis base their long-term success on the quality of osseointegration that takes place between the prosthetic material and bone tissue, and to achieve this good primary stability conditions is mandatory. Aseptic loosening is the main cause of failure in total hip arthroplasty. This is a phenomenon in which relative movements occur, due to complex interactions of mechanical and biological factors, and these micromovements put the implant functionality at risk. To minimize possible damage, it greatly depends on the early detection of loosening. For this purpose, various techniques have been tested both in vivo and in vitro: numerical analysis and experimental techniques based on sensors for movements caused by naturally or artificially transmitted loads, such as impacts or vibrations at different frequencies. The assemblies and methods applied are heterogeneous and, in many cases, they are complex and expensive, with no agreement on the use of a simple and effective technique for general purposes. Likewise, in current regulations for governing the conditions to be fulfilled by the prosthesis before going to market, there is no specific section related to the evaluation of the femoral stem design in relation to primary stability. The aim of this thesis is to develop a in vitro methodology for analyzing the stability of an implanted femoral stem, in order to assess the implantation techniques and the different prosthesis designs prior to its offer in the market. We also propose as a fundamental requirement that the developed testing method should be simple, reversible, repeatable, non-destructive, with close monitoring of parameters (boundary conditions of loads and displacements) and with the availability of a register system to record and interpret results in a fast, reliable and affordable manner. As a preliminary step, we have performed a qualitative analysis of the contact problems in the bone-stem interface, through the application of a continuous field optomechanical technique (photoelasticity). For this proposal three 2D models of bone–stem set, has been built simulating three interface contact types: loosened an unbounded contact, unbounded and fixed contact, and bounded homogeneous contact. By means of applying the same load to each model, and using the stress freezing technique, it has displayed the corresponding stress states, being more severe as expected, in the unbounded union model. In any case, the results clearly show the complexity of the interface contact problem, and they confirm the need for experimental studies about this problem. Afterward a free oscillation dynamic test has been done using resistive strain gauge sensors. Test samples have been femur bones in all possible variants: simplified models, standardized synthetic bone, and dry and cool cadaveric bones. An embedding system at the distal end of the sample with strong control of the anchoring conditions has been designed. The free oscillation of the sample has been obtained by the instantaneous release of a static load, which was previously determined and applied to the sample through a testing machine or using the gravity force. Each sample was equipped with conventional strain gauges whose signal is registered with a marketed dynamic equipment. Then, it has applied a signal processing procedure to delimit, filter and present the time and frequency response signals from the sensors. Results are interpreted by comparing different trials: the test is applied to an intact femur sample which is taken as a reference, and then this test is repeated over the same sample with an implanted prosthesis. From comparison between results, immediate conclusions about the effects of the implantation of the prosthesis can be obtained. It must be said that the implementation has been made by an expert orthopedic surgeon using the same techniques and instruments as those used in clinical surgery. He has worked with three commercial femoral stems. From the results obtained in the time and frequency domains for the different applications the following conclusions have been established: Feasibility of the different types of synthetic samples: simplified models and standardized synthetic femur. Repeatability, linearity and reversibility of the testing method. Consistency of results with theoretical values deduced from the bars free oscillations theory. Effects of introduction of femoral stems in the amplitude, damping and frequencies of oscillations Detection of micromobility associated harmonics. This methodology has been proved suitable to be included in the standardization process of arthroplasty prosthesis, it is universally applicable and it allows establishing new methods for the analysis, detection and characterization of prosthesis micromobility due to functional loads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of the plasma potential Vpl of unmagnetized plasmas by using the floating potential of emissive Langmuir probes operated in the strong emission regime is investigated. The experiments evidence that, for most cases, the electron thermionic emission is orders of magnitude larger than the plasma thermal electron current. The temperature-dependent floating potentials of negatively biased Vpmenor queVpl emissive probes are in agreement with the predictions of a simple phenomenological model that considers, in addition to the plasma electrons, an ad-ditional electron group that contributes to the probe current. The latter would be constituted by a fraction of the repelled electron thermionic current, which might return back to the probe with a different energy spectrum. Its origin would be a plasma potential well formed in the plasma sheath around the probe, acting as a virtual cathode or by collisions and electron thermalization pro-cesses. These results suggest that, for probe bias voltages close to the plasma potential Vp?Vpl, two electron populations coexist, i.e., the electrons from the plasma with temperatureTeand a large group of returned thermionic electrons. These results question the theoretical possibility of measuring the electron temperature by using emissive probes biased to potentials Vp about lower equal than ?Vpl.