25 resultados para Catastrophic Cognitions

em Universidad Politécnica de Madrid


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Specific tests to assess reliability of high luminosity AlInGaP LED for outdoor applications are needed. In this paper tests to propose a model involving three parameters: temperature, humidity and current have been carried out. Temperature, humidity and current accelerated model has been proposed to evaluate the reliability of this type of LED. Degradation and catastrophic failure mechanisms have been analyzed. Finally we analyze the effect of serial resistance in power luminosity degradation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A quantitative temperature accelerated life test on sixty GaInP/GaInAs/Ge triple-junction commercial concentrator solar cells is being carried out. The final objective of this experiment is to evaluate the reliability, warranty period, and failure mechanism of high concentration solar cells in a moderate period of time. The acceleration of the degradation is realized by subjecting the solar cells at temperatures markedly higher than the nominal working temperature under a concentrator Three experiments at three different temperatures are necessary in order to obtain the acceleration factor which relates the time at the stress level with the time at nominal working conditions. . However, up to now only the test at the highest temperature has finished. Therefore, we can not provide complete reliability information but we have analyzed the life data and the failure mode of the solar cells inside the climatic chamber at the highest temperature. The failures have been all of them catastrophic. In fact, the solar cells have turned into short circuits. We have fitted the failure distribution to a two parameters Weibull function. The failures are wear-out type. We have observed that the busbar and the surrounding fingers are completely deteriorate

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Telecommunications networks have been always expanding and thanks to it, new services have appeared. The old mechanisms for carrying packets have become obsolete due to the new service requirements, which have begun working in real time. Real time traffic requires strict service guarantees. When this traffic is sent through the network, enough resources must be given in order to avoid delays and information losses. When browsing through the Internet and requesting web pages, data must be sent from a server to the user. If during the transmission there is any packet drop, the packet is sent again. For the end user, it does not matter if the webpage loads in one or two seconds more. But if the user is maintaining a conversation with a VoIP program, such as Skype, one or two seconds of delay in the conversation may be catastrophic, and none of them can understand the other. In order to provide support for this new services, the networks have to evolve. For this purpose MPLS and QoS were developed. MPLS is a packet carrying mechanism used in high performance telecommunication networks which directs and carries data using pre-established paths. Now, packets are forwarded on the basis of labels, making this process faster than routing the packets with the IP addresses. MPLS also supports Traffic Engineering (TE). This refers to the process of selecting the best paths for data traffic in order to balance the traffic load between the different links. In a network with multiple paths, routing algorithms calculate the shortest one, and most of the times all traffic is directed through it, causing overload and packet drops, without distributing the packets in the other paths that the network offers and do not have any traffic. But this is not enough in order to provide the real time traffic the guarantees it needs. In fact, those mechanisms improve the network, but they do not make changes in how the traffic is treated. That is why Quality of Service (QoS) was developed. Quality of service is the ability to provide different priority to different applications, users, or data flows, or to guarantee a certain level of performance to a data flow. Traffic is distributed into different classes and each of them is treated differently, according to its Service Level Agreement (SLA). Traffic with the highest priority will have the preference over lower classes, but this does not mean it will monopolize all the resources. In order to achieve this goal, a set policies are defined to control and alter how the traffic flows. Possibilities are endless, and it depends in how the network must be structured. By using those mechanisms it is possible to provide the necessary guarantees to the real-time traffic, distributing it between categories inside the network and offering the best service for both real time data and non real time data. Las Redes de Telecomunicaciones siempre han estado en expansión y han propiciado la aparición de nuevos servicios. Los viejos mecanismos para transportar paquetes se han quedado obsoletos debido a las exigencias de los nuevos servicios, que han comenzado a operar en tiempo real. El tráfico en tiempo real requiere de unas estrictas garantías de servicio. Cuando este tráfico se envía a través de la red, necesita disponer de suficientes recursos para evitar retrasos y pérdidas de información. Cuando se navega por la red y se solicitan páginas web, los datos viajan desde un servidor hasta el usuario. Si durante la transmisión se pierde algún paquete, éste se vuelve a mandar de nuevo. Para el usuario final, no importa si la página tarda uno o dos segundos más en cargar. Ahora bien, si el usuario está manteniendo una conversación usando algún programa de VoIP (como por ejemplo Skype) uno o dos segundos de retardo en la conversación podrían ser catastróficos, y ninguno de los interlocutores sería capaz de entender al otro. Para poder dar soporte a estos nuevos servicios, las redes deben evolucionar. Para este propósito se han concebido MPLS y QoS MPLS es un mecanismo de transporte de paquetes que se usa en redes de telecomunicaciones de alto rendimiento que dirige y transporta los datos de acuerdo a caminos preestablecidos. Ahora los paquetes se encaminan en función de unas etiquetas, lo cual hace que sea mucho más rápido que encaminar los paquetes usando las direcciones IP. MPLS también soporta Ingeniería de Tráfico (TE). Consiste en seleccionar los mejores caminos para el tráfico de datos con el objetivo de balancear la carga entre los diferentes enlaces. En una red con múltiples caminos, los algoritmos de enrutamiento actuales calculan el camino más corto, y muchas veces el tráfico se dirige sólo por éste, saturando el canal, mientras que otras rutas se quedan completamente desocupadas. Ahora bien, esto no es suficiente para ofrecer al tráfico en tiempo real las garantías que necesita. De hecho, estos mecanismos mejoran la red, pero no realizan cambios a la hora de tratar el tráfico. Por esto es por lo que se ha desarrollado el concepto de Calidad de Servicio (QoS). La calidad de servicio es la capacidad para ofrecer diferentes prioridades a las diferentes aplicaciones, usuarios o flujos de datos, y para garantizar un cierto nivel de rendimiento en un flujo de datos. El tráfico se distribuye en diferentes clases y cada una de ellas se trata de forma diferente, de acuerdo a las especificaciones que se indiquen en su Contrato de Tráfico (SLA). EL tráfico con mayor prioridad tendrá preferencia sobre el resto, pero esto no significa que acapare la totalidad de los recursos. Para poder alcanzar estos objetivos se definen una serie de políticas para controlar y alterar el comportamiento del tráfico. Las posibilidades son inmensas dependiendo de cómo se quiera estructurar la red. Usando estos mecanismos se pueden proporcionar las garantías necesarias al tráfico en tiempo real, distribuyéndolo en categorías dentro de la red y ofreciendo el mejor servicio posible tanto a los datos en tiempo real como a los que no lo son.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A temperature accelerated life test on commercial concentrator lattice-matched GaInP/GaInAs/Ge triple-junction solar cells has been carried out. The solar cells have been tested at three different temperatures: 119, 126 and 164 °C and the nominal photo-current condition (820 X) has been emulated by injecting current in darkness. All the solar cells have presented catastrophic failures. The failure distributions at the three tested temperatures have been fitted to an Arrhenius-Weibull model. An Arrhenius activation energy of 1.58 eV was determined from the fit. The main reliability functions and parameters (reliability function, instantaneous failure rate, mean time to failure, warranty time) of these solar cells at the nominal working temperature (80 °C) have been obtained. The warranty time obtained for a failure population of 5 % has been 69 years. Thus, a long-term warranty could be offered for these particular solar cells working at 820 X, 8 hours per day at 80 °C.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT Evaluating the reliability, warranty period, and power degradation of high concentration solar cells is crucial to introducing this new technology to the market. The reliability of high concentration GaAs solar cells, as measured in temperature accelerated life tests, is described in this paper. GaAs cells were tested under high thermal accelerated conditions that emulated operation under 700 or 1050 suns over a period exceeding 10 000 h. Progressive power degradation was observed, although no catastrophic failures occurred. An Arrhenius activation energy of 1.02 eV was determined from these tests. The solar cell reliability [R(t)] under working conditions of 65°C was evaluated for different failure limits (1–10% power loss). From this reliability function, the mean time to failure and the warranty time were evaluated. Solar cell temperature appeared to be the primary determinant of reliability and warranty period, with concentration being the secondary determinant. A 30-year warranty for these 1 mm2-sized GaAs cells (manufactured according to a light emitting diode-like approach) may be offered for both cell concentrations (700 and 1050 suns) if the solar cell is operated at a working temperature of 65°C.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A bare tether with thin-tape cross section is both i) the most effective electrodinamic tether for given length and mass, and ii) capable of effective design for an arbitrary mission through its three disparate dimensions. It handily beats the fully insulated tether that exchanges current at both ends, a result resting in advantages of 2D current collection as against 3D collection; it has much greater perimeter than the round bare tether and much lower fatal debris-impact rate, leading to greatly faster de-orbiting and greatly higher probability of survival; and it only allows multi-line tethers reaching a few hundred lines to stand competitive. In selecting the disparate values of length L, width w, and thickness h for a de-orbit mission, performance involves three criteria: a) tether-tospacecraft mass ratio must be small; b) probability of survival against the debris environment must be high; and c) de-orbiting must be fast to reduce manoeuvres for avoiding catastrophic collisions with big active/passive satellites around. Beyond determining tether mass through the product Lwh, main dimension parameters affecting performance are L/h2li characterizing ohmic effects, and w determining electron collection. An algorithm for optimal selection of tape dimensions is elaborated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is the development of a building cost estimation model whose purpose is to quickly and precisely evaluate rebuilding costs for historic heritage buildings affected by catastrophic events. Specifically, this study will be applied to the monumental buildings owned by the Catholic Church that were affected by two earthquakes on May 11, 2011 in the town of Lorca. To estimate the initial total replacement cost new, calculation model will be applied which, on the one hand, will use two-dimensional metric exterior parameters and, on the other, three-dimensional interior cubic parameters. Based on the total of the analyzed buildings, and considering damage caused by the seismic event, the final reconstruction cost for the building units ruined by the earthquakes can be estimated. The proposed calculation model can also be applied to other emergency scenarios and situations for the quick estimation of construction costs necessary for rebuilding historic heritage buildings which have been affected by catastrophic events that deteriorate or ruin their structural or constructive configuration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En muchas áreas de la ingeniería, la integridad y confiabilidad de las estructuras son aspectos de extrema importancia. Estos son controlados mediante el adecuado conocimiento de danos existentes. Típicamente, alcanzar el nivel de conocimiento necesario que permita caracterizar la integridad estructural implica el uso de técnicas de ensayos no destructivos. Estas técnicas son a menudo costosas y consumen mucho tiempo. En la actualidad, muchas industrias buscan incrementar la confiabilidad de las estructuras que emplean. Mediante el uso de técnicas de última tecnología es posible monitorizar las estructuras y en algunos casos, es factible detectar daños incipientes que pueden desencadenar en fallos catastróficos. Desafortunadamente, a medida que la complejidad de las estructuras, los componentes y sistemas incrementa, el riesgo de la aparición de daños y fallas también incrementa. Al mismo tiempo, la detección de dichas fallas y defectos se torna más compleja. En años recientes, la industria aeroespacial ha realizado grandes esfuerzos para integrar los sensores dentro de las estructuras, además de desarrollar algoritmos que permitan determinar la integridad estructural en tiempo real. Esta filosofía ha sido llamada “Structural Health Monitoring” (o “Monitorización de Salud Estructural” en español) y este tipo de estructuras han recibido el nombre de “Smart Structures” (o “Estructuras Inteligentes” en español). Este nuevo tipo de estructuras integran materiales, sensores, actuadores y algoritmos para detectar, cuantificar y localizar daños dentro de ellas mismas. Una novedosa metodología para detección de daños en estructuras se propone en este trabajo. La metodología está basada en mediciones de deformación y consiste en desarrollar técnicas de reconocimiento de patrones en el campo de deformaciones. Estas últimas, basadas en PCA (Análisis de Componentes Principales) y otras técnicas de reducción dimensional. Se propone el uso de Redes de difracción de Bragg y medidas distribuidas como sensores de deformación. La metodología se validó mediante pruebas a escala de laboratorio y pruebas a escala real con estructuras complejas. Los efectos de las condiciones de carga variables fueron estudiados y diversos experimentos fueron realizados para condiciones de carga estáticas y dinámicas, demostrando que la metodología es robusta ante condiciones de carga desconocidas. ABSTRACT In many engineering fields, the integrity and reliability of the structures are extremely important aspects. They are controlled by the adequate knowledge of existing damages. Typically, achieving the level of knowledge necessary to characterize the structural integrity involves the usage of nondestructive testing techniques. These are often expensive and time consuming. Nowadays, many industries look to increase the reliability of the structures used. By using leading edge techniques it is possible to monitoring these structures and in some cases, detect incipient damage that could trigger catastrophic failures. Unfortunately, as the complexity of the structures, components and systems increases, the risk of damages and failures also increases. At the same time, the detection of such failures and defects becomes more difficult. In recent years, the aerospace industry has done great efforts to integrate the sensors within the structures and, to develop algorithms for determining the structural integrity in real time. The ‘philosophy’ has being called “Structural Health Monitoring” and these structures have been called “smart structures”. These new types of structures integrate materials, sensors, actuators and algorithms to detect, quantify and locate damage within itself. A novel methodology for damage detection in structures is proposed. The methodology is based on strain measurements and consists in the development of strain field pattern recognition techniques. The aforementioned are based on PCA (Principal Component Analysis) and other dimensional reduction techniques. The use of fiber Bragg gratings and distributed sensing as strain sensors is proposed. The methodology have been validated by using laboratory scale tests and real scale tests with complex structures. The effects of the variable load conditions were studied and several experiments were performed for static and dynamic load conditions, demonstrating that the methodology is robust under unknown load conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Markerless video-based human pose estimation algorithms face a high-dimensional problem that is frequently broken down into several lower-dimensional ones by estimating the pose of each limb separately. However, in order to do so they need to reliably locate the torso, for which they typically rely on time coherence and tracking algorithms. Their losing track usually results in catastrophic failure of the process, requiring human intervention and thus precluding their usage in real-time applications. We propose a very fast rough pose estimation scheme based on global shape descriptors built on 3D Zernike moments. Using an articulated model that we configure in many poses, a large database of descriptor/pose pairs can be computed off-line. Thus, the only steps that must be done on-line are the extraction of the descriptors for each input volume and a search against the database to get the most likely poses. While the result of such process is not a fine pose estimation, it can be useful to help more sophisticated algorithms to regain track or make more educated guesses when creating new particles in particle-filter-based tracking schemes. We have achieved a performance of about ten fps on a single computer using a database of about one million entries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work the failure analysis carried out in III-V concentrator multijunction solar cells after a temperature accelerated life test is presented. All the failures appeared have been catastrophic since all the solar cells turned into low shunt resistances. A case study in failure analysis based on characterization by optical microscope, SEM, EDX, EQE and XPS is presented in this paper, revealing metal deterioration in the bus bar and fingers as well as cracks in the semiconductor structure beneath or next to the bus bar. In fact, in regions far from the bus bar the semiconductor structure seems not to be damaged. SEM images have dismissed the presence of metal spikes inside the solar cell structure. Therefore, we think that for these particular solar cells, failures appear mainly as a consequence of a deficient electrolytic growth of the front metallization which also results in failures in the semiconductor structure close to the bus bars.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

(This is an excerpt from the content) On May 11 2011 at 1705 hours, a small 4.5 Mw. magnitude earthquake struck the town of Lorca in south-eastern Spain. Other than alarmed citizens, only minor damage to buildings occurred due to this quake. Unfortunately at 1847 hours, a second shock registering a magnitude of 5.1 Mw. and very shallow (just around 2 km under the city) produced the largest seismic catastrophe registered in Spain in the last 120 years. This second shock is commonly referred to as “Lorca’s earthquake” and the following papers describe the context, circumstances and consequences of the event. Spain is a country of moderate seismic hazard in a global context. Before the Lorca earthquake, the most destructive earthquake in modern times was the so-called “Andalusian earthquake” (25th December 1884) that resulted in 750 fatalities and more than 1,500 injuries, reaching X in Mercalli’s intensity scale. Despite the lack of catastrophic events in the last 120 years, Spain has always had a scientific interest in seismic ...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivado por los últimos hallazgos realizados gracias a los recientes avances tecnológicos y misiones espaciales, el estudio de los asteroides ha despertado el interés de la comunidad científica. Tal es así que las misiones a asteroides han proliferado en los últimos años (Hayabusa, Dawn, OSIRIX-REx, ARM, AIMS-DART, ...) incentivadas por su enorme interés científico. Los asteroides son constituyentes fundamentales en la evolución del Sistema Solar, son además grandes concentraciones de valiosos recursos naturales, y también pueden considerarse como objectivos estratégicos para la futura exploración espacial. Desde hace tiempo se viene especulando con la posibilidad de capturar objetos próximos a la Tierra (NEOs en su acrónimo anglosajón) y acercarlos a nuestro planeta, permitiendo así un acceso asequible a los mismos para estudiarlos in-situ, explotar sus recursos u otras finalidades. Por otro lado, las asteroides se consideran con frecuencia como posibles peligros de magnitud planetaria, ya que impactos de estos objetos con la Tierra suceden constantemente, y un asteroide suficientemente grande podría desencadenar eventos catastróficos. Pese a la gravedad de tales acontecimientos, lo cierto es que son ciertamente difíciles de predecir. De hecho, los ricos aspectos dinámicos de los asteroides, su modelado complejo y las incertidumbres observaciones hacen que predecir su posición futura con la precisión necesaria sea todo un reto. Este hecho se hace más relevante cuando los asteroides sufren encuentros próximos con la Tierra, y más aún cuando estos son recurrentes. En tales situaciones en las cuales fuera necesario tomar medidas para mitigar este tipo de riesgos, saber estimar con precisión sus trayectorias y probabilidades de colisión es de una importancia vital. Por ello, se necesitan herramientas avanzadas para modelar su dinámica y predecir sus órbitas con precisión, y son también necesarios nuevos conceptos tecnológicos para manipular sus órbitas llegado el caso. El objetivo de esta Tesis es proporcionar nuevos métodos, técnicas y soluciones para abordar estos retos. Las contribuciones de esta Tesis se engloban en dos áreas: una dedicada a la propagación numérica de asteroides, y otra a conceptos de deflexión y captura de asteroides. Por lo tanto, la primera parte de este documento presenta novedosos avances de apliación a la propagación dinámica de alta precisión de NEOs empleando métodos de regularización y perturbaciones, con especial énfasis en el método DROMO, mientras que la segunda parte expone ideas innovadoras para la captura de asteroides y comenta el uso del “ion beam shepherd” (IBS) como tecnología para deflectarlos. Abstract Driven by the latest discoveries enabled by recent technological advances and space missions, the study of asteroids has awakened the interest of the scientific community. In fact, asteroid missions have become very popular in the recent years (Hayabusa, Dawn, OSIRIX-REx, ARM, AIMS-DART, ...) motivated by their outstanding scientific interest. Asteroids are fundamental constituents in the evolution of the Solar System, can be seen as vast concentrations of valuable natural resources, and are also considered as strategic targets for the future of space exploration. For long it has been hypothesized with the possibility of capturing small near-Earth asteroids and delivering them to the vicinity of the Earth in order to allow an affordable access to them for in-situ science, resource utilization and other purposes. On the other side of the balance, asteroids are often seen as potential planetary hazards, since impacts with the Earth happen all the time, and eventually an asteroid large enough could trigger catastrophic events. In spite of the severity of such occurrences, they are also utterly hard to predict. In fact, the rich dynamical aspects of asteroids, their complex modeling and observational uncertainties make exceptionally challenging to predict their future position accurately enough. This becomes particularly relevant when asteroids exhibit close encounters with the Earth, and more so when these happen recurrently. In such situations, where mitigation measures may need to be taken, it is of paramount importance to be able to accurately estimate their trajectories and collision probabilities. As a consequence, advanced tools are needed to model their dynamics and accurately predict their orbits, as well as new technological concepts to manipulate their orbits if necessary. The goal of this Thesis is to provide new methods, techniques and solutions to address these challenges. The contributions of this Thesis fall into two areas: one devoted to the numerical propagation of asteroids, and another to asteroid deflection and capture concepts. Hence, the first part of the dissertation presents novel advances applicable to the high accuracy dynamical propagation of near-Earth asteroids using regularization and perturbations techniques, with a special emphasis in the DROMO method, whereas the second part exposes pioneering ideas for asteroid retrieval missions and discusses the use of an “ion beam shepherd” (IBS) for asteroid deflection purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper shows the importance of a holistic comprehension of the Earth as a living planet, where man inhabits and is exposed to environmental incidences of different nature. The aim of the paper here summarized is a reflection on all these concepts and scientific considerations related to the important role of men in the handling of natural hazards. Our Planet is an unstable and dynamical system highly sensitive to initial conditions, as proposed by Chaos theory (González-Miranda 2004); it is a complex organic whole, which responds to minimal variations which can affect several natural phenomena such as plate tectonics, solar flares, fluid turbulences, landscape formation, forest fires, growth and migration of populations and biological evolution. This is known as the “butterfly effect” (Lorenz 1972), which means that a small change of the system causes a chain of events leading to large-scale unpredictable consequences. The aim of this work is dwelling on the importance of the knowledge of these natural and catastrophic geological, biological and human systems so much sensible to equilibrium conditions, to prevent, avoid and mend their effects, and to face them in a resilient way

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Civil buildings are not specifically designed to support blast loads, but it is important to take into account these potential scenarios because of their catastrophic effects, on persons and structures. A practical way to consider explosions on reinforced concrete structures is necessary. With this objective we propose a methodology to evaluate blast loads on large concrete buildings, using LS-DYNA code for calculation, with Lagrangian finite elements and explicit time integration. The methodology has three steps. First, individual structural elements of the building like columns and slabs are studied, using continuum 3D elements models subjected to blast loads. In these models reinforced concrete is represented with high precision, using advanced material models such as CSCM_CONCRETE model, and segregated rebars constrained within the continuum mesh. Regrettably this approach cannot be used for large structures because of its excessive computational cost. Second, models based on structural elements are developed, using shells and beam elements. In these models concrete is represented using CONCRETE_EC2 model and segregated rebars with offset formulation, being calibrated with continuum elements models from step one to obtain the same structural response: displacement, velocity, acceleration, damage and erosion. Third, models basedon structural elements are used to develop large models of complete buildings. They are used to study the global response of buildings subjected to blast loads and progressive collapse. This article carries out different techniques needed to calibrate properly the models based on structural elements, using shells and beam elements, in order to provide results of sufficient accuracy that can be used with moderate computational cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Las rápidas y continuas transformaciones urbanas que experimentan las ciudades actuales han generado importantes cambios en la forma en que se vive, experimenta y percibe la vida en ellas. Estos procesos de transformación, a veces fruto del rápido crecimiento de los entornos urbanos, de su heterogeneidad, de sus dinámicas económicas, de la confrontación de las diferencias sociales de los habitantes o simplemente producto de nuevos usos en los entornos, han propiciado un creciente interés por comprender el efecto que un determinado contexto urbano tiene sobre el individuo y viceversa. O dicho de forma más general, se ha buscado comprender la relación entre el ser humano y el entorno. Así, más recientemente, la atención de muchos estudios se ha posado también en los patrones de uso de los espacios, los cuales van acompañados de una infinidad de cogniciones y valoraciones surgidas en los residentes producto de sus experiencias con el entorno construido. Dentro del desarrollo de estas investigaciones, han surgido una gran variedad de conceptos que han sido identificados por la trascendencia que tienen sobre el vínculo entre el individuo y el contexto, entre los cuales encontramos los conceptos de Identidad, Apropiación de lugar, Sentido de Comunidad y Satisfacción residencial. Este trabajo, tomando como base estos cuatro conceptos, se ha preocupado por comprender la forma en que éstos se relacionan con una serie de variables socio-demográficas, físico-urbanas y cognitivas en la dimensión urbana, con el propósito de valorar el poder de estos constructos —y las variables vinculadas— con los procesos de transformación urbana. La finalidad de la realización de dicho análisis se sustenta en poder identificar como estos factores son tomados en consideración en el urbanismo para así poder reconocer diversas opciones para mediar o incidir sobre estas variables desde las políticas públicas, la planificación urbana o el proyecto urbano. Esto pues, se parte del entendimiento de que la práctica urbanística tiene la opción de interceder sobre una serie de cuestiones que afectan directamente el bienestar y la calidad de vida de las personas, por lo cual su absoluto entendimiento sobre los efectos que fenómenos urbanos como éstos tienen sobre los entornos, es de gran transcendencia para la disciplina. Por tanto, esta investigación ha recopilado y analizado, mediante diferentes herramientas estadísticas, gran cantidad de datos provenientes de una encuesta realizada en un barrio de la ciudad de Madrid, que le han permitido identificar las variables que más peso tienen en los procesos urbanos que desembocan en una mayor Identificación, Apropiación, Sentido de Comunidad y Satisfacción Residencial en el individuo. Esta primera etapa, permitió posteriormente indagar, con ayuda de un panel de expertos urbanistas, sobre las posibilidades de la disciplina para mediar o interceder sobre esos aspectos, lo que nos permitió finalmente alcanzar algunos resultados y conclusiones que permiten valorar las opciones de las diferentes escalas de intervención urbana y los retos de cara al futuro de trabajos sobre esta temática. ABSTRACT The fast and continuous urban transformations that current major cities experience, have generated important changes in the way in which we live, experience and perceive life in them. These transformation processes, sometimes result of rapid growth of urban environments, of their heterogeneity, of their economic dynamics, of the confrontation of social differences amongst its population or just as a product of new ways environments are used, have led to a growing interest in understanding the effect that a given urban context has on the individual and vice versa. In other words, it’s sought to understand the relationship between human beings and the environment. Thus, more recently, the attention of many studies have also focused in the patterns of use of space, which are accompanied by a multitude of cognitions and valuations arising in residents product of their experiences with the built environment. Within the development of those investigations, there have been a great variety of concepts that have been identified due to the importance they have on the relationship between the individual and the context, among which are the concepts of identity, appropriation of place, sense of community and residential satisfaction. This work, based on these four concepts, has been concerned with understanding how they relate to a series of socio-demographic, physical and cognitive variables in an urban dimension, in order to assess the power of these constructs —and related variables— with urban transformation processes. The purpose of that analysis is based on being able to identify how these factors are taken into consideration in planning, in order to recognize different options to mediate or influence on these variables from public policies, urban planning or urban project. This then, is on the understanding that the planning practice has the option to intercede on a number of issues that directly affect the welfare and quality of life of people, so the absolute understanding of the effects of urban phenomena like these have over environments, is of great importance to the discipline. Therefore, this research has collected and analyzed, using different statistical tools, lots of data obtained from a survey performed in a neighborhood of the city of Madrid, which enabled it to identify the variables that have more weight in urban processes that lead in higher identification, appropriation, sense of community and residential satisfaction in the individual. This first step, then allows investigating, with the help of a panel of expert planners, on the possibility of the discipline to mediate or intercede on these aspects, which ultimately enabled us to achieve some results and conclusions that allow evaluating the options of different scales of urban intervention and challenges for the future of work on this subject.