53 resultados para FITS

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Publishing Linked Data is a process that involves several design decisions and technologies. Although some initial guidelines have been already provided by Linked Data publishers, these are still far from covering all the steps that are necessary (from data source selection to publication) or giving enough details about all these steps, technologies, intermediate products, etc. Furthermore, given the variety of data sources from which Linked Data can be generated, we believe that it is possible to have a single and uni�ed method for publishing Linked Data, but we should rely on di�erent techniques, technologies and tools for particular datasets of a given domain. In this paper we present a general method for publishing Linked Data and the application of the method to cover di�erent sources from di�erent domains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental methods based on single particle tracking (SPT) are being increasingly employed in the physical and biological sciences, where nanoscale objects are visualized with high temporal and spatial resolution. SPT can probe interactions between a particle and its environment but the price to be paid is the absence of ensemble averaging and a consequent lack of statistics. Here we address the benchmark question of how to accurately extract the diffusion constant of one single Brownian trajectory. We analyze a class of estimators based on weighted functionals of the square displacement. For a certain choice of the weight function these functionals provide the true ensemble averaged diffusion coefficient, with a precision that increases with the trajectory resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Una Idea Bien Cabe en una Mano. Sobre las maquetas pequeñas como síntesis del espacio arquitectónico = An Idea Fits in the Palm of a Hand

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Jewish laws prescribed that when a first-born son was presented in the temple shortly after birth, the offering consisted of two turtle doves or pigeons. And if the family were very poor, a handful of wheat would suffice: the wheat that would fit in the palm of one’s hand. That wonderful Jewish custom, which I learnt about when writing this text, moved me deeply on account of what it shares with my proposal of making models capable of fitting into the palm of one’s hand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Linear regression is a technique widely used in digital signal processing. It consists on finding the linear function that better fits a given set of samples. This paper proposes different hardware architectures for the implementation of the linear regression method on FPGAs, specially targeting area restrictive systems. It saves area at the cost of constraining the lengths of the input signal to some fixed values. We have implemented the proposed scheme in an Automatic Modulation Classifier, meeting the hard real-time constraints this kind of systems have.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El objetivo de la presente tesis se enmarca dentro del estudio del estado de hormigones de presas, desarrollado en los últimos años en el Laboratorio Central del CEDEX, en el que se ratifica que una de las causas más importantes del deterioro de obras hidráulicas en España es la reacción álcali-sílice. La tesis que se presenta pretende contribuir al mejor conocimiento de la reacción álcali sílice con fines normativos preventivos, abordando los aspectos relativos a la identificación de áridos reactivos en el hormigón. El conocimiento de los áridos reactivos en España (origen de la reactividad, tipos de reacción y su comportamiento, así como las herramientas disponibles para su detección) es imprescindible para evitar la futura aparición de esta patología en nuevas estructuras, ya sea evitando el uso de áridos reactivos o tomando las medidas preventivas necesarias si su utilización es inevitable. A partir del Estudio Bibliográfico realizado se han detectado diversas lagunas en la identificación y caracterización de áridos de reacción rápida, cuya característica principal es que son reactivos con concentraciones muy bajas de diferentes componentes reactivos. Para resolver las lagunas identificadas se ha planeado un estudio experimental, consistente en el análisis de áridos cuya reactividad es conocida porque han sido empleados en obras afectadas por la reacción álcali sílice. Sobre el árido grueso extraído de estas estructuras se han realizado una serie de ensayos normalizados (estudio petrográfico, ensayo acelerado de probetas de mortero, ensayo Gel Pat y ensayos químicos). El análisis de los resultados experimentales ha permitido conocer las limitaciones reales en áridos reactivos españoles de las diferentes técnicas existentes, tratando de minimizarlas para áridos cuya reactividad es debida a componentes minoritarios (áridos de reacción rápida). Además, se ha evaluado la utilización de la difracción de rayos X (no normalizada) y la creación de un nuevo ensayo (Gel Pat Modificado). Finalmente, el estudio experimental ha permitido fijar una metodología de ensayo para el estudio de áridos reactivos por su contenido en componentes minoritarios (áridos de reacción rápida). The objective of this Thesis fits into the research program developed in CEDEX the last years and focused on the durability of concrete in Dams. This research work confirms that one of the main problems related to the deterioration of hydraulic structures is the alkali silica reaction. This Thesis aims to contribute to a better understanding of alkali-silica reaction, for preventive regulation purposes, considering the aspects related to the identification of reactive aggregates. The knowledge of Spanish reactive aggregates (origin of the reactivity, types of reaction and their behavior, and the tools available to detect and describe them) is essential to avoid the appearance of this pathology in new structures, either not using the reactive aggregate or taking the necessary preventive measures available in bibliography if the use of the reactive aggregate is inevitable. From the State-of –the-Art developed, several gaps have been detected in the detection and description of rapid reactive aggregates, which main characteristic if that they are reactive with low content of some reactive components. An experimental programme has been designed to solve these gaps, consisting on studying the reactivity of aggregates used in Spanish structures affected by the alkali silica reaction. Several Standard Tests have been carried out on coarse aggregates removed from the affected structures (Petrographic description, Accelerated Mortar Bar Test, Gel Pat Test and Chemical Tests). The analysis of the results obtained in Spanish reactive aggregates allows to know the advantages and limitations of each test, trying to minimize the disadvantages to detect Spanish reactive aggregates because of the minority content of rapid reactive components (rapid reactive aggregates). Moreover, X ray diffraction (not standardized) has been tested to detect rapid reactive aggregates and also a new test has been developed for the same purpose (Optimized Gel Pat Test). Finally, the experimental programme has made possible to define a methodology for detection of Spanish rapid reactive aggregates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The twentieth century brought a new sensibility characterized by the discredit of cartesian rationality and the weakening of universal truths, related with aesthetic values as order, proportion and harmony. In the middle of the century, theorists such as Theodor Adorno, Rudolf Arnheim and Anton Ehrenzweig warned about the transformation developed by the artistic field. Contemporary aesthetics seemed to have a new goal: to deny the idea of art as an organized, finished and coherent structure. The order had lost its privileged position. Disorder, probability, arbitrariness, accidentality, randomness, chaos, fragmentation, indeterminacy... Gradually new terms were coined by aesthetic criticism to explain what had been happening since the beginning of the century. The first essays on the matter sought to provide new interpretative models based on, among other arguments, the phenomenology of perception, the recent discoveries of quantum mechanics, the deeper layers of the psyche or the information theories. Overall, were worthy attempts to give theoretical content to a situation as obvious as devoid of founding charter. Finally, in 1962, Umberto Eco brought together all this efforts by proposing a single theoretical frame in his book Opera Aperta. According to his point of view, all of the aesthetic production of twentieth century had a characteristic in common: its capacity to express multiplicity. For this reason, he considered that the nature of contemporary art was, above all, ambiguous. The aim of this research is to clarify the consequences of the incorporation of ambiguity in architectural theoretical discourse. We should start making an accurate analysis of this concept. However, this task is quite difficult because ambiguity does not allow itself to be clearly defined. This concept has the disadvantage that its signifier is as imprecise as its signified. In addition, the negative connotations that ambiguity still has outside the aesthetic field, stigmatizes this term and makes its use problematic. Another problem of ambiguity is that the contemporary subject is able to locate it in all situations. This means that in addition to distinguish ambiguity in contemporary productions, so does in works belonging to remote ages and styles. For that reason, it could be said that everything is ambiguous. And that’s correct, because somehow ambiguity is present in any creation of the imperfect human being. However, as Eco, Arnheim and Ehrenzweig pointed out, there are two major differences between current and past contexts. One affects the subject and the other the object. First, it’s the contemporary subject, and no other, who has acquired the ability to value and assimilate ambiguity. Secondly, ambiguity was an unexpected aesthetic result in former periods, while in contemporary object it has been codified and is deliberately present. In any case, as Eco did, we consider appropriate the use of the term ambiguity to refer to the contemporary aesthetic field. Any other term with more specific meaning would only show partial and limited aspects of a situation quite complex and difficult to diagnose. Opposed to what normally might be expected, in this case ambiguity is the term that fits better due to its particular lack of specificity. In fact, this lack of specificity is what allows to assign a dynamic condition to the idea of ambiguity that in other terms would hardly be operative. Thus, instead of trying to define the idea of ambiguity, we will analyze how it has evolved and its consequences in architectural discipline. Instead of trying to define what it is, we will examine what its presence has supposed in each moment. We will deal with ambiguity as a constant presence that has always been latent in architectural production but whose nature has been modified over time. Eco, in the mid-twentieth century, discerned between classical ambiguity and contemporary ambiguity. Currently, half a century later, the challenge is to discern whether the idea of ambiguity has remained unchanged or have suffered a new transformation. What this research will demonstrate is that it’s possible to detect a new transformation that has much to do with the cultural and aesthetic context of last decades: the transition from modernism to postmodernism. This assumption leads us to establish two different levels of contemporary ambiguity: each one related to one these periods. The first level of ambiguity is widely well-known since many years. Its main characteristics are a codified multiplicity, an interpretative freedom and an active subject who gives conclusion to an object that is incomplete or indefinite. This level of ambiguity is related to the idea of indeterminacy, concept successfully introduced into contemporary aesthetic language. The second level of ambiguity has been almost unnoticed for architectural criticism, although it has been identified and studied in other theoretical disciplines. Much of the work of Fredric Jameson and François Lyotard shows reasonable evidences that the aesthetic production of postmodernism has transcended modern ambiguity to reach a new level in which, despite of the existence of multiplicity, the interpretative freedom and the active subject have been questioned, and at last denied. In this period ambiguity seems to have reached a new level in which it’s no longer possible to obtain a conclusive and complete interpretation of the object because it has became an unreadable device. The postmodern production offers a kind of inaccessible multiplicity and its nature is deeply contradictory. This hypothetical transformation of the idea of ambiguity has an outstanding analogy with that shown in the poetic analysis made by William Empson, published in 1936 in his Seven Types of Ambiguity. Empson established different levels of ambiguity and classified them according to their poetic effect. This layout had an ascendant logic towards incoherence. In seventh level, where ambiguity is higher, he located the contradiction between irreconcilable opposites. It could be said that contradiction, once it undermines the coherence of the object, was the better way that contemporary aesthetics found to confirm the Hegelian judgment, according to which art would ultimately reject its capacity to express truth. Much of the transformation of architecture throughout last century is related to the active involvement of ambiguity in its theoretical discourse. In modern architecture ambiguity is present afterwards, in its critical review made by theoreticians like Colin Rowe, Manfredo Tafuri and Bruno Zevi. The publication of several studies about Mannerism in the forties and fifties rescued certain virtues of an historical style that had been undervalued due to its deviation from Renacentist canon. Rowe, Tafuri and Zevi, among others, pointed out the similarities between Mannerism and certain qualities of modern architecture, both devoted to break previous dogmas. The recovery of Mannerism allowed joining ambiguity and modernity for first time in the same sentence. In postmodernism, on the other hand, ambiguity is present ex-professo, developing a prominent role in the theoretical discourse of this period. The distance between its analytical identification and its operational use quickly disappeared because of structuralism, an analytical methodology with the aspiration of becoming a modus operandi. Under its influence, architecture began to be identified and studied as a language. Thus, postmodern theoretical project discerned between the components of architectural language and developed them separately. Consequently, there is not only one, but three projects related to postmodern contradiction: semantic project, syntactic project and pragmatic project. Leading these projects are those prominent architects whose work manifested an especial interest in exploring and developing the potential of the use of contradiction in architecture. Thus, Robert Venturi, Peter Eisenman and Rem Koolhaas were who established the main features through which architecture developed the dialectics of ambiguity, in its last and extreme level, as a theoretical project in each component of architectural language. Robert Venturi developed a new interpretation of architecture based on its semantic component, Peter Eisenman did the same with its syntactic component, and also did Rem Koolhaas with its pragmatic component. With this approach this research aims to establish a new reflection on the architectural transformation from modernity to postmodernity. Also, it can serve to light certain aspects still unaware that have shaped the architectural heritage of past decades, consequence of a fruitful relationship between architecture and ambiguity and its provocative consummation in a contradictio in terminis. Esta investigación centra su atención fundamentalmente sobre las repercusiones de la incorporación de la ambigüedad en forma de contradicción en el discurso arquitectónico postmoderno, a través de cada uno de sus tres proyectos teóricos. Está estructurada, por tanto, en torno a un capítulo principal titulado Dialéctica de la ambigüedad como proyecto teórico postmoderno, que se desglosa en tres, de títulos: Proyecto semántico. Robert Venturi; Proyecto sintáctico. Peter Eisenman; y Proyecto pragmático. Rem Koolhaas. El capítulo central se complementa con otros dos situados al inicio. El primero, titulado Dialéctica de la ambigüedad contemporánea. Una aproximación realiza un análisis cronológico de la evolución que ha experimentado la idea de la ambigüedad en la teoría estética del siglo XX, sin entrar aún en cuestiones arquitectónicas. El segundo, titulado Dialéctica de la ambigüedad como crítica del proyecto moderno se ocupa de examinar la paulatina incorporación de la ambigüedad en la revisión crítica de la modernidad, que sería de vital importancia para posibilitar su posterior introducción operativa en la postmodernidad. Un último capítulo, situado al final del texto, propone una serie de Proyecciones que, a tenor de lo analizado en los capítulos anteriores, tratan de establecer una relectura del contexto arquitectónico actual y su evolución posible, considerando, en todo momento, que la reflexión en torno a la ambigüedad todavía hoy permite vislumbrar nuevos horizontes discursivos. Cada doble página de la Tesis sintetiza la estructura tripartita del capítulo central y, a grandes rasgos, la principal herramienta metodológica utilizada en la investigación. De este modo, la triple vertiente semántica, sintáctica y pragmática con que se ha identificado al proyecto teórico postmoderno se reproduce aquí en una distribución específica de imágenes, notas a pie de página y cuerpo principal del texto. En la columna de la izquierda están colocadas las imágenes que acompañan al texto principal. Su distribución atiende a criterios estéticos y compositivos, cualificando, en la medida de lo posible, su condición semántica. A continuación, a su derecha, están colocadas las notas a pie de página. Su disposición es en columna y cada nota está colocada a la misma altura que su correspondiente llamada en el texto principal. Su distribución reglada, su valor como notación y su posible equiparación con una estructura profunda aluden a su condición sintáctica. Finalmente, el cuerpo principal del texto ocupa por completo la mitad derecha de cada doble página. Concebido como un relato continuo, sin apenas interrupciones, su papel como responsable de satisfacer las demandas discursivas que plantea una investigación doctoral está en correspondencia con su condición pragmática.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Las organizaciones son sistemas o unidades sociales, compuestas por personas que interactúan entre sí, para lograr objetivos comunes. Uno de sus objetivos es la productividad. La productividad es un constructo multidimensional en la que influyen aspectos tecnológicos, económicos, organizacionales y humanos. Diversos estudios apoyan la influencia de la motivación de las personas, de las habilidades y destrezas de los individuos, de su talento para desempeñar el trabajo, así como también del ambiente de trabajo presente en la organización, en la productividad. Por esta razón, el objetivo general de la investigación, es analizar la influencia entre los factores humanos y la productividad. Se hará énfasis en la persona como factor productivo clave, para responder a las interrogantes de la investigación, referidas a cuáles son las variables humanas que inciden en la productividad, a la posibilidad de plantear un modelo de productividad que considere el impacto del factor humano y la posibilidad de encontrar un método para la medición de la productividad que contemple la percepción del factor humano. Para resolver estas interrogantes, en esta investigación se busca establecer las relaciones entre las variables humanas y la productividad, vistas desde la perspectiva de tres unidades de análisis diferentes: individuo, grupo y organización, para la formulación de un modelo de productividad humana y el diseño de un instrumento para su medida. Una de las principales fuente de investigación para la elección de las variables humanas, la formulación del modelo, y el método de medición de la productividad, fue la revisión de la literatura disponible sobre la productividad y el factor humano en las organizaciones, lo que facilitó el trazado del marco teórico y conceptual. Otra de las fuentes para la selección fue la opinión de expertos y de especialistas directamente involucrados en el sector eléctrico venezolano, lo cual facilitó la obtención de un modelo, cuyas variables reflejasen la realidad del ámbito en estudio. Para aportar una interpretación explicativa del fenómeno, se planteó el modelo de los Factores Humanos vs Productividad (MFHP), el cual se analizó desde la perspectiva del análisis causal y fue conformado por tres variables latentes exógenas denominadas: factores individuales, factores grupales y factores organizacionales, que estaban relacionadas con una variable latente endógena denominada productividad. El MFHP se formuló mediante la metodología de los modelos de ecuaciones estructurales (SEM). Las relaciones inicialmente propuestas entre las variables latentes fueron corroboradas por los ajustes globales del modelo, se constataron las relaciones entre las variables latentes planteadas y sus indicadores asociados, lo que facilitó el enunciado de 26 hipótesis, de las cuales se comprobaron 24. El modelo fue validado mediante la estrategia de modelos rivales, utilizada para comparar varios modelos SEM, y seleccionar el de mejor ajuste, con sustento teórico. La aceptación del modelo se realizó mediante la evaluación conjunta de los índices de bondad de ajuste globales. Asimismo, para la elaboración del instrumento de medida de la productividad (IMPH), se realizó un análisis factorial exploratorio previo a la aplicación del análisis factorial confirmatorio, aplicando SEM. La revisión de los conceptos de productividad, la incidencia del factor humano, y sus métodos de medición, condujeron al planteamiento de métodos subjetivos que incorporaron la percepción de los principales actores del proceso productivo, tanto para la selección de las variables, como para la formulación de un modelo de productividad y el diseño de un instrumento de medición de la productividad. La contribución metodológica de este trabajo de investigación, ha sido el empleo de los SEM para relacionar variables que tienen que ver con el comportamiento humano en la organización y la productividad, lo cual abre nuevas posibilidades a la investigación en este ámbito. Organizations are social systems or units composed of people who interact with each other to achieve common goals. One objective is productivity, which is a multidimensional construct influenced by technological, economic, organizational and human aspects. Several studies support the influence on productivity of personal motivation, of the skills and abilities of individuals, of their talent for the job, as well as of the work environment present in the organization. Therefore, the overall objective of this research is to analyze the influence between human factors and productivity. The emphasis is on the individual as a productive factor which is key in order to answer the research questions concerning the human variables that affect productivity and to address the ability to propose a productivity model that considers the impact of the human factor and the possibility of finding a method for the measurement of productivity that includes the perception of the human factor. To consider these questions, this research seeks to establish the relationships between human and productivity variables, as seen from the perspective of three different units of analysis: the individual, the group and the organization, in order to formulate a model of human productivity and to design an instrument for its measurement. A major source of research for choosing the human variables, model formulation, and method of measuring productivity, was the review of the available literature on productivity and the human factor in organizations which facilitated the design of the theoretical and conceptual framework. Another source for the selection was the opinion of experts and specialists directly involved in the Venezuelan electricity sector which facilitated obtaining a model whose variables reflect the reality of the area under study. To provide an interpretation explaining the phenomenon, the model of the Human Factors vs. Productivity Model (HFPM) was proposed. This model has been analyzed from the perspective of causal analysis and was composed of three latent exogenous variables denominated: individual, group and organizational factors which are related to a latent variable denominated endogenous productivity. The HFPM was formulated using the methodology of Structural Equation Modeling (SEM). The initially proposed relationships between latent variables were confirmed by the global fits of the model, the relationships between the latent variables and their associated indicators enable the statement of 26 hypotheses, of which 24 were confirmed. The model was validated using the strategy of rival models, used for comparing various SEM models and to select the one that provides the best fit, with theoretical support. The acceptance of the model was performed through the joint evaluation of the adequacy of global fit indices. Additionally, for the development of an instrument to measure productivity, an exploratory factor analysis was performed prior to the application of a confirmatory factor analysis, using SEM. The review of the concepts of productivity, the impact of the human factor, and the measurement methods led to a subjective methods approach that incorporated the perception of the main actors of the production process, both for the selection of variables and for the formulation of a productivity model and the design of an instrument to measure productivity. The methodological contribution of this research has been the use of SEM to relate variables that have to do with human behavior in the organization and with productivity, opening new possibilities for research in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto consiste en el dimensionamiento del proceso de licuación de una planta offshore para la producción de gas natural licuado, usando únicamente N2 como refrigerante, evitando de este modo riesgos potenciales que podrían surgir con el uso de refrigerantes mixtos compuestos de hidrocarburos. El proceso ha sido diseñado para acomodar 35,23 kg/s (aproximadamente un millón de toneladas por año) de gas natural seco, sin separación de gases licuados de petróleo (GLP) y ajustarlo dentro de los parámetros requeridos en las especificaciones del proceso. Para proceder al dimensionamiento del proceso de licuación de gas natural de la planta se ha empleado el programa Aspen Plus. Los sistemas floating production, storage and offloading para licuar el gas natural (LNG-FPSO), es una nueva unidad conceptual y un modo realista y efectivo para la explotación, recuperación, almacenamiento, transporte y agotamiento de los campos marginales de gas y las fuentes de gas asociadas offshore. En el proyecto se detalla el proceso, equipos necesarios y costes estimados, potencia aproximada requerida y un breve análisis económico. ABSTRACT This project consist of the dimensioning of a liquefaction process in an offshore plant to produce liquefied natural, using only N2 as refrigerant in the cooling cycles to avoid potential hazards of mixed hydrocarbon refrigerants. The process was designed to accommodate 35.23 kg/s (roughly 1 MTPA) of raw natural gas feed without separation of LPG, and fits within all parameters required in the process specifications. The plant has been designed with the computer tool Aspen Plus. The floating production, storage and offloading system for liquefied natural gas (LNGFPSO), is a new conceptual unit and an effective and realistic way for exploitation, recovery, storage, transportation and end-use applications of marginal gas fields and offshore associated-gas resources. The following report details the process, equipment needs and estimated costs, approximated power requirements, and a brief economic analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present the use of D-higraphs to perform HAZOP studies. D-higraphs is a formalism that includes in a single model the functional as well as the structural (ontological) components of any given system. A tool to perform a semi-automatic guided HAZOP study on a process plant is presented. The diagnostic system uses an expert system to predict the behavior modeled using D-higraphs. This work is applied to the study of an industrial case and its results are compared with other similar approaches proposed in previous studies. The analysis shows that the proposed methodology fits its purpose enabling causal reasoning that explains causes and consequences derived from deviations, it also fills some of the gaps and drawbacks existing in previous reported HAZOP assistant tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El ruido de tráfico generado en la ciudad es uno de los principales problemas ambientales que afecta notablemente a la calidad de vida de los ciudadanos. Actualmente, la manera de abordar el problema de la contaminación acústica se basa principalmente en medidas correctoras que se aplican a posteriori; cuando el problema ya existe. El problema del ruido debería abordarse además, con medidas preventivas aplicables en la fase de diseño de la ciudad. Sin embargo existen pocos estudios acústicos que puedan aportar conclusiones concretas sobre cómo afectan acústicamente las decisiones tomadas en el planeamiento urbano, ni sobre cómo podrían optimizarse. El trabajo realizado consiste en el estudio de la propagación de ruido en diversas calles representativas de la ciudad de Madrid que pertenecen a diversas tipologías urbanas. De él se concluye que existe una relación directa entre las características tipológicas urbanas y la propagación del ruido. Este estudio representa la base para la investigación acústica sobre múltiples aspectos urbanos y se encuadra en esta nueva área de investigación dentro de la acústica, que podría estar al servicio del planeamiento urbanístico, aportándole las herramientas que precisa para optimizar el diseño de las ciudades teniendo en consideración la problemática del ruido. ABSTRACT. Traffic noise generated in the city has become one of the main environmental problems that significantly affects the quality of life of its citizens. Currently, the approach to the problem of acoustic noise pollution is mainly based on corrective methods that are applied retrospectively; when the problem already exists. The problem of noise pollution in the city should also be dealt with preventive methods, developed in the design phase of the city. However there are few studies that can provide concrete conclusions on how urban planning decisions can affect acoustically the noise problem, or how to optimize it. This work consists in studying noise propagation in several representative streets in the city of Madrid. These streets are a selection belonging to different urban typologies. This study reveals that a direct relation exists between the urban typological characteristics and the noise propagation. This conclusion represents the base for acoustic research on multiple urban aspects. The work fits into this new area of research in acoustics, which could be at the service of the urban planning, giving it the tools it needs to improve urban designing taking into account the problem of noise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The principal risks in the railway industry are mainly associated with collisions, derailments and level crossing accidents. An understanding of the nature of previous accidents on the railway network is required to identify potential causes and develop safety systems and deploy safety procedures. Risk assessment is a process for determining the risk magnitude to assist with decision-making. We propose a three-step methodology to predict the mean number of fatalities in railway accidents. The first is to predict the mean number of accidents by analyzing generalized linear models and selecting the one that best fits to the available historical data on the basis of goodness-offit statistics. The second is to compute the mean number of fatalities per accident and the third is to estimate the mean number of fatalities. The methodology is illustrated on the Spanish railway system. Statistical models accounting for annual and grouped data for the 1992-2009 time period have been analyzed. After identifying the models for broad and narrow gauges, we predicted mean number of accidents and the number of fatalities for the 2010-18 time period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a tool to perform guided HAZOP studies using a functional modeling framework: D-higraphs. It is a formalism that gathers in a single model structural (ontological) and functional information about the process considered. In this paper it is applied to an industrial case showing that the proposed methodology fits its purposes and fulfills some of the gaps and drawbacks existing in previous reported HAZOP assistant tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A generic, sudden transition to chaos has been experimentally verified using electronic circuits. The particular system studied involves the near resonance of two coupled oscillators at 2:1 frequency ratio when the damping of the first oscillator becomes negative. We identified in the experiment all types of orbits described by theory. We also found that a theoretical, ID limit map fits closely a map of the experimental attractor which, however, could be strongly disturbed by noise. In particular, we found noisy periodic orbits, in good agreement with noise theory.