31 resultados para Build-Up Back To Back LSB, Cold-Formed Steel Structures, Lateral Distortional Buckling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La consola portátil Nintendo DS es una plataforma de desarrollo muy presente entre la comunidad de desarrolladores independientes, con una extensa y nutrida escena homebrew. Si bien las capacidades 2D de la consola están muy aprovechadas, dado que la mayor parte de los esfuerzos de los creadores amateur están enfocados en este aspecto, el motor 3D de ésta (el que se encarga de representar en pantalla modelos tridimensionales) no lo está de igual manera. Por lo tanto, en este proyecto se tiene en vista determinar las capacidades gráficas de la Nintendo DS. Para ello se ha realizado una biblioteca de funciones en C que permite aprovechar las posibilidades que ofrece la consola en el terreno 3D y que sirve como herramienta para la comunidad homebrew para crear aplicaciones 3D de forma sencilla, dado que se ha diseñado como un sistema modular y accesible. En cuanto al proceso de renderizado se han sacado varias conclusiones. En primer lugar se ha determinado la posibilidad de asignar varias componentes de color a un mismo vértice (color material reactivo a la iluminación, color por vértice directo y color de textura), tanto de forma independiente como simultáneamente, pudiéndose utilizar para aplicar diversos efectos al modelo, como iluminación pre-calculada o simulación de una textura mediante color por vértice, ahorrando en memoria de video. Por otro lado se ha implementado un sistema de renderizado multi-capa, que permite realizar varias pasadas de render, pudiendo, de esta forma, aplicar al modelo una segunda textura mezclada con la principal o realizar un efecto de reflexión esférica. Uno de los principales avances de esta herramienta con respecto a otras existentes se encuentra en el apartado de animación. El renderizador desarrollado permite por un lado animación por transformación, consistente en la animación de mallas o grupos de vértices del modelo mediante el movimiento de una articulación asociada que determina su posición y rotación en cada frame de animación. Por otro lado se ha implementado un sistema de animación por muestreo de vértices mediante el cual se determina la posición de éstos en cada instante de la animación, generando frame a frame las poses que componen el movimiento (siendo este último método necesario cuando no se puede animar una malla por transformación). Un mismo modelo puede contener diferentes esqueletos, animados independientemente entre sí, y cada uno de ellos tener definidas varias costumbres de animación que correspondan a movimientos contextuales diferentes (andar, correr, saltar, etc). Además, el sistema permite extraer cualquier articulación para asociar su transformación a un objeto estático externo y que éste siga el movimiento de la animación, pudiendo así, por ejemplo, equipar un objeto en la mano de un personaje. Finalmente se han implementado varios efectos útiles en la creación de escenas tridimensionales, como el billboarding (tanto esférico como cilíndrico), que restringe la rotación de un modelo para que éste siempre mire a cámara y así poder emular la apariencia de un objeto tridimensional mediante una imagen plana, ahorrando geometría, o emplearlo para realizar efectos de partículas. Por otra parte se ha implementado un sistema de animación de texturas por subimágenes que permite generar efectos de movimiento mediante imágenes, sin necesidad de transformar geometría. ABSTRACT. The Nintendo DS portable console has received great interest within the independent developers’ community, with a huge homebrew scene. The 2D capabilities of this console are well known and used since most efforts of the amateur creators has been focused on this point. However its 3D engine (which handles with the representation of three-dimensional models) is not equally used. Therefore, in this project the main objective is to assess the Nintendo DS graphic capabilities. For this purpose, a library of functions in C programming language has been coded. This library allows the programmer to take advantage of the possibilities that the 3D area brings. This way the library can be used by the homebrew community as a tool to create 3D applications in an easy way, since it has been designed as a modular and accessible system. Regarding the render process, some conclusions have been drawn. First, it is possible to assign several colour components to the same vertex (material colour, reactive to the illumination, colour per vertex and texture colour), independently and simultaneously. This feature can be useful to apply certain effects on the model, such as pre-calculated illumination or the simulation of a texture using colour per vertex, providing video memory saving. Moreover, a multi-layer render system has been implemented. This system allows the programmer to issue several render passes on the same model. This new feature brings the possibility to apply to the model a second texture blended with the main one or simulate a spherical reflection effect. One of the main advances of this tool over existing ones consists of its animation system. The developed renderer includes, on the one hand, transform animation, which consists on animating a mesh or groups of vertices of the model by the movement of an associated joint. This joint determines position and rotation of the mesh at each frame of the animation. On the other hand, this tool also implements an animation system by vertex sampling, where the position of vertices is determined at every instant of the animation, generating the poses that build up the movement (the latter method is mandatory when a mesh cannot be animated by transform). A model can contain multiple skeletons, animated independently, each of them being defined with several animation customs, corresponding to different contextual movements (walk, run, jump, etc). Besides, the system allows extraction of information from any joint in order to associate its transform to a static external object, which will follow the movement of the animation. This way, any object could be equipped, for example, on the hand of a character. Finally, some useful effects for the creation of three-dimensional scenes have been implemented. These effects include billboarding (both spherical and cylindrical), which constraints the rotation of a model so it always looks on the camera's direction. This feature can provide the ability to emulate the appearance of a three-dimensional model through a flat image (saving geometry). It can also be helpful in the implementation of particle effects. Moreover, a texture animation system using sub-images has also been implemented. This system allows the generation of movement by using images as textures, without having to transform geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the rationale to build up a Telematics Engineering curriculum. Telematics is a strongly computing oriented area; then, the authors have initially intended to apply the common requirements described in the computing curricula elaborated by the ACM/EEEE-CS Joint Curriculum Task Force. This experience has revealed some problematic aspects in the ACM/IEEE-CS proposal. From the analysis of these problems, a model to guide the selection and specially the approach of the Telematics curriculum contents is proposed. This model can be easily generalized to other strongly computing oriented curricula, whose number is growing everyday

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information Technologies are complex and this is true even in the smallest piece of equipment. But this kind of complexity is nothing comparejwith the one that arises when this technology interact with society. Office Automation has been traditionally considered as a technical field but there is no way to find solutions from a technical point of view when the problems are primarily social in their origin. Technology management has to change its focus from a pure technical perspective to a sociotechnical point of view. To facilitate this change, we propose a model that allows a better understanding between the managerial and the technical world, offering a coherent, complete and integrated perspective of both. The base for this model is an unfolding of the complexity found in information Technologies and a matching of these complexities with several levels considered within the Office, Office Automation and Human Factors dimensions. Each one of these domains is studied trough a set of distinctions that create a new and powerful understanding of its reality. Using this model we build up a map of Office Automation to be use^not only by managers but also by technicians because the primaty advantage of such a framework is that it allows a comprehensive evaluation of technology without requhing extensive technical knowledge. Thus, the model can be seen as principle for design and diagnosis of Office Automation and as a common reference for managers and specialist avoiding the severe limitations arising from the language used by the last

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to build up a data set including productive performance and production factors data of growing-finishing (GF) pigs in Spain in order to perform a representative and reliable description of the traits of Spanish growing-finishing pig industry. Data from 764 batches from 452 farms belonging to nine companies (1,157,212 pigs) were collected between 2008 and 2010 through a survey including five parts: general, facilities, feeding, health status and performance. Most studied farms had only GF pigs on their facilities (94.7%), produced ‘industrial’ pigs (86.7%), had entire male and female (59.5%) and Pietrain-sired pigs (70.0%), housed between 13-20 pigs per pen (87.2%), had  50% of slatted floor (70%), single-space dry feeder (54.0%), nipple drinker (88.7%) and automatic ventilation systems (71.2%). A 75.0% of the farms used three feeding phases using mainly pelleted diets (91.0%), 61.3% performed three or more antibiotic treatments and 36.5% obtained water from the public supply. Continuous variables studied had the following average values: number of pigs placed per batch, 1,515 pigs; initial and final body weight, 19.0 and 108 kg; length of GF period, 136 days; culling rate, 1.4%; barn occupation, 99.7%; feed intake per pig and fattening cycle, 244 kg; daily gain, 0.657 kg; feed conversion ratio, 2.77 kg kg-1 and mortality rate, 4.3%. Data reflecting the practical situation of the Spanish growing and finishing pig production and it may contribute to develop new strategies in order to improve the productive and economic efficiency of GF pig units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

If reinforced concrete structures are to be safe under extreme impulsive loadings such as explosions, a broad understanding of the fracture mechanics of concrete under such events is needed. Most buildings and infrastructures which are likely to be subjected to terrorist attacks are borne by a reinforced concrete (RC) structure. Up to some years ago, the traditional method used to study the ability of RC structures to withstand explosions consisted on a choice between handmade calculations, affordable but inaccurate and unreliable, and full scale experimental tests involving explosions, expensive and not available for many civil institutions. In this context, during the last years numerical simulations have arisen as the most effective method to analyze structures under such events. However, for accurate numerical simulations, reliable constitutive models are needed. Assuming that failure of concrete elements subjected to blast is primarily governed by the tensile behavior, a constitutive model has been built that accounts only for failure under tension while it behaves as elastic without failure under compression. Failure under tension is based on the Cohesive Crack Model. Moreover, the constitutive model has been used to simulate the experimental structural response of reinforced concrete slabs subjected to blast. The results of the numerical simulations with the aforementioned constitutive model show its ability of representing accurately the structural response of the RC elements under study. The simplicity of the model, which does not account for failure under compression, as already mentioned, confirms that the ability of reinforced concrete structures to withstand blast loads is primarily governed by tensile strength.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En la actualidad existe un gran conocimiento en la caracterización de rellenos hidráulicos, tanto en su caracterización estática, como dinámica. Sin embargo, son escasos en la literatura estudios más generales y globales de estos materiales, muy relacionados con sus usos y principales problemáticas en obras portuarias y mineras. Los procedimientos semi‐empíricos para la evaluación del efecto silo en las celdas de cajones portuarios, así como para el potencial de licuefacción de estos suelos durantes cargas instantáneas y terremotos, se basan en estudios donde la influencia de los parámetros que los rigen no se conocen en gran medida, dando lugar a resultados con considerable dispersión. Este es el caso, por ejemplo, de los daños notificados por el grupo de investigación del Puerto de Barcelona, la rotura de los cajones portuarios en el Puerto de Barcelona en 2007. Por estos motivos y otros, se ha decidido desarrollar un análisis para la evaluación de estos problemas mediante la propuesta de una metodología teórico‐numérica y empírica. El enfoque teórico‐numérico desarrollado en el presente estudio se centra en la determinación del marco teórico y las herramientas numéricas capaces de solventar los retos que presentan estos problemas. La complejidad del problema procede de varios aspectos fundamentales: el comportamiento no lineal de los suelos poco confinados o flojos en procesos de consolidación por preso propio; su alto potencial de licuefacción; la caracterización hidromecánica de los contactos entre estructuras y suelo (camino preferencial para el flujo de agua y consolidación lateral); el punto de partida de los problemas con un estado de tensiones efectivas prácticamente nulo. En cuanto al enfoque experimental, se ha propuesto una metodología de laboratorio muy sencilla para la caracterización hidromecánica del suelo y las interfaces, sin la necesidad de usar complejos aparatos de laboratorio o procedimientos excesivamente complicados. Este trabajo incluye por tanto un breve repaso a los aspectos relacionados con la ejecución de los rellenos hidráulicos, sus usos principales y los fenómenos relacionados, con el fin de establecer un punto de partida para el presente estudio. Este repaso abarca desde la evolución de las ecuaciones de consolidación tradicionales (Terzaghi, 1943), (Gibson, English & Hussey, 1967) y las metodologías de cálculo (Townsend & McVay, 1990) (Fredlund, Donaldson and Gitirana, 2009) hasta las contribuciones en relación al efecto silo (Ranssen, 1985) (Ravenet, 1977) y sobre el fenómeno de la licuefacción (Casagrande, 1936) (Castro, 1969) (Been & Jefferies, 1985) (Pastor & Zienkiewicz, 1986). Con motivo de este estudio se ha desarrollado exclusivamente un código basado en el método de los elementos finitos (MEF) empleando el programa MATLAB. Para ello, se ha esablecido un marco teórico (Biot, 1941) (Zienkiewicz & Shiomi, 1984) (Segura & Caron, 2004) y numérico (Zienkiewicz & Taylor, 1989) (Huerta & Rodríguez, 1992) (Segura & Carol, 2008) para resolver problemas de consolidación multidimensional con condiciones de contorno friccionales, y los correspondientes modelos constitutivos (Pastor & Zienkiewicz, 1986) (Fiu & Liu, 2011). Asimismo, se ha desarrollado una metodología experimental a través de una serie de ensayos de laboratorio para la calibración de los modelos constitutivos y de la caracterización de parámetros índice y de flujo (Castro, 1969) (Bahda 1997) (Been & Jefferies, 2006). Para ello se han empleado arenas de Hostun como material (relleno hidráulico) de referencia. Como principal aportación se incluyen una serie de nuevos ensayos de corte directo para la caracterización hidromecánica de la interfaz suelo – estructura de hormigón, para diferentes tipos de encofrados y rugosidades. Finalmente, se han diseñado una serie de algoritmos específicos para la resolución del set de ecuaciones diferenciales de gobierno que definen este problema. Estos algoritmos son de gran importancia en este problema para tratar el procesamiento transitorio de la consolidación de los rellenos hidráulicos, y de otros efectos relacionados con su implementación en celdas de cajones, como el efecto silo y la licuefacciones autoinducida. Para ello, se ha establecido un modelo 2D axisimétrico, con formulación acoplada u‐p para elementos continuos y elementos interfaz (de espesor cero), que tratan de simular las condiciones de estos rellenos hidráulicos cuando se colocan en las celdas portuarias. Este caso de estudio hace referencia clara a materiales granulares en estado inicial muy suelto y con escasas tensiones efectivas, es decir, con prácticamente todas las sobrepresiones ocasionadas por el proceso de autoconsolidación (por peso propio). Por todo ello se requiere de algoritmos numéricos específicos, así como de modelos constitutivos particulares, para los elementos del continuo y para los elementos interfaz. En el caso de la simulación de diferentes procedimientos de puesta en obra de los rellenos se ha requerido la modificacion de los algoritmos empleados para poder así representar numéricamente la puesta en obra de estos materiales, además de poder realizar una comparativa de los resultados para los distintos procedimientos. La constante actualización de los parámetros del suelo, hace también de este algoritmo una potente herramienta que permite establecer un interesante juego de perfiles de variables, tales como la densidad, el índice de huecos, la fracción de sólidos, el exceso de presiones, y tensiones y deformaciones. En definitiva, el modelo otorga un mejor entendimiento del efecto silo, término comúnmente usado para definir el fenómeno transitorio del gradiente de presiones laterales en las estructuras de contención en forma de silo. Finalmente se incluyen una serie de comparativas entre los resultados del modelo y de diferentes estudios de la literatura técnica, tanto para el fenómeno de las consolidaciones por preso propio (Fredlund, Donaldson & Gitirana, 2009) como para el estudio del efecto silo (Puertos del Estado, 2006, EuroCódigo (2006), Japan Tech, Stands. (2009), etc.). Para concluir, se propone el diseño de un prototipo de columna de decantación con paredes friccionales, como principal propuesta de futura línea de investigación. Wide research is nowadays available on the characterization of hydraulic fills in terms of either static or dynamic behavior. However, reported comprehensive analyses of these soils when meant for port or mining works are scarce. Moreover, the semi‐empirical procedures for assessing the silo effect on cells in floating caissons, and the liquefaction potential of these soils during sudden loads or earthquakes are based on studies where the underlying influence parameters are not well known, yielding results with significant scatter. This is the case, for instance, of hazards reported by the Barcelona Liquefaction working group, with the failure of harbor walls in 2007. By virtue of this, a complex approach has been undertaken to evaluate the problem by a proposal of numerical and laboratory methodology. Within a theoretical and numerical scope, the study is focused on the numerical tools capable to face the different challenges of this problem. The complexity is manifold; the highly non‐linear behavior of consolidating soft soils; their potentially liquefactable nature, the significance of the hydromechanics of the soil‐structure contact, the discontinuities as preferential paths for water flow, setting “negligible” effective stresses as initial conditions. Within an experimental scope, a straightforward laboratory methodology is introduced for the hydromechanical characterization of the soil and the interface without the need of complex laboratory devices or cumbersome procedures. Therefore, this study includes a brief overview of the hydraulic filling execution, main uses (land reclamation, filled cells, tailing dams, etc.) and the underlying phenomena (self‐weight consolidation, silo effect, liquefaction, etc.). It comprises from the evolution of the traditional consolidation equations (Terzaghi, 1943), (Gibson, English, & Hussey, 1967) and solving methodologies (Townsend & McVay, 1990) (Fredlund, Donaldson and Gitirana, 2009) to the contributions in terms of silo effect (Ranssen, 1895) (Ravenet, 1977) and liquefaction phenomena (Casagrande, 1936) (Castro, 1969) (Been & Jefferies, 1985) (Pastor & Zienkiewicz, 1986). The novelty of the study lies on the development of a Finite Element Method (FEM) code, exclusively formulated for this problem. Subsequently, a theoretical (Biot, 1941) (Zienkiewicz and Shiomi, 1984) (Segura and Carol, 2004) and numerical approach (Zienkiewicz and Taylor, 1989) (Huerta, A. & Rodriguez, A., 1992) (Segura, J.M. & Carol, I., 2008) is introduced for multidimensional consolidation problems with frictional contacts and the corresponding constitutive models (Pastor & Zienkiewicz, 1986) (Fu & Liu, 2011). An experimental methodology is presented for the laboratory test and material characterization (Castro 1969) (Bahda 1997) (Been & Jefferies 2006) using Hostun sands as reference hydraulic fill. A series of singular interaction shear tests for the interface calibration is included. Finally, a specific model algorithm for the solution of the set of differential equations governing the problem is presented. The process of consolidation and settlements involves a comprehensive simulation of the transient process of decantation and the build‐up of the silo effect in cells and certain phenomena related to self‐compaction and liquefaction. For this, an implementation of a 2D axi‐syimmetric coupled model with continuum and interface elements, aimed at simulating conditions and self‐weight consolidation of hydraulic fills once placed into floating caisson cells or close to retaining structures. This basically concerns a loose granular soil with a negligible initial effective stress level at the onset of the process. The implementation requires a specific numerical algorithm as well as specific constitutive models for both the continuum and the interface elements. The simulation of implementation procedures for the fills has required the modification of the algorithm so that a numerical representation of these procedures is carried out. A comparison of the results for the different procedures is interesting for the global analysis. Furthermore, the continuous updating of the model provides an insightful logging of variable profiles such as density, void ratio and solid fraction profiles, total and excess pore pressure, stresses and strains. This will lead to a better understanding of complex phenomena such as the transient gradient in lateral pressures due to silo effect in saturated soils. Interesting model and literature comparisons for the self‐weight consolidation (Fredlund, Donaldson, & Gitirana, 2009) and the silo effect results (Puertos del Estado (2006), EuroCode (2006), Japan Tech, Stands. (2009)). This study closes with the design of a decantation column prototype with frictional walls as the main future line of research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En esta tesis se ha profundizado en el estudio y desarrollo de modelos de soporte para el aprendizaje colaborativo a distancia, que ha permitido proponer una arquitectura fundamentada en los principios del paradigma CSCL (Computer Supported Collaborative Learning). La arquitectura propuesta aborda un tipo de problema concreto que requiere el uso de técnicas derivadas del Trabajo Colaborativo, la Inteligencia Artificial, Interfaces de Usuario así como ideas tomadas de la Pedagogía y la Psicología. Se ha diseñado una solución completa, abierta y genérica. La arquitectura aprovecha las nuevas tecnologías para lograr un sistema efectivo de apoyo a la educación a distancia. Está organizada en cuatro niveles: el de Configuración, el de Experiencia, el de Organización y el de Análisis. A partir de ella se ha implementado un sistema llamado DEGREE. En DEGREE, cada uno de los niveles de la arquitectura da lugar a un subsistema independiente pero relacionado con los otros. La aplicación saca partido del uso de espacios de trabajo estructurados. El subsistema Configurador de Experiencias permite definir los elementos de un espacio de trabajo y una experiencia y adaptarlos a cada tipo de usuario. El subsistema Manejador de Experiencias recoge las contribuciones de los usuarios para construir una solución conjunta de un problema. Las intervenciones de los alumnos se estructuran basándose en un grafo conversacional genérico. Además, se registran todas las acciones de los usuarios para representar explícitamente el proceso completo que lleva a la solución. Estos datos también se almacenan en una memoria común que constituye el subsistema llamado Memoria Organizativa de Experiencias. El subsistema Analizador estudia las intervenciones de los usuarios. Este análisis permite inferir conclusiones sobre la forma en que trabajan los grupos y sus actitudes frente a la colaboración, teniendo en cuenta además el conocimiento subjetivo del observador. El proceso de desarrollo en paralelo de la arquitectura y el sistema ha seguido un ciclo de refinamiento en cinco fases con sucesivas etapas de prototipado y evaluación formativa. Cada fase de este proceso se ha realizado con usuarios reales y se han considerado las opiniones de los usuarios para mejorar las funcionalidades de la arquitectura así como la interfaz del sistema. Esta aproximación ha permitido, además, comprobar la utilidad práctica y la validez de las propuestas que sustentan este trabajo.---ABSTRACT---In this thesis, we have studied in depth the development of support models for distance collaborative learning and subsequently devised an architecture based on the Computer Supported Collaborative Learning paradigm principles. The proposed architecture addresses a specific problem: coordinating groups of students to perform collaborative distance learning activities. Our approach uses Cooperative Work, Artificial Intelligence and Human-Computer Interaction techniques as well as some ideas from the fields of Pedagogy and Psychology. We have designed a complete, open and generic solution. Our architecture exploits the new information technologies to achieve an effective system for education purposes. It is organised into four levels: Configuration, Experience, Organisation and Reflection. This model has been implemented into a system called DEGREE. In DEGREE, each level of the architecture gives rise to an independent subsystem related to the other ones. The application benefits from the use of shared structured workspaces. The configuration subsystem allows customising the elements that define an experience and a workspace. The experience subsystem gathers the users' contributions to build joint solutions to a given problem. The students' interventions build up a structure based on a generic conversation graph. Moreover, all user actions are registered in order to represent explicitly the complete process for reaching the group solution. Those data are also stored into a common memory, which constitutes the organisation subsystem. The user interventions are studied by the reflection subsystem. This analysis allows us inferring conclusions about the way in which the group works and its attitudes towards collaboration. The inference process takes into account the observer's subjective knowledge. The process of developing both the architecture and the system in parallel has run through a five-pass cycle involving successive stages of prototyping and formative evaluation. At each stage of that process, we have considered the users' feedback for improving the architecture's functionalities as well as the system interface. This approach has allowed us to prove the usability and validity of our proposal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although previous studies report on the effect of street washing on ambient particulate matter levels, there is a lack of studies investigating the results of street washing on the emission strength of road dust. A sampling campaign was conducted in Madrid urban area during July 2009 where road dust samples were collected in two sites, namely Reference site (where the road surface was not washed) and Pelayo site (where street washing was performed daily during night). Following the chemical characterization of the road dust particles the emission sources were resolved by means of Positive Matrix Factorization, PMF (Multilinear Engine scripting) and the mass contribution of each source was calculated for the two sites. Mineral dust, brake wear, tire wear, carbonaceous emissions and construction dust were the main sources of road dust with mineral and construction dust being the major contributors to inhalable road dust load. To evaluate the effectiveness of street washing on the emission sources, the sources mass contributions between the two sites were compared. Although brake wear and tire wear had lower concentrations at the site where street washing was performed, these mass differences were not statistically significant and the temporal variation did not show the expected build-up after dust removal. It was concluded that the washing activities resulted merely in a road dust moistening, without effective removal and that mobilization of particles took place in a few hours between washing and sampling. The results also indicated that it is worth paying attention to the dust dispersed from the construction sites as they affect the emission strength in nearby streets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Civil buildings are not specifically designed to support blast loads, but it is important to take into account these potential scenarios because of their catastrophic effects, on persons and structures. A practical way to consider explosions on reinforced concrete structures is necessary. With this objective we propose a methodology to evaluate blast loads on large concrete buildings, using LS-DYNA code for calculation, with Lagrangian finite elements and explicit time integration. The methodology has three steps. First, individual structural elements of the building like columns and slabs are studied, using continuum 3D elements models subjected to blast loads. In these models reinforced concrete is represented with high precision, using advanced material models such as CSCM_CONCRETE model, and segregated rebars constrained within the continuum mesh. Regrettably this approach cannot be used for large structures because of its excessive computational cost. Second, models based on structural elements are developed, using shells and beam elements. In these models concrete is represented using CONCRETE_EC2 model and segregated rebars with offset formulation, being calibrated with continuum elements models from step one to obtain the same structural response: displacement, velocity, acceleration, damage and erosion. Third, models basedon structural elements are used to develop large models of complete buildings. They are used to study the global response of buildings subjected to blast loads and progressive collapse. This article carries out different techniques needed to calibrate properly the models based on structural elements, using shells and beam elements, in order to provide results of sufficient accuracy that can be used with moderate computational cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European Union has been promoting linguistic diversity for many years as one of its main educational goals. This is an element that facilitates student mobility and student exchanges between different universities and countries and enriches the education of young undergraduates. In particular,a higher degree of competence in the English language is becoming essential for engineers, architects and researchers in general, as English has become the lingua franca that opens up horizons to internationalisation and the transfer of knowledge in today’s world. Many experts point to the Integrated Approach to Contents and Foreign Languages System as being an option that has certain benefits over the traditional method of teaching a second language that is exclusively based on specific subjects. This system advocates teaching the different subjects in the syllabus in a language other than one’s mother tongue, without prioritising knowledge of the language over the subject. This was the idea that in the 2009/10 academic year gave rise to the Second Language Integration Programme (SLI Programme) at the Escuela Arquitectura Tecnica in the Universidad Politecnica Madrid (EUATM-UPM), just at the beginning of the tuition of the new Building Engineering Degree, which had been adapted to the European Higher Education Area (EHEA) model. This programme is an interdisciplinary initiative for the set of subjects taught during the semester and is coordinated through the Assistant Director Office for Educational Innovation. The SLI Programme has a dual goal; to familiarise students with the specific English terminology of the subject being taught, and at the same time improve their communication skills in English. A total of thirty lecturers are taking part in the teaching of eleven first year subjects and twelve in the second year, with around 120 students who have voluntarily enrolled in a special group in each semester. During the 2010/2011 academic year the degree of acceptance and the results of the SLI Programme are being monitored. Tools have been designed to aid interdisciplinary coordination and to analyse satisfaction, such as coordination records and surveys. The results currently available refer to the first semester of the year and are divided into specific aspects of the different subjects involved and into general aspects of the ongoing experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo general de este trabajo es explorar las potenciales interacciones entre la alteración hidrológica y el estado de la vegetación de ribera en diversas cuencas hidrográficas españolas. La mayor parte del área de estudio está dentro de la región Mediterránea, una región caracterizada por un particular comportamiento climatológico, ecológico y socio-económico. Las cuencas estudiadas son: Guadiana, Guadalquivir, Tajo, y Duero. Para complementar el estudio y comparar resultados con otras regiones climáticas españolas se estudiaron dos Demarcaciones atlánticas: Cantábrico y Miño-Sil. El funcionamiento fluvial, en las áreas mediterráneas, presenta grandes variaciones en el régimen hídrico, siendo uno de los principales controladores de la estructura, composición y distribución de la vegetación de ribera. Para investigar las interacciones mencionadas previamente, se presenta un nuevo índice, Riparian Forest Evaluation (RFV). El objetivo de este índice es valorar el estado de la vegetación de ribera en base a los principales controladores hidro-morfológicos responsables de la dinámica fluvial, y por tanto, relacionados con el desarrollo del corredor ribereño. RFV divide la evaluación del bosque de ribera en cuatro componentes: continuidad longitudinal, transversal, y vertical (dimensiones espaciales), y condiciones del regenerado (dimensión temporal). La clasificación final está basada en las mismas cinco clases fijadas por la Directiva europea Marco del Agua (DMA) (2000/60/CE) para valorar el estado ecológico de las masas de agua. La aplicación de este índice en 187 masas de agua ha mostrado su facilidad de aplicación y su consistencia desde un punto de vista legislativo y técnicocientífico. En paralelo al diseño del índice RFV, se ha desarrollado una nueva herramienta para dar apoyo a la evaluación del bosque de ribera (RFV) y la extracción de variables hidromorfológicas a escala de masa de agua en lugar de a escala local (muestreo de campo local). Se trata de Riparian Characterisation by LiDAR (RiC-DAR), que permite valorar el estado del bosque de ribera de una manera semiautomática, en un modo cuasi-continuo, usando LiDAR de alta resolución. Esto hecho permite mejorar significativamente la calidad y cantidad de información comparado con la toma de datos en campo, lo que permite reducir los recursos, particularmente cuando se trabaja a escala de masas de agua. Una de las potenciales causas de la degradación del bosque de ribera es la alteración hidrológica; así una vez se ha realizado la evaluación del bosque de ribera basada en elementos hidro-morfológicos ligados al régimen hídrico (RFV), se han identificado las potenciales relaciones entre la alteración hidrológica y la degradación de la vegetación riparia. Para ello, se requiere contar con series de aforos fiables y de una duración apropiada. Para satisfacer este requerimiento, se ha creado una nueva herramienta: el Servidor de Datos para el Estudio de la Alteración Hidrológica (SEDAH). Esta herramienta genera datos diarios y mensuales completados para más años, construyendo así bases de datos más aptas para el estudio de la alteración hidrológica. (http://ambiental.cedex.es/Sedah) Haciendo uso de las herramientas y métodos desarrollados previamente, se han calculado diferentes indicadores de alteración hidrológica en 87 masas de agua que han sido analizados conjuntamente con el estado de la vegetación de ribera. Los resultados estadísticos han mostrado relaciones significativas entre ellos: la degradación de la vegetación podría estar fuertemente ligada a la alteración del régimen hídrico en años secos y a las sequías en el entorno mediterráneo. A su vez se ha analizado la relación del estado del bosque de ribera con el estado ecológico basado en la DMA, mostrando una relación no significativa. Esta y otras potenciales relaciones son discutidas a lo largo del documento. Los resultados permiten proponer recomendaciones de gestión de la vegetación de ribera y de manejo del régimen de caudales para masas de agua reguladas. ABSTRACT The general objective of this work is to explore the potential interactions between hydrologic alteration and degradation of the riparian vegetation, mainly focused in the Mediterranean Environment. The majority of the study area is part of the Spanish Mediterranean region, a geographical environment characterized by a singular climatologic, ecological and socio-economical behavior. The basins analysed in this work are: Guadiana, Guadalquivir, Tagus, Douro. In order to complete the results and compare those to other climatic regions in Spain, two Atlantic Districts where selected: Cantábrico and Miño-Sil. The river functioning, in the Mediterranean areas, presents great variations in the flow regimes being one of the main drivers of the riparian vegetation development. To explore the interactions stated above, a new index is presented, Riparian Forest Evaluation (RFV). This index is aimed to assess the status of the riparian vegetation based on the main hydromorphological drivers responsible of the river dynamic, and so, related to the development of the riparian corridor. RFV split the evaluation of the riparian forest into four components: longitudinal continuity, transversal continuity, vertical (structure) continuity and regeneration (temporal) continuity. The final classification is based in the same five classes to those stated in the European Water Frame Work directive to assess the ecological status. The application of this index over 187 water bodies has shown to be easily applicable and consistent from a regulatory perspective. In parallel to the design of RFV, a new tool has been developed to enhance the evaluation of the riparian forest and river morphology (RFV and morphological parameters) at water body scale rather than at local scale, i.e. when field surveyed at different sites. The Riparian Characterisation by LiDAR, RiC-DAR allows to assess the status of the riparian vegetation semi-automatically in a quasi-continuous way using high resolution LiDAR. This improves significantly the quantity and quality of information gathered through field sampling methods, reducing resources when working at larger scales. One of the potential major causes of degradation of the riparian forest is the hydrologic alteration; hence, once the evaluation of the riparian forest is done consistently based on hydro-morphological features linked to the hydrological regime (RVF), this work has identified potential relationships between hydrologic alteration and degradation of riparian vegetation. In order to do this, it is required to count with reliable series of flow records describing both reference and altered regimes. To satisfy this need, a new tool has been created, SEDAH, data server for assessing hydrologic alteration. This tool has completed daily and monthly gaps in the flow series to build up an improved database to assess the hydrologic alteration. This application is available on line (http://ambiental.cedex.es). Making use of the tools and methods developed previously, this work uses these data to work out indicators of hydrologic alteration through 87 water bodies to be analysed against the riparian status. These indicators are statistically analysed exploring significant relationships with the status of riparian vegetation and ecological status, showing some potential significant relationships; hence it seems that the degradation of riparian vegetation, particularly the regeneration, is associated with the alteration of the hydrologic regimen in dry years and draughts in Mediterranean environments. Furthermore, the analysis of the status of the ecological status and the status of the riparian vegetation has shown the lack of significant influence of the riparian vegetation in the final ecological status using the WFD approach applied in Spain. This and other potential relationships are discussed in this work. The results allow giving guidance on the management of both riparian vegetation and environmental flows of water bodies affected by flow regulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project is to create a website which is useful both employees and students of a university, so employees can add information, if they log in with username and password access, and students can view this information . Employees may modify and display information such as title, room, or their faculty (from a list defined by the administrator), and most importantly, their schedule, whether class, tutoring, free time, or any of tasks that the administrator define. There will be a manager, responsible for managing employees, the availables faculties and the types of tasks that employees can use on their schedule. Students may see the employees schedules and rooms on the homepage. They differentiate between differents tasks of employees, because these are in different colors. They can also filter information for faculty, employee or day. To achieve our goal, we decided to program in Java using Servlets, which we will use to generate response to requests coming from users from the website. We will also use JSP, allowing us to create different websites files. We use JSP files and not HTML, because we need that the pages are dynamic, since not only want to show specific information, we like that information can change depending on user requests. The JSP file allows us to generate HTML, but also using JAVA language, which is necessary for our purpose. As the information we store is not fixed. We want this information can be modified at any time by employees and admin, so we need a database, which can be accessed from anywhere. We decided SQLite databases because are integrated quite well in our application, and offer a quick response. To access the database from our program, we simply connect it to the database, and with very few lines of code, add, delete or modify entries in different tables that owns the database. To facilitate the initial creation of the database, and the first tables, we use a Mozilla Firefox browser plugin, called SQLite Manager, which allows us to do so from a more friendly interface. Finally, we need a server that supports and implements specifications Servlets and JSP. We decided on the TomCat server, which is a container Servlets, because is free, easy to use, and compatible with our program. We realized all the project with Eclipse environment, also free program that allows integrating database, server and program the JSP and Servlets. Once submitted all the tools we used, we must first organize the structure of the web, relating each Servlets with JSP files. Next, create the database and the different Servlets, and adjust the database accesses to make sure we do it right. From here simply is to build up the page step by step, showing in each place we need, and redirect to different pages. In this way, we can build a complex website, free, and without being an expert in the field. RESUMEN. El objetivo de este proyecto, es crear una página web que sirva tanto a empleados como a alumnos de una universidad, de tal manera que los empleados podrán añadir información, mediante el acceso con usuario y contraseña, y los alumnos podrán visualizar está información. Los empleados podrán modificar y mostrar información como su título, despacho, facultad a la que pertenecen (de entre una lista definida por el administrador), y lo más importante, sus horarios, ya sean de clase, tutorías, tiempo libre, o cualquiera de las tareas que el administrador defina. Habrá un administrador, encargado de gestionar los empleados existentes, las facultades disponibles y los tipos de tareas que podrán usar los empleados en su horario. Los alumnos, podrán visualizar los horarios y despacho de los empleados en la página principal. Diferenciarán entre las distintas tareas de los profesores, porque estas se encuentran en colores diferentes. Además, podrán filtrar la información, por facultad, empleado o día de la semana. Para conseguir nuestro objetivo, hemos decidido programar en Java, mediante el uso de Servlets, los cuales usaremos para generar respuesta antes las peticiones que llegan de los usuarios desde la página web. También usaremos archivos JSP, que nos permitirán crear las diferentes páginas webs. Usamos archivos JSP y no HTML, porque necesitamos que las diferentes páginas sean dinámicas, ya que no solo queremos mostrar una información concreta, si no que esta información puede variar en función de las peticiones de usuario. El archivo JSP nos permite generar HTML, pero a la vez usar lenguaje JAVA, algo necesario para nuestro cometido. Como la información que queremos almacenar no es fija, si no que en todo momento debe poder ser modificada por empleados y administrador, necesitamos una base de datos, a la que podamos acceder desde la web. Nos hemos decidido por bases SQLite, ya que se integran bastante bien en nuestra aplicación, y además ofrecen una rápida respuesta. Para acceder a la base de datos desde nuestro programa, simplemente debemos conectar el mismo a la base de datos, y con muy pocas líneas de código, añadir, eliminar o modificar entradas de las diferentes tablas que posee la base de datos. Para facilitar la creación inicial de la base de datos, y de las primeras tablas, usamos un complemento del navegador Mozilla Firefox, llamado SQLite Manager, que nos permite hacerlo desde una interfaz más amigable. Por último, necesitamos de un servidor que soporte e implemente las especificaciones de los Servlets y JSP. Nos decidimos por el servidor TomCat, que es un contenedor de Servlets gratuito, de fácil manejo, y compatible con nuestro programa. Todo el desarrollo del proyecto, lo realizamos desde el entorno Eclipse, programa también gratuito, que permite integrar la base de datos, el servidor y programar los JSP y Servlets. Una vez presentadas todas las herramientas que hemos utilizado, primero debemos organizar la estructura de la web, relacionando cada archivo JSP con los Servlets a los que debe acceder. A continuación creamos la base de datos y los diferentes Servlets, y ajustamos bien los accesos a la base de datos para comprobar que lo hacemos correctamente. A partir de aquí, simplemente es ir construyendo la página paso a paso, mostrando en cada lugar lo que necesitemos, y redirigiendo a las diferentes páginas. De esta manera, podremos construir una página web compleja, de manera gratuita, y sin ser un experto en la materia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the aims of COST C14 action is the assessment and evaluation of pedestrian wind comfort. At present there is no general rule available that is applied across Europe. There are several criteria that have been developed and applied in different countries. These criteria are based on the definition of two independent parameters, a threshold effective wind speed and a probability of exceedence of this threshold speed. The difficulty of the criteria comparison arises from the two-dimensional character of the criteria definition. An effort is being made to compare these criteria, trying both to find commonalities and to clearly identify differences, in order to build up the basis for the next step: to try to define common criteria (perhaps with regional and seasonal variations). The first point is to define clearly the threshold effective wind speed (mean velocity definition parameters: averaging interval and reference height) and equivalence between different ways of defining the threshold effective wind speed (mean wind speed, gust equivalent mean, etc.) in comparable terms (as far as possible). It can be shown that if the wind speed at a given location is defined in terms of a probability distribution, e.g. Weibull function, a given criterion is satisfied by an infinite set of wind conditions, that is, of probability distributions. The criterion parameters and the Weibull function parameters are linked to each other, establishing a set called iso-criteria lines (the locus of the Weibull function parameters that fulfil a given criterion). The relative position of iso-criteria lines when displayed in a suitable two-dimensional plane facilitates the comparison of comfort criteria. The comparison of several wind comfort criteria, coming from several institutes is performed, showing the feasibility and limitations of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mathematical model of the process employed by a sonic anemometer to build up the measured wind vector in a steady flow is presented to illustrate the way the geometry of these sensors as well as the characteristics of aerodynamic disturbance on the acoustic path can lead to singularities in the transformation function that relates the measured (disturbed) wind vector with the real (corrected) wind vector, impeding the application of correction/calibration functions for some wind conditions. An implicit function theorem allows for the identification of those combinations of real wind conditions and design parameters that lead to undefined correction/ calibration functions. In general, orthogonal path sensors do not show problematic combination of parameters. However, some geometric sonic sensor designs, available in the market, with paths forming smaller angles could lead to undefined correction functions for some levels of aerodynamic disturbances and for certain wind directions. The parameters studied have a strong influence on the existence and number of singularities in the correction/ calibration function as well as on the number of singularities for some combination of parameters. Some conclusions concerning good design practices are included.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo principal de crear un espacio web para el Museo Histórico de la Informática (MHI) perteneciente a la Escuela Técnica Superior de Ingenieros Informáticos (ETSIINF) de la Universidad Politécnica de Madrid (UPM) es la difusión de la historia de la informática entre el público en general. Si bien es cierto que existe ya una página web de consulta del MHI con algunas imágenes y contenido sobre los objetos que allí se exhiben, es también reseñable que se trata de un espacio obsoleto, lleno de carencias y extremadamente difícil de gestionar y actualizar, por lo que se hacía imprescindible actualizar el diseño del espacio web, los contenidos y el sistema de gestión de los mismos, cosa que es de gran interés para un lugar divulgativo. En la actualidad, existen maneras mucho más amigables para el usuario de navegar por una web; y de la misma manera para un administrador, gestionar el contenido de la misma y mantener a los usuarios bien informados de todo lo que se ofrece en cada momento. Esto es posible gracias a los sistemas de gestión de contenidos o content management system (CMS), de los que se hablará lo largo de todo el documento. Estos sistemas, dan una facilidad mucho mayor a los encargados de llevar al día una página web, sin tener que saber de programación, lenguajes o informática en general, ya que incorporan paneles de control muy intuitivos y fáciles de manejar, que son una ventaja tanto para ellos como para los usuarios. Es por esta razón que, hay páginas web como la de las empresas IKEA, Ubuntu o, en especial para el caso que compete al documento, el museo del Louvre usan gestores de contenidos para sus páginas web. Y es que las ventajas y facilidades que ofrece un CMS son realmente interesantes y se tratará de todas ellas en el documento, de la elección del CMS que mejor se ajusta a los requisitos del museo, las restricciones a la hora del despliegue en el ámbito de la ETSIINF y de cómo mejorará esto la calidad visual y divulgativa del MHI. Este trabajo se desarrolla a lo largo de 11 capítulos, en los que se muestra como construir un sitio web, las posibilidades y la elección final para este caso. En el primer capítulo se hace una pequeña introducción de lo que es el proyecto, se especifican los objetivos, la motivación del mismo y el alcance que tiene. En el segundo capítulo se muestra la información que se ha recopilado en el trabajo de investigación que se hace previo al desarrollo. En él se muestran los distintos tipos de páginas web, que tecnologías y lenguajes se pueden usar para su construcción, una comparativa sobre otras entidades similares al MHI, las limitaciones que presenta el entorno y la elección final que se consideró más adecuada para este caso. En el capítulo tres se empieza a desarrollar la solución a través del diseño. Aquí se puede encontrar el diseño de más bajo nivel que se le presenta al cliente para sentar las bases del trabajo, el diseño de alto nivel con un mayor grado de realidad que el anterior y una preparación de lo que serán los planes de prueba. El capítulo cuatro muestra todo lo que se ha usado en la implementación y la integración de la página web: herramientas, tecnologías, plantillas de diseño y módulos que proporcionan distintas funcionalidades. Llegados al capítulo cinco, se puede ver una detallada documentación de los resultados de las pruebas de usabilidad y accesibilidad realizadas, y las conclusiones que subyacen de estas. Una vez acabada la implementación del sitio web del MHI, en el capítulo seis se intenta hacer una labor de consultoría, mostrando precios y presupuestos de las distintas tareas que se han llevado a cabo para la realización de este proyecto. En el capítulo siete se extraen las conclusiones de todo lo acontecido en los capítulos anteriores y en el ocho unas líneas sobre posibles trabajos futuros que se podrían llevar a cabo en base a lo que ya se tiene en la institución, incluido este trabajo. Para facilitar la comprensión y satisfacer la curiosidad del lector, se ha incluido en el capítulo nueve la bibliografía con toda la documentación consultada y en el diez un glosario de términos para la aclaración de términos y acrónimos más técnicos. Para acabar, en el capítulo once se anexionan tanto el documento que se usó para las pruebas de usabilidad como un manual de administrador para el sitio web, que hace más amigable el entorno para las personas que lo tengan que mantener en un futuro.---ABSTRACT---The main goal of creating a website for the Informatics Historical Museum (“Museo Histórico de la Informática” or MHI) located in the Higher Technical School of Informatics Engineers (“Escuela Técnica Superior de Ingenieros Informáticos” or ETSIINF) of the Polytechnic University of Madrid (“Universidad Politécnica de Madrid” UPM), is to promote the aforementioned museum as to increase its reach to the public. While it is true that there already is a consulting website with some pictures and information about the items which are displayed in the building, it is outdated and the data is scarce. Moreover, it is extremely complicated to manage and to regularly update the web page, which is very important for informative/broadcasting media. Currently, there are easier ways for the users of a website to consult whatever information they want, as well as it is now easy for a website manager to display new content and to keep the users informed about what is been offered at every moment. This is possible because of content management systems (CMS), which will be discussed throughout the entire paper. These systems make it easier to use for the administrator of a website to keep it up to date without the necessity for them of having any knowledge or skills in programming, languages or computing, because the systems have an intuitive control panel that is easy to use, which is an advantage for both managers and users. Because of all these reasons, there are lots of companies that use this kind of systems, such as IKEA, Ubuntu or, especially, the Louvre Museum, to which we direct our attention all throughout this report. It is easy to notice that these systems have an important and very interesting number of perks and benefits. In the next chapters of the document we will explain the benefits of the program as well as the choice on the kind of CMS that best suits the requirements of the museum and, finally, the restrictions of the school for the deployment and of how all of this will improve the visual and informational qualities of the MHI. This work is developed over 11 chapters, shown how to build up a website, the possibilities and the final choice for this case. In the first chapter a brief introduction of the project, goals, motivation and scope thereof having specified are done. Before the development of the website, the second chapter shows de information of the researching work. It discusses the different types of websites, technologies and languages that can be used for build-up, a comparison of similar entities to MHI, the limitations of the environment and the final choice was considered more appropriate for this case. Chapter three begins to develop the design of the solution. Here there are the lowest level design that presents the customer to fix any problem, the high level design with a higher degree of reality than the last and the test plans. Chapter four shows everything that has been used in the implementation and integration of the website: tools, technologies, design templates and modules that provide different functionalities. Reaching the fifth chapter, you can see a detailed documentation of the results of the usability and accessibility tests made to some users and the conclusions of it. Once the implementation of MHI website is done, in chapter six there is a consultancy work, showing prices and budgets of the different tasks which were carried out for this project. In chapter seven there are the conclusions of what happened in the previous chapters and eight chapter shows possible future works that could be carried out based on what the institution already have, including this work. To make easier to the reader understand this paper and satisfy his curiosity, the chapter nine includes the bibliography consulted with all the documentation and chapter ten has a glossary of terms and an explanation of technical terms and acronyms. Finally, in chapter eleven there are attached both the document that was used for usability testing as a manual administrator for the website, making the environment friendlier for people who have to maintain it in the future.