82 resultados para Environment degradation
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The circumstances that were the driving forces behind Europe's economic growth beginning in the 19th century are diverse, and not easily prioritized. Until the 1970's, specifically, in Economy and Economic History, attention was focused on different institutional and technological variables, and various regularities were proposed. Nevertheless, new studies also underlined that the evolution of economic activity could not be understood considering only the new production possibilities offered by market economies. As a result, today it is also accepted that those processes can not be explained without considering two additional circumstances: the energy flows that sustained them, and the changes undergone in their transformation In this context, a question arises that takes on special importance. Which was the influence of the biological change in the economic growth?. A part of the flows of energy must be made into food, and this transformation can only happen with the participation.
Resumo:
El entorno aéreo es, a día de hoy, uno de los escenarios más complicados a la hora de establecer enlaces de comunicación fiables. Esto es debido, principalmente, a las altas velocidades a las que circulan los aviones, que propician una gran degradación del rendimiento des sistema si no se estima de forma continua el canal. Además el entorno aéreo es susceptible a sufrir muchos otros efectos que provocan la degradación de la señal, como la difracción, la reflexión, etc. Por este motivo en este proyecto se hace un estudio de dos escenarios típicos de vuelo: arrival (aterrizaje) y on route (vuelo en ruta). En el escenario on route los aviones circulan a más de el doble de velocidad que en el escenario arrival, de esta manera se podrá ver el efecto de sufrir un doppler mayor. Para realizar el estudio se utiliza un sistema multiportadora con solapamiento de subcanales, OFDM, y se toman inicialmente parámetros típicos de la tecnología WiMAX, que se variarán con el objetivo de mejorar el rendimiento del sistema.
Development of an optimized methodology for tensile testing of carbon steels in hydrogen environment
Resumo:
The study was performed at OCAS, the Steel Research Centre of ArcelorMittal for the Industry market. The major aim of this research was to obtain an optimized tensile testing methodology with in-situ H-charging to reveal the hydrogen embrittlement in various high strength steels. The second aim of this study has been the mechanical characterization of the hydrogen effect on hight strength carbon steels with varying microstructure, i.e. ferrite-martensite and ferrite-bainite grades. The optimal parameters for H-charging - which influence the tensile test results (sample geometry type of electrolyte, charging methods effect of steel type, etc.) - were defined and applied to Slow Strain Rate testing, Incremental Step Loading and Constant Load Testing. To better understand the initiation and propagation of cracks during tensile testing with in-situ H-charging, and to make the correlation with crystallographic orientation, some materials have been analyzed in the SEM in combination with the EBSD technique. The introduction of a notch on the tensile samples permits to reach a significantly improved reproducibility of the results. Comparing the various steel grades reveals that Dual Phase (ferrite-martensite) steels are more sensitive to hydrogen induced cracking than the FB (ferritic-bainitic) ones. This higher sensitivity to hydrogen was found back in the reduced failure times, increased creep rates and enhanced crack initiation (SEM) for the Dual Phase steels in comparison with the FB steels.
Resumo:
La tolerancia a fallos es una línea de investigación que ha adquirido una importancia relevante con el aumento de la capacidad de cómputo de los súper-computadores actuales. Esto es debido a que con el aumento del poder de procesamiento viene un aumento en la cantidad de componentes que trae consigo una mayor cantidad de fallos. Las estrategias de tolerancia a fallos actuales en su mayoría son centralizadas y estas no escalan cuando se utiliza una gran cantidad de procesos, dado que se requiere sincronización entre todos ellos para realizar las tareas de tolerancia a fallos. Además la necesidad de mantener las prestaciones en programas paralelos es crucial, tanto en presencia como en ausencia de fallos. Teniendo en cuenta lo citado, este trabajo se ha centrado en una arquitectura tolerante a fallos descentralizada (RADIC – Redundant Array of Distributed and Independant Controllers) que busca mantener las prestaciones iniciales y garantizar la menor sobrecarga posible para reconfigurar el sistema en caso de fallos. La implementación de esta arquitectura se ha llevado a cabo en la librería de paso de mensajes denominada Open MPI, la misma es actualmente una de las más utilizadas en el mundo científico para la ejecución de programas paralelos que utilizan una plataforma de paso de mensajes. Las pruebas iniciales demuestran que el sistema introduce mínima sobrecarga para llevar a cabo las tareas correspondientes a la tolerancia a fallos. MPI es un estándar por defecto fail-stop, y en determinadas implementaciones que añaden cierto nivel de tolerancia, las estrategias más utilizadas son coordinadas. En RADIC cuando ocurre un fallo el proceso se recupera en otro nodo volviendo a un estado anterior que ha sido almacenado previamente mediante la utilización de checkpoints no coordinados y la relectura de mensajes desde el log de eventos. Durante la recuperación, las comunicaciones con el proceso en cuestión deben ser retrasadas y redirigidas hacia la nueva ubicación del proceso. Restaurar procesos en un lugar donde ya existen procesos sobrecarga la ejecución disminuyendo las prestaciones, por lo cual en este trabajo se propone la utilización de nodos spare para la recuperar en ellos a los procesos que fallan, evitando de esta forma la sobrecarga en nodos que ya tienen trabajo. En este trabajo se muestra un diseño propuesto para gestionar de un modo automático y descentralizado la recuperación en nodos spare en un entorno Open MPI y se presenta un análisis del impacto en las prestaciones que tiene este diseño. Resultados iniciales muestran una degradación significativa cuando a lo largo de la ejecución ocurren varios fallos y no se utilizan spares y sin embargo utilizándolos se restablece la configuración inicial y se mantienen las prestaciones.
Resumo:
Over the past year, the Open University of Catalonia library has been designing its new website with this question in mind. Our main concern has been how to integrate the library in the student day to day study routine to not to be only a satellite tool. We present the design of the website that, in a virtual library like ours, it is not only a website but the whole library itself. The central point of the web is my library, a space that associates the library resources with the student's curriculum and their course subjects. There the students can save the resources as favourites, comment or share them. They have also access to all the services the library offers them. The resources are imported from multiple tools such as Millennium, SFX, Metalib and Dspace to the Drupal CMS. Then the resources' metadata can be enriched with other contextual information from other sources, for example the course subjects. And finally they can be exported in standard, open data formats making them available for linked data applications.
Resumo:
This paper presents a vision-based localization approach for an underwater robot in a structured environment. The system is based on a coded pattern placed on the bottom of a water tank and an onboard down looking camera. Main features are, absolute and map-based localization, landmark detection and tracking, and real-time computation (12.5 Hz). The proposed system provides three-dimensional position and orientation of the vehicle along with its velocity. Accuracy of the drift-free estimates is very high, allowing them to be used as feedback measures of a velocity-based low-level controller. The paper details the localization algorithm, by showing some graphical results, and the accuracy of the system
Resumo:
Recently, White (2007) analysed the international inequalities in Ecological Footprints per capita (EF hereafter) based on a two-factor decomposition of an index from the Atkinson family (Atkinson (1970)). Specifically, this paper evaluated the separate role of environment intensity (EF/GDP) and average income as explanatory factors for these global inequalities. However, in addition to other comments on their appeal, this decomposition suffers from the serious limitation of the omission of the role exerted by probable factorial correlation (York et al. (2005)). This paper proposes, by way of an alternative, a decomposition of a conceptually similar index like Theil’s (Theil, 1967) which, in effect, permits clear decomposition in terms of the role of both factors plus an inter-factor correlation, in line with Duro and Padilla (2006). This decomposition might, in turn, be extended to group inequality components (Shorrocks, 1980), an analysis that cannot be conducted in the case of the Atkinson indices. The proposed methodology is implemented empirically with the aim of analysing the international inequalities in EF per capita for the 1980-2007 period and, amongst other results, we find that, indeed, the interactive component explains, to a significant extent, the apparent pattern of stability observed in overall international inequalities.
Resumo:
Recently, White (2007) analysed the international inequalities in Ecological Footprints per capita (EF hereafter) based on a two-factor decomposition of an index from the Atkinson family (Atkinson (1970)). Specifically, this paper evaluated the separate role of environment intensity (EF/GDP) and average income as explanatory factors for these global inequalities. However, in addition to other comments on their appeal, this decomposition suffers from the serious limitation of the omission of the role exerted by probable factorial correlation (York et al. (2005)). This paper proposes, by way of an alternative, a decomposition of a conceptually similar index like Theil’s (Theil, 1967) which, in effect, permits clear decomposition in terms of the role of both factors plus an inter-factor correlation, in line with Duro and Padilla (2006). This decomposition might, in turn, be extended to group inequality components (Shorrocks, 1980), an analysis that cannot be conducted in the case of the Atkinson indices. The proposed methodology is implemented empirically with the aim of analysing the international inequalities in EF per capita for the 1980-2007 period and, amongst other results, we find that, indeed, the interactive component explains, to a significant extent, the apparent pattern of stability observed in overall international inequalities. Key words: ecological footprint; international environmental distribution; inequality decomposition
Resumo:
This paper describes a navigation system for autonomous underwater vehicles (AUVs) in partially structured environments, such as dams, harbors, marinas or marine platforms. A mechanical scanning imaging sonar is used to obtain information about the location of planar structures present in such environments. A modified version of the Hough transform has been developed to extract line features, together with their uncertainty, from the continuous sonar dataflow. The information obtained is incorporated into a feature-based SLAM algorithm running an Extended Kalman Filter (EKF). Simultaneously, the AUV's position estimate is provided to the feature extraction algorithm to correct the distortions that the vehicle motion produces in the acoustic images. Experiments carried out in a marina located in the Costa Brava (Spain) with the Ictineu AUV show the viability of the proposed approach
Resumo:
Engineering of negotiation model allows to develop effective heuristic for business intelligence. Digital ecosystems demand open negotiation models. To define in advance effective heuristics is not compliant with the requirement of openness. The new challenge is to develop business intelligence in advance exploiting an adaptive approach. The idea is to learn business strategy once new negotiation model rise in the e-market arena. In this paper we present how recommendation technology may be deployed in an open negotiation environment where the interaction protocol models are not known in advance. The solution we propose is delivered as part of the ONE Platform, open source software that implements a fully distributed open environment for business negotiation
Resumo:
This paper presents the distributed environment for virtual and/or real experiments for underwater robots (DEVRE). This environment is composed of a set of processes running on a local area network composed of three sites: 1) the onboard AUV computer; 2) a surface computer used as human-machine interface (HMI); and 3) a computer used for simulating the vehicle dynamics and representing the virtual world. The HMI can be transparently linked to the real sensors and actuators dealing with a real mission. It can also be linked with virtual sensors and virtual actuators, dealing with a virtual mission. The aim of DEVRE is to assist engineers during the software development and testing in the lab prior to real experiments
Resumo:
This paper highlights both the new functions taken on by UOC research librarians and the new skills that this professional profile requires, based on the experience of the UOC Virtual Library. By setting up a series of bibliometric units, the Library has been able to integrate itself into the University through bibliometric studies and other research support services. A group of research librarians provides support to researchers from the start of the research process to the assessment of their scientific output. They also provide support for the University's strategic decision-making through the analysis of bibliometric data.
Resumo:
This poster highlights both the new functions taken on by UOC research librarians and the new skills that this professional profile requires, based on the experience of the UOC Virtual Library. By setting up a series of bibliometric units, the Library has been able to integrate itself into the University through bibliometric studies and other research support services. A group of research librarians provides support to researchers from the start of the research process to the assessment of their scientific output. They also provide support for the University's strategic decision-making through the analysis of bibliometric data.
Resumo:
The explosive growth of Internet during the last years has been reflected in the ever-increasing amount of the diversity and heterogeneity of user preferences, types and features of devices and access networks. Usually the heterogeneity in the context of the users which request Web contents is not taken into account by the servers that deliver them implying that these contents will not always suit their needs. In the particular case of e-learning platforms this issue is especially critical due to the fact that it puts at stake the knowledge acquired by their users. In the following paper we present a system that aims to provide the dotLRN e-learning platform with the capability to adapt to its users context. By integrating dotLRN with a multi-agent hypermedia system, online courses being undertaken by students as well as their learning environment are adapted in real time
Resumo:
The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.