6 resultados para code of ethics

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The important developments in technology in all areas of human life have generated high expectations and hopes with regard to the health sector. Science and technology have favored the development of incredible therapeutic treatments to help resolve numerous problems relating to illness and disability. Nonetheless, many developments in the therapeutic realm have given rise to discussions over the possibility of whether this same scientific and technological progress could be beneficial even for those who may not be sick. One may ask: why not apply the same knowledge and technology used for treatment of illness for conditions where therapy is not necessary, but there is a desire to care for, improve and enhance human person? These new horizons offered by biomedical technologies undoubtedly express a deep desire of every person for health, happiness, and a long life. In order to offer a response to these questions, current biomedical technologies and those in development offer a wide range of possibilities. Therefore, in this investigation we attempt to identify and define four areas of non-therapeutic treatment: illness prevention, health promotion, improving human nature, and human enhancement. These four areas, which do not directly regard illness, give rise to a series of questions, which range from those regarding the meaning of health and illness to those concerning anthropological questions, such as situations and conditions that must be taken into account so human dignity is respected. The treatment, improvement and enhancement of the human being imply clarifying in scientific and technological terms the truth and meaning of the human person as such. This research identifies and looks at the relationship between the four anthropological cornerstones which non-therapeutic biomedical technologies should be based upon so as not to impact or violate the dignity of the human person. This research presents the anthropological boundaries which non-therapeutic biomedical technologies should take into consideration so as not to alter or violate the dignity of the human person. At the same time, the research proposes an anthropological foundation on which to build a code of ethics for non-therapeutic biomedical technologies. El gran desarrollo de las tecnologías en todos los ámbitos de la vida del hombre ha generado una gran expectativa y esperanza en lo que se refiere a la salud. Ciencia y técnica están aportando grandes beneficios en materia terapéutica, ayudando a resolver muchos problemas concernientes a la enfermedad y a la discapacidad. Pero este desarrollo que se ha producido en el ámbito terapéutico nos conduce a la formulación de preguntas sobre las posibilidades que esos avances técnico-científicos pueden aportar en beneficio del hombre, cuando no se encuentra enfermo: ¿por qué no pueden aplicarse los conocimientos y tecnologías usados en terapia a un ámbito diferente, no terapéutico, con el fin de mantener, mejorar o incluso potenciar al hombre? Ciertamente los nuevos horizontes que abren las Tecnologías Biomédicas encuentran repercusión en el deseo de bienestar, de felicidad e incluso de prolongación de la vida presente en todos los hombres. Para responder a esta pregunta las Tecnologías Biomédicas han desarrollado y están desarrollando una gama muy amplia de posibilidades. En este trabajo intentamos organizar en cuatro áreas los conceptos de los tratamientos no-terapéuticos: prevención de la enfermedad, promoción de la salud, mejoramiento de la naturaleza humana y potenciación del hombre. Estas cuatro áreas, que no se refieren directamente a la enfermedad, generan una serie de interrogantes que van desde las preguntas sobre el significado de salud y enfermedad, hasta las cuestiones antropológicas relativas a la posibilidad y las condiciones que se han de dar para que tales acciones respeten la dignidad humana. Cuidar, mejorar y potenciar al hombre implica que los objetivos de la ciencia y de la técnica mantengan siempre claros los valores y la realidad del hombre en cuanto tal. ... Este Trabajo de Investigación presenta los límites antropológicos dentro de los cuales deben moverse las Tecnologías Biomédicas no-terapéuticas para no alterar el ser ni menoscabar la dignidad del hombre. Y ofrece los fundamentos antropológicos sobre los cuales se pueda construir un código ético y deontológico para las Tecnologías Biomédicas no-terapéuticas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem is general: modern architects and engineers are trying to understand historic structures using the wrong theoretical frame, the classic (elastic) thery of structures developed in the 19th Century for iron and stell, and in the 20th century for reinforced concrete, disguised with "modern" computer packages, mainly FEM, but also others. Masonry is an essentially different material, and the structural equations must be adapted accordingly. It is not a matter of "taste" or "opinion", and the consequences are before us. Since, say 1920s, historic monuments have suffered the aggression of generations of archietcts and engineers, trying to transform masonry in reinfored concrete or steel. The damage to the monuments and the expense has been, and is, enormous. However, as we have an adequate theory (modern limit analysis of masonry structures, Heyman 1966) which encompasses the "old theory" used successfully by the 18th and 19th Century practical engineers (from Perronet to Sejourné), it is a matter of "Ethics" not to use the wrong approach. It is also "contra natura" to modify the material masonry with indiscriminate injections, stitchings, etc. It is insane to consider, suddenly, that buildings which are Centuries or milennia old, are suddenly in danger of collapse. Maintenance is necessary but not the actual destruction of the constructive essence of the monument. A cocktail of "ignorance, fear and greed" is acting under the best of intentions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents the results of part of the research carried out by a committee in charge of the elaboration of the new Spanish Code of Actions in Railway Bridges. Following the work developed by the European Rail Research Institute (ERRI), the dynamic effects caused by the Spanish high-speed train TALGO have been studied and compared with other European trains. A simplified envelope of the impact coefficient is also presented. Finally, the train-bridge interactions has been analysed and the results compared with those obtained from simple models based on moving loads.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

EPICS (Experimental Physics and Industrial Control System) lies in a set of software tools and applications which provide a software infrastructure for building distributed data acquisition and control systems. Currently there is an increase in use of such systems in large Physics experiments like ITER, ESS, and FREIA. In these experiments, advanced data acquisition systems using FPGA-based technology like FlexRIO are more frequently been used. The particular case of ITER (International Thermonuclear Experimental Reactor), the instrumentation and control system is supported by CCS (CODAC Core System), based on RHEL (Red Hat Enterprise Linux) operating system, and by the plant design specifications in which every CCS element is defined either hardware, firmware or software. In this degree final project the methodology proposed in Implementation of Intelligent Data Acquisition Systems for Fusion Experiments using EPICS and FlexRIO Technology Sanz et al. [1] is used. The final objective is to provide a document describing the fulfilled process and the source code of the data acquisition system accomplished. The use of the proposed methodology leads to have two diferent stages. The first one consists of the hardware modelling with graphic design tools like LabVIEWFPGA which later will be implemented in the FlexRIO device. In the next stage the design cycle is completed creating an EPICS controller that manages the device using a generic device support layer named NDS (Nominal Device Support). This layer integrates the data acquisition system developed into CCS (Control, data access and communication Core System) as an EPICS interface to the system. The use of FlexRIO technology drives the use of LabVIEW and LabVIEW FPGA respectively. RESUMEN. EPICS (Experimental Physics and Industrial Control System) es un conjunto de herramientas software utilizadas para el desarrollo e implementación de sistemas de adquisición de datos y control distribuidos. Cada vez es más utilizado para entornos de experimentación física a gran escala como ITER, ESS y FREIA entre otros. En estos experimentos se están empezando a utilizar sistemas de adquisición de datos avanzados que usan tecnología basada en FPGA como FlexRIO. En el caso particular de ITER, el sistema de instrumentación y control adoptado se basa en el uso de la herramienta CCS (CODAC Core System) basado en el sistema operativo RHEL (Red Hat) y en las especificaciones del diseño del sistema de planta, en la cual define todos los elementos integrantes del CCS, tanto software como firmware y hardware. En este proyecto utiliza la metodología propuesta para la implementación de sistemas de adquisición de datos inteligente basada en EPICS y FlexRIO. Se desea generar una serie de ejemplos que cubran dicho ciclo de diseño completo y que serían propuestos como casos de uso de dichas tecnologías. Se proporcionará un documento en el que se describa el trabajo realizado así como el código fuente del sistema de adquisición. La metodología adoptada consta de dos etapas diferenciadas. En la primera de ellas se modela el hardware y se sintetiza en el dispositivo FlexRIO utilizando LabVIEW FPGA. Posteriormente se completa el ciclo de diseño creando un controlador EPICS que maneja cada dispositivo creado utilizando una capa software genérica de manejo de dispositivos que se denomina NDS (Nominal Device Support). Esta capa integra la solución en CCS realizando la interfaz con la capa EPICS del sistema. El uso de la tecnología FlexRIO conlleva el uso del lenguaje de programación y descripción hardware LabVIEW y LabVIEW FPGA respectivamente.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This document is the result of a process of web development to create a tool that will allow to Cracow University of Technology consult, create and manage timetables. The technologies chosen for this purpose are Apache Tomcat Server, My SQL Community Server, JDBC driver, Java Servlets and JSPs for the server side. The client part counts on Javascript, jQuery, AJAX and CSS technologies to perform the dynamism. The document will justify the choice of these technologies and will explain some development tools that help in the integration and development of all this elements: specifically, NetBeans IDE and MySQL workbench have been used as helpful tools. After explaining all the elements involved in the development of the web application, the architecture and the code developed are explained through UML diagrams. Some implementation details related to security are also deeper explained through sequence diagrams. As the source code of the application is provided, an installation manual has been developed to run the project. In addition, as the platform is intended to be a beta that will be grown, some unimplemented ideas for future development are also exposed. Finally, some annexes with important files and scripts related to the initiation of the platform are attached. This project started through an existing tool that needed to be expanded. The main purpose of the project along its development has focused on setting the roots for a whole new platform that will replace the existing one. For this goal, it has been needed to make a deep inspection on the existing web technologies: a web server and a SQL database had to be chosen. Although the alternatives were a lot, Java technology for the server was finally selected because of the big community backwards, the easiness of modelling the language through UML diagrams and the fact of being free license software. Apache Tomcat is the open source server that can use Java Servlet and JSP technology. Related to the SQL database, MySQL Community Server is the most popular open-source SQL Server, with a big community after and quite a lot of tools to manage the server. JDBC is the driver needed to put in contact Java and MySQL. Once we chose the technologies that would be part of the platform, the development process started. After a detailed explanation of the development environment installation, we used UML use case diagrams to set the main tasks of the platform; UML class diagrams served to establish the existing relations between the classes generated; the architecture of the platform was represented through UML deployment diagrams; and Enhanced entity–relationship (EER) model were used to define the tables of the database and their relationships. Apart from the previous diagrams, some implementation issues were explained to make a better understanding of the developed code - UML sequence diagrams helped to explain this. Once the whole platform was properly defined and developed, the performance of the application has been shown: it has been proved that with the current state of the code, the platform covers the use cases that were set as the main target. Nevertheless, some requisites needed for the proper working of the platform have been specified. As the project is aimed to be grown, some ideas that could not be added to this beta have been explained in order not to be missed for future development. Finally, some annexes containing important configuration issues for the platform have been added after proper explanation, as well as an installation guide that will let a new developer get the project ready. In addition to this document some other files related to the project are provided: - Javadoc. The Javadoc containing the information of every Java class created is necessary for a better understanding of the source code. - database_model.mwb. This file contains the model of the database for MySQL Workbench. This model allows, among other things, generate the MySQL script for the creation of the tables. - ScheduleManager.war. The WAR file that will allow loading the developed application into Tomcat Server without using NetBeans. - ScheduleManager.zip. The source code exported from NetBeans project containing all Java packages, JSPs, Javascript files and CSS files that are part of the platform. - config.properties. The configuration file to properly get the names and credentials to use the database, also explained in Annex II. Example of config.properties file. - db_init_script.sql. The SQL query to initiate the database explained in Annex III. SQL statements for MySQL initialization. RESUMEN. Este proyecto tiene como punto de partida la necesidad de evolución de una herramienta web existente. El propósito principal del proyecto durante su desarrollo se ha centrado en establecer las bases de una completamente nueva plataforma que reemplazará a la existente. Para lograr esto, ha sido necesario realizar una profunda inspección en las tecnologías web existentes: un servidor web y una base de datos SQL debían ser elegidos. Aunque existen muchas alternativas, la tecnología Java ha resultado ser elegida debido a la gran comunidad de desarrolladores que tiene detrás, además de la facilidad que proporciona este lenguaje a la hora de modelarlo usando diagramas UML. Tampoco hay que olvidar que es una tecnología de uso libre de licencia. Apache Tomcat es el servidor de código libre que permite emplear Java Servlets y JSPs para hacer uso de la tecnología de Java. Respecto a la base de datos SQL, el servidor más popular de código libre es MySQL, y cuenta también con una gran comunidad detrás y buenas herramientas de modelado, creación y gestión de la bases de datos. JDBC es el driver que va a permitir comunicar las aplicaciones Java con MySQL. Tras elegir las tecnologías que formarían parte de esta nueva plataforma, el proceso de desarrollo tiene comienzo. Tras una extensa explicación de la instalación del entorno de desarrollo, se han usado diagramas de caso de UML para establecer cuáles son los objetivos principales de la plataforma; los diagramas de clases nos permiten realizar una organización del código java desarrollado de modo que sean fácilmente entendibles las relaciones entre las diferentes clases. La arquitectura de la plataforma queda definida a través de diagramas de despliegue. Por último, diagramas EER van a definir las relaciones entre las tablas creadas en la base de datos. Aparte de estos diagramas, algunos detalles de implementación se van a justificar para tener una mejor comprensión del código desarrollado. Diagramas de secuencia ayudarán en estas explicaciones. Una vez que toda la plataforma haya quedad debidamente definida y desarrollada, se va a realizar una demostración de la misma: se demostrará cómo los objetivos generales han sido alcanzados con el desarrollo actual del proyecto. No obstante, algunos requisitos han sido aclarados para que la plataforma trabaje adecuadamente. Como la intención del proyecto es crecer (no es una versión final), algunas ideas que se han podido llevar acabo han quedado descritas de manera que no se pierdan. Por último, algunos anexos que contienen información importante acerca de la plataforma se han añadido tras la correspondiente explicación de su utilidad, así como una guía de instalación que va a permitir a un nuevo desarrollador tener el proyecto preparado. Junto a este documento, ficheros conteniendo el proyecto desarrollado quedan adjuntos. Estos ficheros son: - Documentación Javadoc. Contiene la información de las clases Java que han sido creadas. - database_model.mwb. Este fichero contiene el modelo de la base de datos para MySQL Workbench. Esto permite, entre otras cosas, generar el script de iniciación de la base de datos para la creación de las tablas. - ScheduleManager.war. El fichero WAR que permite desplegar la plataforma en un servidor Apache Tomcat. - ScheduleManager.zip. El código fuente exportado directamente del proyecto de Netbeans. Contiene todos los paquetes de Java generados, ficheros JSPs, Javascript y CSS que forman parte de la plataforma. - config.properties. Ejemplo del fichero de configuración que permite obtener los nombres de la base de datos - db_init_script.sql. Las consultas SQL necesarias para la creación de la base de datos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.