779 resultados para code of ethics
Resumo:
While forms of ethics based upon authenticity and recognition are holding sway in contemporary philosophical debates (Ferrara, Honneth, Fraser, etc.), many of the implications of both processes – conceptual, moral, political – are still insufficiently reflected upon. The talk will offer a “critique” (in the Kantian sense) of both, based upon an analysis of the “semiotics” of authenticity and the resulting perpetuation of a regime of authority of experts, as well as commenting upon the striking absence of the realm of literature and the arts from this debate, except in some references to a rather abstract notion of Aesthetics. It will also critically revaluate the concept of agency implicit in an ethics of authenticity and recognition.
Resumo:
We provide new insights into the geochemistry of serpentinites from mid-ocean ridges (Mid-Atlantic Ridge and Hess Deep), passive margins (Iberia Abyssal Plain and Newfoundland) and fore-arcs (Mariana and Guatemala) based on bulk-rock and in situ mineral major and trace element compositional data collected on drill cores from the Deep Sea Drilling Project and Ocean Drilling Program. These data are important for constraining the serpentinite-hosted trace element inventory of subduction zones. Bulk serpentinites show up to several orders of magnitude enrichments in Cl, B, Sr, U, Sb, Pb, Rb, Cs and Li relative to elements of similar compatibility during mantle melting, which correspond to the highest primitive mantle-normalized B/Nb, B/Th, U/Th, Sb/Ce, Sr/Nd and Li/Y among subducted lithologies of the oceanic lithosphere (serpentinites, sediments and altered igneous oceanic crust). Among the elements showing relative enrichment, Cl and B are by far the most abundant with bulk concentrations mostly above 1000 µg/g and 30 µg/g, respectively. All other trace elements showing relative enrichments are generally present in low concentrations (µg/g level), except Sr in carbonate-bearing serpentinites (thousands of µg/g). In situ data indicate that concentrations of Cl, B, Sr, U, Sb, Rb and Cs are, and that of Li can be, increased by serpentinization. These elements are largely hosted in serpentine (lizardite and chrysotile, but not antigorite). Aragonite precipitation leads to significant enrichments in Sr, U and B, whereas calcite is important only as an Sr host. Commonly observed brucite is trace element-poor. The overall enrichment patterns are comparable among serpentinites from mid-ocean ridges, passive margins and fore-arcs, whereas the extents of enrichments are often specific to the geodynamic setting. Variability in relative trace element enrichments within a specific setting (and locality) can be several orders of magnitude. Mid-ocean ridge serpentinites often show pronounced bulk-rock U enrichment in addition to ubiquitous Cl, B and Sr enrichment. They also exhibit positive Eu anomalies on chondrite-normalized rare earth element plots. Passive margin serpentinites tend to have higher overall incompatible trace element contents than mid-ocean ridge and fore-arc serpentinites and show the highest B enrichment among all the studied serpentinites. Fore-arc serpentinites are characterized by low overall trace element contents and show the lowest Cl, but the highest Rb, Cs and Sr enrichments. Based on our data, subducted dehydrating serpentinites are likely to release fluids with high B/Nb, B/Th, U/Th, Sb/Ce and Sr/Nd, rendering them one of the potential sources of some of the characteristic trace element fingerprints of arc magmas (e.g. high B/Nb, high Sr/Nd, high Sb/Ce). However, although serpentinites are a substantial part of global subduction zone chemical cycling, owing to their low overall trace element contents (except for B and Cl) their geochemical imprint on arc magma sources (apart from addition of H2O, B and Cl) can be masked considerably by the trace element signal from subducted crustal components.
Resumo:
The problem is general: modern architects and engineers are trying to understand historic structures using the wrong theoretical frame, the classic (elastic) thery of structures developed in the 19th Century for iron and stell, and in the 20th century for reinforced concrete, disguised with "modern" computer packages, mainly FEM, but also others. Masonry is an essentially different material, and the structural equations must be adapted accordingly. It is not a matter of "taste" or "opinion", and the consequences are before us. Since, say 1920s, historic monuments have suffered the aggression of generations of archietcts and engineers, trying to transform masonry in reinfored concrete or steel. The damage to the monuments and the expense has been, and is, enormous. However, as we have an adequate theory (modern limit analysis of masonry structures, Heyman 1966) which encompasses the "old theory" used successfully by the 18th and 19th Century practical engineers (from Perronet to Sejourné), it is a matter of "Ethics" not to use the wrong approach. It is also "contra natura" to modify the material masonry with indiscriminate injections, stitchings, etc. It is insane to consider, suddenly, that buildings which are Centuries or milennia old, are suddenly in danger of collapse. Maintenance is necessary but not the actual destruction of the constructive essence of the monument. A cocktail of "ignorance, fear and greed" is acting under the best of intentions.
Resumo:
This paper presents the results of part of the research carried out by a committee in charge of the elaboration of the new Spanish Code of Actions in Railway Bridges. Following the work developed by the European Rail Research Institute (ERRI), the dynamic effects caused by the Spanish high-speed train TALGO have been studied and compared with other European trains. A simplified envelope of the impact coefficient is also presented. Finally, the train-bridge interactions has been analysed and the results compared with those obtained from simple models based on moving loads.
Resumo:
EPICS (Experimental Physics and Industrial Control System) lies in a set of software tools and applications which provide a software infrastructure for building distributed data acquisition and control systems. Currently there is an increase in use of such systems in large Physics experiments like ITER, ESS, and FREIA. In these experiments, advanced data acquisition systems using FPGA-based technology like FlexRIO are more frequently been used. The particular case of ITER (International Thermonuclear Experimental Reactor), the instrumentation and control system is supported by CCS (CODAC Core System), based on RHEL (Red Hat Enterprise Linux) operating system, and by the plant design specifications in which every CCS element is defined either hardware, firmware or software. In this degree final project the methodology proposed in Implementation of Intelligent Data Acquisition Systems for Fusion Experiments using EPICS and FlexRIO Technology Sanz et al. [1] is used. The final objective is to provide a document describing the fulfilled process and the source code of the data acquisition system accomplished. The use of the proposed methodology leads to have two diferent stages. The first one consists of the hardware modelling with graphic design tools like LabVIEWFPGA which later will be implemented in the FlexRIO device. In the next stage the design cycle is completed creating an EPICS controller that manages the device using a generic device support layer named NDS (Nominal Device Support). This layer integrates the data acquisition system developed into CCS (Control, data access and communication Core System) as an EPICS interface to the system. The use of FlexRIO technology drives the use of LabVIEW and LabVIEW FPGA respectively. RESUMEN. EPICS (Experimental Physics and Industrial Control System) es un conjunto de herramientas software utilizadas para el desarrollo e implementación de sistemas de adquisición de datos y control distribuidos. Cada vez es más utilizado para entornos de experimentación física a gran escala como ITER, ESS y FREIA entre otros. En estos experimentos se están empezando a utilizar sistemas de adquisición de datos avanzados que usan tecnología basada en FPGA como FlexRIO. En el caso particular de ITER, el sistema de instrumentación y control adoptado se basa en el uso de la herramienta CCS (CODAC Core System) basado en el sistema operativo RHEL (Red Hat) y en las especificaciones del diseño del sistema de planta, en la cual define todos los elementos integrantes del CCS, tanto software como firmware y hardware. En este proyecto utiliza la metodología propuesta para la implementación de sistemas de adquisición de datos inteligente basada en EPICS y FlexRIO. Se desea generar una serie de ejemplos que cubran dicho ciclo de diseño completo y que serían propuestos como casos de uso de dichas tecnologías. Se proporcionará un documento en el que se describa el trabajo realizado así como el código fuente del sistema de adquisición. La metodología adoptada consta de dos etapas diferenciadas. En la primera de ellas se modela el hardware y se sintetiza en el dispositivo FlexRIO utilizando LabVIEW FPGA. Posteriormente se completa el ciclo de diseño creando un controlador EPICS que maneja cada dispositivo creado utilizando una capa software genérica de manejo de dispositivos que se denomina NDS (Nominal Device Support). Esta capa integra la solución en CCS realizando la interfaz con la capa EPICS del sistema. El uso de la tecnología FlexRIO conlleva el uso del lenguaje de programación y descripción hardware LabVIEW y LabVIEW FPGA respectivamente.
Resumo:
This document is the result of a process of web development to create a tool that will allow to Cracow University of Technology consult, create and manage timetables. The technologies chosen for this purpose are Apache Tomcat Server, My SQL Community Server, JDBC driver, Java Servlets and JSPs for the server side. The client part counts on Javascript, jQuery, AJAX and CSS technologies to perform the dynamism. The document will justify the choice of these technologies and will explain some development tools that help in the integration and development of all this elements: specifically, NetBeans IDE and MySQL workbench have been used as helpful tools. After explaining all the elements involved in the development of the web application, the architecture and the code developed are explained through UML diagrams. Some implementation details related to security are also deeper explained through sequence diagrams. As the source code of the application is provided, an installation manual has been developed to run the project. In addition, as the platform is intended to be a beta that will be grown, some unimplemented ideas for future development are also exposed. Finally, some annexes with important files and scripts related to the initiation of the platform are attached. This project started through an existing tool that needed to be expanded. The main purpose of the project along its development has focused on setting the roots for a whole new platform that will replace the existing one. For this goal, it has been needed to make a deep inspection on the existing web technologies: a web server and a SQL database had to be chosen. Although the alternatives were a lot, Java technology for the server was finally selected because of the big community backwards, the easiness of modelling the language through UML diagrams and the fact of being free license software. Apache Tomcat is the open source server that can use Java Servlet and JSP technology. Related to the SQL database, MySQL Community Server is the most popular open-source SQL Server, with a big community after and quite a lot of tools to manage the server. JDBC is the driver needed to put in contact Java and MySQL. Once we chose the technologies that would be part of the platform, the development process started. After a detailed explanation of the development environment installation, we used UML use case diagrams to set the main tasks of the platform; UML class diagrams served to establish the existing relations between the classes generated; the architecture of the platform was represented through UML deployment diagrams; and Enhanced entity–relationship (EER) model were used to define the tables of the database and their relationships. Apart from the previous diagrams, some implementation issues were explained to make a better understanding of the developed code - UML sequence diagrams helped to explain this. Once the whole platform was properly defined and developed, the performance of the application has been shown: it has been proved that with the current state of the code, the platform covers the use cases that were set as the main target. Nevertheless, some requisites needed for the proper working of the platform have been specified. As the project is aimed to be grown, some ideas that could not be added to this beta have been explained in order not to be missed for future development. Finally, some annexes containing important configuration issues for the platform have been added after proper explanation, as well as an installation guide that will let a new developer get the project ready. In addition to this document some other files related to the project are provided: - Javadoc. The Javadoc containing the information of every Java class created is necessary for a better understanding of the source code. - database_model.mwb. This file contains the model of the database for MySQL Workbench. This model allows, among other things, generate the MySQL script for the creation of the tables. - ScheduleManager.war. The WAR file that will allow loading the developed application into Tomcat Server without using NetBeans. - ScheduleManager.zip. The source code exported from NetBeans project containing all Java packages, JSPs, Javascript files and CSS files that are part of the platform. - config.properties. The configuration file to properly get the names and credentials to use the database, also explained in Annex II. Example of config.properties file. - db_init_script.sql. The SQL query to initiate the database explained in Annex III. SQL statements for MySQL initialization. RESUMEN. Este proyecto tiene como punto de partida la necesidad de evolución de una herramienta web existente. El propósito principal del proyecto durante su desarrollo se ha centrado en establecer las bases de una completamente nueva plataforma que reemplazará a la existente. Para lograr esto, ha sido necesario realizar una profunda inspección en las tecnologías web existentes: un servidor web y una base de datos SQL debían ser elegidos. Aunque existen muchas alternativas, la tecnología Java ha resultado ser elegida debido a la gran comunidad de desarrolladores que tiene detrás, además de la facilidad que proporciona este lenguaje a la hora de modelarlo usando diagramas UML. Tampoco hay que olvidar que es una tecnología de uso libre de licencia. Apache Tomcat es el servidor de código libre que permite emplear Java Servlets y JSPs para hacer uso de la tecnología de Java. Respecto a la base de datos SQL, el servidor más popular de código libre es MySQL, y cuenta también con una gran comunidad detrás y buenas herramientas de modelado, creación y gestión de la bases de datos. JDBC es el driver que va a permitir comunicar las aplicaciones Java con MySQL. Tras elegir las tecnologías que formarían parte de esta nueva plataforma, el proceso de desarrollo tiene comienzo. Tras una extensa explicación de la instalación del entorno de desarrollo, se han usado diagramas de caso de UML para establecer cuáles son los objetivos principales de la plataforma; los diagramas de clases nos permiten realizar una organización del código java desarrollado de modo que sean fácilmente entendibles las relaciones entre las diferentes clases. La arquitectura de la plataforma queda definida a través de diagramas de despliegue. Por último, diagramas EER van a definir las relaciones entre las tablas creadas en la base de datos. Aparte de estos diagramas, algunos detalles de implementación se van a justificar para tener una mejor comprensión del código desarrollado. Diagramas de secuencia ayudarán en estas explicaciones. Una vez que toda la plataforma haya quedad debidamente definida y desarrollada, se va a realizar una demostración de la misma: se demostrará cómo los objetivos generales han sido alcanzados con el desarrollo actual del proyecto. No obstante, algunos requisitos han sido aclarados para que la plataforma trabaje adecuadamente. Como la intención del proyecto es crecer (no es una versión final), algunas ideas que se han podido llevar acabo han quedado descritas de manera que no se pierdan. Por último, algunos anexos que contienen información importante acerca de la plataforma se han añadido tras la correspondiente explicación de su utilidad, así como una guía de instalación que va a permitir a un nuevo desarrollador tener el proyecto preparado. Junto a este documento, ficheros conteniendo el proyecto desarrollado quedan adjuntos. Estos ficheros son: - Documentación Javadoc. Contiene la información de las clases Java que han sido creadas. - database_model.mwb. Este fichero contiene el modelo de la base de datos para MySQL Workbench. Esto permite, entre otras cosas, generar el script de iniciación de la base de datos para la creación de las tablas. - ScheduleManager.war. El fichero WAR que permite desplegar la plataforma en un servidor Apache Tomcat. - ScheduleManager.zip. El código fuente exportado directamente del proyecto de Netbeans. Contiene todos los paquetes de Java generados, ficheros JSPs, Javascript y CSS que forman parte de la plataforma. - config.properties. Ejemplo del fichero de configuración que permite obtener los nombres de la base de datos - db_init_script.sql. Las consultas SQL necesarias para la creación de la base de datos.
Resumo:
PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.
Resumo:
Design of hydroxyproline (Hyp)-rich glycoproteins (HRGPs) offers an approach for the structural and functional analysis of these wall components, which are broadly implicated in plant growth and development. HRGPs consist of multiple small repetitive “glycomodules” extensively O-glycosylated through the Hyp residues. The patterns of Hyp-O-glycosylation are putatively coded by the primary sequence as described by the Hyp contiguity hypothesis, which predicts contiguous Hyp residues to be attachment sites of small arabinooligosaccharides (1–5 Ara residues/Hyp); while clustered, noncontiguous Hyp residues are sites of arabinogalactan polysaccharide attachment. As a test, we designed two simple HRGPs as fusion proteins with green fluorescent protein. The first was a repetitive Ser-Hyp motif that encoded only clustered noncontiguous Hyp residues, predicted polysaccharide addition sites. The resulting glycoprotein had arabinogalactan polysaccharide O-linked to all Hyp residues. The second construct, based on the consensus sequence of a gum arabic HRGP, contained both arabinogalactan and arabinooligosaccharide addition sites and, as predicted, gave a product that contained both saccharide types. These results identify an O-glycosylation code of plants.
Resumo:
M2 is a double-stranded RNA (dsRNA) element occurring in the hypovirulent isolate Rhs 1A1 of the plant pathogenic basidiomycete Rhizoctonia solani. Rhs 1A1 originated as a sector of the virulent field isolate Rhs 1AP, which contains no detectable amount of the M2 dsRNA. The complete sequence (3,570 bp) of the M2 dsRNA has been determined. A 6.9-kbp segment of total DNA from either Rhs 1A1 or Rhs 1AP hybridizes with an M2-specific cDNA probe. The sequences of M2 dsRNA and of PCR products generated from Rhs 1A1 total DNA were found to be identical. Thus this report describes a fungal host containing full-length DNA copies of a dsRNA element. A major portion of the M2 dsRNA is located in the cytoplasm, whereas a smaller amount is found in mitochondria. Based on either the universal or the mitochondrial genetic code of filamentous fungi, one strand of M2 encodes a putative protein of 754 amino acids. The resulting polypeptide has all four motifs of a dsRNA viral RNA-dependent RNA polymerase (RDRP) and is phylogenetically related to the RDRP of a mitochondrial dsRNA associated with hypovirulence in strain NB631 of Cryphonectria parasitica, incitant of chestnut blight. This polypeptide also has significant sequence similarity with two domains of a pentafunctional polypeptide, which catalyzes the five central steps of the shikimate pathway in yeast and filamentous fungi.
Resumo:
The ability to predict macromolecular conformations from sequence and thermodynamic principles has long been coveted but generally has not been achieved. We show that differences in the hydration of DNA surfaces can be used to distinguish between sequences that form A- and B-DNA. From this, a "triplet code" of A-DNA propensities was derived as energetic rules for predicting A-DNA formation. This code correctly predicted > 90% of A- and B-DNA sequences in crystals and correlates with A-DNA formation in solution. Thus, with our previous studies on Z-DNA, we now have a single method to predict the relative stability of sequences in the three standard DNA duplex conformations.
Resumo:
This project examines what an ethical code of conduct would look like in Afghanistan through analysis of historical, cultural and linguistic aspects found within its regions, as well as an examination of ethical codes of conduct for translators and interpreters in other countries. While numerous ethical guidelines and codes of conduct for translators and interpreters exist throughout global communities, it seems that creating a successful standardized ethical code of conduct in Afghanistan may be difficult to achieve given cultural and linguistic complexities. An ethical code of conduct for translators and interpreters in Afghanistan should include particular focus on: cultural sensitivity and courtesy, impartiality, conflict of interest, options for withdrawal, as well as reflect the importance of Pashtunwali.
Resumo:
From the perspective of the sociology of professions, every professional activity should have its own clearly circumscribed and regulated sphere of action. Such an articulation facilitates the regulation of the production of a given profession as well as the way in which it is practiced. The purpose of the research reported here was to provide a comprehensive review and evaluation of the regulatory framework governing the advertising sector in Spain. To this end, the authors analysed external regulatory legislation and self-regulatory codes extracted from the data base of the Asociación para la Autoregulación de la Comunicación Comercial (Autocontrol) that had been enacted or adopted between 1988, the year that Law 11/1998 on General Telecommunications entered into force, and 2003 as well as other relevant documents retrieved from the Boletin Oficial del Estado (BOE) pertaining to the same period. Findings indicate that although there has been a groundswell of legislation governing advertising practices in Spain since 1988, especially at the regional level, lawmakers have focused on the content of advertising messages and shown very little interest in regulating the professions of advertising and public relations. Furthermore, Spanish legislation enacted in 2003 and EU policies appear to have encouraged the adoption of voluntary codes of ethics. Sectors traditionally subject to mandatory advertising regulation, either due to the vulnerability of their target audiences or the potential impact of their commercial messages on public health or the environment, are more likely to develop self-regulatory codes of conduct than others
Resumo:
This paper demonstrates a mixed approach to the theme of the instrumentality of law by both analysing the goal of a legal transformation and the techniques adapted to achieve it. The correct recognition of a certain practical necessity has lead the Swiss Federal Tribunal to an intriguing judgement “Fussballclub Lohn-Fall” of 1997. The legal remedies provided for cases of unfair advantage have been then creatively modified praeter legem. The adaptation was strongly influenced by foreign legal patterns. The Swiss Code of Obligations of 1911 provides a norm in art. 21 on unfair advantage (unconscionable contract), prescribing that if one party takes unjustified advantage over the weaknesses of another in order to receive an excessive benefit, such a contract is avoidable. Its wording has been shaped over a hundred years ago and still remains intact. However, over the course of the 20th century the necessity for a more efficient protection has arisen. The legal doctrine and jurisprudence were constantly pointing out the incompleteness of the remedies provided by art. 21 of the Code of Obligations. In the “Fussballclub Lohn-Fall” (BGE 123 III 292) the Swiss Federal Tribunal finally introduced the possibility to modify the contract. Its decision has been described as “a sign of the zeitgeist, spirit of the time”. It was the Swiss legal doctrine that has imposed the new measure under the influence of the German “quantitative Teilnichtigkeit” (quantitative partial nullity). The historical heritage of the Roman laesio enormis has also played its role.
Resumo:
"Reprint from the Code of Virginia of 1950 and the 1952 cumulative supplement."