238 resultados para compiling
Resumo:
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
Resumo:
La Ciencia Ciudadana nace del resultado de involucrar en las investigaciones científicas a todo tipo de personas, las cuales pueden participar en un determinado experimento analizando o recopilando datos. No hace falta que tengan una formación científica para poder participar, es decir cualquiera puede contribuir con su granito de arena. La ciencia ciudadana se ha convertido en un elemento a tener en cuenta a la hora de realizar tareas científicas que requieren mucha dedicación, o que simplemente por el volumen de trabajo que estas implican, resulta casi imposible que puedan ser realizadas por una sola persona o un pequeño grupo de trabajo. El proyecto GLORIA (GLObal Robotic-telescopes Intelligent Array) es la primera red de telescopios robóticos del mundo de acceso libre que permite a los usuarios participar en la investigación astronómica mediante la observación con telescopios robóticos, y/o analizando los datos que otros usuarios han adquirido con GLORIA, o desde otras bases de datos de libre acceso. Con el objetivo de contribuir a esta iniciativa se ha propuesto crear una plataforma web que pasará a formar parte del Proyecto GLORIA, en la que se puedan realizar experimentos astronómicos. Con el objetivo de fomentar la ciencia y el aprendizaje colaborativo se propone construir una aplicación web que se ejecute en la plataforma Facebook. Los experimentos los proporciona la red de telescopios del proyecto GLORIA mediante servicios web y están definidos mediante XML. La aplicación web recibe el XML con la descripción del experimento, lo interpreta y lo representa en la plataforma Facebook para que los usuarios potenciales puedan realizar los experimentos. Los resultados de los experimentos realizados se envían a una base de datos de libre acceso que será gestionada por el proyecto GLORIA, para su posterior análisis por parte de expertos. ---ABSTRACT---The citizen’s science is born out of the result of involving all type of people in scientific investigations, in which, they can participate in a determined experiment analyzing or compiling data. There is no need to have a scientific training in order to participate, but, anyone could contribute doing one’s bit. The citizen’s science has become an element to take into account when carrying out scientific tasks that require a lot dedication, or that, for the volume of work that these involve, are nearly impossible to be carried out by one person or a small working group. The GLORIA Project (Global Robotic-Telescopes Intelligent Array) is the first network of free access robotic telescopes in the world that permits the users to participate in the astronomic investigation by means of observation with robotic telescopes, and/or analyzing data from other users that have obtained through GLORIA, or from other free-access databases. With the aim of contributing to this initiative, a web platform has been created and will be part of the GLORIA Project, in which astronomic experiments can be carried out. With the objective of promoting science and collaborative apprenticeship, a web application carried out in the FACEBOOK platform is to be built. The experiments are founded by the telescopes network of the GLORIA project by means of web services and are defined through XML. The web application receives the XML with the description of the experiment, interprets it and represents it in the FACEBOOK platform in order for potential users may perform the experiments. The results of the experiments carried out are sent to a free-access database that will be managed by the GLORIA Project for its analysis on the part of experts.
Resumo:
Multigroup diffusion codes for three dimensional LWR core analysis use as input data pre-generated homogenized few group cross sections and discontinuity factors for certain combinations of state variables, such as temperatures or densities. The simplest way of compiling those data are tabulated libraries, where a grid covering the domain of state variables is defined and the homogenized cross sections are computed at the grid points. Then, during the core calculation, an interpolation algorithm is used to compute the cross sections from the table values. Since interpolation errors depend on the distance between the grid points, a determined refinement of the mesh is required to reach a target accuracy, which could lead to large data storage volume and a large number of lattice transport calculations. In this paper, a simple and effective procedure to optimize the distribution of grid points for tabulated libraries is presented. Optimality is considered in the sense of building a non-uniform point distribution with the minimum number of grid points for each state variable satisfying a given target accuracy in k-effective. The procedure consists of determining the sensitivity coefficients of k-effective to cross sections using perturbation theory; and estimating the interpolation errors committed with different mesh steps for each state variable. These results allow evaluating the influence of interpolation errors of each cross section on k-effective for any combination of state variables, and estimating the optimal distance between grid points.
Resumo:
After the 2010 Haiti earthquake, that hits the city of Port-au-Prince, capital city of Haiti, a multidisciplinary working group of specialists (seismologist, geologists, engineers and architects) from different Spanish Universities and also from Haiti, joined effort under the SISMO-HAITI project (financed by the Universidad Politecnica de Madrid), with an objective: Evaluation of seismic hazard and risk in Haiti and its application to the seismic design, urban planning, emergency and resource management. In this paper, as a first step for a structural damage estimation of future earthquakes in the country, a calibration of damage functions has been carried out by means of a two-stage procedure. After compiling a database with observed damage in the city after the earthquake, the exposure model (building stock) has been classified and through an iteratively two-step calibration process, a specific set of damage functions for the country has been proposed. Additionally, Next Generation Attenuation Models (NGA) and Vs30 models have been analysed to choose the most appropriate for the seismic risk estimation in the city. Finally in a next paper, these functions will be used to estimate a seismic risk scenario for a future earthquake.
Resumo:
En la presente investigación se analiza la causa del hundimiento del cuarto compartimento del Tercer Depósito del Canal de Isabel II el 8 de abril de 1905, uno de los más graves de la historia de la construcción en España: fallecieron 30 personas y quedaron heridas otras 60. El Proyecto y Construcción de esta estructura era de D. José Eugenio Ribera, una de las grandes figuras de la ingeniería civil en nuestro país, cuya carrera pudo haber quedado truncada como consecuencia del siniestro. Dado el tiempo transcurrido desde la ocurrencia de este accidente, la investigación ha partido de la recopilación de la información relativa al Proyecto y a la propia construcción de la estructura, para revisar a continuación la información disponible sobre el hundimiento. De la construcción de la cubierta es interesante destacar la atrevida configuración estructural, cubriéndose una inmensa superficie de 74.000 m2 mediante una sucesión de bóvedas de hormigón armado de tan sólo 5 cm de espesor y un rebajamiento de 1/10 para salvar una luz de 6 m, que apoyaban en pórticos del mismo material, con pilares también muy esbeltos: 0,25 m de lado para 8 m de altura. Y todo ello en una época en la que la tecnología y conocimiento de las estructuras con este "nuevo" material se basaban en buena medida en el desarrollo de patentes. En cuanto a la información sobre el hundimiento, llama la atención en primer lugar la relevancia de los técnicos, peritos y letrados que intervinieron en el juicio y en el procedimiento administrativo posterior, poniéndose de manifiesto la trascendencia que el accidente tuvo en su momento y que, sin embargo, no ha trascendido hasta nuestros días. Ejemplo de ello es el papel de Echegaray -primera figura intelectual de la época- como perito en la defensa de Ribera, de D. Melquiades Álvarez -futuro presidente del Congreso- como abogado defensor, el General Marvá -uno de los máximos exponentes del papel de los ingenieros militares en la introducción del hormigón armado en nuestro país-, que presidiría la Comisión encargada del peritaje por parte del juzgado, o las opiniones de reconocidas personalidades internacionales del "nuevo" material como el Dr. von Emperger o Hennebique. Pero lo más relevante de dicha información es la falta de uniformidad sobre lo que pudo ocasionar el hundimiento: fallos en los materiales, durante la construcción, defectos en el diseño de la estructura, la realización de unas pruebas de carga cuando se concluyó ésta, etc. Pero la que durante el juicio y en los Informes posteriores se impuso como causa del fallo de la estructura fue su dilatación como consecuencia de las altas temperaturas que se produjeron aquella primavera. Y ello a pesar de que el hundimiento ocurrió a las 7 de la mañana... Con base en esta información se ha analizado el comportamiento estructural de la cubierta, permitiendo evaluar el papel que diversos factores pudieron tener en el inicio del hundimiento y en su extensión a toda la superficie construida, concluyéndose así cuáles fueron las causas del siniestro. De los resultados obtenidos se presta especial atención a las enseñanzas que se desprenden de la ocurrencia del hundimiento, enfatizándose en la relevancia de la historia -y en particular de los casos históricos de error- para la formación continua que debe existir en la Ingeniería. En el caso del hundimiento del Tercer Depósito algunas de estas "enseñanzas" son de plena actualidad, tales como la importancia de los detalles constructivos en la "robustez" de la estructuras, el diseño de estructuras "integrales" o la vigilancia del proceso constructivo. Por último, la investigación ha servido para recuperar, una vez más, la figura de D. José Eugenio Ribera, cuyo papel en la introducción del hormigón armado en España fue decisivo. En la obra del Tercer Depósito se arriesgó demasiado, y provocó un desastre que aceleró la transición hacia una nueva etapa en el hormigón estructural al abrigo de un mayor conocimiento científico y de las primeras normativas. También en esta etapa sería protagonista. This dissertation analyses the cause of the collapse of the 4th compartment of the 3th Reservoir of Canal de Isabel II in Madrid. It happened in 1905, on April 8th, being one of the most disastrous accidents occurred in the history of Spanish construction: 30 people died and 60 were injured. The design and construction supervision were carried out by D. José Eugenio Ribera, one of the main figures in Civil Engineering of our country, whose career could have been destroyed as a result of this accident. Since it occurred more than 100 years ago, the investigation started by compiling information about the structure`s design and construction, followed by reviewing the available information about the accident. With regard to the construction, it is interesting to point out its daring structural configuration. It covered a huge area of 74.000 m2 with a series of reinforced concrete vaults with a thickness of not more than 5 cm, a 6 m span and a rise of 1/10th. In turn, these vaults were supported by frames composed of very slender 0,25 m x 0,25 m columns with a height of 8 m. It is noteworthy that this took place in a time when the technology and knowledge about this "new" material was largely based on patents. In relation to the information about the collapse, its significance is shown by the important experts and lawyers that were involved in the trial and the subsequent administrative procedure. For example, Echegaray -the most important intellectual of that time- defended Ribera, Melquiades Álvarez –the future president of the Congress- was his lawyer, and General Marvá -who represented the important role of the military engineers in the introduction of reinforced concrete in our country-, led the Commission that was put in charge by the judge of the root cause analysis. In addition, the matter caught the interest of renowned foreigners like Dr. von Emperger or Hennebique and their opinions had a great influence. Nonetheless, this structural failure is unknown to most of today’s engineers. However, what is most surprising are the different causes that were claimed to lie at the root of the disaster: material defects, construction flaws, errors in the design, load tests performed after the structure was finished, etc. The final cause that was put forth during the trial and in the following reports was attributed to the dilatation of the roof due to the high temperatures that spring, albeit the collapse occurred at 7 AM... Based on this information the structural behaviour of the roof has been analysed, which allowed identifying the causes that could have provoked the initial failure and those that could have led to the global collapse. Lessons have been learned from these results, which points out the relevance of history -and in particular, of examples gone wrong- for the continuous education that should exist in engineering. In the case of the 3th Reservoir some of these lessons are still relevant during the present time, like the importance of detailing in "robustness", the design of "integral" structures or the due consideration of construction methods. Finally, the investigation has revived, once again, the figure of D. José Eugenio Ribera, whose role in the introduction of reinforced concrete in Spain was crucial. With the construction of the 3th Reservoir he took too much risk and caused a disaster that accelerated the transition to a new era in structural concrete based on greater scientific knowledge and the first codes. In this new period he would also play a major role.
Resumo:
LLas nuevas tecnologías orientadas a la nube, el internet de las cosas o las tendencias "as a service" se basan en el almacenamiento y procesamiento de datos en servidores remotos. Para garantizar la seguridad en la comunicación de dichos datos al servidor remoto, y en el manejo de los mismos en dicho servidor, se hace uso de diferentes esquemas criptográficos. Tradicionalmente, dichos sistemas criptográficos se centran en encriptar los datos mientras no sea necesario procesarlos (es decir, durante la comunicación y almacenamiento de los mismos). Sin embargo, una vez es necesario procesar dichos datos encriptados (en el servidor remoto), es necesario desencriptarlos, momento en el cual un intruso en dicho servidor podría a acceder a datos sensibles de usuarios del mismo. Es más, este enfoque tradicional necesita que el servidor sea capaz de desencriptar dichos datos, teniendo que confiar en la integridad de dicho servidor de no comprometer los datos. Como posible solución a estos problemas, surgen los esquemas de encriptación homomórficos completos. Un esquema homomórfico completo no requiere desencriptar los datos para operar con ellos, sino que es capaz de realizar las operaciones sobre los datos encriptados, manteniendo un homomorfismo entre el mensaje cifrado y el mensaje plano. De esta manera, cualquier intruso en el sistema no podría robar más que textos cifrados, siendo imposible un robo de los datos sensibles sin un robo de las claves de cifrado. Sin embargo, los esquemas de encriptación homomórfica son, actualmente, drás-ticamente lentos comparados con otros esquemas de encriptación clásicos. Una op¬eración en el anillo del texto plano puede conllevar numerosas operaciones en el anillo del texto encriptado. Por esta razón, están surgiendo distintos planteamientos sobre como acelerar estos esquemas para un uso práctico. Una de las propuestas para acelerar los esquemas homomórficos consiste en el uso de High-Performance Computing (HPC) usando FPGAs (Field Programmable Gate Arrays). Una FPGA es un dispositivo semiconductor que contiene bloques de lógica cuya interconexión y funcionalidad puede ser reprogramada. Al compilar para FPGAs, se genera un circuito hardware específico para el algorithmo proporcionado, en lugar de hacer uso de instrucciones en una máquina universal, lo que supone una gran ventaja con respecto a CPUs. Las FPGAs tienen, por tanto, claras difrencias con respecto a CPUs: -Arquitectura en pipeline: permite la obtención de outputs sucesivos en tiempo constante -Posibilidad de tener multiples pipes para computación concurrente/paralela. Así, en este proyecto: -Se realizan diferentes implementaciones de esquemas homomórficos en sistemas basados en FPGAs. -Se analizan y estudian las ventajas y desventajas de los esquemas criptográficos en sistemas basados en FPGAs, comparando con proyectos relacionados. -Se comparan las implementaciones con trabajos relacionados New cloud-based technologies, the internet of things or "as a service" trends are based in data storage and processing in a remote server. In order to guarantee a secure communication and handling of data, cryptographic schemes are used. Tradi¬tionally, these cryptographic schemes focus on guaranteeing the security of data while storing and transferring it, not while operating with it. Therefore, once the server has to operate with that encrypted data, it first decrypts it, exposing unencrypted data to intruders in the server. Moreover, the whole traditional scheme is based on the assumption the server is reliable, giving it enough credentials to decipher data to process it. As a possible solution for this issues, fully homomorphic encryption(FHE) schemes is introduced. A fully homomorphic scheme does not require data decryption to operate, but rather operates over the cyphertext ring, keeping an homomorphism between the cyphertext ring and the plaintext ring. As a result, an outsider could only obtain encrypted data, making it impossible to retrieve the actual sensitive data without its associated cypher keys. However, using homomorphic encryption(HE) schemes impacts performance dras-tically, slowing it down. One operation in the plaintext space can lead to several operations in the cyphertext space. Because of this, different approaches address the problem of speeding up these schemes in order to become practical. One of these approaches consists in the use of High-Performance Computing (HPC) using FPGAs (Field Programmable Gate Array). An FPGA is an integrated circuit designed to be configured by a customer or a designer after manufacturing - hence "field-programmable". Compiling into FPGA means generating a circuit (hardware) specific for that algorithm, instead of having an universal machine and generating a set of machine instructions. FPGAs have, thus, clear differences compared to CPUs: - Pipeline architecture, which allows obtaining successive outputs in constant time. -Possibility of having multiple pipes for concurrent/parallel computation. Thereby, In this project: -We present different implementations of FHE schemes in FPGA-based systems. -We analyse and study advantages and drawbacks of the implemented FHE schemes, compared to related work.
Resumo:
The search for a common cause of species richness gradients has spawned more than 100 explanatory hypotheses in just the past two decades. Despite recent conceptual advances, further refinement of the most plausible models has been stifled by the difficulty of compiling high-resolution databases at continental scales. We used a database of the geographic ranges of 2,869 species of birds breeding in South America (nearly a third of the world's living avian species) to explore the influence of climate, quadrat area, ecosystem diversity, and topography on species richness gradients at 10 spatial scales (quadrat area, ≈12,300 to ≈1,225,000 km2). Topography, precipitation, topography × latitude, ecosystem diversity, and cloud cover emerged as the most important predictors of regional variability of species richness in regression models incorporating 16 independent variables, although ranking of variables depended on spatial scale. Direct measures of ambient energy such as mean and maximum temperature were of ancillary importance. Species richness values for 1° × 1° latitude-longitude quadrats in the Andes (peaking at 845 species) were ≈30–250% greater than those recorded at equivalent latitudes in the central Amazon basin. These findings reflect the extraordinary abundance of species associated with humid montane regions at equatorial latitudes and the importance of orography in avian speciation. In a broader context, our data reinforce the hypothesis that terrestrial species richness from the equator to the poles is ultimately governed by a synergism between climate and coarse-scale topographic heterogeneity.
Resumo:
RNA viruses evolve rapidly. One source of this ability to rapidly change is the apparently high mutation frequency in RNA virus populations. A high mutation frequency is a central tenet of the quasispecies theory. A corollary of the quasispecies theory postulates that, given their high mutation frequency, animal RNA viruses may be susceptible to error catastrophe, where they undergo a sharp drop in viability after a modest increase in mutation frequency. We recently showed that the important broad-spectrum antiviral drug ribavirin (currently used to treat hepatitis C virus infections, among others) is an RNA virus mutagen, and we proposed that ribavirin's antiviral effect is by forcing RNA viruses into error catastrophe. However, a direct demonstration of error catastrophe has not been made for ribavirin or any RNA virus mutagen. Here we describe a direct demonstration of error catastrophe by using ribavirin as the mutagen and poliovirus as a model RNA virus. We demonstrate that ribavirin's antiviral activity is exerted directly through lethal mutagenesis of the viral genetic material. A 99.3% loss in viral genome infectivity is observed after a single round of virus infection in ribavirin concentrations sufficient to cause a 9.7-fold increase in mutagenesis. Compiling data on both the mutation levels and the specific infectivities of poliovirus genomes produced in the presence of ribavirin, we have constructed a graph of error catastrophe showing that normal poliovirus indeed exists at the edge of viability. These data suggest that RNA virus mutagens may represent a promising new class of antiviral drugs.
Resumo:
Aumento da produtividade, melhorias na qualidade dos produtos, redução de custos e de impactos ambientais são essenciais para a capacidade competitiva das empresas. A execução da fachada faz parte do caminho crítico da obra, por ser um subsistema que associa as funções de fechamento, acabamento, iluminação e ventilação e ainda por incorporar sistemas prediais; apresenta, por isso também, um alto custo direto em relação aos outros subsistemas do edifício. A tecnologia construtiva de fachadas em chapas delgadas com estrutura em Light Steel Framing (LSF) é uma alternativa viável para aumentar a produtividade e reduzir os prazos de obra, com qualidade e desempenho, e pode trazer benefícios em relação a atividades intensas em mão de obra como é o caso da alvenaria de vedação e de seus revestimentos. O presente trabalho tem por objetivo sistematizar e analisar o conhecimento relativo a essa tecnologia construtiva de fachada. O método adotado compreende revisão bibliográfica. Como contribuição, o trabalho reúne um conjunto organizado de informações sobre os principais sistemas disponíveis no mercado contemplando: a caracterização do sistema de fachada, de suas camadas e dos perfis leves de aço e a sistematização das principais avaliações técnicas de sistemas existentes em outros países, reunindo normas técnicas de produtos e de execução. Acredita-se que a reunião e organização das informações, antes dispersas em diversas referências, têm potencial para subsidiar o meio técnico para tomada de decisão quanto ao uso adequado da nova tecnologia.
Resumo:
After the 2010 Haiti earthquake, that hits the city of Port-au-Prince, capital city of Haiti, a multidisciplinary working group of specialists (seismologist, geologists, engineers and architects) from different Spanish Universities and also from Haiti, joined effort under the SISMO-HAITI project (financed by the Universidad Politecnica de Madrid), with an objective: Evaluation of seismic hazard and risk in Haiti and its application to the seismic design, urban planning, emergency and resource management. In this paper, as a first step for a structural damage estimation of future earthquakes in the country, a calibration of damage functions has been carried out by means of a two-stage procedure. After compiling a database with observed damage in the city after the earthquake, the exposure model (building stock) has been classified and through an iteratively two-step calibration process, a specific set of damage functions for the country has been proposed. Additionally, Next Generation Attenuation Models (NGA) and Vs30 models have been analysed to choose the most appropriate for the seismic risk estimation in the city. Finally in a next paper, these functions will be used to estimate a seismic risk scenario for a future earthquake.
Resumo:
This series contains one small leaf with handwritten calculations related to the number of volumes in the Harvard College Library. The verso has the note: "No. of Vol: in Harvard College 11465 vol. making one line 15380 miles long. The document is in the hand of Loammi Baldwin Sr., and may have been created in 1789 while the Library was compiling a catalog of its holdings.
Resumo:
This series contains one small leaf with handwritten calculations related to the number of volumes in the Harvard College Library. The verso has the note: "No. of Vol: in Harvard College 11465 vol. making one line 15380 miles long. The document is in the hand of Loammi Baldwin Sr., and may have been created in 1789 while the Library was compiling a catalog of its holdings.
Resumo:
This hardcover volume contains manuscript copies of Charles Morton's "A System of Ethicks," "Pneumatics. Or a treatise of the Rev'd Mr. Charles Morton about ye Nature of Spirit," "Appendix of the Souls of Brutes," "Some Theological Questions Answd," and a one-page list "Texts of Scripture to prove if ye head of Christ &c." copied by Harvard student Ebenezer Williams in February 1707/8.
Resumo:
Manuscript copy of Charles Morton’s Compendium Physicae prepared by copyist Robert Ward in 1714. The leather-bound volume includes text and drawings, and there is an index to the chapters of the book at the end of the volume. "Thomas Greaves's book Octob 1 Anno Salutis 1714" inscribed on flyleaf. Thomas Greaves may refer to the Charlestown physician and judge and member of the Harvard Class of 1703.
Resumo:
This leather-bound volume contains a manuscript copy of Charles Morton’s Compendium Physicae copied by Harvard student Obadiah Ayer in 1708. The volume has text and drawings (including one large foldout drawing), and there is an index to the chapters at the end of the volume. Mather Byles (Harvard Class of 1725) also used the book.