14 resultados para Explanation of the reasoning
em Universidad Politécnica de Madrid
Resumo:
The debate on network neutrality has reached sufficient notoriety to eliminate the need for detailed explanation. A simple definition will suffice: “network neutrality” is understood as the principle by which the owners of broadband networks would not be allowed to establish any type of discrimination or preference over the traffic transmitted through them
Resumo:
Early 18th century treatise writer Tomas Vicente Tosca1 includes in his Tratado de la montea y cortes de Canteria [On Masonry Design and Stone Cutting], what is an important documentary source about the lantern of Valencia Cathedral. Tosca writes about this lantern as an example of vaulting over cross arches without the need of buttresses. A geometrical description is followed by an explanation of the structural behavior which manifests his deep understanding of the mechanics of masonry structures. He tries to demonstrate the absence of buttresses supporting his thesis on the appropriate distribution of loads which will reduce the "empujos" [horizontal thrusts] to the point of not requiring more than the thickness of the walls to stand (Tosca [1727] 1992, 227-230). The present article2 assesses T osca' s appreciation studying how loads and the thrusts they generate are transmitted through the different masonry elements that constitute this ciborium. In order to do so, we first present a geometrical analysis and make considerations regarding its materials and construction methods to, subsequently, analyze its stability adopting an equilibrium approach within the theoretical framework of the lower bound limit analysis.
Resumo:
Early 18th century treatise writer Tomas Vicente Tosca1 includes in his Tratado de la montea y cortes de Canteria [On Masonry Design and Stone Cutting], what is an important documentary source about the lantern of Valencia Cathedral. Tosca writes about this lantern as an example of vaulting over cross arches without the need of buttresses. A geometrical description is followed by an explanation of the structural behavior which manifests his deep understanding of the mechanics of masonry structures. He tries to demonstrate the absence of buttresses supporting his thesis on the appropriate distribution of loads which will reduce the "empujos" [horizontal thrusts] to the point of not requiring more than the thickness of the walls to stand (Tosca [1727] 1992, 227-230). The present article2 assesses T osca' s appreciation studying how loads and the thrusts they generate are transmitted through the different masonry elements that constitute this ciborium. In order to do so, we first present a geometrical analysis and make considerations regarding its materials and construction methods to, subsequently, analyze its stability adopting an equilibrium approach within the theoretical framework of the lower bound limit analysis.
Resumo:
This paper describes the first five SEALS Evaluation Campaigns over the semantic technologies covered by the SEALS project (ontology engineering tools, ontology reasoning tools, ontology matching tools, semantic search tools, and semantic web service tools). It presents the evaluations and test data used in these campaigns and the tools that participated in them along with a comparative analysis of their results. It also presents some lessons learnt after the execution of the evaluation campaigns and draws some final conclusions.
Resumo:
Ciao is a logic-based, multi-paradigm programming system. One of its most distinguishing features is that it supports a large number of semantic and syntactic language features which can be selectively activated or deactivated for each program module. As a result, a module can be written in, for example, ISO-Prolog plus constraints and higher order, while another can be a puré logic module with a different control rule such as iterative deepening and/or tabling, and perhaps using constructive negation. A powerful and modular extensión mechanism allows user-level design and implementation of such features and sub-languages. Another distinguishing feature of Ciao is its powerful assertion language, which allows expressing many kinds of program properties (ranging from, e.g., moded types to resource consumption), as well as tests and documentation. The compiler is capable of statically ñnding violations of these properties or verifying that programs comply with them, and issuing certiñcates of this compliance. The compiler also performs many types of optimizations, including automatic parallelization. It offers very competitive performance, while retaining the flexibility and interactive development of a dynamic language. We will present a hands-on overview of the system, through small examples which emphasize the novel aspects and the motivations which lie behind Ciao's design and implementation.
Resumo:
Our intention in this note is not to provide a listing of the many features of the Ciao system: this can be found in part for example in the brochures announcing upcoming versions, in the Ciao website, or in more feature-oriented descriptions such as. Instead in this document we would like to describe the objectives and reasoning followed in our design as well as the fundamental characteristics that in our opinion make Ciao quite unique and hopefully really useful to you as a Ciao user.
Resumo:
The engineer must have sufficient theoretical knowledge to be applied to solve specific problems, with the necessary capacity to simplify these approaches, and taking into account factors such as speed, simplicity, quality and economy. In Geology, its ultimate goal is the exploration of the history of the geological events through observation, deduction, reasoning and, in exceptional cases by the direct underground exploration or experimentation. Experimentation is very limited in Geology. Reproduction laboratory of certain phenomena or geological processes is difficult because both time and space become a large scale. For this reason, some Earth Sciences are in a nearly descriptive stage whereas others closest to the experimental, Geophysics and Geochemistry, have assimilated progress experienced by the physics and chemistry. Thus, Anglo-Saxon countries clearly separate Engineering Geology from Geological Engineering, i.e. Applied Geology to the Geological Engineering concepts. Although there is a big professional overlap, the first one corresponds to scientific approach, while the last one corresponds to a technological one. Applied Geology to Engineering could be defined as the Science and Applied Geology to the design, construction and performance of engineering infrastructures in and field geology discipline. There has been much discussion on the primacy of theory over practice. Today prevails the exaggeration of practice, but you get good workers and routine and mediocre teachers. This idea forgets too that teaching problem is a problem of right balance. The approach of the action lines on the European Higher Education Area (EHEA) framework provides for such balance. Applied Geology subject represents the first real contact with the physical environment with the practice profession and works. Besides, the situation of the topic in the first trace of Study Plans for many students implies the link to other subjects and topics of the career (tunnels, dams, groundwater, roads, etc). This work analyses in depth the justification of such practical trips. It shows the criteria and methods of planning and the result which manifests itself in pupils. Once practical trips experience developed, the objective work tries to know about results and changes on student’s motivation in learning perspective. This is done regardless of the outcome of their knowledge achievements assessed properly and they are not subject to such work. For this objective, it has been designed a survey about their motivation before and after trip. Survey was made by the Unidad Docente de Geología Aplicada of the Departamento de Ingeniería y Morfología del Terreno (Escuela Técnica Superior de Ingenieros de Caminos, Canales y Puertos, Universidad Politécnica de Madrid). It was completely anonymous. Its objective was to collect the opinion of the student as a key agent of learning and teaching of the subject. All the work takes place under new teaching/learning criteria approach at the European framework in Higher Education. The results are exceptionally good with 90% of student’s participation and with very high scores in a number of questions as the itineraries, teachers and visited places (range of 4.5 to 4.2 in a 5 points scale). The majority of students are very satisfied (average of 4.5 in a 5 points scale).
Resumo:
In a crosswind scenario, the risk of high-speed trains overturning increases when they run on viaducts since the aerodynamic loads are higher than on the ground. In order to increase safety, vehicles are sheltered by fences that are installed on the viaduct to reduce the loads experienced by the train. Windbreaks can be designed to have different heights, and with or without eaves on the top. In this paper, a parametric study with a total of 12 fence designs was carried out using a two-dimensional model of a train standing on a viaduct. To asses the relative effectiveness of sheltering devices, tests were done in a wind tunnel with a scaled model at a Reynolds number of 1 × 105, and the train’s aerodynamic coefficients were measured. Experimental results were compared with those predicted by Unsteady Reynolds-averaged Navier-Stokes (URANS) simulations of flow, showing that a computational model is able to satisfactorily predict the trend of the aerodynamic coefficients. In a second set of tests, the Reynolds number was increased to 12 × 106 (at a free flow air velocity of 30 m/s) in order to simulate strong wind conditions. The aerodynamic coefficients showed a similar trend for both Reynolds numbers; however, their numerical value changed enough to indicate that simulations at the lower Reynolds number do not provide all required information. Furthermore, the variation of coefficients in the simulations allowed an explanation of how fences modified the flow around the vehicle to be proposed. This made it clear why increasing fence height reduced all the coefficients but adding an eave had an effect mainly on the lift force coefficient. Finally, by analysing the time signals it was possible to clarify the influence of the Reynolds number on the peak-to-peak amplitude, the time period and the Strouhal number.
Resumo:
This paper presents the knowledge model of a distributed decision support system, that has been designed for the management of a national network in Ukraine. It shows how advanced Artificial Intelligence techniques (multiagent systems and knowledge modelling) have been applied to solve this real-world decision support problem: on the one hand its distributed nature, implied by different loci of decision-making at the network nodes, suggested to apply a multiagent solution; on the other, due to the complexity of problem-solving for local network administration, it was useful to apply knowledge modelling techniques, in order to structure the different knowledge types and reasoning processes involved. The paper sets out from a description of our particular management problem. Subsequently, our agent model is described, pointing out the local problem-solving and coordination knowledge models. Finally, the dynamics of the approach is illustrated by an example.
Resumo:
This document is the result of a process of web development to create a tool that will allow to Cracow University of Technology consult, create and manage timetables. The technologies chosen for this purpose are Apache Tomcat Server, My SQL Community Server, JDBC driver, Java Servlets and JSPs for the server side. The client part counts on Javascript, jQuery, AJAX and CSS technologies to perform the dynamism. The document will justify the choice of these technologies and will explain some development tools that help in the integration and development of all this elements: specifically, NetBeans IDE and MySQL workbench have been used as helpful tools. After explaining all the elements involved in the development of the web application, the architecture and the code developed are explained through UML diagrams. Some implementation details related to security are also deeper explained through sequence diagrams. As the source code of the application is provided, an installation manual has been developed to run the project. In addition, as the platform is intended to be a beta that will be grown, some unimplemented ideas for future development are also exposed. Finally, some annexes with important files and scripts related to the initiation of the platform are attached. This project started through an existing tool that needed to be expanded. The main purpose of the project along its development has focused on setting the roots for a whole new platform that will replace the existing one. For this goal, it has been needed to make a deep inspection on the existing web technologies: a web server and a SQL database had to be chosen. Although the alternatives were a lot, Java technology for the server was finally selected because of the big community backwards, the easiness of modelling the language through UML diagrams and the fact of being free license software. Apache Tomcat is the open source server that can use Java Servlet and JSP technology. Related to the SQL database, MySQL Community Server is the most popular open-source SQL Server, with a big community after and quite a lot of tools to manage the server. JDBC is the driver needed to put in contact Java and MySQL. Once we chose the technologies that would be part of the platform, the development process started. After a detailed explanation of the development environment installation, we used UML use case diagrams to set the main tasks of the platform; UML class diagrams served to establish the existing relations between the classes generated; the architecture of the platform was represented through UML deployment diagrams; and Enhanced entity–relationship (EER) model were used to define the tables of the database and their relationships. Apart from the previous diagrams, some implementation issues were explained to make a better understanding of the developed code - UML sequence diagrams helped to explain this. Once the whole platform was properly defined and developed, the performance of the application has been shown: it has been proved that with the current state of the code, the platform covers the use cases that were set as the main target. Nevertheless, some requisites needed for the proper working of the platform have been specified. As the project is aimed to be grown, some ideas that could not be added to this beta have been explained in order not to be missed for future development. Finally, some annexes containing important configuration issues for the platform have been added after proper explanation, as well as an installation guide that will let a new developer get the project ready. In addition to this document some other files related to the project are provided: - Javadoc. The Javadoc containing the information of every Java class created is necessary for a better understanding of the source code. - database_model.mwb. This file contains the model of the database for MySQL Workbench. This model allows, among other things, generate the MySQL script for the creation of the tables. - ScheduleManager.war. The WAR file that will allow loading the developed application into Tomcat Server without using NetBeans. - ScheduleManager.zip. The source code exported from NetBeans project containing all Java packages, JSPs, Javascript files and CSS files that are part of the platform. - config.properties. The configuration file to properly get the names and credentials to use the database, also explained in Annex II. Example of config.properties file. - db_init_script.sql. The SQL query to initiate the database explained in Annex III. SQL statements for MySQL initialization. RESUMEN. Este proyecto tiene como punto de partida la necesidad de evolución de una herramienta web existente. El propósito principal del proyecto durante su desarrollo se ha centrado en establecer las bases de una completamente nueva plataforma que reemplazará a la existente. Para lograr esto, ha sido necesario realizar una profunda inspección en las tecnologías web existentes: un servidor web y una base de datos SQL debían ser elegidos. Aunque existen muchas alternativas, la tecnología Java ha resultado ser elegida debido a la gran comunidad de desarrolladores que tiene detrás, además de la facilidad que proporciona este lenguaje a la hora de modelarlo usando diagramas UML. Tampoco hay que olvidar que es una tecnología de uso libre de licencia. Apache Tomcat es el servidor de código libre que permite emplear Java Servlets y JSPs para hacer uso de la tecnología de Java. Respecto a la base de datos SQL, el servidor más popular de código libre es MySQL, y cuenta también con una gran comunidad detrás y buenas herramientas de modelado, creación y gestión de la bases de datos. JDBC es el driver que va a permitir comunicar las aplicaciones Java con MySQL. Tras elegir las tecnologías que formarían parte de esta nueva plataforma, el proceso de desarrollo tiene comienzo. Tras una extensa explicación de la instalación del entorno de desarrollo, se han usado diagramas de caso de UML para establecer cuáles son los objetivos principales de la plataforma; los diagramas de clases nos permiten realizar una organización del código java desarrollado de modo que sean fácilmente entendibles las relaciones entre las diferentes clases. La arquitectura de la plataforma queda definida a través de diagramas de despliegue. Por último, diagramas EER van a definir las relaciones entre las tablas creadas en la base de datos. Aparte de estos diagramas, algunos detalles de implementación se van a justificar para tener una mejor comprensión del código desarrollado. Diagramas de secuencia ayudarán en estas explicaciones. Una vez que toda la plataforma haya quedad debidamente definida y desarrollada, se va a realizar una demostración de la misma: se demostrará cómo los objetivos generales han sido alcanzados con el desarrollo actual del proyecto. No obstante, algunos requisitos han sido aclarados para que la plataforma trabaje adecuadamente. Como la intención del proyecto es crecer (no es una versión final), algunas ideas que se han podido llevar acabo han quedado descritas de manera que no se pierdan. Por último, algunos anexos que contienen información importante acerca de la plataforma se han añadido tras la correspondiente explicación de su utilidad, así como una guía de instalación que va a permitir a un nuevo desarrollador tener el proyecto preparado. Junto a este documento, ficheros conteniendo el proyecto desarrollado quedan adjuntos. Estos ficheros son: - Documentación Javadoc. Contiene la información de las clases Java que han sido creadas. - database_model.mwb. Este fichero contiene el modelo de la base de datos para MySQL Workbench. Esto permite, entre otras cosas, generar el script de iniciación de la base de datos para la creación de las tablas. - ScheduleManager.war. El fichero WAR que permite desplegar la plataforma en un servidor Apache Tomcat. - ScheduleManager.zip. El código fuente exportado directamente del proyecto de Netbeans. Contiene todos los paquetes de Java generados, ficheros JSPs, Javascript y CSS que forman parte de la plataforma. - config.properties. Ejemplo del fichero de configuración que permite obtener los nombres de la base de datos - db_init_script.sql. Las consultas SQL necesarias para la creación de la base de datos.
Resumo:
A validation of the burn-up simulation system EVOLCODE 2.0 is presented here, involving the experimental measurement of U and Pu isotopes and some fission fragments production ratios after a burn-up of around 30 GWd/tU in a Pressurized Light Water Reactor (PWR). This work provides an in-depth analysis of the validation results, including the possible sources of the uncertainties. An uncertainty analysis based on the sensitivity methodology has been also performed, providing the uncertainties in the isotopic content propagated from the cross sections uncertainties. An improvement of the classical Sensitivity/ Uncertainty (S/U) model has been developed to take into account the implicit dependence of the neutron flux normalization, that is, the effect of the constant power of the reactor. The improved S/U methodology, neglected in this kind of studies, has proven to be an important contribution to the explanation of some simulation-experiment discrepancies for which, in general, the cross section uncertainties are, for the most relevant actinides, an important contributor to the simulation uncertainties, of the same order of magnitude and sometimes even larger than the experimental uncertainties and the experiment- simulation differences. Additionally, some hints for the improvement of the JEFF3.1.1 fission yield library and for the correction of some errata in the experimental data are presented.
Resumo:
The period between 1570-1620 has left a remarkable amount of documents related to shipbuilding in the Iberian Peninsula. Among them, the Instrucción nautica written by Diego García de Palacio in 1587 is widely recognized as the first published book that includes an extensive discussion of ship design and construction. García de Palacio centres his discussion on a 400 toneladas nao, a series of woodcuts that illustrate the shape and dimensions of the ship accompany the explanation. In the late XVI century ship hulls were designed following procedures based upon an old shipwrightry tradition born in the Mediterranean. By simple rules the master shipwright plots the central frame and tail frames and complete the hull body using wooden ribbands. Computer software for 3D modelling using NURBs surfaces helps to recreate ships hulls. In this work the 400 toneladas nao is reconstructed and her hydrostatic parameters are compared with other ships.
Resumo:
Neuronal morphology is hugely variable across brain regions and species, and their classification strategies are a matter of intense debate in neuroscience. GABAergic cortical interneurons have been a challenge because it is difficult to find a set of morphological properties which clearly define neuronal types. A group of 48 neuroscience experts around the world were asked to classify a set of 320 cortical GABAergic interneurons according to the main features of their three-dimensional morphological reconstructions. A methodology for building a model which captures the opinions of all the experts was proposed. First, one Bayesian network was learned for each expert, and we proposed an algorithm for clustering Bayesian networks corresponding to experts with similar behaviors. Then, a Bayesian network which represents the opinions of each group of experts was induced. Finally, a consensus Bayesian multinet which models the opinions of the whole group of experts was built. A thorough analysis of the consensus model identified different behaviors between the experts when classifying the interneurons in the experiment. A set of characterizing morphological traits for the neuronal types was defined by performing inference in the Bayesian multinet. These findings were used to validate the model and to gain some insights into neuron morphology.
Resumo:
The aim of this study is to explain the changes in the real estate prices as well as in the real estate stock market prices, using some macro-economic explanatory variables, such as the gross domestic product (GDP), the real interest rate and the unemployment rate. Several regressions have been carried out in order to express some types of incremental and absolute deflated real estate lock market indexes in terms of the macro-economic variables. The analyses are applied to the Swedish economy. The period under study is 1984-1994. Time series on monthly data are used. i.e. the number of data-points is 132. If time leads/lags are introduced in the e regressions, significant improvements in the already high correlations are achieved. The signs of the coefficients for IR, UE and GDP are all what one would expect to see from an economic point of view: those for GDP are all positive, those for both IR and UE are negative. All the regressions have high R2 values. Both markets anticipate change in the unemployment rate by 6 to 9 months, which seems reasonable because such change can be forecast quite reliably. But, on the contrary, there is no reason why they should anticipate by 3-6 months changes in the interest rate that can hardly be reliably forecast so far in advance.