937 resultados para Carrying Loads
Resumo:
Proof-Carrying Code (PCC) is a general approach to mobile code safety in which the code supplier augments the program with a certifícate (or proof). The intended benefit is that the program consumer can locally validate the certifícate w.r.t. the "untrusted" program by means of a certifícate checker—a process which should be much simpler, eíñcient, and automatic than generating the original proof. Abstraction Carrying Code (ACC) is an enabling technology for PCC in which an abstract model of the program plays the role of certifícate. The generation of the certifícate, Le., the abstraction, is automatically carried out by an abstract interpretation-based analysis engine, which is parametric w.r.t. different abstract domains. While the analyzer on the producer side typically has to compute a semantic fixpoint in a complex, iterative process, on the receiver it is only necessary to check that the certifícate is indeed a fixpoint of the abstract semantics equations representing the program. This is done in a single pass in a much more efficient process. ACC addresses the fundamental issues in PCC and opens the door to the applicability of the large body of frameworks and domains based on abstract interpretation as enabling technology for PCC. We present an overview of ACC and we describe in a tutorial fashion an application to the problem of resource-aware security in mobile code. Essentially the information computed by a cost analyzer is used to genérate cost certificates which attest a safe and efficient use of a mobile code. A receiving side can then reject code which brings cost certificates (which it cannot validate or) which have too large cost requirements in terms of computing resources (in time and/or space) and accept mobile code which meets the established requirements.
Resumo:
Abstraction-Carrying Code (ACC) has recently been proposed as a framework for mobile code safety in which the code supplier provides a program together with an abstraction whose validity entails compliance with a predefined safety policy. The abstraction plays thus the role of safety certifícate and its generation is carried out automatically by a fixed-point analyzer. The advantage of providing a (fixedpoint) abstraction to the code consumer is that its validity is checked in a single pass of an abstract interpretation-based checker. A main challenge is to reduce the size of certificates as much as possible while at the same time not increasing checking time. In this paper, we first introduce the notion of reduced certifícate which characterizes the subset of the abstraction which a checker needs in order to validate (and re-construct) the full certifícate in a single pass. Based on this notion, we then instrument a generic analysis algorithm with the necessary extensions in order to identify the information relevant to the checker.
Resumo:
Abstraction-Carrying Code (ACC) has recently been proposed as a framework for mobile code safety in which the code supplier provides a program together with an abstraction (or abstract model of the program) whose validity entails compliance with a predefined safety policy. The abstraction plays thus the role of safety certifícate and its generation is carried out automatically by a fixed-point analyzer. The advantage of providing a (fixed-point) abstraction to the code consumer is that its validity is checked in a single pass (i.e., one iteration) of an abstract interpretation-based checker. A main challenge to make ACC useful in practice is to reduce the size of certificates as much as possible while at the same time not increasing checking time. The intuitive idea is to only include in the certifícate information that the checker is unable to reproduce without iterating. We introduce the notion of reduced certifícate which characterizes the subset of the abstraction which a checker needs in order to validate (and re-construct) the full certifícate in a single pass. Based on this notion, we instrument a generic analysis algorithm with the necessary extensions in order to identify information which can be reconstructed by the single-pass checker. Finally, we study what the effects of reduced certificates are on the correctness and completeness of the checking process. We provide a correct checking algorithm together with sufficient conditions for ensuring its completeness. Our ideas are illustrated through a running example, implemented in the context of constraint logic programs, which shows that our approach improves state-of-the-art techniques for reducing the size of certificates.
Resumo:
Abstraction-Carrying Code (ACC) has recently been proposed as a framework for mobile code safety in which the code supplier provides a program together with an abstraction (or abstract model of the program) whose validity entails compliance with a predefined safety policy. The abstraction plays thus the role of safety certificate and its generation is carried out automatically by a fixpoint analyzer. The advantage of providing a (fixpoint) abstraction to the code consumer is that its validity is checked in a single pass (i.e., one iteration) of an abstract interpretation-based checker. A main challenge to make ACC useful in practice is to reduce the size of certificates as much as possible while at the same time not increasing checking time. The intuitive idea is to only include in the certificate information that the checker is unable to reproduce without iterating. We introduce the notion of reduced certificate which characterizes the subset of the abstraction which a checker needs in order to validate (and re-construct) the fall certificate in a single pass. Based on this notion, we instrument a generic analysis algorithm with the necessary extensions in order to identify the information relevant to the checker. Interestingly, the fact that the reduced certificate omits (parts of) the abstraction has implications in the design of the checker. We provide the sufficient conditions which allow us to ensure that 1) if the checker succeeds in validating the certificate, then the certificate is valid for the program (correctness) and 2) the checker will succeed for any reduced certificate which is valid (completeness). Our approach has been implemented and benchmarked within the CiaoPP system. The experimental results show t h a t our proposal is able to greatly reduce the size of certificates in practice. To appear in Theory and Practice of Logic Programming (TPLP).
Resumo:
In this paper, we propose a system for authenticating local bee pollen against fraudulent samples using image processing and classification techniques. Our system is based on the colour properties of bee pollen loads and the use of one-class classifiers to reject unknown pollen samples. The latter classification techniques allow us to tackle the major difficulty of the problem, the existence of many possible fraudulent pollen types. Also presented is a multi-classifier model with an ambiguity discovery process to fuse the output of the one-class classifiers. The method is validated by authenticating Spanish bee pollen types, the overall accuracy of the final system of being 94%. Therefore, the system is able to rapidly reject the non-local pollen samples with inexpensive hardware and without the need to send the product to the laboratory.
Resumo:
The flexural vibration of a homogeneous isotropic linearly elastic cylinder of any aspect ratio is analysed in this paper. Natural frequencies of a cylinder under uniformly distributed axial loads acting on its bases are calculated numerically by the Ritz method with terms of power series in the coordinate directions as approximating functions. The effect of axial loads on the flexural vibration cannot be described by applying infinitesimal strain theory, therefore, geometrically nonlinear strain–displacement relations with second-order terms are considered here. The natural frequencies of free–free, clamped–clamped, and sliding–sliding cylinders subjected to axial loads are calculated using the proposed three-dimensional Ritz approach and are compared with those obtained with the finite element method and the Bernoulli–Euler theory. Different experiments with cylinders axially compressed by a hydraulic press are carried out and the experimental results for the lowest flexural frequency are compared with the numerical results. An approach based on the Ritz formulation is proposed for the flexural vibration of a cylinder between the platens of the press with constraints varying with the intensity of the compression. The results show that for low compressions the cylinder behaves similarly to a sliding–sliding cylinder, whereas for high compressions the cylinder vibrates as a clamped–clamped one.
Resumo:
Tests used to simulate the separation of the lower stage of the Ariane Vehicle Equipment Bay (VEB) were carried out on a flat full scale model. Theoretical studies carried out prior to testing are described. Three different mathematical methods, finite element, component element, and wave propagation, were used. Comparison of the predicted theoretical results with the actual test results is planned.
Resumo:
This paper presents the results of part of the research carried out by a committee in charge of the elaboration of the new Spanish Code of Actions in Railway Bridges. Following the work developed by the European Rail Research Institute (ERRI), the dynamic effects caused by the Spanish high-speed train TALGO have been studied and compared with other European trains. A simplified envelope of the impact coefficient is also presented. Finally, the train-bridge interactions has been analysed and the results compared with those obtained from simple models based on moving loads.
Resumo:
Telecommunications networks have been always expanding and thanks to it, new services have appeared. The old mechanisms for carrying packets have become obsolete due to the new service requirements, which have begun working in real time. Real time traffic requires strict service guarantees. When this traffic is sent through the network, enough resources must be given in order to avoid delays and information losses. When browsing through the Internet and requesting web pages, data must be sent from a server to the user. If during the transmission there is any packet drop, the packet is sent again. For the end user, it does not matter if the webpage loads in one or two seconds more. But if the user is maintaining a conversation with a VoIP program, such as Skype, one or two seconds of delay in the conversation may be catastrophic, and none of them can understand the other. In order to provide support for this new services, the networks have to evolve. For this purpose MPLS and QoS were developed. MPLS is a packet carrying mechanism used in high performance telecommunication networks which directs and carries data using pre-established paths. Now, packets are forwarded on the basis of labels, making this process faster than routing the packets with the IP addresses. MPLS also supports Traffic Engineering (TE). This refers to the process of selecting the best paths for data traffic in order to balance the traffic load between the different links. In a network with multiple paths, routing algorithms calculate the shortest one, and most of the times all traffic is directed through it, causing overload and packet drops, without distributing the packets in the other paths that the network offers and do not have any traffic. But this is not enough in order to provide the real time traffic the guarantees it needs. In fact, those mechanisms improve the network, but they do not make changes in how the traffic is treated. That is why Quality of Service (QoS) was developed. Quality of service is the ability to provide different priority to different applications, users, or data flows, or to guarantee a certain level of performance to a data flow. Traffic is distributed into different classes and each of them is treated differently, according to its Service Level Agreement (SLA). Traffic with the highest priority will have the preference over lower classes, but this does not mean it will monopolize all the resources. In order to achieve this goal, a set policies are defined to control and alter how the traffic flows. Possibilities are endless, and it depends in how the network must be structured. By using those mechanisms it is possible to provide the necessary guarantees to the real-time traffic, distributing it between categories inside the network and offering the best service for both real time data and non real time data. Las Redes de Telecomunicaciones siempre han estado en expansión y han propiciado la aparición de nuevos servicios. Los viejos mecanismos para transportar paquetes se han quedado obsoletos debido a las exigencias de los nuevos servicios, que han comenzado a operar en tiempo real. El tráfico en tiempo real requiere de unas estrictas garantías de servicio. Cuando este tráfico se envía a través de la red, necesita disponer de suficientes recursos para evitar retrasos y pérdidas de información. Cuando se navega por la red y se solicitan páginas web, los datos viajan desde un servidor hasta el usuario. Si durante la transmisión se pierde algún paquete, éste se vuelve a mandar de nuevo. Para el usuario final, no importa si la página tarda uno o dos segundos más en cargar. Ahora bien, si el usuario está manteniendo una conversación usando algún programa de VoIP (como por ejemplo Skype) uno o dos segundos de retardo en la conversación podrían ser catastróficos, y ninguno de los interlocutores sería capaz de entender al otro. Para poder dar soporte a estos nuevos servicios, las redes deben evolucionar. Para este propósito se han concebido MPLS y QoS MPLS es un mecanismo de transporte de paquetes que se usa en redes de telecomunicaciones de alto rendimiento que dirige y transporta los datos de acuerdo a caminos preestablecidos. Ahora los paquetes se encaminan en función de unas etiquetas, lo cual hace que sea mucho más rápido que encaminar los paquetes usando las direcciones IP. MPLS también soporta Ingeniería de Tráfico (TE). Consiste en seleccionar los mejores caminos para el tráfico de datos con el objetivo de balancear la carga entre los diferentes enlaces. En una red con múltiples caminos, los algoritmos de enrutamiento actuales calculan el camino más corto, y muchas veces el tráfico se dirige sólo por éste, saturando el canal, mientras que otras rutas se quedan completamente desocupadas. Ahora bien, esto no es suficiente para ofrecer al tráfico en tiempo real las garantías que necesita. De hecho, estos mecanismos mejoran la red, pero no realizan cambios a la hora de tratar el tráfico. Por esto es por lo que se ha desarrollado el concepto de Calidad de Servicio (QoS). La calidad de servicio es la capacidad para ofrecer diferentes prioridades a las diferentes aplicaciones, usuarios o flujos de datos, y para garantizar un cierto nivel de rendimiento en un flujo de datos. El tráfico se distribuye en diferentes clases y cada una de ellas se trata de forma diferente, de acuerdo a las especificaciones que se indiquen en su Contrato de Tráfico (SLA). EL tráfico con mayor prioridad tendrá preferencia sobre el resto, pero esto no significa que acapare la totalidad de los recursos. Para poder alcanzar estos objetivos se definen una serie de políticas para controlar y alterar el comportamiento del tráfico. Las posibilidades son inmensas dependiendo de cómo se quiera estructurar la red. Usando estos mecanismos se pueden proporcionar las garantías necesarias al tráfico en tiempo real, distribuyéndolo en categorías dentro de la red y ofreciendo el mejor servicio posible tanto a los datos en tiempo real como a los que no lo son.
Resumo:
Gran parte del patrimonio construido cuenta con edificios cuya estructura está compuesta por elementos de madera. El volumen económico que supone el mantenimiento y renovación de dicho patrimonio es considerable, por ello, es de especial interés el estudio de las diferentes técnicas de refuerzo aplicables a este tipo de estructuras. Las estructuras de madera han sido tradicionalmente reforzadas con piezas del mismo material, aumentando la sección de los elementos dañados, o con acero. La aparición de los materiales compuestos de polímeros reforzados con fibras, y su progresiva aplicación en obras de construcción, hizo que a principios de la década de los noventa se comenzara a aplicar este material en refuerzos de estructuras de madera (Puente de Sins, 1992). La madera es un material natural con una excelente relación entre sus características mecánicas y su peso. Con el uso de materiales compuestos como refuerzo ésta característica se mantiene. En cuanto a su modelo constitutivo, se admite un comportamiento elástico lineal a tracción paralela a la fibra hasta la rotura, mientras que a compresión, se considera un comportamiento lineal elástico inicial, seguido de un tramo plástico. En vigas de madera aserrada sometidas a flexión predomina el modo de fallo por tracción localizándose la fractura frecuentemente en el canto inferior. Los FRP tienen un comportamiento elástico lineal a tracción hasta la rotura y cuentan con excelentes propiedades mecánicas en relación a su peso y volumen. Si se refuerza la viga por el canto inferior se aumentará su capacidad de absorber tracciones y por tanto, es previsible que se produzca un incremento en la capacidad de carga, así como un aumento de ductilidad. En este trabajo se analizan los beneficios que aportan distintos sistemas de refuerzos de materiales compuestos. El objetivo es contribuir al conocimiento de esta técnica para la recuperación o aumento de las propiedades resistentes de elementos de madera sometidos a flexión. Se ha llevado a cabo un estudio basado en datos obtenidos experimentalmente mediante el ensayo a flexión de vigas de madera de pino silvestre reforzadas con materiales compuestos. Las fibras que componen los tejidos utilizados para la ejecución de los refuerzos son de basalto y de carbono. En el caso de los compuestos de fibra de basalto se aplican en distintos gramajes, y los de carbono en tejido unidireccional y bidireccional. Se analiza el comportamiento de las vigas según las variables de refuerzo aplicadas y se comparan con los resultados de vigas ensayadas sin reforzar. Además se comprueba el ajuste del modelo de cálculo no lineal aplicado para predecir la carga de rotura de cada viga reforzada. Con este trabajo queda demostrado el buen funcionamiento del FRP de fibra de basalto aplicado en el refuerzo de vigas de madera y de los tejidos de carbono bidireccionales con respecto a los unidireccionales. ABSTRACT Many of the buildings of the built heritage include a structure composed by timber elements. The economic volume involved in the maintenance and renewal of this built heritage is considerable, therefore, the study of the different reinforcement techniques applicable to this type of structure is of special interest. The wooden structures have traditionally been reinforced either with steel or with pieces of the same material, increasing the section of the damaged parts. The emergence of polymer composites reinforced with fibers, and their progressive use in construction, started to be applied as reinforcement in timber structures at the beginning of the nineties decade in the 20th century (Sins Bridge, 1992). Wood is a natural material with an excellent ratio between its mechanic characteristics and its weight. This feature is maintained with the use of composites as reinforcement. In terms of its constitutive model, linear elastic behavior parallel to the fiber up to fracture is admitted when subjected to tensile stress, while under compression, an initial linear elastic behavior, followed by a section plasticizing, is considered. In sawn timber beams subjected to bending, the predominant failure is mainly due to tensile stress; and frequently the fracture is located at the beam lower face. The FRP have a linear elastic behavior until fracture occurs, and have excellent mechanical properties in relation to their weight and volume. If the beam is reinforced by its lower face, its capacity to absorb tensile stresses will increase, and therefore, an increase in its carrying capacity is likely to be produced, as well as an increase in ductility. This work analyzes the benefits different reinforcement systems of composite materials provide, with the aim of contributing to the knowledge of this technique for recovering or increasing the strength properties of timber elements subjected to bending loads. It is a study based on data obtained experimentally using bending tests of pine timber beams reinforced with composite materials. Fibers used for the execution of the reinforcement are basalt and carbon. Basalt fiber composites are applied in different grammages, whereas with carbon composites, unidirectional and bidirectional fabrics are used. The behavior of the beams was analyzed regarding the reinforcement variables applied, and the results are compared with those of the tested beams without reinforcement. Furthermore it has been proved adjunting the nonlinear calculation model applied to predict the failure load of each reinforced beam. This work proves the good behavior of fiber reinforce plastic (FRP) with basalt fiber when applied to timber beams, and that of bidirectional carbon fabrics as opposed to the unidirectional ones.
Resumo:
The objectives of this work are to revisit the experimental measurements on dam break flow over a dry horizontal bed and to provide a detailed insight into the dynamics of the dam break wave impacting a vertical wall downstream the dam, with emphasis on the pressure loads. The measured data are statistically analyzed and critically discussed. As a result, an extensive set of data for validation of computational tools is provided.
Resumo:
Scientific missions constitute fundamental cornerstones of space agencies such as ESA and NASA. Modern astronomy could not be understood without the data provided by these missions. Scientists need to design very carefully onboard instruments. Payloads have to survive the crucial launch moment and later perform well in the really harsh space environ-ment. It is very important that the instrument conceptual idea can be engineered to sustain all those loads
Resumo:
During the last two decades the topic of human induced vibration has attracted a lot of attention among civil engineering practitioners and academics alike. Usually this type of problem may be encountered in pedestrian footbridges or floors of paperless offices. Slender designs are becoming increasingly popular, and as a consequence, the importance of paying attention to vibration serviceability also increases. This paper resumes the results obtained from measurements taken at different points of an aluminium catwalk which is 6 m in length by 0.6 m in width. Measurements were carried out when subjecting the structure to different actions:1)Static test: a steel cylinder of 35 kg was placed in the middle of the catwalk; 2)Dynamic test: this test consists of exciting the structure with singles impulses; 3)Dynamic test: people walking on the catwalk. Identification of the mechanical properties of the structure is an achievement of the paper. Indirect methods were used to estimate properties including the support stiffness, the beam bending stiffness, the mass of the structure (using Rayleigh method and iterative matrix method), the natural frequency (using the time domain and frequency domain analysis) and the damping ratio (by calculating the logarithmic decrement). Experimental results and numerical predictions for the response of an aluminium catwalk subjected to walking loads have been compared. The damping of this light weight structure depends on the amplitude of vibration which complicates the tuning of a structural model. In the light of the results obtained it seems that the used walking load model is not appropriate as the predicted transient vibration values (TTVs) are much higher than the measured ones.
Discussion of “Initial Pore Pressure from Vertical Surface Loads” by Jacobo Bielak (September, 1982)
Resumo:
The author presents a very interesting application of the ideas developed by Scott to determine the initial pore pressure in excess of the hydrostatic pore pressure in linear, elastic, homogeneous and isotropic soil-skeleton. Scott demonstrates that under vertical surface loads the problem is governed by Laplace's equation. Nevertheless the writers' think that it could be interesting to state clearly the conditions under which this analogy can be applied.
Resumo:
Different methods to reduce the high suction caused by conical vortices have been reported in the literature: vertical parapets, either solids or porous, placed at the roof edges being the most analysed configuration. Another method for alleviating the high suction peaks due to conical vortices is to round the roof edges. Very recently, the use of some non-standard parapet configurations, like cantilever parapets, has been suggested. In this paper, its efficiency to reduce suction loads on curved roofs is experimentally checked by testing the pressure distribution on the curved roof of a low-rise building model in a wind tunnel. Very high suction loads have been measured on this model, the magnitude of these high suction loads being significantly decreased when cantilever...