860 resultados para Vibration analysis techniques


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Counter systems are a well-known and powerful modeling notation for specifying infinite-state systems. In this paper we target the problem of checking liveness properties in counter systems. We propose two semi decision techniques towards this, both of which return a formula that encodes the set of reachable states of the system that satisfy a given liveness property. A novel aspect of our techniques is that they use reachability analysis techniques, which are well studied in the literature, as black boxes, and are hence able to compute precise answers on a much wider class of systems than previous approaches for the same problem. Secondly, they compute their results by iterative expansion or contraction, and hence permit an approximate solution to be obtained at any point. We state the formal properties of our techniques, and also provide experimental results using standard benchmarks to show the usefulness of our approaches. Finally, we sketch an extension of our liveness checking approach to check general CTL properties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The vibration analysis of an adhered S-shaped microbeam under alternating sinusoidal voltage is presented. The shaking force is the electrical force due to the sinusoidal voltage. During vibration, both the microbeam deflection and the adhesion length keep changing. The microbeam deflection and adhesion length are numerically determined by the iteration method. As the adhesion length keeps changing, the domain of the equation of motion for the microbeam (unadhered part) changes correspondingly, which results in changes of the structure natural frequencies. For this reason, the system can never reach a steady state. The transient behaviors of the microbeam under different shaking frequencies are compared. We deliberately choose the initial conditions to compare our dynamic results with the existing static theory. The paper also analyzes the changing behavior of adhesion length during vibration and an asymmetric pattern of adhesion length change is revealed, which may be used to guide the dynamic de-adhering process. The abnormal behavior of the adhered microbeam vibrating at almost the same frequency under two quite different shaking frequencies is also shown. The Galerkin method is used to discretize the equation of motion and its convergence study is also presented. The model is only applicable in the case that the peel number is equal to 1. Some other model limitations are also discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN] The concept of image in its different aspects is very important in today s society as well as in the business management field. Some authors reports that most of the studies that measure image do not take into account neither previous theoretical and conceptual models nor other possible empirical evidence alternatives. Given this need, a research regarding the concept of brand image applied to shopping malls was conducted based on the conceptual model of the consumer cognitive response in order to empirically explore and contrast it. For this reason, a survey was applied to 420 consumers in five shopping malls in Bogotá, achieving a database of 3.749 cases. The results show attribute-shopping mall associations expressed in unique, differentiated, and notorious vocabulary obtained applying lexicometric and multivariate analysis techniques. Attribute-shopping mall associations such as spacious , good location , good variety of stores , and the existence of movie theaters . Finally, this research aims to potentially improve the management of shopping malls and increase their attractiveness and customer loyalty by applying the development of service quality systems, integral communication, segmentation, and positioning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabalho tem por objetivo avaliar o comportamento dinâmico de pisos mistos (aço-concreto) sob a ação de cargas provenientes das atividades humanas rítmicas, especificamente a prática de ginástica aeróbica, sob o ponto de vista do conforto humano. Tal avaliação torna-se necessária por crescentes problemas estruturais associados às vibrações excessivas, decorrentes da concepção de sistemas estruturais com baixos níveis de amortecimento e com frequências naturais cada vez mais baixas e bastante próximas das faixas de frequência das excitações associadas às atividades humanas rítmicas. O modelo estrutural investigado baseiase em um piso misto (aço-concreto) submetido a aulas de ginástica aeróbica. A modelagem numérica do piso misto investigado foi realizada com base no emprego do programa ANSYS e foram utilizadas técnicas de discretização por meio do método dos elementos finitos (MEF). As cargas aplicadas sobre o piso, oriundas das atividades aeróbicas, são simuladas através de dois modelos de carregamentos dinâmicos distintos. Uma extensa análise paramétrica foi desenvolvida sobre o modelo estrutural investigado e a resposta dinâmica do sistema foi obtida, em termos dos deslocamentos e das acelerações, e comparada com os limites recomendados por normas e critérios de projeto. A resposta dinâmica do piso estudado viola os critérios de projeto relativos ao conforto humano e indica níveis de vibrações excessivas nos casos de carregamento dinâmicos analisados nesta dissertação.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chorioamnion is the membrane that surrounds the fetus during gestation. Normally, it must remain intact for the duration of pregnancy, 37-42 weeks, and only rupture during or just before labour and delivery of the fetus. In a significant number (3%) of all births, this does not happen, and membranes rupture before term, resulting in preterm birth and significant perinatal morbidity. It is known that the material properties of chorioamnion may play a major role in mechanical rupture; a number of studies have been undertaken to characterise the physical nature of chorioamnion and examine factors that may predispose to rupture. However, the existing literature is inconsistent in its choice of both physical testing methods and data analysis techniques, motivating the current review. Experimental data from a large number of chorioamnion mechanical studies were collated, and data were converted to standard engineering quantities. The failure strength of the chorioamnion membrane was found consistently to value approximately 0.9 MPa. It is hoped that past and future studies of membrane mechanics can provide insight into the role of chorioamnion in labour and delivery. In addition, biomechanical approaches can help elucidate the potential causes of early rupture, and suggest future protocols or treatments that could both diagnose and prevent its occurrence. © 2009 Elsevier Ireland Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The anatomical and morphometric (shape indices, contour descriptors and otolith weight) characterizations of sagittal otoliths were investigated in 13 species of Lutjanus spp. inhabiting the Persian Gulf. This is the first study that compares the efficiency of three different image analysis techniques for discriminating species based on the shape of the outer otolith contour, including elliptical Fourier descriptors (EFD), fast Fourier transform (FFT) and wavelet transform (WT). Sagittal otoliths of snappers are morphologically similar with some small specific variations. The use of otolith contour based on wavelets (WT) provided the best results in comparison with the two other methods based on Fourier descriptors, but only the combination of the all three methods (EFD, FFT and WT) was useful to obtain a robust classification of species. The species prediction improved when otolith weight was included. In relation to the shape indices, only the aspect ratio provided a clear grouping of species. Also, another study was carried on to test the possibility of application of shape analysis and comparing otolith contour of otoliths of Lutjanus johnii from Persian Gulf and Oman Sea to identify potential stocks. The results showed the otoliths have differences in contour shape and can be contribute to two different stocks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dynamical behaviour of the sidewall has an important influence on tyre vibration characteristics. Nonetheless, it remains crudely represented in many existing models. The current work considers a geometrically accurate, two-dimensional, sidewall description, with a view to identifying potential shortcomings in the approximate formulations and identifying the physical characteristics that must be accounted for. First, the mean stress state under pressurisation and centrifugal loading is investigated. Finite-Element calculations show that, while the loaded sidewall shape remains close to a toroid, its in-plane tensions differ appreciably from the associated analytical solution. This is largely due to the inability of the anisotropic sidewall material to sustain significant azimuthal stress. An approximate analysis, based on the meridional tension alone, is therefore developed, and shown to yield accurate predictions. In conjunction with a set of formulae for the 'engineering constants' of the sidewall material, the approximate solutions provide a straightforward and efficient means of determining the base state for the vibration analysis. The latter is implemented via a 'waveguide' discretisation of a variational formulation. Its results show that, while the full geometrical description is necessary for a complete and reliable characterisation of the sidewall's vibrational properties, a one-dimensional approximation will often be satisfactory in practice. Meridional thickness variations only become important at higher frequencies (above 500 Hz for the example considered here), and rotational inertia effects appear to be minor at practical vehicle speeds. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An aim of proactive risk management strategies is the timely identification of safety related risks. One way to achieve this is by deploying early warning systems. Early warning systems aim to provide useful information on the presence of potential threats to the system, the level of vulnerability of a system, or both of these, in a timely manner. This information can then be used to take proactive safety measures. The United Nation’s has recommended that any early warning system need to have four essential elements, which are the risk knowledge element, a monitoring and warning service, dissemination and communication and a response capability. This research deals with the risk knowledge element of an early warning system. The risk knowledge element of an early warning system contains models of possible accident scenarios. These accident scenarios are created by using hazard analysis techniques, which are categorised as traditional and contemporary. The assumption in traditional hazard analysis techniques is that accidents are occurred due to a sequence of events, whereas, the assumption of contemporary hazard analysis techniques is that safety is an emergent property of complex systems. The problem is that there is no availability of a software editor which can be used by analysts to create models of accident scenarios based on contemporary hazard analysis techniques and generate computer code that represent the models at the same time. This research aims to enhance the process of generating computer code based on graphical models that associate early warning signs and causal factors to a hazard, based on contemporary hazard analyses techniques. For this purpose, the thesis investigates the use of Domain Specific Modeling (DSM) technologies. The contributions of this thesis is the design and development of a set of three graphical Domain Specific Modeling languages (DSML)s, that when combined together, provide all of the necessary constructs that will enable safety experts and practitioners to conduct hazard and early warning analysis based on a contemporary hazard analysis approach. The languages represent those elements and relations necessary to define accident scenarios and their associated early warning signs. The three DSMLs were incorporated in to a prototype software editor that enables safety scientists and practitioners to create and edit hazard and early warning analysis models in a usable manner and as a result to generate executable code automatically. This research proves that the DSM technologies can be used to develop a set of three DSMLs which can allow user to conduct hazard and early warning analysis in more usable manner. Furthermore, the three DSMLs and their dedicated editor, which are presented in this thesis, may provide a significant enhancement to the process of creating the risk knowledge element of computer based early warning systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The requirement for a very accurate dependence analysis to underpin software tools to aid the generation of efficient parallel implementations of scalar code is argued. The current status of dependence analysis is shown to be inadequate for the generation of efficient parallel code, causing too many conservative assumptions to be made. This paper summarises the limitations of conventional dependence analysis techniques, and then describes a series of extensions which enable the production of a much more accurate dependence graph. The extensions include analysis of symbolic variables, the development of a symbolic inequality disproof algorithm and its exploitation in a symbolic Banerjee inequality test; the use of inference engine proofs; the exploitation of exact dependence and dependence pre-domination attributes; interprocedural array analysis; conditional variable definition tracing; integer array tracing and division calculations. Analysis case studies on typical numerical code is shown to reduce the total dependencies estimated from conventional analysis by up to 50%. The techniques described in this paper have been embedded within a suite of tools, CAPTools, which combines analysis with user knowledge to produce efficient parallel implementations of numerical mesh based codes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a method for the integration of several numerical analytical techniques that are used in microsystems design and failure analysis is presented. The analytical techniques are categorized into four groups in the discussion, namely the high-fidelity analytical tools, i.e. finite element (FE) method, the fast analytical tools referring to reduced order modeling (ROM); the optimization tools, and probability based analytical tools. The characteristics of these four tools are investigated. The interactions between the four tools are discussed and a methodology for the coupling of these four tools is offered. This methodology consists of three stages, namely reduced order modeling, deterministic optimization and probabilistic optimization. Using this methodology, a case study for optimization of a solder joint is conducted. It is shown that these analysis techniques have mutual relationship of interaction and complementation. Synthetic application of these techniques can fully utilize the advantages of these techniques and satisfy various design requirements. The case study shows that the coupling method of different tools provided by this paper is effective and efficient and it is highly relevant in the design and reliability analysis of microsystems

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE - The aim if the study was to investigate whether children born to older mothers have an increased risk of type 1 diabetes by performing a pooled analysis of previous studies using individual patient data to adjust for recognized confounders.
RESEARCH DESIGN AND METHODS - Relevant studies published before June 2009 were identified from MEDLINE, Web of Science, and EMBASE. Authors of studies were contacted and asked to provide individual patient data or conduct prespecified analyses. Risk estimates of type 1 diabetes by maternal age were calculated for each study, before and after adjustment for potential confounders. Meta-analysis techniques were used to derive combined odds ratios and to investigate heterogeneity among studies.
RESULTS - Data were available for 5 cohort and 25 case-control studies, including 14,724 cases of type 1 diabetes. Overall, there was, on average, a 5% (95% CI 2-9) increase in childhood type 1 diabetes odds per 5-year increase in maternal age (P = 0.006), but there was heterogeneity among studies (heterogeneity I 2 = 70%). In studies with a low risk of bias, there was a more marked increase in diabetes odds of 10% per 5-year increase in maternal age. Adjustments for potential confounders little altered these estimates. CONCLUSIONS - There was evidence of a weak but significant linear increase in the risk of childhood type 1 diabetes across the range of maternal ages, but the magnitude of association varied between studies. A very small percentage of the increase in the incidence of childhood type 1 diabetes in recent years could be explained by increases in maternal age.