926 resultados para METHOD OF ANALYSIS
Resumo:
This article presents a new and computationally efficient method of analysis of a railway track modelled as a continuous beam of 2N spans supported by elastic vertical springs. The main feature of this method is its important reduction in computational effort with respect to standard matrix methods of structural analysis. In this article, the whole structure is considered to be a repetition of a single one. The analysis presented is applied to a simple railway track model, i.e. to a repetitive beam supported on vertical springs (sleepers). The proposed method of analysis is based on the general theory of spatially periodic structures. The main feature of this theory is the possibility to apply Discrete Fourier Transform (DFT) in order to reduce a large system of q(2N + 1) linear stiffness equilibrium equations to a set of 2N + 1 uncoupled systems of q equations each. In this way, a dramatic reduction of the computational effort of solving the large system of equations is achieved. This fact is particularly important in the analysis of railway track structures, in which N is a very large number (around several thousands), and q = 2, the vertical displacement and rotation, is very small. The proposed method allows us to easily obtain the exact solution given by Samartín [1], i.e. the continuous beam railway track response. The comparison between the proposed method and other methods of analysis of railway tracks, such as Lorente de Nó and Zimmermann-Timoshenko, clearly shows the accuracy of the obtained results for the proposed method, even for low values of N. In addition, identical results between the proposed and the Lorente methods have been found, although the proposed method seems to be of simpler application and computationally more efficient than the Lorente one. Small but significative differences occur between these two methods and the one developed by Zimmermann-Timoshenko. This article also presents a detailed sensitivity analysis of the vertical displacement of the sleepers. Although standard matrix methods of structural analysis can handle this railway model, one of the objectives of this article is to show the efficiency of DFT method with respect to standard matrix structural analysis. A comparative analysis between standard matrix structural analysis and the proposed method (DFT), in terms of computational time, input, output and also software programming, will be carried out. Finally, a URL link to a MatLab computer program list, based on the proposed method, is given
Resumo:
In this paper a consistent analysis of reinforced concrete (RC) two-dimensional (2-D) structures,namely slab structures subjected to in-plane and out-plane forces, is presented. By using this method of analysis the well established methodology for dimensioning and verifying RC sections of beam structures is extended to 2-D structures. The validity of the proposed analysis results is checked by comparing them with some published experimental test results. Several examples show some of these proposed analysis features, such as the influence of the reinforcement layout on the service and ultimate behavior of a slab structure and the non straightforward problem of the optimal dimension at a slab point subjected to several loading cases. Also, in these examples, the method applications to design situations as multiple steel families and non orthogonal reinforcement layout are commented.
Resumo:
In the present work, a three-dimensional (3D) formulation based on the method of fundamental solutions (MFS) is applied to the study of acoustic horns. The implemented model follows and extends previous works that only considered two-dimensional and axisymmetric horn configurations. The more realistic case of 3D acoustic horns with symmetry regarding two orthogonal planes is addressed. The use of the domain decomposition technique with two interconnected sub-regions along a continuity boundary is proposed, allowing for the computation of the sound pressure generated by an acoustic horn installed on a rigid screen. In order to reduce the model discretization requirements for these cases, Green’s functions derived with the image source methodology are adopted, automatically accounting for the presence of symmetry conditions. A strategy for the calculation of an optimal position of the virtual sources used by the MFS to define the solution is also used, leading to improved reliability and flexibility of the proposed method. The responses obtained by the developed model are compared to reference solutions, computed by well-established models based on the boundary element method. Additionally, numerically calculated acoustic parameters, such as directivity and beamwidth, are compared with those evaluated experimentally.
Resumo:
"(This is being submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Mathematics, June 1959.)"
Resumo:
Mode of access: Internet.
Resumo:
Folates and its derivatives occur as polyglutamates in nature. The multiplicity of forms and the generally low levels in foods makes quantitative analysis of folate a difficult task. The assay of folates from foods generally involves three steps: liberation of folates from the cellular matrix; deconjugation from the polyglutamate to the mono and di-glutamate forms; and the detection of the biological activity or chemical concentration of the resulting folates. The detection methods used are the microbiological assay relying on the turbidimetric bacterial growth of Lactobacillus rhamnosus which is by far the most commonly used method; the HPLC and LC/MS techniques and bio-specific procedures. This review attempts to describe the methods along with the merits and demerits of using each of these methods.
Resumo:
Analysis of variance (ANOVA) is the most efficient method available for the analysis of experimental data. Analysis of variance is a method of considerable complexity and subtlety, with many different variations, each of which applies in a particular experimental context. Hence, it is possible to apply the wrong type of ANOVA to data and, therefore, to draw an erroneous conclusion from an experiment. This article reviews the types of ANOVA most likely to arise in clinical experiments in optometry including the one-way ANOVA ('fixed' and 'random effect' models), two-way ANOVA in randomised blocks, three-way ANOVA, and factorial experimental designs (including the varieties known as 'split-plot' and 'repeated measures'). For each ANOVA, the appropriate experimental design is described, a statistical model is formulated, and the advantages and limitations of each type of design discussed. In addition, the problems of non-conformity to the statistical model and determination of the number of replications are considered. © 2002 The College of Optometrists.
Resumo:
Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the number of replications required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the DF of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for each error term of the ANOVA. Finally, it is important to consider the design of the experiment because this determines the appropriate ANOVA to use. Some of the most common experimental designs used in the biosciences and their relevant ANOVAs are discussed by. If there is doubt about which ANOVA to use, the researcher should seek advice from a statistician with experience of research in applied microbiology.
Resumo:
In this paper we investigate an application of the method of fundamental solutions (MFS) to transient heat conduction. In almost all of the previously proposed MFS for time-dependent heat conduction the fictitious sources are located outside the time-interval of interest. In our case, however, these sources are instead placed outside the space domain of interest in the same manner as is done for stationary heat conduction. A denseness result for this method is discussed and the method is numerically tested showing that accurate numerical results can be obtained. Furthermore, a test example with boundary singularities shows that it is advisable to remove such singularities before applying the MFS.
Resumo:
In this paper we investigate an application of the method of fundamental solutions (MFS) to transient heat conduction in layered materials, where the thermal diffusivity is piecewise constant. Recently, in Johansson and Lesnic [A method of fundamental solutions for transient heat conduction. Eng Anal Boundary Elem 2008;32:697–703], a MFS was proposed with the sources placed outside the space domain of interest, and we extend that technique to numerically approximate the heat flow in layered materials. Theoretical properties of the method, as well as numerical investigations are included.
Resumo:
In this paper, free surface problems of Stefan-type for the parabolic heat equation are investigated using the method of fundamental solutions. The additional measurement necessary to determine the free surface could be a boundary temperature, a heat flux or an energy measurement. Both one- and two-phase flows are investigated. Numerical results are presented and discussed.
Resumo:
The purpose of this paper is to explain the notion of clustering and a concrete clustering method- agglomerative hierarchical clustering algorithm. It shows how a data mining method like clustering can be applied to the analysis of stocks, traded on the Bulgarian Stock Exchange in order to identify similar temporal behavior of the traded stocks. This problem is solved with the aid of a data mining tool that is called XLMiner™ for Microsoft Excel Office.
Resumo:
The description of the support system for marking decision in terms of prognosing the inflation level based on the multifactor dependence represented by the decision – marking “tree” is given in the paper. The interrelation of factors affecting the inflation level – economic, financial, political, socio-demographic ones, is considered. The perspectives for developing the method of decision – marking “tree”, and pointing out the so- called “narrow” spaces and further analysis of possible scenarios for inflation level prognosing in particular, are defined.
Resumo:
A környezeti hatások rendszerint túlmutatnak egy vállalat határain, éppen ezért az ellátási lánc kontextusban a környezeti szempontok érvényesítése során fontos szerep jut a beszerzési döntéseknek is. Számos olyan példát lehetne említeni, amikor egy adott szempont szerint egy alternatíva környezetileg előnyös, de az ellátási lánc egészét nézve már környezetterhelő. A környezeti hatások ellátási lánc szinten való mérése azonban komoly kihívásokat jelent. Ezzel jelentős kutatásokat és fejlesztéseket inspirált a téma. Az egyik olyan terület, amelyben komoly kutatási eredmények születtek, az a környezeti szempontok beszállítói értékelésbe való beépítése. A kutatások ezen irányához csatlakozva a szerzők tanulmányunkban azt keresik, hogyan lehet meghatározni az egyik legáltalánosabban használt szállítóértékelési módszerben, a súlyozott pontrendszerben egy adott szemponthoz azt a súlyt, amely mellett az adott szempont már döntésbefolyásoló tényezővé válik. Ehhez a DEA (Data Envelopment Analysis) összetett indikátorok (Composite Indicators, CI) módszerét alkalmazzák. A szempontok közös súlyának fontossága megállapításához a lineáris programozás elméletét használják. _____ Management decisions often have an environmental effect not just within the company, but outside as well, this is why supply chain context is highlighted in literature. Measuring environmental issues of supply decisions raise a lot of problems from methodological and practical point of view. This inspires a rapidly growing literature as a lot of studies were published focusing on how to incorporate environmental issues into supplier evaluation. This paper contributes to this stream of research as it develops a method to help weight selection. In the authors’ paper the method of Data Envelope Analysis (DEA) is used to study the extension of traditional supplier selection methods with environmental factors. The selection of the weight system can control the result of the selection process.
Resumo:
A tanulmány a mikroökonómia eszközrendszerét és a hazai gépjárműpiac 2013-as adatait segítségül hívva egy új módszert mutat be az ármeghatározás területén. A kutatás központi kérdése az, hogy hol található az a pont, amikor a fogyasztó elégedett a kínált minőséggel és árral – lehetőleg megfelelő időben – és a vállalat is elégedett a megszerzett profittal. A tanulmányban tehát az ármeghatározás során központi szerepet játszik a minőség és az idő, mint értékteremtő funkció. Az elemzés egyik legfőbb következtetése, hogy a profitmaximumból levezetett optimális ár a minőség és az idő különböző paraméterei mellett meghatározható. A módszer segítségével a vállalatok közgazdasági eszközrendszer segítségével kapnak egy új szemléletet működési paramétereik és egyben versenyprioritásaik (ár, költség, minőségszint, idő) felállításához. _____ The study points to a new method for determining price with the tools of microeconomics and data of the Hungarian car market. The focus of the research is on where to find the point where the consumer is satisfied with the quality and price offered – preferably right time – and the company is satisfied with the profit achieved. In this study, therefore, in setting prices plays a central role the quality and time as a value-added feature. One of the main conclusions of the analysis is that the optimal price can be determined by various parameters of the quality and time. The method of using the economic tools help companies get a new perspective and to set up their optimal operating parameters (price, cost, quality level, time).