955 resultados para Movable bed models (Hydraulic engineering)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing efforts exist in integrating different levels of detail in models of the cardiovascular system. For instance, one-dimensional representations are employed to model the systemic circulation. In this context, effective and black-box-type decomposition strategies for one-dimensional networks are needed, so as to: (i) employ domain decomposition strategies for large systemic models (1D-1D coupling) and (ii) provide the conceptual basis for dimensionally-heterogeneous representations (1D-3D coupling, among various possibilities). The strategy proposed in this article works for both of these two scenarios, though the several applications shown to illustrate its performance focus on the 1D-1D coupling case. A one-dimensional network is decomposed in such a way that each coupling point connects two (and not more) of the sub-networks. At each of the M connection points two unknowns are defined: the flow rate and pressure. These 2M unknowns are determined by 2M equations, since each sub-network provides one (non-linear) equation per coupling point. It is shown how to build the 2M x 2M non-linear system with arbitrary and independent choice of boundary conditions for each of the sub-networks. The idea is then to solve this non-linear system until convergence, which guarantees strong coupling of the complete network. In other words, if the non-linear solver converges at each time step, the solution coincides with what would be obtained by monolithically modeling the whole network. The decomposition thus imposes no stability restriction on the choice of the time step size. Effective iterative strategies for the non-linear system that preserve the black-box character of the decomposition are then explored. Several variants of matrix-free Broyden`s and Newton-GMRES algorithms are assessed as numerical solvers by comparing their performance on sub-critical wave propagation problems which range from academic test cases to realistic cardiovascular applications. A specific variant of Broyden`s algorithm is identified and recommended on the basis of its computer cost and reliability. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we address decomposition strategies especially tailored to perform strong coupling of dimensionally heterogeneous models, under the hypothesis that one wants to solve each submodel separately and implement the interaction between subdomains by boundary conditions alone. The novel methodology takes full advantage of the small number of interface unknowns in this kind of problems. Existing algorithms can be viewed as variants of the `natural` staggered algorithm in which each domain transfers function values to the other, and receives fluxes (or forces), and vice versa. This natural algorithm is known as Dirichlet-to-Neumann in the Domain Decomposition literature. Essentially, we propose a framework in which this algorithm is equivalent to applying Gauss-Seidel iterations to a suitably defined (linear or nonlinear) system of equations. It is then immediate to switch to other iterative solvers such as GMRES or other Krylov-based method. which we assess through numerical experiments showing the significant gain that can be achieved. indeed. the benefit is that an extremely flexible, automatic coupling strategy can be developed, which in addition leads to iterative procedures that are parameter-free and rapidly converging. Further, in linear problems they have the finite termination property. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. in this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface roughness is an important geomorphological variable which has been used in the Earth and planetary sciences to infer material properties, current/past processes, and the time elapsed since formation. No single definition exists; however, within the context of geomorphometry, we use surface roughness as an expression of the variability of a topographic surface at a given scale, where the scale of analysis is determined by the size of the landforms or geomorphic features of interest. Six techniques for the calculation of surface roughness were selected for an assessment of the parameter`s behavior at different spatial scales and data-set resolutions. Area ratio operated independently of scale, providing consistent results across spatial resolutions. Vector dispersion produced results with increasing roughness and homogenization of terrain at coarser resolutions and larger window sizes. Standard deviation of residual topography highlighted local features and did not detect regional relief. Standard deviation of elevation correctly identified breaks of slope and was good at detecting regional relief. Standard deviation of slope (SD(slope)) also correctly identified smooth sloping areas and breaks of slope, providing the best results for geomorphological analysis. Standard deviation of profile curvature identified the breaks of slope, although not as strongly as SD(slope), and it is sensitive to noise and spurious data. In general, SD(slope) offered good performance at a variety of scales, while the simplicity of calculation is perhaps its single greatest benefit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural riversare consisting of various networks as junction andstreams. And sediment and erosion are occurred by specific stream condition. When flood season,large discharge flew in the river and river bed changed by high flow velocity. Especially junction area’s flow characteristics are very complex. The purpose of this study is to analyze the flow characteristics in channel junction, which are most influenced by large discharge like flooding and input water from tributary. We investigate the flow characteristics by using hydrodynamics and transport module in MIKE 3 FM. MIKE 3 FM model was helpful tool to analysis 3D hydrodynamics, erosion and sediment effect from channel bed. We analyze flow characteristics at channel junction. Also we consider hydraulic structures like a bridge pier which is influencing flow characteristics like a flow velocity, water level, erosion and scour depth in channel bed. In the model, we controlled discharge condition according to Froude Number and reflect various grain diameter size and flow ratio change in main stream and tributary. In the result, flow velocity, water level, erosion and sediment depth are analyzed. Additionally, we suggest a these result relationship with equations. This study will help the understand flow characteristics and influence of hydraulic structure in channel junction. Acknowledgments This research was supported by a grant (12-TI-C01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a detailed study of the application of different additive manufacturing technologies (sintering process, three-dimensional printing, extrusion and stereolithographic process), in the design process of a complex geometry model and its moving parts. The fabrication sequence was evaluated in terms of pre-processing conditions (model generation and model STL SLI), generation strategy and physical model post-processing operations. Dimensional verification of the obtained models was undertook by projecting structured light (optical scan), a relatively new technology of main importance for metrology and reverse engineering. Studies were done in certain manufacturing time and production costs, which allowed the definition of an more comprehensive evaluation matrix of additive technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The search for better performance in the structural systems has been taken to more refined models, involving the analysis of a growing number of details, which should be correctly formulated aiming at defining a representative model of the real system. Representative models demand a great detailing of the project and search for new techniques of evaluation and analysis. Model updating is one of this technologies, it can be used to improve the predictive capabilities of computer-based models. This paper presents a FRF-based finite element model updating procedure whose the updating variables are physical parameters of the model. It includes the damping effects in the updating procedure assuming proportional and non proportional damping mechanism. The updating parameters are defined at an element level or macro regions of the model. So, the parameters are adjusted locally, facilitating the physical interpretation of the adjusting of the model. Different tests for simulated and experimental data are discussed aiming at evaluating the characteristics and potentialities of the methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researches in Requirements Engineering have been growing in the latest few years. Researchers are concerned with a set of open issues such as: communication between several user profiles involved in software engineering; scope definition; volatility and traceability issues. To cope with these issues a set of works are concentrated in (i) defining processes to collect client s specifications in order to solve scope issues; (ii) defining models to represent requirements to address communication and traceability issues; and (iii) working on mechanisms and processes to be applied to requirements modeling in order to facilitate requirements evolution and maintenance, addressing volatility and traceability issues. We propose an iterative Model-Driven process to solve these issues, based on a double layered CIM to communicate requirements related knowledge to a wider amount of stakeholders. We also present a tool to help requirements engineer through the RE process. Finally we present a case study to illustrate the process and tool s benefits and usage

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is concerned with the computation of incompressible axisymmetric and fall three-dimensional free-surface flows. In particular, the circular-hydraulic jump is simulated and compared with approximate analytic solutions. However, the principal thrust of this paper is to provide a real problem as a test bed for comparing the many existing convective approximations. Their performance is compared; SMART, HLPA and VONOS emerge as acceptable upwinding methods for this problem. Copyright (C) 2002 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents experimental information relevant to the combustion of biomass in a bubbling fluidized bed. The biomass distribution in a fluidized bed was studied through tests performed in a cold bed, while the volatiles released in the biomass pyrolysis, the burning rate of the resulting charcoal, and the combustion control regime, were studied through tests performed in a high temperature bed.Visual examination of photographs taken from a transparent walls bed, with a rectangular cross-section, showed that the large fuel particles, typical of biomass processing, were distributed in the bubbles, in the splash zone, and in the emulsion phase. The occurrence of biomass in the emulsion phase was favored by burning biomass particles of greater density and smaller size-expetimentally determined in each case. Decreasing the fuel particle size improved the biomass distribution inside the bed. The same was accomplished by increasing the superficial gas velocity as high as possible, compatibly with the acceptable elutriation.Burning tests showed that the biomass fuels have the advantage of reaching the diffusional regime at temperatures that can be lower than 1000 K, which ensures that the biomass fuels burn in a stable regime. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rare earth (RE) metals are essentials for the manufacturing of high-technology products. The separation of RE is complex and expensive; biosorption is an alternative to conventional processes. This work focuses on the biosorption of monocomponent and bicomponent solutions of lanthanum(III) and neodymium(III) in fixed-bed columns using Sargassum sp. biomass. The desorption of metals with HCl 0.10 mol L-1 from loaded biomass is also carried out with the objective of increasing the efficiency of metal separation. Simple models have been successfully used to model breakthrough curves (i.e., Thomas, Bohart-Adams, and Yoon-Nelson equations) for the biosorption of monocomponent solutions. From biosorption and desorption experiments in both monocomponent and bicomponent solutions, a slight selectivity of the biomass for Nd(III) over La(III) is observed. The experiments did not find an effective separation of the RE studied, but their results indicate a possible partition between the metals, which is the fundamental condition for separation perspectives. (C) 2012 American Institute of Chemical Engineers Biotechnol. Prog., 2012

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)