8 resultados para Simulação por computador

em Repositório Institucional da Universidade Tecnológica Federal do Paraná (RIUT)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reinforced concrete creep is a phenomenon of great importance. Despite being appointed as the main cause of several pathologies, its effects are yet considered in a simplified way by the structural designers. In addition to studying the phenomenon in reinforced concrete structures and its current account used in the structural analysis, this paper compares creep strains at simply supported reinforced concrete beams in analytical and in experimental forms with the finite element method (FEM) simulation results. The strains and deflections obtained through the analytical form were calculated with the Brazilian code NBR 6118 (2014) recommendations and the simplified method from CEB-FIP 90 and the experimental results were extracted from tests available in the literature. Finite element simulations are performed using ANSYS Workbench software, using its 3D SOLID 186 elements and the structure symmetry. Analyzes of convergence using 2D PLANE 183 elements are held as well. At the end, it is concluded that FEM analyses are quantitative and qualitative efficient for the estimation of this non-linearity and that the method utilized to obtain the creep coefficients values is sufficiently accurate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power generation from alternative sources is at present the subject of numerous research and development in science and industry. Wind energy stands out in this scenario as one of the most prominent alternative in the generation of electricity, by its numerous advantages. In research works, computer reproduction and experimental behavior of a wind turbine are very suitable tools for the development and study of new technologies and the use of wind potential of a given region. These tools generally are desired to include simulation of mechanical and electrical parameters that directly affect the energy conversion. This work presents the energy conversion process in wind systems for power generation, in order to develop a tool for wind turbine emulation testing experimental, using LabVIEW® software. The purpose of this tool is to emulate the torque developed in an axis wind turbine. The physical setup consists of a three phase induction motor and a permanent magnet synchronous generator, which are evaluated under different wind speed conditions. This tool has the objective to be flexible to other laboratory arrangements, and can be used in other wind power generation structures in real time. A modeling of the wind power system is presented, from the turbine to the electrical generator. A simulation tool is developed using Matlab/Simulink® with the purpose to pre-validate the experiment setup. Finally, the design is implemented in a laboratory setup.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Centrifugal pumps are vastly used in many industrial applications. Knowledge of how these components behave in several circumstances is crucial for the development of more efficient and, therefore, less expensive pumping installations. The combination of multiple impellers, vaned diffusers and a volute might introduce several complex flow characteristics that largely deviate from regular inviscid pump flow theory. Computational Fluid Dynamics can be very helpful to extract information about which physical phenomena are involved in such flows. In this sense, this work performs a numerical study of the flow in a two-stage centrifugal pump (Imbil ITAP 65-330/2) with a vaned diffuser and a volute. The flow in the pump is modeled using the software Ansys CFX, by means of a multi-block, transient rotor-stator technique, with structured grids for all pump parts. The simulations were performed using water and a mixture of water and glycerin as work fluids. Several viscosities were considered, in a range between 87 and 720 cP. Comparisons between experimental data obtained by Amaral (2007) and numerical head curves showed a good agreement, with an average deviation of 6.8% for water. The behavior of velocity, pressure and turbulence kinetic energy fields was evaluated for several operational conditions. In general, the results obtained by this work achieved the proposed goals and are a significant contribution to the understanding of the flow studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this research work, a new routing protocol for Opportunistic Networks is presented. The proposed protocol is called PSONET (PSO for Opportunistic Networks) since the proposal uses a hybrid system composed of a Particle Swarm Optimization algorithm (PSO). The main motivation for using the PSO is to take advantage of its search based on individuals and their learning adaptation. The PSONET uses the Particle Swarm Optimization technique to drive the network traffic through of a good subset of forwarders messages. The PSONET analyzes network communication conditions, detecting whether each node has sparse or dense connections and thus make better decisions about routing messages. The PSONET protocol is compared with the Epidemic and PROPHET protocols in three different scenarios of mobility: a mobility model based in activities, which simulates the everyday life of people in their work activities, leisure and rest; a mobility model based on a community of people, which simulates a group of people in their communities, which eventually will contact other people who may or may not be part of your community, to exchange information; and a random mobility pattern, which simulates a scenario divided into communities where people choose a destination at random, and based on the restriction map, move to this destination using the shortest path. The simulation results, obtained through The ONE simulator, show that in scenarios where the mobility model based on a community of people and also where the mobility model is random, the PSONET protocol achieves a higher messages delivery rate and a lower replication messages compared with the Epidemic and PROPHET protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the challenges to biomedical engineers proposed by researchers in neuroscience is brain machine interaction. The nervous system communicates by interpreting electrochemical signals, and implantable circuits make decisions in order to interact with the biological environment. It is well known that Parkinson’s disease is related to a deficit of dopamine (DA). Different methods has been employed to control dopamine concentration like magnetic or electrical stimulators or drugs. In this work was automatically controlled the neurotransmitter concentration since this is not currently employed. To do that, four systems were designed and developed: deep brain stimulation (DBS), transmagnetic stimulation (TMS), Infusion Pump Control (IPC) for drug delivery, and fast scan cyclic voltammetry (FSCV) (sensing circuits which detect varying concentrations of neurotransmitters like dopamine caused by these stimulations). Some softwares also were developed for data display and analysis in synchronously with current events in the experiments. This allowed the use of infusion pumps and their flexibility is such that DBS or TMS can be used in single mode and other stimulation techniques and combinations like lights, sounds, etc. The developed system allows to control automatically the concentration of DA. The resolution of the system is around 0.4 µmol/L with time correction of concentration adjustable between 1 and 90 seconds. The system allows controlling DA concentrations between 1 and 10 µmol/L, with an error about +/- 0.8 µmol/L. Although designed to control DA concentration, the system can be used to control, the concentration of other substances. It is proposed to continue the closed loop development with FSCV and DBS (or TMS, or infusion) using parkinsonian animals models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work is presented mixed convection heat transfer inside a lid-driven cavity heated from below and filled with heterogeneous and homogeneous porous medium. In the heterogeneous approach, the solid domain is represented by heat conductive equally spaced blocks; the fluid phase surrounds the blocks being limited by the cavity walls. The homogeneous or pore-continuum approach is characterized by the cavity porosity and permeability. Generalized mass, momentum and energy conservation equations are obtained in dimensionless form to represent both the continuum and the pore-continuum models. The numerical solution is obtained via the finite volume method. QUICK interpolation scheme is set for numerical treatment of the advection terms and SIMPLE algorithm is applied for pressure-velocity coupling. Aiming the laminar regime, the flow parameters are kept in the range of 102≤Re≤103 and 103≤Ra≤106 for both the heterogeneous and homogeneous approaches. In the tested configurations for the continuous model, 9, 16, 36, and 64 blocks are considered for each combination of Re and Ra being the microscopic porosity set as constant φ=0,64 . For the pore-continuum model the Darcy number (Da) is set according to the number of blocks in the heterogeneous cavity and the φ. Numerical results of the comparative study between the microscopic and macroscopic approaches are presented. As a result, average Nusselt number equations for the continuum and the pore continuum models as a function of Ra and Re are obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a platform to the conditioning, digitizing, visualization and recording of the EMG signals was developed. After the acquisition, the analysis can be done by signal processing techniques. The platform consists of two modules witch acquire electromyography (EMG) signals by surface electrodes, limit the interest frequency band, filter the power grid interference and digitalize the signals by the analogue-to- digital converter of the modules microcontroller. Thereby, the data are sent to the computer by the USB interface by the HID specification, displayed in real-time in graphical form and stored in files. As processing resources was implemented the operations of signal absolute value, the determination of effective value (RMS), Fourier analysis, digital filter (IIR) and the adaptive filter. Platform initial tests were performed with signal of lower and upper limbs with the aim to compare the EMG signal laterality. The open platform is intended to educational activities and academic research, allowing the addition of other processing methods that the researcher want to evaluate or other required analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes to adjust the Notification Oriented Paradigm (NOP) so that it provides support to fuzzy concepts. NOP is inspired by elements of imperative and declarative paradigms, seeking to solve some of the drawbacks of both. By decomposing an application into a network of smaller computational entities that are executed only when necessary, NOP eliminates the need to perform unnecessary computations and helps to achieve better logical-causal uncoupling, facilitating code reuse and application distribution over multiple processors or machines. In addition, NOP allows to express the logical-causal knowledge at a high level of abstraction, through rules in IF-THEN format. Fuzzy systems, in turn, perform logical inferences on causal knowledge bases (IF-THEN rules) that can deal with problems involving uncertainty. Since PON uses IF-THEN rules in an alternative way, reducing redundant evaluations and providing better decoupling, this research has been carried out to identify, propose and evaluate the necessary changes to be made on NOP allowing to be used in the development of fuzzy systems. After that, two fully usable materializations were created: a C++ framework, and a complete programming language (LingPONFuzzy) that provide support to fuzzy inference systems. From there study cases have been created and several tests cases were conducted, in order to validate the proposed solution. The test results have shown a significant reduction in the number of rules evaluated in comparison to a fuzzy system developed using conventional tools (frameworks), which could represent an improvement in performance of the applications.