899 resultados para the SIMPLE algorithm


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the present work the benefits of using graphics processing units (GPU) to aid the design of complex geometry profile extrusion dies, are studied. For that purpose, a3Dfinite volume based code that employs unstructured meshes to solve and couple the continuity, momentum and energy conservation equations governing the fluid flow, together with aconstitutive equation, was used. To evaluate the possibility of reducing the calculation time spent on the numerical calculations, the numerical code was parallelized in the GPU, using asimple programing approach without complex memory manipulations. For verificationpurposes, simulations were performed for three benchmark problems: Poiseuille flow, lid-driven cavity flow and flow around acylinder. Subsequently, the code was used on the design of two real life extrusion dies for the production of a medical catheter and a wood plastic composite decking profile. To evaluate the benefits, the results obtained with the GPU parallelized code were compared, in terms of speedup, with a serial implementation of the same code, that traditionally runs on the central processing unit (CPU). The results obtained show that, even with the simple parallelization approach employed, it was possible to obtain a significant reduction of the computation times.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The building sector is one of the Europeâ s main energy consumer, making buildings an important target for a wiser energy use, improving indoor comfort conditions and reducing the energy consumption. To achieve the European Union targets for energy consumption and carbon reductions it is crucial to act in new, but also in existing buildings, which constitute the majority of the building stock. In existing buildings, the significant improvement of their efficiency requires important investments. Therefore, costs are a major concern in the decision making process and the analysis of the cost effectiveness of the interventions is an important path in the guidance for the selection of the different renovation scenarios. The Portuguese thermal legislation considers the simple payback method for the calculations of the time for the return of the investment. However, this method does not take into consideration inflation, cash flows and cost of capital, as well as the future costs of energy and the building elements lifetime as it happens in a life cycle cost analysis. In order to understand the impact of the economic analysis method used in the choice of the renovation measures, a case study has been analysed using simple payback calculations and life cycle costs analysis. Overall results show that less far-reaching renovation measures are indicated when using the simple payback calculations which may be leading to solutions less cost-effective in a long run perspective.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose an extension of the firefly algorithm (FA) to multi-objective optimization. FA is a swarm intelligence optimization algorithm inspired by the flashing behavior of fireflies at night that is capable of computing global solutions to continuous optimization problems. Our proposal relies on a fitness assignment scheme that gives lower fitness values to the positions of fireflies that correspond to non-dominated points with smaller aggregation of objective function distances to the minimum values. Furthermore, FA randomness is based on the spread metric to reduce the gaps between consecutive non-dominated solutions. The obtained results from the preliminary computational experiments show that our proposal gives a dense and well distributed approximated Pareto front with a large number of points.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a single-phase Series Active Power Filter (Series APF) for mitigation of the load voltage harmonic content, while maintaining the voltage on the DC side regulated without the support of a voltage source. The proposed series active power filter control algorithm eliminates the additional voltage source to regulate the DC voltage, and with the adopted topology it is not used a coupling transformer to interface the series active power filter with the electrical power grid. The paper describes the control strategy which encapsulates the grid synchronization scheme, the compensation voltage calculation, the damping algorithm and the dead-time compensation. The topology and control strategy of the series active power filter have been evaluated in simulation software and simulations results are presented. Experimental results, obtained with a developed laboratorial prototype, validate the theoretical assumptions, and are within the harmonic spectrum limits imposed by the international recommendations of the IEEE-519 Standard.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The equations predicting maximal oxygen uptake (VO2max or peak) presently in use in cardiopulmonary exercise testing (CPET) softwares in Brazil have not been adequately validated. These equations are very important for the diagnostic capacity of this method. Objective: Build and validate a Brazilian Equation (BE) for prediction of VO2peak in comparison to the equation cited by Jones (JE) and the Wasserman algorithm (WA). Methods: Treadmill evaluation was performed on 3119 individuals with CPET (breath by breath). The construction group (CG) of the equation consisted of 2495 healthy participants. The other 624 individuals were allocated to the external validation group (EVG). At the BE (derived from a multivariate regression model), age, gender, body mass index (BMI) and physical activity level were considered. The same equation was also tested in the EVG. Dispersion graphs and Bland-Altman analyses were built. Results: In the CG, the mean age was 42.6 years, 51.5% were male, the average BMI was 27.2, and the physical activity distribution level was: 51.3% sedentary, 44.4% active and 4.3% athletes. An optimal correlation between the BE and the CPET measured VO2peak was observed (0.807). On the other hand, difference came up between the average VO2peak expected by the JE and WA and the CPET measured VO2peak, as well as the one gotten from the BE (p = 0.001). Conclusion: BE presents VO2peak values close to those directly measured by CPET, while Jones and Wasserman differ significantly from the real VO2peak.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background:Vascular remodeling, the dynamic dimensional change in face of stress, can assume different directions as well as magnitudes in atherosclerotic disease. Classical measurements rely on reference to segments at a distance, risking inappropriate comparison between dislike vessel portions.Objective:to explore a new method for quantifying vessel remodeling, based on the comparison between a given target segment and its inferred normal dimensions.Methods:Geometric parameters and plaque composition were determined in 67 patients using three-vessel intravascular ultrasound with virtual histology (IVUS-VH). Coronary vessel remodeling at cross-section (n = 27.639) and lesion (n = 618) levels was assessed using classical metrics and a novel analytic algorithm based on the fractional vessel remodeling index (FVRI), which quantifies the total change in arterial wall dimensions related to the estimated normal dimension of the vessel. A prediction model was built to estimate the normal dimension of the vessel for calculation of FVRI.Results:According to the new algorithm, “Ectatic” remodeling pattern was least common, “Complete compensatory” remodeling was present in approximately half of the instances, and “Negative” and “Incomplete compensatory” remodeling types were detected in the remaining. Compared to a traditional diagnostic scheme, FVRI-based classification seemed to better discriminate plaque composition by IVUS-VH.Conclusion:Quantitative assessment of coronary remodeling using target segment dimensions offers a promising approach to evaluate the vessel response to plaque growth/regression.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In general terms key sectors analysis aims at identifying the role, or impact, that the existence of a productive sector has in the economy. Quite a few measures, indicators and methodologies of varied complexity have been proposed in the literature, from multiplier sums to extraction methods, but not without debate about their properties and their information content. All of them, to our knowledge, focus exclusively on the interdependence effects that result from the input-output structure of the economy. By so doing the simple input-output approach misses critical links beyond the interindustry ones. A productive sector’s role is that of producing but also that of generating and distributing income among primary factors as a result of production. Thus when measuring a sector’s role, the income generating process cannot and should not be omitted if we want to better elucidate the sector’ economic role. A simple way to make the missing income link explicit is to use the SAM (Soci

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we explore the effect of bounded rationality on the convergence of individual behavior toward equilibrium. In the context of a Cournot game with a unique and symmetric Nash equilibrium, firms are modeled as adaptive economic agents through a genetic algorithm. Computational experiments show that (1) there is remarkable heterogeneity across identical but boundedly rational agents; (2) such individual heterogeneity is not simply a consequence of the random elements contained in the genetic algorithm; (3) the more rational agents are in terms of memory abilities and pre-play evaluation of strategies, the less heterogeneous they are in their actions. At the limit case of full rationality, the outcome converges to the standard result of uniform individual behavior.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

"Es tracta d'un projecte dividit en dues parts independents però complementàries, realitzades per autors diferents. Aquest document conté originàriament altre material i/o programari només consultable a la Biblioteca de Ciència i Tecnologia"

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An aqueous solution of the latex of Euphorbia tirucalli collected at sites receiving large amounts of sunlight showed molluscicide action on Biomphalaria glabrata, with LD50 obtained at the concentration of 28,0 ppm and LD90 at the concentration of 85,0 ppm. The toxicity of the product for fish was similar to that of Bayluscide and of copper sulfate used for comparison. However, the wide distribution of the plant, its easy propagation and the simple procedure for extraction of the active substance, which is biodegradable, favor "avelós" as a promising agent in the control of schistosomiasis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aquest projecte té com a objectiu l'anàlisi de prestacions de processadors RISC de baix cost i el disseny d'un processador RISC simple per a aplicacions de propòsit general relacionades amb l'adquisició i el procés simple de dades. Com a resultat es presenta el processador SR3C de 32 bits i arquitectura RISC. Aquest processador s'ha descrit i simulat mitjançant el llenguatge de descripció de hardware VHDL i s'ha sintetitzat en una FPGA. El processador està preparat per poder utilitzar-se en SoCs reals gràcies al compliment de l'estàndard de busos Wishbone. A més també es pot utilitzar com plataforma educativa gràcies a l'essamblador i simulador desenvolupats.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Waveform tomographic imaging of crosshole georadar data is a powerful method to investigate the shallow subsurface because of its ability to provide images of pertinent petrophysical parameters with extremely high spatial resolution. All current crosshole georadar waveform inversion strategies are based on the assumption of frequency-independent electromagnetic constitutive parameters. However, in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behavior. In this paper, we evaluate synthetically the reconstruction limits of a recently published crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. Our results indicate that, when combined with a source wavelet estimation procedure that provides a means of partially accounting for the frequency-dependent effects through an "effective" wavelet, the inversion algorithm performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.