862 resultados para Computer simulation software


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Existence of positive solutions for a fourth order equation with nonlinear boundary conditions, which models deformations of beams on elastic supports, is considered using fixed points theorems in cones of ordered Banach spaces. Iterative and numerical solutions are also considered. (C) 2010 IMACS. Published by Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

fit the context of normalized variable formulation (NVF) of Leonard and total variation diminishing (TVD) constraints of Harten. this paper presents an extension of it previous work by the authors for solving unsteady incompressible flow problems. The main contributions of the paper are threefold. First, it presents the results of the development and implementation of a bounded high order upwind adaptative QUICKEST scheme in the 3D robust code (Freeflow), for the numerical solution of the full incompressible Navier-Stokes equations. Second, it reports numerical simulation results for 1D hock tube problem, 2D impinging jet and 2D/3D broken clam flows. Furthermore, these results are compared with existing analytical and experimental data. And third, it presents the application of the numerical method for solving 3D free surface flow problems. (C) 2007 IMACS. Published by Elsevier B.V. All rights reserved,

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reusable and evolvable Software Engineering Environments (SEES) are essential to software production and have increasingly become a need. In another perspective, software architectures and reference architectures have played a significant role in determining the success of software systems. In this paper we present a reference architecture for SEEs, named RefASSET, which is based on concepts coming from the aspect-oriented approach. This architecture is specialized to the software testing domain and the development of tools for that domain is discussed. This and other case studies have pointed out that the use of aspects in RefASSET provides a better Separation of Concerns, resulting in reusable and evolvable SEEs. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Component-based software engineering has recently emerged as a promising solution to the development of system-level software. Unfortunately, current approaches are limited to specific platforms and domains. This lack of generality is particularly problematic as it prevents knowledge sharing and generally drives development costs up. In the past, we have developed a generic approach to component-based software engineering for system-level software called OpenCom. In this paper, we present OpenComL an instantiation of OpenCom to Linux environments and show how it can be profiled to meet a range of system-level software in Linux environments. For this, we demonstrate its application to constructing a programmable router platform and a middleware for parallel environments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a novel statistical test is introduced to compare two locally stationary time series. The proposed approach is a Wald test considering time-varying autoregressive modeling and function projections in adequate spaces. The covariance structure of the innovations may be also time- varying. In order to obtain function estimators for the time- varying autoregressive parameters, we consider function expansions in splines and wavelet bases. Simulation studies provide evidence that the proposed test has a good performance. We also assess its usefulness when applied to a financial time series.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabalho de conclusão investiga o efeito da geração de estoques intermediários nos indicadores principais empregados na Teoria das Restrições (Ganho, Despesa Operacional e Inventário) em uma unidade industrial de processo produtivo de Propriedade contínuo, que emprega embalagens, matérias-primas obtidas em larga escala e cadeias logísticas de longo curso. Este tipo de indústria produz bens de consumo imediato, com pouca variabilidade, de modo “empurrado”. A principal conseqüência é a perda do sincronismo na cadeia logística, resultando em uma grande quantidade de estoques intermediários e custos crescentes, relacionados principalmente ao custo de manutenção destes estoques. Através dos cinco passos de focalização e das ferramentas lógicas da Teoria das Restrições, propõe-se uma alternativa gerencial, que inclui o algoritmo Tambor-Pulmão-Corda e insere a organização em um processo de melhoria contínua, cujos impactos são avaliados por simulação computacional. Através de técnicas estatísticas e software apropriados, constrói-se um modelo de simulação computacional baseado em dados reais de uma planta produtora de cimento. A partir deste modelo, diferentes cenários são testados, descobrindo-se a condição ótima. Chega-se a uma conclusão, considerando a mudança na política de geração de estoques intermediários e seus impactos na redução de custos e riscos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The assessment of building thermal performance is often carried out using HVAC energy consumption data, when available, or thermal comfort variables measurements, for free-running buildings. Both types of data can be determined by monitoring or computer simulation. The assessment based on thermal comfort variables is the most complex because it depends on the determination of the thermal comfort zone. For these reasons, this master thesis explores methods of building thermal performance assessment using variables of thermal comfort simulated by DesignBuilder software. The main objective is to contribute to the development of methods to support architectural decisions during the design process, and energy and sustainable rating systems. The research method consists on selecting thermal comfort methods, modeling them in electronic sheets with output charts developed to optimize the analyses, which are used to assess the simulation results of low cost house configurations. The house models consist in a base case, which are already built, and changes in thermal transmittance, absorptance, and shading. The simulation results are assessed using each thermal comfort method, to identify the sensitivity of them. The final results show the limitations of the methods, the importance of a method that considers thermal radiance and wind speed, and the contribution of the chart proposed

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Na unfolding method of linear intercept distributions and secction área distribution was implemented for structures with spherical grains. Although the unfolding routine depends on the grain shape, structures with spheroidal grains can also be treated by this routine. Grains of non-spheroidal shape can be treated only as approximation. A software was developed with two parts. The first part calculates the probability matrix. The second part uses this matrix and minimizes the chi-square. The results are presented with any number of size classes as required. The probability matrix was determined by means of the linear intercept and section area distributions created by computer simulation. Using curve fittings the probability matrix for spheres of any sizes could be determined. Two kinds of tests were carried out to prove the efficiency of the Technique. The theoretical tests represent ideal cases. The software was able to exactly find the proposed grain size distribution. In the second test, a structure was simulated in computer and images of its slices were used to produce the corresponding linear intercept the section area distributions. These distributions were then unfolded. This test simulates better reality. The results show deviations from the real size distribution. This deviations are caused by statistic fluctuation. The unfolding of the linear intercept distribution works perfectly, but the unfolding of section area distribution does not work due to a failure in the chi-square minimization. The minimization method uses a matrix inversion routine. The matrix generated by this procedure cannot be inverted. Other minimization method must be used

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Oil wells subjected to cyclic steam injection present important challenges for the development of well cementing systems, mainly due to tensile stresses caused by thermal gradients during its useful life. Cement sheath failures in wells using conventional high compressive strength systems lead to the use of cement systems that are more flexible and/or ductile, with emphasis on Portland cement systems with latex addition. Recent research efforts have presented geopolymeric systems as alternatives. These cementing systems are based on alkaline activation of amorphous aluminosilicates such as metakaolin or fly ash and display advantageous properties such as high compressive strength, fast setting and thermal stability. Basic geopolymeric formulations can be found in the literature, which meet basic oil industry specifications such as rheology, compressive strength and thickening time. In this work, new geopolymeric formulations were developed, based on metakaolin, potassium silicate, potassium hydroxide, silica fume and mineral fiber, using the state of the art in chemical composition, mixture modeling and additivation to optimize the most relevant properties for oil well cementing. Starting from molar ratios considered ideal in the literature (SiO2/Al2O3 = 3.8 e K2O/Al2O3 = 1.0), a study of dry mixtures was performed,based on the compressive packing model, resulting in an optimal volume of 6% for the added solid material. This material (silica fume and mineral fiber) works both as an additional silica source (in the case of silica fume) and as mechanical reinforcement, especially in the case of mineral fiber, which incremented the tensile strength. The first triaxial mechanical study of this class of materials was performed. For comparison, a mechanical study of conventional latex-based cementing systems was also carried out. Regardless of differences in the failure mode (brittle for geopolymers, ductile for latex-based systems), the superior uniaxial compressive strength (37 MPa for the geopolymeric slurry P5 versus 18 MPa for the conventional slurry P2), similar triaxial behavior (friction angle 21° for P5 and P2) and lower stifness (in the elastic region 5.1 GPa for P5 versus 6.8 GPa for P2) of the geopolymeric systems allowed them to withstand a similar amount of mechanical energy (155 kJ/m3 for P5 versus 208 kJ/m3 for P2), noting that geopolymers work in the elastic regime, without the microcracking present in the case of latex-based systems. Therefore, the geopolymers studied on this work must be designed for application in the elastic region to avoid brittle failure. Finally, the tensile strength of geopolymers is originally poor (1.3 MPa for the geopolymeric slurry P3) due to its brittle structure. However, after additivation with mineral fiber, the tensile strength became equivalent to that of latex-based systems (2.3 MPa for P5 and 2.1 MPa for P2). The technical viability of conventional and proposed formulations was evaluated for the whole well life, including stresses due to cyclic steam injection. This analysis was performed using finite element-based simulation software. It was verified that conventional slurries are viable up to 204ºF (400ºC) and geopolymeric slurries are viable above 500ºF (260ºC)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The method "toe-to-heel air injection" (THAITM) is a process of enhanced oil recovery, which is the integration of in-situ combustion with technological advances in drilling horizontal wells. This method uses horizontal wells as producers of oil, keeping vertical injection wells to inject air. This process has not yet been applied in Brazil, making it necessary, evaluation of these new technologies applied to local realities, therefore, this study aimed to perform a parametric study of the combustion process with in-situ oil production in horizontal wells, using a semi synthetic reservoir, with characteristics of the Brazilian Northeast basin. The simulations were performed in a commercial software "STARS" (Steam, Thermal, and Advanced Processes Reservoir Simulator), from CMG (Computer Modelling Group). The following operating parameters were analyzed: air rate, configuration of producer wells and oxygen concentration. A sensitivity study on cumulative oil (Np) was performed with the technique of experimental design, with a mixed model of two and three levels (32x22), a total of 36 runs. Also, it was done a technical economic estimative for each model of fluid. The results showed that injection rate was the most influence parameter on oil recovery, for both studied models, well arrangement depends on fluid model, and oxygen concentration favors recovery oil. The process can be profitable depends on air rate

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)