853 resultados para Tutorial on Computing
Resumo:
A MATHEMATICA notebook to compute the elements of the matrices which arise in the solution of the Helmholtz equation by the finite element method (nodal approximation) for tetrahedral elements of any approximation order is presented. The results of the notebook enable a fast computational implementation of finite element codes for high order simplex 3D elements reducing the overheads due to implementation and test of the complex mathematical expressions obtained from the analytical integrations. These matrices can be used in a large number of applications related to physical phenomena described by the Poisson, Laplace and Schrodinger equations with anisotropic physical properties.
Resumo:
There is a remarkable connection between the number of quantum states of conformal theories and the sequence of dimensions of Lie algebras. In this paper, we explore this connection by computing the asymptotic expansion of the elliptic genus and the microscopic entropy of black holes associated with (supersymmetric) sigma models. The new features of these results are the appearance of correct prefactors in the state density expansion and in the coefficient of the logarithmic correction to the entropy.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Although cluster environments have an enormous potential processing power, real applications that take advantage of this power remain an elusive goal. This is due, in part, to the lack of understanding about the characteristics of the applications best suited for these environments. This paper focuses on Master/Slave applications for large heterogeneous clusters. It defines application, cluster and execution models to derive an analytic expression for the execution time. It defines speedup and derives speedup bounds based on the inherent parallelism of the application and the aggregated computing power of the cluster. The paper derives an analytical expression for efficiency and uses it to define scalability of the algorithm-cluster combination based on the isoefficiency metric. Furthermore, the paper establishes necessary and sufficient conditions for an algorithm-cluster combination to be scalable which are easy to verify and use in practice. Finally, it covers the impact of network contention as the number of processors grow. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A semi-analytical approach is proposed to study the rotational motion of an artificial satellite under the influence of the torque due to the solar radiation pressure and taking into account the influence of Earth's shadow. The Earth's shadow is introduced in the equations for the rotational motion as a function depending on the longitude of the Sun, on the ecliptic's obliquity and on the orbital parameters of the satellite. By mapping and computing this function, we can get the periods in which the satellite is not illuminated and the torque due to the solar radiation pressure is zero. When the satellite is illuminated, a known analytical solution is used to predict the satellite's attitude. This analytical solution is expressed in terms of Andoyer's variables and depends on the physical and geometrical properties of the satellite and on the direction of the Sun radiation flux. By simulating a hypothetical circular cylindrical type satellite, an example is exhibited and the results agree quite well when compared with a numerical integration. © 1997 COSPAR. Published by Elsevier Science Ltd.
Resumo:
A simple algorithm for computing the propagator for higher derivative gravity theories based on the Barnes-Rivers operators is presented. The prescription is used, among other things, to obtain the propagator for quadratic gravity in an unconventional gauge. We also find the propagator for both gravity and quadratic gravity in an interesting gauge recently baptized the Einstein gauge [Hitzer and Dehnen, Int. J. Theor. Phys. 36 (1997), 559].
Resumo:
Research on Blindsight, Neglect/Extinction and Phantom limb syndromes, as well as electrical measurements of mammalian brain activity, have suggested the dependence of vivid perception on both incoming sensory information at primary sensory cortex and reentrant information from associative cortex. Coherence between incoming and reentrant signals seems to be a necessary condition for (conscious) perception. General reticular activating system and local electrical synchronization are some of the tools used by the brain to establish coarse coherence at the sensory cortex, upon which biochemical processes are coordinated. Besides electrical synchrony and chemical modulation at the synapse, a central mechanism supporting such a coherence is the N-methyl-D-aspartate channel, working as a 'coincidence detector' for an incoming signal causing the depolarization necessary to remove Mg 2+, and reentrant information releasing the glutamate that finally prompts Ca 2+ entry. We propose that a signal transduction pathway activated by Ca 2+ entry into cortical neurons is in charge of triggering a quantum computational process that accelerates inter-neuronal communication, thus solving systemic conflict and supporting the unity of consciousness. © 2001 Elsevier Science Ltd.
Resumo:
A combined theoretical and experimental study to elucidate the molecular mechanism for the Grob fragmentation of different (N-halo)-2-amino cyclocarboxylates with the nitrogen atom in exocyclic position: (N-Cl)-2-amino cyclopropanecarboxylate (1), (N-Cl)-2-amino cyclobutanecarboxylate (2), (N-Cl)-2-amino cyclopentanecarboxylate (3) and (N-Cl)-2-amino cyclohexanecarboxylate (4), and the corresponding acyclic compounds, (N-Cl)-2-amino isobutyric acid (A), (N-Cl)-2-amino butyric acid (B), has been carried out. The kinetics of decomposition for these compounds and related bromine derivatives were experimentally determined by conventional and stopped-flow UV spectrophotometry. The reaction products have been analyzed by GC and spectrophotometry. Theoretical analysis is based in the localization of stationary points (reactants and transition structures) on the potential energy surface. Calculations were carried out at B3LYP/6-31+G* and MP2/6-31+G* computing methods in the gas phase, while solvent effects have been included by means the self-consistent reaction field theory, PCM continuum model, at MP2/6-31+G* and MP4/6-31+G*//MP2/6-31+G* calculation levels. Based on both experimental and theoretical results, the different Grob fragmentation processes show a global synchronicity index close to 0.9, corresponding to a nearly concerted process. At the TSs, the N-Cl bond breaking is more advanced than the C-C cleavage process. An antiperiplanar configuration of these bonds is reached at the TSs, and this geometrical arrangement is the key factor governing the decomposition. In the case of 1 and 2 the ring strain prevents this spatial disposition, leading to a larger value of the activation barrier. Natural population analysis shows that the polarization of the N-Cl and C-C bonds along the bond-breaking process can be considered the driving force for the decomposition and that a negative charge flows from the carboxylate group to the chlorine atom to assist the reaction pathway. A comparison of theoretical and experimental results shows the relevance of calculation level and the inclusion of solvent effects for determining accurate unimolecular rate coefficients for the decomposition process. © 2002 Published by Elsevier Science B.V.
Resumo:
It is commonly assumed that the equivalence principle can coexist without conflict with quantum mechanics. We shall argue here that, contrary to popular belief, this principle does not hold in quantum mechanics. We illustrate this point by computing the second-order correction for the scattering of a massive scalar boson by a weak gravitational field, treated as an external field. The resulting cross-section turns out to be mass-dependent. A way out of this dilemma would be, perhaps, to consider gravitation without the equivalence principle. At first sight, this seems to be a too much drastic attitude toward general relativity. Fortunately, the teleparallel version of general relativity - a description of the gravitational interaction by a force similar to the Lorentz force of electromagnetism and that, of course, dispenses with the equivalence principle - is equivalent to general relativity, thus providing a consistent theory for gravitation in the absence of the aforementioned principle. © World Scientific Publishing Company.
Resumo:
We establish the conditions under which it is possible to construct signal sets satisfying the properties of being geometrically uniform and matched to additive quotient groups. Such signal sets consist of subsets of signal spaces identified to integers rings ℤ[i] and ℤ[ω] in ℤ2. © 2008 KSCAM and Springer-Verlag.
Resumo:
This paper studies the use of different population structures in a Genetic Algorithm (GA) applied to lot sizing and scheduling problems. The population approaches are divided into two types: single-population and multi-population. The first type has a non-structured single population. The multi-population type presents non-structured and structured populations organized in binary and ternary trees. Each population approach is tested on lot sizing and scheduling problems found in soft drink companies. These problems have two interdependent levels with decisions concerning raw material storage and soft drink bottling. The challenge is to simultaneously determine the lot sizing and scheduling of raw materials in tanks and products in lines. Computational results are reported allowing determining the better population structure for the set of problem instances evaluated. Copyright 2008 ACM.
Resumo:
In this work we discuss the Hamilton-Jacobi formalism for fields on the null-plane. The Real Scalar Field in (1+1) - dimensions is studied since in it lays crucial points that are presented in more structured fields as the Electromagnetic case. The Hamilton-Jacobi formalism leads to the equations of motion for these systems after computing their respective Generalized Brackets. Copyright © owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike Licence.
Resumo:
This classical way to manage product development processes for massive production seems to be changing: high pressure for cost reduction, higher quality standards, markets reaching for innovation lead to the necessity of new tools for development control. Into this, and learning from the automotive and aerospace industries factories from other segments are starting to understand and apply manufacturing and assembly oriented projects to ease the task of generate goods and from this obtain at least a part of the expected results. This paper is intended to demonstrate the applicability of the concepts of Concurrent Engineering and DFM/DFA (Design for Manufacturing and Assembly) in the development of products and parts for the White Goods industry in Brazil (major appliances as refrigerators, cookers and washing machines), showing one case concerning the development and releasing of a component. Finally is demonstrated in a short term how was reached a solution that could provide cost savings and reduction on the time to delivery using those techniques.
Resumo:
In this article we explore the NVIDIA graphical processing units (GPU) computational power in cryptography using CUDA (Compute Unified Device Architecture) technology. CUDA makes the general purpose computing easy using the parallel processing presents in GPUs. To do this, the NVIDIA GPUs architectures and CUDA are presented, besides cryptography concepts. Furthermore, we do the comparison between the versions executed in CPU with the parallel version of the cryptography algorithms Advanced Encryption Standard (AES) and Message-digest Algorithm 5 (MD5) wrote in CUDA. © 2011 AISTI.
Resumo:
The widespread availability of wirelessly connected portable computers, smartphones and other mobile devices, and the pervasive presence of computer services in our everyday environment, has brought the prediction of Mark Weiser of future ubiquitous computer systems closer to reality. Some of these - ever-present, anywhere, anytime - ubiquitous computer services mean easier and pleasant lifestyles for many people, but the generalized availability of some classes of these softwares and computer services, known as virtual disguisers and Virtual Robots, can pose new ethical problems in a world of explosive growth of social networking sites. The objective of the present article is to investigate some of these problems, from an interdisciplinary philosophical perspective. Special emphasis shall be given to the potential impact on human conduct caused by disguisers and Virtual Robots. © 2011 IEEE.