986 resultados para Computer Structure


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variation of the crystallite structure of several coal chars during gasification in air and carbon dioxide was studied by high-resolution transmission electron microscopy (HRTEM) and X-ray diffraction (XRD) techniques. The XRD analysis of the partially gasified coal chars, based on two approaches, Scherrer's equation and Alexander and Sommer's method, shows a contradictory trend of the variation of the crystallite height with carbon conversion, despite giving a similar trend for the crystallite width change. The HRTEM fringe images of the partially gasified coal chars indicate that large and highly ordered crystallites exist at conversion levels as high as 86%. It is also demonstrated that the crystalline structure of chars can be very different although their pore structures are similar, suggesting a combination of crystalline structure analysis with pore structure analysis in studies of carbon gasification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regiospecific bromination of 2,4,4-trimethyl-cyclohex-2-enone was achieved and the X-ray crystal structure of 6-bromo-2,4,4-trimethyl-cyclohex-2-enone is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

9-Carboxyhexahydro-7-methoxy-4a,7-ethano-benzopyran-5-en-1-one (1) was prepared and examined by X-ray crystallography to probe its potential as a new peptide scaffold/template. The crystal structure of the anhydride precursor 7-(2-acetoxyethyl)-4-methoxy-3a,4,7,7a-tetrahydro-4,7-ethanoisobenzofuran-1,3-dione (6) is also reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variation of the pore structure of several coal chars during gasification in air and carbon dioxide was studied by argon adsorption at 87 K and CO2 adsorption at 273 K. It is found that the surface area and volume of the small pores (10 Å for air gasification is constant over a wide range of conversion (>20%), while for CO2 gasification similar results are obtained using the total surface area. However, in the early stages of gasification (

Relevância:

20.00% 20.00%

Publicador:

Resumo:

José Plínio Baptista School of Cosmology (1. : 2012 : Anchieta, ES). Seminário realizado no período de 14 a 19 de outubro de 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental scratch resistance testing provides two numbers: the penetration depth Rp and the healing depth Rh. In molecular dynamics computer simulations, we create a material consisting of N statistical chain segments by polymerization; a reinforcing phase can be included. Then we simulate the movement of an indenter and response of the segments during X time steps. Each segment at each time step has three Cartesian coordinates of position and three of momentum. We describe methods of visualization of results based on a record of 6NX coordinates. We obtain a continuous dependence on time t of positions of each of the segments on the path of the indenter. Scratch resistance at a given location can be connected to spatial structures of individual polymeric chains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Indentation tests are used to determine the hardness of a material, e.g., Rockwell, Vickers, or Knoop. The indentation process is empirically observed in the laboratory during these tests; the mechanics of indentation is insufficiently understood. We have performed first molecular dynamics computer simulations of indentation resistance of polymers with a chain structure similar to that of high density polyethylene (HDPE). A coarse grain model of HDPE is used to simulate how the interconnected segments respond to an external force imposed by an indenter. Results include the time-dependent measurement of penetration depth, recovery depth, and recovery percentage, with respect to indenter force, indenter size, and indentation time parameters. The simulations provide results that are inaccessible experimentally, including continuous evolution of the pertinent tribological parameters during the entire indentation process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The radial undistortion model proposed by Fitzgibbon and the radial fundamental matrix were early steps to extend classical epipolar geometry to distorted cameras. Later minimal solvers have been proposed to find relative pose and radial distortion, given point correspondences between images. However, a big drawback of all these approaches is that they require the distortion center to be exactly known. In this paper we show how the distortion center can be absorbed into a new radial fundamental matrix. This new formulation is much more practical in reality as it allows also digital zoom, cropped images and camera-lens systems where the distortion center does not exactly coincide with the image center. In particular we start from the setting where only one of the two images contains radial distortion, analyze the structure of the particular radial fundamental matrix and show that the technique also generalizes to other linear multi-view relationships like trifocal tensor and homography. For the new radial fundamental matrix we propose different estimation algorithms from 9,10 and 11 points. We show how to extract the epipoles and prove the practical applicability on several epipolar geometry image pairs with strong distortion that - to the best of our knowledge - no other existing algorithm can handle properly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last decade, software architecture emerged as a critical issue in Software Engineering. This encompassed a shift from traditional programming towards software development based on the deployment and assembly of independent components. The specification of both the overall systems structure and the interaction patterns between their components became a major concern for the working developer. Although a number of formalisms to express behaviour and to supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from CCS behavioural specifications the corresponding architectural skeletons in the Microsoft .Net framework, in the form of executable C and Cω code. The prototyping process is fully supported by a specific tool developed in Haskell

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large and growing amount of software systems rely on non-trivial coordination logic for making use of third party services or components. Therefore, it is of outmost importance to understand and capture rigorously this continuously growing layer of coordination as this will make easier not only the veri cation of such systems with respect to their original speci cations, but also maintenance, further development, testing, deployment and integration. This paper introduces a method based on several program analysis techniques (namely, dependence graphs, program slicing, and graph pattern analysis) to extract coordination logic from legacy systems source code. This process is driven by a series of pre-de ned coordination patterns and captured by a special purpose graph structure from which coordination speci cations can be generated in a number of di erent formalisms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract: in Portugal, and in much of the legal systems of Europe, «legal persons» are likely to be criminally responsibilities also for cybercrimes. Like for example the following crimes: «false information»; «damage on other programs or computer data»; «computer-software sabotage»; «illegitimate access»; «unlawful interception» and «illegitimate reproduction of protected program». However, in Portugal, have many exceptions. Exceptions to the «question of criminal liability» of «legal persons». Some «legal persons» can not be blamed for cybercrime. The legislature did not leave! These «legal persons» are v.g. the following («public entities»): legal persons under public law, which include the public business entities; entities utilities, regardless of ownership; or other legal persons exercising public powers. In other words, and again as an example, a Portuguese public university or a private concessionaire of a public service in Portugal, can not commit (in Portugal) any one of cybercrime pointed. Fair? Unfair. All laws should provide that all legal persons can commit cybercrimes. PS: resumo do artigo em inglês.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually targeting either the imperative or the object oriented paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird- Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general alternative to slicing functional programs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays despite improvements in usability and intuitiveness users have to adapt to the proposed systems to satisfy their needs. For instance, they must learn how to achieve tasks, how to interact with the system, and fulfill system's specifications. This paper proposes an approach to improve this situation enabling graphical user interface redefinition through virtualization and computer vision with the aim of increasing the system's usability. To achieve this goal the approach is based on enriched task models, virtualization and picture-driven computing.