911 resultados para Legacy codes
Resumo:
More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications
Resumo:
Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.
Resumo:
This paper describes the process of wrapping existing scientific codes in the domain of plasma physics simulations through the use of the Sun’s Java Native Interface. We have created a Java front-end for a particular functionality, offered by legacy native libraries, in order to achieve reusability and interoperability without having to rewrite these libraries. The technique, introduced in this paper, includes two approaches – the one-to-one mapping for wrapping a number of native functions, and using peer classes for wrapping native data structures.
Resumo:
This paper concerns the application of recent information technologies for creating a software system for numerical simulations in the domain of plasma physics and in particular metal vapor lasers. The presented work is connected with performing modernization of legacy physics software for reuse on the web and inside a Service-Oriented Architecture environment. Applied and described is the creation of Java front-ends of legacy C++ and FORTRAN codes. Then the transformation of some of the scientific components into web services, as well as the creation of a web interface to the legacy application, is presented. The use of the BPEL language for managing scientific workflows is also considered.
Resumo:
We show that commutative group spherical codes in R(n), as introduced by D. Slepian, are directly related to flat tori and quotients of lattices. As consequence of this view, we derive new results on the geometry of these codes and an upper bound for their cardinality in terms of minimum distance and the maximum center density of lattices and general spherical packings in the half dimension of the code. This bound is tight in the sense it can be arbitrarily approached in any dimension. Examples of this approach and a comparison of this bound with Union and Rankin bounds for general spherical codes is also presented.
Resumo:
Surface heat treatment in glasses and ceramics, using CO(2) lasers, has attracted the attention of several researchers around the world due to its impact in technological applications, such as lab-on-a-chip devices, diffraction gratings and microlenses. Microlens fabrication on a glass surface has been studied mainly due to its importance in optical devices (fiber coupling, CCD signal enhancement, etc). The goal of this work is to present a systematic study of the conditions for microlens fabrications, along with the viability of using microlens arrays, recorded on the glass surface, as bidimensional codes for product identification. This would allow the production of codes without any residues (like the fine powder generated by laser ablation) and resistance to an aggressive environment, such as sterilization processes. The microlens arrays were fabricated using a continuous wave CO(2) laser, focused on the surface of flat commercial soda-lime silicate glass substrates. The fabrication conditions were studied based on laser power, heating time and microlens profiles. A He-Ne laser was used as a light source in a qualitative experiment to test the viability of using the microlenses as bidimensional codes.
Resumo:
We describe a one-time signature scheme based on the hardness of the syndrome decoding problem, and prove it secure in the random oracle model. Our proposal can be instantiated on general linear error correcting codes, rather than restricted families like alternant codes for which a decoding trapdoor is known to exist. (C) 2010 Elsevier Inc. All rights reserved,
Resumo:
This work studies the turbo decoding of Reed-Solomon codes in QAM modulation schemes for additive white Gaussian noise channels (AWGN) by using a geometric approach. Considering the relations between the Galois field elements of the Reed-Solomon code and the symbols combined with their geometric dispositions in the QAM constellation, a turbo decoding algorithm, based on the work of Chase and Pyndiah, is developed. Simulation results show that the performance achieved is similar to the one obtained with the pragmatic approach with binary decomposition and analysis.
Resumo:
The question raised by researchers in the field of mathematical biology regarding the existence of error-correcting codes in the structure of the DNA sequences is answered positively. It is shown, for the first time, that DNA sequences such as proteins, targeting sequences and internal sequences are identified as codewords of BCH codes over Galois fields.
Resumo:
The Anglo cluster comprises Australia, Canada, England, Ireland, New Zealand, South Africa (White sample), and the United States of America. These countries are all developed nations, predominantly English speaking, and were all once British colonies. Today, they are amongst the wealthiest countries in the world. The GLOBE results show that the Anglo cluster is characterized by an individualistic performance orientation. Further, although they value gender equality, the Anglo cluster countries tend to be male-dominated in practice. Effective leadership in the Anglo cultures is affected by a combination of charismatic inspiration and a articipative style.
Resumo:
The debate about the dynamics and potential policy responses to asset inflation has intensified in recent years. Some analysts, notably Borio and Lowe, have called for 'subtle' changes to existing monetary targeting frameworks to try to deal with the problems of asset inflation and have attempted to developed indicators of financial vulnerability to aid this process. In contrast, this paper argues that the uncertainties involved in understanding financial market developments and their potential impact on the real economy are likely to remain too high to embolden policy makers. The political and institutional risks associated with policy errors are also significant. The fundamental premise that a liberalised financial system is based on 'efficient' market allocation cannot be overlooked. The corollary is that any serious attempt to stabilize financial market outcomes must involve at least a partial reversal of deregulation.
Resumo:
The influence of initial perturbation geometry and material propel-ties on final fold geometry has been investigated using finite-difference (FLAC) and finite-element (MARC) numerical models. Previous studies using these two different codes reported very different folding behaviour although the material properties, boundary conditions and initial perturbation geometries were similar. The current results establish that the discrepancy was not due to the different computer codes but due to the different strain rates employed in the two previous studies (i.e. 10(-6) s(-1) in the FLAC models and 10(-14) s(-1) in the MARC models). As a result, different parts of the elasto-viscous rheological field were bring investigated. For the same material properties, strain rate and boundary conditions, the present results using the two different codes are consistent. A transition in Folding behaviour, from a situation where the geometry of initial perturbation determines final fold shape to a situation where material properties control the final geometry, is produced using both models. This transition takes place with increasing strain rate, decreasing elastic moduli or increasing viscosity (reflecting in each case the increasing influence of the elastic component in the Maxwell elastoviscous rheology). The transition described here is mechanically feasible but is associated with very high stresses in the competent layer (on the order of GPa), which is improbable under natural conditions. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
Examples from the Murray-Darling basin in Australia are used to illustrate different methods of disaggregation of reconnaissance-scale maps. One approach for disaggregation revolves around the de-convolution of the soil-landscape paradigm elaborated during a soil survey. The descriptions of soil ma units and block diagrams in a soil survey report detail soil-landscape relationships or soil toposequences that can be used to disaggregate map units into component landscape elements. Toposequences can be visualised on a computer by combining soil maps with digital elevation data. Expert knowledge or statistics can be used to implement the disaggregation. Use of a restructuring element and k-means clustering are illustrated. Another approach to disaggregation uses training areas to develop rules to extrapolate detailed mapping into other, larger areas where detailed mapping is unavailable. A two-level decision tree example is presented. At one level, the decision tree method is used to capture mapping rules from the training area; at another level, it is used to define the domain over which those rules can be extrapolated. (C) 2001 Elsevier Science B.V. All rights reserved.