948 resultados para execution trace


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Monimutkaisen tietokonejärjestelmän suorituskykyoptimointi edellyttää järjestelmän ajonaikaisen käyttäytymisen ymmärtämistä. Ohjelmiston koon ja monimutkaisuuden kasvun myötä suorituskykyoptimointi tulee yhä tärkeämmäksi osaksi tuotekehitysprosessia. Tehokkaampien prosessorien käytön myötä myös energiankulutus ja lämmöntuotto ovat nousseet yhä suuremmiksi ongelmiksi, erityisesti pienissä, kannettavissa laitteissa. Lämpö- ja energiaongelmien rajoittamiseksi on kehitetty suorituskyvyn skaalausmenetelmiä, jotka edelleen lisäävät järjestelmän kompleksisuutta ja suorituskykyoptimoinnin tarvetta. Tässä työssä kehitettiin visualisointi- ja analysointityökalu ajonaikaisen käyttäytymisen ymmärtämisen helpottamiseksi. Lisäksi kehitettiin suorituskyvyn mitta, joka mahdollistaa erilaisten skaalausmenetelmien vertailun ja arvioimisen suoritusympäristöstä riippumatta, perustuen joko suoritustallenteen tai teoreettiseen analyysiin. Työkalu esittää ajonaikaisesti kerätyn tallenteen helposti ymmärrettävällä tavalla. Se näyttää mm. prosessit, prosessorikuorman, skaalausmenetelmien toiminnan sekä energiankulutuksen kolmiulotteista grafiikkaa käyttäen. Työkalu tuottaa myös käyttäjän valitsemasta osasta suorituskuvaa numeerista tietoa, joka sisältää useita oleellisia suorituskykyarvoja ja tilastotietoa. Työkalun sovellettavuutta tarkasteltiin todellisesta laitteesta saatua suoritustallennetta sekä suorituskyvyn skaalauksen simulointia analysoimalla. Skaalausmekanismin parametrien vaikutus simuloidun laitteen suorituskykyyn analysoitiin.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nous proposons une approche semi-automatique pour la rétro-ingénierie des diagrammes de séquence d’UML. Notre approche commence par un ensemble de traces d'exécution qui sont automatiquement alignées pour déterminer le comportement commun du système. Les diagrammes de séquence sont ensuite extraits avec l’aide d’une visualisation interactive, qui permet la navigation dans les traces d'exécution et la production des opérations d'extraction. Nous fournissons une illustration concrète de notre approche avec une étude de cas, et nous montrons en particulier que nos diagrammes de séquence générés sont plus significatifs et plus compacts que ceux qui sont obtenus par les méthodes automatisées.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La compréhension des objets dans les programmes orientés objet est une tâche impor- tante à la compréhension du code. JavaScript (JS) est un langage orienté-objet dyna- mique, et son dynamisme rend la compréhension du code source très difficile. Dans ce mémoire, nous nous intéressons à l’analyse des objets pour les programmes JS. Notre approche construit de façon automatique un graphe d’objets inspiré du diagramme de classes d’UML à partir d’une exécution concrète d’un programme JS. Le graphe résul- tant montre la structure des objets ainsi que les interactions entre eux. Notre approche utilise une transformation du code source afin de produire cette in- formation au cours de l’exécution. Cette transformation permet de recueillir de l’infor- mation complète au sujet des objets crées ainsi que d’intercepter toutes les modifications de ces objets. À partir de cette information, nous appliquons plusieurs abstractions qui visent à produire une représentation des objets plus compacte et intuitive. Cette approche est implémentée dans l’outil JSTI. Afin d’évaluer l’utilité de l’approche, nous avons mesuré sa performance ainsi que le degré de réduction dû aux abstractions. Nous avons utilisé les dix programmes de réfé- rence de V8 pour cette comparaison. Les résultats montrent que JSTI est assez efficace pour être utilisé en pratique, avec un ralentissement moyen de 14x. De plus, pour 9 des 10 programmes, les graphes sont suffisamment compacts pour être visualisés. Nous avons aussi validé l’approche de façon qualitative en inspectant manuellement les graphes gé- nérés. Ces graphes correspondent généralement très bien au résultat attendu. Mots clés: Analyse de programmes, analyse dynamique, JavaScript, profilage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Malicious programs (malware) can cause severe damage on computer systems and data. The mechanism that the human immune system uses to detect and protect from organisms that threaten the human body is efficient and can be adapted to detect malware attacks. In this paper we propose a system to perform malware distributed collection, analysis and detection, this last inspired by the human immune system. After collecting malware samples from Internet, they are dynamically analyzed so as to provide execution traces at the operating system level and network flows that are used to create a behavioral model and to generate a detection signature. Those signatures serve as input to a malware detector, acting as the antibodies in the antigen detection process. This allows us to understand the malware attack and aids in the infection removal procedures. © 2012 Springer-Verlag.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Execução Condicional Dinâmica (DCE) é uma alternativa para redução dos custos relacionados a desvios previstos incorretamente. A idéia básica é buscar todos os fluxos produzidos por um desvio que obedecem algumas restrições relativas à complexidade e tamanho. Como conseqüência, um número menor de previsões é executado, e assim, um número mais baixo de desvios é incorretamente previsto. Contudo, tal como outras soluções multi-fluxo, o DCE requer uma estrutura de controle mais complexa. Na arquitetura DCE, é observado que várias réplicas da mesma instrução são despachadas para as unidades funcionais, bloqueando recursos que poderiam ser utilizados por outras instruções. Essas réplicas são geradas após o ponto de convergência dos diversos fluxos em execução e são necessárias para garantir a semântica correta entre instruções dependentes de dados. Além disso, o DCE continua produzindo réplicas até que o desvio que gerou os fluxos seja resolvido. Assim, uma seção completa do código pode ser replicado, reduzindo o desempenho. Uma alternativa natural para esse problema é reusar essas seções (ou traços) que são replicadas. O objetivo desse trabalho é analisar e avaliar a efetividade do reuso de valores na arquitetura DCE. Como será apresentado, o princípio do reuso, em diferentes granularidades, pode reduzir efetivamente o problema das réplicas e levar a aumentos de desempenho.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When reengineering legacy systems, it is crucial to assess if the legacy behavior has been preserved or how it changed due to the reengineering effort. Ideally if a legacy system is covered by tests, running the tests on the new version can identify potential differences or discrepancies. However, writing tests for an unknown and large system is difficult due to the lack of internal knowledge. It is especially difficult to bring the system to an appropriate state. Our solution is based on the acknowledgment that one of the few trustable piece of information available when approaching a legacy system is the running system itself. Our approach reifies the execution traces and uses logic programming to express tests on them. Thereby it eliminates the need to programatically bring the system in a particular state, and handles the test-writer a high-level abstraction mechanism to query the trace. The resulting system, called TESTLOG, was used on several real-world case studies to validate our claims.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is a reflection about the similarities between uptake and trace, and translation taken as an event - at once possible and impossible - which deflagrates and constitutes meaning through the language game played by the subjects of communication: text-translator. Both Austin and Derrida, each one on his own way, show that meaning is part of the human language process. The uptake, in Austin s point of view, guarantees the existence of human language, assured by a process of recognition between the subjects of communication, process through which the production of meaning takes place. The trace, according to Derrida, deflagrates, through the human language, the crashing of meaning and destroys the possibility of someone reaching the origin. In this study, taking into consideration the similarities between uptake and trace, I try to disclose translation taken as an event which at once contaminates the languages and is contaminated by them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We solve the operator ordering problem for the quantum continuous integrable su(1,1) Landau-Lifshitz model, and give a prescription to obtain the quantum trace identities, and the spectrum for the higher-order local charges. We also show that this method, based on operator regularization and renormalization, which guarantees quantum integrability, as well as the construction of self-adjoint extensions, can be used as an alternative to the discretization procedure, and unlike the latter, is based only on integrable representations. (C) 2010 American Institute of Physics. [doi:10.1063/1.3509374]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The biogeochemical processes affecting the transport and cycling of terrestrial organic carbon in coastal and transition areas are still not fully understood One means of distinguishing between the sources of organic materials contributing to particulate organic matter (POM) in Babitonga Bay waters and sediments is by the direct measurement of delta(13)C of dissolved inorganic carbon (DIC) and delta(13)C and delta(15)N in the organic constituents. An isotopic survey was taken from samples collected in the Bay in late spring of 2004. The results indicate that the delta(13)C and delta(15)N compositions of OM varied from -21.7 parts per thousand to -26 2 parts per thousand. and from + 9 2 parts per thousand. to -0 1 parts per thousand, respectively. delta(13)C from DIC ranges from +0.04 parts per thousand to -12.7 parts per thousand The difference in the isotope compositions enables the determination of three distinct end-members terrestrial, marine and urban Moreover, the evaluation of source contribution to the particulate organic matter (POM) in the Bay, enables assessment of the anthropogenic impact. Comparing the depleted values of delta(13)C(DIC) and delta(13)C(POC) it is possible to further understand the carbon dynamic within Babitonga Bay (C) 2010 Elsevier BV All rights reserved

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of cancer is a complex, multistage process during which a normal cell undergoes genetic changes that result in phenotypic alterations and in the acquisition of the ability to invade other sites. Inductively coupled plasma optical emission spectroscopy was used to estimate the contents of Al, Ca, Cd, Cr, Cu, Fe, K, Mg, Mn, Na, P, Pb, and Zn in healthy kidney and renal cell carcinoma (RCC), and significant differences were found for all elements. Along with the progression of the malignant disease, a progressive decrease of Cd and K was observed. In fact, for Cd, the concentration in stage T4 was 263.9 times lower than in stage T1, and for K, the concentration in stage T4 was 1.73 times lower than in stage T1. Progressive accumulation was detected for P, Pb, and Zn in stage T4. For P, the concentration in stage T4 was 11.1 times higher than in stage T1; for Pb, the concentration in stage T4 was 232.7 times higher than in T1; and for Zn, the concentration in T4 was 8.452 times higher than in T1. This study highlights the marked differences in the concentrations of selected trace metals in different malignant tumor stages. These findings indicate that some trace metals may play important roles in the pathogenesis of RCC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The degree of homogeneity is normally assessed by the variability of the results of independent analyses of several (e.g., 15) normal-scale replicates. Large sample instrumental neutron activation analysis (LS-INAA) with a collimated Ge detector allows inspecting the degree of homogeneity of the initial batch material, using a kilogram-size sample. The test is based on the spatial distributions of induced radioactivity. Such test was applied to samples of Brazilian whole (green) coffee beans (Coffea arabica and Coffea canephora) of approximately I kg in the frame of development of a coffee reference material. Results indicated that the material do not contain significant element composition inhomogeneities between batches of approximately 30-50 g, masses typically forming the starting base of a reference material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2003-2004, several food items were purchased from large commercial outlets in Coimbra, Portugal. Such items included meats (chicken, pork, beef), eggs, rice, beans and vegetables (tomato, carrot, potato, cabbage, broccoli, lettuce). Elemental analysis was carried out through INAA at the Technological and Nuclear Institute (ITN, Portugal), the Nuclear Energy Centre for Agriculture (CENA, Brazil), and the Nuclear Engineering Teaching Lab of the University of Texas at Austin (NETL, USA). At the latter two, INAA was also associated to Compton suppression. It can be concluded that by applying Compton suppression (1) the detection limits for arsenic, copper and potassium improved; (2) the counting-statistics error for molybdenum diminished; and (3) the long-lived zinc had its 1115-keV photopeak better defined. In general, the improvement sought by introducing Compton suppression in foodstuff analysis was not significant. Lettuce, cabbage and chicken (liver, stomach, heart) are the richest diets in terms of human nutrients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper present the possible alternative options for the remove of trace elements from drinking water supplies in the trace. Arsenic and chromium are two of the most toxic pollutants, introduced into natural waters from a variety of sources and causing various adverse effects on living bodies. The performance of three filter bed methods was evaluated in the laboratory. Experiments were conducted to investigate the sorption of arsenic and chromium on carbon steel and removal of trace elements from drinking water with a household filtration process. The affinity of the arsenic and chromium species for Fe / Fe3C (iron / iron carbide) sites is the key factor controlling the removal of the elements. The method is based on the use of powdered block carbon, powder carbon steel and ceramic spheres in the ion-sorption columns as a cleaning process. The modified powdered block carbon is a satisfactory and economical sorbent for trace elements (arsenite and chromate) dissolved in water due to its low unit cost of about $23 and compatibility with the traditional household filtration system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper it is presented the theoretical background, the architecture (using the ""4+1"" model), and the use of the library for execution of adaptive devices, AdapLib. This library was created seeking to be accurate to the adaptive devices theory, and to allow its easy extension considering the specific details of solutions that employ this kind of device. As an example, it is presented a case study in which the library was used to create a proof of concept to monitor and diagnose problems in an online news portal.