929 resultados para Program Analysis
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
Code clones are portions of source code which are similar to the original program code. The presence of code clones is considered as a bad feature of software as the maintenance of software becomes difficult due to the presence of code clones. Methods for code clone detection have gained immense significance in the last few years as they play a significant role in engineering applications such as analysis of program code, program understanding, plagiarism detection, error detection, code compaction and many more similar tasks. Despite of all these facts, several features of code clones if properly utilized can make software development process easier. In this work, we have pointed out such a feature of code clones which highlight the relevance of code clones in test sequence identification. Here program slicing is used in code clone detection. In addition, a classification of code clones is presented and the benefit of using program slicing in code clone detection is also mentioned in this work.
Resumo:
This study reports the details of the finite element analysis of eleven shear critical partially prestressed concrete T-beams having steel fibers over partial or full depth. Prestressed T-beams having a shear span to depth ratio of 2.65 and 1.59 that failed in shear have been analyzed using the ‘ANSYS’ program. The ‘ANSYS’ model accounts for the nonlinearity, such as, bond-slip of longitudinal reinforcement, postcracking tensile stiffness of the concrete, stress transfer across the cracked blocks of the concrete and load sustenance through the bridging action of steel fibers at crack interface. The concrete is modeled using ‘SOLID65’- eight-node brick element, which is capable of simulating the cracking and crushing behavior of brittle materials. The reinforcement such as deformed bars, prestressing wires and steel fibers have been modeled discretely using ‘LINK8’ – 3D spar element. The slip between the reinforcement (rebars, fibers) and the concrete has been modeled using a ‘COMBIN39’- nonlinear spring element connecting the nodes of the ‘LINK8’ element representing the reinforcement and nodes of the ‘SOLID65’ elements representing the concrete. The ‘ANSYS’ model correctly predicted the diagonal tension failure and shear compression failure of prestressed concrete beams observed in the experiment. The capability of the model to capture the critical crack regions, loads and deflections for various types of shear failures in prestressed concrete beam has been illustrated.
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.
Resumo:
The identification of chemical mechanism that can exhibit oscillatory phenomena in reaction networks are currently of intense interest. In particular, the parametric question of the existence of Hopf bifurcations has gained increasing popularity due to its relation to the oscillatory behavior around the fixed points. However, the detection of oscillations in high-dimensional systems and systems with constraints by the available symbolic methods has proven to be difficult. The development of new efficient methods are therefore required to tackle the complexity caused by the high-dimensionality and non-linearity of these systems. In this thesis, we mainly present efficient algorithmic methods to detect Hopf bifurcation fixed points in (bio)-chemical reaction networks with symbolic rate constants, thereby yielding information about their oscillatory behavior of the networks. The methods use the representations of the systems on convex coordinates that arise from stoichiometric network analysis. One of the methods called HoCoQ reduces the problem of determining the existence of Hopf bifurcation fixed points to a first-order formula over the ordered field of the reals that can then be solved using computational-logic packages. The second method called HoCaT uses ideas from tropical geometry to formulate a more efficient method that is incomplete in theory but worked very well for the attempted high-dimensional models involving more than 20 chemical species. The instability of reaction networks may lead to the oscillatory behaviour. Therefore, we investigate some criterions for their stability using convex coordinates and quantifier elimination techniques. We also study Muldowney's extension of the classical Bendixson-Dulac criterion for excluding periodic orbits to higher dimensions for polynomial vector fields and we discuss the use of simple conservation constraints and the use of parametric constraints for describing simple convex polytopes on which periodic orbits can be excluded by Muldowney's criteria. All developed algorithms have been integrated into a common software framework called PoCaB (platform to explore bio- chemical reaction networks by algebraic methods) allowing for automated computation workflows from the problem descriptions. PoCaB also contains a database for the algebraic entities computed from the models of chemical reaction networks.
Resumo:
This paper describes a system for the computer understanding of English. The system answers questions, executes commands, and accepts information in normal English dialog. It uses semantic information and context to understand discourse and to disambiguate sentences. It combines a complete syntactic analysis of each sentence with a "heuristic understander" which uses different kinds of information about a sentence, other parts of the discourse, and general information about the world in deciding what the sentence means. It is based on the belief that a computer cannot deal reasonably with language unless it can "understand" the subject it is discussing. The program is given a detailed model of the knowledge needed by a simple robot having only a hand and an eye. We can give it instructions to manipulate toy objects, interrogate it about the scene, and give it information it will use in deduction. In addition to knowing the properties of toy objects, the program has a simple model of its own mentality. It can remember and discuss its plans and actions as well as carry them out. It enters into a dialog with a person, responding to English sentences with actions and English replies, and asking for clarification when its heuristic programs cannot understand a sentence through use of context and physical knowledge.
Resumo:
It has been widely known that a significant part of the bits are useless or even unused during the program execution. Bit-width analysis targets at finding the minimum bits needed for each variable in the program, which ensures the execution correctness and resources saving. In this paper, we proposed a static analysis method for bit-widths in general applications, which approximates conservatively at compile time and is independent of runtime conditions. While most related work focus on integer applications, our method is also tailored and applicable to floating point variables, which could be extended to transform floating point number into fixed point numbers together with precision analysis. We used more precise representations for data value ranges of both scalar and array variables. Element level analysis is carried out for arrays. We also suggested an alternative for the standard fixed-point iterations in bi-directional range analysis. These techniques are implemented on the Trimaran compiler structure and tested on a set of benchmarks to show the results.
Resumo:
El Antígeno Leucocitario Humano (HLA en inglés) ha sido descrito en muchos casos como factor de pronóstico para cáncer. La característica principal de los genes de HLA, localizados en el cromosoma 6 (6p21.3), son sus numerosos polimorfismos. Los análisis de secuencia de nucleótidos muestran que la variación está restringida predominantemente a los exones que codifican los dominios de unión a péptidos de la proteína. Por lo tanto, el polimorfismo del HLA define el repertorio de péptidos que se unen a los alotipos de HLA y este hecho define la habilidad de un individuo para responder a la exposición a muchos agentes infecciosos durante su vida. La tipificación de HLA se ha convertido en un análisis importante en clínica. Muestras de tejido embebidas en parafina y fijadas con formalina (FFPE en inglés) son recolectadas rutinariamente en oncología. Este procedimiento podría ser utilizado como una buena fuente de ADN, dado que en estudios en el pasado los ensayos de recolección de ADN no eran normalmente llevados a cabo de casi ningún tejido o muestra en procedimientos clínicos regulares. Teniendo en cuenta que el problema más importante con el ADN de muestras FFPE es la fragmentación, nosotros propusimos un nuevo método para la tipificación del alelo HLA-A desde muestras FFPE basado en las secuencias del exón 2, 3 y 4. Nosotros diseñamos un juego de 12 cebadores: cuatro para el exón 2 de HLA-A, tres para el exón 3 de HLA-A y cinco para el exón 4 de HLA-A, cada uno de acuerdo las secuencias flanqueantes de su respectivo exón y la variación en la secuencia entre diferentes alelos. 17 muestran FFPE colectadas en el Hospital Universitario de Karolinska en Estocolmo Suecia fueron sometidas a PCR y los productos fueron secuenciados. Finalmente todas las secuencias obtenidas fueron analizadas y comparadas con la base de datos del IMGT-HLA. Las muestras FFPE habían sido previamente tipificadas para HLA y los resultados fueron comparados con los de este método. De acuerdo con nuestros resultados, las muestras pudieron ser correctamente secuenciadas. Con este procedimiento, podemos concluir que nuestro estudio es el primer método de tipificación basado en secuencia que permite analizar muestras viejas de ADN de las cuales no se tiene otra fuente. Este estudio abre la posibilidad de desarrollar análisis para establecer nuevas relaciones entre HLA y diferentes enfermedades como el cáncer también.
Resumo:
The bibliographic profile of 125 undergraduate (licentiate) theses was analyzed, describing absolute quantities of several bibliometric variables, as well as within-document indexes and average lags of the references. The results show a consistent pattern across the years in the 6 cohorts included in the sample (2001-2007), with variations, which fall within the robust confi dence intervals for the global central tendency. The median number of references per document was 52 (99% CI 47-55); the median percentage of journal articles cited was 55%, with a median age for journal references of 9 years. Other highlights of the bibliographic profile were the use of foreign language references (median 61%), and low reliance on open web documents (median 2%). A cluster analysis of the bibliometric indexes resulted in a typology of 2 main profiles, almost evenly distributed, one of them with the makeup of a natural science bibliographic profile and the second within the style of the humanities. In general, the number of references, proportion of papers, and age of the references are close to PhD dissertations and Master theses, setting a rather high standard for undergraduate theses.
Resumo:
Evaluation processes in clinical practice have not been well, being their study focused on the technical issues concerning these processes. This study tried an approach to the evaluation processes through the analysis of perceptions from teachers and students about the methodology of evaluation considering the teachinglearning processes performed in a clinical practice of the Medicine Program –Universidad El Bosque from Bogota. With this purpose we conducted interviews with teachers and students searching the manner in which the evaluative, learning and teaching processes are done; then we analyzed the perception from both agents concerning the way these processes are related. The interviews were categorized bath deductively and inductively, and then contrasted with some current theories of learning, teaching and evaluation in medicine. The study showed that nowadays the evaluation and, in general, the educative processes are affected by several factors which are associated to the manner the professional practice is developed, and the educative process of the current teachers. We concluded there is no congrency between the approach of the evaluation, mainly conductivist, and the learning and teaching strategies mainly constructivist. This fact cause dissent in teachers and students.
Resumo:
El artículo describe el programa de privatización en el sector real de la economía Colombiana durante los años noventa y ubica esta política en un contexto de desregulación de mercados y promoción de la inversión privada en la oferta de infraestructura publica y servicios públicos domiciliarios. El articulo evalúa el programa de privatización en los sectores manufacturero y de generación de energía eléctrica. Se hacen mediciones ex - post y análisis econométrico del desempeño de las firmas privatizadas. En el sector manufacturero la muestra analizada esta compuesta por 30 firmas manufactureras de gran tamaño, donde el Instituto de Fomento Industrial era socio fundador. Los principales resultados sugieren que estas firmas mantuvieron un comportamiento pro - cíclico relativo a su principal competidor privado y desestimaron planes drásticos de reestructuración operativa. Para el grupo de firmas de generación de energía el articulo estudia el impacto de la reforma regulatoria en el proceso de entrada al mercado, estructura de propiedad, competencia de mercado y eficiencia productiva. La medición de eficiencia productiva usa la técnica de Data Envelope Análisis para 33 plantas que representan el 85% de la capacidad instalada en la generación térmica de energía eléctrica. La muestra esta compuesta por plantas que estaban en funcionamiento antes de la reforma y las entrantes que comenzaron su operación comercial después de la reforma. Los resultados sugieren que los niveles de evidencia en la generación térmica a mejorado después de la reforma y que la política regulatoria ha tenido un efecto positivo en la eficiencia productiva.
Resumo:
In the past 2009/10 academic year, we took steps towards introduction of active methodologies, from a multidisciplinar approach, into a conventional lecture-based Dental Education program. We consolidated these practices in the current 2010/11 year, already within a new Bologna-adapted scheme. Transition involved (i) critical assessment of the limitations of traditional teaching (ii) identification of specific learning topics allowing for integration of contents, (iii) implementation of student-centred learning activities in old curricular plans (iv) assessment of students' satisfaction and perceived learning outcomes, (v) implementation of these changes in new Bologna-adapted curricula
Resumo:
One of the main stated objectives of the Favela Bairro program was to allow squatter settlements to become part of the formal city. The focus of the study is to assess how different project solutions in the Favela Bairro program have used urban design tools to achieve this objective. A subsequent question concerns the extent to which urban design factors helped to improve the social integration of the settlements, and specifically to overcome poverty and exclusion conditions.This bring us to a another group of problems related to indicators to measure the attainment of these objectives from four perspectives: Spatial, Social, Economical and Political (citizenship and participation)This analysis highlights the major difficulties confronted by the design teams, and allows pinpointing the positive and negative impacts of the interventions on the communities and the city
Resumo:
The purpose of this study was to evaluate the effectiveness of a hearing screening program, particularly focusing on hit and false positive rates in the NICU and WBN at a top-rated birthing hospital in Saint Louis, MO. Additionally, the study examined how these rates may be influenced by risk factors for hearing loss.