851 resultados para Object Oriented Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model for representing music scores in a form suitable for general processing by a music-analyst-programmer is proposed and implemented. Typical input to the model consists of one or more pieces of music which are encoded in a file-based score representation. File-based representations are in a form unsuited for general processing, as they do not provide a suitable level of abstraction for a programmer-analyst. Instead, a representation is created giving a programmer's view of the score. This frees the analyst-programmer from implementation details, that otherwise would form a substantial barrier to progress. The score representation uses an object-oriented approach to create a natural and robust software environment for the musicologist. The system is used to explore ways in which it could benefit musicologists. Methodologies for analysing music corpora are presented in a series of analytic examples which illustrate some of the potential of this model. Proving hypotheses or performing analysis on corpora involves the construction of algorithms. Some unique aspects of using this score model for corpus-based musicology are: - Algorithms impose a discipline which arises from the necessity for formalism. - Automatic analysis enables musicologists to complete tasks that otherwise would be infeasible because of limitations of their energy, attentiveness, accuracy and time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

”compositions” is a new R-package for the analysis of compositional and positive data. It contains four classes corresponding to the four different types of compositional and positive geometry (including the Aitchison geometry). It provides means for computation, plotting and high-level multivariate statistical analysis in all four geometries. These geometries are treated in an fully analogous way, based on the principle of working in coordinates, and the object-oriented programming paradigm of R. In this way, called functions automatically select the most appropriate type of analysis as a function of the geometry. The graphical capabilities include ternary diagrams and tetrahedrons, various compositional plots (boxplots, barplots, piecharts) and extensive graphical tools for principal components. Afterwards, ortion and proportion lines, straight lines and ellipses in all geometries can be added to plots. The package is accompanied by a hands-on-introduction, documentation for every function, demos of the graphical capabilities and plenty of usage examples. It allows direct and parallel computation in all four vector spaces and provides the beginner with a copy-and-paste style of data analysis, while letting advanced users keep the functionality and customizability they demand of R, as well as all necessary tools to add own analysis routines. A complete example is included in the appendix

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on literature review, electronic systems design employ largely top-down methodology. The top-down methodology is vital for success in the synthesis and implementation of electronic systems. In this context, this paper presents a new computational tool, named BD2XML, to support electronic systems design. From a block diagram system of mixed-signal is generated object code in XML markup language. XML language is interesting because it has great flexibility and readability. The BD2XML was developed with object-oriented paradigm. It was used the AD7528 converter modeled in MATLAB / Simulink as a case study. The MATLAB / Simulink was chosen as a target due to its wide dissemination in academia and industry. From this case study it is possible to demonstrate the functionality of the BD2XML and make it a reflection on the design challenges. Therefore, an automatic tool for electronic systems design reduces the time and costs of the design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with the study of optimal control problems for the incompressible Magnetohydrodynamics (MHD) equations. Particular attention to these problems arises from several applications in science and engineering, such as fission nuclear reactors with liquid metal coolant and aluminum casting in metallurgy. In such applications it is of great interest to achieve the control on the fluid state variables through the action of the magnetic Lorentz force. In this thesis we investigate a class of boundary optimal control problems, in which the flow is controlled through the boundary conditions of the magnetic field. Due to their complexity, these problems present various challenges in the definition of an adequate solution approach, both from a theoretical and from a computational point of view. In this thesis we propose a new boundary control approach, based on lifting functions of the boundary conditions, which yields both theoretical and numerical advantages. With the introduction of lifting functions, boundary control problems can be formulated as extended distributed problems. We consider a systematic mathematical formulation of these problems in terms of the minimization of a cost functional constrained by the MHD equations. The existence of a solution to the flow equations and to the optimal control problem are shown. The Lagrange multiplier technique is used to derive an optimality system from which candidate solutions for the control problem can be obtained. In order to achieve the numerical solution of this system, a finite element approximation is considered for the discretization together with an appropriate gradient-type algorithm. A finite element object-oriented library has been developed to obtain a parallel and multigrid computational implementation of the optimality system based on a multiphysics approach. Numerical results of two- and three-dimensional computations show that a possible minimum for the control problem can be computed in a robust and accurate manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we present ad study an object-oriented language, characterized by two different types of objects, passive and active objects, of which we define the operational syntax and semantics. For this language we also define the type system, that will be used for the type checking and for the extraction of behavioral types, which are an abstract description of the behavior of the methods, used in deadlock analysis. Programs can manifest deadlock due to the errors of the programmer. To statically identify possible unintended behaviors we studied and implemented a technique for the analysis of deadlock based on behavioral types.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract interpretation has been widely used for the analysis of object-oriented languages and, in particular, Java source and bytecode. However, while most existing work deals with the problem of flnding expressive abstract domains that track accurately the characteristics of a particular concrete property, the underlying flxpoint algorithms have received comparatively less attention. In fact, many existing (abstract interpretation based—) flxpoint algorithms rely on relatively inefHcient techniques for solving inter-procedural caligraphs or are speciflc and tied to particular analyses. We also argüe that the design of an efficient fixpoint algorithm is pivotal to supporting the analysis of large programs. In this paper we introduce a novel algorithm for analysis of Java bytecode which includes a number of optimizations in order to reduce the number of iterations. The algorithm is parametric -in the sense that it is independent of the abstract domain used and it can be applied to different domains as "plug-ins"-, multivariant, and flow-sensitive. Also, is based on a program transformation, prior to the analysis, that results in a highly uniform representation of all the features in the language and therefore simplifies analysis. Detailed descriptions of decompilation solutions are given and discussed with an example. We also provide some performance data from a preliminary implementation of the analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finding useful sharing information between instances in object- oriented programs has recently been the focus of much research. The applications of such static analysis are multiple: by knowing which variables definitely do not share in memory we can apply conventional compiler optimizations, find coarse-grained parallelism opportunities, or, more importantly, verify certain correctness aspects of programs even in the absence of annotations. In this paper we introduce a framework for deriving precise sharing information based on abstract interpretation for a Java-like language. Our analysis achieves precision in various ways, including supporting multivariance, which allows separating different contexts. We propose a combined Set Sharing + Nullity + Classes domain which captures which instances do not share and which ones are definitively null, and which uses the classes to refine the static information when inheritance is present. The use of a set sharing abstraction allows a more precise representation of the existing sharings and is crucial in achieving precision during interprocedural analysis. Carrying the domains in a combined way facilitates the interaction among them in the presence of multivariance in the analysis. We show through examples and experimentally that both the set sharing part of the domain as well as the combined domain provide more accurate information than previous work based on pair sharing domains, at reasonable cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finding useful sharing information between instances in object- oriented programs has been recently the focus of much research. The applications of such static analysis are multiple: by knowing which variables share in memory we can apply conventional compiler optimizations, find coarse-grained parallelism opportunities, or, more importantly,erify certain correctness aspects of programs even in the absence of annotations In this paper we introduce a framework for deriving precise sharing information based on abstract interpretation for a Java-like language. Our analysis achieves precision in various ways. The analysis is multivariant, which allows separating different contexts. We propose a combined Set Sharing + Nullity + Classes domain which captures which instances share and which ones do not or are definitively null, and which uses the classes to refine the static information when inheritance is present. Carrying the domains in a combined way facilitates the interaction among the domains in the presence of mutivariance in the analysis. We show that both the set sharing part of the domain as well as the combined domain provide more accurate information than previous work based on pair sharing domains, at reasonable cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract interpretation has been widely used for the analysis of object-oriented languages and, more precisely, Java source and bytecode. However, while most of the existing work deals with the problem of finding expressive abstract domains that track accurately the characteristics of a particular concrete property, the underlying fixpoint algorithms have received comparatively less attention. In fact, many existing (abstract interpretation based) fixpoint algorithms rely on relatively inefficient techniques to solve inter-procedural call graphs or are specific and tied to particular analyses. We argue that the design of an efficient fixpoint algorithm is pivotal to support the analysis of large programs. In this paper we introduce a novel algorithm for analysis of Java bytecode which includes a number of optimizations in order to reduce the number of iterations. Also, the algorithm is parametric in the sense that it is independent of the abstract domain used and it can be applied to different domains as "plug-ins". It is also incremental in the sense that, if desired, analysis data can be saved so that only a reduced amount of reanalysis is needed after a small program change, which can be instrumental for large programs. The algorithm is also multivariant and flowsensitive. Finally, another interesting characteristic of the algorithm is that it is based on a program transformation, prior to the analysis, that results in a highly uniform representation of all the features in the language and therefore simplifies analysis. Detailed descriptions of decompilation solutions are provided and discussed with an example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of conceptualisation is the first step towards the identication of the functional requirements of a system. This article proposes two extensions of well-known object oriented techniques: UER (User-Environment-Responsibility) technique and enhanced CRC (Class-ResponsibilityCollaboration) cards. UER technique consists of (a) looking for the users of systems and describing the ways the system is used; (b) looking for the objects of the environment and describing the possible interactions; and (c) looking for the general requirements or goals of the system, the actions that it should carry out without explicit interaction. The enhanced CRC cards together with the internal use cases technique is used for dening collaborations between agents. These techniques can be easily integrated in UML (Unied Modelling Language) [2], dening the new notation symbols as stereotypes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present new tools for the segmentation and analysis of musical scores in the OpenMusic computer-aided composition environment. A modular object-oriented framework enables the creation of segmentations on score objects and the implementation of automatic or semi-automatic analysis processes. The analyses can be performed and displayed thanks to customizable classes and callbacks. Concrete examples are given, in particular with the implementation of a semi-automatic harmonic analysis system and a framework for rhythmic transcription.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Projected air and ground temperatures are expected to be higher in Arctic and sub-Arcticlatitudes and with temperatures already close to the limit where permafrost can exist,resistance against degradation is low. With thawing permafrost, the landscape is modifiedwith depression in which thermokarst lakes emerge. In permafrost soils a considerableamount of soil organic carbon is stored, with the potential of altering climate even furtherif expansion and formation of new thermokarst lakes emerge, as decay releasesgreenhouse gases (C02 and CH4) to the atmosphere. Analyzing the spatial distribution andmorphometry over time of thermokarst lakes and other water bodies, is of importance inaccurately predict carbon budget and feedback mechanisms, as well as to assess futurelandscape layout and these features interaction. Different types of high-spatial resolutionaerial and satellite imageries from 1963, 1975, 2003, 2010 and 2015, were used in bothpre- and post-classification change detection analyses. Using object oriented segmentationin eCognition combined with manual adjustments, resulted in digitalized water bodies>28m2 from which direction of change and morphometric values were extracted. Thequantity of thermokarst lakes and other water bodies was in 1963 n=92, with succeedingyears as a trend decreased in numbers, until 2010-2015 when eleven water bodies wereadded in 2015 (n=74 to n=85). In 1963-2003, area of these water bodies decreased with50 651m2 (189 446-138 795m2) and continued to decrease in 2003-2015 ending at 129337m2. Limnicity decreased from 19.9% in 1963 to 14.6% in 2003 (-5.3%). In 2010 and2015 13.7-13.6%. The late increase in water bodies differs from an earlier hypothesis thatsporadic permafrost regions experience decrease in both area and quantity of thermokarstlakes and water bodies. During 1963-2015, land gain has been in dominance of the ratiobetween the two competing processes of expansion and drainage. In 1963-1975, 55/45%,followed by 90/10% in 1975-2003. After major drainage events, land loss increased to62/38% in 2010-2015. Drainage and infilling rates, calculated for 15 shorelines werevaried across both landscape and parts of shorelines, with in average 0.17/0.15/0.14m/yr.Except for 1963-1975 when rate of change in average was in opposite direction (-0.09m/yr.), likely due to evident expansion of a large thermokarst lake. Using a squaregrid, distribution of water bodies was determined, with an indistinct cluster located in NEand central parts. Especially for water bodies <250m2, which is the dominant area classthroughout 1963-2015 ranging from n=39-51. With a heterogeneous composition of bothsmall and large thermokarst lakes, and with both expansion and drainage altering thelandscape in Tavvavuoma, both positive and negative climate feedback mechanisms are inplay - given that sporadic permafrost still exist.