972 resultados para object-oriented paradigm
Resumo:
A common mode whereby destruction of coastal lowlands occurs is frontal erosion. The edge cliffing, nonetheless, is also an inherent aspect of salt marsh development in many northwest European tidal marshes. Quite a few geomorphologists in the earlier half of the past century recognized such edge erosion as a definite repetitive stage within an autocyclic mode of marsh growth. A shift in research priorities during the past decades (primarily because of coastal management concerns, however) has resulted in an enhanced focus on sediment-flux measurement campaigns on salt marshes. This, somewhat "object-oriented" strategy hindered any further development of the once-established autocyclic growth concept, which virtually has gone into oblivion in recent times. This work makes an attempt to resurrect the notion of autocyclicity by employing its premises to address edge erosion in tidal marshes. Through a review of intertidal morphosedimentology the underlying framework for autocyclicity is envisaged. The phenomenon is demonstrated in the Holocene salt marsh plain of Moricambe basin in NW England that displays several distinct phases of marsh retreat in the form of abandoned clifflets. The suite of abandoned shorelines and terraces has been identified in detailed field mapping that followed analysis of topographic maps and aerial photographs. Vertical trends in marsh plain sediments are recorded in trenches for signs of past marsh front movements. The characteristic sea level history of the area offers an opportunity to differentiate the morphodynamic variability induced in the autocyclic growth of the marsh plain in scenarios of rising and falling sea level and the accompanied change in sediment budget. The ideas gathered are incorporated to construct a conceptual model that links temporal extent of marsh erosion to inner tidal flat sediment budget and sea level tendency. The review leads to recognition of the necessity of adopting an holistic approach in the morphodynamic investigations where marshes should be treated as a component within the "marsh-mudflat system" as each element apparently modulates evolution of the other, with an eventual linkage to subtidal channels. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The SystemVerilog implementation of the Open Verification Methodology (OVM) is exercised on an 8b/10b RTL open core design in the hope of being a simple yet complete exercise to expose the key features of OVM. Emphasis is put onto the actual usage of the verification components rather than a complete verification flow aiming at being of help to readers unfamiliar with OVM seeking to apply the methodology to their own designs. A link that takes you to the complete code is given to reinforce this aim. We found the methodology easy to use but intimidating at first glance specially for someone with little experience in object oriented programming. However it is clear to see the flexibility, portability and reusability of verification code once you manage to give some first steps.
Resumo:
Simple Adaptive Momentum [1] was introduced as a simple means of speeding the training of multi-layer perceptrons (MLPs) by changing the momentum term depending on the angle between the current and previous changes in the weights of the MLP. In the original paper. the weight changes of the whole network are used in determining this angle. This paper considers adapting the momentum term using certain subsets of these weights. This idea was inspired by the author's object oriented approach to programming MLPs. successfully used in teaching students: this approach is also described. It is concluded that the angle is best determined using the weight changes in each layer separately.
Resumo:
A multi-layered architecture of self-organizing neural networks is being developed as part of an intelligent alarm processor to analyse a stream of power grid fault messages and provide a suggested diagnosis of the fault location. Feedback concerning the accuracy of the diagnosis is provided by an object-oriented grid simulator which acts as an external supervisor to the learning system. The utilization of artificial neural networks within this environment should result in a powerful generic alarm processor which will not require extensive training by a human expert to produce accurate results.
Resumo:
The recent celebrations of the centenary of the publication of the Futurist manifesto led to a renewed discussion of the ideas and artworks of the Italian artists’ group. Jacques Rancière related the Futurist ethos with the modernist project of liberating art from representation. Franco ‘Bifo’ Berardi, in his post-Futurist manifesto, also identified a historical irony at play in the emptying out of Futurism’s promise: a liberated mechanical humanity did indeed materialize, in a global economic system premised on financial servitude to the future via debt. However, these models continue to assess Futurism against an unchallenged humanism, finding it either supporting ideals of freedom and human rights despite itself, or else lacking in these areas. But Futurism is potentially more relevant than ever not in spite of its anti-humanist agenda, precisely because of it. Tom McCarthy annexes not Futurist art but Futurist writing to an emerging object oriented ontology that seeks to challenge the primacy of the human. If Futurism is to be repurposed as a critical concept, it can only do so by countering the humanist myth the liberal subject that underlies the current cultural and political hegemony of neo-liberalism.
Resumo:
The authors discuss an implementation of an object oriented (OO) fault simulator and its use within an adaptive fault diagnostic system. The simulator models the flow of faults around a power network, reporting switchgear indications and protection messages that would be expected in a real fault scenario. The simulator has been used to train an adaptive fault diagnostic system; results and implications are discussed.
Resumo:
Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
A decision support system (DSS) was implemented based on a fuzzy logic inference system (FIS) to provide assistance in dose alteration of Duodopa infusion in patients with advanced Parkinson’s disease, using data from motor state assessments and dosage. Three-tier architecture with an object oriented approach was used. The DSS has a web enabled graphical user interface that presents alerts indicating non optimal dosage and states, new recommendations, namely typical advice with typical dose and statistical measurements. One data set was used for design and tuning of the FIS and another data set was used for evaluating performance compared with actual given dose. Overall goodness-of-fit for the new patients (design data) was 0.65 and for the ongoing patients (evaluation data) 0.98. User evaluation is now ongoing. The system could work as an assistant to clinical staff for Duodopa treatment in advanced Parkinson’s disease.
Resumo:
Despite several examples of deployed agent systems, there remain barriers to the large-scale adoption of agent technologies. In order to understand these barriers, this paper considers aspects of marketing theory which deal with diffusion of innovations and their relevance to the agents domain and the current state of diffusion of agent technologies. In particular, the paper examines the role of standards in the adoption of new technologies, describes the agent standards landscape, and compares the development and diffusion of agent technologies with that of object-oriented programming. The paper also reports on a simulation model developed in order to consider different trajectories for the adoption of agent technologies, with trajectories based on various assumptions regarding industry structure and the existence of competing technology standards. We present details of the simulation model and its assumptions, along with the results of the simulation exercises.
Resumo:
Agent-oriented software engineering and software product lines are two promising software engineering techniques. Recent research work has been exploring their integration, namely multi-agent systems product lines (MAS-PLs), to promote reuse and variability management in the context of complex software systems. However, current product derivation approaches do not provide specific mechanisms to deal with MAS-PLs. This is essential because they typically encompass several concerns (e.g., trust, coordination, transaction, state persistence) that are constructed on the basis of heterogeneous technologies (e.g., object-oriented frameworks and platforms). In this paper, we propose the use of multi-level models to support the configuration knowledge specification and automatic product derivation of MAS-PLs. Our approach provides an agent-specific architecture model that uses abstractions and instantiation rules that are relevant to this application domain. In order to evaluate the feasibility and effectiveness of the proposed approach, we have implemented it as an extension of an existing product derivation tool, called GenArch. The approach has also been evaluated through the automatic instantiation of two MAS-PLs, demonstrating its potential and benefits to product derivation and configuration knowledge specification.
Resumo:
O desenvolvimento de artefatos de software é um processo de engenharia, como todo processo de engenharia, envolve uma série de etapas que devem ser conduzidas através de uma metodologia apropriada. Para que um determinado software alcance seus objetivos, as características conceituais e arquiteturais devem ser bem definidas antes da implementação. Aplicações baseadas em hiperdocumentos possuem uma característica específica que é a definição de seus aspectos navegacionais. A navegação é uma etapa crítica no processo de definição de softwares baseados em hiperdocumentos, pois ela conduz o usuário durante uma sessão de visita ao conteúdo de um site. Uma falha no processo de especificação da navegação causa uma perda de contexto, desorientando o usuário no espaço da aplicação. Existem diversas metodologias para o tratamento das características de navegação de aplicações baseadas em hiperdocumentos. As principais metodologias encontradas na literatura foram estudadas e analisadas neste trabalho. Foi realizada uma análise comparativa entre as metodologias, traçando suas abordagens e etapas. O estudo das abordagens de especificação de hiperdocumentos foi uma etapa preliminar servindo como base de estudo para o objetivo deste trabalho. O foco é a construção de uma ferramenta gráfica de especificação conceitual de hiperdocumentos, segundo uma metodologia de modelagem de software baseado em hiperdocumentos. O método adotado foi o OOHDM (Object-Oriented Hypermedia Design Model), por cercar todas as etapas de um processo de desenvolvimento de aplicações, com uma atenção particular à navegação. A ferramenta implementa uma interface gráfica onde o usuário poderá modelar a aplicação através da criação de modelos. O processo de especificação compreende três modelos: modelagem conceitual, modelagem navegacional e de interface. As características da aplicação são definidas em um processo incremental, que começa na definição conceitual e finaliza nas características de interface. A ferramenta gera um protótipo da aplicação em XML. Para a apresentação das páginas em um navegador Web, utilizou-se XSLT para a conversão das informações no formato XML para HTML. Os modelos criados através das etapas de especificação abstrata da aplicação são exportados em OOHDM-ML. Um estudo de caso foi implementado para validação da ferramenta. Como principal contribuição deste trabalho, pode-se citar a construção de um ambiente gráfico de especificação abstrata de hiperdocumentos e um ambiente de implementação de protótipos e exportação de modelos. Com isso, pretende-se orientar, conduzir e disciplinar o trabalho do usuário durante o processo de especificação de aplicações.
Resumo:
This paper presents results of a pricing system to compute the option adjusted spread ("DAS") of Eurobonds issued by Brazilian firms. The system computes the "DAS" over US treasury rates taktng imo account the embedded options present on these bonds. These options can be calls ("callable bond"), puts ("putable bond") or combinations ("callable and putable bond"). The pricing model takes into account the evolution of the term structure along time, is compatible with the observable market term structure and is able to compute risk measures such as duration and convexity, and pricing and hedging of options on these bonds. Examples show the ejJects of the embedded options on the spread and risk measures as well as the ejJects on the spread due to variations on the volatility parameters ofthe short rate.
Resumo:
This thesis presents a JML-based strategy that incorporates formal specifications into the software development process of object-oriented programs. The strategy evolves functional requirements into a “semi-formal” requirements form, and then expressing them as JML formal specifications. The strategy is implemented as a formal-specification pseudo-phase that runs in parallel with the other phase of software development. What makes our strategy different from other software development strategies used in literature is the particular use of JML specifications we make all along the way from requirements to validation-and-verification.
Resumo:
VALENTIM, R. A. M. ; SOUZA NETO, Plácido Antônio de. O impacto da utilização de design patterns nas métricas e estimativas de projetos de software: a utilização de padrões tem alguma influência nas estimativas?. Revista da FARN, Natal, v. 4, p. 63-74, 2006
Resumo:
In the two last decades of the past century, following the consolidation of the Internet as the world-wide computer network, applications generating more robust data flows started to appear. The increasing use of videoconferencing stimulated the creation of a new form of point-to-multipoint transmission called IP Multicast. All companies working in the area of software and the hardware development for network videoconferencing have adjusted their products as well as developed new solutionsfor the use of multicast. However the configuration of such different solutions is not easy done, moreover when changes in the operational system are also requirede. Besides, the existing free tools have limited functions, and the current comercial solutions are heavily dependent on specific platforms. Along with the maturity of IP Multicast technology and with its inclusion in all the current operational systems, the object-oriented programming languages had developed classes able to handle multicast traflic. So, with the help of Java APIs for network, data bases and hipertext, it became possible to the develop an Integrated Environment able to handle multicast traffic, which is the major objective of this work. This document describes the implementation of the above mentioned environment, which provides many functions to use and manage multicast traffic, functions which existed only in a limited way and just in few tools, normally the comercial ones. This environment is useful to different kinds of users, so that it can be used by common users, who want to join multimedia Internet sessions, as well as more advenced users such engineers and network administrators who may need to monitor and handle multicast traffic