930 resultados para object-oriented software framework


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Strategic sourcing has increased in importance in recent years, and now plays an important role in companies’ planning. The current volatility in supply markets means companies face multiple challenges involving lock-in situations, supplier bankruptcies or supply security issues. In addition, their exposure can increase due to natural disasters, as witnessed recently in the form of bird flu, volcanic ash and tsunamis. Therefore, the primary focus of this study is risk management in the context of strategic sourcing. The study presents a literature review on sourcing based on the 15 years from 1998–2012, and considers 131 academic articles. The literature describes strategic sourcing as a strategic, holistic process in managing supplier relationships, with a long-term focus on adding value to the company and realising competitive advantage. Few studies discovered the real risk impact and status of risk management in strategic sourcing, and evaluation across countries and industries was limited, with the construction sector particularly under-researched. This methodology is founded on a qualitative study of twenty cases across Ger-many and the United Kingdom from the construction sector and electronics manufacturing industries. While considering risk management in the context of strategic sourcing, the thesis takes into account six dimensions that cover trends in strategic sourcing, theoretical and practical sourcing models, risk management, supply and demand management, critical success factors and the strategic supplier evaluation. The study contributes in several ways. First, recent trends are traced and future needs identified across the research dimensions of countries, industries and companies. Second, it evaluates critical success factors in contemporary strategic sourcing. Third, it explores the application of theoretical and practical sourcing models in terms of effectiveness and sustainability. Fourth, based on the case study findings, a risk-oriented strategic sourcing framework and a model for strategic sourcing are developed. These are based on the validation of contemporary requirements and a critical evaluation of the existing situation. It contemplates the empirical findings and leads to a structured process to manage risk in strategic sourcing. The risk-oriented framework considers areas such as trends, corporate and sourcing strategy, critical success factors, strategic supplier selection criteria, risk assessment, reporting, strategy alignment and reporting. The proposed model highlights the essential dimensions in strategic sourcing and guides us to a new definition of strategic sourcing supported by this empirical study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Presented is webComputing – a general framework of mathematically oriented services including remote access to hardware and software resources for mathematical computations, and web interface to dynamic interactive computations and visualization in a diversity of contexts: mathematical research and engineering, computer-aided mathematical/technical education and distance learning. webComputing builds on the innovative webMathematica technology connecting technical computing system Mathematica to a web server and providing tools for building dynamic and interactive web-interface to Mathematica-based functionality. Discussed are the conception and some of the major components of webComputing service: Scientific Visualization, Domain- Specific Computations, Interactive Education, and Authoring of Interactive Pages.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software applications created on top of the service-oriented architecture (SOA) are increasingly popular but testing them remains a challenge. In this paper a framework named TASSA for testing the functional and non-functional behaviour of service-based applications is presented. The paper focuses on the concept of design time testing, the corresponding testing approach and architectural integration of the consisting TASSA tools. The individual TASSA tools with sample validation scenarios were already presented with a general view of their relation. This paper’s contribution is the structured testing approach, based on the integral use of the tools and their architectural integration. The framework is based on SOA principles and is composable depending on user requirements.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): D.2.5, D.2.9, D.2.11.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Тодор П. Чолаков, Димитър Й. Биров - Тази статия представя цялостен модел за автоматизиран реинженеринг на наследени системи. Тя описва в детайли процесите на превод на софтуера и на рефакторинг и степента, до която могат да се автоматизират тези процеси. По отношение на превода на код се представя модел за автоматизирано превеждане на код, съдържащ указатели и работа с адресна аритметика. Също така се дефинира рамка за процеса на реинженеринг и се набелязват възможности за по-нататъшно развитие на концепции, инструменти и алгоритми.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software architecture is the abstract design of a software system. It plays a key role as a bridge between requirements and implementation, and is a blueprint for development. The architecture represents a set of early design decisions that are crucial to a system. Mistakes in those decisions are very costly if they remain undetected until the system is implemented and deployed. This is where formal specification and analysis fits in. Formal specification makes sure that an architecture design is represented in a rigorous and unambiguous way. Furthermore, a formally specified model allows the use of different analysis techniques for verifying the correctness of those crucial design decisions. ^ This dissertation presented a framework, called SAM, for formal specification and analysis of software architectures. In terms of specification, formalisms and mechanisms were identified and chosen to specify software architecture based on different analysis needs. Formalisms for specifying properties were also explored, especially in the case of non-functional properties. In terms of analysis, the dissertation explored both the verification of functional properties and the evaluation of non-functional properties of software architecture. For the verification of functional property, methodologies were presented on how to apply existing model checking techniques on a SAM model. For the evaluation of non-functional properties, the dissertation first showed how to incorporate stochastic information into a SAM model, and then explained how to translate the model to existing tools and conducts the analysis using those tools. ^ To alleviate the analysis work, we also provided a tool to automatically translate a SAM model for model checking. All the techniques and methods described in the dissertation were illustrated by examples or case studies, which also served a purpose of advocating the use of formal methods in practice. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software development is an extremely complex process, during which human errors are introduced and result in faulty software systems. It is highly desirable and important that these errors can be prevented and detected as early as possible. Software architecture design is a high-level system description, which embodies many system features and properties that are eventually implemented in the final operational system. Therefore, methods for modeling and analyzing software architecture descriptions can help prevent and reveal human errors and thus improve software quality. Furthermore, if an analyzed software architecture description can be used to derive a partial software implementation, especially when the derivation can be automated, significant benefits can be gained with regard to both the system quality and productivity. This dissertation proposes a framework for an integrated analysis on both of the design and implementation. To ensure the desirable properties of the architecture model, we apply formal verification by using the model checking technique. To ensure the desirable properties of the implementation, we develop a methodology and the associated tool to translate an architecture specification into an implementation written in the combination of Arch-Java/Java/AspectJ programming languages. The translation is semi-automatic so that many manual programming errors can be prevented. Furthermore, the translation inserting monitoring code into the implementation such that runtime verification can be performed, this provides additional assurance for the quality of the implementation. Moreover, validations for the translations from architecture model to program are provided. Finally, several case studies are experimented and presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The scientific success of the LHC experiments at CERN highly depends on the availability of computing resources which efficiently store, process, and analyse the amount of data collected every year. This is ensured by the Worldwide LHC Computing Grid infrastructure that connect computing centres distributed all over the world with high performance network. LHC has an ambitious experimental program for the coming years, which includes large investments and improvements both for the hardware of the detectors and for the software and computing systems, in order to deal with the huge increase in the event rate expected from the High Luminosity LHC (HL-LHC) phase and consequently with the huge amount of data that will be produced. Since few years the role of Artificial Intelligence has become relevant in the High Energy Physics (HEP) world. Machine Learning (ML) and Deep Learning algorithms have been successfully used in many areas of HEP, like online and offline reconstruction programs, detector simulation, object reconstruction, identification, Monte Carlo generation, and surely they will be crucial in the HL-LHC phase. This thesis aims at contributing to a CMS R&D project, regarding a ML "as a Service" solution for HEP needs (MLaaS4HEP). It consists in a data-service able to perform an entire ML pipeline (in terms of reading data, processing data, training ML models, serving predictions) in a completely model-agnostic fashion, directly using ROOT files of arbitrary size from local or distributed data sources. This framework has been updated adding new features in the data preprocessing phase, allowing more flexibility to the user. Since the MLaaS4HEP framework is experiment agnostic, the ATLAS Higgs Boson ML challenge has been chosen as physics use case, with the aim to test MLaaS4HEP and the contribution done with this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose is to present a scientific research that led to the modeling of an information system which aimed at the maintenance of traceability data in the Brazilian wine industry, according to the principles of a service-oriented architecture (SOA). Since 2005, traceability data maintenance is an obligation for all producers that intend to export to any European Union country. Also, final customers, including the Brazilian ones, have been asking for information about food products. A solution that collectively contemplated the industry was sought in order to permit that producer consortiums of associations could share the costs and benefits of such a solution. Following an extensive bibliographic review, a series of interviews conducted with Brazilian researchers and wine producers in Bento Goncalves - RS, Brazil, elucidated many aspects associated with the wine production process. Information technology issues related to the theme were also researched. The software was modeled with the Unified Modeling Language (UML) and uses web services for data exchange. A model for the wine production process was also proposed. A functional prototype showed that the adopted model is able to fulfill the demands of wine producers. The good results obtained lead us to consider the use of this model in other domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of two experiments are reported that examined how people respond to rectangular targets of different sizes in simple hitting tasks. If a target moves in a straight line and a person is constrained to move along a linear track oriented perpendicular to the targetrsquos motion, then the length of the target along its direction of motion constrains the temporal accuracy and precision required to make the interception. The dimensions of the target perpendicular to its direction of motion place no constraints on performance in such a task. In contrast, if the person is not constrained to move along a straight track, the targetrsquos dimensions may constrain the spatial as well as the temporal accuracy and precision. The experiments reported here examined how people responded to targets of different vertical extent (height): the task was to strike targets that moved along a straight, horizontal path. In experiment 1 participants were constrained to move along a horizontal linear track to strike targets and so target height did not constrain performance. Target height, length and speed were co-varied. Movement time (MT) was unaffected by target height but was systematically affected by length (briefer movements to smaller targets) and speed (briefer movements to faster targets). Peak movement speed (Vmax) was influenced by all three independent variables: participants struck shorter, narrower and faster targets harder. In experiment 2, participants were constrained to move in a vertical plane normal to the targetrsquos direction of motion. In this task target height constrains the spatial accuracy required to contact the target. Three groups of eight participants struck targets of different height but of constant length and speed, hence constant temporal accuracy demand (different for each group, one group struck stationary targets = no temporal accuracy demand). On average, participants showed little or no systematic response to changes in spatial accuracy demand on any dependent measure (MT, Vmax, spatial variable error). The results are interpreted in relation to previous results on movements aimed at stationary targets in the absence of visual feedback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore of the feasibility of the computationally oriented institutional agency framework proposed by Governatori and Rotolo testing it against an industrial strength scenario. In particular we show how to encode in defeasible logic the dispute resolution policy described in Article 67 of FIDIC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major challenge in successfully implementing transit-oriented development (TOD) is having a robust process that ensures effective appraisal, initiation and delivery of multi-stakeholder TOD projects. A step-by step project development process can assist in the methodic design, evaluation, and initiation of TOD projects. Successful TOD requires attention to transit, mixed-use development and public space. Brisbane, Australia provides a case-study where recent planning policies and infrastructure documents have laid a foundation for TOD, but where barriers lie in precinct level planning and project implementation. In this context and perhaps in others, the research effort needs to shift toward identification of appropriate project processes and strategies. This paper presents the outcomes of research conducted to date. Drawing on the mainstream approach to project development and financial evaluation for property projects, key steps for potential use in successful delivery of TOD projects have been identified, including: establish the framework; location selection; precinct context review; preliminary precinct design; the initial financial viability study; the decision stage; establishment of project structure; land acquisition; development application; and project delivery. The appropriateness of this mainstream development and appraisal process will be tested through stakeholder research, and the proposed process will then be refined for adoption in TOD projects. It is suggested that the criteria for successful TOD should be broadened beyond financial concerns in order to deliver public sector support for project initiation.