925 resultados para Service-oriented Architecture


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Owing to the high degree of vulnerability of liquid retaining structures to corrosion problems, there are stringent requirements in its design against cracking. In this paper, a prototype knowledge-based system is developed and implemented for the design of liquid retaining structures based on the blackboard architecture. A commercially available expert system shell VISUAL RULE STUDIO working as an ActiveX Designer under the VISUAL BASIC programming environment is employed. Hybrid knowledge representation approach with production rules and procedural methods under object-oriented programming are used to represent the engineering heuristics and design knowledge of this domain. It is demonstrated that the blackboard architecture is capable of integrating different knowledge together in an effective manner. The system is tailored to give advice to users regarding preliminary design, loading specification and optimized configuration selection of this type of structure. An example of application is given to illustrate the capabilities of the prototype system in transferring knowledge on liquid retaining structure to novice engineers. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The roiling financial markets, constantly changing tax law and increasing complexity of planning transaction increase the demand of aggregated family wealth management (FWM) services. However, current trend of developing such advisory systems is mainly focusing on financial or investment side. In addition, these existing systems lack of flexibility and are hard to be integrated with other organizational information systems, such as CRM systems. In this paper, a novel architecture of Web-service-agents-based FWM systems has been proposed. Multiple intelligent agents are wrapped as Web services and can communicate with each other via Web service protocols. On the one hand, these agents can collaborate with each other and provide comprehensive FWM advices. On the other hand, each service can work independently to achieve its own tasks. A prototype system for supporting financial advice is also presented to demonstrate the advances of the proposed Webservice- agents-based FWM system architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an asymmetric multi-processor SoC architecture, featuring a master CPU running uClinux, and multiple loosely-coupled slave CPUs running real-time threads assigned by the master CPU. Real-time SoC architectures often demand a compromise between a generic platform for different applications, and application-specific customizations to achieve performance requirements. Our proposed architecture offers a generic platform running a conventional embedded operating system providing a traditional software-oriented development approach, while multiple slave CPUs act as a dedicated independent real-time threads execution unit running in parallel of master CPU to achieve performance requirements. In this paper, the architecture is described, including the application / threading development environment. The performance of the architecture with several standard benchmark routines is also analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper asks to question. First, what types of linkages make firms in the service sector innovate? And second, what is the link between innovation and the firms’ productivity and export performance? Using survey data from Northern Ireland we find that links intra-regional links (i.e. within Northern Ireland) to customers, suppliers and universities have little effect on innovation, but external links (i.e. outside Northern Ireland) help to boost innovation. Relationships between innovation, exporting and productivity prove complex but suggest that innovation itself is not sufficient to generate productivity improvements. Only when innovation is combined with increased export activity are productivity gains produced. This suggests that regional innovation policy should be oriented towards helping firms to innovate only where it helps firms to enter export markets or to expand their existing export market presence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An interoperable web processing service (WPS) for the automatic interpolation of environmental data has been developed in the frame of the INTAMAP project. In order to assess the performance of the interpolation method implemented, a validation WPS has also been developed. This validation WPS can be used to perform leave one out and K-fold cross validation: a full dataset is submitted and a range of validation statistics and diagnostic plots (e.g. histograms, variogram of residuals, mean errors) is received in return. This paper presents the architecture of the validation WPS and a case study is used to briefly illustrate its use in practice. We conclude with a discussion on the current limitations of the system and make proposals for further developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jackson System Development (JSD) is an operational software development method which addresses most of the software lifecycle either directly or by providing a framework into which more specialised techniques can fit. The method has two major phases: first an abstract specification is derived that is in principle executable; second the specification is implemented using a variety of transformations. The object oriented paradigm is based on data abstraction and encapsulation coupled to an inheritance architecture that is able to support software reuse. Its claims of improved programmer productivity and easier program maintenance make it an important technology to be considered for building complex software systems. The mapping of JSD specifications into procedural languages typified by Cobol, Ada, etc., involves techniques such as inversion and state vector separation to produce executable systems of acceptable performance. However, at present, no strategy exists to map JSD specifications into object oriented languages. The aim of this research is to investigate the relationship between JSD and the object oriented paradigm, and to identify and implement transformations capable of mapping JSD specifications into an object oriented language typified by Smalltalk-80. The direction which the transformational strategy follows is one whereby the concurrency of a specification is removed. Two approaches implementing inversion - an architectural transformation resulting in a simulated coroutine mechanism being generated - are described in detail. The first approach directly realises inversions by manipulating Smalltalk-80 system contexts. This is possible in Smalltalk-80 because contexts are first class objects and are accessible to the user like any other system object. However, problems associated with this approach are expounded. The second approach realises coroutine-like behaviour in a structure called a `followmap'. A followmap is the results of a transformation on a JSD process in which a collection of followsets is generated. Each followset represents all possible state transitions a process can undergo from the current state of the process. Followsets, together with exploitation of the class/instance mechanism for implementing state vector separation, form the basis for mapping JSD specifications into Smalltalk-80. A tool, which is also built in Smalltalk-80, supports these derived transformations and enables a user to generate Smalltalk-80 prototypes of JSD specifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis has been concerned with obtaining evidence to explore the proposition that the provision of occupational health services as arranged at the present time represents a misallocation of resources. The research has been undertaken within the occupational health service of a large Midlands food factory. As the research progressed it became evident that questions were being raised about the nature and scope of occupational health as well as the contribution, in combating danger at work, that occupational health services can make to the health and safety team. These questions have been scrutinized in depth, as they are clearly important, and a resolution of the problem of the definition of occupational health has been proposed. I have taken the approach of attempting to identify specific objectives or benefits of occupational health activities so that it is possible to assess how far these objectives are being achieved. I have looked at three aspects of occupational health; audiometry, physiotherapy and pre-employment medical examinations as these activities embody crucial concepts which are common to all activities in an occupational health programme. A three category classification of occupational health activities is proposed such that the three activities provide examples within each category. These are called personnel therapy, personnel input screening and personnel throughput screening. I conclude that I have not shown audiometry to be cost-effective. My observations of the physiotherapy service lead me to support the suggestion that there is a decline in sickness absence rates due to physiotherapy in industry. With pre-employment medical examinations I have shown that the service is product safety oriented and that benefits are extremely difficult to identify. In regard to the three services studied, in the one factory investigated, and because of the immeasurability of certain activities, I find support for the proposition that the mix of occupational health services as provided at the present time represents a misallocation of resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The main purpose of this paper is to analyze knowledge management in service networks. It analyzes the knowledge management process and identifies related challenges. The authors take a strategic management approach instead of a more technology-oriented approach, since it is believed that managerial problems still remain after technological problems are solved. Design/methodology/approach – The paper explores the literature on the topic of knowledge management as well as the resource (or knowledge) based view of the firm. It offers conceptual insights and provides possible solutions for knowledge management problems. Findings – The paper discusses several possible solutions for managing knowledge processes in knowledge-intensive service networks. Solutions for knowledge identification/generation, knowledge application, knowledge combination/transfer and supporting the evolution of tacit network knowledge include personal and technological aspects, as well as organizational and cultural elements. Practical implications – In a complex environment, knowledge management and network management become crucial for business success. It is the task of network management to establish routines, and to build and regularly refresh meta-knowledge about the competencies and abilities that exist within the network. It is suggested that each network partner should be rated according to the contribution to the network knowledge base. Based on this rating, a particular network partner is a member of a certain knowledge club, meaning that the partner has access to a particular level of network knowledge. Such an established routine provides strong incentives to add knowledge to the network's knowledge base Originality/value – This paper is a first attempt to outline the problems of knowledge management in knowledge-intensive service networks and, by so doing, to introduce strategic management reasoning to the discussion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - Problems of quality and safety persist in health systems worldwide. We conducted a large research programme to examine culture and behaviour in the English National Health Service (NHS). Methods - Mixed-methods study involving collection and triangulation of data from multiple sources, including interviews, surveys, ethnographic case studies, board minutes and publicly available datasets. We narratively synthesised data across the studies to produce a holistic picture and in this paper present a highlevel summary. Results - We found an almost universal desire to provide the best quality of care. We identified many 'bright spots' of excellent caring and practice and high-quality innovation across the NHS, but also considerable inconsistency. Consistent achievement of high-quality care was challenged by unclear goals, overlapping priorities that distracted attention, and compliance-oriented bureaucratised management. The institutional and regulatory environment was populated by multiple external bodies serving different but overlapping functions. Some organisations found it difficult to obtain valid insights into the quality of the care they provided. Poor organisational and information systems sometimes left staff struggling to deliver care effectively and disempowered them from initiating improvement. Good staff support and management were also highly variable, though they were fundamental to culture and were directly related to patient experience, safety and quality of care. Conclusions - Our results highlight the importance of clear, challenging goals for high-quality care. Organisations need to put the patient at the centre of all they do, get smart intelligence, focus on improving organisational systems, and nurture caring cultures by ensuring that staff feel valued, respected, engaged and supported.