978 resultados para process architecture
Resumo:
The time to process each of W/B processing blocks of a median calculation method on a set of N W-bit integers is improved here by a factor of three compared to the literature. Parallelism uncovered in blocks containing B-bit slices are exploited by independent accumulative parallel counters so that the median is calculated faster than any known previous method for any N, W values. The improvements to the method are discussed in the context of calculating the median for a moving set of N integers for which a pipelined architecture is developed. An extra benefit of smaller area for the architecture is also reported.
Resumo:
Users’ requirements change drives an information system evolution. Consequently, such evolution affects those atomic services which provide functional operations from one state of their composition to another state of composition. A challenging issue associated with such evolution of the state of service composition is to ensure a resultant service composition remaining rational. This paper presents a method of Service Composition Atomic-Operation Set (SCAOS). SCAOS defines 2 classes of atomic operations and 13 kinds of basic service compositions to aid a state change process by using Workflow Net. The workflow net has algorithmic capabilities to compose the required services with rationality and maintain any changes to the services in a different composition also rational. This method can improve the adaptability to the ever changing business requirements of information systems in the dynamic environment.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
During the last few years Enterprise Architecture (EA) has received increasing attention among industry and academia. By adopting EA, organisations may gain a number of benefits such as better decision making,increased revenues and cost reduction, and alignment of business and IT. However, EA adoption has been found to be difficult. In this paper a model to explain resistance during EA adoption process (REAP) is introduced and validated. The model reveals relationships between strategic level of EA, resulting organisational changes, and sources of resistance. By utilising REAP model, organisations may anticipate and prepare for the organisational change resistance during EA adoption.
Resumo:
The resource based view of strategy suggests that competitiveness in part derives from a firms ability to collaborate with a subset of its supply network to co-create highly valued products and services. This relational capability relies on a foundational intra and inter-organisational architecture, the manifestation of strategic, people, and process decisions facilitating the interface between the firm and its strategic suppliers. Using covariance-based structural equation modelling we examine the relationships between internal and external features of relational architecture, and their relationship with relational capability and relational quality. This is undertaken on data collected by mail survey. We find significant relationships between both internal and external relational architecture and relational capability and between relational capability and relational quality. Novel constructs for internal and external elements of relational architecture are specified to demonstrate their positive influence on relational capability and relationship quality.
Resumo:
Process scheduling techniques consider the current load situation to allocate computing resources. Those techniques make approximations such as the average of communication, processing, and memory access to improve the process scheduling, although processes may present different behaviors during their whole execution. They may start with high communication requirements and later just processing. By discovering how processes behave over time, we believe it is possible to improve the resource allocation. This has motivated this paper which adopts chaos theory concepts and nonlinear prediction techniques in order to model and predict process behavior. Results confirm the radial basis function technique which presents good predictions and also low processing demands show what is essential in a real distributed environment.
Resumo:
The discovery of an alternative route to convert poly(xylyliden tetrahydrothiophenium chloride) (PTHT) into poly(p-phenylene vinylene) (PPV) using dodecylbenzenesulfonate (DBS) has allowed the formation of ultrathin films with unprecedented control of architecture and emission properties. In this work, we show that this route may be performed with several sufonated compounds where RSO(3)(-) replaces the counter-ion (Cl(-)) of PTHT, some of which are even more efficient than DBS. Spin-coating films were produced from PTHT and azo-dye molecules, an azo-polymer and organic salts as counter-ions of PTHT. The effects of the thermal annealing step of PTHT/RSO(3)(-) films at 110 and 230 degrees C were monitored by measuring the absorption and emission spectra. The results indicate that the exchange of the counterion Cl(-) of PTHT by a linear long chain with RSO(3)(-) group is a general procedure to obtain PPV polymer at lower conversion temperature (ca. 110 degrees C) with significant increase in the emission efficiency, regardless of the chemical position and the number of sulfonate groups. With the enhanced emission caused by Congo Red and Tinopal as counter-ions, it is demonstrated that the new synthetic route is entirely generic, which may allow accurate control of conversion and emission properties. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
With the rapid advancement of the webtechnology, more and more educationalresources, including software applications forteaching/learning methods, are available acrossthe web, which enables learners to access thelearning materials and use various ways oflearning at any time and any place. Moreover,various web-based teaching/learning approacheshave been developed during the last decade toenhance the capability of both educators andlearners. Particularly, researchers from bothcomputer science and education are workingtogether, collaboratively focusing ondevelopment of pedagogically enablingtechnologies which are believed to improve theinfrastructure of education systems andprocesses, including curriculum developmentmodels, teaching/learning methods, managementof educational resources, systematic organizationof communication and dissemination ofknowledge and skills required by and adapted tousers. Despite of its fast development, however,there are still great gaps between learningintentions, organization of supporting resources,management of educational structures,knowledge points to be learned and interknowledgepoint relationships such as prerequisites,assessment of learning outcomes, andtechnical and pedagogic approaches. Moreconcretely, the issues have been widelyaddressed in literature include a) availability andusefulness of resources, b) smooth integration ofvarious resources and their presentation, c)learners’ requirements and supposed learningoutcomes, d) automation of learning process interms of its schedule and interaction, and e)customization of the resources and agilemanagement of the learning services for deliveryas well as necessary human interferences.Considering these problems and bearing in mindthe advanced web technology of which weshould make full use, in this report we willaddress the following two aspects of systematicarchitecture of learning/teaching systems: 1)learning objects – a semantic description andorganization of learning resources using the webservice models and methods, and 2) learningservices discovery and learning goals match foreducational coordination and learning serviceplanning.
Resumo:
Agent-oriented software engineering and software product lines are two promising software engineering techniques. Recent research work has been exploring their integration, namely multi-agent systems product lines (MAS-PLs), to promote reuse and variability management in the context of complex software systems. However, current product derivation approaches do not provide specific mechanisms to deal with MAS-PLs. This is essential because they typically encompass several concerns (e.g., trust, coordination, transaction, state persistence) that are constructed on the basis of heterogeneous technologies (e.g., object-oriented frameworks and platforms). In this paper, we propose the use of multi-level models to support the configuration knowledge specification and automatic product derivation of MAS-PLs. Our approach provides an agent-specific architecture model that uses abstractions and instantiation rules that are relevant to this application domain. In order to evaluate the feasibility and effectiveness of the proposed approach, we have implemented it as an extension of an existing product derivation tool, called GenArch. The approach has also been evaluated through the automatic instantiation of two MAS-PLs, demonstrating its potential and benefits to product derivation and configuration knowledge specification.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Architecture and relevance of several strongly adhered biofilms over a polyester imide (PEI) surface
Resumo:
Um estudo microscópico foi considerado para analisar a eventual adesão de fungos sobre uma superfície de poliéster-imida presente em fios de cobre esmaltados. A microscopia eletrônica de varredura, permitiu observar nestes biofilmes aderidos, uma alta quantidade de pigmentos, hifas e um arsenal enzimático possivelmente atuando na superfície desta macromolécula. Devido a natureza altamente aromática deste material e traços de derivados fenólicos usados como solventes - que se fazem ainda presentes no polímero já reticulado, uma certa atividade anti-fúngica poderia ser esperada, todavia não foram observadas alterações no crescimento dos microrganismos, bem como no processo de adesão dos fungos. Adicionalmente a este fato, os fios esmaltados revelaram total descaracterização de suas propriedades isolantes. Os estudos visam compreender e avaliar o grande potencial demonstrado pelos fungos que poderia em caráter vindouro, explorado em processos de biodeterioração e biodegradação
Resumo:
The aim of this study was to evaluate the effects of the autogenous demineralized dentin matrix (ADDM) on the third molar socket wound healing process in humans, using the guided bone regeneration technique and a polytetrafluoroethylene barrier (PTFE). Twenty-seven dental sockets were divided into three groups: dental socket (Control), dental socket with PTFE barrier (PTFE), and dental socket with ADDM slices associated to PTFE banier (ADDM + PTFE). The dental sockets were submitted to radiographic bone densitometry analysis and statistical analysis on the 15th, 30th, 60th and 90th days using analysis of variance (ANOVA) and Tukey's test (p ≤ 0.05). The radiographic analysis of the ADDM + PTFE group showed greater homogeneity of bone radiopacity than the Control group and the PTFE group, during all the observation times. The dentin matrix gradually disappeared from the dental socket during the course of the repair process, suggesting its resorption during the bone remodeling process. It was concluded that the radiographic bone density of the dental sockets treated with ADDM was similar to that of the surrounding normal bone on the 90th day. The ADDM was biocompatible with the bone tissue of the surgical wounds of human dental sockets. The radiographic analysis revealed that the repair process was discreetly faster in the ADDM + PTFE group than in the Control and PTFE groups, although the difference was not statistically significant. In addition, the radiographic image of the ADDM + PTFE group suggested that its bone architecture was better than that of the Control and PFTE groups.
Resumo:
Supervising and controlling the many processes involved in petroleum production is both dangerous and complex. Herein, we propose a multiagent supervisory and control system for handle continuous processes like those in chemical and petroleum industries In its architeture, there are agents responsible for managing data production and analysis, and also the production equipments. Fuzzy controllers were used as control agents. The application of a fuzzy control system to managing an off-shore installation for petroleum production onto a submarine separation process is described. © 2008 IEEE.
Resumo:
The increasing use of mobile devices and wireless communication technologies has improved the access to web information systems. However, the development of these systems imposes new challenges mainly due to the heterogeneity of mobile devices, the management of context information, and the complexity of adaptation process. Especially in this case, these systems should be runable on a great number of mobile devices models. In this article, we describe a context-aware architecture that provides solutions to the challenges presented above for the development of education administration systems. Copyright 2009 ACM.
Resumo:
The present study introduces a multi-agent architecture designed for doing automation process of data integration and intelligent data analysis. Different from other approaches the multi-agent architecture was designed using a multi-agent based methodology. Tropos, an agent based methodology was used for design. Based on the proposed architecture, we describe a Web based application where the agents are responsible to analyse petroleum well drilling data to identify possible abnormalities occurrence. The intelligent data analysis methods used was the Neural Network.