535 resultados para Computer Architecture
Resumo:
The tertiary sector is an important employer and its growth is well above average. The Texo project’s aim is to support this development by making services tradable. The composition of new or value-added services is a cornerstone of the proposed architecture. It is, however, intended to cater for build-time. Yet, at run-time unforseen exceptions may occur and user’s requirements may change. Varying circumstances require immediate sensemaking of the situation’s context and call for prompt extensions of existing services. Lightweight composition technology provided by the RoofTop project enables domain experts to create simple widget-like applications, also termed enterprise mashups, without extensive methodological skills. In this way RoofTop can assist and extend the idea of service delivery through the Texo platform and is a further step towards a next generation internet of services.
Resumo:
Enterprise Systems (ES) provide standardized, off-theshelf support for operations and management within organizations. With the advent of ES based on a serviceoriented architecture (SOA) and an increasing demand of IT-supported interorganizational collaboration, implementation projects face paradigmatically new challenges. The configuration of ES is costly and error-prone. Dependencies between business processes and business documents are hardly explicit and foster component proliferation instead of reuse. Configurative modeling can support the problem in two ways: First, conceptual modeling abstracts from technical details and provides more intuitive access and overview. Second, configuration allows the projection of variants from master models providing manageable variants with controlled flexibility. We aim at tackling the problem by proposing an integrated model-based framework for configuring both, processes and business documents, on an equal basis; as together, they constitute the core business components of an ES.
Resumo:
In recent years, enterprise architecture (EA) has captured increasing interest as a means to systematically consolidate and manage various enterprise artefacts in order to provide holistic decision support for business/IT alignment and business/IT landscapes management. To provide a holistic perspective on the enterprise over time, EA frameworks need to co-evolve with the changes in the enterprise and its IT over time. In this paper we focus on the emergence of Service-Oriented Architecture (SOA). There is a need to integrate SOA with EA to keep EA relevant and to use EA products to help drive successful SOA. This paper investigates and compares the integration of SOA elements in five widely used EA frameworks: Archimate, The Open Group Architecture Framework (TOGAF), Federal Enterprise Architecture Framework (FEAF), Department of Defence Architecture Framework (DoDAF) and the Ministry of Defence Architecture Framework (MODAF). It identifies what SOA elements are considered and their relative position in the overall structure. The results show that services and related elements are far from being well-integrated constructs in current EA frameworks and that the different EA frameworks integrated SOA elements in substantially different ways. Our results can support the academic EA and SOA communities with a closer and more consistent integration of EA and SOA and support practitioners in identifying an EA framework that provides the SOA support that matches their requirements.
Resumo:
Previously, expected satiety (ES) has been measured using software and two-dimensional pictures presented on a computer screen. In this context, ES is an excellent predictor of self-selected portions, when quantified using similar images and similar software. In the present study we sought to establish the veracity of ES as a predictor of behaviours associated with real foods. Participants (N = 30) used computer software to assess their ES and ideal portion of three familiar foods. A real bowl of one food (pasta and sauce) was then presented and participants self-selected an ideal portion size. They then consumed the portion ad libitum. Additional measures of appetite, expected and actual liking, novelty, and reward, were also taken. Importantly, our screen-based measures of expected satiety and ideal portion size were both significantly related to intake (p < .05). By contrast, measures of liking were relatively poor predictors (p > .05). In addition, consistent with previous studies, the majority (90%) of participants engaged in plate cleaning. Of these, 29.6% consumed more when prompted by the experimenter. Together, these findings further validate the use of screen-based measures to explore determinants of portion-size selection and energy intake in humans.
Resumo:
Crowds of non-combatants play a large and increasingly recognized role in modern military operations, and often create substantial difficulties for the combatant forces involved. However, realistic models of crowds are essentially absent from current military simulations. To address this problem we are developing a crowd simulation capable of generating crowds of non-combatant civilians that exhibit a variety of realistic individual and group behaviours at differing levels of fidelity. The crowd simulation is interoperable with existing military simulations using a standard distributed simulation architecture. Commercial game technology is utilized in the crowd simulation to model both urban terrain and the physical behaviours of the human characters that make up the crowd. The objective of this paper is to present the process involved with the design and development of a simulation that integrates commercially available game technology with current military simulations in order to generate realistic and believable crowd behaviour.
Resumo:
Background: Optimal adherence to antiretroviral therapy (ART) is necessary for people living with HIV/AIDS (PLHIV). There have been relatively few systematic analyses of factors that promote or inhibit adherence to antiretroviral therapy among PLHIV in Asia. This study assessed ART adherence and examined factors associated with suboptimal adherence in northern Viet Nam. Methods: Data from 615 PLHIV on ART in two urban and three rural outpatient clinics were collected by medical record extraction and from patient interviews using audio computer-assisted self-interview (ACASI). Results: The prevalence of suboptimal adherence was estimated to be 24.9% via a visual analogue scale (VAS) of past-month dose-missing and 29.1% using a modified Adult AIDS Clinical Trial Group scale for on-time dose-taking in the past 4 days. Factors significantly associated with the more conservative VAS score were: depression (p < 0.001), side-effect experiences (p < 0.001), heavy alcohol use (p = 0.001), chance health locus of control (p = 0.003), low perceived quality of information from care providers (p = 0.04) and low social connectedness (p = 0.03). Illicit drug use alone was not significantly associated with suboptimal adherence, but interacted with heavy alcohol use to reduce adherence (p < 0.001). Conclusions: This is the largest survey of ART adherence yet reported from Asia and the first in a developing country to use the ACASI method in this context. The evidence strongly indicates that ART services in Viet Nam should include screening and treatment for depression, linkage with alcohol and/or drug dependence treatment, and counselling to address the belief that chance or luck determines health outcomes.
Resumo:
Scaffolds for bone tissue engineering should be designed to optimize cell migration, enhance new bone formation and give mechanical support. In the present study, we used polycaprolactone-tricalciumphosphate (PCL/TCP) scaffolds with two different fibre lay down patterns which were coated with hydroxyapatite and gelatine as an approach for optimizing bone regeneration in a critical sized calvarial defect. After 12 weeks bone regeneration was quantified using microCT analysis, biomechanical testing and histological evaluation. Notably, the experimental groups containing coated scaffolds showed lower bone formation and lower biomechanical properties within the defect compared to the uncoated scaffolds. Surprisingly, the different lay down pattern of the fibres resulted in different bone formation and biomechanical properties; namely 0/60/120° scaffolds revealed lower bone formation and biomechanical properties compared to the 0/90° scaffolds in all the experimental groups. The different architecture of the scaffold fibres may have an effect on nutrition supply as well as the attachment of the newly formed matrix to the scaffold. Therefore, future bone regeneration strategies utilising scaffolds should consider scaffold architecture as an important factor during the scaffold optimisation stages in order to move closer to a clinical application.
Resumo:
This paper treats the blast response of a pile foundation in saturated sand using explicit nonlinear finite element analysis, considering complex material behavior of soil and soil–pile interaction. Blast wave propagation in the soil is studied and the horizontal deformation of pile and effective stresses in the pile are presented. Results indicate that the upper part of the pile to be vulnerable and the pile response decays with distance from the explosive. The findings of this research provide valuable information on the effects of underground explosions on pile foundation and will guide future development, validation and application of computer models.
Resumo:
Simulation has been widely used to estimate the benefits of Cooperative Systems (CS) based on Inter-Vehicular Communications (IVC). This paper presents a new architecture built with the SiVIC simulator and the RTMaps™ multisensors prototyping platform. We introduce several improvements from a previous similar architecture, regarding IVC modelisation and vehicles’ control. It has been tuned with on-road measurements to improve fidelity. We discuss the results of a freeway emergency braking scenario (EEBL) implemented to validate our architecture’s capabilities.
Resumo:
Cooperative Systems provide, through the multiplication of information sources over the road, a lot of potential to improve the safety of road users, especially drivers. However, developing cooperative ITS applications requires additional resources compared to non-cooperative applications which are both time consuming and expensive. In this paper, we present a simulation architecture aimed at prototyping cooperative ITS applications in an accurate and detailed, close-to-reality environment; the architecture is designed to be modular and generalist. It can be used to simulate any type of CS applications as well as augmented perception. Then, we discuss the results of two applications deployed with our architecture, using a common freeway emergency braking scenario. The first application is Emergency Electronic Brake Light (EEBL); we discuss improvements in safety in terms of the number of crashes and the severity of crashes. The second application compares the performance of a cooperative risk assessment using an augmented map against a non-cooperative approach based on local-perception only. Our results show a systematic improvement of forward warning time for most vehicles in the string when using the augmented-map-based risk assessment.
Resumo:
Computer games have become a commonplace but engaging activity among students. They enjoy playing computer games as they can perform larger-than-life activities virtually such as jumping from great heights, flying planes, and racing cars; actions that are otherwise not possible in real life. Computer games also offer user interactivity which gives them a certain appeal. Considering this appeal, educators should consider integrating computer games into student learning and to encourage students to author computer games of their own. It is thought that students can be engaged in learning by authoring and using computer games and can also gain essential skills such as collaboration, teamwork, problem solving and deductive reasoning. The research in this study revolves around building student engagement through the task of authoring computer games. The study aims to demonstrate how the creation and sharing of student-authored educational games might facilitate student engagement and how ICT (information and communication technology) plays a supportive role in student learning. Results from this study may lead to the broader integration of computer games into student learning and contribute to similar studies. In this qualitative case study, based in a state school in a low socio-economic area west of Brisbane, Australia, students were selected in both junior and senior secondary classes who have authored computer games as a part of their ICT learning. Senior secondary students (Year 12 ICT) were given the task of programming the games, which were to be based on Mathematics learning topics while the junior secondary students (Year 8 ICT) were given the task of creating multimedia elements for the games. A Mathematics teacher volunteered to assist in the project and provided guidance on the inclusion of suitable Mathematics curricular content into these computer games. The student-authored computer games were then used to support another group of Year 8 Mathematics students to learn the topics of Area, Volume and Time. Data was collected through interviews, classroom observations and artefacts. The teacher researcher, acting in the role of ICT teacher, coordinated with the students and the Mathematics teacher to conduct this study. Instrumental case study was applied as research methodology and Third Generation Activity Theory served as theoretical framework for this study. Data was analysed adopting qualitative coding procedures. Findings of this study indicate that having students author and play computer games promoted student engagement and that ICT played a supportive role in learning and allowed students to gain certain essential skills. Although this study will suggest integrating computer games to support classroom learning, it cannot be presumed that computer games are an immediate solution for promoting student engagement.
Resumo:
An onboard payload may be seen in most instances as the “Raison d’Etre” for a UAV. It will define its capabilities, usability and hence market value. Large and medium UAV payloads exhibit significant differences in size and computing capability when compared with small UAVs. The latter have stringent size, weight, and power requirements, typically referred as SWaP, while the former still exhibit endless appetite for compute capability. The tendency for this type of UAVs (Global Hawk, Hunter, Fire Scout, etc.) is to increase payload density and hence processing capability. An example of this approach is the Northrop Grumman MQ-8 Fire Scout helicopter, which has a modular payload architecture that incorporates off-the-shelf components. Regardless of the UAV size and capabilities, advances in miniaturization of electronics are enabling the replacement of multiprocessing, power-hungry general-purpose processors for more integrated and compact electronics (e.g., FPGAs). Payloads play a significant role in the quality of ISR (intelligent, surveillance, and reconnaissance) data, and also in how quick that information can be delivered to the end user. At a high level, payloads are important enablers of greater mission autonomy, which is the ultimate aim in every UAV. This section describes common payload sensors and introduces two examples cases in which onboard payloads were used to solve real-world problems. A collision avoidance payload based on electro optical (EO) sensors is first introduced, followed by a remote sensing application for power line inspection and vegetation management.
Resumo:
This article describes the architecture of a monitoring component for the YAWL system. The architecture proposed is based on sensors and it is realized as a YAWL service to have perfect integration with the YAWL systems. The architecture proposed is generic and applicable in different contexts of business process monitoring. Finally, it was tested and evaluated in the context of risk monitoring for business processes.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.