827 resultados para systems-based simulation
Resumo:
Models are central tools for modern scientists and decision makers, and there are many existing frameworks to support their creation, execution and composition. Many frameworks are based on proprietary interfaces, and do not lend themselves to the integration of models from diverse disciplines. Web based systems, or systems based on web services, such as Taverna and Kepler, allow composition of models based on standard web service technologies. At the same time the Open Geospatial Consortium has been developing their own service stack, which includes the Web Processing Service, designed to facilitate the executing of geospatial processing - including complex environmental models. The current Open Geospatial Consortium service stack employs Extensible Markup Language as a default data exchange standard, and widely-used encodings such as JavaScript Object Notation can often only be used when incorporated with Extensible Markup Language. Similarly, no successful engagement of the Web Processing Service standard with the well-supported technologies of Simple Object Access Protocol and Web Services Description Language has been seen. In this paper we propose a pure Simple Object Access Protocol/Web Services Description Language processing service which addresses some of the issues with the Web Processing Service specication and brings us closer to achieving a degree of interoperability between geospatial models, and thus realising the vision of a useful 'model web'.
Resumo:
This article considers the basic problems of client-server electronic learning systems based on mobile platforms. Such questions as relational learning course model and student’s transitions prediction through the learning course items are considered. Besides, technical questions of electronic learning system “E-Learning Suite” realization and questions of developing portable applications using .NET Framework are discussed.
Resumo:
This paper describes the use of the Business Process Execution Language for Web Services (BPEL4WS/BPEL) for managing scientific workflows. This work is result of our attempt to adopt Service Oriented Architecture in order to perform Web services – based simulation of metal vapor lasers. Scientific workflows can be more demanding in their requirements than business processes. In the context of addressing these requirements, the features of the BPEL4WS specification are discussed, which is widely regarded as the de-facto standard for orchestrating Web services for business workflows. A typical use case of calculation the electric field potential and intensity distributions is discussed as an example of building a BPEL process to perform distributed simulation constructed by loosely-coupled services.
Resumo:
* This paper was made according to the program No 14 of fundamental scientific research of the Presidium of the Russian Academy of Sciences, the project "Intellectual Systems Based on Multilevel Domain Models".
Resumo:
The ability of automatic graphic user interface construction is described. It is based on the building of user interface as reflection of the data domain logical definition. The submitted approach to development of the information system user interface enables dynamic adaptation of the system during their operation. This approach is used for creation of information systems based on CASE-system METAS.
Resumo:
Recommender systems are now widely used in e-commerce applications to assist customers to find relevant products from the many that are frequently available. Collaborative filtering (CF) is a key component of many of these systems, in which recommendations are made to users based on the opinions of similar users in a system. This paper presents a model-based approach to CF by using supervised ARTMAP neural networks (NN). This approach deploys formation of reference vectors, which makes a CF recommendation system able to classify user profile patterns into classes of similar profiles. Empirical results reported show that the proposed approach performs better than similar CF systems based on unsupervised ART2 NN or neighbourhood-based algorithm.
Resumo:
Last mile relief distribution is the final stage of humanitarian logistics. It refers to the supply of relief items from local distribution centers to the disaster affected people (Balcik et al., 2008). In the last mile relief distribution literature, researchers have focused on the use of optimisation techniques for determining the exact optimal solution (Liberatore et al., 2014), but there is a need to include behavioural factors with those optimisation techniques in order to obtain better predictive results. This paper will explain how improving the coordination factor increases the effectiveness of the last mile relief distribution process. There are two stages of methodology used to achieve the goal: Interviews: The authors conducted interviews with the Indian Government and with South Asian NGOs to identify the critical factors for final relief distribution. After thematic and content analysis of the interviews and the reports, the authors found some behavioural factors which affect the final relief distribution. Model building: Last mile relief distribution in India follows a specific framework described in the Indian Government disaster management handbook. We modelled this framework using agent based simulation and investigated the impact of coordination on effectiveness. We define effectiveness as the speed and accuracy with which aid is delivered to affected people. We tested through simulation modelling whether coordination improves effectiveness.
Resumo:
Science and art are considered as two distinct areas in the spectrum of human activities. Many scientists are inspired by art and many artists embed science in their work. This paper presents a one-year experiment, which started with benchmark tests of a compiler, passed through dynamic systems based on complex numbers and ended as a scientific art exhibition. The paper demonstrates that it is possible to blend science and art in a mutually beneficial way. It also shows how science can inspire the creation of artistic works, as well as how these works can inspire further scientific research.
Resumo:
Acute life threatening events such as cardiac/respiratory arrests are often predictable in adults and children. However critical events such as unplanned extubations are considered as not predictable. This paper seeks to evaluate the ability of automated prediction systems based on feature space embedding and time series methods to predict unplanned extubations in paediatric intensive care patients. We try to exploit the trends in the physiological signals such as Heart Rate, Respiratory Rate, Systolic Blood Pressure and Oxygen saturation levels in the blood using signal processing aspects of a frame-based approach of expanding signals using a nonorthogonal basis derived from the data. We investigate the significance of the trends in a computerised prediction system. The results are compared with clinical observations of predictability. We will conclude by investigating whether the prediction capability of the system could be exploited to prevent future unplanned extubations. © 2014 IEEE.
Resumo:
In this paper, we present an innovative topic segmentation system based on a new informative similarity measure that takes into account word co-occurrence in order to avoid the accessibility to existing linguistic resources such as electronic dictionaries or lexico-semantic databases such as thesauri or ontology. Topic segmentation is the task of breaking documents into topically coherent multi-paragraph subparts. Topic segmentation has extensively been used in information retrieval and text summarization. In particular, our architecture proposes a language-independent topic segmentation system that solves three main problems evidenced by previous research: systems based uniquely on lexical repetition that show reliability problems, systems based on lexical cohesion using existing linguistic resources that are usually available only for dominating languages and as a consequence do not apply to less favored languages and finally systems that need previously existing harvesting training data. For that purpose, we only use statistics on words and sequences of words based on a set of texts. This solution provides a flexible solution that may narrow the gap between dominating languages and less favored languages thus allowing equivalent access to information.
Resumo:
Purpose - Enterprise resource planning (ERP) systems are limited due to their operation around a fixed design production process and a fixed lead time to production plan and purchasing plan. The purpose of this paper is to define the concept of informality and to describe the notion of a system combining informality and ERP systems, based on empirical research from four manufacturing case studies. Design/methodology/approach - The case studies present a range of applications of ERP and are analysed in terms of the three characteristics of informality, namely, organisation structure, communication method and leadership approach. Findings - The findings suggest that systems consisting of informality in combination with ERP systems can elicit knowledge fromfrontlineworkers leading to timely improvements in the system. This is achieved by allowing users to modify work procedures or production orders, and to support collaborative working among all employees. However it was found that informality is not required for manufacturers with a relatively stable environment who can deal with uncertainty with a proactive strategy. Research limitations/implications - This study was carried out in China, with four companies as unit of analysis. Future work can help to extend this study across countries. Originality/value - The use of Four dimensions of informality that relate to manufacturers implementing ERP are defined as "technology in practice", "user flexibility", "trusted human networks" and "positive reaction to uncertainty". This is a new construct not applied before to ERP implementations.
Resumo:
This paper contributes a new methodology called Waste And Source-matter ANalyses (WASAN) which supports a group in building agreeable actions for safely minimising avoidable waste. WASAN integrates influences from the Operational Research (OR) methodologies/philosophies of Problem Structuring Methods, Systems Thinking, simulation modelling and sensitivity analysis as well as industry approaches of Waste Management Hierarchy, Hazard Operability (HAZOP) Studies and As Low As Reasonably Practicable (ALARP). The paper shows how these influences are compiled into facilitative structures that support managers in developing recommendations on how to reduce avoidable waste production. WASAN is being designed as Health and Safety Executive Guidance on what constitutes good decision making practice for the companies that manage nuclear sites. In this paper we report and reflect on its use in two soft OR/problem structuring workshops conducted on radioactive waste in the nuclear industry. Crown Copyright © 2010.
Resumo:
Compact and tunable semiconductor terahertz sources providing direct electrical control, efficient operation at room temperatures and device integration opportunities are of great interest at the present time. One of the most well-established techniques for terahertz generation utilises photoconductive antennas driven by ultrafast pulsed or dual wavelength continuous wave laser systems, though some limitations, such as confined optical wavelength pumping range and thermal breakdown, still exist. The use of quantum dot-based semiconductor materials, having unique carrier dynamics and material properties, can help to overcome limitations and enable efficient optical-to-terahertz signal conversion at room temperatures. Here we discuss the construction of novel and versatile terahertz transceiver systems based on quantum dot semiconductor devices. Configurable, energy-dependent optical and electronic characteristics of quantum-dot-based semiconductors are described, and the resonant response to optical pump wavelength is revealed. Terahertz signal generation and detection at energies that resonantly excite only the implanted quantum dots opens the potential for using compact quantum dot-based semiconductor lasers as pump sources. Proof-of-concept experiments are demonstrated here that show quantum dot-based samples to have higher optical pump damage thresholds and reduced carrier lifetime with increasing pump power.
Resumo:
The ability to use Software Defined Radio (SDR) in the civilian mobile applications will make it possible for the next generation of mobile devices to handle multi-standard personal wireless devices and ubiquitous wireless devices. The original military standard created many beneficial characteristics for SDR, but resulted in a number of disadvantages as well. Many challenges in commercializing SDR are still the subject of interest in the software radio research community. Four main issues that have been already addressed are performance, size, weight, and power. ^ This investigation presents an in-depth study of SDR inter-components communications in terms of total link delay related to the number of components and packet sizes in systems based on Software Communication Architecture (SCA). The study is based on the investigation of the controlled environment platform. Results suggest that the total link delay does not linearly increase with the number of components and the packet sizes. The closed form expression of the delay was modeled using a logistic function in terms of the number of components and packet sizes. The model performed well when the number of components was large. ^ Based upon the mobility applications, energy consumption has become one of the most crucial limitations. SDR will not only provide flexibility of multi-protocol support, but this desirable feature will also bring a choice of mobile protocols. Having such a variety of choices available creates a problem in the selection of the most appropriate protocol to transmit. An investigation in a real-time algorithm to optimize energy efficiency was also performed. Communication energy models were used including switching estimation to develop a waveform selection algorithm. Simulations were performed to validate the concept.^
Resumo:
There are authentication models which use passwords, keys, personal identifiers (cards, tags etc) to authenticate a particular user in the authentication/identification process. However, there are other systems that can use biometric data, such as signature, fingerprint, voice, etc., to authenticate an individual in a system. In another hand, the storage of biometric can bring some risks such as consistency and protection problems for these data. According to this problem, it is necessary to protect these biometric databases to ensure the integrity and reliability of the system. In this case, there are models for security/authentication biometric identification, for example, models and Fuzzy Vault and Fuzzy Commitment systems. Currently, these models are mostly used in the cases for protection of biometric data, but they have fragile elements in the protection process. Therefore, increasing the level of security of these methods through changes in the structure, or even by inserting new layers of protection is one of the goals of this thesis. In other words, this work proposes the simultaneous use of encryption (Encryption Algorithm Papilio) with protection models templates (Fuzzy Vault and Fuzzy Commitment) in identification systems based on biometric. The objective of this work is to improve two aspects in Biometric systems: safety and accuracy. Furthermore, it is necessary to maintain a reasonable level of efficiency of this data through the use of more elaborate classification structures, known as committees. Therefore, we intend to propose a model of a safer biometric identification systems for identification.