899 resultados para computer modelling
Resumo:
IT has turned out to be a key factor for the purposes of gaining maturity in Business Process Management (BPM). This book presents a worldwide investigation that was conducted among companies from the ‘Forbes Global 2000’ list to explore the current usage of software throughout the BPM life cycle and to identify the companies’ requirements concerning process modelling. The responses from 130 companies indicate that, at the present time, it is mainly software for process description and analysis that is required, while process execution is supported by general software such as databases, ERP systems and office tools. The resulting complex system landscapes give rise to distinct requirements for BPM software, while the process modelling requirements can be equally satisfied by the most common languages (BPMN, UML, EPC).
Resumo:
Energy consumption modelling by state based approaches often assume constant energy consumption values in each state. However, it happens in certain situations that during state transitions or even during a state the energy consumption is not constant and does fluctuate. This paper discusses those issues by presenting some examples from wireless sensor and wireless local area networks for such cases and possible solutions.
Resumo:
An action of modelling of the Territorial Intelligence Community Systems or TICS began in 2009 at the end of the CaEnti project. It has several objectives: - Establish a set of documents understandable by computer specialists who are in charge of software developments, and by territorial intelligence specialists. - Lay the foundation of a vocabulary describing the main notions of TICS domain. - Ensure the evolution and sustainability of tools and systems, in a highly scalable research context. The definition of models representing the data manipulated by the tools of the suitcase Catalyse is not sufficient to describe in a complete way the TICS domain. We established a correspondence between this computer vocabulary and vocabulary related to the theme to allow communication between computer scientists and territorial intelligence specialists. Furthermore it is necessary to describe the roles of TICS. For that it is interesting to use other kinds of computing models. In this communication we present the modelling of TICS project with business process
Resumo:
An action of modelling of the Territorial Intelligence Community Systems or TICS began in 2009 at the end of the CaEnti project. It has several objectives: - Establish a set of documents understandable by computer specialists who are in charge of software developments, and by territorial intelligence specialists. - Lay the foundation of a vocabulary describing the main notions of TICS domain. - Ensure the evolution and sustainability of tools and systems, in a highly scalable research context. The definition of models representing the data manipulated by the tools of the suitcase Catalyse is not sufficient to describe in a complete way the TICS domain. We established a correspondence between this computer vocabulary and vocabulary related to the theme to allow communication between computer scientists and territorial intelligence specialists. Furthermore it is necessary to describe the roles of TICS. For that it is interesting to use other kinds of computing models. In this communication we present the modelling of TICS project with business process
Resumo:
An action of modelling of the Territorial Intelligence Community Systems or TICS began in 2009 at the end of the CaEnti project. It has several objectives: - Establish a set of documents understandable by computer specialists who are in charge of software developments, and by territorial intelligence specialists. - Lay the foundation of a vocabulary describing the main notions of TICS domain. - Ensure the evolution and sustainability of tools and systems, in a highly scalable research context. The definition of models representing the data manipulated by the tools of the suitcase Catalyse is not sufficient to describe in a complete way the TICS domain. We established a correspondence between this computer vocabulary and vocabulary related to the theme to allow communication between computer scientists and territorial intelligence specialists. Furthermore it is necessary to describe the roles of TICS. For that it is interesting to use other kinds of computing models. In this communication we present the modelling of TICS project with business process
Resumo:
In computer science, different types of reusable components for building software applications were proposed as a direct consequence of the emergence of new software programming paradigms. The success of these components for building applications depends on factors such as the flexibility in their combination or the facility for their selection in centralised or distributed environments such as internet. In this article, we propose a general type of reusable component, called primitive of representation, inspired by a knowledge-based approach that can promote reusability. The proposal can be understood as a generalisation of existing partial solutions that is applicable to both software and knowledge engineering for the development of hybrid applications that integrate conventional and knowledge based techniques. The article presents the structure and use of the component and describes our recent experience in the development of real-world applications based on this approach.
Resumo:
Coupled device and process silumation tools, collectively known as technology computer-aided design (TCAD), have been used in the integrated circuit industry for over 30 years. These tools allow researchers to quickly converge on optimized devide designs and manufacturing processes with minimal experimental expenditures. The PV industry has been slower to adopt these tools, but is quickly developing competency in using them. This paper introduces a predictive defect engineering paradigm and simulation tool, while demonstrating its effectiveness at increasing the performance and throughput of current industrial processes. the impurity-to-efficiency (I2E) simulator is a coupled process and device simulation tool that links wafer material purity, processing parameters and cell desigh to device performance. The tool has been validated with experimental data and used successfully with partners in industry. The simulator has also been deployed in a free web-accessible applet, which is available for use by the industrial and academic communities.
Resumo:
A recent application of computer simulation is its use for the human body, which resembles a mechanism that is complemented by torques in the joints that are caused by the action of muscles and tendons. Among others, the application can be used to provide training in surgical procedures or to learn how the body works. Some of the other applications are to make a biped walk upright, to build robots that are designed on the human body or to make prostheses or robot arms to perform specific tasks. One of the uses of simulation is to optimise the movement of the human body by examining which muscles are activated and which should or should not be activated in order to improve a person?s movements. This work presents a model of the elbow joint, and by analysing the constraint equations using classic methods we go on to model the bones, muscles and tendons as well as the logic linked to the force developed by them when faced with a specific movement. To do this, we analyse the reference bibliography and the software available to perform the validation.
Resumo:
Carbon (C) and nitrogen (N) process-based models are important tools for estimating and reporting greenhouse gas emissions and changes in soil C stocks. There is a need for continuous evaluation, development and adaptation of these models to improve scientific understanding, national inventories and assessment of mitigation options across the world. To date, much of the information needed to describe different processes like transpiration, photosynthesis, plant growth and maintenance, above and below ground carbon dynamics, decomposition and nitrogen mineralization. In ecosystem models remains inaccessible to the wider community, being stored within model computer source code, or held internally by modelling teams. Here we describe the Global Research Alliance Modelling Platform (GRAMP), a web-based modelling platform to link researchers with appropriate datasets, models and training material. It will provide access to model source code and an interactive platform for researchers to form a consensus on existing methods, and to synthesize new ideas, which will help to advance progress in this area. The platform will eventually support a variety of models, but to trial the platform and test the architecture and functionality, it was piloted with variants of the DNDC model. The intention is to form a worldwide collaborative network (a virtual laboratory) via an interactive website with access to models and best practice guidelines; appropriate datasets for testing, calibrating and evaluating models; on-line tutorials and links to modelling and data provider research groups, and their associated publications. A graphical user interface has been designed to view the model development tree and access all of the above functions.
Resumo:
The banking industry is observing how new competitors threaten its millennial business model by targeting unbanked people, offering new financial services to their customer base, and even enabling new channels for existing services and customers. The knowledge on users, their behaviour, and expectations become a key asset in this new context. Well aware of this situation, the Center for Open Middleware, a joint technology center created by Santander Bank and Universidad Politécnica de Madrid, has launched a set of initiatives to allow the experimental analysis and management of socio-economic information. PosdataP2P service is one of them, which seeks to model the economic ties between the holders of university smart cards, leveraging on the social networks the holders are subscribed to. In this paper we describe the design principles guiding the development of the system, its architecture and some implementation details.
Resumo:
In the last decade, multi-sensor data fusion has become a broadly demanded discipline to achieve advanced solutions that can be applied in many real world situations, either civil or military. In Defence,accurate detection of all target objects is fundamental to maintaining situational awareness, to locating threats in the battlefield and to identifying and protecting strategically own forces. Civil applications, such as traffic monitoring, have similar requirements in terms of object detection and reliable identification of incidents in order to ensure safety of road users. Thanks to the appropriate data fusion technique, we can give these systems the power to exploit automatically all relevant information from multiple sources to face for instance mission needs or assess daily supervision operations. This paper focuses on its application to active vehicle monitoring in a particular area of high density traffic, and how it is redirecting the research activities being carried out in the computer vision, signal processing and machine learning fields for improving the effectiveness of detection and tracking in ground surveillance scenarios in general. Specifically, our system proposes fusion of data at a feature level which is extracted from a video camera and a laser scanner. In addition, a stochastic-based tracking which introduces some particle filters into the model to deal with uncertainty due to occlusions and improve the previous detection output is presented in this paper. It has been shown that this computer vision tracker contributes to detect objects even under poor visual information. Finally, in the same way that humans are able to analyze both temporal and spatial relations among items in the scene to associate them a meaning, once the targets objects have been correctly detected and tracked, it is desired that machines can provide a trustworthy description of what is happening in the scene under surveillance. Accomplishing so ambitious task requires a machine learning-based hierarchic architecture able to extract and analyse behaviours at different abstraction levels. A real experimental testbed has been implemented for the evaluation of the proposed modular system. Such scenario is a closed circuit where real traffic situations can be simulated. First results have shown the strength of the proposed system.
Resumo:
The urban microclimate plays an important role in building energy consumption and thermal comfort in outdoor spaces. Nowadays, cities need to increase energy efficiency, reduce pollutant emissions and mitigate the evident lack of sustainability. In light of this, attention has focused on the bioclimatic concepts use in the urban development. However, the speculative unsustainability of the growth model highlights the need to redirect the construction sector towards urban renovation using a bioclimatic approach. The public space plays a key role in improving the quality of today’s cities, especially in terms of providing places for citizens to meet and socialize in adequate thermal conditions. Thermal comfort affects perception of the environment, so microclimate conditions can be decisive for the success or failure of outdoor urban spaces and the activities held in them. For these reasons, the main focus of this work is on the definition of bioclimatic strategies for existing urban spaces, based on morpho-typological components, urban microclimate conditions and comfort requirements for all kinds of citizens. Two case studies were selected in Madrid, in a social housing neighbourhood constructed in the 1970s based on Rational Architecture style. Several renovation scenarios were performed using a computer simulation process based in ENVI-met and diverse microclimate conditions were compared. In addition, thermal comfort evaluation was carried out using the Universal Thermal Climate Index (UTCI) in order to investigate the relationship between microclimate conditions and thermal comfort perception. This paper introduces the microclimate computer simulation process as a valuable support for decision-making for neighbourhood renovation projects in order to provide new and better solutions according to the thermal quality of public spaces and reducing energy consumption by creating and selecting better microclimate areas.