985 resultados para enterprise modelling
Resumo:
All over the world, organizations are becoming more and more complex, and there’s a need to capture its complexity, so this is when the DEMO methodology (Design and Engineering Methodology for Organizations), created and developed by Jan L. G. Dietz, reaches its potential, which is to capture the structure of business processes in a coherent and consistent form of diagrams with their respective grammatical rules. The creation of WAMM (Wiki Aided Meta Modeling) platform was the main focus of this thesis, and had like principal precursor the idea to create a Meta-Editor that supports semantic data and uses MediaWiki. This prototype Meta-Editor uses MediaWiki as a receptor of data, and uses the ideas created in the Universal Enterprise Adaptive Object Model and the concept of Semantic Web, to create a platform that suits our needs, through Semantic MediaWiki, which helps the computer interconnect information and people in a more comprehensive, giving meaning to the content of the pages. The proposed Meta-Modeling platform allows the specification of the abstract syntax i.e., the grammar, and concrete syntax, e.g., symbols and connectors, of any language, as well as their model types and diagram types. We use the DEMO language as a proofof-concept and example. All such specifications are done in a coherent and formal way by the creation of semantic wiki pages and semantic properties connecting them.
Resumo:
The conception of IoT (Internet of Things) is accepted as the future tendency of Internet among academia and industry. It will enable people and things to be connected at anytime and anyplace, with anything and anyone. IoT has been proposed to be applied into many areas such as Healthcare, Transportation,Logistics, and Smart environment etc. However, this thesis emphasizes on the home healthcare area as it is the potential healthcare model to solve many problems such as the limited medical resources, the increasing demands for healthcare from elderly and chronic patients which the traditional model is not capable of. A remarkable change in IoT in semantic oriented vision is that vast sensors or devices are involved which could generate enormous data. Methods to manage the data including acquiring, interpreting, processing and storing data need to be implemented. Apart from this, other abilities that IoT is not capable of are concluded, namely, interoperation, context awareness and security & privacy. Context awareness is an emerging technology to manage and take advantage of context to enable any type of system to provide personalized services. The aim of this thesis is to explore ways to facilitate context awareness in IoT. In order to realize this objective, a preliminary research is carried out in this thesis. The most basic premise to realize context awareness is to collect, model, understand, reason and make use of context. A complete literature review for the existing context modelling and context reasoning techniques is conducted. The conclusion is that the ontology-based context modelling and ontology-based context reasoning are the most promising and efficient techniques to manage context. In order to fuse ontology into IoT, a specific ontology-based context awareness framework is proposed for IoT applications. In general, the framework is composed of eight components which are hardware, UI (User Interface), Context modelling, Context fusion, Context reasoning, Context repository, Security unit and Context dissemination. Moreover, on the basis of TOVE (Toronto Virtual Enterprise), a formal ontology developing methodology is proposed and illustrated which consists of four stages: Specification & Conceptualization, Competency Formulation, Implementation and Validation & Documentation. In addition, a home healthcare scenario is elaborated by listing its well-defined functionalities. Aiming at representing this specific scenario, the proposed ontology developing methodology is applied and the ontology-based model is developed in a free and open-source ontology editor called Protégé. Finally, the accuracy and completeness of the proposed ontology are validated to show that this proposed ontology is able to accurately represent the scenario of interest.
Resumo:
Enterprise systems interoperability (ESI) is an important topic for business currently. This situation is evidenced, at least in part, by the number and extent of potential candidate protocols for such process interoperation, viz., ebXML, BPML, BPEL, and WSCI. Wide-ranging support for each of these candidate standards already exists. However, despite broad acceptance, a sound theoretical evaluation of these approaches has not yet been provided. We use the Bunge-Wand-Weber (BWW) models, in particular, the representation model, to provide the basis for such a theoretical evaluation. We, and other researchers, have shown the usefulness of the representation model for analyzing, evaluating, and engineering techniques in the areas of traditional and structured systems analysis, object-oriented modeling, and process modeling. In this work, we address the question, what are the potential semantic weaknesses of using ebXML alone for process interoperation between enterprise systems? We find that users will lack important implementation information because of representational deficiencies; due to ontological redundancy, the complexity of the specification is unnecessarily increased; and, users of the specification will have to bring in extra-model knowledge to understand constructs in the specification due to instances of ontological excess.
Resumo:
The application of any e-Solution promises significant returns. In particular, using internet technologies both within enterprises and across the supply (value) chain provides real opportunity, not only for operational improvement but also for innovative strategic positioning. However, significant questions obscure potential investment; how any value will actually be created and, importantly, how this value will be shared across the value chain is not clear. This paper will describe a programme of research that is developing an enterprise simulator that will provide a more fundamental understanding of the impact of e-Solutions across operational supply chains, in terms of both standard operational and financial measures of performance. An efficient supply chain reduces total costs of operations by sharing accurate real-time information and coordinating inter-organizational business processes. This form of electronic link between organizations is known as business-to-business (B2B) e-Business. The financial measures go beyond simple cost calculations to real bottom-line performance by modelling the financial transactions that business processes generate. The paper will show how this enterprise simulator allows for a complete supply chain to be modelled in this way across four key applications: control system design, virtual enterprises, pan-supply-chain performance metrics and supporting e-Supply-chain design methodology.
Resumo:
Supply chains are advocated widely as being the new units for commercial competition and developments have made the sharing of supply chain wide information increasingly common. Most organisations however still make operational decisions intended to maximise local organisational performance. With improved information sharing a holistic focus for operational decisions should now be possible. The development of a pan supply chain performance framework requires an examination of the conditions under which holistic-decisions provide benefits to either the individual enterprise or the complete supply chain. This paper presents the background and supporting methodology for a study of the impact of an overall supply chain performance metric framework upon local logistics decisions and the conditions under which such a framework would improve overall supply chain performance. The methodology concludes a simulation approach using a functionally extended Gensym's e-SCOR model, together with case based triangulation, to be optimum. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Risk and knowledge are two concepts and components of business management which have so far been studied almost independently. This is especially true where risk management is conceived mainly in financial terms, as, for example, in the banking sector. The banking sector has sophisticated methodologies for managing risk, such as mathematical risk modeling. However. the methodologies for analyzing risk do not explicitly include knowledge management for risk knowledge creation and risk knowledge transfer. Banks are affected by internal and external changes with the consequent accommodation to new business models new regulations and the competition of big players around the world. Thus, banks have different levels of risk appetite and policies in risk management. This paper takes into consideration that business models are changing and that management is looking across the organization to identify the influence of strategic planning, information systems theory, risk management and knowledge management. These disciplines can handle the risks affecting banking that arise from different areas, but only if they work together. This creates a need to view them in an integrated way. This article sees enterprise risk management as a specific application of knowledge in order to control deviation from strategic objectives, shareholders' values and stakeholders' relationships. Before and after a modeling process it necessary to find insights into how the application of knowledge management processes can improve the understanding of risk and the implementation of enterprise risk management. The article presents a propose methodology to contribute to providing a guide for developing risk modeling knowledge and a reduction of knowledge silos, in order to improve the quality and quantity of solutions related to risk inquiries across the organization.
Resumo:
With the features of low-power and flexible networking capabilities IEEE 802.15.4 has been widely regarded as one strong candidate of communication technologies for wireless sensor networks (WSNs). It is expected that with an increasing number of deployments of 802.15.4 based WSNs, multiple WSNs could coexist with full or partial overlap in residential or enterprise areas. As WSNs are usually deployed without coordination, the communication could meet significant degradation with the 802.15.4 channel access scheme, which has a large impact on system performance. In this thesis we are motivated to investigate the effectiveness of 802.15.4 networks supporting WSN applications with various environments, especially when hidden terminals are presented due to the uncoordinated coexistence problem. Both analytical models and system level simulators are developed to analyse the performance of the random access scheme specified by IEEE 802.15.4 medium access control (MAC) standard for several network scenarios. The first part of the thesis investigates the effectiveness of single 802.15.4 network supporting WSN applications. A Markov chain based analytic model is applied to model the MAC behaviour of IEEE 802.15.4 standard and a discrete event simulator is also developed to analyse the performance and verify the proposed analytical model. It is observed that 802.15.4 networks could sufficiently support most WSN applications with its various functionalities. After the investigation of single network, the uncoordinated coexistence problem of multiple 802.15.4 networks deployed with communication range fully or partially overlapped are investigated in the next part of the thesis. Both nonsleep and sleep modes are investigated with different channel conditions by analytic and simulation methods to obtain the comprehensive performance evaluation. It is found that the uncoordinated coexistence problem can significantly degrade the performance of 802.15.4 networks, which is unlikely to satisfy the QoS requirements for many WSN applications. The proposed analytic model is validated by simulations which could be used to obtain the optimal parameter setting before WSNs deployments to eliminate the interference risks.
Resumo:
Full text This Proceedings volume contains selected papers from the Fourth International CIRP-sponsored, Conference on Digital Enterprise Technology (DET2007), which was held at the University of Bath, UK, 19–21 September 2007. All selected papers have been suitably enhanced for publication in the Journal and have undergone full review. Digital enterprise technology (DET) is ‘the collection of systems and methods for the digital modelling and analysis of the global product development and realization process, in the context of lifecycle management.’ The principal aim of the DET concept is to provide a coherent context for the development and integration of the various digital technologies that underpin modern design and manufacturing. These technologies can be classified according to the following five key areas. 1. Distributed and collaborative design. 2. Process modelling and process planning. 3. Advanced factory design and modelling. 4. Physical-to-digital environment integrators–verification. 5. Enterprise integration technologies. This special issue is representative of the wide breadth of the DET concept including; a comprehensive review of digital engineering, design processes, digital modelling of machine tools, forming, robotics and machining processes, verification and metrology, and dynamic networks. It is particularly pleasing to see the development of metrology as a key aspect of modern manufacturing technology, linking design intent to process capability. The papers published herein will facilitate the exploration of new and evolving research concepts by the international research community and will influence the development of international standards for the application of DET technologies.
Resumo:
A bio-economic modelling framework (GRASP-ENTERPRISE) was used to assess the implications of retaining woody regrowth for carbon sequestration on a case study beef grazing property in northern Australia. Five carbon farming scenarios, ranging from 0% to 100% of the property regrowth retained for carbon sequestration, were simulated over a 20-year period (1993–2012). Dedicating regrowth on the property for carbon sequestration reduced pasture (up to 40%) and herd productivity (up to 20%), and resulted in financial losses (up to 24% reduction in total gross margin). A net carbon income (income after grazing management expenses are removed) of $2–4 per t CO2-e was required to offset economic losses of retaining regrowth on a moderately productive (~8 ha adult equivalent–1) property where income was from the sale of weaners. A higher opportunity cost ($ t–1 CO2-e) of retaining woody regrowth is likely for feeder steer or finishing operations, with improved cattle prices, and where the substantial transaction and reporting costs are included. Although uncertainty remains around the price received for carbon farming activities, this study demonstrated that a conservatively stocked breeding operation can achieve positive production, environmental and economic outcomes, including net carbon stock. This study was based on a beef enterprise in central Queensland’s grazing lands, however, the approach and learnings are expected to be applicable across northern Australia where regrowth is present.
Resumo:
Agricultural land has been identified as a potential source of greenhouse gas emissions offsets through biosequestration in vegetation and soil. In the extensive grazing land of Australia, landholders may participate in the Australian Government’s Emissions Reduction Fund and create offsets by reducing woody vegetation clearing and allowing native woody plant regrowth to grow. This study used bioeconomic modelling to evaluate the trade-offs between an existing central Queensland grazing operation, which has been using repeated tree clearing to maintain pasture growth, and an alternative carbon and grazing enterprise in which tree clearing is reduced and the additional carbon sequestered in trees is sold. The results showed that ceasing clearing in favour of producing offsets produces a higher net present value over 20 years than the existing cattle enterprise at carbon prices, which are close to current (2015) market levels (~$13 t–1 CO2-e). However, by modifying key variables, relative profitability did change. Sensitivity analysis evaluated key variables, which determine the relative profitability of carbon and cattle. In order of importance these were: the carbon price, the gross margin of cattle production, the severity of the tree–grass relationship, the area of regrowth retained, the age of regrowth at the start of the project, and to a lesser extent the cost of carbon project administration, compliance and monitoring. Based on the analysis, retaining regrowth to generate carbon income may be worthwhile for cattle producers in Australia, but careful consideration needs to be given to the opportunity cost of reduced cattle income.
Resumo:
Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.
Resumo:
This document presents an Enterprise Application Integration based proposal for research outcomes and technological information management. The proposal addresses national and international science and research outcomes information management, and corresponding information systems. Information systems interoperability problems, approaches, technologies and integration tools are presented and applied to the research outcomes information management case. A business and technological perspective is provided, including the conceptual analysis and modelling, an integration solution based in a Domain-Specific Language (DSL) and the integration platform to execute the proposed solution. For illustrative purposes, the role and information system needs of a research unit is assumed as the representative case.
Resumo:
The present paper presents an application that composes formal poetry in Spanish in a semiautomatic interactive fashion. JASPER is a forward reasoning rule-based system that obtains from the user an intended message, the desired metric, a choice of vocabulary, and a corpus of verses; and, by intelligent adaptation of selected examples from this corpus using the given words, carries out a prose-to-poetry translation of the given message. In the composition process, JASPER combines natural language generation and a set of construction heuristics obtained from formal literature on Spanish poetry.