881 resultados para DDM Data Distribution Management testbed benchmark design implementation instance generator
Resumo:
La simulazione è definita come la rappresentazione del comportamento di un sistema o di un processo per mezzo del funzionamento di un altro o, alternativamente, dall'etimologia del verbo “simulare”, come la riproduzione di qualcosa di fittizio, irreale, come se in realtà, lo fosse. La simulazione ci permette di modellare la realtà ed esplorare soluzioni differenti e valutare sistemi che non possono essere realizzati per varie ragioni e, inoltre, effettuare differenti valutazioni, dinamiche per quanto concerne la variabilità delle condizioni. I modelli di simulazione possono raggiungere un grado di espressività estremamente elevato, difficilmente un solo calcolatore potrà soddisfare in tempi accettabili i risultati attesi. Una possibile soluzione, viste le tendenze tecnologiche dei nostri giorni, è incrementare la capacità computazionale tramite un’architettura distribuita (sfruttando, ad esempio, le possibilità offerte dal cloud computing). Questa tesi si concentrerà su questo ambito, correlandolo ad un altro argomento che sta guadagnando, giorno dopo giorno, sempre più rilevanza: l’anonimato online. I recenti fatti di cronaca hanno dimostrato quanto una rete pubblica, intrinsecamente insicura come l’attuale Internet, non sia adatta a mantenere il rispetto di confidenzialità, integrità ed, in alcuni, disponibilità degli asset da noi utilizzati: nell’ambito della distribuzione di risorse computazionali interagenti tra loro, non possiamo ignorare i concreti e molteplici rischi; in alcuni sensibili contesti di simulazione (e.g., simulazione militare, ricerca scientifica, etc.) non possiamo permetterci la diffusione non controllata dei nostri dati o, ancor peggio, la possibilità di subire un attacco alla disponibilità delle risorse coinvolte. Essere anonimi implica un aspetto estremamente rilevante: essere meno attaccabili, in quanto non identificabili.
Resumo:
In distribution system operations, dispatchers at control center closely monitor system operating limits to ensure system reliability and adequacy. This reliability is partly due to the provision of remote controllable tie and sectionalizing switches. While the stochastic nature of wind generation can impact the level of wind energy penetration in the network, an estimate of the output from wind on hourly basis can be extremely useful. Under any operating conditions, the switching actions require human intervention and can be an extremely stressful task. Currently, handling a set of switching combinations with the uncertainty of distributed wind generation as part of the decision variables has been nonexistent. This thesis proposes a three-fold online management framework: (1) prediction of wind speed, (2) estimation of wind generation capacity, and (3) enumeration of feasible switching combinations. The proposed methodology is evaluated on 29-node test system with 8 remote controllable switches and two wind farms of 18MW and 9MW nameplate capacities respectively for generating the sequence of system reconfiguration states during normal and emergency conditions.
Resumo:
Quality data are not only relevant for successful Data Warehousing or Business Intelligence applications; they are also a precondition for efficient and effective use of Enterprise Resource Planning (ERP) systems. ERP professionals in all kinds of businesses are concerned with data quality issues, as a survey, conducted by the Institute of Information Systems at the University of Bern, has shown. This paper demonstrates, by using results of this survey, why data quality problems in modern ERP systems can occur and suggests how ERP researchers and practitioners can handle issues around the quality of data in an ERP software Environment.
Resumo:
In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.
Resumo:
Personal data about users (customers) is a key component for enterprises and large organizations. Its correct analysis and processing can produce relevant knowledge to achieve different business goals. For example, the monetisation of this data has become a valuable asset for many companies, such as Google, Facebook or Twitter, that obtain huge profits mainly from targeted advertising.
Resumo:
"UILU-ENG 83-1724."--Cover.
Resumo:
Thesis (M.S.)--University of Illinois at Urbana-Champaign.
Resumo:
Includes bibliographical references (p. 17-19).
Resumo:
Mode of access: Internet.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT
Resumo:
Concurrent engineering and design for manufacture and assembly strategies have become pervasive in use in a wide array of industrial settings. These strategies have generally focused on product and process design issues based on capability concerns. The strategies have been historically justified using cost savings calculations focusing on easily quantifiable costs such as raw material savings or manufacturing or assembly operations no longer required. It is argued herein that neither the focus of the strategies nor the means of justification are adequate. Product and process design strategies should include both capability and capacity concerns and justification procedures should include the financial effects that the product and process changes would have on the entire company. The authors of this paper take this more holistic view of the problem and examine an innovative new design strategy using a comprehensive enterprise simulation tool. The results indicate that both the design strategy and the simulator show promise for further industrial use. © 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.
Resumo:
A cikk célja, hogy elemző bemutatását adja az ellátási láncok működéséhez, különösen a disztribúciós tevékenység kiszervezéséhez kapcsolódó működési kockázatoknak. Az írás első része az irodalomkutatás eredményeit feldolgozva az ellátási láncok kockázati kitettségének növekedése mögött rejlő okokat törekszik feltárni, s röviden bemutatja a vállalati kockázatkezelés lehetséges lépéseit e téren. A cikk második gondolati egysége mélyinterjúk segítségével összefoglalja és rendszerezi a disztribúció kiszervezéséhez kapcsolódó kockázatokat, számba veszi a kapcsolódó kockázatkezelési lehetőségeket, s bemutatja a megkérdezett vállalatok által alkalmazott kockázat-megelőzési alternatívákat. ______ The aim of this paper is to introduce operational risks of supply chains, especially risks deriving from the outsourcing of distribution management. Based on literature review the first part of the paper talks about the potential reasons of increasing global supply chain risks, and the general business activities of risk assessment. Analyzing the results of semi-structured qualitative interviews, the second part summarizes the risks belonging to the outsourcing of distribution and introduces the potential risk assessment and avoidance opportunities and alternatives in practice.