897 resultados para Abstraction.
Resumo:
In the article, we have reviewed the means for visualization of syntax, semantics and source code for programming languages which support procedural and/or object-oriented paradigm. It is examined how the structure of the source code of the structural and object-oriented programming styles has influenced different approaches for their teaching. We maintain a thesis valid for the object-oriented programming paradigm, which claims that the activities for design and programming of classes are done by the same specialist, and the training of this specialist should include design as well as programming skills and knowledge for modeling of abstract data structures. We put the question how a high level of abstraction in the object-oriented paradigm should be presented in simple model in the design stage, so the complexity in the programming stage stay low and be easily learnable. We give answer to this question, by building models using the UML notation, as we take a concrete example from the teaching practice including programming techniques for inheritance and polymorphism.
Resumo:
The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.
Resumo:
In this paper we show how event processing over semantically annotated streams of events can be exploited, for implementing tracing and tracking of products in supply chains through the automated generation of linked pedigrees. In our abstraction, events are encoded as spatially and temporally oriented named graphs, while linked pedigrees as RDF datasets are their specific compositions. We propose an algorithm that operates over streams of RDF annotated EPCIS events to generate linked pedigrees. We exemplify our approach using the pharmaceuticals supply chain and show how counterfeit detection is an implicit part of our pedigree generation. Our evaluation results show that for fast moving supply chains, smaller window sizes on event streams provide significantly higher efficiency in the generation of pedigrees as well as enable early counterfeit detection.
Resumo:
The EPCIS specification provides an event oriented mechanism to record product movement information across stakeholders in supply chain business processes. Besides enabling the sharing of event-based traceability datasets, track and trace implementations must also be equipped with the capabilities to validate integrity constraints and detect runtime exceptions without compromising the time-to-deliver schedule of the shipping and receiving parties. In this paper we present a methodology for detecting exceptions arising during the processing of EPCIS event datasets. We propose an extension to the EEM ontology for modelling EPCIS exceptions and show how runtime exceptions can be detected and reported. We exemplify and evaluate our approach on an abstraction of pharmaceutical supply chains.
Resumo:
Pavel Azalov - Recursion is a powerful technique for producing simple algorithms. It is a main topics in almost every introductory programming course. However, educators often refer to difficulties in learning recursion, and suggest methods for teaching recursion. This paper offers a possible solutions to the problem by (1) expressing the recursive definitions through base operations, which have been predefined as a set of base functions and (2) practising recursion by solving sequences of problems. The base operations are specific for each sequence of problems, resulting in a smooth transitions from recursive definitions to recursive functions. Base functions hide the particularities of the concrete programming language and allows the students to focus solely on the formulation of recursive definitions.
Resumo:
Graph-based representations have been used with considerable success in computer vision in the abstraction and recognition of object shape and scene structure. Despite this, the methodology available for learning structural representations from sets of training examples is relatively limited. In this paper we take a simple yet effective Bayesian approach to attributed graph learning. We present a naïve node-observation model, where we make the important assumption that the observation of each node and each edge is independent of the others, then we propose an EM-like approach to learn a mixture of these models and a Minimum Message Length criterion for components selection. Moreover, in order to avoid the bias that could arise with a single estimation of the node correspondences, we decide to estimate the sampling probability over all the possible matches. Finally we show the utility of the proposed approach on popular computer vision tasks such as 2D and 3D shape recognition. © 2011 Springer-Verlag.
Resumo:
Supply chains comprise of complex processes spanning across multiple trading partners. The various operations involved generate large number of events that need to be integrated in order to enable internal and external traceability. Further, provenance of artifacts and agents involved in the supply chain operations is now a key traceability requirement. In this paper we propose a Semantic web/Linked data powered framework for the event based representation and analysis of supply chain activities governed by the EPCIS specification. We specifically show how a new EPCIS event type called "Transformation Event" can be semantically annotated using EEM - The EPCIS Event Model to generate linked data, that can be exploited for internal event based traceability in supply chains involving transformation of products. For integrating provenance with traceability, we propose a mapping from EEM to PROV-O. We exemplify our approach on an abstraction of the production processes that are part of the wine supply chain.
Resumo:
Gasoline oxygenates (MTBE, methyl tert-butyl ether; DIPE, di-isopropyl ether; ETBE, ethyl tert-butyl ether; TAME, tert-amyl ether) are added to gasoline to boost octane and enhance combustion. The combination of large scale use, high water solubility and only minor biodegradability has now resulted in a significant gasoline oxygenate contamination occurring in surface, ground, and drinking water systems. Combination of hydroxyl radical formation and the pyrolytic environment generated by ultrasonic irradiation (665 kHz) leads to the rapid degradation of MTBE and other gasoline oxygenates in aqueous media. ^ The presence of oxygen promotes the degradation processes by rapid reaction with carbon centered radicals indicating radical processes involving O 2 are significant pathways. A number of the oxidation products were identified. The formation of products (alcohols, ketones, aldehydes, esters, peroxides, etc) could be rationalized by mechanisms which involve hydrogen abstraction by OH radical and/or pyrolysis to form carboncentered radicals which react with oxygen and follow standard oxidation chain processes. ^ The reactions of N-substituted R-triazolinediones (RTAD; R = CH 3 or phenyl) have attracted considerable interest because they exhibit a number of unusual mechanistic characteristics that are analogous to the reactions of singlet oxygen (1O2) and offer an easy way to provide C-N bond(s) formation. The reactions of triazolinedione with olefins have been widely studied and aziridinium imides are generally accepted to be the reactive intermediates. ^ We observed the rapid formation of an unusual intermediate upon mixing tetracyclopropylethylene with 4-methyl-1,2,4-triazoline-3,5-dione in CDCl 3. Detailed characterization by NMR (proton, 13C, 2-D NMRs) indicates the intermediate is 5,5,6,6-tetracyclopropyl-3-methyl-5,6-dihydro-oxazolo[3,2- b][1,2,4]-triazolium-2-olate. Such products are extremely rare and have not been studied. Upon warming the intermediate is converted to 2 + 2 diazetidine (major) and ene product (minor). ^ To further explore the kinetics and dynamics of the reaction activation energies were obtained using Arrhenius plots. Activation energies for the formation of the intermediate from reactants, and 2+2 adduct from the intermediate were determined as 7.48 kcal moll and 19.8 kcal mol−1 with their pre-exponential values of 2.24 × 105 dm 3 mol−1 sec−1 and 2.75 × 108 sec−1, respectively, meaning net slow reactions because of low pre-exponential values caused by steric hindrance. ^
Resumo:
Math storybooks are picture books in which the understanding of mathematical concepts is central to the comprehension of the story. Math stories have provided useful opportunities for children to expand their skills in the language arts area and to talk about mathematical factors that are related to their real lives. The purpose of this study was to examine bilingual children's reading and math comprehension of the math storybooks. ^ The participants were randomly selected from two Korean schools and two public elementary schools in Miami, Florida. The sample consisted of 63 Hispanic American and 43 Korean American children from ages five to seven. A 2 x 3 x (2) mixed-model design with two between- and one within-subjects variable was used to conduct this study. The two between-subjects variables were ethnicity and age, and the within-subjects variable was the subject area of comprehension. Subjects were read the three math stories individually, and then they were asked questions related to reading and math comprehension. ^ The overall ANOVA using multivariate tests was conducted to evaluate the factor of subject area for age and ethnicity. As follow-up tests for a significant main effect and a significant interaction effect, pairwise comparisons and simple main effect tests were conducted, respectively. ^ The results showed that there were significant ethnicity and age differences in total comprehension scores. There were also age differences in reading and math comprehension, but no significant differences were found in reading and math by ethnicity. Korean American children had higher scores in total comprehension than those of Hispanic American children, and they showed greater changes in their comprehension skills at the younger ages, from five to six, whereas Hispanic American children showed greater changes at the older ages, from six to seven. Children at ages five and six showed higher scores in reading than in math, but no significant differences between math and reading comprehension scores were found at age seven. ^ Through schooling with integrated instruction, young bilingual children can move into higher levels of abstraction and concepts. This study highlighted bilingual children's general nature of thinking and showed how they developed reading and mathematics comprehension in an integrated process. ^
Resumo:
Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.
Resumo:
The total time a customer spends in the business process system, called the customer cycle-time, is a major contributor to overall customer satisfaction. Business process analysts and designers are frequently asked to design process solutions with optimal performance. Simulation models have been very popular to quantitatively evaluate the business processes; however, simulation is time-consuming and it also requires extensive modeling experiences to develop simulation models. Moreover, simulation models neither provide recommendations nor yield optimal solutions for business process design. A queueing network model is a good analytical approach toward business process analysis and design, and can provide a useful abstraction of a business process. However, the existing queueing network models were developed based on telephone systems or applied to manufacturing processes in which machine servers dominate the system. In a business process, the servers are usually people. The characteristics of human servers should be taken into account by the queueing model, i.e. specialization and coordination. ^ The research described in this dissertation develops an open queueing network model to do a quick analysis of business processes. Additionally, optimization models are developed to provide optimal business process designs. The queueing network model extends and improves upon existing multi-class open-queueing network models (MOQN) so that the customer flow in the human-server oriented processes can be modeled. The optimization models help business process designers to find the optimal design of a business process with consideration of specialization and coordination. ^ The main findings of the research are, first, parallelization can reduce the cycle-time for those customer classes that require more than one parallel activity; however, the coordination time due to the parallelization overwhelms the savings from parallelization under the high utilization servers since the waiting time significantly increases, thus the cycle-time increases. Third, the level of industrial technology employed by a company and coordination time to mange the tasks have strongest impact on the business process design; as the level of industrial technology employed by the company is high; more division is required to improve the cycle-time; as the coordination time required is high; consolidation is required to improve the cycle-time. ^
Resumo:
In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.
Resumo:
In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^
Resumo:
Over the past few decades, we have been enjoying tremendous benefits thanks to the revolutionary advancement of computing systems, driven mainly by the remarkable semiconductor technology scaling and the increasingly complicated processor architecture. However, the exponentially increased transistor density has directly led to exponentially increased power consumption and dramatically elevated system temperature, which not only adversely impacts the system's cost, performance and reliability, but also increases the leakage and thus the overall power consumption. Today, the power and thermal issues have posed enormous challenges and threaten to slow down the continuous evolvement of computer technology. Effective power/thermal-aware design techniques are urgently demanded, at all design abstraction levels, from the circuit-level, the logic-level, to the architectural-level and the system-level. ^ In this dissertation, we present our research efforts to employ real-time scheduling techniques to solve the resource-constrained power/thermal-aware, design-optimization problems. In our research, we developed a set of simple yet accurate system-level models to capture the processor's thermal dynamic as well as the interdependency of leakage power consumption, temperature, and supply voltage. Based on these models, we investigated the fundamental principles in power/thermal-aware scheduling, and developed real-time scheduling techniques targeting at a variety of design objectives, including peak temperature minimization, overall energy reduction, and performance maximization. ^ The novelty of this work is that we integrate the cutting-edge research on power and thermal at the circuit and architectural-level into a set of accurate yet simplified system-level models, and are able to conduct system-level analysis and design based on these models. The theoretical study in this work serves as a solid foundation for the guidance of the power/thermal-aware scheduling algorithms development in practical computing systems.^
Resumo:
In - Service Management Concepts: Implications for Hospitality Management – a study by K. Michael Haywood, Associate Professor, School of Hotel and Food Administration, University of Guelph, Ontario, Canada, Associate Professor Haywood initially proffers: “The study and application of hospitality management has progressed on its own for many years; however, managers are not immune to the knowledge gained from study of other service industries. The author synthesizes what is happening in the area of service management, looks at its relevance to hospitality management, and identifies a few important implications of service management for hospitality managers.” The author draws a distinction between non-denominated service management, and service management as it applies to the hospitality industry. This is done to make an apparent comparison, as many people would assume the two are one in the same. They are not, and the contrast works well here. “While much of what we already know about effective management applies to service industries, some of the traditional concepts of management are inadequate in solving the problems faced by service businesses,” Haywood points out. “If a body of knowledge to be known as service management already exists, or is being developed, where does it fit relative to hospitality management,” Haywood asks. According to John Bateson, Testing a Conceptual Framework for Consumer Service Marketing, there are four criteria used to judge service management. Haywood details these for you, the reader, by way of citation. Haywood points to the difficulty in pin-pointing the intangibles that underpin the service industry. Since service is a concept rather than a touchable good, such as inventory, problems arise for both the organization and the client. Haywood points to a classic study of four service industries in France to illustrate the problems, although no realistic suggestions address the issues. “Over the past few years a variety of system models have been developed to explain the service process, that is, how the service is designed, produced, delivered, and consumed,” Haywood offers. These models are depicted in Appendices A-E. In offering perspectives on how the hospitality industry can gain from the experiences of service management, Haywood observes: “Service management places particular emphasis on a strategic outlook. Hospitality firms would be wise to carefully examine how they are perceived in the marketplace vis-a-vis their service concept, position, competitive situation, and management’s leadership abilities.” “Learning from the experiences of other service firms can help keep a company on track, that is, providing needed and valued services,” he closes the thought.