982 resultados para Agent Oriented Modeling
Resumo:
WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.
Resumo:
Dance videos are interesting and semantics-intensive. At the same time, they are the complex type of videos compared to all other types such as sports, news and movie videos. In fact, dance video is the one which is less explored by the researchers across the globe. Dance videos exhibit rich semantics such as macro features and micro features and can be classified into several types. Hence, the conceptual modeling of the expressive semantics of the dance videos is very crucial and complex. This paper presents a generic Dance Video Semantics Model (DVSM) in order to represent the semantics of the dance videos at different granularity levels, identified by the components of the accompanying song. This model incorporates both syntactic and semantic features of the videos and introduces a new entity type called, Agent, to specify the micro features of the dance videos. The instantiations of the model are expressed as graphs. The model is implemented as a tool using J2SE and JMF to annotate the macro and micro features of the dance videos. Finally examples and evaluation results are provided to depict the effectiveness of the proposed dance video model.
Resumo:
The purpose of the paper is to explore the possibility of applying existing formal theories of description and design of distributed and concurrent systems to interaction protocols for real-time multi-agent systems. In particular it is shown how the language PRALU, proposed for description of parallel logical control algorithms and rooted in the Petri net formalism, can be used for the modeling of complex concurrent conversations between agents in a multi-agent system. It is demonstrated with a known example of English auction on how to specify an agent interaction protocol using considered means.
Resumo:
Certain theoretical and methodological problems of designing real-time dynamical expert systems, which belong to the class of the most complex integrated expert systems, are discussed. Primary attention is given to the problems of designing subsystems for modeling the external environment in the case where the environment is represented by complex engineering systems. A specific approach to designing simulation models for complex engineering systems is proposed and examples of the application of this approach based on the G2 (Gensym Corp.) tool system are described.
Resumo:
In the article, we have reviewed the means for visualization of syntax, semantics and source code for programming languages which support procedural and/or object-oriented paradigm. It is examined how the structure of the source code of the structural and object-oriented programming styles has influenced different approaches for their teaching. We maintain a thesis valid for the object-oriented programming paradigm, which claims that the activities for design and programming of classes are done by the same specialist, and the training of this specialist should include design as well as programming skills and knowledge for modeling of abstract data structures. We put the question how a high level of abstraction in the object-oriented paradigm should be presented in simple model in the design stage, so the complexity in the programming stage stay low and be easily learnable. We give answer to this question, by building models using the UML notation, as we take a concrete example from the teaching practice including programming techniques for inheritance and polymorphism.
Resumo:
The object of this paper is presenting the University of Economics – Varna, using a 3D model with 3Ds MAX. Created in 1920, May 14, University of Economics - Varna is a cultural institution with a place and style of its own. With the emergence of the three-dimensional modeling we entered a new stage of the evolution of computer graphics. The main target is to preserve the historical vision, to demonstrate forward-thinking and using of future-oriented approaches.
Resumo:
Against a backdrop of ongoing educational reforms that seek to introduce Communicative Language Teaching (CLT) in Albanian primary and secondary state schools, Albanian teachers, among others, are officially required to use communication-based textbooks in their classes. Authorities in a growing number of countries that are seeking to improve and westernise their educational systems are also using communication-based textbooks as agents of change. Behind these actions, there is the commonly held belief that textbooks can be used to support teacher learning as they provide a visible framework teachers can follow. Communication-based textbooks are used in thousands of EFL classrooms around the world to help teachers to “fully understand and routinize change” (Hutchinson and Torres, 1994:323). However, empirical research on the role materials play in the classroom, and in particular the role of textbook as an agent of change, is still very little, and what does exist is rather inconclusive. This study aims to fulfill this gap. It is predominately a qualitative investigation into how and why four Albanian EFL teachers use Western teaching resources in their classes. Aiming at investigating the decision-making processes that teachers go through in their teaching, and specifically at investigating the relationship between Western-published textbooks, teachers’ decision making, and teachers’ classroom delivery, the current study contributes to an extensive discussion on the development of communicative L2 teaching concepts and methods, teacher decision making, as well as a growing discussion on how best to make institutional reforms effective, particularly in East-European ex-communist countries and in other developing countries. Findings from this research indicate that, prompted by the content of Western-published textbooks, the four research participants, who had received little formal training in CLT teaching, accommodated some communicative teaching behaviours into their teaching. The use of communicative textbooks, however, does not seem to account for radical, methodological changes in teachers’ practices. Teacher cognitions based on teachers’ previous learning experience are likely to act as a lens through which teachers judge classroom realities. As such, they shape, to a great degree, the decisions teachers make regarding the use of Western-published textbooks in their classes.
Resumo:
There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.
Resumo:
Systems-of-systems (SoS) are systems resulted from the interaction among other independent constituent systems that collaborate to offer new functionalities towards accomplishing global missions. Each of these constituent systems accomplishes its individual missions and is able to contribute to the achievement of the global missions of the SoS, both being viewed as a set of associated goals. In the perspective of self-aware systems, SoS need to exhibit goal-awareness, i.e., They need to be aware of their own goals and of how their constituent systems contribute to their accomplishment. In this paper, we revisit goal-oriented concepts aiming at identifying and modeling goals at both SoS level and the constituent systems level. Moreover, we take advantage of such goal-oriented models to express the relationship among goals at these levels as well as to define how each constituent system can contribute to the accomplishment of global goals of an SoS. In addition, we shed light on important issues related to goal modeling in self-aware SoS to be addressed in future research.
Resumo:
Access control (AC) limits access to the resources of a system only to authorized entities. Given that information systems today are increasingly interconnected, AC is extremely important. The implementation of an AC service is a complicated task. Yet the requirements to an AC service vary a lot. Accordingly, the design of an AC service should be flexible and extensible in order to save development effort and time. Unfortunately, with conventional object-oriented techniques, when an extension has not been anticipated at the design time, the modification incurred by the extension is often invasive. Invasive changes destroy design modularity, further deteriorate design extensibility, and even worse, they reduce product reliability. ^ A concern is crosscutting if it spans multiple object-oriented classes. It was identified that invasive changes were due to the crosscutting nature of most unplanned extensions. To overcome this problem, an aspect-oriented design approach for AC services was proposed, as aspect-oriented techniques could effectively encapsulate crosscutting concerns. The proposed approach was applied to develop an AC framework that supported role-based access control model. In the framework, the core role-based access control mechanism is given in an object-oriented design, while each extension is captured as an aspect. The resulting framework is well-modularized, flexible, and most importantly, supports noninvasive adaptation. ^ In addition, a process to formalize the aspect-oriented design was described. The purpose is to provide high assurance for AC services. Object-Z was used to specify the static structure and Predicate/Transition net was used to model the dynamic behavior. Object-Z was extended to facilitate specification in an aspect-oriented style. The process of formal modeling helps designers to enhance their understanding of the design, hence to detect problems. Furthermore, the specification can be mathematically verified. This provides confidence that the design is correct. It was illustrated through an example that the model was ready for formal analysis. ^
Resumo:
A two-phase three-dimensional computational model of an intermediate temperature (120--190°C) proton exchange membrane (PEM) fuel cell is presented. This represents the first attempt to model PEM fuel cells employing intermediate temperature membranes, in this case, phosphoric acid doped polybenzimidazole (PBI). To date, mathematical modeling of PEM fuel cells has been restricted to low temperature operation, especially to those employing Nafion ® membranes; while research on PBI as an intermediate temperature membrane has been solely at the experimental level. This work is an advancement in the state of the art of both these fields of research. With a growing trend toward higher temperature operation of PEM fuel cells, mathematical modeling of such systems is necessary to help hasten the development of the technology and highlight areas where research should be focused.^ This mathematical model accounted for all the major transport and polarization processes occurring inside the fuel cell, including the two phase phenomenon of gas dissolution in the polymer electrolyte. Results were presented for polarization performance, flux distributions, concentration variations in both the gaseous and aqueous phases, and temperature variations for various heat management strategies. The model predictions matched well with published experimental data, and were self-consistent.^ The major finding of this research was that, due to the transport limitations imposed by the use of phosphoric acid as a doping agent, namely low solubility and diffusivity of dissolved gases and anion adsorption onto catalyst sites, the catalyst utilization is very low (∼1--2%). Significant cost savings were predicted with the use of advanced catalyst deposition techniques that would greatly reduce the eventual thickness of the catalyst layer, and subsequently improve catalyst utilization. The model also predicted that an increase in power output in the order of 50% is expected if alternative doping agents to phosphoric acid can be found, which afford better transport properties of dissolved gases, reduced anion adsorption onto catalyst sites, and which maintain stability and conductive properties at elevated temperatures.^
Resumo:
The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.
Resumo:
This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.
Resumo:
Taylor Slough is one of the natural freshwater contributors to Florida Bay through a network of microtidal creeks crossing the Everglades Mangrove Ecotone Region (EMER). The EMER ecological function is critical since it mediates freshwater and nutrient inputs and controls the water quality in Eastern Florida Bay. Furthermore, this region is vulnerable to changing hydrodynamics and nutrient loadings as a result of upstream freshwater management practices proposed by the Comprehensive Everglades Restoration Program (CERP), currently the largest wetland restoration project in the USA. Despite the hydrological importance of Taylor Slough in the water budget of Florida Bay, there are no fine scale (∼1 km2) hydrodynamic models of this system that can be utilized as a tool to evaluate potential changes in water flow, salinity, and water quality. Taylor River is one of the major creeks draining Taylor Slough freshwater into Florida Bay. We performed a water budget analysis for the Taylor River area, based on long-term hydrologic data (1999–2007) and supplemented by hydrodynamic modeling using a MIKE FLOOD (DHI,http://dhigroup.com/) model to evaluate groundwater and overland water discharges. The seasonal hydrologic characteristics are very distinctive (average Taylor River wet vs. dry season outflow was 6 to 1 during 1999–2006) with a pronounced interannual variability of flow. The water budget shows a net dominance of through flow in the tidal mixing zone, while local precipitation and evapotranspiration play only a secondary role, at least in the wet season. During the dry season, the tidal flood reaches the upstream boundary of the study area during approximately 80 days per year on average. The groundwater field measurements indicate a mostly upwards-oriented leakage, which possibly equals the evapotranspiration term. The model results suggest a high importance of groundwater contribution to the water salinity in the EMER. The model performance is satisfactory during the dry season where surface flow in the area is confined to the Taylor River channel. The model also provided guidance on the importance of capturing the overland flow component, which enters the area as sheet flow during the rainy season. Overall, the modeling approach is suitable to reach better understanding of the water budget in the mangrove region. However, more detailed field data is needed to ascertain model predictions by further calibrating overland flow parameters.
Resumo:
Over the past five years, XML has been embraced by both the research and industrial community due to its promising prospects as a new data representation and exchange format on the Internet. The widespread popularity of XML creates an increasing need to store XML data in persistent storage systems and to enable sophisticated XML queries over the data. The currently available approaches to addressing the XML storage and retrieval issue have the limitations of either being not mature enough (e.g. native approaches) or causing inflexibility, a lot of fragmentation and excessive join operations (e.g. non-native approaches such as the relational database approach). ^ In this dissertation, I studied the issue of storing and retrieving XML data using the Semantic Binary Object-Oriented Database System (Sem-ODB) to leverage the advanced Sem-ODB technology with the emerging XML data model. First, a meta-schema based approach was implemented to address the data model mismatch issue that is inherent in the non-native approaches. The meta-schema based approach captures the meta-data of both Document Type Definitions (DTDs) and Sem-ODB Semantic Schemas, thus enables a dynamic and flexible mapping scheme. Second, a formal framework was presented to ensure precise and concise mappings. In this framework, both schemas and the conversions between them are formally defined and described. Third, after major features of an XML query language, XQuery, were analyzed, a high-level XQuery to Semantic SQL (Sem-SQL) query translation scheme was described. This translation scheme takes advantage of the navigation-oriented query paradigm of the Sem-SQL, thus avoids the excessive join problem of relational approaches. Finally, the modeling capability of the Semantic Binary Object-Oriented Data Model (Sem-ODM) was explored from the perspective of conceptually modeling an XML Schema using a Semantic Schema. ^ It was revealed that the advanced features of the Sem-ODB, such as multi-valued attributes, surrogates, the navigation-oriented query paradigm, among others, are indeed beneficial in coping with the XML storage and retrieval issue using a non-XML approach. Furthermore, extensions to the Sem-ODB to make it work more effectively with XML data were also proposed. ^