12 resultados para Simulation Theory

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper I argue that many of the core phenomenological insights, including the emphasis on direct perception, are a necessary but not sufficient condition for an adequate account of inter-subjectivity today. I take it that an adequate account of inter-subjectivity must involve substantial interaction with empirical studies, notwithstanding the putative methodological differences between phenomenological description and scientific explanation. As such, I will need to explicate what kind of phenomenology survives, and indeed, thrives, in a milieu that necessitates engagement with the relevant sciences, albeit not necessarily deference to them. There will be two central aims to this paper: 1. to defend the centrality and vitality of phenomenological treatments of inter-subjectivity via a consideration of some remarks in Sartre - which I do think possess a non-trivial unity amongst the various interlocutors - and the manner in which they in fact serve to provide the basis for a better explanation of an array of empirical data than existing inferentialist or mindreading accounts of social cognition (notably Theory Theory, Simulation Theory, and hybrid versions); 2. to offer the methodological resources for renewing phenomenology in a manner that acknowledges ostensibly non-phenomenological moments in theory production - which involve explanation, inference to the best explanation, etc. - but does not abandon phenomenology for all that, allowing it to be simply absorbed into empirical explanation or other forms of philosophical analysis without remainder.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A practical teaching difficulty provided the opportunity to turn a problem into a useful case study with generic implications for the pedagogical effectiveness of simulation games in teaching entrepreneurship. Students playing the simulation game submitted written assessments that became the units of analysis for a single-case research project. Analysis produced a grounded theory consisting of four attribute categories and associated properties required of a simulation game to make it an effective teaching device in entrepreneurship contexts. The theory provides at the very least a useful checklist for teachers of entrepreneurship and, potentially, a basis for developing a quality standard for educational simulation games.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A practical teaching difficulty arising in a conducive environment provided opportunity to turn the particular problem into a case study with generic implications. Little research has been conducted on the use or effectiveness of simulation games for teaching entrepreneurship. In the context of established literature critiquing the effectiveness of simulation games as teaching devices in managerial contexts and at a point where problems in using a simulation game as part of entrepreneurship course became evident, the authors designed and executed a single-case research project to generate initial theoretical propositions about the pedagogical effectiveness of simulation games for teaching various concepts and aspects of entrepreneurship. The case analysed the perceived learning environment created when the Sky-High simulation was played by 41 MBA students taking an elective entrepreneurship course at INSEAD. The research design embodied the established methodological principles specified by Yin (1989), for effective case research. Theory building was based upon the grounded-theorizing procedures articulated by Glaser and Strauss (1967). Analysis and synthesis produced a grounded theory, in the form of a normative argument, containing four attribute categories and associated properties required of a simulation game to make it an efficacious teaching device in entrepreneurship contexts. The establishment of this grounded theory has made it both desirable and feasible to contemplate creation of an ISO quality standard for educational simulation games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kanban Control Systems (KCS) have become a widely accepted form of inventory and production control. The creation of realistic Discrete Events Simulation (DES) models of KCS require specification of both information and material flow. There are several commercially available simulation packages that are able to model these systems although the use of an application specific modelling language provides means for rapid model development. A new Kanban specific simulation language as well as a high-speed execution engine is verified in this paper through the simulation of a single stage single part type production line. A single stage single part KCS is modelled with exhaustive enumeration of the decision variables of container sizes and number of Kanbans. Several performance measures were used; 95% Confidence Interval (CI) of container Flow Time (FT), mean line throughput as well as the Coefficient of Variance (CV) of FT and Cycle Time were used to determine the robustness of the control system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the use of a structured design methodology in the design of asynchronous circuits so that high level constructs can be specified purely in terms of signal exchanges and without the intrusion of lower level concepts. Trace theory is used to specify a multi-processor Forth machine at a high level then part of the design is further elaborated using trace theory operations to (insure that the behaviours of the lower level constructs will combine to give the high level specified behaviour without locking or other hazards. A novel form of threaded language to take advantage of the machine architecture is developed. At suitable points the design is tested by simulation. The stack element which is designed is reduced to an electric circuit which is itself tested by simulation to verify the design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a sociomaterial account of simulation in higher education. Sociomaterial approaches change the ontological and epistemological bases for understanding learning and offer valuable tools for addressing important questions about relationships between university education and professional practices. Simulation has grown in many disciplines as a means to bring the two closer together. However, the theoretical underpinnings of simulation pedagogy are limited. This paper extends the wider work of applying sociomaterial approaches to educational phenomena, taking up Schatzki’s practice theory as a distinctive basis for doing so. The question ‘What is being simulated?’ is posed, prompting discussion of multiple bodies, performances and experiences. The potential of adopting such a framework for understanding simulation as a pedagogic practice that brings the classroom and workplace together is illustrated with reference to clinical education in nursing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ordinary differential equations are used for modelling a wide range of dynamic systems. Even though there are many graphical software applications for this purpose, a fully customised solution for all problems is code-level programming of the model and solver. In this project, a free and open source C++ framework is designed to facilitate modelling in native code environment and fulfill the common simulation needs of control and many other engineering and science applications. The solvers of this project are obtained from ODEINT and specialised for Armadillo matrix library to provide an easy syntax and a fast execution. The solver code is minimised and its modification for users have become easier. There are several features added to the solvers such as controlling maximum step size, informing the solver about sudden input change and forcing custom times into the results and calling a custom method at these points. The comfort of the model designer, code readability, extendibility and model isolation have been considered in the structure of this framework. The application manages the output results, exporting and plotting them. Modifying the model has become more practical and a portion of corresponding codes are updated automatically. A set of libraries is provided for generation of output figures, matrix hashing, control system functions, profiling, etc. In this paper, an example of using this framework for a classical washout filter model is explained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preparation of future professionals for practice is a key focus of higher education institutions. Among a range of approaches is the use of simulation pedagogies. While simulation is often justified as a direct bridge between higher education and professional practice, this paper questions this easy assumption. It develops a conceptually driven argument to cast new light on simulation and its unarticulated potential in professional formation. The argument unfolds in, and is illustrated via, three accounts of a simulation event in an Australian undergraduate nursing program. This begins with a familiar approach, moves to one that problematizes this through a focus on disruption, culminating in a third that draws on socio-material theorisations. Here, simulation is conceived as emergent, challenging stable notions of fidelity, common in simulation literature. New possibilities of simulation in the production of agile practitioners and learners in practice are surfaced. This paper extends and enriches thinking by providing distinctive new ways of understanding simulation and the relationship it affords between education and professional practice, and by illuminating the untapped potential of simulation for producing agile practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With multimedia dominating the digital contents, Device-To-Device (D2D) communication has been proposed as a promising data offloading solution in the big data area. As the quality of experience (QoE) is a major determining factor in the success of new multimedia applications, we propose a QoEdriven cooperative content dissemination (QeCS) scheme in this work. Specifically, all users predict the QoE of the potential connections characterized by the mean opinion score (MOS), and send the results to the content provider (CP). Then CP formulates a weighted directed graph according to the network topology and MOS of each potential connection. In order to stimulate cooperation among the users, the content dissemination mechanism is designed through seeking 1-factor of the weighted directed graph with the maximum weight thus achieving maximum total user MOS. Additionally, a debt mechanism is adopted to combat the cheat attacks. Furthermore, we extend the proposed QeCS scheme by considering a constrained condition to the optimization problem for fairness improvement. Extensive simulation results demonstrate that the proposed QeCS scheme achieves both efficiency and fairness especially in large scale and density networks.