944 resultados para Complexity theory
Resumo:
A salient but rarely explicitly studied characteristic of interfirm relationships is that they can intentionally be formed for finite periods of time. What determines firms' intertemporal choices between different alliance time horizons? Shadow of the future theorists suggest that when an alliance has an explicitly set short-term time frame, there is an increased risk that partners may behave opportunistically. This does not readily explain the high incidence of time-bound alliances being formed. Reconciling insights from the shadow of the future perspective with nascent research on the flexibility of temporary organizations, and shifting the focus from the level of individual transactions to that of strategic alliance portfolios, we argue that firms may be willing to accept a higher risk of opportunism when there are offsetting gains in strategic flexibility in managing their strategic alliance portfolio. Consequently, we hypothesize that environmental factors that increase the need for strategic flexibility—namely, dynamism and complexity in the environment—are likely to increase the relative share of time-bound alliances in strategic alliance portfolios. Our analysis of longitudinal data on the intertemporal alliance choices of a large sample of small and medium-sized enterprises provides support for this argument. Our findings fill an important gap in theory about time horizons in interfirm relationships and temporary organizations and show the importance of separating planned terminations from duration-based performance measures.
Resumo:
This paper presents the results of a qualitative action-research inquiry into how a highly diverse cohort of post-graduate students could develop significant capacity in sustainable development within a single unit (course), in this case a compulsory component of four built environment masters programs. The method comprised applying threshold learning theory within the technical discipline of sustainable development, to transform student understanding of sustainable business practice in the built environment. This involved identifying a number of key threshold concepts, which once learned would provide a pathway to having a transformational learning experience. Curriculum was then revised, to focus on stepping through these targeted concepts using a scaffolded, problem-based-learning approach. Challenges included a large class size of 120 students, a majority of international students, and a wide span of disciplinary backgrounds across the spectrum of built environment professionals. Five ‘key’ threshold learning concepts were identified and the renewed curriculum was piloted in Semester 2 of 2011. The paper presents details of the study and findings from a mixed-method evaluation approach through the semester. The outcomes of this study will be used to inform further review of the course in 2012, including further consideration of the threshold concepts. In future, it is anticipated that this case study will inform a framework for rapidly embedding sustainability within curriculum.
Resumo:
Business processes are an important instrument for understanding and improving how companies provide goods and services to customers. Therefore, many companies have documented their business processes well, often in the Event-driven Process Chains (EPC). Unfortunately, in many cases the resulting EPCs are rather complex, so that the overall process logic is hidden in low level process details. This paper proposes abstraction mechanisms for process models that aim to reduce their complexity, while keeping the overall process structure. We assume that functions are marked with efforts and splits are marked with probabilities. This information is used to separate important process parts from less important ones. Real world process models are used to validate the approach.
Resumo:
Behavioral models capture operational principles of real-world or designed systems. Formally, each behavioral model defines the state space of a system, i.e., its states and the principles of state transitions. Such a model is the basis for analysis of the system’s properties. In practice, state spaces of systems are immense, which results in huge computational complexity for their analysis. Behavioral models are typically described as executable graphs, whose execution semantics encodes a state space. The structure theory of behavioral models studies the relations between the structure of a model and the properties of its state space. In this article, we use the connectivity property of graphs to achieve an efficient and extensive discovery of the compositional structure of behavioral models; behavioral models get stepwise decomposed into components with clear structural characteristics and inter-component relations. At each decomposition step, the discovered compositional structure of a model is used for reasoning on properties of the whole state space of the system. The approach is exemplified by means of a concrete behavioral model and verification criterion. That is, we analyze workflow nets, a well-established tool for modeling behavior of distributed systems, with respect to the soundness property, a basic correctness property of workflow nets. Stepwise verification allows the detection of violations of the soundness property by inspecting small portions of a model, thereby considerably reducing the amount of work to be done to perform soundness checks. Besides formal results, we also report on findings from applying our approach to an industry model collection.
Resumo:
The television quiz program Letters and Numbers, broadcast on the SBS network, has recently become quite popular in Australia. This paper explores the potential of this game to illustrate and engage student interest in a range of fundamental concepts of computer science and mathematics. The Numbers Game in particular has a rich mathematical structure whose analysis and solution involves concepts of counting and problem size, discrete (tree) structures, language theory, recurrences, computational complexity, and even advanced memory management. This paper presents an analysis of these games and their teaching applications, and presents some initial results of use in student assignments.
Resumo:
This paper presents a control design for tracking of attitude and speed of an underactuated slender-hull unmanned underwater vehicle (UUV). The control design is based on Port-Hamiltonian theory. The target dynamics (desired dynamic response) is shaped with particular attention to the target mass matrix so that the influence of the unactuated dynamics on the controlled system is suppressed. This results in achievable dynamics independent of uncontrolled states. Throughout the design, insight of the physical phenomena involved is used to propose the desired target dynamics. The performance of the design is demonstrated through simulation with a high-fidelity model.
Resumo:
The interest in utilising multiple heterogeneous Unmanned Aerial Vehicles (UAVs) in close proximity is growing rapidly. As such, many challenges are presented in the effective coordination and management of these UAVs; converting the current n-to-1 paradigm (n operators operating a single UAV) to the 1-to-n paradigm (one operator managing n UAVs). This paper introduces an Information Abstraction methodology used to produce the functional capability framework initially proposed by Chen et al. and its Level Of Detail (LOD) indexing scale. This framework was validated through comparing the operator workload and Situation Awareness (SA) of three experiment scenarios involving multiple autonomously heterogeneous UAVs. The first scenario was set in a high LOD configuration with highly abstracted UAV functional information; the second scenario was set in a mixed LOD configuration; and the final scenario was set in a low LOD configuration with maximal UAV functional information. Results show that there is a significant statistical decrease in operator workload when a UAV’s functional information is displayed at its physical form (low LOD - maximal information) when comparing to the mixed LOD configuration.
Resumo:
This paper gives an overview of an ongoing project endeavouring to advance theory-based production and project management, and the rationale for this approach is briefly justified. The status of the theoretical foundation of production management, project management and allied disciplines is discussed, with emphasis on metaphysical grounding of theories, as well as the nature of the heuristic solution method commonly used in these disciplines. Then, on-going work related to different aspects of production and project management is reviewed from both theoretical and practical orientation. Next, information systems agile project management is explored with a view to its re-use in generic project management. In production management, the consequences and implementation of a new, wider theoretical basis are analyzed. The theoretical implications and negative symptoms of the peculiarities of the construction industry for supply chains and supply chain management in construction are observed. Theoretical paths for improvements of inter-organisational relationships in construction which are fundamental for improvement of construction supply chains are described. To conclude, the observations made in this paper vis-à-vis production, project and supply chain management are related again to the theoretical basis of this paper, and finally directions for theory development and future research are given and discussed.
Resumo:
This paper examines the use of connectionism (neural networks) in modelling legal reasoning. I discuss how the implementations of neural networks have failed to account for legal theoretical perspectives on adjudication. I criticise the use of neural networks in law, not because connectionism is inherently unsuitable in law, but rather because it has been done so poorly to date. The paper reviews a number of legal theories which provide a grounding for the use of neural networks in law. It then examines some implementations undertaken in law and criticises their legal theoretical naïvete. It then presents a lessons from the implementations which researchers must bear in mind if they wish to build neural networks which are justified by legal theories.
A low-complexity flight controller for Unmanned Aircraft Systems with constrained control allocation
Resumo:
In this paper, we propose a framework for joint allocation and constrained control design of flight controllers for Unmanned Aircraft Systems (UAS). The actuator configuration is used to map actuator constraint set into the space of the aircraft generalised forces. By constraining the demanded generalised forces, we ensure that the allocation problem is always feasible; and therefore, it can be solved without constraints. This leads to an allocation problem that does not require on-line numerical optimisation. Furthermore, since the controller handles the constraints, and there is no need to implement heuristics to inform the controller about actuator saturation. The latter is fundamental for avoiding Pilot Induced Oscillations (PIO) in remotely operated UAS due to the rate limit on the aircraft control surfaces.
Resumo:
Urban agriculture plays an increasingly vital role in supplying food to urban populations. Changes in Information and Communications Technology (ICT) are already driving widespread change in diverse food-related industries such as retail, hospitality and marketing. It is reasonable to suspect that the fields of ubiquitous technology, urban informatics and social media equally have a lot to offer the evolution of core urban food systems. We use communicative ecology theory to describe emerging innovations in urban food systems according to their technical, discursive and social components. We conclude that social media in particular accentuate fundamental social interconnections normally effaced by conventional industrialised approaches to food production and consumption.
Resumo:
The planning of IMRT treatments requires a compromise between dose conformity (complexity) and deliverability. This study investigates established and novel treatment complexity metrics for 122 IMRT beams from prostate treatment plans. The Treatment and Dose Assessor software was used to extract the necessary data from exported treatment plan files and calculate the metrics. For most of the metrics, there was strong overlap between the calculated values for plans that passed and failed their quality assurance (QA) tests. However, statistically significant variation between plans that passed and failed QA measurements was found for the established modulation index and for a novel metric describing the proportion of small apertures in each beam. The ‘small aperture score’ provided threshold values which successfully distinguished deliverable treatment plans from plans that did not pass QA, with a low false negative rate.
Resumo:
This study explored early career academics' experiences in using information to learn while building their networks for professional development. A 'knowledge ecosystem' model was developed consisting of informal learning interactions such as relating to information to create knowledge and engaging in mutually supportive relationships. Findings from this study present an alternative interpretation of information use for learning that is focused on processes manifesting as human interactions with informing entities revolving around the contexts of reciprocal human relationships.