766 resultados para Algebraic thinking
Resumo:
Big Data and Learning Analytics’ promise to revolutionise educational institutions, endeavours, and actions through more and better data is now compelling. Multiple, and continually updating, data sets produce a new sense of ‘personalised learning’. A crucial attribute of the datafication, and subsequent profiling, of learner behaviour and engagement is the continual modification of the learning environment to induce greater levels of investment on the parts of each learner. The assumption is that more and better data, gathered faster and fed into ever-updating algorithms, provide more complete tools to understand, and therefore improve, learning experiences through adaptive personalisation. The argument in this paper is that Learning Personalisation names a new logistics of investment as the common ‘sense’ of the school, in which disciplinary education is ‘both disappearing and giving way to frightful continual training, to continual monitoring'.
Resumo:
A large and growing body of literature has explored corporate environmental sustainability initiatives and their impacts locally, regionally and internationally. While the initiatives provide examples of environmental stewardship and cleaner production, a large proportion of the organisations considered in this literature have ‘sustainable practice’, ‘environmental stewardship’ or similar goals as add-ons to their core business strategy. Furthermore, there is limited evidence of organizations embracing and internalising sustainability principles throughout their activities, products or services. Many challenges and barriers impede outcomes as whole system design or holistic approach to address environmental issues, with some evidence to suggest that targeted initiatives could be useful in making progress. ‘Lean management’ and other lean thinking strategies are often put forward as part of such targeted approaches. Within this context, the authors have drawn on current literature to undertake a review of lean thinking practices and how these influence sustainable business practice, considering the balance of environmental and economic aspects of triple bottom line in sustainability. The review methodology comprised firstly identifying theoretical constructs to be studied, developing criteria for categorising the literature, evaluating the findings within each category and considering the implications of the findings for areas for future research. The evaluation revealed two main areas of consideration: - a) lean manufacturing tools and environmental performance, and; - b) integrated lean and green models and approaches. However the review highlighted the ad hoc use of lean thinking within corporate sustainability initiatives, and established a knowledge gap in the form of a system for being able to consider different categories of environmental impacts in different industries and choose best lean tools or models for a particular problem in a way to ensure holistic exploration. The findings included a specific typology of lean tools for different environmental impacts, drawing from multiple case studies. Within this research context, this paper presents the findings of the review; namely the emerging consensus on the relationships between lean thinking and sustainable business practice. The paper begins with an overview of the current literature regarding lean thinking and its documented role in sustainable business practice. The paper then includes an analysis of lean and green paradigms in different industries; and describes the typology of lean tools used to reduce specific environmental impacts and, integrated lean and green models and approaches. The paper intends to encourage industrial practitioners to consider the merits and potential risks with using specific lean tools to reduce context-specific environmental impacts. It also aims to highlight the potential for further investigation with regard to comparing different industries and conceptualising a generalizable system for ensuring lean thinking initiatives build towards sustainable business practice.
Resumo:
This paper describes an algorithm for ``direct numerical integration'' of the initial value Differential-Algebraic Inequalities (DAI) in a time stepping fashion using a sequential quadratic programming (SQP) method solver for detecting and satisfying active path constraints at each time step. The activation of a path constraint generally increases the condition number of the active discretized differential algebraic equation's (DAE) Jacobian and this difficulty is addressed by a regularization property of the alpha method. The algorithm is locally stable when index 1 and index 2 active path constraints and bounds are active. Subject to available regularization it is seen to be stable for active index 3 active path constraints in the numerical examples. For the high index active path constraints, the algorithm uses a user-selectable parameter to perturb the smaller singular values of the Jacobian with a view to reducing the condition number so that the simulation can proceed. The algorithm can be used as a relatively cheaper estimation tool for trajectory and control planning and in the context of model predictive control solutions. It can also be used to generate initial guess values of optimization variables used as input to inequality path constrained dynamic optimization problems. The method is illustrated with examples from space vehicle trajectory and robot path planning.
Resumo:
In design studio, sketching or visual thinking is part of processes that assist students to achieve final design solutions. At QUT’s First and Third Year industrial design studio classes we engage in a variety of teaching pedagogies from which we identify ‘Concept Bombs’ as instrumental in the development of students’ visual thinking and reflective design process, and also as a vehicle to foster positive student engagement. In First year studios our Concept Bombs’ consist of 20 minute individual design tasks focusing on rapid development of initial concept designs and free-hand sketching. In Third Year studios we adopt a variety of formats and different timing, combining individual and team based tasks. Our experience and surveys tell us that students value intensive studio activities especially when combined with timely assessment and feedback. While conventional longer-duration design projects are essential for allowing students to engage with the full depth and complexity of the design process, short and intensive design activities introduce variety to the learning experience and enhance student engagement. This paper presents a comparative analysis of First and Third Year students’ Concept Bomb sketches to describe the types of design knowledge embedded in them, a discussion of limitations and opportunities of this pedagogical technique, as well as considerations for future development of studio based tasks of this kind as design pedagogies in the midst of current university education trends.
Resumo:
CTRU, a public key cryptosystem was proposed by Gaborit, Ohler and Sole. It is analogue of NTRU, the ring of integers replaced by the ring of polynomials $\mathbb{F}_2[T]$ . It attracted attention as the attacks based on either LLL algorithm or the Chinese Remainder Theorem are avoided on it, which is most common on NTRU. In this paper we presents a polynomial-time algorithm that breaks CTRU for all recommended parameter choices that were derived to make CTRU secure against popov normal form attack. The paper shows if we ascertain the constraints for perfect decryption then either plaintext or private key can be achieved by polynomial time linear algebra attack.
Resumo:
In this paper, we present an algebraic method to study and design spatial parallel manipulators that demonstrate isotropy in the force and moment distributions. We use the force and moment transformation matrices separately, and derive conditions for their isotropy individually as well as in combination. The isotropy conditions are derived in closed-form in terms of the invariants of the quadratic forms associated with these matrices. The formulation is applied to a class of Stewart platform manipulator, and a multi-parameter family of isotropic manipulators is identified analytically. We show that it is impossible to obtain a spatially isotropic configuration within this family. We also compute the isotropic configurations of an existing manipulator and demonstrate a procedure for designing the manipulator for isotropy at a given configuration. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
"Extended Clifford algebras" are introduced as a means to obtain low ML decoding complexity space-time block codes. Using left regular matrix representations of two specific classes of extended Clifford algebras, two systematic algebraic constructions of full diversity Distributed Space-Time Codes (DSTCs) are provided for any power of two number of relays. The left regular matrix representation has been shown to naturally result in space-time codes meeting the additional constraints required for DSTCs. The DSTCs so constructed have the salient feature of reduced Maximum Likelihood (ML) decoding complexity. In particular, the ML decoding of these codes can be performed by applying the lattice decoder algorithm on a lattice of four times lesser dimension than what is required in general. Moreover these codes have a uniform distribution of power among the relays and in time, thus leading to a low Peak to Average Power Ratio at the relays.
Resumo:
In the world today there are many ways in which we measure, count and determine whether something is worth the effort or not. In Australia and many other countries, new government legislation is requiring government-funded entities to become more transparent in their practice and to develop a more cohesive narrative about the worth, or impact, for the betterment of society. This places the executives of such entities in a position of needing evaluative thinking and practice to guide how they may build the narrative that documents and demonstrates this type of impact. In thinking about where to start, executives, project and program managers may consider this workshop as a professional development opportunity to explore both the intended and unintended consequences of performance models as tools of evaluation. This workshop will offer participants an opportunity to unpack the place of performance models as an evaluative tool through the following: · What shape does an ethical, sound and valid performance measure for an organization or personnel take? · What role does cultural specificity play in the design and development of a performance model for an organization or for personnel? · How are stakeholders able to identify risk during the design and development of such models? · When and where will dissemination strategies be required? · And so what? How can you determine that your performance model implementation has made a difference now or in the future?
Resumo:
In the wake of an almost decade long economic downturn and increasing competition from developing economies, a new agenda in the Australian Government for science, technology, engineering, and mathematics (STEM) education and research has emerged as a national priority. However, to art and design educators, the pervasiveness and apparent exclusivity of STEM can be viewed as another instance of art and design education being relegated to the margins of curriculum (Greene, 1995). In the spirit of interdisciplinarity, there have been some recent calls to expand STEM education to include the arts and design, transforming STEM into STEAM in education (Maeda, 2013). As with STEM, STEAM education emphasises the connections between previously disparate disciplines, meaning that education has been conceptualised in different ways, such as focusing on the creative design thinking process that is fundamental to engineering and art (Bequette & Bequette, 2012). In this article, we discuss divergent creative design thinking process and metacognitive skills, how, and why they may enhance learning in STEM and STEAM.
Resumo:
We offer a technique, motivated by feedback control and specifically sliding mode control, for the simulation of differential-algebraic equations (DAEs) that describe common engineering systems such as constrained multibody mechanical structures and electric networks. Our algorithm exploits the basic results from sliding mode control theory to establish a simulation environment that then requires only the most primitive of numerical solvers. We circumvent the most important requisite for the conventionalsimulation of DAEs: the calculation of a set of consistent initial conditions. Our algorithm, which relies on the enforcement and occurrence of sliding mode, will ensure that the algebraic equation is satisfied by the dynamic system even for inconsistent initial conditions and for all time thereafter. [DOI:10.1115/1.4001904]
Resumo:
Ecology and evolutionary biology is the study of life on this planet. One of the many methods applied to answering the great diversity of questions regarding the lives and characteristics of individual organisms, is the utilization of mathematical models. Such models are used in a wide variety of ways. Some help us to reason, functioning as aids to, or substitutes for, our own fallible logic, thus making argumentation and thinking clearer. Models which help our reasoning can lead to conceptual clarification; by expressing ideas in algebraic terms, the relationship between different concepts become clearer. Other mathematical models are used to better understand yet more complicated models, or to develop mathematical tools for their analysis. Though helping us to reason and being used as tools in the craftmanship of science, many models do not tell us much about the real biological phenomena we are, at least initially, interested in. The main reason for this is that any mathematical model is a simplification of the real world, reducing the complexity and variety of interactions and idiosynchracies of individual organisms. What such models can tell us, however, both is and has been very valuable throughout the history of ecology and evolution. Minimally, a model simplifying the complex world can tell us that in principle, the patterns produced in a model could also be produced in the real world. We can never know how different a simplified mathematical representation is from the real world, but the similarity models do strive for, gives us confidence that their results could apply. This thesis deals with a variety of different models, used for different purposes. One model deals with how one can measure and analyse invasions; the expanding phase of invasive species. Earlier analyses claims to have shown that such invasions can be a regulated phenomena, that higher invasion speeds at a given point in time will lead to a reduction in speed. Two simple mathematical models show that analysis on this particular measure of invasion speed need not be evidence of regulation. In the context of dispersal evolution, two models acting as proof-of-principle are presented. Parent-offspring conflict emerges when there are different evolutionary optima for adaptive behavior for parents and offspring. We show that the evolution of dispersal distances can entail such a conflict, and that under parental control of dispersal (as, for example, in higher plants) wider dispersal kernels are optimal. We also show that dispersal homeostasis can be optimal; in a setting where dispersal decisions (to leave or stay in a natal patch) are made, strategies that divide their seeds or eggs into fractions that disperse or not, as opposed to randomized for each seed, can prevail. We also present a model of the evolution of bet-hedging strategies; evolutionary adaptations that occur despite their fitness, on average, being lower than a competing strategy. Such strategies can win in the long run because they have a reduced variance in fitness coupled with a reduction in mean fitness, and fitness is of a multiplicative nature across generations, and therefore sensitive to variability. This model is used for conceptual clarification; by developing a population genetical model with uncertain fitness and expressing genotypic variance in fitness as a product between individual level variance and correlations between individuals of a genotype. We arrive at expressions that intuitively reflect two of the main categorizations of bet-hedging strategies; conservative vs diversifying and within- vs between-generation bet hedging. In addition, this model shows that these divisions in fact are false dichotomies.
Resumo:
In this paper, we present an algebraic method to study and design spatial parallel manipulators that demonstrate isotropy in the force and moment distributions. We use the force and moment transformation matrices separately, and derive conditions for their isotropy individually as well as in combination. The isotropy conditions are derived in closed-form in terms of the invariants of the quadratic forms associated with these matrices. The formulation is applied to a class of Stewart platform manipulator, and a multi-parameter family of isotropic manipulators is identified analytically. We show that it is impossible to obtain a spatially isotropic configuration within this family. We also compute the isotropic configurations of an existing manipulator and demonstrate a procedure for designing the manipulator for isotropy at a given configuration.
Resumo:
In this paper we study representation of KL-divergence minimization, in the cases where integer sufficient statistics exists, using tools from polynomial algebra. We show that the estimation of parametric statistical models in this case can be transformed to solving a system of polynomial equations. In particular, we also study the case of Kullback-Csiszar iteration scheme. We present implicit descriptions of these models and show that implicitization preserves specialization of prior distribution. This result leads us to a Grobner bases method to compute an implicit representation of minimum KL-divergence models.