306 resultados para Individual-based modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we discuss our current efforts to develop and implement an exploratory, discovery mode assessment item into the total learning and assessment profile for a target group of about 100 second level engineering mathematics students. The assessment item under development is composed of 2 parts, namely, a set of "pre-lab" homework problems (which focus on relevant prior mathematical knowledge, concepts and skills), and complementary computing laboratory exercises which are undertaken within a fixed (1 hour) time frame. In particular, the computing exercises exploit the algebraic manipulation and visualisation capabilities of the symbolic algebra package MAPLE, with the aim of promoting understanding of certain mathematical concepts and skills via visual and intuitive reasoning, rather than a formal or rigorous approach. The assessment task we are developing is aimed at providing students with a significant learning experience, in addition to providing feedback on their individual knowledge and skills. To this end, a noteworthy feature of the scheme is that marks awarded for the laboratory work are primarily based on the extent to which reflective, critical thinking is demonstrated, rather than the amount of CBE-style tasks completed by the student within the allowed time. With regard to student learning outcomes, a novel and potentially critical feature of our scheme is that the assessment task is designed to be intimately linked to the overall course content, in that it aims to introduce important concepts and skills (via individual student exploration) which will be revisited somewhat later in the pedagogically more restrictive formal lecture component of the course (typically a large group plenary format). Furthermore, the time delay involved, or "incubation period", is also a deliberate design feature: it is intended to allow students the opportunity to undergo potentially important internal re-adjustments in their understanding, before being exposed to lectures on related course content which are invariably delivered in a more condensed, formal and mathematically rigorous manner. In our presentation, we will discuss in more detail our motivation and rationale for trailing such a scheme for the targeted student group. Some of the advantages and disadvantages of our approach (as we perceived them at the initial stages) will also be enumerated. In a companion paper, the theoretical framework for our approach will be more fully elaborated, and measures of student learning outcomes (as obtained from eg. student provided feedback) will be discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION In their target article, Yuri Hanin and Muza Hanina outlined a novel multidisciplinary approach to performance optimisation for sport psychologists called the Identification-Control-Correction (ICC) programme. According to the authors, this empirically-verified, psycho-pedagogical strategy is designed to improve the quality of coaching and consistency of performance in highly skilled athletes and involves a number of steps including: (i) identifying and increasing self-awareness of ‘optimal’ and ‘non-optimal’ movement patterns for individual athletes; (ii) learning to deliberately control the process of task execution; and iii), correcting habitual and random errors and managing radical changes of movement patterns. Although no specific examples were provided, the ICC programme has apparently been successful in enhancing the performance of Olympic-level athletes. In this commentary, we address what we consider to be some important issues arising from the target article. We specifically focus attention on the contentious topic of optimization in neurobiological movement systems, the role of constraints in shaping emergent movement patterns and the functional role of movement variability in producing stable performance outcomes. In our view, the target article and, indeed, the proposed ICC programme, would benefit from a dynamical systems theoretical backdrop rather than the cognitive scientific approach that appears to be advocated. Although Hanin and Hanina made reference to, and attempted to integrate, constructs typically associated with dynamical systems theoretical accounts of motor control and learning (e.g., Bernstein’s problem, movement variability, etc.), these ideas required more detailed elaboration, which we provide in this commentary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Innovation Management (IM) in most knowledge based firms is used on an adhoc basis where senior managers use this term to leverage competitive edge without understanding its true meaning and how its robust application in organisation impacts organisational performance. There have been attempts in the manufacturing industry to harness the innovative potential of the business and apprehend its use as a point of difference to improve financial and non financial outcomes. However further work is required to innovatively extrapolate the lessons learnt to introduce incremental and/or radical innovation to knowledge based firms. An international structural engineering firm has been proactive in exploring and implementing this idea and has forged an alliance with the Queensland University of Technology to start the Innovation Management Program (IMP). The aim was to develop a permanent and sustainable program with which innovation can be woven through the fabric of the organisation. There was an intention to reinforce the firms’ vision and reinvigorate ideas and create new options that help in its realisation. This paper outlines the need for innovation in knowledge based firms and how this consulting engineering firm reacted to this exigency. The development of the Innovation Management Program, its different themes (and associated projects) and how they integrate to form a holistic model is also discussed. The model is designed around the need of providing professional qualification improvement opportunities for staff, setting-up organised, structured & easily accessible knowledge repositories to capture tacit and explicit knowledge and implement efficient project management strategies with a view to enhance client satisfaction. A Delphi type workshop is used to confirm the themes and projects. Some of the individual projects and their expected outcomes are also discussed. A questionnaire and interviews were used to collect data to select appropriate candidates responsible for leading these projects. Following an in-depth analysis of preliminary research results, some recommendations on the selection process will also be presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investment in residential property in Australia is not dominated by the major investment institutions in to the same degree as the commercial, industrial and retail property markets. As at December 2001, the Property Council of Australia Investment Performance Index contained residential property with a total value of $235 million, which represents only 0.3% of the total PCA Performance Index value. The majority of investment in the Australian residential property market is by small investment companies and individual investors. The limited exposure of residential property in the institutional investment portfolios has also limited the research that has been undertaken in relation to residential property performance. However the importance of individual investment in residential property is continuing to gain importance as both individuals are now taking control of their own superannuation portfolios and the various State Governments of Australia are decreasing their involvement in the construction of public housing by subsidizing low-income families into the private residential property market. This paper will: • Provide a comparison of the cost to initially purchase residential property in the various capital city residential property markets in Australia, and • Analyse the true cost and investment performance of residential property in the main residential property markets in Australia based on a standard investment portfolio in each of the State capital cities and relate these results to real estate marketing and agency practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Which social perceptions and structures shape coworker reliance and contributions to team products? When people form an intercultural team, they launch a set of working relationships that may be affected by social perceptions and social structures. Social perceptions include beliefs about interpersonal similarity and also expectations of behavior based on professional and national memberships. Social structures include dyadic relationships and the patterns they form. In this study, graduate students from three cohorts were consistently more likely to rely on others with whom they had a professional relationship, while structural equivalence in the professional network had no effect. In only one of the cohorts, people were more likely to rely on others who were professionally similar to themselves. Expectations regarding professional or national groups had no effect on willingness to rely on members of those groups, but expectations regarding teammates' nations positively influenced individual contributions. Willingness to rely on one's teammates did not significantly influence individual contributions to the team. Number of professional ties to teammates increased individual contributions, and number of external ties decreased contributions. Finally, people whose professional networks included a mixture of brokerage and closure (higher ego network variance) made greater contributions to their teams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the research and development of an ICT tool to facilitate the learning of ratio and fractions by adult prisoners. The design of the ICT tool was informed by a semiotic framework for mathematical meaning-making. The ICT tool thus employed multiple semiotic resources including topological, typological, and social-actional resources. The results showed that individual semiotic resource could only represent part of the mathematical concept, while at the same time it might signify something else to create a misconception. When multiple semiotic resources were utilised the mathematical ideas could be better learnt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To understand the diffusion of high technology products such as PCs, digital cameras and DVD players it is necessary to consider the dynamics of successive generations of technology. From the consumer’s perspective, these technology changes may manifest themselves as either a new generation product substituting for the old (for instance digital cameras) or as multiple generations of a single product (for example PCs). To date, research has been confined to aggregate level sales models. These models consider the demand relationship between one generation of a product and a successor generation. However, they do not give insights into the disaggregate-level decisions by individual households – whether to adopt the newer generation, and if so, when. This paper makes two contributions. It is the first large scale empirical study to collect household data for successive generations of technologies in an effort to understand the drivers of adoption. Second, in contrast to traditional analysis in diffusion research that conceptualizes technology substitution as an “adoption of innovation” type process, we propose that from a consumer’s perspective, technology substitution combines elements of both adoption (adopting the new generation technology) and replacement (replacing generation I product with generation II). Key Propositions In some cases, successive generations are clear “substitutes” for the earlier generation (e.g. PCs Pentium I to II to III ). More commonly the new generation II technology is a “partial substitute” for existing generation I technology (e.g. DVD players and VCRs). Some consumers will purchase generation II products as substitutes for their generation I product, while other consumers will purchase generation II products as additional products to be used as well as their generation I product. We propose that substitute generation II purchases combine elements of both adoption and replacement, but additional generation II purchases are solely adoption-driven process. Moreover, drawing on adoption theory consumer innovativeness is the most important consumer characteristic for adoption timing of new products. Hence, we hypothesize consumer innovativeness to influence the timing of both additional and substitute generation II purchases but to have a stronger impact on additional generation II purchases. We further propose that substitute generation II purchases act partially as a replacement purchase for the generation I product. Thus, we hypothesize that households with older generation I products will make substitute generation II purchases earlier. Methods We employ Cox hazard modeling to study factors influencing the timing of a household’s adoption of generation II products. A separate hazard model is conducted for additional and substitute purchases. The age of the generation I product is calculated based on the most recent household purchase of that product. Control variables include size and income of household, age and education of decision-maker. Results and Implications Our preliminary results confirm both our hypotheses. Consumer innovativeness has a strong influence on both additional purchases and substitute purchases. Also consistent with our hypotheses, the age of the generation I product has a dramatic influence for substitute purchases of VCR/DVD players and a strong influence for PCs/notebooks. Yet, also as hypothesized, there was no influence on additional purchases. This implies that there is a clear distinction between additional and substitute purchases of generation II products, each with different drivers. For substitute purchases, product age is a key driver. Therefore marketers of high technology products can utilize data on generation I product age (e.g. from warranty or loyalty programs) to target customers who are more likely to make a purchase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Definition of disease phenotype is a necessary preliminary to research into genetic causes of a complex disease. Clinical diagnosis of migraine is currently based on diagnostic criteria developed by the International Headache Society. Previously, we examined the natural clustering of these diagnostic symptoms using latent class analysis (LCA) and found that a four-class model was preferred. However, the classes can be ordered such that all symptoms progressively intensify, suggesting that a single continuous variable representing disease severity may provide a better model. Here, we compare two models: item response theory and LCA, each constructed within a Bayesian context. A deviance information criterion is used to assess model fit. We phenotyped our population sample using these models, estimated heritability and conducted genome-wide linkage analysis using Merlin-qtl. LCA with four classes was again preferred. After transformation, phenotypic trait values derived from both models are highly correlated (correlation = 0.99) and consequently results from subsequent genetic analyses were similar. Heritability was estimated at 0.37, while multipoint linkage analysis produced genome-wide significant linkage to chromosome 7q31-q33 and suggestive linkage to chromosomes 1 and 2. We argue that such continuous measures are a powerful tool for identifying genes contributing to migraine susceptibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the research and development of an ICT tool to facilitate the learning of ratio and fractions by adult prisoners. The design of the ICT tool was informed by a semiotic framework for mathematical meaning-making. The ICT tool thus employed multiple semiotic resources including topological, typological, and social-actional resources. The results showed that individual semiotic resource could only represent part of the mathematical concept, while at the same time it might signify something else to create a misconception. When multiple semiotic resources were utilised the mathematical ideas could be better learnt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years, many scholars have studied the conceptual modeling of information systems based on a theory of ontological expressiveness. This theory offers four constructs that inform properties of modeling grammars in the form of ontological deficiencies, and their implications for development and use of conceptual modeling in IS practice. In this paper we report on the development of a valid and reliable instrument for measuring the perceptions that individuals have of the ontological deficiencies of conceptual modeling grammars. We describe a multi-stage approach for instrument development that incorporates feedback from expert and user panels. We also report on a field test of the instrument with 590 modeling practitioners. We further study how different levels of modeling experience influence user perceptions of ontological deficiencies of modeling grammars. We provide implications for practice and future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A configurable process model describes a family of similar process models in a given domain. Such a model can be configured to obtain a specific process model that is subsequently used to handle individual cases, for instance, to process customer orders. Process configuration is notoriously difficult as there may be all kinds of interdependencies between configuration decisions.} In fact, an incorrect configuration may lead to behavioral issues such as deadlocks and livelocks. To address this problem, we present a novel verification approach inspired by the ``operating guidelines'' used for partner synthesis. We view the configuration process as an external service, and compute a characterization of all such services which meet particular requirements using the notion of configuration guideline. As a result, we can characterize all feasible configurations (i.\,e., configurations without behavioral problems) at design time, instead of repeatedly checking each individual configuration while configuring a process model.