894 resultados para Combinational Logic
Resumo:
With service interaction modelling, it is customary to distinguish between two types of models: choreographies and orchestrations. A choreography describes interactions within a collection of services from a global perspective, where no service plays a privileged role. Instead, services interact in a peer-to-peer manner. In contrast, an orchestration describes the interactions between one particular service, the orchestrator, and a number of partner services. The main proposition of this work is an approach to bridge these two modelling viewpoints by synthesising orchestrators from choreographies. To start with, choreographies are defined using a simple behaviour description language based on communicating finite state machines. From such a model, orchestrators are initially synthesised in the form of state machines. It turns out that state machines are not suitable for orchestration modelling, because orchestrators generally need to engage in concurrent interactions. To address this issue, a technique is proposed to transform state machines into process models in the Business Process Modelling Notation (BPMN). Orchestrations represented in BPMN can then be augmented with additional business logic to achieve value-adding mediation. In addition, techniques exist for refining BPMN models into executable process definitions. The transformation from state machines to BPMN relies on Petri nets as an intermediary representation and leverages techniques from theory of regions to identify concurrency in the initial Petri net. Once concurrency has been identified, the resulting Petri net is transformed into a BPMN model. The original contributions of this work are: an algorithm to synthesise orchestrators from choreographies and a rules-based transformation from Petri nets into BPMN.
Resumo:
Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.
Resumo:
The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.
Resumo:
The MDG deadline is fast approaching and the climate within the United Nations remains positive but skeptical. A common feeling is that a great deal of work and headway has been made, but the MDG goals will not be achieved in full by 2015. The largest problem facing the success of the MDGs is, and unless mitigated may remain, mismanaged governance. This argument is confirmed by a strong line of publications stemming from the United Nations and targeting methods (depending on a region or country context) such as improving governance via combating corruption, instituting accountability, peace and stability, as well as transparency. Furthermore, a logical assessment of the framework which MDGs operate in (i.e. international pressure and local civil socio-economic and/or political initiatives pushing governments to progress with MDGs) identifies the State's governing apparatus as the key to the success of MDGs. It is argued that a new analytic framework and grounded theory of democracy (the Element of Democracy) is needed in order to improve governance and enhance democracy. By looking beyond the confines of the MDGs and focusing on properly rectifying poor governance, the progress of MDGs can be accelerated as societies and their governments will be - at minimum - held more accountable to the success of programs in their respective countries. The paper demonstrates the logic of this argument - especially highlighting a new way of viewing democracy - and certain early practices which can accelerate MDGs in the short to medium term.
Resumo:
Near-infrared spectroscopy is a somewhat unutilised technique for the study of minerals. The technique has the ability to determine water content, hydroxyl groups and transition metals. In this paper we show the application of NIR spectroscopy to the study of selected minerals. The structure and spectral properties of two Cu-tellurite minerals graemite and teineite are compared with bismuth containing tellurite mineral smirnite by the application of NIR and IR spectroscopy. The position of Cu2+ bands and their splitting in the electronic spectra of tellurites are in conformity with octahedral geometry distortion. The spectral pattern of smirnite resembles graemite and the observed band at 10855 cm-1 with a weak shoulder at 7920 cm-1 is identified as due to Cu2+ ion. Any transition metal impurities may be identified by their bands in this spectral region. Three prominent bands observed in the region of 7200-6500 cm-1 are the overtones of water whilst the weak bands observed near 6200 cm-1in tellurites may be attributed to the hydrogen bonding between (TeO3)2- and H2O. The observation of a number of bands centred at around 7200 cm-1 confirms molecular water in tellurite minerals. A number of overlapping bands in the low wavenumbers 4500-4000 cm-1 is the result of combinational modes of (TeO3)2−ion. The appearance of the most intense peak at 5200 cm-1 with a pair of weak bands near 6000 cm-1 is a common feature in all the spectra and is related to the combinations of OH vibrations of water molecules, and bending vibrations ν2 (δ H2O). Bending vibrations δ H2O observed in the IR spectra shows a single band for smirnite at 1610 cm-1. The resolution of this band into number of components is evidenced for non-equivalent types of molecular water in graemite and teineite. (TeO3)2- stretching vibrations are characterized by three main absorptions at 1080, 780 and 695 cm-1.
Resumo:
This article is concerned with the repercussions of societal change on transnational media. It offers a new understanding of multilingual programming strategies by examining “Radio MultiKulti” (RM), a public service radio station discontinued from 1/1/2009 by Rundfunk Berlin-Brandenburg. In its fourteen years of existence, “RM” had to implement a well-intended and politically-motivated logic of ‘multiethnic, intercultural service station’. However, as we demonstrate, such a direction, despite some achievements, has resulted in the constraints to RM’s journalistic activities and language policy, drawing criticism for the station’s economic viability. This paper proposes that multilingual media services are to be framed by the concept of practical hybridity that allows a necessary responsiveness towards an ever-changing media environment, at the moment within digital culture. Our approach draws on Mikhail Bakhtin’s and Yuri Lotman’s theoretical approaches to hybridity, as well as in-depth interviews conducted with “RM” staff from 2005 onwards, further interviews with key agents outside RM and a continuous monitoring of the public debate which culminated at the end of 2008 in the controversial decision to close the radio station. Against this background, the concluding remarks are meant to contribute to the scholarly debate on hybridization as well as to inform multilingual media policy in the 21st century.
Resumo:
Multilevel inverters provide an attractive solution for power electronics when both reduced harmonic contents and high voltages are required. In this paper, a novel predictive current control technique is proposed for a three-phase multilevel inverter, which controls the capacitors voltages and load currents with low switching losses. The advantage of this contribution is that the technique can be applied to more voltage levels without significantly changing the control circuit. The three-phase three-level inverter with a pure inductive load has been implemented to track reference currents using analogue circuits and programmable logic device.
Resumo:
Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.
Resumo:
SCAPE is an interactive simulation that allows teachers and students to experiment with sustainable urban design. The project is based on the Kelvin Grove Urban Village, Brisbane. Groups of students role play as political, retail, elderly, student, council and builder characters to negotiate on game decisions around land use, density, housing types and transport in order to design a sustainable urban community. As they do so, the 3D simulation reacts in real time to illustrate what the village would look like as well as provide statistical information about the community they are creating. SCAPE brings together education, urban professional and technology expertise, helping it achieve educational outcomes, reflect real-world scenarios and include sophisticated logic and decision making processes and effects.---------- The research methodology was primarily practice led underpinned by action research methods resulting in innovative approaches and techniques in adapting digital games and simulation technologies to create dynamic and engaging experiences in pedagogical contexts. It also illustrates the possibilities for urban designers to engage a variety of communities in the processes, complexities and possibilities of urban development and sustainability.
Resumo:
We propose to design a Custom Learning System that responds to the unique needs and potentials of individual students, regardless of their location, abilities, attitudes, and circumstances. This project is intentionally provocative and future-looking but it is not unrealistic or unfeasible. We propose that by combining complex learning databases with a learner’s personal data, we could provide all students with a personal, customizable, and flexible education. This paper presents the initial research undertaken for this project of which the main challenges were to broadly map the complex web of data available, to identify what logic models are required to make the data meaningful for learning, and to translate this knowledge into simple and easy-to-use interfaces. The ultimate outcome of this research will be a series of candidate user interfaces and a broad system logic model for a new smart system for personalized learning. This project is student-centered, not techno-centric, aiming to deliver innovative solutions for learners and schools. It is deliberately future-looking, allowing us to ask questions that take us beyond the limitations of today to motivate new demands on technology.
Resumo:
A configurable process model describes a family of similar process models in a given domain. Such a model can be configured to obtain a specific process model that is subsequently used to handle individual cases, for instance, to process customer orders. Process configuration is notoriously difficult as there may be all kinds of interdependencies between configuration decisions.} In fact, an incorrect configuration may lead to behavioral issues such as deadlocks and livelocks. To address this problem, we present a novel verification approach inspired by the ``operating guidelines'' used for partner synthesis. We view the configuration process as an external service, and compute a characterization of all such services which meet particular requirements using the notion of configuration guideline. As a result, we can characterize all feasible configurations (i.\,e., configurations without behavioral problems) at design time, instead of repeatedly checking each individual configuration while configuring a process model.
Resumo:
Local governments are service driven rather than asset driven. Understanding this distinction is critical to ensuring that community needs are appropriately addressed. Translating community needs and desires into infrastructure is a complex yet little understood process. In this paper, we look at two case studies that explore the interface between service outcomes and the specification of performance requirements for the assets. The two case studies we look at are: a public health issue resulting from inadequate public amenities in a beach resort and the prioritisation of maintenance work in a world of increasing service demands and declining funding. The case studies all use the same investment logic mapping framework to establish clear drivers as to the problem that councils are responding to in delivering their services. The key to the framework is the separation of concern between service management and asset management.
Resumo:
In the quest for shorter time-to-market, higher quality and reduced cost, model-driven software development has emerged as a promising approach to software engineering. The central idea is to promote models to first-class citizens in the development process. Starting from a set of very abstract models in the early stage of the development, they are refined into more concrete models and finally, as a last step, into code. As early phases of development focus on different concepts compared to later stages, various modelling languages are employed to most accurately capture the concepts and relations under discussion. In light of this refinement process, translating between modelling languages becomes a time-consuming and error-prone necessity. This is remedied by model transformations providing support for reusing and automating recurring translation efforts. These transformations typically can only be used to translate a source model into a target model, but not vice versa. This poses a problem if the target model is subject to change. In this case the models get out of sync and therefore do not constitute a coherent description of the software system anymore, leading to erroneous results in later stages. This is a serious threat to the promised benefits of quality, cost-saving, and time-to-market. Therefore, providing a means to restore synchronisation after changes to models is crucial if the model-driven vision is to be realised. This process of reflecting changes made to a target model back to the source model is commonly known as Round-Trip Engineering (RTE). While there are a number of approaches to this problem, they impose restrictions on the nature of the model transformation. Typically, in order for a transformation to be reversed, for every change to the target model there must be exactly one change to the source model. While this makes synchronisation relatively “easy”, it is ill-suited for many practically relevant transformations as they do not have this one-to-one character. To overcome these issues and to provide a more general approach to RTE, this thesis puts forward an approach in two stages. First, a formal understanding of model synchronisation on the basis of non-injective transformations (where a number of different source models can correspond to the same target model) is established. Second, detailed techniques are devised that allow the implementation of this understanding of synchronisation. A formal underpinning for these techniques is drawn from abductive logic reasoning, which allows the inference of explanations from an observation in the context of a background theory. As non-injective transformations are the subject of this research, there might be a number of changes to the source model that all equally reflect a certain target model change. To help guide the procedure in finding “good” source changes, model metrics and heuristics are investigated. Combining abductive reasoning with best-first search and a “suitable” heuristic enables efficient computation of a number of “good” source changes. With this procedure Round-Trip Engineering of non-injective transformations can be supported.
Resumo:
The OED informs us that “gender” has at its root the Latin genus, meaning “race, kind,” and emerges as early as the fifth century as a term for differentiating between types of (especially) people and words. In the following 1500 years, gender appears in linguistic and biological contexts to distinguish types of words and bodies from one another, as when words in Indo-European languages were identified as masculine, feminine, or neuter, and humans were identified as male or female. It is telling that gender has historically (whether overtly or covertly) been a tool of negotiation between our understandings of bodies, and meanings derived from and attributed to them. Within the field of children’s literature studies, as in other disciplines, gender in and of itself is rarely the object of critique. Rather, specific constructions of gender structure understandings of subjectivity; allow or disallow certain behaviors or experiences on the basis of biological sex; and dictate a specific vision of social relations and organization. Critical approaches to gender in children’s literature have included linguistic analysis (Turner-Bowker; Sunderland); analysis of visual representations (Bradford; Moebius); cultural images of females (Grauerholz and Pescosolido); consideration of gender and genre (Christian-Smith; Stephens); ideological (Nodelman and Reimer); psychoanalytic (Coats); discourse analysis (Stephens); and masculinity studies (Nodelman) among others. In the adjacent fields of education and literacy studies, gender has been a sustained point of investigation, often deriving from perceived gendering of pedagogical practices (Lehr) or of reading preferences and competencies, and in recent years, perceptions of boys as “reluctant readers” (Moss). The ideology of patriarchy has primarily come under critical scrutiny 2 because it has been used to locate characters and readers within the specific binary logic of gender relations that historically subordinated the feminine to the masculine. Just as feminism might be broadly defined as resistance to existing power structures, a gendered reading might be broadly defined as a “resistant reading” in that it most often reveals or contests that which a text assumes to be the norm.