78 resultados para Slavic Languages and Societies
Resumo:
The present paper motivates the study of mind change complexity for learning minimal models of length-bounded logic programs. It establishes ordinal mind change complexity bounds for learnability of these classes both from positive facts and from positive and negative facts. Building on Angluin’s notion of finite thickness and Wright’s work on finite elasticity, Shinohara defined the property of bounded finite thickness to give a sufficient condition for learnability of indexed families of computable languages from positive data. This paper shows that an effective version of Shinohara’s notion of bounded finite thickness gives sufficient conditions for learnability with ordinal mind change bound, both in the context of learnability from positive data and for learnability from complete (both positive and negative) data. Let Omega be a notation for the first limit ordinal. Then, it is shown that if a language defining framework yields a uniformly decidable family of languages and has effective bounded finite thickness, then for each natural number m >0, the class of languages defined by formal systems of length <= m: • is identifiable in the limit from positive data with a mind change bound of Omega (power)m; • is identifiable in the limit from both positive and negative data with an ordinal mind change bound of Omega × m. The above sufficient conditions are employed to give an ordinal mind change bound for learnability of minimal models of various classes of length-bounded Prolog programs, including Shapiro’s linear programs, Arimura and Shinohara’s depth-bounded linearly covering programs, and Krishna Rao’s depth-bounded linearly moded programs. It is also noted that the bound for learning from positive data is tight for the example classes considered.
Resumo:
In the era of knowledge economy, cities and regions have started increasingly investing on their physical, social and knowledge infrastructures so as to foster, attract and retain global talent and investment. Knowledge-based urban development as a new paradigm in urban planning and development is being implemented across the globe in order to increase the competitiveness of cities and regions. This chapter provides an overview of the lessons from Multimedia Super Corridor, Malaysia as one of the first large scale manifestations of knowledge-based urban development in South East Asia. The chapter investigates the application of the knowledge-based urban development concept within the Malaysian context, and, particularly, scrutinises the development and evolution of Multimedia Super Corridor by focusing on strategies, implementation policies, infrastructural implications, and agencies involved in the development and management of the corridor. In the light of the literature and case findings, the chapter provides generic recommendations, on the orchestration of knowledge-based urban development, for other cities and regions seeking such development.
Resumo:
While Business Process Management (BPM) is an established discipline, the increased adoption of BPM technology in recent years has introduced new challenges. One challenge concerns dealing with the ever-growing complexity of business process models. Mechanisms for dealing with this complexity can be classified into two categories: 1) those that are solely concerned with the visual representation of the model and 2) those that change its inner structure. While significant attention is paid to the latter category in the BPM literature, this paper focuses on the former category. It presents a collection of patterns that generalize and conceptualize various existing mechanisms to change the visual representation of a process model. Next, it provides a detailed analysis of the degree of support for these patterns in a number of state-of-the-art languages and tools. This paper concludes with the results of a usability evaluation of the patterns conducted with BPM practitioners.
Resumo:
I commence this opinion piece with specific reference to the Gillard Government's decision to cut funding for the Australian Learning and Teaching Council (ALTC)in 2011. I then consider impact of this decision on quality teaching in higher education with specific reference to Studies of Asia. In particular, I reflect on the teaching of Asian languages and cultures in Australia since the 1970 Auchmuty report, and conclude that despite the efforts of policy makers, not much has really changed. In doing so, I emphasise the importance of quality teaching in higher education for inspiring students to challenge their cultural assumptions and to prompt them to develop new views of the world.
Resumo:
The final shape of the "Internet of Things" ubiquitous computing promises relies on a cybernetic system of inputs (in the form of sensory information), computation or decision making (based on the prefiguration of rules, contexts, and user-generated or defined metadata), and outputs (associated action from ubiquitous computing devices). My interest in this paper lies in the computational intelligences that suture these positions together, and how positioning these intelligences as autonomous agents extends the dialogue between human-users and ubiquitous computing technology. Drawing specifically on the scenarios surrounding the employment of ubiquitous computing within aged care, I argue that agency is something that cannot be traded without serious consideration of the associated ethics.
Resumo:
Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.
Resumo:
Flow-oriented process modeling languages have a long tradition in the area of Business Process Management and are widely used for capturing activities with their behavioral and data dependencies. Individual events were introduced for triggering process instantiation and activities. However, real-world business cases drive the need for also covering complex event patterns as they are known in the field of Complex Event Processing. Therefore, this paper puts forward a catalog of requirements for handling complex events in process models, which can be used as reference framework for assessing process definition languages and systems. An assessment of BPEL and BPMN is provided.
Resumo:
Purpose: The purpose of this paper is to clarify how end-users’ tacit knowledge can be captured and integrated in an overall business process management (BPM) approach. Current approaches to support stakeholders’ collaboration in the modelling of business processes envision an egalitarian environment where stakeholders interact in the same context, using the same languages and sharing the same perspectives on the business process. Therefore, such stakeholders have to collaborate in the context of process modelling using a language that some of them do not master, and have to integrate their various perspectives. Design/methodology/approach: The paper applies the SECI knowledge management process to analyse the problems of traditional top-down BPM approaches and BPM collaborative modelling tools. Besides, the SECI model is also applied to Wikipedia, a successful Web 2.0-based knowledge management environment, to identify how tacit knowledge is captured in a bottom-up approach. Findings – The paper identifies a set of requirements for a hybrid BPM approach, both top-down and bottom-up, and describes a new BPM method based on a stepwise discovery of knowledge. Originality/value: This new approach, Processpedia, enhances collaborative modelling among stakeholders without enforcing egalitarianism. In Processpedia tacit knowledge is captured and standardised into the organisation’s business processes by fostering an ecological participation of all the stakeholders and capitalising on stakeholders’ distinctive characteristics.
Resumo:
Language-use has proven to be the most complex and complicating of all Internet features, yet people and institutions invest enormously in language and crosslanguage features because they are fundamental to the success of the Internet’s past, present and future. The thesis takes into focus the developments of the latter – features that facilitate and signify linking between or across languages – both in their historical and current contexts. In the theoretical analysis, the conceptual platform of inter-language linking is developed to both accommodate efforts towards a new social complexity model for the co-evolution of languages and language content, as well as to create an open analytical space for language and cross-language related features of the Internet and beyond. The practiced uses of inter-language linking have changed over the last decades. Before and during the first years of the WWW, mechanisms of inter-language linking were at best important elements used to create new institutional or content arrangements, but on a large scale they were just insignificant. This has changed with the emergence of the WWW and its development into a web in which content in different languages co-evolve. The thesis traces the inter-language linking mechanisms that facilitated these dynamic changes by analysing what these linking mechanisms are, how their historical as well as current contexts can be understood and what kinds of cultural-economic innovation they enable and impede. The study discusses this alongside four empirical cases of bilingual or multilingual media use, ranging from television and web services for languages of smaller populations, to large-scale, multiple languages involving web ventures by the British Broadcasting Corporation, the Special Broadcasting Service Australia, Wikipedia and Google. To sum up, the thesis introduces the concepts of ‘inter-language linking’ and the ‘lateral web’ to model the social complexity and co-evolution of languages online. The resulting model reconsiders existing social complexity models in that it is the first that can explain the emergence of large-scale, networked co-evolution of languages and language content facilitated by the Internet and the WWW. Finally, the thesis argues that the Internet enables an open space for language and crosslanguage related features and investigates how far this process is facilitated by (1) amateurs and (2) human-algorithmic interaction cultures.
Resumo:
Human survival depends on human ingenuity in using resources at hand to sustain human life. The historical record – in wrings and archaeological artefacts – provides evidence of the growth and collapse of political organisations and societies. In the institutions of Western civilisation, some traditions have endured over millennia where the roles of monarchs and public officials have been organised in perpetual succession. These roles were developed as conventions in the British Parliament after 1295 and provided the models of corporate governance in both public and private enterprise that have been continuously refined to the present day. In 2011, the Queensland Parliament legislated to introduce a new and more open system of scrutiny of legislation through a system of portfolio-based parliamentary committees. The committees began to function more actively in July 2012 and have inviting submissions from stakeholders and experts in a structured way to consider the government’s priorities in its legislative programme. The questions now are whether the Surveying and Spatial Sciences can respond expertly to address the terms of reference and meet the timetables of the various parliamentary committees. This paper discusses some of the more important and urgent issues that deserve debate that the profession needs to address in becoming more responsive to matters of public policy.
Resumo:
This article provides an overview on some of the key aspects that relate to the co-evolution of languages and its associated content in the Internet environment. A focus on such a co-evolution is pertinent as the evolution of languages in the Internet environment can be better understood if the development of its existing and emerging content, that is, the content in the respective language, is taken into consideration. By doing so, this article examines two related aspects: the governance of languages at critical sites of the Internet environment, including ICANN, Wikipedia and Google Translate. Following on from this examination, the second part outlines how the co-evolution of languages and associated content in the Internet environment extends policy-making related to linguistic pluralism. It is argued that policies which centre on language availability in the Internet environment must shift their focus to the dynamics of available content instead. The notion of language pairs as a new regime of intersection for both languages and content is discussed to introduce an extended understanding of the uses of linguistic pluralism in the Internet environment. The ultimate extrapolation of such an enhanced approach, it is argued, centres less on 6,000 languages but, instead, on 36 million language pairs. This article describes how such a powerful resource evolves in the Internet environment.
Resumo:
The act of computer programming is generally considered to be temporally removed from a computer program's execution. In this paper we discuss the idea of programming as an activity that takes place within the temporal bounds of a real-time computational process and its interactions with the physical world. We ground these ideas within the con- text of livecoding -- a live audiovisual performance practice. We then describe how the development of the programming environment "Impromptu" has addressed our ideas of programming with time and the notion of the programmer as an agent in a cyber-physical system.
Resumo:
To comprehend other places through the lens of the ‘shifting’ interior – examining interior histories and theories, and by association, the constructed and performed practices of a range of interior conditions – is to enable speculation on the production and occupation of interior space in other territories and societies. Forms of inspiration are, as both keynote papers acknowledge, often overlooked; whether this is a result of western-centric approaches to aesthetics, or their methods of enquiry. Evidently as the Symposium papers demonstrate, the discussion is more complicated than it might at first appear and the observation of interior decoration/design as critical practice offers one way to engage.
Resumo:
This chapter is a tutorial that teaches you how to design extended finite state machine (EFSM) test models for a system that you want to test. EFSM models are more powerful and expressive than simple finite state machine (FSM) models, and are one of the most commonly used styles of models for model-based testing, especially for embedded systems. There are many languages and notations in use for writing EFSM models, but in this tutorial we write our EFSM models in the familiar Java programming language. To generate tests from these EFSM models we use ModelJUnit, which is an open-source tool that supports several stochastic test generation algorithms, and we also show how to write your own model-based testing tool. We show how EFSM models can be used for unit testing and system testing of embedded systems, and for offline testing as well as online testing.