270 resultados para 319.272053
Resumo:
Modified montmorillonite was prepared at different surfactant (HDTMA) loadings through ion exchange. The conformational arrangement of the loaded surfactants within the interlayer space of MMT was obtained by computational modelling. The conformational change of surfactant molecules enhance the visual understanding of the results obtained from characterization methods such as XRD and surface analysis of the organoclays. Batch experiments were carried out for the adsorption of p-chlorophenol (PCP) and different conditions (pH and temperature) were used in order to determine the optimum sorption. For comparison purpose, the experiments were repeated under the same conditions for p-nitrophenol (PNP). Langmuir and Freundlich equations were applied to the adsorption isotherm of PCP and PNP. The Freundlich isotherm model was found to be the best fit for both of the phenolic compounds. This involved multilayer adsorptions in the adsorption process. In particular, the binding affinity value of PNP was higher than that of PCP and this is attributable to their hydrophobicities. The adsorption of the phenolic compounds by organoclays intercalated with highly loaded surfactants was markedly improved possibly due to the fact that the intercalated surfactant molecules within the interlayer space contribute to the partition phases, which result in greater adsorption of the organic pollutants.
Resumo:
Norms regulate the behaviour of their subjects and define what is legal and what is illegal. Norms typically describe the conditions under which they are applicable and the normative effects as a results of their applications. On the other hand, process models specify how a business operation or service is to be carried out to achieve a desired outcome. Norms can have significant impact on how business operations are conducted and they can apply to the whole or part of a business process. For example, they may impose conditions on the different aspects of a process (e.g., perform tasks in a specific sequence (control-flow), at a specific time or within a certain time frame (temporal aspect), by specific people (resources)). We propose a framework that provides the formal semantics of the normative requirements for determining whether a business process complies with a normative document (where a normative document can be understood in a very broad sense, ranging from internal policies to best practice policies, to statutory acts). We also present a classification of normal requirements based on the notion of different types of obligations and the effects of violating these obligations.
Resumo:
This paper presents a study whereby a series of tests was undertaken using a naturally aspirated 4 cylinder, 2.216 litre, Perkins Diesel engine fitted with a piston having an undersized skirt. This experimental simulation resulted in engine running conditions that included abnormally high levels of piston slap occurring in one of the cylinders. The detectability of the resultant Diesel engine piston slap was investigated using acoustic emission signals. Data corresponding to both normal and piston slap engine running conditions was captured using acoustic emission transducers along with both; in-cylinder pressure and top-dead centre reference signals. Using these signals it was possible to demonstrate that the increased piston slap running conditions were distinguishable by monitoring the piston slap events occurring near the piston mid-stroke positions. However, when monitoring the piston slap events occurring near the TDC/BDC piston stroke positions, the normal and excessive piston slap engine running condition were not clearly distinguishable.
Resumo:
Existing compliance management frameworks (CMFs) offer a multitude of compliance management capabilities that makes difficult for enterprises to decide on the suitability of a framework. Making a decision on the suitability requires a deep understanding of the functionalities of a framework. Gaining such an understanding is a difficult task which, in turn, requires specialised tools and methodologies for evaluation. Current compliance research lacks such tools and methodologies for evaluating CMFs. This paper reports a methodological evaluation of existing CMFs based on a pre-defined evaluation criteria. Our evaluation highlights what existing CMFs offer, and what they cannot. Also, it underpins various open questions and discusses the challenges in this direction.
Resumo:
Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.
Resumo:
This book presents readers with the opportunity to fundamentally re-evaluate the processes of innovation and entrepreneurship, and to rethink how they might best be stimulated and fostered within our organizations and communities. The fundamental thesis of the book is that the entrepreneurial process is not a linear progression from novel idea to successful innovation, but is an iterative series of experiments, where progress depends on the persistence and resilience of the individuals involved, and their ability and to learn from failure as well as success. From this premise, the authors argue that the ideal environment for new venture creation is a form of “experimental laboratory,” a community of innovators where ideas are generated, shared, and refined; experiments are encouraged; and which in itself serves as a test environment for those ideas and experiments. This environment is quite different from the traditional “incubator,” which may impose the disciplines of the established firm too early in the development of the new venture.
Resumo:
This chapter investigates the relationship between technical and operational skills and the development of conceptual knowledge and literacy in Media Arts learning. It argues that there is a relationship between the stories, expressions and ideas that students aim to produce with communications media, and their ability to realise these in material form through technical processes in specific material contexts. Our claim is that there is a relationship between the technical and the operational, along with material relations and the development of conceptual knowledge and literacy in media arts learning. We place more emphasis on the material aspects of literacy than is usually the case in socio-cultural accounts of media literacy. We provide examples from a current project to demonstrate that it is just as important to address the material as it is the discursive and conceptual when considering how students develop media literacy in classroom spaces.
Resumo:
In Social Science (Organization Studies, Economics, Management Science, Strategy, International Relations, Political Science…) the quest for addressing the question “what is a good practitioner?” has been around for centuries, with the underlying assumptions that good practitioners should lead organizations to higher levels of performance. Hence to ask “what is a good “captain”?” is not a new question, we should add! (e.g. Tsoukas & Cummings, 1997, p. 670; Söderlund, 2004, p. 190). This interrogation leads to consider problems such as the relations between dichotomies Theory and Practice, rigor and relevance of research, ways of knowing and knowledge forms. On the one hand we face the “Enlightenment” assumptions underlying modern positivist Social science, grounded in “unity-of-science dream of transforming and reducing all kinds of knowledge to one basic form and level” and cause-effects relationships (Eikeland, 2012, p. 20), and on the other, the postmodern interpretivist proposal, and its “tendency to make all kinds of knowing equivalent” (Eikeland, 2012, p. 20). In the project management space, this aims at addressing one of the fundamental problems in the field: projects still do not deliver their expected benefits and promises and therefore the socio-economical good (Hodgson & Cicmil, 2007; Bredillet, 2010, Lalonde et al., 2012). The Cartesian tradition supporting projects research and practice for the last 60 years (Bredillet, 2010, p. 4) has led to the lack of relevance to practice of the current conceptual base of project management, despite the sum of research, development of standards, best & good practices and the related development of project management bodies of knowledge (Packendorff, 1995, p. 319-323; Cicmil & Hodgson, 2006, p. 2–6, Hodgson & Cicmil, 2007, p. 436–7; Winter et al., 2006, p. 638). Referring to both Hodgson (2002) and Giddens (1993), we could say that “those who expect a “social-scientific Newton” to revolutionize this young field “are not only waiting for a train that will not arrive, but are in the wrong station altogether” (Hodgson, 2002, p. 809; Giddens, 1993, p. 18). While, in the postmodern stream mainly rooted in the “practice turn” (e.g. Hällgren & Lindahl, 2012), the shift from methodological individualism to social viscosity and the advocated pluralism lead to reinforce the “functional stupidity” (Alvesson & Spicer, 2012, p. 1194) this postmodern stream aims at overcoming. We suggest here that addressing the question “what is a good PM?” requires a philosophy of practice perspective to complement the “usual” philosophy of science perspective. The questioning of the modern Cartesian tradition mirrors a similar one made within Social science (Say, 1964; Koontz, 1961, 1980; Menger, 1985; Warry, 1992; Rothbard, 1997a; Tsoukas & Cummings, 1997; Flyvbjerg, 2001; Boisot & McKelvey, 2010), calling for new thinking. In order to get outside the rationalist ‘box’, Toulmin (1990, p. 11), along with Tsoukas & Cummings (1997, p. 655), suggests a possible path, summarizing the thoughts of many authors: “It can cling to the discredited research program of the purely theoretical (i.e. “modern”) philosophy, which will end up by driving it out of business: it can look for new and less exclusively theoretical ways of working, and develop the methods needed for a more practical (“post-modern”) agenda; or it can return to its pre-17th century traditions, and try to recover the lost (“pre-modern”) topics that were side-tracked by Descartes, but can be usefully taken up for the future” (Toulmin, 1990, p. 11). Thus, paradoxically and interestingly, in their quest for the so-called post-modernism, many authors build on “pre-modern” philosophies such as the Aristotelian one (e.g. MacIntyre, 1985, 2007; Tsoukas & Cummings, 1997; Flyvbjerg, 2001; Blomquist et al., 2010; Lalonde et al., 2012). It is perhaps because the post-modern stream emphasizes a dialogic process restricted to reliance on voice and textual representation, it limits the meaning of communicative praxis, and weaking the practice because it turns away attention from more fundamental issues associated with problem-definition and knowledge-for-use in action (Tedlock, 1983, p. 332–4; Schrag, 1986, p. 30, 46–7; Warry, 1992, p. 157). Eikeland suggests that the Aristotelian “gnoseology allows for reconsidering and reintegrating ways of knowing: traditional, practical, tacit, emotional, experiential, intuitive, etc., marginalised and considered insufficient by modernist [and post-modernist] thinking” (Eikeland, 2012, p. 20—21). By contrast with the modernist one-dimensional thinking and relativist and pluralistic post-modernism, we suggest, in a turn to an Aristotelian pre-modern lens, to re-conceptualise (“re” involving here a “re”-turn to pre-modern thinking) the “do” and to shift the perspective from what a good PM is (philosophy of science lens) to what a good PM does (philosophy of practice lens) (Aristotle, 1926a). As Tsoukas & Cummings put it: “In the Aristotelian tradition to call something good is to make a factual statement. To ask, for example, ’what is a good captain’?’ is not to come up with a list of attributes that good captains share (as modem contingency theorists would have it), but to point out the things that those who are recognized as good captains do.” (Tsoukas & Cummings, 1997, p. 670) Thus, this conversation offers a dialogue and deliberation about a central question: What does a good project manager do? The conversation is organized around a critic of the underlying assumptions supporting the modern, post-modern and pre-modern relations to ways of knowing, forms of knowledge and “practice”.
Resumo:
"This book examines the growing trend of recognition and practices of CSR in private enterprises in developing countries. It identifies the challenges and deficiencies in these practices and proposes means for improvement. Based on a sound theoretical foundation, this book focusses on the case of Bangladesh and the ready-made garment industry to exemplify the described developments. After a brief introduction the book outlines the standards of Corporate Social Responsibility. It compares the trends in CSR practices both in developed and developing countries and then embarks on CSR practices in the private sector in Bangladesh to finally present a detailed analysis of CSR and its practices in the ready-made garment industry. The book not only compares developing countries with developed, but as well provides an assessment and analysis of different stages of CSR within the South Asian area."--published website
Resumo:
The representation of business process models has been a continuing research topic for many years now. However, many process model representations have not developed beyond minimally interactive 2D icon-based representations of directed graphs and networks, with little or no annotation for information overlays. In addition, very few of these representations have undergone a thorough analysis or design process with reference to psychological theories on data and process visualization. This dearth of visualization research, we believe, has led to problems with BPM uptake in some organizations, as the representations can be difficult for stakeholders to understand, and thus remains an open research question for the BPM community. In addition, business analysts and process modeling experts themselves need visual representations that are able to assist with key BPM life cycle tasks in the process of generating optimal solutions. With the rise of desktop computers and commodity mobile devices capable of supporting rich interactive 3D environments, we believe that much of the research performed in computer human interaction, virtual reality, games and interactive entertainment have much potential in areas of BPM; to engage, provide insight, and to promote collaboration amongst analysts and stakeholders alike. We believe this is a timely topic, with research emerging in a number of places around the globe, relevant to this workshop. This is the second TAProViz workshop being run at BPM. The intention this year is to consolidate on the results of last year's successful workshop by further developing this important topic, identifying the key research topics of interest to the BPM visualization community.
Resumo:
Time plays an important role in norms. In this paper we start from our previously proposed classification of obligations, and point out some shortcomings of Event Calculus (EC) to represent obligations. We proposed an extension of EC that avoids such shortcomings and we show how to use it to model the various types of obligations.
Resumo:
The design of concurrent software systems, in particular process-aware information systems, involves behavioral modeling at various stages. Recently, approaches to behavioral analysis of such systems have been based on declarative abstractions defined as sets of behavioral relations. However, these relations are typically defined in an ad-hoc manner. In this paper, we address the lack of a systematic exploration of the fundamental relations that can be used to capture the behavior of concurrent systems, i.e., co-occurrence, conflict, causality, and concurrency. Besides the definition of the spectrum of behavioral relations, which we refer to as the 4C spectrum, we also show that our relations give rise to implication lattices. We further provide operationalizations of the proposed relations, starting by proposing techniques for computing relations in unlabeled systems, which are then lifted to become applicable in the context of labeled systems, i.e., systems in which state transitions have semantic annotations. Finally, we report on experimental results on efficiency of the proposed computations.
Resumo:
It was demonstrated recently that dramatic changes in the redox behaviour of gold/aqueous solution interfaces may be observed following either cathodic or thermal electrode pretreatment. Further work on the cathodic pretreatment of gold in acid solution revealed that as the activity of the gold surface was increased, its performance as a substrate for hydrogen gas evolution under constant potential conditions deteriorated. The change in activity of the gold atoms at the interface, which was attributed to a hydrogen embrittlement process (the occurrence of the latter was subsequently checked by surface microscopy), was confirmed, as in earlier work, by the appearance of a substantial anodic peak at ca. 0.5 V (RHE) in a post-activation positive sweep. Changes in the catalytic activity of a metal surface reflect the fact that the structure (or topography), thermodynamic activity and electronic properties of a surface are dependent not only on pretreatment but also, in the case of the hydrogen evolution reaction, vary with time during the course of reaction. As will be reported shortly, similar (and often more dramatic) time-dependent behaviour was observed for hydrogen gas evolution on other metal electrodes.
Resumo:
In this paper I will explore some experience-based perspectives on information literacy research and practice. The research based understanding of what information literacy looks like to those experiencing it, is very different from the standard interpretations of information literacy as involving largely text based information searching, interpretation, evaluation and use. It also involves particular understandings of the interrelation between information and learning experiences. In following this thread of the history of information literacy I will reflect on aspects of the past, present and future of information literacy research. In each of these areas I explore experiential, especially phenomenographic approaches to information literacy and information literacy education, to reveal the unfolding understanding of people’s experience of information literacy stemming from this orientation. In addressing the past I will look in particular at the contribution of the seven faces of information literacy and some lessons learned from attending to variation in experience. I will explore important directions and insights that this history may help us to retain; including the value of understanding peoples’ information literacy experience. In addressing the present, I will introduce more recent work that adopts the key ideas of informed learning by attending to both information and learning experiences in specific contexts. I will look at some contemporary directions and key issues, including the reinvention of the phenomenographic, or relational approach to information literacy as informed learning or using information to learn. I will also provide some examples of the contribution of experiential approaches to information literacy research and practice. The evolution and development of the phenomenographic approach to information literacy, and the growing attention to a dual focus on information and learning experiences in this approach will be highlighted. Finally, in addressing the future I will return to advocacy, the recognition and pursuit of the transforming and empowering heart of information literacy; and suggest that for information literacy research, including the experiential, a turn towards the emancipatory has much to offer.