745 resultados para Change-based collaboration


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Based on Goffman’s definition that frames are general ‘schemata of interpretation’ that people use to ‘locate, perceive, identify, and label’, other scholars have used the concept in a more specific way to analyze media coverage. Frames are used in the sense of organizing devices that allow journalists to select and emphasise topics, to decide ‘what matters’ (Gitlin 1980). Gamson and Modigliani (1989) consider frames as being embedded within ‘media packages’ that can be seen as ‘giving meaning’ to an issue. According to Entman (1993), framing comprises a combination of different activities such as: problem definition, causal interpretation, moral evaluation, and/or treatment recommendation for the item described. Previous research has analysed climate change with the purpose of testing Downs’s model of the issue attention cycle (Trumbo 1996), to uncover media biases in the US press (Boykoff and Boykoff 2004), to highlight differences between nations (Brossard et al. 2004; Grundmann 2007) or to analyze cultural reconstructions of scientific knowledge (Carvalho and Burgess 2005). In this paper we shall present data from a corpus linguistics-based approach. We will be drawing on results of a pilot study conducted in Spring 2008 based on the Nexis news media archive. Based on comparative data from the US, the UK, France and Germany, we aim to show how the climate change issue has been framed differently in these countries and how this framing indicates differences in national climate change policies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The International Cooperation Agency (identified in this article as IDEA) working in Colombia is one of the most important in Colombian society with programs that support gender rights, human rights, justice and peace, scholarships, aboriginal population, youth, afro descendants population, economic development in communities, and environmental development. The identified problem is based on the diversified offer of services, collaboration and social intervention which requires diverse groups of people with multiple agendas, ways to support their mandates, disciplines, and professional competences. Knowledge creation and the growth and sustainability of the organization can be in danger because of a silo culture and the resulting reduced leverage of the separate group capabilities. Organizational memory is generally formed by the tacit knowledge of the organization members, given the value of accumulated experience that this kind of social work implies. Its loss is therefore a strategic and operational risk when most problem interventions rely on direct work in the socio-economic field and living real experiences with communities. The knowledge management solution presented in this article starts first, with the identification of the people and groups concerned and the creation of a knowledge map as a means to strengthen the ties between organizational members; second, by introducing a content management system designed to support the documentation process and knowledge sharing process; and third, introducing a methodology for the adaptation of a Balanced Scorecard based on the knowledge management processes. These three main steps lead to a knowledge management “solution” that has been implemented in the organization, comprising three components: a knowledge management system, training support and promotion of cultural change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A frequency-modulated continuous-wave technique is used to detect the presence of frequency shifts in the Rayleigh-backscattered light in a single-mode optical fiber as a result of a changing temperature. The system is able to detect a rate of temperature change of 0.014 K/s, when a 20-cm length of fiber is heated. The system is also able to demonstrate a spatial resolution of better than 15 cm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Swarm intelligence is a popular paradigm for algorithm design. Frequently drawing inspiration from natural systems, it assigns simple rules to a set of agents with the aim that, through local interactions, they collectively solve some global problem. Current variants of a popular swarm based optimization algorithm, particle swarm optimization (PSO), are investigated with a focus on premature convergence. A novel variant, dispersive PSO, is proposed to address this problem and is shown to lead to increased robustness and performance compared to current PSO algorithms. A nature inspired decentralised multi-agent algorithm is proposed to solve a constrained problem of distributed task allocation. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. New rules for specialisation are proposed and are shown to exhibit improved eciency and exibility compared to existing ones. These new rules are compared with a market based approach to agent control. The eciency (average number of tasks performed), the exibility (ability to react to changes in the environment), and the sensitivity to load (ability to cope with differing demands) are investigated in both static and dynamic environments. A hybrid algorithm combining both approaches, is shown to exhibit improved eciency and robustness. Evolutionary algorithms are employed, both to optimize parameters and to allow the various rules to evolve and compete. We also observe extinction and speciation. In order to interpret algorithm performance we analyse the causes of eciency loss, derive theoretical upper bounds for the eciency, as well as a complete theoretical description of a non-trivial case, and compare these with the experimental results. Motivated by this work we introduce agent "memory" (the possibility for agents to develop preferences for certain cities) and show that not only does it lead to emergent cooperation between agents, but also to a signicant increase in efficiency.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The Respiratory Health Network in Western Australia developed the Asthma Model of Care in 2010 which incorporates best practice guidelines. At the same time short-acting beta agonist guidelines (SABA) were developed by stakeholder consensus at University of Western Australia (UWA) and incorporated the use of an Asthma Action Plan Card. Objective: To report on the implementation of a key component of the WA Asthma Model of Care, the SABA guidelines that incorporate the Asthma Action Plan card. Methods: Implementation strategies included lectures, direct pharmacy detailing, media releases, and information packs (postal and electronic). Groups targeted included pharmacists, consumers and medical practitioners. Results: State-based (n=18) and national (n=6) professional organisations were informed about the launch of the guidelines into practice in WA. In the four-month implementation period more than 47,000 Asthma Action Plan Cards were distributed, primarily to community pharmacies. More than 500 pharmacies were provided with information packs or individual detailing. More than 10,000 consumers were provided with information about the guidelines. Conclusions and implications: The collaboration of stakeholders in this project allowed for widespread access to various portals which, in turn, resulted in a multifaceted approach in disseminating information. Ongoing maintenance programs are required to sustain and build on the momentum of the implementation program and to ultimately address patient outcomes and practice change, which would be the longer-term goals of such a project. Future research will seek to ascertain the impact of the card on patient outcomes in WA.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A frequency-modulated continuous-wave technique is used to detect the presence of frequency shifts in the Rayleigh-backscattered light in a single-mode optical fiber as a result of a changing temperature. The system is able to detect a rate of temperature change of 0.014 K/s, when a 20-cm length of fiber is heated. The system is also able to demonstrate a spatial resolution of better than 15 cm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

* Under Knowledge Infrastructure we imply all the means that enable effective knowledge management within organization ~ knowledge process support.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Engineering education in the United Kingdom is at the point of embarking upon an interesting journey into uncharted waters. At no point in the past have there been so many drivers for change and so many opportunities for the development of engineering pedagogy. This paper will look at how Engineering Education Research (EER) has developed within the UK and what differentiates it from the many small scale practitioner interventions, perhaps without a clear research question or with little evaluation, which are presented at numerous staff development sessions, workshops and conferences. From this position some examples of current projects will be described, outcomes of funding opportunities will be summarised and the benefits of collaboration with other disciplines illustrated. In this study, I will account for how the design of task structure according to variation theory, as well as the probe-ware technology, make the laws of force and motion visible and learnable and, especially, in the lab studied make Newton's third law visible and learnable. I will also, as a comparison, include data from a mechanics lab that use the same probe-ware technology and deal with the same topics in mechanics, but uses a differently designed task structure. I will argue that the lower achievements on the FMCE-test in this latter case can be attributed to these differences in task structure in the lab instructions. According to my analysis, the necessary pattern of variation is not included in the design. I will also present a microanalysis of 15 hours collected from engineering students' activities in a lab about impulse and collisions based on video recordings of student's activities in a lab about impulse and collisions. The important object of learning in this lab is the development of an understanding of Newton's third law. The approach analysing students interaction using video data is inspired by ethnomethodology and conversation analysis, i.e. I will focus on students practical, contingent and embodied inquiry in the setting of the lab. I argue that my result corroborates variation theory and show this theory can be used as a 'tool' for designing labs as well as for analysing labs and lab instructions. Thus my results have implications outside the domain of this study and have implications for understanding critical features for student learning in labs. Engineering higher education is well used to change. As technology develops the abilities expected by employers of graduates expand, yet our understanding of how to make informed decisions about learning and teaching strategies does not without a conscious effort to do so. With the numerous demands of academic life, we often fail to acknowledge our incomplete understanding of how our students learn within our discipline. The journey facing engineering education in the UK is being driven by two classes of driver. Firstly there are those which we have been working to expand our understanding of, such as retention and employability, and secondly the new challenges such as substantial changes to funding systems allied with an increase in student expectations. Only through continued research can priorities be identified, addressed and a coherent and strong voice for informed change be heard within the wider engineering education community. This new position makes it even more important that through EER we acquire the knowledge and understanding needed to make informed decisions regarding approaches to teaching, curriculum design and measures to promote effective student learning. This then raises the question 'how does EER function within a diverse academic community?' Within an existing community of academics interested in taking meaningful steps towards understanding the ongoing challenges of engineering education a Special Interest Group (SIG) has formed in the UK. The formation of this group has itself been part of the rapidly changing environment through its facilitation by the Higher Education Academy's Engineering Subject Centre, an entity which through the Academy's current restructuring will no longer exist as a discrete Centre dedicated to supporting engineering academics. The aims of this group, the activities it is currently undertaking and how it expects to network and collaborate with the global EER community will be reported in this paper. This will include explanation of how the group has identified barriers to the progress of EER and how it is seeking, through a series of activities, to facilitate recognition and growth of EER both within the UK and with our valued international colleagues.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A dolgozatban a hitelderivatívák intenzitásalapú modellezésének néhány kérdését vizsgáljuk meg. Megmutatjuk, hogy alkalmas mértékcserével nemcsak a duplán sztochasztikus folyamatok, hanem tetszőleges intenzitással rendelkező pontfolyamat esetén is kiszámolható az összetett kár- és csődfolyamat eloszlásának Laplace-transzformáltja. _____ The paper addresses questions concerning the use of intensity based modeling in the pricing of credit derivatives. As the specification of the distribution of the lossprocess is a non-trivial exercise, the well-know technique for this task utilizes the inversion of the Laplace-transform. A popular choice for the model is the class of doubly stochastic processes given that their Laplace-transforms can be determined easily. Unfortunately these processes lack several key features supported by the empirical observations, e.g. they cannot replicate the self-exciting nature of defaults. The aim of the paper is to show that by using an appropriate change of measure the Laplace-transform can be calculated not only for a doubly stochastic process, but for an arbitrary point process with intensity as well. To support the application of the technique, we investigate the e®ect of the change of measure on the stochastic nature of the underlying process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Our aim was to approach an important and well-investigable phenomenon – connected to a relatively simple but real field situation – in such a way, that the results of field observations could be directly comparable with the predictions of a simulation model-system which uses a simple mathematical apparatus and to simultaneously gain such a hypothesis-system, which creates the theoretical opportunity for a later experimental series of studies. As a phenomenon of the study, we chose the seasonal coenological changes of aquatic and semiaquatic Heteroptera community. Based on the observed data, we developed such an ecological model-system, which is suitable for generating realistic patterns highly resembling to the observed temporal patterns, and by the help of which predictions can be given to alternative situations of climatic circumstances not experienced before (e.g. climate changes), and furthermore; which can simulate experimental circumstances. The stable coenological state-plane, which was constructed based on the principle of indirect ordination is suitable for unified handling of data series of monitoring and simulation, and also fits for their comparison. On the state-plane, such deviations of empirical and model-generated data can be observed and analysed, which could otherwise remain hidden.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Climate change affects on insect populations in many ways: it can cause a shift in geographical spread, abundance, or diversity, it can change the location, the timing and the magnitude of outbreaks of pests and it can define the phenological or even the genetic properties of the species. Long-time investigations of special insect populations, simulation models and scenario studies give us very important information about the response of the insects far away and near to our century. Getting to know the potential responses of insect populations to climate change makes us possible to evaluate the adaptation of pest management alternatives as well as to formulate our future management policy. In this paper we apply two simple models, in order to introduce a complex case study for a Sycamore lace bug population. We test how the model works in case the whether conditions are very different from those in our days. Thus, besides we can understand the processes that happen in present, we can analyze the effects of a possible climate change, as well.