50 resultados para Expanded critical incident approach


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This chapter offers a framework for combining critical language policy with critical discourse studies (CDS) to analyse language policy as a process in the context of minority language policy in Wales. I propose a discursive approach to language policy, which starts from the premise that language policy is constituted, enacted, interpreted and (re)contextualised in and through language. This approach extends the critical language policy framework provided by Shohamy (Language policy: hidden agendas and new approaches. Routledge, London, 2006) and integrates perspectives from the context-sensitive discourse-historical approach in CDS. It incorporates discourse as an essential lens through which policy mechanisms, ideologies and practices are constituted and de facto language policy materialises. This chapter argues that conceptualising and analysing language policy as a discursive phenomenon enables a better understanding of the multi-layered nature of language policy that shapes the management and experience of corporate bilingualism in Wales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We determine the critical noise level for decoding low density parity check error correcting codes based on the magnetization enumerator , rather than on the weight enumerator employed in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. In addition, our analysis provides an explanation for the difference in performance between MN and Gallager codes. Our results are more optimistic than those derived via the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a method based on the magnetization enumerator to determine the critical noise level for Gallager type low density parity check error correcting codes (LDPC). Our method provides an appealingly simple interpretation to the relation between different decoding schemes, and provides more optimistic critical noise levels than those reported in the information theory literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper departs from this point to consider whether and how crisis thinking contributes to practices of affirmative critique and transformative social action in late-capitalist societies. I argue that different deployments of crisis thinking have different ‘affect-effects’ and consequences for ethical and political practice. Some work to mobilize political action through articulating a politics of fear, assuming that people take most responsibility for the future when they fear the alternatives. Other forms of crisis thinking work to heighten critical awareness by disrupting existential certainty, asserting an ‘ethics of ambiguity’ which assumes that the continuous production of uncertain futures is a fundamental part of the human condition (de Beauvoir, 2000). In this paper, I hope to illustrate that the first deployment of crisis thinking can easily justify the closing down of political debate, discouraging radical experimentation and critique for the sake of resolving problems in a timely and decisive way. The second approach to crisis thinking, on the other hand, has greater potential to enable intellectual and political alterity in everyday life—but one that poses considerable challenges for our understandings of and responses to climate change...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last two decades there have been substantial developments in the mathematical theory of inverse optimization problems, and their applications have expanded greatly. In parallel, time series analysis and forecasting have become increasingly important in various fields of research such as data mining, economics, business, engineering, medicine, politics, and many others. Despite the large uses of linear programming in forecasting models there is no a single application of inverse optimization reported in the forecasting literature when the time series data is available. Thus the goal of this paper is to introduce inverse optimization into forecasting field, and to provide a streamlined approach to time series analysis and forecasting using inverse linear programming. An application has been used to demonstrate the use of inverse forecasting developed in this study. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on Goffman’s definition that frames are general ‘schemata of interpretation’ that people use to ‘locate, perceive, identify, and label’, other scholars have used the concept in a more specific way to analyze media coverage. Frames are used in the sense of organizing devices that allow journalists to select and emphasise topics, to decide ‘what matters’ (Gitlin 1980). Gamson and Modigliani (1989) consider frames as being embedded within ‘media packages’ that can be seen as ‘giving meaning’ to an issue. According to Entman (1993), framing comprises a combination of different activities such as: problem definition, causal interpretation, moral evaluation, and/or treatment recommendation for the item described. Previous research has analysed climate change with the purpose of testing Downs’s model of the issue attention cycle (Trumbo 1996), to uncover media biases in the US press (Boykoff and Boykoff 2004), to highlight differences between nations (Brossard et al. 2004; Grundmann 2007) or to analyze cultural reconstructions of scientific knowledge (Carvalho and Burgess 2005). In this paper we shall present data from a corpus linguistics-based approach. We will be drawing on results of a pilot study conducted in Spring 2008 based on the Nexis news media archive. Based on comparative data from the US, the UK, France and Germany, we aim to show how the climate change issue has been framed differently in these countries and how this framing indicates differences in national climate change policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Challenges of returnable transport equipment (RTE) management continue to heighten as the popularity of their usage magnifies. Logistics companies are investigating the implementation of radio-frequency identification (RFID) technology to alleviate problems such as loss prevention and stock reduction. However, the research within this field is limited and fails to fully explore with depth, the wider network improvements that can be made to optimize the supply chain through efficient RTE management. This paper, investigates the nature of RTE network management building on current research and practices, filling a gap in the literature, through the investigation of a product-centric approach where the paradigms of “intelligent products” and “autonomous objects” are explored. A network optimizing approach with RTE management is explored, encouraging advanced research development of the RTE paradigm to align academic research with problematic areas in industry. Further research continues with the development of an agent-based software system, ready for application to a real-case study distribution network, producing quantitative results for further analysis. This is pivotal on the endeavor to developing agile support systems, fully utilizing an information-centric environment and encouraging RTE to be viewed as critical network optimizing tools rather than costly waste.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter serves three very important functions within this collection. First, it aims to make the existence of FPDA better known to both gender and language researchers and to the wider community of discourse analysts, by outlining FPDA’s own theoretical and methodological approaches. This involves locating and positioning FPDA in relation, yet in contradistinction to, the fields of discourse analysis to which it is most often compared: Critical Discourse Analysis (CDA) and, to a lesser extent, Conversation Analysis (CA). Secondly, the chapter serves a vital symbolic function. It aims to contest the authority of the more established theoretical and methodological approaches represented in this collection, which currently dominate the field of discourse analysis. FPDA considers that an established field like gender and language study will only thrive and develop if it is receptive to new ways of thinking, divergent methods of study, and approaches that question and contest received wisdoms or established methods. Thirdly, the chapter aims to introduce some new, experimental and ground-breaking FPDA work, including that by Harold Castañeda-Peña and Laurel Kamada (same volume). I indicate the different ways in which a number of young scholars are imaginatively developing the possibilities of an FPDA approach to their specific gender and language projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis challenges the consensual scholarly expectation of low EU impact in Central Asia. In particular, it claims that by focusing predominantly on narrow, micro-level factors, the prevailing theoretical perspectives risk overlooking less obvious aspects of the EU?s power, including structural aspects, and thus tend to underestimate the EU?s leverage in the region. Therefore, the thesis argues that a more structurally integrative and holistic approach is needed to understand the EU?s power in the region. In responding to this need, the thesis introduces a conceptual tool, which it terms „transnational power over? (TNPO). Inspired by debates in IPE, in particular new realist and critical IPE perspectives, and combining these views with insights from neorealist, neo-institutionalist and constructivist approaches to EU external relations, the concept of TNPO is an analytically eclectic notion, which helps to assess the degree to which, in today?s globalised and interdependent world, the EU?s power over third countries derives from its control over a combination of material, institutional and ideational structures, making it difficult for the EU?s partners to resist the EU?s initiatives or to reject its offers. In order to trace and assess the mechanisms of EU impact across these three structures, the thesis constructs a toolbox, which centres on four analytical distinctions: (i) EU-driven versus domestically driven mechanisms, (ii) mechanisms based on rationalist logics of action versus mechanisms following constructivist logics of action, (iii) agent-based versus purely structural mechanisms of TNPO, and (iv) transnational and intergovernmental mechanisms of EU impact. Using qualitative research methodology, the thesis then applies the conceptual model to the case of EU-Central Asia. It finds that the EU?s power over Central Asia effectively derives from its control over a combination of material, institutional and ideational structures, including its position as a leader in trade and investment in the region, its (geo)strategic and security-related capabilities vis-à-vis Central Asia, as well as the relatively dense level of institutionalisation of its relations with the five countries and the positive image of the EU in Central Asia as a more neutral actor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work explores the relevance of semantic and linguistic description to translation, theory and practice. It is aimed towards a practical model of approach to texts to translate. As literary texts [poetry mainly] are the focus of attention, so are stylistic matters. Note, however, that 'style', and, to some extent, the conclusions of the work, are not limited to so-called literary texts. The study of semantic description reveals that most translation problems do not stem from the cognitive (langue-related), but rather from the contextual (parole-related) aspects of meaning. Thus, any linguistic model that fails to account for the latter is bound to fall short. T.G.G. does, whereas Systemics, concerned with both the 'Iangue' and 'parole' (stylistic and sociolinguistic mainly) aspects of meaning, provides a useful framework of approach to texts to translate. Two essential semantic principles for translation are: that meaning is the property of a language (Firth); and the 'relativity of meaning assignments' (Tymoczko). Both imply that meaning can only be assessed, correctly, in the relevant socio-cultural background. Translation is seen as a restricted creation, and the translator's encroach as a three-dimensional critical one. To encompass the most technical to the most literary text, and account for variations in emphasis in any text, translation theory must be based on typology of function Halliday's ideational, interpersonal and textual, or, Buhler's symbol, signal, symptom, Functions3. Function Coverall and specific] will dictate aims and method, and also provide the critic with criteria to assess translation Faithfulness. Translation can never be reduced to purely objective methods, however. Intuitive procedures intervene, in textual interpretation and analysis, in the choice of equivalents, and in the reception of a translation. Ultimately, translation, theory and practice, may perhaps constitute the touchstone as regards the validity of linguistic and semantic theories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematics is highly structured and also underpins most of science and engineering. For this reason, it has proved a very suitable domain for Intelligent Tutoring System (ITS) research, with the result that probably more tutoring systems have been constructed for the domain than any other. However, the literature reveals that there still exists no consensus on a credible approach or approaches for the design of such systems, despite numerous documented efforts. Current approaches to the construction of ITSs leave much to be desired. Consequently, existing ITSs in the domain suffer from a considerable number of shortcomings which render them 'unintelligent'. The thesis examines some of the reasons why this is the case. Following a critical review of existing ITSs in the domain, and some pilot studies, an alternative approach to their construction is proposed (the 'iterative-style' approach); this supports an iterative style, and also improves on at least some of the shortcomings of existing approaches. The thesis also presents an ITS for fractions which has been developed using this approach, and which has been evaluated in various ways. It has, demonstrably, improved on many of the limitations of existing ITSs; furthermore, it has been shown to be largely 'intelligent', at least more so than current tutors for the domain. Perhaps more significantly, the tutor has also been evaluated against real students with, so far, very encouraging results. The thesis thus concludes that the novel iterative-style approach is a more credible approach to the construction of ITSs in mathematics than existing techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop econometric models to better understand the economic factors affecting inbound tourist flows from each of six origin countries that contribute to Hong Kong’s international tourism demand. To this end, we test alternative cointegration and error correction approaches to examine the economic determinants of tourist flows to Hong Kong, and to produce accurate econometric forecasts of inbound tourism demand. Our empirical findings show that permanent income is the most significant determinant of tourism demand in all models. The variables of own price, weighted substitute prices, trade volume, the share price index (as an indicator of changes in wealth in origin countries), and a dummy variable representing the Beijing incident (1989) are also found to be important determinants for some origin countries. The average long-run income and own price elasticity was measured at 2.66 and – 1.02, respectively. It was hypothesised that permanent income is a better explanatory variable of long-haul tourism demand than current income. A novel approach (grid search process) has been used to empirically derive the weights to be attached to the lagged income variable for estimating permanent income. The results indicate that permanent income, estimated with empirically determined relatively small weighting factors, was capable of producing better results than the current income variable in explaining long-haul tourism demand. This finding suggests that the use of current income in previous empirical tourism demand studies may have produced inaccurate results. The share price index, as a measure of wealth, was also found to be significant in two models. Studies of tourism demand rarely include wealth as an explanatory forecasting long-haul tourism demand. However, finding a satisfactory proxy for wealth common to different countries is problematic. This study indicates with the ECM (Error Correction Models) based on the Engle-Granger (1987) approach produce more accurate forecasts than ECM based on Pesaran and Shin (1998) and Johansen (1988, 1991, 1995) approaches for all of the long-haul markets and Japan. Overall, ECM produce better forecasts than the OLS, ARIMA and NAÏVE models, indicating the superiority of the application of a cointegration approach for tourism demand forecasting. The results show that permanent income is the most important explanatory variable for tourism demand from all countries but there are substantial variations between countries with the long-run elasticity ranging between 1.1 for the U.S. and 5.3 for U.K. Price is the next most important variable with the long-run elasticities ranging between -0.8 for Japan and -1.3 for Germany and short-run elasticities ranging between – 0.14 for Germany and -0.7 for Taiwan. The fastest growing market is Mainland China. The findings have implications for policies and strategies on investment, marketing promotion and pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present dissertation is concerned with the determination of the magnetic field distribution in ma[.rnetic electron lenses by means of the finite element method. In the differential form of this method a Poisson type equation is solved by numerical methods over a finite boundary. Previous methods of adapting this procedure to the requirements of digital computers have restricted its use to computers of extremely large core size. It is shown that by reformulating the boundary conditions, a considerable reduction in core store can be achieved for a given accuracy of field distribution. The magnetic field distribution of a lens may also be calculated by the integral form of the finite element rnethod. This eliminates boundary problems mentioned but introduces other difficulties. After a careful analysis of both methods it has proved possible to combine the advantages of both in a .new approach to the problem which may be called the 'differential-integral' finite element method. The application of this method to the determination of the magnetic field distribution of some new types of magnetic lenses is described. In the course of the work considerable re-programming of standard programs was necessary in order to reduce the core store requirements to a minimum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research began with an attempt to solve a practical problem, namely, the prediction of the rate at which an operator will learn a task. From a review of the literature, communications with researchers in this area and the study of psychomotor learning in factories it was concluded that a more fundamental approach was required which included the development of a task taxonomy. This latter objective had been researched for over twenty years by E. A. Fleishman and his approach was adopted. Three studies were carried out to develop and extend Fleishman's approach to the industrial area. However, the results of these studies were not in accord with FIeishman's conclusions and suggested that a critical re-assessment was required of the arguments, methods and procedures used by Fleishman and his co-workers. It was concluded that Fleishman's findings were to some extent an artifact of the approximate methods and procedures which he used in the original factor analyses and that using the more modern computerised factor analytic methods a reliable ability taxonomy could be developed to describe the abilities involved in the learning of psychomotor tasks. The implications for a changing-task or changing-subject model were drawn and it was concluded that a changing task and subject model needs to be developed.