834 resultados para use value
Resumo:
Traditionally, language speakers are categorised as mono-lingual, bilingual, or multilingual. It is traditionally assumed in English language education that the ‘lingual’ is something that can be ‘fixed’ in form, written down to be learnt, and taught. Accordingly, the ‘mono’-lingual will have a ‘fixed’ linguistic form. Such a ‘form’ differs according to a number of criteria or influences including region or ‘type’ of English (for example, World Englishes) but is nevertheless assumed to be a ‘form’. ‘Mono-lingualism’ is defined and believed, traditionally, to be ‘speaking one language’; wherever that language is; or whatever that language may be. In this chapter, grounded in an individual subjective philosophy of language, we question this traditional definition. Viewing language from the philosophical perspectives such as those of Bakhtin and Voloshinov, we argue that the prominence of ‘context’ and ‘consciousness’ in language means that to ‘fix’ the form of a language goes against the very spirit of how it is formed and used. We thus challenge the categorisation of ‘mono’-lingualism; proposing that such a categorisation is actually a category error, or a case ‘in which a property is ascribed to a thing that could not possibly have that property’ (Restivo, 2013, p. 175), in this case the property of ‘mono’. Using this proposition as a starting point, we suggest that more time be devoted to language in its context and as per its genuine use as a vehicle for consciousness. We theorise this can be done through a ‘literacy’ based approach which fronts the context of language use rather than the language itself. We outline how we envision this working for teachers, students and materials developers of English Language Education materials in a global setting. To do this we consider Scotland’s Curriculum for Excellence as an exemplar to promote conscious language use in context.
Resumo:
Ramlogan, R.,& Tedd, L. (2006). Use and non-use of electronic information sources by undergraduates at the University of the West Indies. Online Information Review, 30(1), 24-42.
Resumo:
Tedd, L.A. (2006).Use of library and information science journals by Master?s students in their dissertations: experiences at the University of Wales Aberystwyth. Aslib Proceedings: New Information Perspectives, 58(6), 570-581.
Resumo:
Attributing a dollar value to a keyword is an essential part of running any profitable search engine advertising campaign. When an advertiser has complete control over the interaction with and monetization of each user arriving on a given keyword, the value of that term can be accurately tracked. However, in many instances, the advertiser may monetize arrivals indirectly through one or more third parties. In such cases, it is typical for the third party to provide only coarse-grained reporting: rather than report each monetization event, users are aggregated into larger channels and the third party reports aggregate information such as total daily revenue for each channel. Examples of third parties that use channels include Amazon and Google AdSense. In such scenarios, the number of channels is generally much smaller than the number of keywords whose value per click (VPC) we wish to learn. However, the advertiser has flexibility as to how to assign keywords to channels over time. We introduce the channelization problem: how do we adaptively assign keywords to channels over the course of multiple days to quickly obtain accurate VPC estimates of all keywords? We relate this problem to classical results in weighing design, devise new adaptive algorithms for this problem, and quantify the performance of these algorithms experimentally. Our results demonstrate that adaptive weighing designs that exploit statistics of term frequency, variability in VPCs across keywords, and flexible channel assignments over time provide the best estimators of keyword VPCs.
Resumo:
This thesis investigates the optimisation of Coarse-Fine (CF) spectrum sensing architectures under a distribution of SNRs for Dynamic Spectrum Access (DSA). Three different detector architectures are investigated: the Coarse-Sorting Fine Detector (CSFD), the Coarse-Deciding Fine Detector (CDFD) and the Hybrid Coarse-Fine Detector (HCFD). To date, the majority of the work on coarse-fine spectrum sensing for cognitive radio has focused on a single value for the SNR. This approach overlooks the key advantage that CF sensing has to offer, namely that high powered signals can be easily detected without extra signal processing. By considering a range of SNR values, the detector can be optimised more effectively and greater performance gains realised. This work considers the optimisation of CF spectrum sensing schemes where the security and performance are treated separately. Instead of optimising system performance at a single, constant, low SNR value, the system instead is optimised for the average operating conditions. The security is still provided such that at the low SNR values the safety specifications are met. By decoupling the security and performance, the system’s average performance increases whilst maintaining the protection of licensed users from harmful interference. The different architectures considered in this thesis are investigated in theory, simulation and physical implementation to provide a complete overview of the performance of each system. This thesis provides a method for estimating SNR distributions which is quick, accurate and relatively low cost. The CSFD is modelled and the characteristic equations are found for the CDFD scheme. The HCFD is introduced and optimisation schemes for all three architectures are proposed. Finally, using the Implementing Radio In Software (IRIS) test-bed to confirm simulation results, CF spectrum sensing is shown to be significantly quicker than naive methods, whilst still meeting the required interference probability rates and not requiring substantial receiver complexity increases.
Resumo:
The pervasive use of mobile technologies has provided new opportunities for organisations to achieve competitive advantage by using a value network of partners to create value for multiple users. The delivery of a mobile payment (m-payment) system is an example of a value network as it requires the collaboration of multiple partners from diverse industries, each bringing their own expertise, motivations and expectations. Consequently, managing partnerships has been identified as a core competence required by organisations to form viable partnerships in an m-payment value network and an important factor in determining the sustainability of an m-payment business model. However, there is evidence that organisations lack this competence which has been witnessed in the m-payment domain where it has been attributed as an influencing factor in a number of failed m-payment initiatives since 2000. In response to this organisational deficiency, this research project leverages the use of design thinking and visualisation tools to enhance communication and understanding between managers who are responsible for managing partnerships within the m-payment domain. By adopting a design science research approach, which is a problem solving paradigm, the research builds and evaluates a visualisation tool in the form of a Partnership Management Canvas. In doing so, this study demonstrates that when organisations encourage their managers to adopt design thinking, as a way to balance their analytical thinking and intuitive thinking, communication and understanding between the partners increases. This can lead to a shared understanding and a shared commitment between the partners. In addition, the research identifies a number of key business model design issues that need to be considered by researchers and practitioners when designing an m-payment business model. As an applied research project, the study makes valuable contributions to the knowledge base and to the practice of management.
Resumo:
The past few years have witnessed an exponential increase in studies trying to identify molecular markers in patients with breast tumours that might predict for the success or failure of hormonal therapy or chemotherapy. HER2, a tyrosine kinase membrane receptor of the epidermal growth factor receptor family, has been the most widely studied marker in this respect. This paper attempts to critically review to what extent HER2 may improve 'treatment individualisation' for the breast cancer patient. Copyright (C) 2000.
Resumo:
Creativity is often defined as developing something novel or new, that fits its context, and has value. To achieve this, the creative process itself has gained increasing attention as organizational leaders seek competitive advantages through developing new products, services, process, or business models. In this paper, we explore the notion of the creative process as including a series of “filters” or ways to process information as being a critical component of the creative process. We use the metaphor of coffee making and filters because many of our examples come from Vietnam, which is one of the world’s top coffee exporters and which has created a coffee culture rivaling many other countries. We begin with a brief review of the creative process its connection to information processing, propose a tentative framework for integrating the two ideas, and provide examples of how it might work. We close with implications for further practical and theoretical directions for this idea.
Resumo:
OBJECTIVE: The diagnosis of Alzheimer's disease (AD) remains difficult. Lack of diagnostic certainty or possible distress related to a positive result from diagnostic testing could limit the application of new testing technologies. The objective of this paper is to quantify respondents' preferences for obtaining AD diagnostic tests and to estimate the perceived value of AD test information. METHODS: Discrete-choice experiment and contingent-valuation questions were administered to respondents in Germany and the United Kingdom. Choice data were analyzed by using random-parameters logit. A probit model characterized respondents who were not willing to take a test. RESULTS: Most respondents indicated a positive value for AD diagnostic test information. Respondents who indicated an interest in testing preferred brain imaging without the use of radioactive markers. German respondents had relatively lower money-equivalent values for test features compared with respondents in the United Kingdom. CONCLUSIONS: Respondents preferred less invasive diagnostic procedures and tests with higher accuracy and expressed a willingness to pay up to €700 to receive a less invasive test with the highest accuracy.
Resumo:
In this paper, we consider the problem of providing flexibility to solutions of two-machine shop scheduling problems. We use the concept of group-scheduling to characterize a whole set of schedules so as to provide more choice to the decision-maker at any decision point. A group-schedule is a sequence of groups of permutable operations defined on each machine where each group is such that any permutation of the operations inside the group leads to a feasible schedule. Flexibility of a solution and its makespan are often conflicting, thus we search for a compromise between a low number of groups and a small value of makespan. We resolve the complexity status of the relevant problems for the two-machine flow shop, job shop and open shop. A number of approximation algorithms are developed and their worst-case performance is analyzed. For the flow shop, an effective heuristic algorithm is proposed and the results of computational experiments are reported.
Resumo:
Economic analysis of technology treats it as given exogenously, while determined endogenously. This paper examines the conceptual conflict. The paper outlines an alternative conceptual framework. This uses a 'General Vertical Division of Labour' into conceptual and executive parts to facilitate a coherent political economic explanation of technological change. The paper suggests that we may acquire rather than impose an understanding of technological change. It also suggests that we may re-define and reassess the efficiency of technological change, through the values inculcated into it.
Resumo:
The results presented in this deliverable depict relevant aspects of the EU based Applied Game industry and its competitive landscape. This preliminary overview of the primary target market for the RAGE ecosystem identifies some of the key issues to be further investigated by the RAGE WP7 team through stakeholders/market consultations commencing in year 2 of the project. These findings will form as an integral part of the baseline needed to formulate a sustainable exploitation strategy for the RAGE assets and ecosystem.
Resumo:
The Continuous Plankton Recorder (CPR) survey was conceived from the outset as a programme of applied research designed to assist the fishing industry. Its survival and continuing vigour after 70 years is a testament to its utility, which has been achieved in spite of great changes in our understanding of the marine environment and in our concerns over how to manage it. The CPR has been superseded in several respects by other technologies, such as acoustics and remote sensing, but it continues to provide unrivalled seasonal and geographic information about a wide range of zooplankton and phytoplankton taxa. The value of this coverage increases with time and provides the basis for placing recent observations into the context of long-term, large-scale variability and thus suggesting what the causes are likely to be. Information from the CPR is used extensively in judging environmental impacts and producing quality status reports (QSR); it has shown the distributions of fish stocks, which had not previously been exploited; it has pointed to the extent of ungrazed phytoplankton production in the North Atlantic, which was a vital element in establishing the importance of carbon sequestration by phytoplankton. The CPR continues to be the principal source of large-scale, long-term information about the plankton ecosystem of the North Atlantic. It has recently provided extensive information about the biodiversity of the plankton and about the distribution of introduced species. It serves as a valuable example for the design of future monitoring of the marine environment and it has been essential to the design and implementation of most North Atlantic plankton research.
Resumo:
Ecosystem services provided by the marine environment are fundamental to human health and well-being. Despite this, many marine systems are being degraded to an extent that may reduce their capacity to provide these ecosystem services. The ecosystem approach is a strategy for the integrated management of land, water and living resources that promotes conservation and sustainable use in an equitable way (UN Convention on Biological Diversity, 2000). Its application to marine management and spatial planning has been proposed as a means of maintaining the economic and social value of the oceans, not only in the present but for generations to come. Characterising the susceptibility of services (and combinations of services) to particular human activities based on knowledge of impacts on biodiversity and ecosystem functioning (as described in preceding chapters) is a challenge for future management of the oceans. In this chapter, we highlight the existing, but limited knowledge of how ecosystem services may be impacted by different human activities. We discuss how impacts on one service can impact multiple services and explore how the impacts on services can vary both spatially and temporally and according to context. We focus particularly on the effects on ecosystem services of activities whose impacts on biodiversity and ecosystem functioning have already been considered in previous chapters. Some of these activities are associated with poor management of ecosystem benefits, for example, from provisioning services (aquaculture and fisheries), or with excessive input of wastes, fertilisers and contaminants into the system overburdening the waste treatment and assimilation services. Other impacts are associated with the construction of structures or use of space designed to generate benefits from environmental services such as the presence of water as a carrier for shipping, or sources of wind, wave and tidal power. We discuss the trade-offs that are made, consciously or otherwise, between different ecosystem services, which arise from human activities to optimise or manage specific ecosystem services.