828 resultados para expected value of information
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
This paper presents the findings of a podcasting trial held in 2007-2008 within the Faculty of Economics and Business at the University of Sydney, Australia. The trial investigates the value of using short-format podcasts to support assessment for postgraduate and undergraduate students. A multi-method approach is taken in investigating perceptions of the benefits of podcasting, incorporating surveys, focus groups and interviews. The results show that a majority of students believe they gained learning benefits from the podcasts and appreciated the flexibility of the medium to support their learning, and the lecturers felt the innovation helped diversify their pedagogical approach and support a diverse student population. Three primary conclusions are presented: (1) most students reject the mobile potential of podcasting in favour of their traditional study space at home; (2) what students and lecturers value about this podcasting design overlap; (3) the assessment-focussed, short-format podcast design may be considered a successful podcasting model. The paper finishes by identifying areas for future research on the effective use of podcasting in learning and teaching.
Resumo:
the coursework2 for INFO2009 by group23. This resource contains a poster and a questionnaire(web page based).Please access following website for the questionnaire: http://users.ecs.soton.ac.uk/rrs4g10/info2009
Resumo:
Link to various resources appropriate for revising the FOI
Resumo:
Abstract In this talk, I'll focus on the work we've been doing on evaluating the cognitive side of dealing with information resources and increasingly complex user interfaces. While we can build increasingly powerful user interfaces, they often come at the cost of simple design and ease of use. I'll describe two specific studies: 1) work on the ORCHID project focused on measuring mental workload during tasks using fNIRS (a blood-oxygen-based brain scanner), and 2) a evaluation metric for measuring how much people learn during tasks. Together these provide advances towards understanding the cognitive side of information interaction, in working towards building better tools for users.
Resumo:
A project to identify metrics for assessing the quality of open data based on the needs of small voluntary sector organisations in the UK and India. For this project we assumed the purpose of open data metrics is to determine the value of a group of open datasets to a defined community of users. We adopted a much more user-centred approach than most open data research using small structured workshops to identify users’ key problems and then working from those problems to understand how open data can help address them and the key attributes of the data if it is to be successful. We then piloted different metrics that might be used to measure the presence of those attributes. The result was six metrics that we assessed for validity, reliability, discrimination, transferability and comparability. This user-centred approach to open data research highlighted some fundamental issues with expanding the use of open data from its enthusiast base.
Resumo:
Abstract 1: Social Networks such as Twitter are often used for disseminating and collecting information during natural disasters. The potential for its use in Disaster Management has been acknowledged. However, more nuanced understanding of the communications that take place on social networks are required to more effectively integrate this information into the processes within disaster management. The type and value of information shared should be assessed, determining the benefits and issues, with credibility and reliability as known concerns. Mapping the tweets in relation to the modelled stages of a disaster can be a useful evaluation for determining the benefits/drawbacks of using data from social networks, such as Twitter, in disaster management.A thematic analysis of tweets’ content, language and tone during the UK Storms and Floods 2013/14 was conducted. Manual scripting was used to determine the official sequence of events, and classify the stages of the disaster into the phases of the Disaster Management Lifecycle, to produce a timeline. Twenty- five topics discussed on Twitter emerged, and three key types of tweets, based on the language and tone, were identified. The timeline represents the events of the disaster, according to the Met Office reports, classed into B. Faulkner’s Disaster Management Lifecycle framework. Context is provided when observing the analysed tweets against the timeline. This illustrates a potential basis and benefit for mapping tweets into the Disaster Management Lifecycle phases. Comparing the number of tweets submitted in each month with the timeline, suggests users tweet more as an event heightens and persists. Furthermore, users generally express greater emotion and urgency in their tweets.This paper concludes that the thematic analysis of content on social networks, such as Twitter, can be useful in gaining additional perspectives for disaster management. It demonstrates that mapping tweets into the phases of a Disaster Management Lifecycle model can have benefits in the recovery phase, not just in the response phase, to potentially improve future policies and activities. Abstract2: The current execution of privacy policies, as a mode of communicating information to users, is unsatisfactory. Social networking sites (SNS) exemplify this issue, attracting growing concerns regarding their use of personal data and its effect on user privacy. This demonstrates the need for more informative policies. However, SNS lack the incentives required to improve policies, which is exacerbated by the difficulties of creating a policy that is both concise and compliant. Standardization addresses many of these issues, providing benefits for users and SNS, although it is only possible if policies share attributes which can be standardized. This investigation used thematic analysis and cross- document structure theory, to assess the similarity of attributes between the privacy policies (as available in August 2014), of the six most frequently visited SNS globally. Using the Jaccard similarity coefficient, two types of attribute were measured; the clauses used by SNS and the coverage of forty recommendations made by the UK Information Commissioner’s Office. Analysis showed that whilst similarity in the clauses used was low, similarity in the recommendations covered was high, indicating that SNS use different clauses, but to convey similar information. The analysis also showed that low similarity in the clauses was largely due to differences in semantics, elaboration and functionality between SNS. Therefore, this paper proposes that the policies of SNS already share attributes, indicating the feasibility of standardization and five recommendations are made to begin facilitating this, based on the findings of the investigation.
Resumo:
El contexto teórico y empírico de esta investigación sobre entonación, se enmarca dentro de la filosofía lingüística de la teoría sistémico-funcional. El modelo metodológico empleado se basa en la Lingüística de Corpus. La descripción de la adquisición y aprendizaje de la lengua extranjera está justificada dentro del marco teórico de la teoría de interlengua y la adquisición de segundas lenguas y lenguas extranjeras.. El presente estudio del corpus comparativo y longitudinal de aprendices y hablantes nativos de lengua inglesa, tiene como objetivo principal investigar los modelos de entonación producidos por ambos grupos de hablantes. Se pretende demostrar que la diferencia a nivel entonativo entre estos dos grupos no sólo tiene como resultado que los no nativos tengan acento extranjero; sino que puede afectar al mensaje transmitido, en cuanto a la estructura y organización de la información dentro del discurso oral en las metafunciones textual e interpersonal.. Se asume la existencia de un sistema entonativo de interlengua, de esta forma este análisis tiene como objeto no sólo reflejar los errores sino también los posibles sistemas aproximativos de los aprendices mencionados..
Resumo:
Resumen basado en el de la publicación
Resumo:
Resumen tomado de la publicaci??n
Resumo:
The author studies the error and complexity of the discrete random walk Monte Carlo technique for radiosity, using both the shooting and gathering methods. The author shows that the shooting method exhibits a lower complexity than the gathering one, and under some constraints, it has a linear complexity. This is an improvement over a previous result that pointed to an O(n log n) complexity. The author gives and compares three unbiased estimators for each method, and obtains closed forms and bounds for their variances. The author also bounds the expected value of the mean square error (MSE). Some of the results obtained are also shown
Resumo:
This dissertation studies the effects of Information and Communication Technologies (ICT) on the banking sector and the payments system. It provides insight into how technology-induced changes occur, by exploring both the nature and scope of main technology innovations and evidencing their economic implications for banks and payment systems. Some parts in the dissertation are descriptive. They summarise the main technological developments in the field of finance and link them to economic policies. These parts are complemented with sections of the study that focus on assessing the extent of technology application to banking and payment activities. Finally, it includes also some work which borrows from the economic literature on banking. The need for an interdisciplinary approach arises from the complexity of the topic and the rapid path of change to which it is subject. The first chapter provides an overview of the influence of developments in ICT on the evolution of financial services and international capital flows. We include main indicators and discuss innovation in the financial sector, exchange rates and international capital flows. The chapter concludes with impact analysis and policy options regarding the international financial architecture, some monetary policy issues and the role of international institutions. The second chapter is a technology assessment study that focuses on the relationship between technology and money. The application of technology to payments systems is transforming the way we use money and, in some instances, is blurring the definition of what constitutes money. This chapter surveys the developments in electronic forms of payment and their relationship to the banking system. It also analyses the challenges posed by electronic money for regulators and policy makers, and in particular the opportunities created by two simultaneous processes: the Economic and Monetary Union and the increasing use of electronic payment instruments. The third chapter deals with the implications of developments in ICT on relationship banking. The financial intermediation literature explains relationship banking as a type of financial intermediation characterised by proprietary information and multiple interactions with customers. This form of banking is important for the financing of small and medium-sized enterprises. We discuss the effects of ICT on the banking sector as a whole and then apply these developments to the case of relationship banking. The fourth chapter is an empirical study of the effects of technology on the banking business, using a sample of data from the Spanish banking industry. The design of the study is based on some of the events described in the previous chapters, and also draws from the economic literature on banking. The study shows that developments in information management have differential effects on wholesale and retail banking activities. Finally, the last chapter is a technology assessment study on electronic payments systems in Spain and the European Union. It contains an analysis of existing payment systems and ongoing or planned initiatives in Spain. It forms part of a broader project comprising a series of country-specific analyses covering ten European countries. The main issues raised across the countries serve as the starting point to discuss implications of the development of electronic money for regulation and policies, and in particular, for monetary-policy making.