662 resultados para Microsoft Excel ®
Resumo:
There are a variety of guidelines and methods available to measure and assess survey quality. Most of these are based on qualitative descriptions. In practice, they are not easy to implement and it is very difficult to make comparisons between surveys. Hence there is a theoretical and pragmatic demand to develop a mainly quantitative based survey assessment tool. This research aimed to meet this need and make contributions to the evaluation and improvement of survey quality. Acknowledging the critical importance of measurement issues in survey research, this thesis starts with a comprehensive introduction to measurement theory and identifies the types of measurement errors associated with measurement procedures through three experiments. Then it moves on to describe concepts, guidelines and methods available for measuring and assessing survey quality. Combining these with measurement principles leads to the development of a quantitative based statistical holistic tool to measure and assess survey quality. The criteria, weights and subweights for the assessment tool are determined using Multi-Criteria Decision-Making (MCDM) and a survey questionnaire based on the Delphi method. Finally the model is applied to a database of surveys which was constructed to develop methods of classification, assessment and improvement of survey quality. The model developed in this thesis enables survey researchers and/or commissioners to make a holistic assessment of the value of the particular survey(s). This model is an Excel based audit which takes a holistic approach, following all stages of the survey from inception, to design, construction, execution, analysis and dissemination. At each stage a set of criteria are applied to assess quality. Scores attained against these assessments are weighted by the importance of the criteria and summed to give an overall assessment of the stage. The total score for a survey can be obtained by a combination of the scores for every stage weighted again by the importance of each stage. The advantage of this is to construct a means of survey assessment which can be used in a diagnostic manner to assess and improve survey quality.
Resumo:
This survey was undertaken by the film crew accompanying Cary Grant when making the film "Charade" in 1963.
Resumo:
Collaborative projects between Industry and Academia provide excellent opportunities for learning. Throughout the academic year 2014-2015 undergraduates from the School of Arts, Media and Computer Games at Abertay University worked with academics from the Infection Group at the University of St Andrews and industry partners Microsoft and DeltaDNA. The result was a serious game prototype that utilized game design techniques and technology to demystify and educate players about the diagnosis and treatment of one of the world's oldest and deadliest diseases, Tuberculosis (TB). Project Sanitarium is a game incorporating a mathematical model that is based on data from real-world drug trials. This paper discusses the project design and development, demonstrating how the project builds on the successful collaborative pedagogical model developed by academic staff at Abertay University. The aim of the model is to provide undergraduates with workplace simulation, wider industry collaboration and access to academic expertise to solve challenging and complex problems.
Resumo:
Tese de Doutoramento apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutror em Ciências da Terra.
Resumo:
Implementations are presented of two common algorithms for integer factorization, Pollard’s “p – 1” method and the SQUFOF method. The algorithms are implemented in the F# language, a functional programming language developed by Microsoft and officially released for the first time in 2010. The algorithms are thoroughly tested on a set of large integers (up to 64 bits in size), running both on a physical machine and a Windows Azure machine instance. Analysis of the relative performance between the two environments indicates comparable performance when taking into account the difference in computing power. Further analysis reveals that the relative performance of the Azure implementation tends to improve as the magnitudes of the integers increase, indicating that such an approach may be suitable for larger, more complex factorization tasks. Finally, several questions are presented for future research, including the performance of F# and related languages for more efficient, parallelizable algorithms, and the relative cost and performance of factorization algorithms in various environments, including physical hardware and commercial cloud computing offerings from the various vendors in the industry.
Resumo:
Understanding the nature of the workloads and system demands created by users of the World Wide Web is crucial to properly designing and provisioning Web services. Previous measurements of Web client workloads have been shown to exhibit a number of characteristic features; however, it is not clear how those features may be changing with time. In this study we compare two measurements of Web client workloads separated in time by three years, both captured from the same computing facility at Boston University. The older dataset, obtained in 1995, is well-known in the research literature and has been the basis for a wide variety of studies. The newer dataset was captured in 1998 and is comparable in size to the older dataset. The new dataset has the drawback that the collection of users measured may no longer be representative of general Web users; however using it has the advantage that many comparisons can be drawn more clearly than would be possible using a new, different source of measurement. Our results fall into two categories. First we compare the statistical and distributional properties of Web requests across the two datasets. This serves to reinforce and deepen our understanding of the characteristic statistical properties of Web client requests. We find that the kinds of distributions that best describe document sizes have not changed between 1995 and 1998, although specific values of the distributional parameters are different. Second, we explore the question of how the observed differences in the properties of Web client requests, particularly the popularity and temporal locality properties, affect the potential for Web file caching in the network. We find that for the computing facility represented by our traces between 1995 and 1998, (1) the benefits of using size-based caching policies have diminished; and (2) the potential for caching requested files in the network has declined.
Resumo:
In this paper, we propose and evaluate an implementation of a prototype scalable web server. The prototype consists of a load-balanced cluster of hosts that collectively accept and service TCP connections. The host IP addresses are advertised using the Round Robin DNS technique, allowing any host to receive requests from any client. Once a client attempts to establish a TCP connection with one of the hosts, a decision is made as to whether or not the connection should be redirected to a different host---namely, the host with the lowest number of established connections. We use the low-overhead Distributed Packet Rewriting (DPR) technique to redirect TCP connections. In our prototype, each host keeps information about connections in hash tables and linked lists. Every time a packet arrives, it is examined to see if it has to be redirected or not. Load information is maintained using periodic broadcasts amongst the cluster hosts.
Resumo:
In a recent paper (Changes in Web Client Access Patterns: Characteristics and Caching Implications by Barford, Bestavros, Bradley, and Crovella) we performed a variety of analyses upon user traces collected in the Boston University Computer Science department in 1995 and 1998. A sanitized version of the 1995 trace has been publicly available for some time; the 1998 trace has now been sanitized, and is available from: http://www.cs.bu.edu/techreports/1999-011-usertrace-98.gz ftp://ftp.cs.bu.edu/techreports/1999-011-usertrace-98.gz This memo discusses the format of this public version of the log, and includes additional discussion of how the data was collected, how the log was sanitized, what this log is and is not useful for, and areas of potential future research interest.
Resumo:
This book originally accompanied a 2-day course on using the LATEX typesetting system. It has been extensively revised and updated and can now be used or self-study or in the classroom. It is aimed at users of Linux, Macintosh, or Microsoft Windows but it can be used with LATEX systems on any platform, including other Unix workstations, mainframes, and even your Personal Digital Assistant (PDA).
Resumo:
Team NAVIGATE aims to create a robust, portable navigational aid for the blind. Our prototype uses depth data from the Microsoft Kinect to perform realtime obstacle avoidance in unfamiliar indoor environments. The device augments the white cane by performing two signi cant functions: detecting overhanging objects and identifying stairs. Based on interviews with blind individuals, we found a combined audio and haptic feedback system best for communicating environmental information. Our prototype uses vibration motors to indicate the presence of an obstacle and an auditory command to alert the user to stairs ahead. Through multiple trials with sighted and blind participants, the device was successful in detecting overhanging objects and approaching stairs. The device increased user competency and adaptability across all trials.
Resumo:
A través de varias experiencias, sencillas y fáciles de desarrollar en el aula de clase, se inducirá a los estudiantes para que reconozcan la forma como varían, directa e inversamente dos magnitudes, de tal forma, que logren caracterizarla s; luego con los datos obtenidos de la práctica y con la ayuda de los programas para computador (Excel, Geogebra y TI-NspireCas) se encontrará la tendencia de los datos, acercándolos al concepto de modelación matemática.
Resumo:
La simulación computacional de problemas de probabilidad permite obtener sus soluciones a través de la frecuencia relativa del número de éxitos obtenidos en los n experimentos realizados. La ley de los grandes números respalda una buena aproximación de la probabilidad teórica de un evento a través de la repetición sucesiva de experimentos. A continuación se presentan una serie de problemas probabilísticos con una posible simulación realizada en los paquetes Fathom y Excel. La solución teórica de estos problemas requiere conocimientos básicos de probabilidad, por lo que las simulaciones realizadas buscan dar una propuesta de solución a estos problemas sin tener que acudir al formalismo matemático.
Resumo:
The miniaturization and dissemination of audiovisual media into small, mobile assemblages of cameras, screens and microphones has brought "database cinema" (Manovich) into pockets and handbags. In turn, this micro-portability of video production calls for a reconsideration of database cinema, not as an aesthetic but rather as a media ecology that makes certain experiences and forms of interaction possible. In this context the clip and the fragment become a social currency (showing, trading online, etc.), and the enjoyment of a moment or "occasion" becomes an opportunity for recording, extending, preserving and displaying. If we are now the documentarists of our lives (as so many mobile phone adverts imply), it follows that we are also our own archivists as well. From the folksonomies of Flickr and YouTube to the slick "media centres" of Sony, Apple and Microsoft, the audiovisual home archive is a prized territory of struggle among platforms and brands. The database is emerging as the dominant (screen) medium of popular creativity and distribution – but it also brings the categories of "home" and "person" closer to that of the archive.
Resumo:
This is the first report from ALT’s new Annual Survey launched in December 2014. This survey was primarily for ALT members (individual or at an organisation which is an organisational member) it could however also be filled in by others, perhaps those interested in taking out membership. The report and data highlight emerging work areas that are important to the survey respondents. Analysis of the survey responses indicates a number of areas ALT should continue to support and develop. Priorities for the membership are ‘Intelligent use of learning technology’ and ‘Research and practice’, aligned to this is the value placed by respondent’s on by communication via the ALT Newsletter/News, social media and Research in Learning Technology. The survey also reveals ‘Data and Analytics’ and ‘Open Education’ are areas where the majority of respondents are finding are becoming increasingly important. As such our community may benefit from development opportunities ALT can provide. The survey is also a reminder that ALT has an essential role in enabling members to develop research and practice in areas which might be considered as minority interest. For example whilst the majority of respondents didn't indicate areas such as ‘Digital and Open Badges’, and ‘Game Based Learning’ as important there are still members who consider these areas are very significant and becoming increasingly valuable and as such ALT will continue to better support these groups within our community. Whilst ALT has conducted previous surveys of ALT membership this is the first iteration in this form. ALT has committed to surveying the sector on an annual basis, refining the core question set but trying to preserve an opportunity for longitudinal analysis.