771 resultados para user requirements
Resumo:
Typeface design: a series of collaborative projects commissioned by Adobe, Inc. and Brill to develop extensive polytonic Greek typefaces. The two Adobe typefaces can be seen as extension of previous research for the Garamond Premier Pro family (2005), and concludes a research theme started in 1998 with work for Adobe’s Minion Pro Greek. These typefaces together define the state of the art for text-intensive Greek typesetting for wide character set texts (from classical texts, to poetry, to essays, to prose). They serve both as exemplar for other developers, and as vehicles for developing the potential of Greek text typography, for example with the parallel inclusion of monotonic and polytonic characters, detailed localised punctuation options, fluid handling of case-conversion issues, and innovative options such as accented small caps (originally requested by bibliographers, and subsequently rolled out to a general user base). The Brill typeface (for the established academic publisher) has an exceptionally wide character set to cover several academic disciplines, and is intended to differentiate sufficiently from its partner Latin typeface, while maintaining a clear texture in both offset and low-resolution print-on-demand reproduction. This work involved substantial amounts of testing and modifying the design, especially of diacritics, to maintain clarity the readability of unfamiliar words. All together these typefaces form a study in how Greek typesetting meets contemporary typographic requirements, while resonating with historically accurate styles, where these are present. Significant research in printing archives helped to identify appropriate styles, as well as originate variants that are coherent stylistically, even when historical equivalents were absent.
Resumo:
Health care provision is significantly impacted by the ability of the health providers to engineer a viable healthcare space to support care stakeholders needs. In this paper we discuss and propose use of organisational semiotics as a set of methods to link stakeholders to systems, which allows us to capture clinician activity, information transfer, and building use; which in tern allows us to define the value of specific systems in the care environment to specific stakeholders and the dependence between systems in a care space. We suggest use of a semantically enhanced building information model (BIM) to support the linking of clinician activity to the physical resource objects and space; and facilitate the capture of quantifiable data, over time, concerning resource use by key stakeholders. Finally we argue for the inclusion of appropriate stakeholder feedback and persuasive mechanism, to incentivise building user behaviour to support organisational level sustainability policy.
Resumo:
We discuss the potential of using THz spectrometry for the direct observation of phase transitions in foodstuffs, with the aim of quantifying consumer perception. Experimental results from phase transitions using a continuous wave dispersive Fourier transform spectrometer and a cyclotron enhanced liquid helium cooled bolometric detector are reported.
Resumo:
The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.
Resumo:
There is increasing pressure to capture of video within Higher Education. Although much research has looked at how communication technologies enhance information transfer during playback of video, consideration of technical issues seems incongruous if we do not consider how presentation mode affects information assimilated by, and satisfaction of, learners with a range of individual differences, and from a range of different backgrounds. This paper considers whether a relationship exists between the media and presentation mode used in recorded content, and the level of information assimilation and satisfaction perceived by learners with a range of individual differences. Results aim to inform learning practitioners whether generic delivery is justified, or whether tailoring content delivery enhances the experience of specific learner groups.
Resumo:
The discrete Fourier transmission spread OFDM DFTS-OFDM) based single-carrier frequency division multiple access (SC-FDMA) has been widely adopted due to its lower peak-to-average power ratio (PAPR) of transmit signals compared with OFDM. However, the offset modulation, which has lower PAPR than general modulation, cannot be directly applied into the existing SC-FDMA. When pulse-shaping filters are employed to further reduce the envelope fluctuation of transmit signals of SC-FDMA, the spectral efficiency degrades as well. In order to overcome such limitations of conventional SC-FDMA, this paper for the first time investigated cyclic prefixed OQAMOFDM (CP-OQAM-OFDM) based SC-FDMA transmission with adjustable user bandwidth and space-time coding. Firstly, we propose CP-OQAM-OFDM transmission with unequally-spaced subbands. We then apply it to SC-FDMA transmission and propose a SC-FDMA scheme with the following features: a) the transmit signal of each user is offset modulated single-carrier with frequency-domain pulse-shaping; b) the bandwidth of each user is adjustable; c) the spectral efficiency does not decrease with increasing roll-off factors. To combat both inter-symbolinterference and multiple access interference in frequencyselective fading channels, a joint linear minimum mean square error frequency domain equalization using a prior information with low complexity is developed. Subsequently, we construct space-time codes for the proposed SC-FDMA. Simulation results confirm the powerfulness of the proposed CP-OQAM-OFDM scheme (i.e., effective yet with low complexity).
Resumo:
Structured abstract: Purpose: LibraryThing is a Web 2.0 tool allowing users to catalogue books using data drawn from sources such as Amazon and the Library of Congress and has facilities such as tagging and interest groups. This study evaluates whether LibraryThing is a valuable tool for libraries to use for promotional and user engagement purposes. Methodology: This study used a sequential mixed methods 3 phase design: (1) the identification of LibraryThing features for user engagement or promotional purposes, (2) exploratory semi-structured interviews (3) a questionnaire. Findings: Several uses of LibraryThing for promotional and user engagement purposes were identified. The most popular reason libraries used LibraryThing was to promote the library or library stock, with most respondents using it specifically to highlight collections of books. Monitoring of patron usage was low and many respondents had not received any feedback. LibraryThing was commonly reported as being easy to use, remotely accessible, and having low cost, whilst its main drawbacks were the 200 book limit for free accounts, and it being a third-party site. The majority of respondents felt LibraryThing was a useful tool for libraries. Practical implications: LibraryThing has most value as a promotional tool for libraries. Libraries should actively monitor patron usage of their LibraryThing account or request user feedback to ensure that LibraryThing provides a truly valuable service for their library. Orginality : There is little research on the value of LibraryThing for libraries, or librarians perceptions of LibraryThing as a Web 2.0 tool.
Resumo:
An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.
Resumo:
This paper introduces a novel approach for free-text keystroke dynamics authentication which incorporates the use of the keyboard’s key-layout. The method extracts timing features from specific key-pairs. The Euclidean distance is then utilized to find the level of similarity between a user’s profile data and his/her test data. The results obtained from this method are reasonable for free-text authentication while maintaining the maximum level of user relaxation. Moreover, it has been proven in this study that flight time yields better authentication results when compared with dwell time. In particular, the results were obtained with only one training sample for the purpose of practicality and ease of real life application.
Resumo:
The quality control, validation and verification of the European Flood Alert System (EFAS) are described. EFAS is designed as a flood early warning system at pan-European scale, to complement national systems and provide flood warnings more than 2 days before a flood. On average 20–30 alerts per year are sent out to the EFAS partner network which consists of 24 National hydrological authorities responsible for transnational river basins. Quality control of the system includes the evaluation of the hits, misses and false alarms, showing that EFAS has more than 50% of the time hits. Furthermore, the skills of both the meteorological as well as the hydrological forecasts are evaluated, and are included here for a 10-year period. Next, end-user needs and feedback are systematically analysed. Suggested improvements, such as real-time river discharge updating, are currently implemented.
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis