927 resultados para Time complexity
Resumo:
This research paper explores the impact product personalisation has upon product attachment and aims to develop a deeper understanding of why, how and if consumers choose to do so. The current research in this field is mainly based on attachment theories and is predominantly product specific. This paper researches the link between product attachment and personalisation through in-depth, semi-structured interviews, where the data has been thematically analysed and broken down into three themes, and nine sub-themes. It was found that participants did become more attached to products once they were personalised and the reasons why this occurred varied. The most common reasons that led to personalisation were functionality and usability, the expression of personality through a product and the complexity of personalisation. The reasons why participants felt connected to their products included strong emotions/memories, the amount of time and effort invested into the personalisation, a sense of achievement. Reasons behind the desire for personalisation included co-designing, expression of uniqueness/individualism and having choice for personalisation. Through theme and inter-theme relationships, many correlations were formed, which created the basis for design recommendations. These recommendations demonstrate how a designer could implement the emotions and reasoning for personalisation into the design process.
Resumo:
Due to increased complexity, scale, and functionality of information and telecommunication (IT) infrastructures, every day new exploits and vulnerabilities are discovered. These vulnerabilities are most of the time used by ma¬licious people to penetrate these IT infrastructures for mainly disrupting business or stealing intellectual pro¬perties. Current incidents prove that it is not sufficient anymore to perform manual security tests of the IT infra¬structure based on sporadic security audits. Instead net¬works should be continuously tested against possible attacks. In this paper we present current results and challenges towards realizing automated and scalable solutions to identify possible attack scenarios in an IT in¬frastructure. Namely, we define an extensible frame¬work which uses public vulnerability databases to identify pro¬bable multi-step attacks in an IT infrastructure, and pro¬vide recommendations in the form of patching strategies, topology changes, and configuration updates.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
Conspicuity limitations make bicycling at night dangerous. This experiment quantified bicyclists’ estimates of the distance at which approaching drivers would first recognize them. Twenty five participants (including 13 bicyclists who rode at least once per week, and 12 who rode once per month or less) cycled in place on a closed-road circuit at night-time and indicated when they were confident that an approaching driver would first recognize that a bicyclist was present. Participants wore black clothing alone or together with a fluorescent bicycling vest, a fluorescent bicycling vest with additional retroreflective tape, or the fluorescent retroreflective vest plus ankle and knee reflectors in a modified ‘biomotion’ configuration. The bicycle had a light mounted on the handlebars which was either static, flashing or off. Participants judged that black clothing made them least visible, retroreflective strips on the legs in addition to a retroreflective vest made them most visible and that adding retroreflective materials to a fluorescent vest provides no conspicuity benefits. Flashing bicycle lights were associated with higher conspicuity than static lights. Additionally, occasional bicyclists judged themselves to be more visible than did frequent bicyclists. Overall, bicyclists overestimated their conspicuity compared to previously collected recognition distances and underestimated the conspicuity benefits of retroreflective markings on their ankles and knees. Participants mistakenly judged that a fluorescent vest that did not include retroreflective material would enhance their night-time conspicuity. These findings suggest that bicyclists have dangerous misconceptions concerning the magnitude of the night-time conspicuity problem and the potential value of conspicuity treatments.
Resumo:
BACKGROUND: Hot and cold temperatures have been associated with childhood asthma. However, the relationship between daily temperature variation and childhood asthma is not well understood. This study aimed to examine the relationship between diurnal temperature range (DTR) and childhood asthma. METHODS: A Poisson generalized linear model combined with a distributed lag non-linear model was used to examine the relationship between DTR and emergency department admissions for childhood asthma in Brisbane, from January 1st 2003 to December 31st 2009. RESULTS: There was a statistically significant relationship between DTR and childhood asthma. The DTR effect on childhood asthma increased above a DTR of 10[degree sign]C. The effect of DTR on childhood asthma was the greatest for lag 0--9 days, with a 31% (95% confidence interval: 11% -- 58%) increase of emergency department admissions per 5[degree sign]C increment of DTR. Male children and children aged 5--9 years appeared to be more vulnerable to the DTR effect than others. CONCLUSIONS: Large DTR may trigger childhood asthma. Future measures to control and prevent childhood asthma should include taking temperature variability into account. More protective measures should be taken after a day of DTR above10[degree sign]C.
Resumo:
Topic recommendation can help users deal with the information overload issue in micro-blogging communities. This paper proposes to use the implicit information network formed by the multiple relationships among users, topics and micro-blogs, and the temporal information of micro-blogs to find semantically and temporally relevant topics of each topic, and to profile users' time-drifting topic interests. The Content based, Nearest Neighborhood based and Matrix Factorization models are used to make personalized recommendations. The effectiveness of the proposed approaches is demonstrated in the experiments conducted on a real world dataset that collected from Twitter.com.
Resumo:
Many older people have difficulties using modern consumer products due to increased product complexity both in terms of functionality and interface design. Previous research has shown that older people have more difficulty in using complex devices intuitively when compared to the younger. Furthermore, increased life expectancy and a falling birth rate have been catalysts for changes in world demographics over the past two decades. This trend also suggests a proportional increase of older people in the work-force. This realisation has led to research on the effective use of technology by older populations in an effort to engage them more productively and to assist them in leading independent lives. Ironically, not enough attention has been paid to the development of interaction design strategies that would actually enable older users to better exploit new technologies. Previous research suggests that if products are designed to reflect people's prior knowledge, they will appear intuitive to use. Since intuitive interfaces utilise domain-specific prior knowledge of users, they require minimal learning for effective interaction. However, older people are very diverse in their capabilities and domain-specific prior knowledge. In addition, ageing also slows down the process of acquiring new knowledge. Keeping these suggestions and limitations in view, the aim of this study was set to investigate possible approaches to developing interfaces that facilitate their intuitive use by older people. In this quest to develop intuitive interfaces for older people, two experiments were conducted that systematically investigated redundancy (the use of both text and icons) in interface design, complexity of interface structure (nested versus flat), and personal user factors such as cognitive abilities, perceived self-efficacy and technology anxiety. All of these factors could interfere with intuitive use. The results from the first experiment suggest that, contrary to what was hypothesised, older people (65+ years) completed the tasks on the text only based interface design faster than on the redundant interface design. The outcome of the second experiment showed that, as expected, older people took more time on a nested interface. However, they did not make significantly more errors compared with younger age groups. Contrary to what was expected, older age groups also did better under anxious conditions. The findings of this study also suggest that older age groups are more heterogeneous in their capabilities and their intuitive use of contemporary technological devices is mediated more by domain-specific technology prior knowledge and by their cognitive abilities, than chronological age. This makes it extremely difficult to develop product interfaces that are entirely intuitive to use. However, by keeping in view the cognitive limitations of older people when interfaces are developed, and using simple text-based interfaces with flat interface structure, would help them intuitively learn and use complex technological products successfully during early encounter with a product. These findings indicate that it might be more pragmatic if interfaces are designed for intuitive learning rather than for intuitive use. Based on this research and the existing literature, a model for adaptable interface design as a strategy for developing intuitively learnable product interfaces was proposed. An adaptable interface can initially use a simple text only interface to help older users to learn and successfully use the new system. Over time, this can be progressively changed to a symbols-based nested interface for more efficient and intuitive use.
Resumo:
Goethite and Al-substituted goethite were synthesized from the reaction between ferric nitrate and/or aluminum nitrate and potassium hydroxide. XRF, XRD, TEM with EDS were used to characterize the chemical composition, phase and lattice parameters, and morphology of the synthesized products. The results show that d(020) decreases from 4.953 to 4.949 Å and the b dimension decreases from 9.951 Å to 9.906 Å when the aging time increases from 6 days to 42 days for 9.09 mol% Al-substituted goethite. A sample with 9.09 mol% Al substitution in Al-substituted goethite was prepared by a rapid co-precipitation method. In the sample, 13.45 mol%, 12.31 mol% and 5.85 mol% Al substitution with a crystal size of 163, 131, and 45 nm are observed as shown in the TEM images and EDS. The crystal size of goethite is positively related to the degree of Al substitution according to the TEM images and EDS results. Thus, this methodology is proved to be effective to distinguish the morphology of goethite and Al substituted goethite.
Resumo:
Current diagnostic methods for assessing the severity of articular cartilage degenerative conditions, such as osteoarthritis, are inadequate. There is also a lack of techniques that can be used for real-time evaluation of the tissue during surgery to inform treatment decision and eliminate subjectivity. This book, derived from Dr Afara’s doctoral research, presents a scientific framework that is based on near infrared (NIR) spectroscopy for facilitating the non-destructive evaluation of articular cartilage health relative to its structural, functional, and mechanical properties. This development is a component of the ongoing research on advanced endoscopic diagnostic techniques in the Articular Cartilage Biomechanics Research Laboratory of Professor Adekunle Oloyede at Queensland University of Technology (QUT), Brisbane Australia.
Resumo:
The GameFlow model strives to be a general model of player enjoyment, applicable to all game genres and platforms. Derived from a general set of heuristics for creating enjoyable player experiences, the GameFlow model has been widely used in evaluating many types of games, as well as non-game applications. However, we recognize that more specific, low-level, and implementable criteria are potentially more useful for designing and evaluating video games. Consequently, the research reported in this paper aims to provide detailed heuristics for designing and evaluating one specific game genre, real-time strategy games. In order to develop these heuristics, we conducted a grounded theoretical analysis on a set of professional game reviews and structured the resulting heuristics using the GameFlow model. The resulting 165 heuristics for designing and evaluating real-time strategy games are presented and discussed in this paper.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
This paper presents a methodology for real-time estimation of exit movement-specific average travel time on urban routes by integrating real-time cumulative plots, probe vehicles, and historic cumulative plots. Two approaches, component based and extreme based, are discussed for route travel time estimation. The methodology is tested with simulation and is validated with real data from Lucerne, Switzerland, that demonstrate its potential for accurate estimation. Both approaches provide similar results. The component-based approach is more reliable, with a greater chance of obtaining a probe vehicle in each interval, although additional data from each component is required. The extreme-based approach is simple and requires only data from upstream and downstream of the route, but the chances of obtaining a probe that traverses the entire route might be low. The performance of the methodology is also compared with a probe-only method. The proposed methodology requires only a few probes for accurate estimation; the probe-only method requires significantly more probes.
Resumo:
This article proposes an approach for real-time monitoring of risks in executable business process models. The approach considers risks in all phases of the business process management lifecycle, from process design, where risks are defined on top of process models, through to process diagnosis, where risks are detected during process execution. The approach has been realized via a distributed, sensor-based architecture. At design-time, sensors are defined to specify risk conditions which when fulfilled, are a likely indicator of negative process states (faults) to eventuate. Both historical and current process execution data can be used to compose such conditions. At run-time, each sensor independently notifies a sensor manager when a risk is detected. In turn, the sensor manager interacts with the monitoring component of a business process management system to prompt the results to process administrators who may take remedial actions. The proposed architecture has been implemented on top of the YAWL system, and evaluated through performance measurements and usability tests with students. The results show that risk conditions can be computed efficiently and that the approach is perceived as useful by the participants in the tests.
Resumo:
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Resumo:
An optical system which performs the multiplication of binary numbers is described and proof-of-principle experiments are performed. The simultaneous generation of all partial products, optical regrouping of bit products, and optical carry look-ahead addition are novel features of the proposed scheme which takes advantage of the parallel operations capability of optical computers. The proposed processor uses liquid crystal light valves (LCLVs). By space-sharing the LCLVs one such system could function as an array of multipliers. Together with the optical carry look-ahead adders described, this would constitute an optical matrix-vector multiplier.