889 resultados para Man-Machine Perceptual Performance.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
Censorship and Performance, edited by Tom Sellar, examines the politics of censorship, and continuing contests over the ‘right’ to claim theatrical and cultural stages for controversial forms of social and self representation, at the start of the twenty-first century. In bringing this collection together, Sellar has taken a broad-based approach to the concept of censorship in theatrical performanceand, indeed, to the concept of theatrical performance itself. Sellar and his contributors clearly accept that surveillance, suppression and restriction of specific forms of representation is a complex, culturally specific phenomenon. In this sense, Censorship and Performance addresses direct political control over content, as well as thornier arguments about media controversy, moral panic, and the politics of self-censorship amongst artists and arts organisations.
Resumo:
This study investigated the effects of visual status, driver age and the presence of secondary distracter tasks on driving performance. Twenty young (M = 26.8 years) and 19 old (M = 70.2 years) participants drove around a closed-road circuit under three visual (normal, simulated cataracts, blur) and three distracter conditions (none, visual, auditory). Simulated visual impairment, increased driver age and the presence of a distracter task detrimentally affected all measures of driving performance except gap judgments and lane keeping. Significant interaction effects were evident between visual status, age and distracters; simulated cataracts had the most negative impact on performance in the presence of visual distracters and a more negative impact for older drivers. The implications of these findings for driving behaviour and acquisition of driving-related information for people with common visual impairments are discussed
Resumo:
Older drivers represent the fastest growing segment of the road user population. Cognitive and physiological capabilities diminishes with ages. The design of future in-vehicle interfaces have to take into account older drivers' needs and capabilities. Older drivers have different capabilities which impact on their driving patterns and subsequently on road crash patterns. New in-vehicle technology could improve safety, comfort and maintain elderly people's mobility for longer. Existing research has focused on the ergonomic and Human Machine Interface (HMI) aspects of in-vehicle technology to assist the elderly. However there is a lack of comprehensive research on identifying the most relevant technology and associated functionalities that could improve older drivers' road safety. To identify future research priorities for older drivers, this paper presents: (i) a review of age related functional impairments, (ii) a brief description of some key characteristics of older driver crashes and (iii) a conceptualisation of the most relevant technology interventions based on traffic psychology theory and crash data.
Seismic performance of brick infilled RC frame structures in low and medium rise buildings in Bhutan
Resumo:
The construction of reinforced concrete buildings with unreinforced infill is common practice even in seismically active country such as Bhutan, which is located in high seismic region of Eastern Himalaya. All buildings constructed prior 1998 were constructed without seismic provisions while those constructed after this period adopted seismic codes of neighbouring country, India. However, the codes have limited information on the design of infilled structures besides having differences in architectural requirements which may compound the structural problems. Although the influence of infill on the reinforced concrete framed structures is known, the present seismic codes do not consider it due to the lack of sufficient information. Time history analyses were performed to study the influence of infill on the performance of concrete framed structures. Important parameters were considered and the results presented in a manner that can be used by practitioners. The results show that the influence of infill on the structural performance is significant. The structural responses such as fundamental period, roof displacement, inter-storey drift ratio, stresses in infill wall and structural member forces of beams and column generally reduce, with incorporation of infill wall. The structures designed and constructed with or without seismic provision perform in a similar manner if the infills of high strength are used.
Resumo:
It is often postulated that an increased hip to shoulder differential angle (`X-Factor') during the early downswing better utilises the stretch-shorten cycle and improves golf performance. The current study aims to examine the potential relationship between the X-Factor and performance during the tee-shot. Seven golfers with handicaps between 0 and 10 strokes comprised the low-handicap group, whilst the high-handicap group consisted of eight golfers with handicaps between 11 and 20 strokes. The golfers performed 20 drives and three-dimensional kinematic data were used to quantify hip and shoulder rotation and the subsequent X-Factor. Compared with the low-handicap group, the high-handicap golfers tended to demonstrate greater hip rotation at the top of the backswing and recorded reduced maximum X-Factor values. The inconsistencies evident in the literature may suggest that a universal method of measuring rotational angles during the golf swing would be beneficial for future studies, particularly when considering potential injury.
Resumo:
The ability to forecast machinery failure is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models for forecasting machinery health based on condition data. Although these models have aided the advancement of the discipline, they have made only a limited contribution to developing an effective machinery health prognostic system. The literature review indicates that there is not yet a prognostic model that directly models and fully utilises suspended condition histories (which are very common in practice since organisations rarely allow their assets to run to failure); that effectively integrates population characteristics into prognostics for longer-range prediction in a probabilistic sense; which deduces the non-linear relationship between measured condition data and actual asset health; and which involves minimal assumptions and requirements. This work presents a novel approach to addressing the above-mentioned challenges. The proposed model consists of a feed-forward neural network, the training targets of which are asset survival probabilities estimated using a variation of the Kaplan-Meier estimator and a degradation-based failure probability density estimator. The adapted Kaplan-Meier estimator is able to model the actual survival status of individual failed units and estimate the survival probability of individual suspended units. The degradation-based failure probability density estimator, on the other hand, extracts population characteristics and computes conditional reliability from available condition histories instead of from reliability data. The estimated survival probability and the relevant condition histories are respectively presented as “training target” and “training input” to the neural network. The trained network is capable of estimating the future survival curve of a unit when a series of condition indices are inputted. Although the concept proposed may be applied to the prognosis of various machine components, rolling element bearings were chosen as the research object because rolling element bearing failure is one of the foremost causes of machinery breakdowns. Computer simulated and industry case study data were used to compare the prognostic performance of the proposed model and four control models, namely: two feed-forward neural networks with the same training function and structure as the proposed model, but neglected suspended histories; a time series prediction recurrent neural network; and a traditional Weibull distribution model. The results support the assertion that the proposed model performs better than the other four models and that it produces adaptive prediction outputs with useful representation of survival probabilities. This work presents a compelling concept for non-parametric data-driven prognosis, and for utilising available asset condition information more fully and accurately. It demonstrates that machinery health can indeed be forecasted. The proposed prognostic technique, together with ongoing advances in sensors and data-fusion techniques, and increasingly comprehensive databases of asset condition data, holds the promise for increased asset availability, maintenance cost effectiveness, operational safety and – ultimately – organisation competitiveness.
Resumo:
Most tropical fruit flies only lay into mature fruit, but a small number can also oviposit into unripe fruit. Little is known about the link between adult oviposition preference and offspring performance in such situations. In this study we examine the influence of different ripening stages of two mango Mangifera indica L. (Anacardiaceae) varieties on the preference and performance of the Oriental fruit fly, Bactrocera dorsalis (Hendel) (Diptera: Tephritidae), a fly known to be able to develop in unripe fruit. Work was carried out as a series of laboratory-based choice and no-choice oviposition experiments and larval growth trials. In oviposition choice trials, female B. dorsalis demonstrated a preference for ripe fruit of mango variety Namdorkmai over variety Oakrong, but generally the dependent variable most influencing oviposition results was fruit ripening stage. Ripe and fully-ripe mangoes were most preferred for oviposition by B. dorsalis. In contrast, unripe mango was infrequently used by ovipositing females, particularly in choice trials. Consistent with the results of oviposition preference, ripe and fully-ripe mangoes were also best for offspring survival, with a higher percentage of larval survival to pupation and shorter development times in comparison to unripe mango. Changes in Total Soluble Solids, TSS, and skin toughness correlate with changing host use across the ripening stages. Regardless of the mango variety or ripeness stage, B. dorsalis had difficulty penetrating the pericarp of our experimental fruit. Larval survival was also often poor. We discuss the possibility that there may be differences in the ability of laboratory and wild flies to penetrate fruit for oviposition, or that in the field flies more regularly utilize natural fruit wounds as oviposition sites.
Resumo:
In this paper, the train scheduling problem is modelled as a blocking parallel-machine job shop scheduling (BPMJSS) problem. In the model, trains, single-track sections and multiple-track sections, respectively, are synonymous with jobs, single machines and parallel machines, and an operation is regarded as the movement/traversal of a train across a section. Due to the lack of buffer space, the real-life case should consider blocking or hold-while-wait constraints, which means that a track section cannot release and must hold the train until next section on the routing becomes available. Based on literature review and our analysis, it is very hard to find a feasible complete schedule directly for BPMJSS problems. Firstly, a parallel-machine job-shop-scheduling (PMJSS) problem is solved by an improved shifting bottleneck procedure (SBP) algorithm without considering blocking conditions. Inspired by the proposed SBP algorithm, feasibility satisfaction procedure (FSP) algorithm is developed to solve and analyse the BPMJSS problem, by an alternative graph model that is an extension of the classical disjunctive graph models. The proposed algorithms have been implemented and validated using real-world data from Queensland Rail. Sensitivity analysis has been applied by considering train length, upgrading track sections, increasing train speed and changing bottleneck sections. The outcomes show that the proposed methodology would be a very useful tool for the real-life train scheduling problems
Resumo:
The growth of the Penaeus monodon prawn aquaculture industry in Australia is hampered by a reliance on wild-caught broodstock. This species has proven difficult to breed from if broodstock are reared in captivity. Studies were therefore carried out to investigate factors controlling reproduction and influencing egg quality. Results of the studies revealed that patterns of nutrient accumulation during early ovary development are altered by captive conditions, possibly contributing to reduce larval quality. The sinus gland hormones were shown, together with the environment, to regulate two stages of ovary development. In a separate study it was further revealed that the hormone methyl farnesoate (MF) could negatively regulate the final stages of ovary development. Lastly it was shown that broodstock reared in captivity are less likely to mate and that this is due to inherent problems in both the male and the female prawns.
Resumo:
Using the generative processes developed over two stages of creative development and the performance of The Physics Project at the Loft at the Creative Industries Precinct at the Queensland University of Technology (QUT) from 5th – 8th April 2006 as a case study, this exegesis considers how the principles of contemporary physics can be reframed as aesthetic principles in the creation of contemporary performance. The Physics Project is an original performance work that melds live performance, video and web-casting and overlaps an exploration of personal identity with the physics of space, time, light and complementarity. It considers the acts of translation between the language of physics and the language of contemporary performance that occur via process and form. This exegesis also examines the devices in contemporary performance making and contemporary performance that extend the reach of the performance, including the integration of the live and the mediated and the use of metanarratives.
Resumo:
Network Jamming systems provide real-time collaborative performance experiences for novice or inexperienced users. In this paper we will outline the interaction design considerations that have emerged during through evolutionary development cycles of the jam2jam Network Jamming software that employs generative techniques that require particular attention to the human computer relationship. In particular we describe the co-evolution of features and uses, explore the role of agile development methods in supporting this evolution, and show how the provision of a clear core capability can be matched with options for enhanced features support multi-levelled user experience and skill develop.
Resumo:
Aims: This study investigated the effect of simulated visual impairment on the speed and accuracy of performance on a series of commonly used cognitive tests. ----- Methods: Cognitive performance was assessed for 30 young, visually normal subjects (M=22.0yrs ± 3.1 yrs) using the Digit Symbol Substitution Test (DSST), Trail Making Test (TMT) A and B and the Stroop Colour Word Test under three visual conditions: normal vision and two levels of visually degrading filters (VistechTM) administered in a random order. Distance visual acuity and contrast sensitivity were also assessed for each filter condition. ----- Results: The visual filters, which degraded contrast sensitivity to a greater extent than visual acuity, significantly increased the time to complete (p<0.05), but not the number of errors made, on the DSST and the TMT A and B and affected only some components of the Stroop test.----- Conclusions: Reduced contrast sensitivity had a marked effect on the speed but not the accuracy of performance on commonly used cognitive tests, even in young individuals; the implications of these findings are discussed.
Resumo:
In this paper, we discuss our participation to the INEX 2008 Link-the-Wiki track. We utilized a sliding window based algorithm to extract the frequent terms and phrases. Using the extracted phrases and term as descriptive vectors, the anchors and relevant links (both incoming and outgoing) are recognized efficiently.
Resumo:
One of the classic forms of intermediate representation used for communication between compiler front-ends and back-ends are those based on abstract stack machines. It is possible to compile the stack machine instructions into machine code by means of an interpretive code generator, or to simulate the stack machine at runtime using an interpreter. This paper describes an approach intermediate between these two extremes. The front-end for a commercial Modula 2 compiler was ported to the "industry standard PC", and a partially compiling back-end written. The object code runs with the assistance of an interpreter, but may be linked with libraries which are fully compiled. The intent was to provide a programming environment on the PC which is identical to that of the same compilers on 32-bit UNIX machines. This objective has been met, and the compiler is available to educational institutions as free-ware. The design basis of the new compiler is described, and the performance critically evaluated.