979 resultados para Conveying machinery
Resumo:
Emergence has the potential to effect complex, creative or open-ended interactions and novel game-play. We report on research into an emergent interactive system. This investigates emergent user behaviors and experience through the creation and evaluation of an interactive system. The system is +-NOW, an augmented reality, tangible, interactive art system. The paper briefly describes the qualities of emergence and +-NOW before focusing on its evaluation. This was a qualitative study with 30 participants conducted in context. Data analysis followed Grounded Theory Methods. Coding schemes, induced from data and external literature are presented. Findings show that emergence occurred in over half of the participants. The nature of these emergent behaviors is discussed along with examples from the data. Other findings indicate that participants found interaction with the work satisfactory. Design strategies for facilitating satisfactory experience despite the often unpredictable character of emergence, are briefly reviewed and potential application areas for emergence are discussed.
Resumo:
This paper reports outcomes of a pilot study to develop a conceptual framework to allow people to retrofit a building-layer to gain better control of their own built- environments. The study was initiated by the realisation that discussions surrounding the improvement of building performances tend to be about top-down technological solutions rather than to help and encourage bottom-up involvement of building-users. While users are the ultimate beneficiaries and their feedback is always appreciated, their direct involvements in managing buildings would often be regarded as obstruction or distraction. This is largely because casual interventions by uninformed building-users tend to disrupt the system. Some earlier researches showed however that direct and active participation of users could improve the building performance if appropriate training and/or systems were introduced. We also speculate this in long run would also make the built environment more sustainable. With this in mind, we looked for opportunities to retrofit our own office with an interactive layer to study how we could introduce ad-hoc systems for building-users. The aim of this paper is to describe our vision and initial attempts followed by discussion.
Resumo:
The act of computer programming is generally considered to be temporally removed from a computer program’s execution. In this paper we discuss the idea of programming as an activity that takes place within the temporal bounds of a real-time computational process and its interactions with the physical world. We ground these ideas within the context of livecoding – a live audiovisual performance practice. We then describe how the development of the programming environment “Impromptu” has addressed our ideas of programming with time and the notion of the programmer as an agent in a cyber-physical system.
Resumo:
The convergence of locative and social media with collaborative interfaces and data visualisation has expanded the potential of online information provision. Offering new ways for communities to share contextually specific information, it presents the opportunity to expand social media’s current focus on micro self-publishing with applications that support communities to actively address areas of local need. This paper details the design and development of a prototype application that illustrates this potential. Entitled PetSearch, it was designed in collaboration with the Animal Welfare League of Queensland to support communities to map and locate lost, found and injured pets, and to build community engagement in animal welfare issues. We argue that, while established approaches to social and locative media provide a useful foundation for designing applications to harness social capital, they must be re-envisaged if they are to effectively facilitate community collaboration. We conclude by arguing that the principles of user engagement and co-operation employed in this project can be extrapolated to other online approaches that aim to facilitate co-operative problem solving for social benefit.
Resumo:
Due to the explosive growth of the Web, the domain of Web personalization has gained great momentum both in the research and commercial areas. One of the most popular web personalization systems is recommender systems. In recommender systems choosing user information that can be used to profile users is very crucial for user profiling. In Web 2.0, one facility that can help users organize Web resources of their interest is user tagging systems. Exploring user tagging behavior provides a promising way for understanding users’ information needs since tags are given directly by users. However, free and relatively uncontrolled vocabulary makes the user self-defined tags lack of standardization and semantic ambiguity. Also, the relationships among tags need to be explored since there are rich relationships among tags which could provide valuable information for us to better understand users. In this paper, we propose a novel approach for learning tag ontology based on the widely used lexical database WordNet for capturing the semantics and the structural relationships of tags. We present personalization strategies to disambiguate the semantics of tags by combining the opinion of WordNet lexicographers and users’ tagging behavior together. To personalize further, clustering of users is performed to generate a more accurate ontology for a particular group of users. In order to evaluate the usefulness of the tag ontology, we use the tag ontology in a pilot tag recommendation experiment for improving the recommendation performance by exploiting the semantic information in the tag ontology. The initial result shows that the personalized information has improved the accuracy of the tag recommendation.
Resumo:
The ability to forecast machinery health is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models which attempt to forecast machinery health based on condition data such as vibration measurements. This paper demonstrates how the population characteristics and condition monitoring data (both complete and suspended) of historical items can be integrated for training an intelligent agent to predict asset health multiple steps ahead. The model consists of a feed-forward neural network whose training targets are asset survival probabilities estimated using a variation of the Kaplan–Meier estimator and a degradation-based failure probability density function estimator. The trained network is capable of estimating the future survival probabilities when a series of asset condition readings are inputted. The output survival probabilities collectively form an estimated survival curve. Pump data from a pulp and paper mill were used for model validation and comparison. The results indicate that the proposed model can predict more accurately as well as further ahead than similar models which neglect population characteristics and suspended data. This work presents a compelling concept for longer-range fault prognosis utilising available information more fully and accurately.
Resumo:
The ability to accurately predict the remaining useful life of machine components is critical for machine continuous operation, and can also improve productivity and enhance system safety. In condition-based maintenance (CBM), maintenance is performed based on information collected through condition monitoring and an assessment of the machine health. Effective diagnostics and prognostics are important aspects of CBM for maintenance engineers to schedule a repair and to acquire replacement components before the components actually fail. All machine components are subjected to degradation processes in real environments and they have certain failure characteristics which can be related to the operating conditions. This paper describes a technique for accurate assessment of the remnant life of machines based on health state probability estimation and involving historical knowledge embedded in the closed loop diagnostics and prognostics systems. The technique uses a Support Vector Machine (SVM) classifier as a tool for estimating health state probability of machine degradation, which can affect the accuracy of prediction. To validate the feasibility of the proposed model, real life historical data from bearings of High Pressure Liquefied Natural Gas (HP-LNG) pumps were analysed and used to obtain the optimal prediction of remaining useful life. The results obtained were very encouraging and showed that the proposed prognostic system based on health state probability estimation has the potential to be used as an estimation tool for remnant life prediction in industrial machinery.
Resumo:
The use of material artefacts within the design process is a long-standing and continuing characteristic of interaction design. Established methods, such as prototyping, which have been widely adopted by educators and practitioners, are seeing renewed research interest and being reconsidered in light of the evolving needs of the field. Alongside this, the past decade has seen the introduction and adoption of a diverse range of novel design methods into interaction design, such as cultural probes, technology probes, context mapping, and provotypes.
Resumo:
We investigated the collaboration of ten doctor-nurse pairs with a prototype digital telehealth stethoscope. Doctors could see and hear the patient but could not touch them or the stethoscope. The nurse in each pair controlled the stethoscope. For ethical reasons, an experimenter stood in for a patient. Each of the ten interactions was video recorded and analysed to understand the interaction and collaboration between the doctor and nurse. The video recordings were coded and transformed into maps of interaction that were analysed for patterns of activity. The analysis showed that as doctors and nurses became more experienced at using the telehealth stethoscope their collaboration was more effective. The main measure of effectiveness was the number of corrections in stethoscope placement required by the doctor. In early collaborations, the doctors gave many corrections. After several trials, each doctor and nurse had reduced corrections and all pairs reduced their corrections. The significance of this research is the identification of the qualities of effective collaboration in the use of the telehealth stethoscope and telehealth systems more generally.
Resumo:
With increasing demands on our time, everyday behaviors such as food purchasing, preparation, and consumption have become habitual and unconscious. Indeed, modern food values are focused on conve- nience and effortlessness, overshad- owing other values such as environ- mental sustainability, health, and pleasure. The rethinking of how we approach everyday food behaviors appears to be a particularly timely concern. In this special section, we explore work carried out and dis- cussed during the recent workshop “Food for Thought: Designing for Critical Reflection on Food Practices,” at the 2012 Designing Interactive Systems Conference in Newcastle upon Tyne, U.K.
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.
Resumo:
Modelling video sequences by subspaces has recently shown promise for recognising human actions. Subspaces are able to accommodate the effects of various image variations and can capture the dynamic properties of actions. Subspaces form a non-Euclidean and curved Riemannian manifold known as a Grassmann manifold. Inference on manifold spaces usually is achieved by embedding the manifolds in higher dimensional Euclidean spaces. In this paper, we instead propose to embed the Grassmann manifolds into reproducing kernel Hilbert spaces and then tackle the problem of discriminant analysis on such manifolds. To achieve efficient machinery, we propose graph-based local discriminant analysis that utilises within-class and between-class similarity graphs to characterise intra-class compactness and inter-class separability, respectively. Experiments on KTH, UCF Sports, and Ballet datasets show that the proposed approach obtains marked improvements in discrimination accuracy in comparison to several state-of-the-art methods, such as the kernel version of affine hull image-set distance, tensor canonical correlation analysis, spatial-temporal words and hierarchy of discriminative space-time neighbourhood features.
Resumo:
We are aware of global concerns of sustainability and are encouraged on many fronts to modify our behaviour to save the planet but sometimes this understanding is more intellectual than motivated. An opportunity was identified within the university environment to activate a pilot study to investigate the level of voluntary student engagement in saving energy if a plant/digital interface were introduced. We postulate that people may be more inclined to participate in a "green" activity if they are more directly aware of the benefits. This project also seeks to discover if the introduction of nature (green plants) as the interface would encourage users to increase participation in socially responsive activities. Using plants as the interface offers an immediate sensory connection between the participants and the outcome of their chosen actions. This may generate a deeper awareness of the environment by enabling the participant to realise that their one small action in an ordinary day can contribute positively to larger global issues.
Resumo:
GPS is a commonly used and convenient technology for determining absolute position in outdoor environments, but its high power consumption leads to rapid battery depletion in mobile devices. An obvious solution is to duty cycle the GPS module, which prolongs the device lifetime at the cost of increased position uncertainty while the GPS is off. This article addresses the trade-off between energy consumption and localization performance in a mobile sensor network application. The focus is on augmenting GPS location with more energy-efficient location sensors to bound position estimate uncertainty while GPS is off. Empirical GPS and radio contact data from a large-scale animal tracking deployment is used to model node mobility, radio performance, and GPS. Because GPS takes a considerable, and variable, time after powering up before it delivers a good position measurement, we model the GPS behaviour through empirical measurements of two GPS modules. These models are then used to explore duty cycling strategies for maintaining position uncertainty within specified bounds. We then explore the benefits of using short-range radio contact logging alongside GPS as an energy-inexpensive means of lowering uncertainty while the GPS is off, and we propose strategies that use RSSI ranging and GPS back-offs to further reduce energy consumption. Results show that our combined strategies can cut node energy consumption by one third while still meeting application-specific positioning criteria.
Resumo:
Programmed cell death is characterized by a cascade of tightly controlled events that culminate in the orchestrated death of the cell. In multicellular organisms autophagy and apoptosis are recognized as two principal means by which these genetically determined cell deaths occur. During plant-microbe interactions cell death programs can mediate both resistant and susceptible events. Via oxalic acid (OA), the necrotrophic phytopathogen Sclerotinia sclerotiorum hijacks host pathways and induces cell death in host plant tissue resulting in hallmark apoptotic features in a time and dose dependent manner. OA-deficient mutants are non-pathogenic and trigger a restricted cell death phenotype in the host that unexpectedly exhibits markers associated with the plant hypersensitive response including callose deposition and a pronounced oxidative burst, suggesting the plant can recognize and in this case respond, defensively. The details of this plant directed restrictive cell death associated with OA deficient mutants is the focus of this work. Using a combination of electron and fluorescence microscopy, chemical effectors and reverse genetics, we show that this restricted cell death is autophagic. Inhibition of autophagy rescued the non-pathogenic mutant phenotype. These findings indicate that autophagy is a defense response in this necrotrophic fungus/plant interaction and suggest a novel function associated with OA; namely, the suppression of autophagy. These data suggest that not all cell deaths are equivalent, and though programmed cell death occurs in both situations, the outcome is predicated on who is in control of the cell death machinery. Based on our data, we suggest that it is not cell death per se that dictates the outcome of certain plant-microbe interactions, but the manner by which cell death occurs that is crucial.