960 resultados para Graders (Earthmoving machinery)
Resumo:
Protease-activated receptor-2 (PAR2) is a G protein coupled receptor (GPCR) that is activated by proteolytic cleavage of its amino terminal domain by trypsin-like serine proteases. Cleavage of this receptor exposes a neoepitope, termed the tethered ligand (TL), which binds intramolecularly within the receptor to stimulate signal transduction via coupled G proteins. PAR2-mediated signal transduction is also experimentally stimulated by hexapeptides (agonist peptides; APs) that are homologous to the TL sequence. Due to the irreversible nature of PAR2 proteolysis, downstream signal transduction is tightly regulated. Following activation, PAR2 is rapidly uncoupled from downstream signalling by the post-translational modifications phosphorylation and ubiquination which facilitate interactions with â- arrestin. This scaffolding protein couples PAR2 to the internalisation machinery initiating its desensitisation and trafficking through the early and late endosomes followed by receptor degradation. PAR2 is widely expressed in mammalian tissues with key roles for this receptor in cardiovascular, respiratory, nervous and musculoskeletal systems. This receptor has also been linked to pathological states with aberrant expression and signalling noted in several cancers. In prostate cancer, PAR2 signalling induces migration and proliferation of tumour derived cell lines, while elevated receptor expression has been noted in malignant tissues. Importantly, a role for this receptor has also been suggested in prostate cancer bone metastasis as coexpression of PAR2 and a proteolytic activator has been demonstrated by immunohistochemical analysis. Based on these data, the primary focus of this project has been on two aspects of PAR2 biology. The first is characterisation of cellular mechanisms that regulate PAR2 signalling and trafficking. The second aspect is the role of this receptor in prostate cancer bone metastasis. In addition, to permit these studies, it was first necessary to evaluate the specificity of the commercially available anti-PAR2 antibodies SAM11, C17, N19 and H99. The evaluation of the four commercially available antibodies was assessed using four techniques: immunoprecipitation; Western blot analysis; immunofluorescence; and flow cytometry. These approaches demonstrated that three of the antibodies efficiently detect ectopically expressed PAR2 by each of these techniques. A significant finding from this study was that N19 was the only antibody able to specifically detect N-glycosylated endogenous PAR2 by Western blot analysis. This analysis was performed on lysates from prostate cancer derived cell lines and tissue derived from wildtype and PAR2 knockout mice. Importantly, further evaluation demonstrated that this antibody also efficiently detects endogenous PAR2 at the cell surface by flow cytometry. The anti-PAR2 antibody N19 was used to explore the in vitro role of palmitoylation, the post-translational addition of palmitate, in PAR2 signalling, trafficking, cell surface expression and desensitization. Significantly, use of the palmitoylation inhibitor 2-bromopalmitate indicated that palmitate addition is important in trafficking of PAR2 endogenously expressed by prostate cancer cell lines. This was supported by palmitate labelling experiments using two approaches which showed that PAR2 stably expressed by CHO cells is palmitoylated and that palmitoylation occurs on cysteine 361. Another key finding from this study is that palmitoylation is required for optimal PAR2 signalling as Ca2+ flux assays indicated that in response to trypsin agonism, palmitoylation deficient PAR2 is ~9 fold less potent than wildtype receptor with a reduction of about 33% in the maximum signal induced via the mutant receptor. Confocal microscopy, flow cytometry and cell surface biotinylation analyses demonstrated that palmitoylation is required for efficient cell surface expression of PAR2. Importantly, this study also identified that palmitoylation of this receptor within the Golgi apparatus is required for efficient agonist-induced rab11amediated trafficking of PAR2 to the cell surface. Interestingly, palmitoylation is also required for receptor desensitization, as agonist-induced â-arrestin recruitment and receptor degradation were markedly reduced in CHO-PAR2-C361A cells compared with CHO-PAR2 cells. Collectively, these data provide new insights on the life cycle of PAR2 and demonstrate that palmitoylation is critical for efficient signalling, trafficking, cell surface localization and degradation of this receptor. This project also evaluated PAR2 residues involved in ligand docking. Although the extracellular loop (ECL)2 of PAR2 is known to be required for agonist-induced signal transduction, the binding pocket for receptor agonists remains to be determined. In silico homology modelling, based on a crystal structure for the prototypical GPCR rhodopsin, and ligand docking were performed to identify PAR2 transmembrane (TM) amino acids potentially involved in agonist binding. These methods identified 12 candidate residues that were mutated to examine the binding site of the PAR2 TL, revealed by trypsin cleavage, as well as of the soluble ligands 2f-LIGRLO-NH2 and GB110, which are both structurally based on the AP SLIGRLNH2. Ligand binding was evaluated from the impact of the mutated residues on PAR2-mediated calcium mobilisation. An important finding from these experiments was that mutation of residues Y156 and Y326 significantly reduced 2f-LIGRLO-NH2 and GB110 agonist activity. L307 was also important for GB110 activity. Intriguingly, mutation of PAR2 residues did not alter trypsin-induced signalling to the same extent as for the soluble agonists. The reason for this difference remains to be further examined by in silico and in vitro experimentation and, potentially, crystal structure studies. However, these findings identified the importance of TM domains in PAR2 ligand docking and will enhance the design of both PAR2 agonists and potentially agents to inhibit signalling (antagonists). The potential importance of PAR2 in prostate cancer bone metastasis was examined using a mouse model. In patients, prostate cancer bone metastases cause bone growth by disrupting bone homeostasis. In an attempt to mimic prostate cancer growth in bone, PAR2 responsive 22Rv1 prostate cancer cells, which form mixed osteoblastic and osteolytic lesions, were injected into the proximal aspect of mouse tibiae. A role for PAR2 was assessed by treating these mice with the recently developed PAR2 antagonist GB88. As controls, animals bearing intra-tibial tumours were also treated with vehicle (olive oil) or the prostate cancer chemotherapeutic docetaxel. The effect of these treatments on bone was examined radiographically and by micro-CT. Consistent with previous studies, 22Rv1 tumours caused osteoblastic periosteal spicule formation and concurrent osteolytic bone loss. Significantly, blockade of PAR2 signalling reduced the osteoblastic and osteolytic phenotype of 22Rv1 tumours in bone. No bone defects were detected in mice treated with docetaxel. These qualitative data will be followed in the future by quantitative micro-CT analysis as well as histology and histomorphometry analysis of already collected tissues. Nonetheless, these preliminary experiments highlight a potential role for PAR2 in prostate cancer growth in bone. In summary, in vitro studies have defined mechanisms regulating PAR2 activation, downstream signalling and trafficking and in vivo studies point to a potential role for this receptor in prostate cancer bone metastasis. The outcomes of this project are that a greater understanding of the biology of PAR2 may lead to the development of strategies to modulate the function of this receptor in disease.
Resumo:
This workshop is a continuation and extension to the successful past workshops including [4, 5, 6]. The workshop addresses the opportunities and challenges for the design of digital interactive systems that engage individuals in critical reflection on their everyday food practices - including designing for engagement in more environmentally aware, socially inclusive, and healthier behaviour. These three themes represent the focus of much recent HCI work related to food. The workshop aims to further the conversation on these themes through understanding specifically how the process of critical reflection can be encouraged by interactive technology. While the focus will be on food as an application area, the intention is to also explore, more generally, how the process of critical reflection can be facilitated through interactive technology. The workshop provides a unique forum to discuss existing theoretical and pragmatic approaches, and to envision novel ways to design technology that encourages sustained critical reflection.
Resumo:
Circoviruses lack an autonomous DNA polymerase and are dependent on the replication machinery of the host cell for de novo DNA synthesis. Accordingly, the viral DNA needs to cross both the plasma membrane and the nuclear envelope before replication can occur. Here we report on the subcellular distribution of the beak and feather disease virus (BFDV) capsid protein (CP) and replication-associated protein (Rep) expressed via recombinant baculoviruses in an insect cell system and test the hypothesis that the CP is responsible for transporting the viral genome, as well as Rep, across the nuclear envelope. The intracellular localization of the BFDV CP was found to be directed by three partially overlapping bipartite nuclear localization signals (NLSs) situated between residues 16 and 56 at the N terminus of the protein. Moreover, a DNA binding region was also mapped to the N terminus of the protein and falls within the region containing the three putative NLSs. The ability of CP to bind DNA, coupled with the karyophilic nature of this protein, strongly suggests that it may be responsible for nuclear targeting of the viral genome. Interestingly, whereas Rep expressed on its own in insect cells is restricted to the cytoplasm, coexpression with CP alters the subcellular localization of Rep to the nucleus, strongly suggesting that an interaction with CP facilitates movement of Rep into the nucleus. Copyright © 2006, American Society for Microbiology. All Rights Reserved.
Resumo:
Emergence has the potential to effect complex, creative or open-ended interactions and novel game-play. We report on research into an emergent interactive system. This investigates emergent user behaviors and experience through the creation and evaluation of an interactive system. The system is +-NOW, an augmented reality, tangible, interactive art system. The paper briefly describes the qualities of emergence and +-NOW before focusing on its evaluation. This was a qualitative study with 30 participants conducted in context. Data analysis followed Grounded Theory Methods. Coding schemes, induced from data and external literature are presented. Findings show that emergence occurred in over half of the participants. The nature of these emergent behaviors is discussed along with examples from the data. Other findings indicate that participants found interaction with the work satisfactory. Design strategies for facilitating satisfactory experience despite the often unpredictable character of emergence, are briefly reviewed and potential application areas for emergence are discussed.
Resumo:
This paper reports outcomes of a pilot study to develop a conceptual framework to allow people to retrofit a building-layer to gain better control of their own built- environments. The study was initiated by the realisation that discussions surrounding the improvement of building performances tend to be about top-down technological solutions rather than to help and encourage bottom-up involvement of building-users. While users are the ultimate beneficiaries and their feedback is always appreciated, their direct involvements in managing buildings would often be regarded as obstruction or distraction. This is largely because casual interventions by uninformed building-users tend to disrupt the system. Some earlier researches showed however that direct and active participation of users could improve the building performance if appropriate training and/or systems were introduced. We also speculate this in long run would also make the built environment more sustainable. With this in mind, we looked for opportunities to retrofit our own office with an interactive layer to study how we could introduce ad-hoc systems for building-users. The aim of this paper is to describe our vision and initial attempts followed by discussion.
Resumo:
This paper reports on some findings from the first year of a three-year longitudinal study, in which seventh to ninth-graders were introduced to engineering education. Specifically, the paper addresses students’ responses to an initial design activity involving bridge construction, which was implemented at the end of seventh grade. This paper also addresses how students created their bridge designs and applied these in their bridge constructions; their reflections on their designs; their reflections on why the bridge failed to support increased weights during the testing process; and their suggestions on ways in which they would improve their bridge designs. The present findings include identification of six, increasingly sophisticated levels of illustrated bridge designs, with designs improving between the classroom and homework activities of two focus groups of students. Students’ responses to the classroom activity revealed a number of iterative design processes, where the problem goals, including constraints, served as monitoring factors for students’ generation of ideas, design thinking and construction of an effective bridge.
Resumo:
The act of computer programming is generally considered to be temporally removed from a computer program’s execution. In this paper we discuss the idea of programming as an activity that takes place within the temporal bounds of a real-time computational process and its interactions with the physical world. We ground these ideas within the context of livecoding – a live audiovisual performance practice. We then describe how the development of the programming environment “Impromptu” has addressed our ideas of programming with time and the notion of the programmer as an agent in a cyber-physical system.
Resumo:
The convergence of locative and social media with collaborative interfaces and data visualisation has expanded the potential of online information provision. Offering new ways for communities to share contextually specific information, it presents the opportunity to expand social media’s current focus on micro self-publishing with applications that support communities to actively address areas of local need. This paper details the design and development of a prototype application that illustrates this potential. Entitled PetSearch, it was designed in collaboration with the Animal Welfare League of Queensland to support communities to map and locate lost, found and injured pets, and to build community engagement in animal welfare issues. We argue that, while established approaches to social and locative media provide a useful foundation for designing applications to harness social capital, they must be re-envisaged if they are to effectively facilitate community collaboration. We conclude by arguing that the principles of user engagement and co-operation employed in this project can be extrapolated to other online approaches that aim to facilitate co-operative problem solving for social benefit.
Resumo:
Due to the explosive growth of the Web, the domain of Web personalization has gained great momentum both in the research and commercial areas. One of the most popular web personalization systems is recommender systems. In recommender systems choosing user information that can be used to profile users is very crucial for user profiling. In Web 2.0, one facility that can help users organize Web resources of their interest is user tagging systems. Exploring user tagging behavior provides a promising way for understanding users’ information needs since tags are given directly by users. However, free and relatively uncontrolled vocabulary makes the user self-defined tags lack of standardization and semantic ambiguity. Also, the relationships among tags need to be explored since there are rich relationships among tags which could provide valuable information for us to better understand users. In this paper, we propose a novel approach for learning tag ontology based on the widely used lexical database WordNet for capturing the semantics and the structural relationships of tags. We present personalization strategies to disambiguate the semantics of tags by combining the opinion of WordNet lexicographers and users’ tagging behavior together. To personalize further, clustering of users is performed to generate a more accurate ontology for a particular group of users. In order to evaluate the usefulness of the tag ontology, we use the tag ontology in a pilot tag recommendation experiment for improving the recommendation performance by exploiting the semantic information in the tag ontology. The initial result shows that the personalized information has improved the accuracy of the tag recommendation.
Resumo:
The ability to forecast machinery health is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models which attempt to forecast machinery health based on condition data such as vibration measurements. This paper demonstrates how the population characteristics and condition monitoring data (both complete and suspended) of historical items can be integrated for training an intelligent agent to predict asset health multiple steps ahead. The model consists of a feed-forward neural network whose training targets are asset survival probabilities estimated using a variation of the Kaplan–Meier estimator and a degradation-based failure probability density function estimator. The trained network is capable of estimating the future survival probabilities when a series of asset condition readings are inputted. The output survival probabilities collectively form an estimated survival curve. Pump data from a pulp and paper mill were used for model validation and comparison. The results indicate that the proposed model can predict more accurately as well as further ahead than similar models which neglect population characteristics and suspended data. This work presents a compelling concept for longer-range fault prognosis utilising available information more fully and accurately.
Resumo:
The ability to accurately predict the remaining useful life of machine components is critical for machine continuous operation, and can also improve productivity and enhance system safety. In condition-based maintenance (CBM), maintenance is performed based on information collected through condition monitoring and an assessment of the machine health. Effective diagnostics and prognostics are important aspects of CBM for maintenance engineers to schedule a repair and to acquire replacement components before the components actually fail. All machine components are subjected to degradation processes in real environments and they have certain failure characteristics which can be related to the operating conditions. This paper describes a technique for accurate assessment of the remnant life of machines based on health state probability estimation and involving historical knowledge embedded in the closed loop diagnostics and prognostics systems. The technique uses a Support Vector Machine (SVM) classifier as a tool for estimating health state probability of machine degradation, which can affect the accuracy of prediction. To validate the feasibility of the proposed model, real life historical data from bearings of High Pressure Liquefied Natural Gas (HP-LNG) pumps were analysed and used to obtain the optimal prediction of remaining useful life. The results obtained were very encouraging and showed that the proposed prognostic system based on health state probability estimation has the potential to be used as an estimation tool for remnant life prediction in industrial machinery.
Resumo:
The use of material artefacts within the design process is a long-standing and continuing characteristic of interaction design. Established methods, such as prototyping, which have been widely adopted by educators and practitioners, are seeing renewed research interest and being reconsidered in light of the evolving needs of the field. Alongside this, the past decade has seen the introduction and adoption of a diverse range of novel design methods into interaction design, such as cultural probes, technology probes, context mapping, and provotypes.
Resumo:
We investigated the collaboration of ten doctor-nurse pairs with a prototype digital telehealth stethoscope. Doctors could see and hear the patient but could not touch them or the stethoscope. The nurse in each pair controlled the stethoscope. For ethical reasons, an experimenter stood in for a patient. Each of the ten interactions was video recorded and analysed to understand the interaction and collaboration between the doctor and nurse. The video recordings were coded and transformed into maps of interaction that were analysed for patterns of activity. The analysis showed that as doctors and nurses became more experienced at using the telehealth stethoscope their collaboration was more effective. The main measure of effectiveness was the number of corrections in stethoscope placement required by the doctor. In early collaborations, the doctors gave many corrections. After several trials, each doctor and nurse had reduced corrections and all pairs reduced their corrections. The significance of this research is the identification of the qualities of effective collaboration in the use of the telehealth stethoscope and telehealth systems more generally.
Resumo:
With increasing demands on our time, everyday behaviors such as food purchasing, preparation, and consumption have become habitual and unconscious. Indeed, modern food values are focused on conve- nience and effortlessness, overshad- owing other values such as environ- mental sustainability, health, and pleasure. The rethinking of how we approach everyday food behaviors appears to be a particularly timely concern. In this special section, we explore work carried out and dis- cussed during the recent workshop “Food for Thought: Designing for Critical Reflection on Food Practices,” at the 2012 Designing Interactive Systems Conference in Newcastle upon Tyne, U.K.
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.