531 resultados para Instant Messaging
Resumo:
Cette thèse de doctorat en composition comprend deux projets de nature différente et complémentaire : (1) un projet de recherche théorique sur la communication des caractères musicaux; (2) un projet artistique s'articulant autour de la composition de trois pièces : L'homme à deux têtes - opéra de chambre, Un instant dans l'esprit de Lovecraft - pour clarinette solo, orchestre à cordes et percussions, et Balade ornithologique - pour orchestre à vents. La conception de la musique comme un moyen de communication est à l'origine de cette recherche théorique qui est motivée par un désir de compréhension des stratégies d'expressions des émotions en musique, à partir du point de vue du compositeur. Cette thèse aborde les modèles de communication, le concept de personnage virtuel et la théorie de la contagion des humeurs. Par la suite, nous détaillerons les indices acoustiques menant à la perception des caractères musicaux. Toutes ces notions sont illustrées et explorées par la composition de miniature ayant un caractère bien ciblé. Finalement, nous proposons un système d'analyse musical des caractères et des émotions qui est appliqué à l'analyse de sections des pièces composées au cours de ce projet de doctorat. Ce dernier chapitre met en lumière les stratégies utilisées pour créer un discours dramatique tout en exposant l'évocation de différents caractères musicaux.
Resumo:
Based on close examinations of instant message (IM) interactions, this chapter argues that an interactional sociolinguistic approach to computer-mediated language use could provide explanations for phenomena that previously could not be accounted for in computer-mediated discourse analysis (CMDA). Drawing on the theoretical framework of relational work (Locher, 2006), the analysis focuses on non-task oriented talk and its function in forming and establishing communication norms in the team, as well as micro-level phenomena, such as hesitation, backchannel signals and emoticons. The conclusions of this preliminary research suggest that the linguistic strategies used for substituting audio-visual signals are strategically used in discursive functions and have an important role in relational work
Resumo:
Purpose: This paper extends the use of Radio Frequency Identification (RFID) data for accounting of warehouse costs and services. Time Driven Activity Based Costing (TDABC) methodology is enhanced with the real-time collected RFID data about duration of warehouse activities. This allows warehouse managers to have accurate and instant calculations of costs. The RFID enhanced TDABC (RFID-TDABC) is proposed as a novel application of the RFID technology. Research Approach: Application of RFID-TDABC in a warehouse is implemented on warehouse processes of a case study company. Implementation covers receiving, put-away, order picking, and despatching. Findings and Originality: RFID technology is commonly used for the identification and tracking items. The use of the RFID generated information with the TDABC can be successfully extended to the area of costing. This RFID-TDABC costing model will benefit warehouse managers with accurate and instant calculations of costs. Research Impact: There are still unexplored benefits to RFID technology in its applications in warehousing and the wider supply chain. A multi-disciplinary research approach led to combining RFID technology and TDABC accounting method in order to propose RFID-TDABC. Combining methods and theories from different fields with RFID, may lead researchers to develop new techniques such as RFID-TDABC presented in this paper. Practical Impact: RFID-TDABC concept will be of value to practitioners by showing how warehouse costs can be accurately measured by using this approach. Providing better understanding of incurred costs may result in a further optimisation of warehousing operations, lowering costs of activities, and thus provide competitive pricing to customers. RFID-TDABC can be applied in a wider supply chain.
Resumo:
Sudden changes in the stiffness of a structure are often indicators of structural damage. Detection of such sudden stiffness change from the vibrations of structures is important for Structural Health Monitoring (SHM) and damage detection. Non-contact measurement of these vibrations is a quick and efficient way for successful detection of sudden stiffness change of a structure. In this paper, we demonstrate the capability of Laser Doppler Vibrometry to detect sudden stiffness change in a Single Degree Of Freedom (SDOF) oscillator within a laboratory environment. The dynamic response of the SDOF system was measured using a Polytec RSV-150 Remote Sensing Vibrometer. This instrument employs Laser Doppler Vibrometry for measuring dynamic response. Additionally, the vibration response of the SDOF system was measured through a MicroStrain G-Link Wireless Accelerometer mounted on the SDOF system. The stiffness of the SDOF system was experimentally determined through calibrated linear springs. The sudden change of stiffness was simulated by introducing the failure of a spring at a certain instant in time during a given period of forced vibration. The forced vibration on the SDOF system was in the form of a white noise input. The sudden change in stiffness was successfully detected through the measurements using Laser Doppler Vibrometry. This detection from optically obtained data was compared with a detection using data obtained from the wireless accelerometer. The potential of this technique is deemed important for a wide range of applications. The method is observed to be particularly suitable for rapid damage detection and health monitoring of structures under a model-free condition or where information related to the structure is not sufficient.
Resumo:
Flame retardants (FRs) are added to materials to enhance the fire safety level of readily combustible polymers. Although they have been purported to aid in preventing fires in some cases, they have also become a significant cause for concern given the vast data on environmental persistence and human and animal adverse health effects. Evidence since the 1980s has shown that Canadian, American and Europeans have detectable levels of FRs in their bodies. North Americans in particular have high levels of these chemicals due to stringent flammability standards and the higher use of polybrominated diphenyl ethers (PBDEs) in North America as opposed to Europe. FRs have been detected in household dust and some evidence suggests that TVs could be a significant source of exposure to FRs. It is imperative to re-visit the flammability standard (UL94V) that allows for FR use in TVs plastic materials by providing a risk versus benefit analysis to determine if this standard provides a fire safety benefit and if it plays a major role in FR exposure. This report first examined the history of televisions and the progression to the UL94V flammability test standard to understand why FRs were first added to polymers used in the manufacturing of TVs. It has been demonstrated to be due to fire hazards resulting from the use of plastic materials in cathode-ray tube (CRT) TVs that had an “instant-on” feature and high voltage and operating temperatures. In providing a risk versus benefit analysis, this paper presents the argument that 1) by providing a market survey the current flammability test standard (UL94V) is outdated and lacks relevance to current technology as flat, thin, energy efficient Liquid Crystal Displays (LCDs) dominate over traditionally used heavy, bulky and energy-intensive CRTs; 2) FRs do not impart fire safety benefits considering that there is a lack of valid fire safety concern, such as reduced internal and external ignition and fire hazard, and a lack of valid fire data and hazard for television fires in general and finally; 3) the standard is overly stringent as it does not consider the risk due to exposure to FRs in household dust due to the proliferation and greater use of televisions in households. Therefore, this report argues that the UL94V standard has become trapped in history and needs to be updated as it may play a major role in FR exposure.
Resumo:
PURPOSE: Radiation therapy is used to treat cancer using carefully designed plans that maximize the radiation dose delivered to the target and minimize damage to healthy tissue, with the dose administered over multiple occasions. Creating treatment plans is a laborious process and presents an obstacle to more frequent replanning, which remains an unsolved problem. However, in between new plans being created, the patient's anatomy can change due to multiple factors including reduction in tumor size and loss of weight, which results in poorer patient outcomes. Cloud computing is a newer technology that is slowly being used for medical applications with promising results. The objective of this work was to design and build a system that could analyze a database of previously created treatment plans, which are stored with their associated anatomical information in studies, to find the one with the most similar anatomy to a new patient. The analyses would be performed in parallel on the cloud to decrease the computation time of finding this plan. METHODS: The system used SlicerRT, a radiation therapy toolkit for the open-source platform 3D Slicer, for its tools to perform the similarity analysis algorithm. Amazon Web Services was used for the cloud instances on which the analyses were performed, as well as for storage of the radiation therapy studies and messaging between the instances and a master local computer. A module was built in SlicerRT to provide the user with an interface to direct the system on the cloud, as well as to perform other related tasks. RESULTS: The cloud-based system out-performed previous methods of conducting the similarity analyses in terms of time, as it analyzed 100 studies in approximately 13 minutes, and produced the same similarity values as those methods. It also scaled up to larger numbers of studies to analyze in the database with a small increase in computation time of just over 2 minutes. CONCLUSION: This system successfully analyzes a large database of radiation therapy studies and finds the one that is most similar to a new patient, which represents a potential step forward in achieving feasible adaptive radiation therapy replanning.
Resumo:
El presente ensayo versa sobre la conversación escrita que mantienen los jóvenes hoy en sus chats. Se tratan las diferentes variedades lingüísticas. La escritura de los chats equivale a una conversación: los textos escritos se convierten en textos orales, las conversaciones se transcriben y las normas lingüísticas se rompen. Esto no significa que los jóvenes no sepan cuáles son, pero en los chats, no les interesan.
Resumo:
La tesis mariológica de la conceptio per aurem, según la cual la Virgen María habría concebido a Jesucristo por el oído en el momento de escuchar del ángel el mensaje celestial anunciándole que, sin perder su virginidad, sería madre del Hijo de Dios encarnado, ha merecido hasta ahora muy pocos estudios académicos rigurosamente fundados en fuentes primarias. De hecho, en la literatura especializada son muy escasas las referencias a tal teoría y, cuando algún estudioso la evoca, casi siempre se contenta con aludir a ella, sin aportar pruebas documentales. Sin embargo, tal como lo revelan las nueve pinturas italianas aquí analizadas, esa teoría fue ilustrada mediante sutiles metáforas visuales en muchas obras pictóricas medievales, las cuales se inspiraron en una sólida tradición literaria. Además una pléyade de Padres de la Iglesia y teólogos medievales testimonia, mediante afirmaciones explícitas, que semejante teoría gozó de notable aceptación entre los maestros del pensamiento cristiano. Basándose en numerosos textos patrísticos y teológicos, este artículo intenta dos objetivos esenciales: exponer, ante todo, las distintas formulaciones teóricas propuestas por esos pensadores; y además, tratar de poner en luz los significados dogmáticos que subyacen bajo esa sorprendente tesis.
Resumo:
Available under the GNU Lesser General Public License (LGPL3)
Resumo:
This work explores the development of MemTri. A memory forensics triage tool that can assess the likelihood of criminal activity in a memory image, based on evidence data artefacts generated by several applications. Fictitious illegal suspect activity scenarios were performed on virtual machines to generate 60 test memory images for input into MemTri. Four categories of applications (i.e. Internet Browsers, Instant Messengers, FTP Client and Document Processors) are examined for data artefacts located through the use of regular expressions. These identified data artefacts are then analysed using a Bayesian Network, to assess the likelihood that a seized memory image contained evidence of illegal activity. Currently, MemTri is under development and this paper introduces only the basic concept as well as the components that the application is built on. A complete description of MemTri coupled with extensive experimental results is expected to be published in the first semester of 2017.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
We report on the development of a Java-based application devised to support collaborative learning of Art concepts and ideas over the Internet. Starting from an examination of the pedagogy of both Art education and collaborative learning we propose principles which are useful for the design and construction of a “lightweight” software application which supports interactive Art learning in groups. This application makes “dynamics” of an art work explicit, and supports group interaction with simple messaging and “chat” facilities. This application may be used to facilitate learning and teaching of Art, but also as a research tool to investigate the learning of Art and also the development and dynamics of collaborating groups. Evaluation of a pilot study of the use of our system with a group of 20 school children is presented.
Resumo:
Dans un contexte où l’utilisation d’un antibiotique provoque une hausse de la résistance bactérienne, ce mémoire évalue théoriquement et grâce à la simulation numérique, de quelle façon un monopoleur maximize son profit, à la suite de l’obtention d’un brevet. Du point de vue théorique, le monopoleur souhaite d’une part maximiser son profit à chaque instant et de l’autre maximiser son profit tout au long de la durée de vie de son brevet. À l’aide de la simulation numérique, la valorisation faite par le monopoleur de son antibiotique s’avère être le point central de l’analyse, de sorte que si la valorisation faite par le monopoleur est élevée, le monopoleur conserve l’efficacité de son antibiotique en diminuant la quantité produite. Dans le cas contraire, le monopoleur produira une plus grande quantité et conservera une moins grande efficacité de son antibiotique.
Resumo:
Images about Africa in the northern hemisphere are generally negative and pessimistic. In spite of instant global communication, why have these images persisted till date? This contribution shall revisit these perceptions and the images embodying them to unearth the motivations and rationale. The central argument, based on some narratives and experiences, is that ignorance feeds these images and stereotypes. Furthermore, positionality of non-African experts and some groups of African scholars and activists contribute to this culture of ignorance and paternalism. The contribution shall end with an ethical evaluation of the persistence of the images and the extent of moral responsibility of the authors and carriers of the racist stereotypes embedded in the images. (DIPF/Orig.)
Resumo:
This paper presents a discrete formalism for temporal reasoning about actions and change, which enjoys an explicit representation of time and action/event occurrences. The formalism allows the expression of truth values for given fluents over various times including nondecomposable points/moments and decomposable intervals. Two major problems which beset most existing interval-based theories of action and change, i.e., the so-called dividing instant problem and the intermingling problem, are absent from this new formalism. The dividing instant problem is overcome by excluding the concepts of ending points of intervals, and the intermingling problem is bypassed by means of characterising the fundamental time structure as a well-ordered discrete set of non-decomposable times (points and moments), from which decomposable intervals are constructed. A comprehensive characterisation about the relationship between the negation of fluents and the negation of involved sentences is formally provided. The formalism provides a flexible expression of temporal relationships between effects and their causal events, including delayed effects of events which remains a problematic question in most existing theories about action and change.