621 resultados para Instant messaging
Resumo:
Although he did not write copious novels, endless essays, or long poems, Jorge Luis Borges is considered one of today's best modem writers. His works have never been more than ten pages long. The purpose of this dissertation is to demonstrate that the willing use of concise expression in Borges's writings is inscribed in a poetic worldview of great implications. This view is based on the synthesis of philosophical, literary, and cultural issues that Borges interprets, discusses, refutes, and re-elaborates with a new conjectural approach. This dissertation is based on a methodological review of all his current scholarly work and on a thorough examination of the four volumes of his Complete Works, edited by Emece, in 2002. His pantheistic vision, the epiphanic moments, and his love/hate relationship with language, conform an aesthetic of resounding silence that enlightens the hidden aspects of his brief masterpieces. Even though Borgesian studies flood the library he once imagined, they have been presented in an isolated manner. This dissertation establishes a link among the various aforementioned aspects as studied by Borges scholars, and demonstrates the powerful influence of Borges's illuminating and precise vision. Paradoxically, the poetry of brevity in Borges's works is filled with allusions to the things that Borges silences, because, from a panoramic pantheism, his words almost reach an epiphanic enlightenment that flashes between preterit and future nothingness. By replacing extension with intensity, and mastering the art of omission, Borges's laborious work reaches power and concentration that only the very greatest talents can achieve. His delicate verbal conciseness provides his readers with a virtually infinite freedom of imagination because it exposes them to the chaotic world of mythical probabilities, where an instant encompasses etemity.
Resumo:
Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.
Resumo:
Cette thèse de doctorat en composition comprend deux projets de nature différente et complémentaire : (1) un projet de recherche théorique sur la communication des caractères musicaux; (2) un projet artistique s'articulant autour de la composition de trois pièces : L'homme à deux têtes - opéra de chambre, Un instant dans l'esprit de Lovecraft - pour clarinette solo, orchestre à cordes et percussions, et Balade ornithologique - pour orchestre à vents. La conception de la musique comme un moyen de communication est à l'origine de cette recherche théorique qui est motivée par un désir de compréhension des stratégies d'expressions des émotions en musique, à partir du point de vue du compositeur. Cette thèse aborde les modèles de communication, le concept de personnage virtuel et la théorie de la contagion des humeurs. Par la suite, nous détaillerons les indices acoustiques menant à la perception des caractères musicaux. Toutes ces notions sont illustrées et explorées par la composition de miniature ayant un caractère bien ciblé. Finalement, nous proposons un système d'analyse musical des caractères et des émotions qui est appliqué à l'analyse de sections des pièces composées au cours de ce projet de doctorat. Ce dernier chapitre met en lumière les stratégies utilisées pour créer un discours dramatique tout en exposant l'évocation de différents caractères musicaux.
Resumo:
Based on close examinations of instant message (IM) interactions, this chapter argues that an interactional sociolinguistic approach to computer-mediated language use could provide explanations for phenomena that previously could not be accounted for in computer-mediated discourse analysis (CMDA). Drawing on the theoretical framework of relational work (Locher, 2006), the analysis focuses on non-task oriented talk and its function in forming and establishing communication norms in the team, as well as micro-level phenomena, such as hesitation, backchannel signals and emoticons. The conclusions of this preliminary research suggest that the linguistic strategies used for substituting audio-visual signals are strategically used in discursive functions and have an important role in relational work
Resumo:
Purpose: This paper extends the use of Radio Frequency Identification (RFID) data for accounting of warehouse costs and services. Time Driven Activity Based Costing (TDABC) methodology is enhanced with the real-time collected RFID data about duration of warehouse activities. This allows warehouse managers to have accurate and instant calculations of costs. The RFID enhanced TDABC (RFID-TDABC) is proposed as a novel application of the RFID technology. Research Approach: Application of RFID-TDABC in a warehouse is implemented on warehouse processes of a case study company. Implementation covers receiving, put-away, order picking, and despatching. Findings and Originality: RFID technology is commonly used for the identification and tracking items. The use of the RFID generated information with the TDABC can be successfully extended to the area of costing. This RFID-TDABC costing model will benefit warehouse managers with accurate and instant calculations of costs. Research Impact: There are still unexplored benefits to RFID technology in its applications in warehousing and the wider supply chain. A multi-disciplinary research approach led to combining RFID technology and TDABC accounting method in order to propose RFID-TDABC. Combining methods and theories from different fields with RFID, may lead researchers to develop new techniques such as RFID-TDABC presented in this paper. Practical Impact: RFID-TDABC concept will be of value to practitioners by showing how warehouse costs can be accurately measured by using this approach. Providing better understanding of incurred costs may result in a further optimisation of warehousing operations, lowering costs of activities, and thus provide competitive pricing to customers. RFID-TDABC can be applied in a wider supply chain.
Resumo:
Sudden changes in the stiffness of a structure are often indicators of structural damage. Detection of such sudden stiffness change from the vibrations of structures is important for Structural Health Monitoring (SHM) and damage detection. Non-contact measurement of these vibrations is a quick and efficient way for successful detection of sudden stiffness change of a structure. In this paper, we demonstrate the capability of Laser Doppler Vibrometry to detect sudden stiffness change in a Single Degree Of Freedom (SDOF) oscillator within a laboratory environment. The dynamic response of the SDOF system was measured using a Polytec RSV-150 Remote Sensing Vibrometer. This instrument employs Laser Doppler Vibrometry for measuring dynamic response. Additionally, the vibration response of the SDOF system was measured through a MicroStrain G-Link Wireless Accelerometer mounted on the SDOF system. The stiffness of the SDOF system was experimentally determined through calibrated linear springs. The sudden change of stiffness was simulated by introducing the failure of a spring at a certain instant in time during a given period of forced vibration. The forced vibration on the SDOF system was in the form of a white noise input. The sudden change in stiffness was successfully detected through the measurements using Laser Doppler Vibrometry. This detection from optically obtained data was compared with a detection using data obtained from the wireless accelerometer. The potential of this technique is deemed important for a wide range of applications. The method is observed to be particularly suitable for rapid damage detection and health monitoring of structures under a model-free condition or where information related to the structure is not sufficient.
Resumo:
Flame retardants (FRs) are added to materials to enhance the fire safety level of readily combustible polymers. Although they have been purported to aid in preventing fires in some cases, they have also become a significant cause for concern given the vast data on environmental persistence and human and animal adverse health effects. Evidence since the 1980s has shown that Canadian, American and Europeans have detectable levels of FRs in their bodies. North Americans in particular have high levels of these chemicals due to stringent flammability standards and the higher use of polybrominated diphenyl ethers (PBDEs) in North America as opposed to Europe. FRs have been detected in household dust and some evidence suggests that TVs could be a significant source of exposure to FRs. It is imperative to re-visit the flammability standard (UL94V) that allows for FR use in TVs plastic materials by providing a risk versus benefit analysis to determine if this standard provides a fire safety benefit and if it plays a major role in FR exposure. This report first examined the history of televisions and the progression to the UL94V flammability test standard to understand why FRs were first added to polymers used in the manufacturing of TVs. It has been demonstrated to be due to fire hazards resulting from the use of plastic materials in cathode-ray tube (CRT) TVs that had an “instant-on” feature and high voltage and operating temperatures. In providing a risk versus benefit analysis, this paper presents the argument that 1) by providing a market survey the current flammability test standard (UL94V) is outdated and lacks relevance to current technology as flat, thin, energy efficient Liquid Crystal Displays (LCDs) dominate over traditionally used heavy, bulky and energy-intensive CRTs; 2) FRs do not impart fire safety benefits considering that there is a lack of valid fire safety concern, such as reduced internal and external ignition and fire hazard, and a lack of valid fire data and hazard for television fires in general and finally; 3) the standard is overly stringent as it does not consider the risk due to exposure to FRs in household dust due to the proliferation and greater use of televisions in households. Therefore, this report argues that the UL94V standard has become trapped in history and needs to be updated as it may play a major role in FR exposure.
Resumo:
PURPOSE: Radiation therapy is used to treat cancer using carefully designed plans that maximize the radiation dose delivered to the target and minimize damage to healthy tissue, with the dose administered over multiple occasions. Creating treatment plans is a laborious process and presents an obstacle to more frequent replanning, which remains an unsolved problem. However, in between new plans being created, the patient's anatomy can change due to multiple factors including reduction in tumor size and loss of weight, which results in poorer patient outcomes. Cloud computing is a newer technology that is slowly being used for medical applications with promising results. The objective of this work was to design and build a system that could analyze a database of previously created treatment plans, which are stored with their associated anatomical information in studies, to find the one with the most similar anatomy to a new patient. The analyses would be performed in parallel on the cloud to decrease the computation time of finding this plan. METHODS: The system used SlicerRT, a radiation therapy toolkit for the open-source platform 3D Slicer, for its tools to perform the similarity analysis algorithm. Amazon Web Services was used for the cloud instances on which the analyses were performed, as well as for storage of the radiation therapy studies and messaging between the instances and a master local computer. A module was built in SlicerRT to provide the user with an interface to direct the system on the cloud, as well as to perform other related tasks. RESULTS: The cloud-based system out-performed previous methods of conducting the similarity analyses in terms of time, as it analyzed 100 studies in approximately 13 minutes, and produced the same similarity values as those methods. It also scaled up to larger numbers of studies to analyze in the database with a small increase in computation time of just over 2 minutes. CONCLUSION: This system successfully analyzes a large database of radiation therapy studies and finds the one that is most similar to a new patient, which represents a potential step forward in achieving feasible adaptive radiation therapy replanning.
Resumo:
El presente ensayo versa sobre la conversación escrita que mantienen los jóvenes hoy en sus chats. Se tratan las diferentes variedades lingüísticas. La escritura de los chats equivale a una conversación: los textos escritos se convierten en textos orales, las conversaciones se transcriben y las normas lingüísticas se rompen. Esto no significa que los jóvenes no sepan cuáles son, pero en los chats, no les interesan.
Resumo:
La tesis mariológica de la conceptio per aurem, según la cual la Virgen María habría concebido a Jesucristo por el oído en el momento de escuchar del ángel el mensaje celestial anunciándole que, sin perder su virginidad, sería madre del Hijo de Dios encarnado, ha merecido hasta ahora muy pocos estudios académicos rigurosamente fundados en fuentes primarias. De hecho, en la literatura especializada son muy escasas las referencias a tal teoría y, cuando algún estudioso la evoca, casi siempre se contenta con aludir a ella, sin aportar pruebas documentales. Sin embargo, tal como lo revelan las nueve pinturas italianas aquí analizadas, esa teoría fue ilustrada mediante sutiles metáforas visuales en muchas obras pictóricas medievales, las cuales se inspiraron en una sólida tradición literaria. Además una pléyade de Padres de la Iglesia y teólogos medievales testimonia, mediante afirmaciones explícitas, que semejante teoría gozó de notable aceptación entre los maestros del pensamiento cristiano. Basándose en numerosos textos patrísticos y teológicos, este artículo intenta dos objetivos esenciales: exponer, ante todo, las distintas formulaciones teóricas propuestas por esos pensadores; y además, tratar de poner en luz los significados dogmáticos que subyacen bajo esa sorprendente tesis.
Resumo:
Available under the GNU Lesser General Public License (LGPL3)
Resumo:
This work explores the development of MemTri. A memory forensics triage tool that can assess the likelihood of criminal activity in a memory image, based on evidence data artefacts generated by several applications. Fictitious illegal suspect activity scenarios were performed on virtual machines to generate 60 test memory images for input into MemTri. Four categories of applications (i.e. Internet Browsers, Instant Messengers, FTP Client and Document Processors) are examined for data artefacts located through the use of regular expressions. These identified data artefacts are then analysed using a Bayesian Network, to assess the likelihood that a seized memory image contained evidence of illegal activity. Currently, MemTri is under development and this paper introduces only the basic concept as well as the components that the application is built on. A complete description of MemTri coupled with extensive experimental results is expected to be published in the first semester of 2017.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
We report on the development of a Java-based application devised to support collaborative learning of Art concepts and ideas over the Internet. Starting from an examination of the pedagogy of both Art education and collaborative learning we propose principles which are useful for the design and construction of a “lightweight” software application which supports interactive Art learning in groups. This application makes “dynamics” of an art work explicit, and supports group interaction with simple messaging and “chat” facilities. This application may be used to facilitate learning and teaching of Art, but also as a research tool to investigate the learning of Art and also the development and dynamics of collaborating groups. Evaluation of a pilot study of the use of our system with a group of 20 school children is presented.
Resumo:
Dans un contexte où l’utilisation d’un antibiotique provoque une hausse de la résistance bactérienne, ce mémoire évalue théoriquement et grâce à la simulation numérique, de quelle façon un monopoleur maximize son profit, à la suite de l’obtention d’un brevet. Du point de vue théorique, le monopoleur souhaite d’une part maximiser son profit à chaque instant et de l’autre maximiser son profit tout au long de la durée de vie de son brevet. À l’aide de la simulation numérique, la valorisation faite par le monopoleur de son antibiotique s’avère être le point central de l’analyse, de sorte que si la valorisation faite par le monopoleur est élevée, le monopoleur conserve l’efficacité de son antibiotique en diminuant la quantité produite. Dans le cas contraire, le monopoleur produira une plus grande quantité et conservera une moins grande efficacité de son antibiotique.