926 resultados para automatic affect analysis
Resumo:
Social media are increasingly being recruited into care practices in mental health. This paper analyses how a major new mental health social media site (www.elefriends.org.uk) is used when trying to manage the impact of psychiatric medication on the body. Drawing on Henri Bergson's concept of affection, analysis shows that Elefriends is used at particular moments of reconfiguration (e.g. change in dosage and/or medication), periods of self-experimentation (when people tailor their regimen by altering prescriptions or ceasing medication) and when dealing with a present bodily concern (showing how members have a direct, immediate relationship with the site). In addition, analysis illustrates how users face having to structure their communication to try to avoid 'triggering' distress in others. The paper concludes by pointing to the need to focus on the multiple emerging relationships between bodies and social media in mental health due to the ways the latter are becoming increasingly prominent technologies through which to experience the body when distressed.
Resumo:
Les courriels Spams (courriels indésirables ou pourriels) imposent des coûts annuels extrêmement lourds en termes de temps, d’espace de stockage et d’argent aux utilisateurs privés et aux entreprises. Afin de lutter efficacement contre le problème des spams, il ne suffit pas d’arrêter les messages de spam qui sont livrés à la boîte de réception de l’utilisateur. Il est obligatoire, soit d’essayer de trouver et de persécuter les spammeurs qui, généralement, se cachent derrière des réseaux complexes de dispositifs infectés, ou d’analyser le comportement des spammeurs afin de trouver des stratégies de défense appropriées. Cependant, une telle tâche est difficile en raison des techniques de camouflage, ce qui nécessite une analyse manuelle des spams corrélés pour trouver les spammeurs. Pour faciliter une telle analyse, qui doit être effectuée sur de grandes quantités des courriels non classés, nous proposons une méthodologie de regroupement catégorique, nommé CCTree, permettant de diviser un grand volume de spams en des campagnes, et ce, en se basant sur leur similarité structurale. Nous montrons l’efficacité et l’efficience de notre algorithme de clustering proposé par plusieurs expériences. Ensuite, une approche d’auto-apprentissage est proposée pour étiqueter les campagnes de spam en se basant sur le but des spammeur, par exemple, phishing. Les campagnes de spam marquées sont utilisées afin de former un classificateur, qui peut être appliqué dans la classification des nouveaux courriels de spam. En outre, les campagnes marquées, avec un ensemble de quatre autres critères de classement, sont ordonnées selon les priorités des enquêteurs. Finalement, une structure basée sur le semiring est proposée pour la représentation abstraite de CCTree. Le schéma abstrait de CCTree, nommé CCTree terme, est appliqué pour formaliser la parallélisation du CCTree. Grâce à un certain nombre d’analyses mathématiques et de résultats expérimentaux, nous montrons l’efficience et l’efficacité du cadre proposé.
Resumo:
Background and Problem: Sustainability reporting is a growing interest in today’s organizations and it is essential to report on non-financial matters. Many of the existing frameworks have been criticized for being used only of symbolical reasons which is why the concept of integrated reporting and the <IR> framework have been developed. One of the cornerstones in the <IR> framework is human capital which is one of the most valuable assets in an organization. Traditionally, employee costs have only been treated as an expense and there have been limited disclosures in corporate reports. In the current business world it is instead seen as an investment in human resources. Since previous studies have shown an increase of human capital disclosures when corporate reports become integrated, integrated reporting might be the solution to this problem. Purpose: The purpose of this study is to examine if there are differences in human capital disclosures between integrated reports and separate annual and sustainability reports in companies listed at OMXS30. Delimitations: This study’s empirical examination is limited to include the companies listed at Stockholm OMX30. Only corporate reports issued for the year 2014 are treated. Methodology: For this study a self-constructed disclosure scoreboard with human capital- related items has been used to collect data from the companies’ corporate reports. Also additional information beyond the pre-determined items has been collected to extend the data collection. Empirical Results and Conclusion: The results show that human capital seems to be a subject that is relatively little reported about. The integrated reporting companies do not disclose more information compared to non-integrated reporting companies. However, the results show that integrated reporting companies seem to have a more future-oriented focus and that the disclosures are more dispersed throughout the reports. It can be concluded that company sector and size do not affect the amount or type of information.
Resumo:
AZEVEDO, George Dantas de et al. Raloxifene therapy does not affect uterine blood flow in postmenopausal women: a transvaginal Doppler study. Maturitas, Amsterdam, v.47, n.3, p.195-200, 2004
Resumo:
With the world of professional sports shifting towards employing better sport analytics, the demand for vision-based performance analysis is growing increasingly in recent years. In addition, the nature of many sports does not allow the use of any kind of sensors or other wearable markers attached to players for monitoring their performances during competitions. This provides a potential application of systematic observations such as tracking information of the players to help coaches to develop their visual skills and perceptual awareness needed to make decisions about team strategy or training plans. My PhD project is part of a bigger ongoing project between sport scientists and computer scientists involving also industry partners and sports organisations. The overall idea is to investigate the contribution technology can make to the analysis of sports performance on the example of team sports such as rugby, football or hockey. A particular focus is on vision-based tracking, so that information about the location and dynamics of the players can be gained without any additional sensors on the players. To start with, prior approaches on visual tracking are extensively reviewed and analysed. In this thesis, methods to deal with the difficulties in visual tracking to handle the target appearance changes caused by intrinsic (e.g. pose variation) and extrinsic factors, such as occlusion, are proposed. This analysis highlights the importance of the proposed visual tracking algorithms, which reflect these challenges and suggest robust and accurate frameworks to estimate the target state in a complex tracking scenario such as a sports scene, thereby facilitating the tracking process. Next, a framework for continuously tracking multiple targets is proposed. Compared to single target tracking, multi-target tracking such as tracking the players on a sports field, poses additional difficulties, namely data association, which needs to be addressed. Here, the aim is to locate all targets of interest, inferring their trajectories and deciding which observation corresponds to which target trajectory is. In this thesis, an efficient framework is proposed to handle this particular problem, especially in sport scenes, where the players of the same team tend to look similar and exhibit complex interactions and unpredictable movements resulting in matching ambiguity between the players. The presented approach is also evaluated on different sports datasets and shows promising results. Finally, information from the proposed tracking system is utilised as the basic input for further higher level performance analysis such as tactics and team formations, which can help coaches to design a better training plan. Due to the continuous nature of many team sports (e.g. soccer, hockey), it is not straightforward to infer the high-level team behaviours, such as players’ interaction. The proposed framework relies on two distinct levels of performance analysis: low-level performance analysis, such as identifying players positions on the play field, as well as a high-level analysis, where the aim is to estimate the density of player locations or detecting their possible interaction group. The related experiments show the proposed approach can effectively explore this high-level information, which has many potential applications.
Resumo:
The use of human brain electroencephalography (EEG) signals for automatic person identi cation has been investigated for a decade. It has been found that the performance of an EEG-based person identication system highly depends on what feature to be extracted from multi-channel EEG signals. Linear methods such as Power Spectral Density and Autoregressive Model have been used to extract EEG features. However these methods assumed that EEG signals are stationary. In fact, EEG signals are complex, non-linear, non-stationary, and random in nature. In addition, other factors such as brain condition or human characteristics may have impacts on the performance, however these factors have not been investigated and evaluated in previous studies. It has been found in the literature that entropy is used to measure the randomness of non-linear time series data. Entropy is also used to measure the level of chaos of braincomputer interface systems. Therefore, this thesis proposes to study the role of entropy in non-linear analysis of EEG signals to discover new features for EEG-based person identi- cation. Five dierent entropy methods including Shannon Entropy, Approximate Entropy, Sample Entropy, Spectral Entropy, and Conditional Entropy have been proposed to extract entropy features that are used to evaluate the performance of EEG-based person identication systems and the impacts of epilepsy, alcohol, age and gender characteristics on these systems. Experiments were performed on the Australian EEG and Alcoholism datasets. Experimental results have shown that, in most cases, the proposed entropy features yield very fast person identication, yet with compatible accuracy because the feature dimension is low. In real life security operation, timely response is critical. The experimental results have also shown that epilepsy, alcohol, age and gender characteristics have impacts on the EEG-based person identication systems.
Resumo:
We propose a study of the mathematical properties of voice as an audio signal -- This work includes signals in which the channel conditions are not ideal for emotion recognition -- Multiresolution analysis- discrete wavelet transform – was performed through the use of Daubechies Wavelet Family (Db1-Haar, Db6, Db8, Db10) allowing the decomposition of the initial audio signal into sets of coefficients on which a set of features was extracted and analyzed statistically in order to differentiate emotional states -- ANNs proved to be a system that allows an appropriate classification of such states -- This study shows that the extracted features using wavelet decomposition are enough to analyze and extract emotional content in audio signals presenting a high accuracy rate in classification of emotional states without the need to use other kinds of classical frequency-time features -- Accordingly, this paper seeks to characterize mathematically the six basic emotions in humans: boredom, disgust, happiness, anxiety, anger and sadness, also included the neutrality, for a total of seven states to identify
Resumo:
BACKGROUND: Invasive meningococcal disease is a significant cause of mortality and morbidity in the UK. Administration of chemoprophylaxis to close contacts reduces the risk of a secondary case. However, unnecessary chemoprophylaxis may be associated with adverse reactions, increased antibiotic resistance and removal of organisms, such as Neisseria lactamica, which help to protect against meningococcal disease. Limited evidence exists to suggest that overuse of chemoprophylaxis may occur. This study aimed to evaluate prescribing of chemoprophylaxis for contacts of meningococcal disease by general practitioners and hospital staff. METHODS: Retrospective case note review of cases of meningococcal disease was conducted in one health district from 1st September 1997 to 31st August 1999. Routine hospital and general practitioner prescribing data was searched for chemoprophylactic prescriptions of rifampicin and ciprofloxacin. A questionnaire of general practitioners was undertaken to obtain more detailed information. RESULTS: Prescribing by hospital doctors was in line with recommendations by the Consultant for Communicable Disease Control. General practitioners prescribed 118% more chemoprophylaxis than was recommended. Size of practice and training status did not affect the level of additional prescribing, but there were significant differences by geographical area. The highest levels of prescribing occurred in areas with high disease rates and associated publicity. However, some true close contacts did not appear to receive prophylaxis. CONCLUSIONS: Receipt of chemoprophylaxis is affected by a series of patient, doctor and community interactions. High publicity appears to increase demand for prophylaxis. Some true contacts do not receive appropriate chemoprophylaxis and are left at an unnecessarily increased risk
Resumo:
Sharpening is a powerful image transformation because sharp edges can bring out image details. Sharpness is achieved by increasing local contrast and reducing edge widths. We present a method that enhances sharpness of images and thereby their perceptual quality. Most existing enhancement techniques require user input to improve the perception of the scene in a manner most pleasing to the particular user. Our goal of image enhancement is to improve the perception of sharpness in digital images for human viewers. We consider two parameters in order to exaggerate the differences between local intensities. The two parameters exploit local contrast and widths of edges. We start from the assumption that color, texture, or objects of focus such as faces affect the human perception of photographs. When human raters are presented with a collection of images with different sharpness and asked to rank them according to perceived sharpness, the results have shown that there is a statistical consensus among the raters. We introduce a ramp enhancement technique by modifying the optimal overshoot in the ramp for different region contrasts as well as the new ramp width. Optimal parameter values are searched to be applied to regions under the criteria mentioned above. In this way, we aim to enhance digital images automatically to create pleasing image output for common users.
Resumo:
A lean muscle line (L) and a fat muscle line (F) of rainbow trout were established (Quillet et al., 2005) by a two-way selection for muscle lipid content performed on pan-size rainbow trout using a non-destructive measurement of muscle lipid content (Distell Fish Fat Meter®). The aim of the present study was to evaluate the consequences of this selective breeding on flesh quality of pan size (290 g) diploid and triploid trout after three generations of selection. Instrumental evaluations of fillet color and pH measurement were performed at slaughter. Flesh color, pH, dry matter content and mechanical resistance were measured at 48 h and 96 h postmortem on raw and cooked flesh, respectively. A sensorial profile analysis was performed on cooked fillets. Fillets from the selected fatty muscle line (F) had a higher dry matter content and were more colorful for both raw and cooked fillets. Mechanical evaluation indicated a tendency of raw flesh from F fish to be less firm, but this was not confirmed after cooking, neither instrumentally or by sensory analysis. The sensory analysis revealed higher fat loss, higher intensity of flavor of cooked potato, higher exudation, higher moisture content and a more fatty film left on the tongue for flesh from F fish. Triploid fish had mechanically softer raw and cooked fillets, but the difference was not perceived by the sensorial panel. The sensorial evaluation also revealed a lower global intensity of odor, more exudation and a higher moisture content in the fillets from triploid fish. These differences in quality parameters among groups of fish were associated with larger white muscle fibers in F fish and in triploid fish. The data provide additional information about the relationship between muscle fat content, muscle cellularity and flesh quality.
Resumo:
With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.
Resumo:
We develop the a-posteriori error analysis of hp-version interior-penalty discontinuous Galerkin finite element methods for a class of second-order quasilinear elliptic partial differential equations. Computable upper and lower bounds on the error are derived in terms of a natural (mesh-dependent) energy norm. The bounds are explicit in the local mesh size and the local degree of the approximating polynomial. The performance of the proposed estimators within an automatic hp-adaptive refinement procedure is studied through numerical experiments.
Resumo:
Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.
Resumo:
Cultivation of chilling-tolerant ornamental crops at lower temperature could reduce the energy demands of heated greenhouses. To provide a better understanding of how sub-optimal temperatures (12 degrees C vs. 16 degrees C) affect growth of the sensitive Petunia hybrida cultivar 'SweetSunshine Williams', the transcriptome, carbohydrate metabolism, and phytohormone homeostasis were monitored in aerial plant parts over 4 weeks by use of a microarray, enzymatic assays and GC-MS/MS. The data revealed three consecutive phases of chilling response. The first days were marked by a strong accumulation of sugars, particularly in source leaves, preferential up-regulation of genes in the same tissue and down-regulation of several genes in the shoot apex, especially those involved in the abiotic stress response. The midterm phase featured a partial normalization of carbohydrate levels and gene expression. After 3 weeks of chilling exposure, a new stabilized balance was established. Reduced hexose levels in the shoot apex, reduced ratios of sugar levels between the apex and source leaves and a higher apical sucrose/hexose ratio, associated with decreased activity and expression of cell wall invertase, indicate that prolonged chilling induced sugar accumulation in source leaves at the expense of reduced sugar transport to and reduced sucrose utilization in the shoot. This was associated with reduced levels of indole-3-acetic acid and abscisic acid in the apex and high numbers of differentially, particularly up-regulated genes, especially in the source leaves, including those regulating histones, ethylene action, transcription factors, and a jasmonate-ZIM-domain protein. Transcripts of one Jumonji C domain containing protein and one expansin accumulated in source leaves throughout the chilling period. The results reveal a dynamic and complex disturbance of plant function in response to mild chilling, opening new perspectives for the comparative analysis of differently tolerant cultivars.
Resumo:
The present article reflects the progress of an ongoing master’s dissertation on language engineering. The main goal of the work here described, is to infer a programmer’s profile through the analysis of his source code. After such analysis the programmer shall be placed on a scale that characterizes him on his language abilities. There are several potential applications for such profiling, namely, the evaluation of a programmer’s skills and proficiency on a given language or the continuous evaluation of a student’s progress on a programming course. Throughout the course of this project and as a proof of concept, a tool that allows the automatic profiling of a Java programmer is under development. This tool is also introduced in the paper and its preliminary outcomes are discussed.