965 resultados para End-user enrichments


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Humans are profoundly affected by the surroundings which they inhabit. Environmental psychologists have produced numerous credible theories describing optimal human environments, based on the concept of congruence or “fit” (1, 2). Lack of person/environment fit can lead to stress-related illness and lack of psychosocial well-being (3). Conversely, appropriately designed environments can promote wellness (4) or “salutogenesis” (5). Increasingly, research in the area of Evidence-Based Design, largely concentrated in the area of healthcare architecture, has tended to bear out these theories (6). Patients and long-term care residents, because of injury, illness or physical/ cognitive impairment, are less likely to be able to intervene to modify their immediate environment, unless this is designed specifically to facilitate their particular needs. In the context of care settings, detailed design of personal space therefore takes on enormous significance. MyRoom conceptualises a personalisable room, utilising sensoring and networked computing to enable the environment to respond directly and continuously to the occupant. Bio-signals collected and relayed to the system will actuate application(s) intended to positively influence user well-being. Drawing on the evidence base in relation to therapeutic design interventions (7), real-time changes in ambient lighting, colour, image, etc. respond continuously to the user’s physiological state, optimising congruence. Based on research evidence, consideration is also given to development of an application which uses natural images (8). It is envisaged that actuation will require machine-learning based on interpretation of data gathered by sensors; sensoring arrangements may vary depending on context and end-user. Such interventions aim to reduce inappropriate stress/ provide stimulation, supporting both instrumental and cognitive tasks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Incumbent telecommunication lasers emitting at 1.5 µm are fabricated on InP substrates and consist of multiple strained quantum well layers of the ternary alloy InGaAs, with barriers of InGaAsP or InGaAlAs. These lasers have been seen to exhibit very strong temperature dependence of the threshold current. This strong temperature dependence leads to a situation where external cooling equipment is required to stabilise the optical output power of these lasers. This results in a significant increase in the energy bill associated with telecommunications, as well as a large increase in equipment budgets. If the exponential growth trend of end user bandwidth demand associated with the internet continues, these inefficient lasers could see the telecommunications industry become the dominant consumer of world energy. For this reason there is strong interest in developing new, much more efficient telecommunication lasers. One avenue being investigated is the development of quantum dot lasers on InP. The confinement experienced in these low dimensional structures leads to a strong perturbation of the density of states at the band edge, and has been predicted to result in reduced temperature dependence of the threshold current in these devices. The growth of these structures is difficult due to the large lattice mismatch between InP and InAs; however, recently quantum dots elongated in one dimension, known as quantum dashes, have been demonstrated. Chapter 4 of this thesis provides an experimental analysis of one of these quantum dash lasers emitting at 1.5 µm along with a numerical investigation of threshold dynamics present in this device. Another avenue being explored to increase the efficiency of telecommunications lasers is bandstructure engineering of GaAs-based materials to emit at 1.5 µm. The cause of the strong temperature sensitivity in InP-based quantum well structures has been shown to be CHSH Auger recombination. Calculations have shown and experiments have verified that the addition of bismuth to GaAs strongly reduces the bandgap and increases the spin orbit splitting energy of the alloy GaAs1−xBix. This leads to a bandstructure condition at x = 10 % where not only is 1.5 µm emission achieved on GaAs-based material, but also the bandstructure of the material can naturally suppress the costly CHSH Auger recombination which plagues InP-based quantum-well-based material. It has been predicted that telecommunications lasers based on this material system should operate in the absence of external cooling equipment and offer electrical and optical benefits over the incumbent lasers. Chapters 5, 6, and 7 provide a first analysis of several aspects of this material system relevant to the development of high bismuth content telecommunication lasers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulating the efficiency of business processes could reveal crucial bottlenecks for manufacturing companies and could lead to significant optimizations resulting in decreased time to market, more efficient resource utilization, and larger profit. While such business optimization software is widely utilized by larger companies, SMEs typically do not have the required expertise and resources to efficiently exploit these advantages. The aim of this work is to explore how simulation software vendors and consultancies can extend their portfolio to SMEs by providing business process optimization based on a cloud computing platform. By executing simulation runs on the cloud, software vendors and associated business consultancies can get access to large computing power and data storage capacity on demand, run large simulation scenarios on behalf of their clients, analyze simulation results, and advise their clients regarding process optimization. The solution is mutually beneficial for both vendor/consultant and the end-user SME. End-user companies will only pay for the service without requiring large upfront costs for software licenses and expensive hardware. Software vendors can extend their business towards the SME market with potentially huge benefits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ageing of the population is a worldwide phenomenon. Numerous ICT-based solutions have been developed for elderly care but mainly connected to the physiological and nursing aspects in services for the elderly. Social work is a profession that should pay attention to the comprehensive wellbeing and social needs of the elderly. Many people experience loneliness and depression in their old age, either as a result of living alone or due to a lack of close family ties and reduced connections with their culture of origin, which results in an inability to participate actively in community activities (Singh & Misra, 2009). Participation in society would enhance the quality of life. With the development of information technology, the use of technology in social work practice has risen dramatically. The aim of this literature review is to map out the state of the art of knowledge about the usage of ICT in elderly care and to figure out research-based knowledge about the usability of ICT for the prevention of loneliness and social isolation of elderly people. The data for the current research comes from the core collection of the Web of Science and the data searching was performed using Boolean? The searching resulted in 216 published English articles. After going through the topics and abstracts, 34 articles were selected for the data analysis that is based on a multi approach framework. The analysis of the research approach is categorized according to some aspects of using ICT by older adults from the adoption of ICT to the impact of usage, and the social services for them. This literature review focused on the function of communication by excluding the applications that mainly relate to physical nursing. The results show that the so-called ‘digital divide’ still exists, but the older adults have the willingness to learn and utilise ICT in daily life, especially for communication. The data shows that the usage of ICT can prevent the loneliness and social isolation of older adults, and they are eager for technical support in using ICT. The results of data analysis on theoretical frames and concepts show that this research field applies different theoretical frames from various scientific fields, while a social work approach is lacking. However, a synergic frame of applied theories will be suggested from the perspective of social work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several studies in the past have revealed that network end user devices are left powered up 24/7 even when idle just for the sake of maintaining Internet connectivity. Network devices normally support low power states but are kept inactive due to their inability to maintain network connectivity. The Network Connectivity Proxy (NCP) has recently been proposed as an effective mechanism to impersonate network connectivity on behalf of high power devices and enable them to sleep when idle without losing network presence. The NCP can efficiently proxy basic networking protocol, however, proxying of Internet based applications have no absolute solution due to dynamic and non-predictable nature of the packets they are sending and receiving periodically. This paper proposes an approach for proxying Internet based applications and presents the basic software architectures and capabilities. Further, this paper also practically evaluates the proposed framework and analyzes expected energy savings achievable under-different realistic conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les courriels Spams (courriels indésirables ou pourriels) imposent des coûts annuels extrêmement lourds en termes de temps, d’espace de stockage et d’argent aux utilisateurs privés et aux entreprises. Afin de lutter efficacement contre le problème des spams, il ne suffit pas d’arrêter les messages de spam qui sont livrés à la boîte de réception de l’utilisateur. Il est obligatoire, soit d’essayer de trouver et de persécuter les spammeurs qui, généralement, se cachent derrière des réseaux complexes de dispositifs infectés, ou d’analyser le comportement des spammeurs afin de trouver des stratégies de défense appropriées. Cependant, une telle tâche est difficile en raison des techniques de camouflage, ce qui nécessite une analyse manuelle des spams corrélés pour trouver les spammeurs. Pour faciliter une telle analyse, qui doit être effectuée sur de grandes quantités des courriels non classés, nous proposons une méthodologie de regroupement catégorique, nommé CCTree, permettant de diviser un grand volume de spams en des campagnes, et ce, en se basant sur leur similarité structurale. Nous montrons l’efficacité et l’efficience de notre algorithme de clustering proposé par plusieurs expériences. Ensuite, une approche d’auto-apprentissage est proposée pour étiqueter les campagnes de spam en se basant sur le but des spammeur, par exemple, phishing. Les campagnes de spam marquées sont utilisées afin de former un classificateur, qui peut être appliqué dans la classification des nouveaux courriels de spam. En outre, les campagnes marquées, avec un ensemble de quatre autres critères de classement, sont ordonnées selon les priorités des enquêteurs. Finalement, une structure basée sur le semiring est proposée pour la représentation abstraite de CCTree. Le schéma abstrait de CCTree, nommé CCTree terme, est appliqué pour formaliser la parallélisation du CCTree. Grâce à un certain nombre d’analyses mathématiques et de résultats expérimentaux, nous montrons l’efficience et l’efficacité du cadre proposé.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Digital forensics is a rapidly expanding field, due to the continuing advances in computer technology and increases in data stage capabilities of devices. However, the tools supporting digital forensics investigations have not kept pace with this evolution, often leaving the investigator to analyse large volumes of textual data and rely heavily on their own intuition and experience. Aim: This research proposes that given the ability of information visualisation to provide an end user with an intuitive way to rapidly analyse large volumes of complex data, such approached could be applied to digital forensics datasets. Such methods will be investigated; supported by a review of literature regarding the use of such techniques in other fields. The hypothesis of this research body is that by utilising exploratory information visualisation techniques in the form of a tool to support digital forensic investigations, gains in investigative effectiveness can be realised. Method:To test the hypothesis, this research examines three different case studies which look at different forms of information visualisation and their implementation with a digital forensic dataset. Two of these case studies take the form of prototype tools developed by the researcher, and one case study utilises a tool created by a third party research group. A pilot study by the researcher is conducted on these cases, with the strengths and weaknesses of each being drawn into the next case study. The culmination of these case studies is a prototype tool which was developed to resemble a timeline visualisation of the user behaviour on a device. This tool was subjected to an experiment involving a class of university digital forensics students who were given a number of questions about a synthetic digital forensic dataset. Approximately half were given the prototype tool, named Insight, to use, and the others given a common open-source tool. The assessed metrics included: how long the participants took to complete all tasks, how accurate their answers to the tasks were, and how easy the participants found the tasks to complete. They were also asked for their feedback at multiple points throughout the task. Results:The results showed that there was a statistically significant increase in accuracy for one of the six tasks for the participants using the Insight prototype tool. Participants also found completing two of the six tasks significantly easier when using the prototype tool. There were no statistically significant different difference between the completion times of both participant groups. There were no statistically significant differences in the accuracy of participant answers for five of the six tasks. Conclusions: The results from this body of research show that there is evidence to suggest that there is the potential for gains in investigative effectiveness when information visualisation techniques are applied to a digital forensic dataset. Specifically, in some scenarios, the investigator can draw conclusions which are more accurate than those drawn when using primarily textual tools. There is also evidence so suggest that the investigators found these conclusions to be reached significantly more easily when using a tool with a visual format. None of the scenarios led to the investigators being at a significant disadvantage in terms of accuracy or usability when using the prototype visual tool over the textual tool. It is noted that this research did not show that the use of information visualisation techniques leads to any statistically significant difference in the time taken to complete a digital forensics investigation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Avec l’avènement des objets connectés, la bande passante nécessaire dépasse la capacité des interconnections électriques et interface sans fils dans les réseaux d’accès mais aussi dans les réseaux coeurs. Des systèmes photoniques haute capacité situés dans les réseaux d’accès utilisant la technologie radio sur fibre systèmes ont été proposés comme solution dans les réseaux sans fil de 5e générations. Afin de maximiser l’utilisation des ressources des serveurs et des ressources réseau, le cloud computing et des services de stockage sont en cours de déploiement. De cette manière, les ressources centralisées pourraient être diffusées de façon dynamique comme l’utilisateur final le souhaite. Chaque échange nécessitant une synchronisation entre le serveur et son infrastructure, une couche physique optique permet au cloud de supporter la virtualisation des réseaux et de les définir de façon logicielle. Les amplificateurs à semi-conducteurs réflectifs (RSOA) sont une technologie clé au niveau des ONU(unité de communications optiques) dans les réseaux d’accès passif (PON) à fibres. Nous examinons ici la possibilité d’utiliser un RSOA et la technologie radio sur fibre pour transporter des signaux sans fil ainsi qu’un signal numérique sur un PON. La radio sur fibres peut être facilement réalisée grâce à l’insensibilité a la longueur d’onde du RSOA. Le choix de la longueur d’onde pour la couche physique est cependant choisi dans les couches 2/3 du modèle OSI. Les interactions entre la couche physique et la commutation de réseaux peuvent être faites par l’ajout d’un contrôleur SDN pour inclure des gestionnaires de couches optiques. La virtualisation réseau pourrait ainsi bénéficier d’une couche optique flexible grâce des ressources réseau dynamique et adaptée. Dans ce mémoire, nous étudions un système disposant d’une couche physique optique basé sur un RSOA. Celle-ci nous permet de façon simultanée un envoi de signaux sans fil et le transport de signaux numérique au format modulation tout ou rien (OOK) dans un système WDM(multiplexage en longueur d’onde)-PON. Le RSOA a été caractérisé pour montrer sa capacité à gérer une plage dynamique élevée du signal sans fil analogique. Ensuite, les signaux RF et IF du système de fibres sont comparés avec ses avantages et ses inconvénients. Finalement, nous réalisons de façon expérimentale une liaison point à point WDM utilisant la transmission en duplex intégral d’un signal wifi analogique ainsi qu’un signal descendant au format OOK. En introduisant deux mélangeurs RF dans la liaison montante, nous avons résolu le problème d’incompatibilité avec le système sans fil basé sur le TDD (multiplexage en temps duplexé).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän kandidaatintutkimuksen tarkoituksena on löytää vastaus siihen, miten vahva voi olla DRM-systeemi, ennen kuin kuluttajat eivät enää hyväksy sitä. DRM-systeemejä on monen tasoisia, mutta ne eivät ole soveltuvia sellaisenaan kaikille eri alustoille. Peliteollisuuden digitaalisten käyttöoikeuksien hallintajärjestelmillä on omanlaisensa lainalaisuudet kuin esimerkiksi musiikkiteollisuudella. Lisäksi on olemassa tietty tämän hetkinen hyväksytty DRM:n taso, josta voi olla vaarallista poiketa. Tutkimus on luonteeltaan laadullinen tutkimus. Työssä on sovellettu sekä diskurssi- että sisällönanalyysin oppeja. Tutkimuksen aineistona on erilaisten viestiketjujen tekstit, joiden pohjalta pyritään löytämään vastaus tutkimuskysymykseen. Ketjut on jaettu eri vahvuisiksi sen perusteella, miten vahva on DRM:ää koskeva uutinen, jonka pohjalta viestiketju on syntynyt. Koska aineisto on puhuttua kieltä ja sillä on aina oma merkityksensä kontekstissaan, ovat valitut menetelmät soveltuvia analysoimaan aineistoa. Eri ketjujen analyysien tuloksien pohjalta voidaan sanoa, että DRM ei voi olla sitä tasoa suurempi kuin mikä on sen hetkinen vallitseva taso. Jos tästä tasosta poiketaan pikkaisenkin, voi se aiheuttaa suurta närästystä kuluttajien keskuudessa, jopa siihen saakka, että yritys menettää tuloja. Sen hetkiseen tasoon on päästy erinäisten kokeilujen kautta, joista kuluttajat ovat kärsineet, joten he eivät suosiolla hyväksy yhtään sen suurempaa tasoa kuin mikä vallitsee sillä hetkellä. Jos yritys näkee, että tasoa on pakko tiukentaa, täytyy tiukennus tehdä pikkuhiljaa ja naamioida se lisäominaisuuksilla. Kuluttajat ovat tietoisia omista oikeuksistaan, eivätkä he helpolla halua luopua niistä yhtään sen enempää kuin on tarpeellista.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Problems in subject access to information organization systems have been under investigation for a long time. Focusing on item-level information discovery and access, researchers have identified a range of subject access problems, including quality and application of metadata, as well as the complexity of user knowledge required for successful subject exploration. While aggregations of digital collections built in the United States and abroad generate collection-level metadata of various levels of granularity and richness, no research has yet focused on the role of collection-level metadata in user interaction with these aggregations. This dissertation research sought to bridge this gap by answering the question “How does collection-level metadata mediate scholarly subject access to aggregated digital collections?” This goal was achieved using three research methods: • in-depth comparative content analysis of collection-level metadata in three large-scale aggregations of cultural heritage digital collections: Opening History, American Memory, and The European Library • transaction log analysis of user interactions, with Opening History, and • interview and observation data on academic historians interacting with two aggregations: Opening History and American Memory. It was found that subject-based resource discovery is significantly influenced by collection-level metadata richness. The richness includes such components as: 1) describing collection’s subject matter with mutually-complementary values in different metadata fields, and 2) a variety of collection properties/characteristics encoded in the free-text Description field, including types and genres of objects in a digital collection, as well as topical, geographic and temporal coverage are the most consistently represented collection characteristics in free-text Description fields. Analysis of user interactions with aggregations of digital collections yields a number of interesting findings. Item-level user interactions were found to occur more often than collection-level interactions. Collection browse is initiated more often than search, while subject browse (topical and geographic) is used most often. Majority of collection search queries fall within FRBR Group 3 categories: object, concept, and place. Significantly more object, concept, and corporate body searches and less individual person, event and class of persons searches were observed in collection searches than in item searches. While collection search is most often satisfied by Description and/or Subjects collection metadata fields, it would not retrieve a significant proportion of collection records without controlled-vocabulary subject metadata (Temporal Coverage, Geographic Coverage, Subjects, and Objects), and free-text metadata (the Description field). Observation data shows that collection metadata records in Opening History and American Memory aggregations are often viewed. Transaction log data show a high level of engagement with collection metadata records in Opening History, with the total page views for collections more than 4 times greater than item page views. Scholars observed viewing collection records valued descriptive information on provenance, collection size, types of objects, subjects, geographic coverage, and temporal coverage information. They also considered the structured display of collection metadata in Opening History more useful than the alternative approach taken by other aggregations, such as American Memory, which displays only the free-text Description field to the end-user. The results extend the understanding of the value of collection-level subject metadata, particularly free-text metadata, for the scholarly users of aggregations of digital collections. The analysis of the collection metadata created by three large-scale aggregations provides a better understanding of collection-level metadata application patterns and suggests best practices. This dissertation is also the first empirical research contribution to test the FRBR model as a conceptual and analytic framework for studying collection-level subject access.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sound is potentially an effective way of analysing data and it is possible to simultaneously interpret layers of sounds and identify changes. Multiple attempts to use sound with scientific data have been made, with varying levels of success. On many occasions this was done without including the end user during the development. In this study a sonified model of the 8 planets of our solar system was built and tested using an end user approach. The sonification was created for the Esplora Planetarium, which is currently being constructed in Malta. The data requirements were gathered from a member of the planetarium staff, and 12 end users, as well as the planetarium representative tested the sonification. The results suggest that listeners were able to discern various planetary characteristics without requiring any additional information. Three out of eight sound design parameters did not represent characteristics successfully. These issues have been identified and further development will be conducted in order to improve the model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There are many ways in which research messages and findings can be extended to the expansive cotton community. As everyone learns differently it is crucial that information is delivered in a variety of ways to meet the various learning needs of the CottonInfo team’s broad audience. In addition different cotton production areas often require targeted information to address specific challenges. Successful implementation of innovative research outcomes typically relies on a history of cultivated communication between the researcher and the end-user, the grower. The CottonInfo team, supported by a joint venture between Cotton Seed Distributors, Cotton Research Development Corporation, Cotton Australia and other collaborative partners, represents a unique model of extension in Australian agriculture. Industry research is extended via regionally based Regional Development Officers backed by support from Technical Specialists. The 2015 Cotton Irrigation Technology Tour is one example of a successful CottonInfo capacity building activity. This tour took seven CRDC funded irrigation-specific researchers to Emerald, Moree and Nevertire to showcase their research and technologies. These events provided irrigators and consultants with the opportunity to hear first-hand from researchers about their technologies and how they could be applied onfarm. This tour was an example of how the CottonInfo team can connect growers and researchers, not only to provide an avenue for growers to learn about the latest irrigation research, but for researchers to receive feedback about their current and future irrigation research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The value of integrating a heat storage into a geothermal district heating system has been investigated. The behaviour of the system under a novel operational strategy has been simulated focusing on the energetic, economic and environmental effects of the new strategy of incorporation of the heat storage within the system. A typical geothermal district heating system consists of several production wells, a system of pipelines for the transportation of the hot water to end-users, one or more re-injection wells and peak-up devices (usually fossil-fuel boilers). Traditionally in these systems, the production wells change their production rate throughout the day according to heat demand, and if their maximum capacity is exceeded the peak-up devices are used to meet the balance of the heat demand. In this study, it is proposed to maintain a constant geothermal production and add heat storage into the network. Subsequently, hot water will be stored when heat demand is lower than the production and the stored hot water will be released into the system to cover the peak demands (or part of these). It is not intended to totally phase-out the peak-up devices, but to decrease their use, as these will often be installed anyway for back-up purposes. Both the integration of a heat storage in such a system as well as the novel operational strategy are the main novelties of this thesis. A robust algorithm for the sizing of these systems has been developed. The main inputs are the geothermal production data, the heat demand data throughout one year or more and the topology of the installation. The outputs are the sizing of the whole system, including the necessary number of production wells, the size of the heat storage and the dimensions of the pipelines amongst others. The results provide several useful insights into the initial design considerations for these systems, emphasizing particularly the importance of heat losses. Simulations are carried out for three different cases of sizing of the installation (small, medium and large) to examine the influence of system scale. In the second phase of work, two algorithms are developed which study in detail the operation of the installation throughout a random day and a whole year, respectively. The first algorithm can be a potentially powerful tool for the operators of the installation, who can know a priori how to operate the installation on a random day given the heat demand. The second algorithm is used to obtain the amount of electricity used by the pumps as well as the amount of fuel used by the peak-up boilers over a whole year. These comprise the main operational costs of the installation and are among the main inputs of the third part of the study. In the third part of the study, an integrated energetic, economic and environmental analysis of the studied installation is carried out together with a comparison with the traditional case. The results show that by implementing heat storage under the novel operational strategy, heat is generated more cheaply as all the financial indices improve, more geothermal energy is utilised and less fuel is used in the peak-up boilers, with subsequent environmental benefits, when compared to the traditional case. Furthermore, it is shown that the most attractive case of sizing is the large one, although the addition of the heat storage most greatly impacts the medium case of sizing. In other words, the geothermal component of the installation should be sized as large as possible. This analysis indicates that the proposed solution is beneficial from energetic, economic, and environmental perspectives. Therefore, it can be stated that the aim of this study is achieved in its full potential. Furthermore, the new models for the sizing, operation and economic/energetic/environmental analyses of these kind of systems can be used with few adaptations for real cases, making the practical applicability of this study evident. Having this study as a starting point, further work could include the integration of these systems with end-user demands, further analysis of component parts of the installation (such as the heat exchangers) and the integration of a heat pump to maximise utilisation of geothermal energy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents how new paradigms and methodologies for software development are changing rapidly in the last two years. In the current scenario where we live on, occurs a transition that, although slight, reflects the rapid manner in which the software production paradigms are reinvented due to the change of display devices and interaction with the end user. Studies indicate that in 2013 was the turn out of the internet access domain for mobile devices over the traditional desktop device, which is currently at around 60% mobile, against 40% desktop. This field will tend to grow in the coming years and it is expected that the use of internet for a desktop terminal tends to be less each day (comScore). In this context, the software industry has been re-invented and updated with respect to technologies that promote software and mobile applications, building products capable of responding to the user market. The development of software products, such as applications, must be put into production for different user environments, such as Web, iOS and Android in a way to enhance efficiency, optimization and productivity in the software development cycle (Langer, Arthur M.).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Part 4: Transition Towards Product-Service Systems