868 resultados para end user modes of operation
Resumo:
Wireless sensor networks (WSNs) have shown wide applicability to many fields including monitoring of environmental, civil, and industrial settings. WSNs however are resource constrained by many competing factors that span their hardware, software, and networking. One of the central resource constrains is the charge consumption of WSN nodes. With finite energy supplies, low charge consumption is needed to ensure long lifetimes and success of WSNs. This thesis details the design of a power system to support long-term operation of WSNs. The power system’s development occurs in parallel with a custom WSN from the Queen’s MEMS Lab (QML-WSN), with the goal of supporting a 1+ year lifetime without sacrificing functionality. The final power system design utilizes a TPS62740 DC-DC converter with AA alkaline batteries to efficiently supply the nodes while providing battery monitoring functionality and an expansion slot for future development. Testing tools for measuring current draw and charge consumption were created along with analysis and processing software. Through their use charge consumption of the power system was drastically lowered and issues in QML-WSN were identified and resolved including the proper shutdown of accelerometers, and incorrect microcontroller unit (MCU) power pin connection. Controlled current profiling revealed unexpected behaviour of nodes and detailed current-voltage relationships. These relationships were utilized with a lifetime projection model to estimate a lifetime between 521-551 days, depending on the mode of operation. The power system and QML-WSN were tested over a long term trial lasting 272+ days in an industrial testbed to monitor an air compressor pump. Environmental factors were found to influence the behaviour of nodes leading to increased charge consumption, while a node in an office setting was still operating at the conclusion of the trail. This agrees with the lifetime projection and gives a strong indication that a 1+ year lifetime is achievable. Additionally, a light-weight charge consumption model was developed which allows charge consumption information of nodes in a distributed WSN to be monitored. This model was tested in a laboratory setting demonstrating +95% accuracy for high packet reception rate WSNs across varying data rates, battery supply capacities, and runtimes up to full battery depletion.
Resumo:
One challenge related to transit planning is selecting the appropriate mode: bus, light rail transit (LRT), regional express rail (RER), or subway. This project uses data from life cycle assessment to develop a tool to measure energy requirements for different modes of transit, on a per passenger-kilometer basis. For each of the four transit modes listed, a range of energy requirements associated with different vehicle models and manufacturers was developed. The tool demonstrated that there are distinct ranges where specific transit modes are the best choice. Diesel buses are the clear best choice from 7-51 passengers, LRTs make the most sense from 201-427 passengers, and subways are the best choice above 918 passengers. There are a number of other passenger loading ranges where more than one transit mode makes sense; in particular, LRT and RER represent very energy-efficient options for ridership ranging from 200 to 900 passengers. The tool developed in the thesis was used to analyze the Bloor-Danforth subway line in Toronto using estimated ridership for weekday morning peak hours. It was found that ridership across the line is for the most part actually insufficient to justify subways over LRTs or RER. This suggests that extensions to the existing Bloor-Danforth line should consider LRT options, which could service the passenger loads at the ends of the line with far greater energy efficiency. It was also clear that additional destinations along the entire transit line are necessary to increase the per passenger-kilometer energy efficiency, as the current pattern of commuting to downtown leaves much of the system underutilized. It is hoped that the tool developed in this thesis can be used as an additional resource in the transit mode decision-making process for many developing transportation systems, including the transit systems across the GTHA.
Resumo:
A turn towards documentary modes of practice amongst contemporary fine art video and filmmakers towards the end of the 20th Century, led to moving image works that represent current social realities. This drew some comparisons of these forms of art to journalism and industrial documentary. The practical research is embodied in a single screen film that responds to recent political and ecological realities in Spain. These include the mass demonstrations that led to the occupation of Madrid’s Plaza del Sol and Spain’s in 2011 and largest recorded forest fires that spread through Andalusia in August of the following year. The film, titled Spanish Labyrinth, South from Granada, is a response to these events and also relates to political avant-garde film of the 1930’s by re-tracing a journey undertaken by three revolutionary filmmakers, Yves Allegret, René Naville and Eli Lotar, in 1931. The theoretical research for this project establishes an historical root of artists’ film that responds to current social realities, in contrast to news media, in the Soviet and European avant-garde movements of the 1920s and 1930s. The main aim of this method is to argue the status of the works that I identify, both avant-garde and contemporary, as a form of art that preceded a Griersonian definition of documentary film.
Resumo:
Simulating the efficiency of business processes could reveal crucial bottlenecks for manufacturing companies and could lead to significant optimizations resulting in decreased time to market, more efficient resource utilization, and larger profit. While such business optimization software is widely utilized by larger companies, SMEs typically do not have the required expertise and resources to efficiently exploit these advantages. The aim of this work is to explore how simulation software vendors and consultancies can extend their portfolio to SMEs by providing business process optimization based on a cloud computing platform. By executing simulation runs on the cloud, software vendors and associated business consultancies can get access to large computing power and data storage capacity on demand, run large simulation scenarios on behalf of their clients, analyze simulation results, and advise their clients regarding process optimization. The solution is mutually beneficial for both vendor/consultant and the end-user SME. End-user companies will only pay for the service without requiring large upfront costs for software licenses and expensive hardware. Software vendors can extend their business towards the SME market with potentially huge benefits.
Resumo:
Ageing of the population is a worldwide phenomenon. Numerous ICT-based solutions have been developed for elderly care but mainly connected to the physiological and nursing aspects in services for the elderly. Social work is a profession that should pay attention to the comprehensive wellbeing and social needs of the elderly. Many people experience loneliness and depression in their old age, either as a result of living alone or due to a lack of close family ties and reduced connections with their culture of origin, which results in an inability to participate actively in community activities (Singh & Misra, 2009). Participation in society would enhance the quality of life. With the development of information technology, the use of technology in social work practice has risen dramatically. The aim of this literature review is to map out the state of the art of knowledge about the usage of ICT in elderly care and to figure out research-based knowledge about the usability of ICT for the prevention of loneliness and social isolation of elderly people. The data for the current research comes from the core collection of the Web of Science and the data searching was performed using Boolean? The searching resulted in 216 published English articles. After going through the topics and abstracts, 34 articles were selected for the data analysis that is based on a multi approach framework. The analysis of the research approach is categorized according to some aspects of using ICT by older adults from the adoption of ICT to the impact of usage, and the social services for them. This literature review focused on the function of communication by excluding the applications that mainly relate to physical nursing. The results show that the so-called ‘digital divide’ still exists, but the older adults have the willingness to learn and utilise ICT in daily life, especially for communication. The data shows that the usage of ICT can prevent the loneliness and social isolation of older adults, and they are eager for technical support in using ICT. The results of data analysis on theoretical frames and concepts show that this research field applies different theoretical frames from various scientific fields, while a social work approach is lacking. However, a synergic frame of applied theories will be suggested from the perspective of social work.
Resumo:
Several studies in the past have revealed that network end user devices are left powered up 24/7 even when idle just for the sake of maintaining Internet connectivity. Network devices normally support low power states but are kept inactive due to their inability to maintain network connectivity. The Network Connectivity Proxy (NCP) has recently been proposed as an effective mechanism to impersonate network connectivity on behalf of high power devices and enable them to sleep when idle without losing network presence. The NCP can efficiently proxy basic networking protocol, however, proxying of Internet based applications have no absolute solution due to dynamic and non-predictable nature of the packets they are sending and receiving periodically. This paper proposes an approach for proxying Internet based applications and presents the basic software architectures and capabilities. Further, this paper also practically evaluates the proposed framework and analyzes expected energy savings achievable under-different realistic conditions.
Resumo:
This study measured fuel consumption in transporting grain from Iowa origins to Japan and Amsterdam by alternative routes and modes of transport and applied these data to construct equations for fuel consumption from Iowa origins to alternative final destinations. Some of the results are as follows: (1) The metered tractor-trailer truck averaged 186.6 gross ton-miles per gallon and 90.5 net ton-miles per gallon when loaded 50% of total miles. (2) The 1983 fuel consumption of seven trucks taken from company records was 82.4 net ton-miles per gallon at 67.5% loaded miles and 68.6 net ton-miles per gallon at 50% loaded miles. (3) Unit grain trains from Iowa to West Coast ports averaged 437.0 net ton-miles per gallon whereas unit grain trains from Iowa to New Orleans averaged 640.1 net ton-miles per gallon--a 46% advantage for the New Orleans trips. (4) Average barge fuel consumption on the Mississippi River from Iowa to New Orleans export grain elevators was 544.5 net ton-miles per gallon, with a 35% backhaul rate. (5) Ocean vessel net ton-miles per gallon varies widely by size of ship and backhaul percentage. With no backhaul, the average net ton-miles per gallon were as follows: for 30,000 dwt ship, 574.8 net ton-miles per gallon; for 50,000 dwt ship, 701.9; for 70,000 dwt ship, 835.1; and for 100,000 dwt ship, 1,043.4. (6) The most fuel efficient route and modal combination to transport grain from Iowa to Japan depends on the size of ocean vessel, the percentage of backhaul, and the origin of the grain. Alternative routes and modal combinations in shipping grain to Japan are ranked in descending order of fuel efficiencies.
Resumo:
Thirty years of academic and critical scholarship on the subject of gay porn have born witness to significant changes not only in the kinds of porn produced for, and watched by, gay men, but in the modes of production and distribution of that porn, and the legal, economic and social contexts in which it has been made, sold/shared, and watched. Those thirty years have also seen a huge shift in the cultural and political position of gay men, especially in the US and UK, and other apparently ‘advanced’ democracies. Those thirty years of scholarship on the topic of gay porn have produced one striking consensus, which is that gay cultures are especially ‘pornified’: porn has arguably offered gay men not only homoerotic visibility, but a heritage culture and a radical aesthetic. However, neoliberal cultures have transformed the operation and meaning of sexuality, installing new standards of performativity and display, and new responsibilities attached to a ‘democratisation’ that offers women and men apparently expanded terms for articulating both their gender and their sexuality. Does gay porn still have the same urgency in this context? At the level of politics and cultural dissent, what’s ‘gay’ about gay porn now? This essay questions the extent to which processes of legal and social liberalization, and the emergence of networked and digital cultures, have foreclosed or expanded the apparently liberationary opportunities of gay porn. The essay attempts to map some of the political implications of the ‘pornification’ of gay culture on to ongoing debates about materiality, labour and the entrepreneurial subject by analyzing gay porn blogs.
Resumo:
Avec l’avènement des objets connectés, la bande passante nécessaire dépasse la capacité des interconnections électriques et interface sans fils dans les réseaux d’accès mais aussi dans les réseaux coeurs. Des systèmes photoniques haute capacité situés dans les réseaux d’accès utilisant la technologie radio sur fibre systèmes ont été proposés comme solution dans les réseaux sans fil de 5e générations. Afin de maximiser l’utilisation des ressources des serveurs et des ressources réseau, le cloud computing et des services de stockage sont en cours de déploiement. De cette manière, les ressources centralisées pourraient être diffusées de façon dynamique comme l’utilisateur final le souhaite. Chaque échange nécessitant une synchronisation entre le serveur et son infrastructure, une couche physique optique permet au cloud de supporter la virtualisation des réseaux et de les définir de façon logicielle. Les amplificateurs à semi-conducteurs réflectifs (RSOA) sont une technologie clé au niveau des ONU(unité de communications optiques) dans les réseaux d’accès passif (PON) à fibres. Nous examinons ici la possibilité d’utiliser un RSOA et la technologie radio sur fibre pour transporter des signaux sans fil ainsi qu’un signal numérique sur un PON. La radio sur fibres peut être facilement réalisée grâce à l’insensibilité a la longueur d’onde du RSOA. Le choix de la longueur d’onde pour la couche physique est cependant choisi dans les couches 2/3 du modèle OSI. Les interactions entre la couche physique et la commutation de réseaux peuvent être faites par l’ajout d’un contrôleur SDN pour inclure des gestionnaires de couches optiques. La virtualisation réseau pourrait ainsi bénéficier d’une couche optique flexible grâce des ressources réseau dynamique et adaptée. Dans ce mémoire, nous étudions un système disposant d’une couche physique optique basé sur un RSOA. Celle-ci nous permet de façon simultanée un envoi de signaux sans fil et le transport de signaux numérique au format modulation tout ou rien (OOK) dans un système WDM(multiplexage en longueur d’onde)-PON. Le RSOA a été caractérisé pour montrer sa capacité à gérer une plage dynamique élevée du signal sans fil analogique. Ensuite, les signaux RF et IF du système de fibres sont comparés avec ses avantages et ses inconvénients. Finalement, nous réalisons de façon expérimentale une liaison point à point WDM utilisant la transmission en duplex intégral d’un signal wifi analogique ainsi qu’un signal descendant au format OOK. En introduisant deux mélangeurs RF dans la liaison montante, nous avons résolu le problème d’incompatibilité avec le système sans fil basé sur le TDD (multiplexage en temps duplexé).
Resumo:
Designing for users rather than with users is still a common practice in technology design and innovation as opposed to taking them on board in the process. Design for inclusion aims to define and understand end-users, their needs, context of use, and, by doing so, ensure that end-users are catered for and included, while the results are geared towards universality of use. We describe the central role of end-user and designer participation, immersion and perspective to build user-driven solutions. These approaches provided a critical understanding of the counterpart role. Designer(s) could understand what the user’s needs were, experience physical impairments, and see from other’s perspective the interaction with the environment. Users could understand challenges of designing for physical impairments, build a sense of ownership with technology and explore it from a creative perspective. The understanding of the peer’s role (user and designer), needs and perspective enhanced user participation and inclusion.
Collection-Level Subject Access in Aggregations of Digital Collections: Metadata Application and Use
Resumo:
Problems in subject access to information organization systems have been under investigation for a long time. Focusing on item-level information discovery and access, researchers have identified a range of subject access problems, including quality and application of metadata, as well as the complexity of user knowledge required for successful subject exploration. While aggregations of digital collections built in the United States and abroad generate collection-level metadata of various levels of granularity and richness, no research has yet focused on the role of collection-level metadata in user interaction with these aggregations. This dissertation research sought to bridge this gap by answering the question “How does collection-level metadata mediate scholarly subject access to aggregated digital collections?” This goal was achieved using three research methods: • in-depth comparative content analysis of collection-level metadata in three large-scale aggregations of cultural heritage digital collections: Opening History, American Memory, and The European Library • transaction log analysis of user interactions, with Opening History, and • interview and observation data on academic historians interacting with two aggregations: Opening History and American Memory. It was found that subject-based resource discovery is significantly influenced by collection-level metadata richness. The richness includes such components as: 1) describing collection’s subject matter with mutually-complementary values in different metadata fields, and 2) a variety of collection properties/characteristics encoded in the free-text Description field, including types and genres of objects in a digital collection, as well as topical, geographic and temporal coverage are the most consistently represented collection characteristics in free-text Description fields. Analysis of user interactions with aggregations of digital collections yields a number of interesting findings. Item-level user interactions were found to occur more often than collection-level interactions. Collection browse is initiated more often than search, while subject browse (topical and geographic) is used most often. Majority of collection search queries fall within FRBR Group 3 categories: object, concept, and place. Significantly more object, concept, and corporate body searches and less individual person, event and class of persons searches were observed in collection searches than in item searches. While collection search is most often satisfied by Description and/or Subjects collection metadata fields, it would not retrieve a significant proportion of collection records without controlled-vocabulary subject metadata (Temporal Coverage, Geographic Coverage, Subjects, and Objects), and free-text metadata (the Description field). Observation data shows that collection metadata records in Opening History and American Memory aggregations are often viewed. Transaction log data show a high level of engagement with collection metadata records in Opening History, with the total page views for collections more than 4 times greater than item page views. Scholars observed viewing collection records valued descriptive information on provenance, collection size, types of objects, subjects, geographic coverage, and temporal coverage information. They also considered the structured display of collection metadata in Opening History more useful than the alternative approach taken by other aggregations, such as American Memory, which displays only the free-text Description field to the end-user. The results extend the understanding of the value of collection-level subject metadata, particularly free-text metadata, for the scholarly users of aggregations of digital collections. The analysis of the collection metadata created by three large-scale aggregations provides a better understanding of collection-level metadata application patterns and suggests best practices. This dissertation is also the first empirical research contribution to test the FRBR model as a conceptual and analytic framework for studying collection-level subject access.
Resumo:
Data leakage is a serious issue and can result in the loss of sensitive data, compromising user accounts and details, potentially affecting millions of internet users. This paper contributes to research in online security and reducing personal footprint by evaluating the levels of privacy provided by the Firefox browser. The aim of identifying conditions that would minimize data leakage and maximize data privacy is addressed by assessing and comparing data leakage in the four possible browsing modes: normal and private modes using a browser installed on the host PC or using a portable browser from a connected USB device respectively. To provide a firm foundation for analysis, a series of carefully designed, pre-planned browsing sessions were repeated in each of the various modes of Firefox. This included low RAM environments to determine any effects low RAM may have on browser data leakage. The results show that considerable data leakage may occur within Firefox. In normal mode, all of the browsing information is stored within the Mozilla profile folder in Firefox-specific SQLite databases and sessionstore.js. While passwords were not stored as plain text, other confidential information such as credit card numbers could be recovered from the Form history under certain conditions. There is no difference when using a portable browser in normal mode, except that the Mozilla profile folder is located on the USB device rather than the host's hard disk. By comparison, private browsing reduces data leakage. Our findings confirm that no information is written to the Firefox-related locations on the hard disk or USB device during private browsing, implying that no deletion would be necessary and no remnants of data would be forensically recoverable from unallocated space. However, two aspects of data leakage occurred equally in all four browsing modes. Firstly, all of the browsing history was stored in the live RAM and was therefore accessible while the browser remained open. Secondly, in low RAM situations, the operating system caches out RAM to pagefile.sys on the host's hard disk. Irrespective of the browsing mode used, this may include Firefox history elements which can then remain forensically recoverable for considerable time.
Resumo:
The emergence of multidrug-resistant bacterial infections in both the clinical setting and the community has created an environment in which the development of novel antibacterial compounds is necessary to keep dangerous infections at bay. While the derivatization of existing antibiotics by pharmaceutical companies has so far been successful at achieving this end, this strategy is short-term, and the discovery of antibacterials with novel scaffolds would be a greater contribution to the fight of multidrug-resistant infections. Described herein is the application of both target-based and whole cell screening strategies to identify novel antibacterial compounds. In a target-based approach, we sought small-molecule disruptors of the MazEF toxin-antitoxin protein complex. A lack of facile, continuous assays for this target required the development of a fluorometric assay for MazF ribonuclease activity. This assay was employed to further characterize the activity of the MazF enzyme and was used in a screening effort to identify disruptors of the MazEF complex. In addition, by employing a whole cell screening approach, we identified two compounds with potent antibacterial activity. Efforts to characterize the in vitro antibacterial activities displayed by these compounds and to identify their modes of action are described.
Resumo:
The value of integrating a heat storage into a geothermal district heating system has been investigated. The behaviour of the system under a novel operational strategy has been simulated focusing on the energetic, economic and environmental effects of the new strategy of incorporation of the heat storage within the system. A typical geothermal district heating system consists of several production wells, a system of pipelines for the transportation of the hot water to end-users, one or more re-injection wells and peak-up devices (usually fossil-fuel boilers). Traditionally in these systems, the production wells change their production rate throughout the day according to heat demand, and if their maximum capacity is exceeded the peak-up devices are used to meet the balance of the heat demand. In this study, it is proposed to maintain a constant geothermal production and add heat storage into the network. Subsequently, hot water will be stored when heat demand is lower than the production and the stored hot water will be released into the system to cover the peak demands (or part of these). It is not intended to totally phase-out the peak-up devices, but to decrease their use, as these will often be installed anyway for back-up purposes. Both the integration of a heat storage in such a system as well as the novel operational strategy are the main novelties of this thesis. A robust algorithm for the sizing of these systems has been developed. The main inputs are the geothermal production data, the heat demand data throughout one year or more and the topology of the installation. The outputs are the sizing of the whole system, including the necessary number of production wells, the size of the heat storage and the dimensions of the pipelines amongst others. The results provide several useful insights into the initial design considerations for these systems, emphasizing particularly the importance of heat losses. Simulations are carried out for three different cases of sizing of the installation (small, medium and large) to examine the influence of system scale. In the second phase of work, two algorithms are developed which study in detail the operation of the installation throughout a random day and a whole year, respectively. The first algorithm can be a potentially powerful tool for the operators of the installation, who can know a priori how to operate the installation on a random day given the heat demand. The second algorithm is used to obtain the amount of electricity used by the pumps as well as the amount of fuel used by the peak-up boilers over a whole year. These comprise the main operational costs of the installation and are among the main inputs of the third part of the study. In the third part of the study, an integrated energetic, economic and environmental analysis of the studied installation is carried out together with a comparison with the traditional case. The results show that by implementing heat storage under the novel operational strategy, heat is generated more cheaply as all the financial indices improve, more geothermal energy is utilised and less fuel is used in the peak-up boilers, with subsequent environmental benefits, when compared to the traditional case. Furthermore, it is shown that the most attractive case of sizing is the large one, although the addition of the heat storage most greatly impacts the medium case of sizing. In other words, the geothermal component of the installation should be sized as large as possible. This analysis indicates that the proposed solution is beneficial from energetic, economic, and environmental perspectives. Therefore, it can be stated that the aim of this study is achieved in its full potential. Furthermore, the new models for the sizing, operation and economic/energetic/environmental analyses of these kind of systems can be used with few adaptations for real cases, making the practical applicability of this study evident. Having this study as a starting point, further work could include the integration of these systems with end-user demands, further analysis of component parts of the installation (such as the heat exchangers) and the integration of a heat pump to maximise utilisation of geothermal energy.
Resumo:
This is a project of the School of Library, Documentation and Information of the National University, is performed to support the initiative of UNESCO to build the Memory of the World (UNESCO, 1995) and to help provide universal access to documentation. To this end, the School of Library Science students has promoted the realization of final graduation work on documentary control of domestic production. This project has the following objectives:Objectives1. Conduct mapping national documentary through the identification, analysis, organization and access to documentary heritage of Costa Rica, to contribute to the Memory of the World.2. Perform bibliometric analysis of documentary records contained in the integrated databases.This project seeks to bring undergraduate students graduating from the school, in making final graduation work on document control. Students have the opportunity to make final graduation work on the documentary production of Costa Rica on a specific theme or on a country's geographical area.Desk audits aimed at identifying the document using access points and indicate its contents to allow recovery by the user.The result is the production of a new document, other than the original, a secondary document: the bibliography. The records in the database each control documentation completed work will be integrated into a single database to be placed on the website of EBDI, for consultation of researchers and interested users.