916 resultados para usage-based
Resumo:
The current industry trend is towards using Commercially available Off-The-Shelf (COTS) based multicores for developing real time embedded systems, as opposed to the usage of custom-made hardware. In typical implementation of such COTS-based multicores, multiple cores access the main memory via a shared bus. This often leads to contention on this shared channel, which results in an increase of the response time of the tasks. Analyzing this increased response time, considering the contention on the shared bus, is challenging on COTS-based systems mainly because bus arbitration protocols are often undocumented and the exact instants at which the shared bus is accessed by tasks are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. This paper makes three contributions towards analyzing tasks scheduled on COTS-based multicores. Firstly, we describe a method to model the memory access patterns of a task. Secondly, we apply this model to analyze the worst case response time for a set of tasks. Although the required parameters to obtain the request profile can be obtained by static analysis, we provide an alternative method to experimentally obtain them by using performance monitoring counters (PMCs). We also compare our work against an existing approach and show that our approach outperforms it by providing tighter upper-bound on the number of bus requests generated by a task.
Resumo:
The current ubiquitous network access and increase in network bandwidth are driving the sales of mobile location-aware user devices and, consequently, the development of context-aware applications, namely location-based services. The goal of this project is to provide consumers of location-based services with a richer end-user experience by means of service composition, personalization, device adaptation and continuity of service. Our approach relies on a multi-agent system composed of proxy agents that act as mediators and providers of personalization meta-services, device adaptation and continuity of service for consumers of pre-existing location-based services. These proxy agents, which have Web services interfaces to ensure a high level of interoperability, perform service composition and take in consideration the preferences of the users, the limitations of the user devices, making the usage of different types of devices seamless for the end-user. To validate and evaluate the performance of this approach, use cases were defined, tests were conducted and results gathered which demonstrated that the initial goals were successfully fulfilled.
Resumo:
It is unquestionable that an effective decision concerning the usage of a certain environmental clean-up technology should be conveniently supported. Significant amount of scientific work focussing on the reduction of nitrate concentration in drinking water by both metallic iron and nanomaterials and their usage in permeable reactive barriers has been worldwide published over the last two decades. This work aims to present in a systematic review of the most relevant research done on the removal of nitrate from groundwater using nanosized iron based permeable reactive barriers. The research was based on scientific papers published between 2004 and June 2014. It was performed using 16 combinations of keywords in 34 databases, according to PRISMA statement guidelines. Independent reviewers validated the selection criteria. From the 4161 records filtered, 45 met the selection criteria and were selected to be included in this review. This study's outcomes show that the permeable reactive barriers are, indeed, a suitable technology for denitrification and with good performance record but the long-term impact of the use of nanosized zero valent iron in this remediation process, in both on the environment and on the human health, is far to be conveniently known. As a consequence, further work is required on this matter, so that nanosized iron based permeable reactive barriers for the removal of nitrate from drinking water can be genuinely considered an eco-efficient technology.
Resumo:
The future of health care delivery is becoming more citizen-centred, as today’s user is more active, better informed and more demanding. The European Commission is promoting online health services and, therefore, member states will need to boost deployment and use of online services. This makes e-health adoption an important field to be studied and understood. This study applied the extended unified theory of acceptance and usage technology (UTAUT2) to explain patients’ individual adoption of e-health. An online questionnaire was administrated Portugal using mostly the same instrument used in UTAUT2 adapted to e-health context. We collected 386 valid answers. Performance expectancy, effort expectancy, social influence, and habit had the most significant explanatory power over behavioural intention and habit and behavioural intention over technology use. The model explained 52% of the variance in behavioural intention and 32% of the variance in technology use. Our research helps to understand the desired technology characteristics of ehealth. By testing an information technology acceptance model, we are able to determine what is more valued by patients when it comes to deciding whether to adopt e-health systems or not.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics and Maastricht University School of Business and Economics
Resumo:
Throughout recent years, there has been an increase in the population size, as well as a fast economic growth, which has led to an increase of the energy demand that comes mainly from fossil fuels. In order to reduce the ecological footprint, governments have implemented sustainable measures and it is expected that by 2035 the energy produced from renewable energy sources, such as wind and solar would be responsible for one-third of the energy produced globally. However, since the energy produced from renewable sources is governed by the availability of the respective primary energy source there is often a mismatch between production and demand, which could be solved by adding flexibility on the demand side through demand response (DR). DR programs influence the end-user electricity usage by changing its cost along the time. Under this scenario the user needs to estimate the energy demand and on-site production in advance to plan its energy demand according to the energy price. This work focuses on the development of an agent-based electrical simulator, capable of: (a) estimating the energy demand and on-site generation with a 1-min time resolution for a 24-h period, (b) calculating the energy price for a given scenario, (c) making suggestions on how to maximize the usage of renewable energy produced on-site and to lower the electricity costs by rescheduling the use of certain appliances. The results show that this simulator allows reducing the energy bill by 11% and almost doubling the use of renewable energy produced on-site.
Resumo:
Affiliation: Département de biochimie, Faculté de médecine, Université de Montréal
Resumo:
L’ayahuasca est une décoction de plantes originaires de la forêt amazonienne. Elle contient la vigne nommée ayahuasca (Banistereopsis caapi) et un arbuste (Psychotria viridis). Ces plantes contiennent des substances psychoactives. Respectivement il s’agit de l’harmine et de la diméthyle-tryptamine (DMT). Ce mélange est utilisé par plusieurs peuples autochtones du bassin amazonien depuis une époque qui précède la Conquête. L’ayahuasca est utilisée par ces peuples à des fins chamaniques. Au début du 20e siècle, la demande en caoutchouc a engendrée une migration de travailleurs dans ces régions. Un de ces travailleurs d’origine africaine est entré en contact avec cette substance. De croyance chrétienne, il a interprété son expérience comme une rencontre avec le divin. Ceci l’a amené à fonder dans les années 30 une église syncrétique nommée Santo Daime. Depuis, l’utilisation rituelle d’ayahuasca est répandue dans le monde. Aujourd’hui, On retrouve au Québec des groupes faisant un usage rituel d’ayahuasca. Ce mémoire est une ethnographie d’un groupe actif au Québec. Ce groupe a fait l’objet d’observations participantes durant l’été 2010. L’étude a démontré que la participation aux cérémonies d’ayahuasca engendre des prises de conscience et le travail de groupe en permet l’intégration dans le quotidien des participants. De plus, la structure rituelle garanti un usage non abusif.
Resumo:
Les technologies de stimulations transcrâniennes – tel que la tDCS ou la TMS – présentent à l’heure actuelle d’intéressantes perspectives thérapeutiques, tout comme diverses améliorations cognitives chez le sujet « non-malade » dont découlent des applications neuroamélioratives, plus ou moins imminentes, en dehors du cadre clinique ou investigatoire. Est proposé ici d’analyser les risques associés à ces applications, détournées des objectifs premiers de recherche, et aux préoccupations éthiques qui les accompagnent (autonomie, justice, intégrité physique), via un concept généralement associé aux recherches avec des perspectives de sécurité nationale et associées à un niveau de risque élevé. Révisant la trivialité d’une définition dichotomique aux usages « bons » et « mauvais », est proposé d’étendre le concept de « double-usage » pour l’appliquer à la neuroamélioration comme un mésusage de la recherche en neurosciences. Faisant référence au conflit entre, d’une part, le respect de la liberté académique et, d’autre part, la protection de la sécurité et de la santé publique, ce concept s’avère être un outil diagnostique pertinent pour l’évaluation des risques associés à l’usage mélioratif desdites technologies, et plus particulièrement de la tDCS, afin d’alimenter la réflexion sur la régulation de ces dispositifs en amont de leur utilisation, selon un principe de précaution inhérent au double-usage de la recherche. Ce concept permet ainsi de réfléchir à la mise en place d’une gouvernance proactive et contextualisée impliquant une responsabilité partagée d’un large panel d’acteurs, nécessaire au vu des avancées rapides du domaine des neurosciences et de l’imminence de l’arrivée sur le marché de ces dispositifs.
Resumo:
Usage of a dielectric multilayer around a dielectric Sample is studied as a means for improving the efficiency in multimode microwave- heating cavities. The results show that by using additional dielectric constant layers the appearance of undesired reflections at the sample-air interface is avoided and higher power -absorption rates within the sample and high -efficiency designs are obtained
Resumo:
Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.
Resumo:
This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron. The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.
Resumo:
Cooperative caching in mobile ad hoc networks aims at improving the efficiency of information access by reducing access latency and bandwidth usage. Cache replacement policy plays a vital role in improving the performance of a cache in a mobile node since it has limited memory. In this paper we propose a new key based cache replacement policy called E-LRU for cooperative caching in ad hoc networks. The proposed scheme for replacement considers the time interval between the recent references, size and consistency as key factors for replacement. Simulation study shows that the proposed replacement policy can significantly improve the cache performance in terms of cache hit ratio and query delay
Resumo:
The role urban and peri-urban agriculture (UPA) plays in reducing urban poverty and ensuring environmental sustainability was recognized by the Millennium Development Goals (MGDs). India is the world’s largest democratic nation with a population of 1.2 billion. The rapid urbanization and high proportion of people below the poverty line along with higher migration to urban areas make India vulnerable to food crisis and urbanization of poverty. Ensuring jobs and food security among urban poor is a major challenge in India. The role of UPA can be well explained and understood in this context. This paper focuses on the current situation of UPA production in India with special attention to wastewater irrigation. This question is being posed about the various human health risks from wastewater irrigation which are faced by farmers and labourers, and, secondly by consumers. The possible health hazards involve microbial pathogens as well as helminth (intestinal parasites). Based on primary and secondary data, this paper attempts to confirm that UPA is one of the best options to address increasing urban food demand and can serve to complement rural supply chains and reduce ecological food prints in India. “Good practice urban and peri-urban agriculture” necessitates an integrated approach with suitable risk reduction mechanisms to improve the efficiency and safety of UPA production.
Resumo:
We present a type-based approach to statically derive symbolic closed-form formulae that characterize the bounds of heap memory usages of programs written in object-oriented languages. Given a program with size and alias annotations, our inference system will compute the amount of memory required by the methods to execute successfully as well as the amount of memory released when methods return. The obtained analysis results are useful for networked devices with limited computational resources as well as embedded software.