957 resultados para end user programming
Resumo:
Energy service companies (ESCOs) are faced with a range of challenges and opportunities associated with the rapidly changing and flexible requirements of energy customers (end users) and rapid improvements in technologies associated with energy and ICT. These opportunities for innovation include better prediction of energy demand, transparency of data to the end user, flexible and time dependent energy pricing and a range of novel finance models. The liberalisation of energy markets across the world has leads to a very small price differential between suppliers on the unit cost of energy. Energy companies are therefore looking to add additional layers of value using service models borrowed from the manufacturing industry. This opens a range of new product and service offerings to energy markets and consumers and has implications for the overall efficiency, utility and price of energy provision.
Conceptual Model and Security Requirements for DRM Techniques Used for e-Learning Objects Protection
Resumo:
This paper deals with the security problems of DRM protected e-learning content. After a short review of the main DRM systems and methods used in e-learning, an examination is made of participators in DRM schemes (e-learning object author, content creator, content publisher, license creator and end user). Then a conceptual model of security related processes of DRM implementation is proposed which is improved afterwards to reflect some particularities in DRM protection of e-learning objects. A methodical way is used to describe the security related motives, responsibilities and goals of the main participators involved in the DRM system. Taken together with the process model, these security properties are used to establish a list of requirements to fulfill and a possibility for formal verification of real DRM systems compliance with these requirements.
Resumo:
A real-time adaptive resource allocation algorithm considering the end user's Quality of Experience (QoE) in the context of video streaming service is presented in this work. An objective no-reference quality metric, namely Pause Intensity (PI), is used to control the priority of resource allocation to users during the scheduling process. An online adjustment has been introduced to adaptively set the scheduler's parameter and maintain a desired trade-off between fairness and efficiency. The correlation between the data rates (i.e. video code rates) demanded by users and the data rates allocated by the scheduler is taken into account as well. The final allocated rates are determined based on the channel status, the distribution of PI values among users, and the scheduling policy adopted. Furthermore, since the user's capability varies as the environment conditions change, the rate adaptation mechanism for video streaming is considered and its interaction with the scheduling process under the same PI metric is studied. The feasibility of implementing this algorithm is examined and the result is compared with the most commonly existing scheduling methods.
Resumo:
Interestingness in Association Rules has been a major topic of research in the past decade. The reason is that the strength of association rules, i.e. its ability to discover ALL patterns given some thresholds on support and confidence, is also its weakness. Indeed, a typical association rules analysis on real data often results in hundreds or thousands of patterns creating a data mining problem of the second order. In other words, it is not straightforward to determine which of those rules are interesting for the end-user. This paper provides an overview of some existing measures of interestingness and we will comment on their properties. In general, interestingness measures can be divided into objective and subjective measures. Objective measures tend to express interestingness by means of statistical or mathematical criteria, whereas subjective measures of interestingness aim at capturing more practical criteria that should be taken into account, such as unexpectedness or actionability of rules. This paper only focusses on objective measures of interestingness.
Resumo:
Clinical decision support systems (CDSSs) often base their knowledge and advice on human expertise. Knowledge representation needs to be in a format that can be easily understood by human users as well as supporting ongoing knowledge engineering, including evolution and consistency of knowledge. This paper reports on the development of an ontology specification for managing knowledge engineering in a CDSS for assessing and managing risks associated with mental-health problems. The Galatean Risk and Safety Tool, GRiST, represents mental-health expertise in the form of a psychological model of classification. The hierarchical structure was directly represented in the machine using an XML document. Functionality of the model and knowledge management were controlled using attributes in the XML nodes, with an accompanying paper manual for specifying how end-user tools should behave when interfacing with the XML. This paper explains the advantages of using the web-ontology language, OWL, as the specification, details some of the issues and problems encountered in translating the psychological model to OWL, and shows how OWL benefits knowledge engineering. The conclusions are that OWL can have an important role in managing complex knowledge domains for systems based on human expertise without impeding the end-users' understanding of the knowledge base. The generic classification model underpinning GRiST makes it applicable to many decision domains and the accompanying OWL specification facilitates its implementation.
Resumo:
PRELIDA (PREserving LInked DAta) is an FP7 Coordination Action funded by the European Commission under the Digital Preservation Theme. PRELIDA targets the particular stakeholders of the Linked Data community, including data providers, service providers, technology providers and end user communities. These stakeholders have not been traditionally targeted by the Digital Preservation community, and are typically not aware of the digital preservation solutions already available. So an important task of PRELIDA is to raise awareness of existing preservation solutions and to facilitate their uptake. At the same time, the Linked Data cloud has specific characteristics in terms of structuring, interlinkage, dynamicity and distribution that pose new challenges to the preservation community. PRELIDA organises in-depth discussions among the two communities to identify which of these characteristics require novel solutions, and to develop road maps for addressing the new challenges. PRELIDA will complete its lifecycle at the end of this year, and the talk will report about the major findings.
Resumo:
A significant body of research investigates the acceptance of computer-based support (including devices and applications ranging from e-mail to specialized clinical systems, like PACS) among clinicians. Much of this research has focused on measuring the usability of systems using characteristics related to the clarity of interactions and ease of use. We propose that an important attribute of any clinical computer-based support tool is the intrinsic motivation of the end-user (i.e. a clinician) to use the system in practice. In this paper we present the results of a study that investigated factors motivating medical doctors (MDs) to use computer-based support. Our results demonstrate that MDs value computer-based support, find it useful and easy to use, however, uptake is hindered by perceived incompetence, and pressure and tension associated with using technology.
Resumo:
Mobile communication and networking infrastructures play an important role in the development of smart cities, to support real-time information exchange and management required in modern urbanization. Mobile WiFi devices that help offloading data traffic from the macro-cell base station and serve the end users within a closer range can significantly improve the connectivity of wireless communications between essential components including infrastructural and human devices in a city. However, this offloading function through interworking between LTE and WiFi systems will change the pattern of resource distributions operated by the base station. In this paper, a resource allocation scheme is proposed to ensure stable service coverage and end-user quality of experience (QoE) when offloading takes place in a macro-cell environment. In this scheme, a rate redistribution algorithm is derived to form a parametric scheduler to meet the required levels of efficiency and fairness, guided by a no-reference quality assessment metric. We show that the performance of resource allocation can be regulated by this scheduler without affecting the service coverage offered by the WLAN access point. The performances of different interworking scenarios and macro-cell scheduling policies are also compared.
Resumo:
eHabitat is a Web Processing Service (WPS) designed to compute the likelihood of finding ecosystems with equal properties. Inputs to the WPS, typically thematic geospatial "layers", can be discovered using standardised catalogues, and the outputs tailored to specific end user needs. Because these layers can range from geophysical data captured through remote sensing to socio-economical indicators, eHabitat is exposed to a broad range of different types and levels of uncertainties. Potentially chained to other services to perform ecological forecasting, for example, eHabitat would be an additional component further propagating uncertainties from a potentially long chain of model services. This integration of complex resources increases the challenges in dealing with uncertainty. For such a system, as envisaged by initiatives such as the "Model Web" from the Group on Earth Observations, to be used for policy or decision making, users must be provided with information on the quality of the outputs since all system components will be subject to uncertainty. UncertWeb will create the Uncertainty-Enabled Model Web by promoting interoperability between data and models with quantified uncertainty, building on existing open, international standards. It is the objective of this paper to illustrate a few key ideas behind UncertWeb using eHabitat to discuss the main types of uncertainties the WPS has to deal with and to present the benefits of the use of the UncertWeb framework.
Resumo:
The quest for renewable energy sources has led to growing attention in the research of organic photovoltaics (OPVs), as a promising alternative to fossil fuels, since these devices have low manufacturing costs and attractive end-user qualities, such as ease of installation and maintenance. Wide application of OPVs is majorly limited by the devices lifetime. With the development of new encapsulation materials, some degradation factors, such as water and oxygen ingress, can almost be excluded, whereas the thermal degradation of the devices remains a major issue. Two aspects have to be addressed to solve the problem of thermal instability: bulk effects in the photoactive layer and interfacial effects at the photoactive layer/charge-transporting layers. In this work, the interface between photoactive layer and electron-transporting zinc oxide (ZnO) in devices with inverted architecture was engineered by introducing polymeric interlayers, based on zinc-binding ligands, such as 3,4-dihydroxybenzene and 8-hydroxyquinoline. Also, a cross-linkable layer of poly(3,4-dimethoxystyrene) and its fullerene derivative were studied. At first, controlled reversible addition-fragmentation chain transfer (RAFT) polymerisation was employed to achieve well-defined polymers in a range of molar masses, all bearing a chain-end functionality for further modifications. Resulting polymers have been fully characterised, including their thermal and optical properties, and introduced as interlayers to study their effect on the initial device performance and thermal stability. Poly(3,4-dihydroxystyrene) and its fullerene derivative were found unsuitable for application in devices as they increased the work function of ZnO and created a barrier for electron extraction. On the other hand, their parental polymer, poly(3,4-dimethoxystyrene), and its fullerene derivative, upon cross-linking, resulted in enhanced efficiency and stability of devices, if compared to control. Polymers based on 8-hydroxyquinoline ligand had a negative effect on the initial stability of the devices, but increased the lifetime of the cells under accelerated thermal stress. Comprehensive studies of the key mechanisms, determining efficiency, such as charge generation and extraction, were performed by using time-resolved electrical and spectroscopic techniques, in order to understand in detail the effect of the interlayers on the device performance. Obtained results allow deeper insight into mechanisms of degradation that limit the lifetime of devices and prompt the design of better materials for the interface stabilisation.
Resumo:
Az Európai Unión belül az elmúlt időszakban megerősödött a vita arról, vajon a Közösség versenyképességének javításához milyen módon és mértékben járulhat hozzá az ipari és lakossági fogyasztók számára kedvező áron elérhető villamos energia. Az uniós testületek elsődlegesen a verseny feltételeinek további javításában látják a versenyképesség javításának fő eszközét, ám egyesek az aktívabb központi szabályozás mellett érvelnek. A jelenleg alkalmazott európai szabályozási gyakorlat áttekintése, a szabályozási modellek és a piaci árak alakulásának vizsgálata hozzásegíthet, hogy következtetéseket vonjunk le a tagállami gyakorlatok tekintetében, vajon sikeresebb-e a központi ármegállapításon alapuló szabályozói mechanizmus, mint a liberalizált piacmodell. ______ There is a strengthening debate within the European Union in recent years about the impact of the affordable industrial and household electricity prices on the general competitiveness of European economies. While the European Institutions argues for the further liberalization of the energy retail sector, there are others who believe in centralization and price control to achieve lower energy prices. Current paper reviews the regulatory models of the European countries and examines the connection between the regulatory regime and consumer price trends. The analysis can help to answer, whether the bureaucratic central regulation or the liberalized market model seems more successful in supporting the competitiveness goals. Although the current regulatory practice is heterogeneous within the EU member states, there is a clear trend to decrease the role of regulated tariffs in the end-user prices. Our study did not find a general causal relationship between the regulatory regime and the level of consumer electricity prices in a country concerned. However, the quantitative analysis of the industrial and household energy prices by various segments detected significant differences between the regulated and free-market countries. The first group of member states tends to decrease the prices in the low-consuming household segments through cross-financing technics, including increased network tariffs and/or taxes for the high-consuming segments and for industrial consumers. One of the major challenges of the regulatory authorities is to find the proper way of sharing these burdens proportionally with minimizing the market-distorting effects of the cross-subsidization between the different stakeholder groups.
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
The availability and pervasiveness of new communication services, such as mobile networks and multimedia communication over digital networks, has resulted in strong demands for approaches to modeling and realizing customized communication systems. The stovepipe approach used to develop today's communication applications is no longer effective because it results in a lengthy and expensive development cycle. To address this need, the Communication Virtual Machine (CVM) technology has been developed by researchers at Florida International University. The CVM technology includes the Communication Modeling Language (CML) and the platform, CVM, to model and rapidly realize communication models. ^ In this dissertation, we investigate the basic communication primitives needed to capture and specify an end-user's requirements for communication-intensive applications, and how these specifications can be automatically realized. To identify the basic communication primitives, we perform a feature analysis on a set of communication-intensive scenarios from the healthcare domain. Based on the feature analysis, we define a new version of CML that includes the meta-model definition (abstract syntax and static semantics) and a partial behavior model (operational semantics). To validate our CML definition, we present a case study that shows how one of the scenarios from the healthcare domain is modeled and automatically realized. ^
Resumo:
The 9/11 Act mandates the inspection of 100% of cargo shipments entering the U.S. by 2012 and 100% inspection of air cargo by March 2010. So far, only 5% of inbound shipping containers are inspected thoroughly while air cargo inspections have fared better at 50%. Government officials have admitted that these milestones cannot be met since the appropriate technology does not exist. This research presents a novel planar solid phase microextraction (PSPME) device with enhanced surface area and capacity for collection of the volatile chemical signatures in air that are emitted from illicit compounds for direct introduction into ion mobility spectrometers (IMS) for detection. These IMS detectors are widely used to detect particles of illicit substances and do not have to be adapted specifically to this technology. For static extractions, PDMS and sol-gel PDMS PSPME devices provide significant increases in sensitivity over conventional fiber SPME. Results show a 50–400 times increase in mass detected of piperonal and a 2–4 times increase for TNT. In a blind study of 6 cases suspected to contain varying amounts of MDMA, PSPME-IMS correctly detected 5 positive cases with no false positives or negatives. One of these cases had minimal amounts of MDMA resulting in a false negative response for fiber SPME-IMS. A La (dihed) phase chemistry has shown an increase in the extraction efficiency of TNT and 2,4-DNT and enhanced retention over time. An alternative PSPME device was also developed for the rapid (seconds) dynamic sampling and preconcentration of large volumes of air for direct thermal desorption into an IMS. This device affords high extraction efficiencies due to strong retention properties under ambient conditions resulting in ppt detection limits when 3.5 L of air are sampled over the course of 10 seconds. Dynamic PSPME was used to sample the headspace over the following: MDMA tablets (12–40 ng detected of piperonal), high explosives (Pentolite) (0.6 ng detected of TNT), and several smokeless powders (26–35 ng of 2,4-DNT and 11–74 ng DPA detected). PSPME-IMS technology is flexible to end-user needs, is low-cost, rapid, sensitive, easy to use, easy to implement, and effective. ^
Resumo:
In their article - Sales Promotion In Hotels: A British Perspective - by Francis Buttle, Lecturer, Department of Hotel, Restaurant, and Travel Administration, University of Massachusetts and Ini Akpabio, Property Manager, Trusthouse Forte, Britain, Buttle and Akpabio initially state: “Sales promotion in hotels is in its infancy. Other industries, particularly consumer goods manufacturing, have long recognized the contribution that sales promotion can make to the cost-effective achievement of marketing objectives. Sales promotion activities in hotels have remained largely uncharted. The authors define, identify and classify these hotel sales promotion activities to understand their function and form, and to highlight any scope for improvement.” The authors begin their discussion by attempting to define what the phrase sales promotion [SP] actually means. “The Institute of Sales Promotion regards sales promotions as “adding value, usually of a temporary nature, to a product or service in order to persuade the end user to purchase that particular brand as opposed to a competitive brand,” the authors offer. Williams, however, describes sales promotions more broadly as “short term tactical marketing tools which are used to achieve specific marketing objectives during a defined time period,” Buttle and Akpabio present with attribution. “The most significant difference between these two viewpoints is that Williams does not limit his definition to activities which are targeted at the consumer,” is their educated view. A lot of the discussion is centered on the differences in the collective marketing-promotional mix. “…it is not always easy to definitively categorize promotional activity,” Buttle and Akpabio say. “For example, in personal selling, a sales promotion such as a special bonus offer may be used to close the sale; an advertisement may be sales promotional in character in that it offers discounts.” Are promotion and marketing distinguishable as two separate entities? “…not only may there be conceptual confusion between components of the promotional mix, but there is sometimes a blurring of the boundaries between the elements of the marketing mix,” the authors suggest. “There are several reasons why SP is particularly suitable for use in hotels: seasonality, increasing competitiveness, asset characteristics, cost characteristics, increased use of channel intermediaries, new product launches, and deal proneness.” Buttle and Akpabio offer their insight on each of these segments. The authors also want you to know that SP customer applications are not the only game in town, SP trade applications are just as essential. Bonuses, enhanced commission rates, and vouchers are but a few examples of trade SP. The research for the article was compiled from several sources including, mail surveys, telephone surveys, personal interviews, trade magazines and newspapers; essentially in the U.K.