923 resultados para point of view of service users
Resumo:
Different treatments that could be implemented in the home environ-ment are evaluated with the objective of reaching a more rational and efficient use of energy. We consider that a detailed knowledge of energy-consuming behaviour is paramount for the development and implementation of new technologies, services and even policies that could result in more rational energy use. The proposed evaluation methodology is based on the development of economic experiments implemented in an experimental economics laboratory, where the behaviour of individuals when making decisions related to energy use in the domestic environment can be tested.
Resumo:
With the increasing importance of digital communication and its distinct characteristics, marketing tools and strategies adopted by companies have changed dramatically. Among the many digital marketing tools and new media channels available for marketers, the phenomenon known as social media is one of the most complex and enigmatic. It has a range that still is quite unexplored and deeply transforms the present view on the promotion mix (Mangold & Faulds, 2009). Conversations among users on social media directly affect their perceptions on products, services and brands. But more than that, a wide range of other subjects can also become topics of conversations on social media. Hit songs, sporting events, celebrity news and even natural disasters and politics are topics that often become viral on the web. Thus, companies must grasp that, and in order to become more interesting and relevant, they must take part in these conversations inserting their brands in these online dynamic dialogues. This paper focuses on how these social interactions are manifested in the web in to two distinct cultures, Brazil and China. By understanding the similarities and differences of these cultures, this study helps firms to better adjust its marketing efforts across regions, targeting and positioning themselves, not only geographically and culturally, but also across different web platforms (Facebook and RenRen). By examining how companies should focus their efforts according to each segment in social media, firms can also maximize its results in communication and mitigate risks. The findings suggest that differences in cultural dimensions in these two countries directly affect their virtual social networking behavior in many dimensions (Identity, Presence, Relationships, Reputation, Groups, Conversations and Sharing). Accordingly, marketing efforts must be tailored to each comportment and expectations.
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
The full blood cell (FBC) count is the most common indicator of diseases. At present hematology analyzers are used for the blood cell characterization, but, recently, there has been interest in using techniques that take advantage of microscale devices and intrinsic properties of cells for increased automation and decreased cost. Microfluidic technologies offer solutions to handling and processing small volumes of blood (2-50 uL taken by finger prick) for point-of-care(PoC) applications. Several PoC blood analyzers are in use and may have applications in the fields of telemedicine, out patient monitoring and medical care in resource limited settings. They have the advantage to be easy to move and much cheaper than traditional analyzers, which require bulky instruments and consume large amount of reagents. The development of miniaturized point-of-care diagnostic tests may be enabled by chip-based technologies for cell separation and sorting. Many current diagnostic tests depend on fractionated blood components: plasma, red blood cells (RBCs), white blood cells (WBCs), and platelets. Specifically, white blood cell differentiation and counting provide valuable information for diagnostic purposes. For example, a low number of WBCs, called leukopenia, may be an indicator of bone marrow deficiency or failure, collagen- vascular diseases, disease of the liver or spleen. The leukocytosis, a high number of WBCs, may be due to anemia, infectious diseases, leukemia or tissue damage. In the laboratory of hybrid biodevices, at the University of Southampton,it was developed a functioning micro impedance cytometer technology for WBC differentiation and counting. It is capable to classify cells and particles on the base of their dielectric properties, in addition to their size, without the need of labeling, in a flow format similar to that of a traditional flow cytometer. It was demonstrated that the micro impedance cytometer system can detect and differentiate monocytes, neutrophils and lymphocytes, which are the three major human leukocyte populations. The simplicity and portability of the microfluidic impedance chip offer a range of potential applications in cell analysis including point-of-care diagnostic systems. The microfluidic device has been integrated into a sample preparation cartridge that semi-automatically performs erythrocyte lysis before leukocyte analysis. Generally erythrocytes are manually lysed according to a specific chemical lysis protocol, but this process has been automated in the cartridge. In this research work the chemical lysis protocol, defined in the patent US 5155044 A, was optimized in order to improve white blood cell differentiation and count performed by the integrated cartridge.
Resumo:
Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.
Resumo:
Multicasting is an efficient mechanism for one to many data dissemination. Unfortunately, IP Multicasting is not widely available to end-users today, but Application Layer Multicast (ALM), such as Content Addressable Network, helps to overcome this limitation. Our OM-QoS framework offers Quality of Service support for ALMs. We evaluated OM-QoS applied to CAN and show that we can guarantee that all multicast paths support certain QoS requirements.
Resumo:
BACKGROUND Implementation of user-friendly, real-time, electronic medical records for patient management may lead to improved adherence to clinical guidelines and improved quality of patient care. We detail the systematic, iterative process that implementation partners, Lighthouse clinic and Baobab Health Trust, employed to develop and implement a point-of-care electronic medical records system in an integrated, public clinic in Malawi that serves HIV-infected and tuberculosis (TB) patients. METHODS Baobab Health Trust, the system developers, conducted a series of technical and clinical meetings with Lighthouse and Ministry of Health to determine specifications. Multiple pre-testing sessions assessed patient flow, question clarity, information sequencing, and verified compliance to national guidelines. Final components of the TB/HIV electronic medical records system include: patient demographics; anthropometric measurements; laboratory samples and results; HIV testing; WHO clinical staging; TB diagnosis; family planning; clinical review; and drug dispensing. RESULTS Our experience suggests that an electronic medical records system can improve patient management, enhance integration of TB/HIV services, and improve provider decision-making. However, despite sufficient funding and motivation, several challenges delayed system launch including: expansion of system components to include of HIV testing and counseling services; changes in the national antiretroviral treatment guidelines that required system revision; and low confidence to use the system among new healthcare workers. To ensure a more robust and agile system that met all stakeholder and user needs, our electronic medical records launch was delayed more than a year. Open communication with stakeholders, careful consideration of ongoing provider input, and a well-functioning, backup, paper-based TB registry helped ensure successful implementation and sustainability of the system. Additional, on-site, technical support provided reassurance and swift problem-solving during the extended launch period. CONCLUSION Even when system users are closely involved in the design and development of an electronic medical record system, it is critical to allow sufficient time for software development, solicitation of detailed feedback from both users and stakeholders, and iterative system revisions to successfully transition from paper to point-of-care electronic medical records. For those in low-resource settings, electronic medical records for integrated care is a possible and positive innovation.
Resumo:
The prevalence of obesity has reached epidemic proportions in the United States. Twenty-five percent of school aged students are overweight. Schools have the opportunity to help slow this epidemic. School cafeterias in the United States feed millions of students every day through the National School Lunch Program.^ Point-of-sale machines are used in most school cafeterias to help streamline the process of purchasing school lunches. The point-of-sale software allows school personnel to place special notes on student's accounts to provide alerts about parental requests. This study investigated what the alerts are used for, who uses the alerts, and if there are any patterns by demographic characteristics. ^ Counts and percentages were used to determine what the alerts were used for and who used them. This study found that students who were white non-Hispanic, paid status, or in elementary school were most likely to have alerts placed on their accounts. Also, the majority of point-of-sale alerts were used as allowances (i.e., allowed to purchase snacks from the balance on the school lunch account), rather than restrictions (i.e., restricted from purchasing high calorie foods or specific food items). Using chi-square analysis, a total of 688 alerts were analyzed. There were significant differences in alert frequencies for intent category by grade level (p=0.000), snack access (p=0.000), and gender (p=0.002). Therefore, the results are significant, and one can conclude there is a significant relationship between gender, grade level, and snack access, and the presence of an alert on the school lunch account.^ Also, school administrators may want to take into consideration possible changes to their program, such as requiring more time to run the software. The results of this study can assist school administrators to better understand that a point-of-sale alert program may help their school lunch programs run more efficiently, while also providing parental influence on students’ food choices at the point-of-sale.^ School food service authorities should consider implementing a structured point-of-sale alert policy to encourage parental input on their children's food choices. When implementing the point-of-sale policy, schools should publicize this policy online, through school lunch menus, and parent communications increase participation throughout the school district.^
Resumo:
Telecommunications networks have been always expanding and thanks to it, new services have appeared. The old mechanisms for carrying packets have become obsolete due to the new service requirements, which have begun working in real time. Real time traffic requires strict service guarantees. When this traffic is sent through the network, enough resources must be given in order to avoid delays and information losses. When browsing through the Internet and requesting web pages, data must be sent from a server to the user. If during the transmission there is any packet drop, the packet is sent again. For the end user, it does not matter if the webpage loads in one or two seconds more. But if the user is maintaining a conversation with a VoIP program, such as Skype, one or two seconds of delay in the conversation may be catastrophic, and none of them can understand the other. In order to provide support for this new services, the networks have to evolve. For this purpose MPLS and QoS were developed. MPLS is a packet carrying mechanism used in high performance telecommunication networks which directs and carries data using pre-established paths. Now, packets are forwarded on the basis of labels, making this process faster than routing the packets with the IP addresses. MPLS also supports Traffic Engineering (TE). This refers to the process of selecting the best paths for data traffic in order to balance the traffic load between the different links. In a network with multiple paths, routing algorithms calculate the shortest one, and most of the times all traffic is directed through it, causing overload and packet drops, without distributing the packets in the other paths that the network offers and do not have any traffic. But this is not enough in order to provide the real time traffic the guarantees it needs. In fact, those mechanisms improve the network, but they do not make changes in how the traffic is treated. That is why Quality of Service (QoS) was developed. Quality of service is the ability to provide different priority to different applications, users, or data flows, or to guarantee a certain level of performance to a data flow. Traffic is distributed into different classes and each of them is treated differently, according to its Service Level Agreement (SLA). Traffic with the highest priority will have the preference over lower classes, but this does not mean it will monopolize all the resources. In order to achieve this goal, a set policies are defined to control and alter how the traffic flows. Possibilities are endless, and it depends in how the network must be structured. By using those mechanisms it is possible to provide the necessary guarantees to the real-time traffic, distributing it between categories inside the network and offering the best service for both real time data and non real time data. Las Redes de Telecomunicaciones siempre han estado en expansión y han propiciado la aparición de nuevos servicios. Los viejos mecanismos para transportar paquetes se han quedado obsoletos debido a las exigencias de los nuevos servicios, que han comenzado a operar en tiempo real. El tráfico en tiempo real requiere de unas estrictas garantías de servicio. Cuando este tráfico se envía a través de la red, necesita disponer de suficientes recursos para evitar retrasos y pérdidas de información. Cuando se navega por la red y se solicitan páginas web, los datos viajan desde un servidor hasta el usuario. Si durante la transmisión se pierde algún paquete, éste se vuelve a mandar de nuevo. Para el usuario final, no importa si la página tarda uno o dos segundos más en cargar. Ahora bien, si el usuario está manteniendo una conversación usando algún programa de VoIP (como por ejemplo Skype) uno o dos segundos de retardo en la conversación podrían ser catastróficos, y ninguno de los interlocutores sería capaz de entender al otro. Para poder dar soporte a estos nuevos servicios, las redes deben evolucionar. Para este propósito se han concebido MPLS y QoS MPLS es un mecanismo de transporte de paquetes que se usa en redes de telecomunicaciones de alto rendimiento que dirige y transporta los datos de acuerdo a caminos preestablecidos. Ahora los paquetes se encaminan en función de unas etiquetas, lo cual hace que sea mucho más rápido que encaminar los paquetes usando las direcciones IP. MPLS también soporta Ingeniería de Tráfico (TE). Consiste en seleccionar los mejores caminos para el tráfico de datos con el objetivo de balancear la carga entre los diferentes enlaces. En una red con múltiples caminos, los algoritmos de enrutamiento actuales calculan el camino más corto, y muchas veces el tráfico se dirige sólo por éste, saturando el canal, mientras que otras rutas se quedan completamente desocupadas. Ahora bien, esto no es suficiente para ofrecer al tráfico en tiempo real las garantías que necesita. De hecho, estos mecanismos mejoran la red, pero no realizan cambios a la hora de tratar el tráfico. Por esto es por lo que se ha desarrollado el concepto de Calidad de Servicio (QoS). La calidad de servicio es la capacidad para ofrecer diferentes prioridades a las diferentes aplicaciones, usuarios o flujos de datos, y para garantizar un cierto nivel de rendimiento en un flujo de datos. El tráfico se distribuye en diferentes clases y cada una de ellas se trata de forma diferente, de acuerdo a las especificaciones que se indiquen en su Contrato de Tráfico (SLA). EL tráfico con mayor prioridad tendrá preferencia sobre el resto, pero esto no significa que acapare la totalidad de los recursos. Para poder alcanzar estos objetivos se definen una serie de políticas para controlar y alterar el comportamiento del tráfico. Las posibilidades son inmensas dependiendo de cómo se quiera estructurar la red. Usando estos mecanismos se pueden proporcionar las garantías necesarias al tráfico en tiempo real, distribuyéndolo en categorías dentro de la red y ofreciendo el mejor servicio posible tanto a los datos en tiempo real como a los que no lo son.
Resumo:
Since the beginning of Internet, Internet Service Providers (ISP) have seen the need of giving to users? traffic different treatments defined by agree- ments between ISP and customers. This procedure, known as Quality of Service Management, has not much changed in the last years (DiffServ and Deep Pack-et Inspection have been the most chosen mechanisms). However, the incremen-tal growth of Internet users and services jointly with the application of recent Ma- chine Learning techniques, open up the possibility of going one step for-ward in the smart management of network traffic. In this paper, we first make a survey of current tools and techniques for QoS Management. Then we intro-duce clustering and classifying Machine Learning techniques for traffic charac-terization and the concept of Quality of Experience. Finally, with all these com-ponents, we present a brand new framework that will manage in a smart way Quality of Service in a telecom Big Data based scenario, both for mobile and fixed communications.
Resumo:
Customer Satisfaction Surveys (CSS) have become an important tool for public transport planners, as improvements in the perceived quality of service lead to greater use of public transport and lower traffic pollution. Until now, Intelligent Transportation System (ITS) enhancements in public transport have traditionally included fleet management systems based on Automatic Vehicle Location (AVL) technologies, which can be used to optimize routing and scheduling, and to feed real-time information into passenger information channels. However, surveys of public transport users could also benefit from the new information technologies. As most customers carry their smartphones when traveling, Quick Response (QR) codes open up the possibility of conducting these surveys at a lower cost.This paper contributes to the limited existing literature by developing the analysis of QR codes applied to CSS in public transport and highlighting their importance in reducing the cost of data collection and processing. The added value of this research is that it provides the first assessment of a real case study in Madrid (Spain) using QR codes for this purpose. This pilot experience was part of a research project analyzing bus service quality in the same case study, so the QR code survey (155 valid questionnaires) was validated using a conventional face-to-face survey (520 valid questionnaires). The results show clearly that, after overcoming a few teething troubles, this QR code application will ultimately provide transport management with a useful tool to reduce survey costs
Resumo:
We present the results of a study that collected, compared and analyzed the terms and conditions of a number of cloud services vis-a-vis privacy and data protection. First, we assembled a list of factors that comprehensively capture cloud companies' treatment of user data with regard to privacy and data protection; then, we assessed how various cloud services of different types protect their users in the collection, retention, and use of their data, as well as in the disclosure to law enforcement authorities. This commentary provides comparative and aggregate analysis of the results.
Resumo:
Cover title.
Resumo:
In double columns; English and Hebrew in opposite columns.