962 resultados para End-user Queries
Resumo:
With the quick advance of web service technologies, end-users can conduct various on-line tasks, such as shopping on-line. Usually, end-users compose a set of services to accomplish a task, and need to enter values to services to invoke the composite services. Quite often, users re-visit websites and use services to perform re-occurring tasks. The users are required to enter the same information into various web services to accomplish such re-occurring tasks. However, repetitively typing the same information into services is a tedious job for end-users. It can negatively impact user experience when an end-user needs to type the re-occurring information repetitively into web services. Recent studies have proposed several approaches to help users fill in values to services automatically. However, prior studies mainly suffer the following drawbacks: (1) limited support of collecting and analyzing user inputs; (2) poor accuracy of filling values to services; (3) not designed for service composition. To overcome the aforementioned drawbacks, we need maximize the reuse of previous user inputs across services and end-users. In this thesis, we introduce our approaches that prevent end-users from entering the same information into repetitive on-line tasks. More specifically, we improve the process of filling out services in the following 4 aspects: First, we investigate the characteristics of input parameters. We propose an ontology-based approach to automatically categorize parameters and fill values to the categorized input parameters. Second, we propose a comprehensive framework that leverages user contexts and usage patterns into the process of filling values to services. Third, we propose an approach for maximizing the value propagation among services and end-users by linking a set of semantically related parameters together and similar end-users. Last, we propose a ranking-based framework that ranks a list of previous user inputs for an input parameter to save a user from unnecessary data entries. Our framework learns and analyzes interactions of user inputs and input parameters to rank user inputs for input parameters under different contexts.
Resumo:
Designing for users rather than with users is still a common practice in technology design and innovation as opposed to taking them on board in the process. Design for inclusion aims to define and understand end-users, their needs, context of use, and, by doing so, ensure that end-users are catered for and included, while the results are geared towards universality of use. We describe the central role of end-user and designer participation, immersion and perspective to build user-driven solutions. These approaches provided a critical understanding of the counterpart role. Designer(s) could understand what the user’s needs were, experience physical impairments, and see from other’s perspective the interaction with the environment. Users could understand challenges of designing for physical impairments, build a sense of ownership with technology and explore it from a creative perspective. The understanding of the peer’s role (user and designer), needs and perspective enhanced user participation and inclusion.
Collection-Level Subject Access in Aggregations of Digital Collections: Metadata Application and Use
Resumo:
Problems in subject access to information organization systems have been under investigation for a long time. Focusing on item-level information discovery and access, researchers have identified a range of subject access problems, including quality and application of metadata, as well as the complexity of user knowledge required for successful subject exploration. While aggregations of digital collections built in the United States and abroad generate collection-level metadata of various levels of granularity and richness, no research has yet focused on the role of collection-level metadata in user interaction with these aggregations. This dissertation research sought to bridge this gap by answering the question “How does collection-level metadata mediate scholarly subject access to aggregated digital collections?” This goal was achieved using three research methods: • in-depth comparative content analysis of collection-level metadata in three large-scale aggregations of cultural heritage digital collections: Opening History, American Memory, and The European Library • transaction log analysis of user interactions, with Opening History, and • interview and observation data on academic historians interacting with two aggregations: Opening History and American Memory. It was found that subject-based resource discovery is significantly influenced by collection-level metadata richness. The richness includes such components as: 1) describing collection’s subject matter with mutually-complementary values in different metadata fields, and 2) a variety of collection properties/characteristics encoded in the free-text Description field, including types and genres of objects in a digital collection, as well as topical, geographic and temporal coverage are the most consistently represented collection characteristics in free-text Description fields. Analysis of user interactions with aggregations of digital collections yields a number of interesting findings. Item-level user interactions were found to occur more often than collection-level interactions. Collection browse is initiated more often than search, while subject browse (topical and geographic) is used most often. Majority of collection search queries fall within FRBR Group 3 categories: object, concept, and place. Significantly more object, concept, and corporate body searches and less individual person, event and class of persons searches were observed in collection searches than in item searches. While collection search is most often satisfied by Description and/or Subjects collection metadata fields, it would not retrieve a significant proportion of collection records without controlled-vocabulary subject metadata (Temporal Coverage, Geographic Coverage, Subjects, and Objects), and free-text metadata (the Description field). Observation data shows that collection metadata records in Opening History and American Memory aggregations are often viewed. Transaction log data show a high level of engagement with collection metadata records in Opening History, with the total page views for collections more than 4 times greater than item page views. Scholars observed viewing collection records valued descriptive information on provenance, collection size, types of objects, subjects, geographic coverage, and temporal coverage information. They also considered the structured display of collection metadata in Opening History more useful than the alternative approach taken by other aggregations, such as American Memory, which displays only the free-text Description field to the end-user. The results extend the understanding of the value of collection-level subject metadata, particularly free-text metadata, for the scholarly users of aggregations of digital collections. The analysis of the collection metadata created by three large-scale aggregations provides a better understanding of collection-level metadata application patterns and suggests best practices. This dissertation is also the first empirical research contribution to test the FRBR model as a conceptual and analytic framework for studying collection-level subject access.
Resumo:
The ever-increasing number and severity of cybersecurity breaches makes it vital to understand the factors that make organizations vulnerable. Since humans are considered the weakest link in the cybersecurity chain of an organization, this study evaluates users’ individual differences (demographic factors, risk-taking preferences, decision-making styles and personality traits) to understand online security behavior. This thesis studies four different yet tightly related online security behaviors that influence organizational cybersecurity: device securement, password generation, proactive awareness and updating. A survey (N=369) of students, faculty and staff in a large mid-Atlantic U.S. public university identifies individual characteristics that relate to online security behavior and characterizes the higher-risk individuals that pose threats to the university’s cybersecurity. Based on these findings and insights from interviews with phishing victims, the study concludes with recommendations to help similat organizations increase end-user cybersecurity compliance and mitigate the risks caused by humans in the organizational cybersecurity chain.
Resumo:
A hybrid system to automatically detect, locate and classify disturbances affecting power quality in an electrical power system is presented in this paper. The disturbances characterized are events from an actual power distribution system simulated by the ATP (Alternative Transients Program) software. The hybrid approach introduced consists of two stages. In the first stage, the wavelet transform (WT) is used to detect disturbances in the system and to locate the time of their occurrence. When such an event is flagged, the second stage is triggered and various artificial neural networks (ANNs) are applied to classify the data measured during the disturbance(s). A computational logic using WTs and ANNs together with a graphical user interface (GU) between the algorithm and its end user is then implemented. The results obtained so far are promising and suggest that this approach could lead to a useful application in an actual distribution system. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Solid-liquid phase equilibrium modeling of triacylglycerol mixtures is essential for lipids design. Considering the alpha polymorphism and liquid phase as ideal, the Margules 2-suffix excess Gibbs energy model with predictive binary parameter correlations describes the non ideal beta and beta` solid polymorphs. Solving by direct optimization of the Gibbs free energy enables one to predict from a bulk mixture composition the phases composition at a given temperature and thus the SFC curve, the melting profile and the Differential Scanning Calorimetry (DSC) curve that are related to end-user lipid properties. Phase diagram, SFC and DSC curve experimental data are qualitatively and quantitatively well predicted for the binary mixture 1,3-dipalmitoyl-2-oleoyl-sn-glycerol (POP) and 1,2,3-tripalmitoyl-sn-glycerol (PPP), the ternary mixture 1,3-dimyristoyl-2-palmitoyl-sn-glycerol (MPM), 1,2-distearoyl-3-oleoyl-sn-glycerol (SSO) and 1,2,3-trioleoyl-sn-glycerol (OOO), for palm oil and cocoa butter. Then, addition to palm oil of Medium-Long-Medium type structured lipids is evaluated, using caprylic acid as medium chain and long chain fatty acids (EPA-eicosapentaenoic acid, DHA-docosahexaenoic acid, gamma-linolenic-octadecatrienoic acid and AA-arachidonic acid), as sn-2 substitutes. EPA, DHA and AA increase the melting range on both the fusion and crystallization side. gamma-linolenic shifts the melting range upwards. This predictive tool is useful for the pre-screening of lipids matching desired properties set a priori.
Resumo:
The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.
Resumo:
This paper presents a study carried out in order to evaluate the students' perception in the development and use of remote Control and Automation education kits developed by two Universities. Three projects, based on real world environments, were implemented, being local and remotely operated. Students implemented the kits using the theoretical and practical knowledge, being the teachers a catalyst in the learning process. When kits were operational, end-user students got acquainted to the kits in the course curricula units. It is the author's believe that successful results were achieved not only in the learning progress on the Automation and Control fields (hard skills) but also on the development of the students soft skills, leading to encouraging and rewarding goals, motivating their future decisions and promoting synergies in their work. The design of learning experimental kits by students, under teacher supervision, for future use in course curricula by enduser students is an advantageous and rewarding experience.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Thesis submitted in the fulfilment of the requirements for the Degree of Master in Electronic and Telecomunications Engineering
Resumo:
Os objectivos principais deste estudo são a caracterização de uma das linhas de extrusão existentes na Cabelte, nomeadamente a linha de extrusão de referência EP5, composta por duas extrusoras. Pretende-se fazer a determinação de indicadores energéticos e de processo e a optimização do consumo energético, no que diz respeito à energia consumida e às perdas térmicas relativas a esta linha. Para fazer a monitorização da linha de extrusão EP5 foi colocado no quadro geral dessa linha um equipamento central de medida de forma a ser possível a sua monitorização. No entanto, para a extrusora auxiliar as medições foram efectuadas com uma pinça amperimétrica e um fasímetro. Foram também efectuados ensaios onde foi avaliada a quantidade de material transformada, para isso foi utilizado um equipamento de pesagem, doseador gravimétrico aplicado nas extrusoras. As medições de temperatura para os cálculos das perdas térmicas da extrusora principal e para a caracterização dos materiais plásticos, foram efectuadas utilizando um termómetro digital. Foram efectuados ensaios de débito às extrusoras auxiliar e principal e foi estudada a variação do factor de potência em função da rotação do fuso. Na perspectiva do utilizador final a optimização para a utilização racional de energia está na redução de encargos da factura de energia eléctrica. Essa factura não depende só da quantidade mas também do modo temporal como se utiliza essa energia, principalmente a energia eléctrica, bastante dependente do período em que é consumida. Uma metodologia diferente no planeamento da produção, contemplando o fabrico dos cabos com maior custo específico nas horas de menor custo energético, implicaria uma redução dos custos específicos de 18,7% para o horário de verão e de 20,4% para o horário de inverno. Os materiais de revestimento utilizados (PE e PVC), influenciam directamente os custos energéticos, uma vez que o polietileno (PE) apresenta sempre valores de entalpia superiores (0,317 kWh/kg e 0,281 kWh/kg)) e necessita de temperaturas de trabalho mais elevadas do que o policloreto de vinilo (PVC) (0,141 kWh/kg e 0,124 kWh/kg). O consumo específico tendencialmente diminui à medida que aumenta a rotação do fuso, até se atingir o valor de rotação óptimo, a partir do qual esta tendência se inverte. O cosφ para as duas extrusoras em estudo, aumenta sempre com o aumento de rotação do fuso. Este estudo permitiu avaliar as condições óptimas no processo de revestimento dos cabos, de forma a minimizarmos os consumos energéticos. A redução de toda a espécie de desperdícios (sobre consumos, desperdício em purgas) é uma prioridade de gestão que alia também a eficácia à eficiência, e constitui uma ferramenta fundamental para assegurar o futuro da empresa. O valor médio lido para o factor de potência (0,38) da linha EP5, valor extremamente baixo e que vem associado à energia reactiva, além do factor económico que lhe está inerente, condiciona futuras ampliações. A forma de se corrigir o factor de potência é instalando uma bateria de condensadores de 500 kVAr. Considerando o novo sistema tarifário aplicado à energia reactiva, vamos ter um ganho de 36167,4 Euro/ano e o período de retorno de investimento é de 0,37 ano (4,5 meses). Esta medida implica também uma redução anual na quantidade de CO2 emitida de 6,5%. A quantificação das perdas térmicas é importante, pois só desta forma se podem definir modos de actuação de forma a aumentar a eficiência energética. Se não existir conhecimento profundo dos processos e metodologias correctas, não podem existir soluções eficientes, logo é importante medir antes de avançar com qualquer medida de gestão.
Resumo:
Esta dissertação de Mestrado pretende, numa primeira fase, identificar as condições gerais e os pressupostos da aplicação da ferramenta Análise do Valor (AV) e integrá-la nos Sistemas de Gestão da Qualidade. Pretende-se demonstrar a técnica e aumentar o seu conhecimento, assim como as várias abordagens do processo, as vantagens e os constrangimentos no seu uso, conduzir à ideia que poderá ser útil proceder a uma análise organizada e sistemática dos produtos/serviços existentes nas organizações, abrindo a hipótese a novas soluções para o produto/serviço de mais fácil produção/realização e ensaio ao menor custo. É realçada a importância do conceito da Análise do Valor demonstrando que se pode tornar numa ferramenta eficaz na melhoria dos produtos mas também de processos de fabrico e até em processos administrativos. Sendo a Qualidade entendida como um conjunto de características que um bem, produto ou serviço possui que o tornam apto para satisfazer na plenitude uma dada necessidade do seu utilizador, este trabalho também faz a ligação com os Sistemas de Gestão da Qualidade comparando dois referenciais, a Norma NP EN 12973 e a ISO 9001:2008. Numa segunda fase é realizada uma profunda abordagem à ferramenta QFD – Quality Function Deployment – como uma técnica complementar à aplicação prática da técnica AV e é realizado um estudo a um serviço pós-venda que inclui muitos dos seus conceitos e princípios. O trabalho foi realizado na empresa onde sou colaborador há cerca de 10 anos exercendo o cargo de “Service Manager Press/Post Press” ao departamento de serviço técnico e apoio ao cliente. Foi muito útil a demonstração prática para entendimento das dificuldades sentidas e dos obstáculos a ultrapassar. O trabalho termina com as conclusões do caso prático e as conclusões gerais, mencionando as definições dos aceleradores / obstáculos da aplicação da AV.
Resumo:
The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing. (C) 2014 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The overall goal of the REMPLI project is to design and implement a communication infrastructure for distributed data acquisition and remote control operations using the power grid as the communication medium. The primary target application is remote meter reading with high time resolution, where the meters can be energy, heat, gas, or water meters. The users of the system (e.g. utility companies) will benefit from the REMPLI system by gaining more detailed information about how energy is consumed by the end-users. In this context, the power-line communication (PLC) is deployed to cover the distance between utility company’s Private Network and the end user. This document specifies a protocol for real-time PLC, in the framework of the REMPLI project. It mainly comprises the Network Layer and Data Link Layer. The protocol was designed having into consideration the specific aspects of the network: different network typologies (star, tree, ring, multiple paths), dynamic changes in network topology (due to network maintenance, hazards, etc.), communication lines strongly affected by noise.
Resumo:
Traditional Real-Time Operating Systems (RTOS) are not designed to accommodate application specific requirements. They address a general case and the application must co-exist with any limitations imposed by such design. For modern real-time applications this limits the quality of services offered to the end-user. Research in this field has shown that it is possible to develop dynamic systems where adaptation is the key for success. However, adaptation requires full knowledge of the system state. To overcome this we propose a framework to gather data, and interact with the operating system, extending the traditional POSIX trace model with a partial reflective model. Such combination still preserves the trace mechanism semantics while creating a powerful platform to develop new dynamic systems, with little impact in the system and avoiding complex changes in the kernel source code.