848 resultados para User friendly interface
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. ^ Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. ^ This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model’s parsing mechanism. ^ The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents. ^
Resumo:
In his study - Evaluating and Selecting a Property Management System - by Galen Collins, Assistant Professor, School of Hotel and Restaurant Management, Northern Arizona University, Assistant Professor Collins states briefly at the outset: “Computerizing a property requires a game plan. Many have selected a Property Management System without much forethought and have been unhappy with the final results. The author discusses the major factors that must be taken into consideration in the selection of a PMS, based on his personal experience.” Although, this article was written in the year 1988 and some information contained may be dated, there are many salient points to consider. “Technological advances have encouraged many hospitality operators to rethink how information should be processed, stored, retrieved, and analyzed,” offers Collins. “Research has led to the implementation of various cost-effective applications addressing almost every phase of operations,” he says in introducing the computer technology germane to many PMS functions. Professor Collins talks about the Request for Proposal, its conditions and its relevance in negotiating a PMS system. The author also wants the system buyer to be aware [not necessarily beware] of vendor recommendations, and not to rely solely on them. Exercising forethought will help in avoiding the drawback of purchasing an inadequate PMS system. Remember, the vendor is there first and foremost to sell you a system. This doesn’t necessarily mean that the adjectives unreliable and unethical are on the table, but do be advised. Professor Collins presents a graphic outline for the Weighted Average Approach to Scoring Vendor Evaluations. Among the elements to be considered in evaluating a PMS system, and there are several analyzed in this essay, Professor Collins advises that a perspective buyer not overlook the service factor when choosing a PMS system. Service is an important element to contemplate. “In a hotel environment, the special emphasis should be on service. System downtime can be costly and aggravating and will happen periodically,” Collins warns. Professor Collins also examines the topic of PMS system environment; of which the importance of such a factor should not be underestimated. “The design of the computer system should be based on the physical layout of the property and the projected workloads. The heart of the system, housed in a protected, isolated area, can support work stations strategically located throughout the property,” Professor Collins provides. A Property Profile Description is outlined in Table 1. The author would also point out that ease-of-operation is another significant factor to think about. “A user-friendly software package allows the user to easily move through the program without encountering frustrating obstacles,” says Collins. “Programs that require users to memorize abstract abbreviations, codes, and information to carry out standard routines should be avoided,” he counsels.
Resumo:
In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model's parsing mechanism. The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents.
Resumo:
Archaeologists are often considered frontrunners in employing spatial approaches within the social sciences and humanities, including geospatial technologies such as geographic information systems (GIS) that are now routinely used in archaeology. Since the late 1980s, GIS has mainly been used to support data collection and management as well as spatial analysis and modeling. While fruitful, these efforts have arguably neglected the potential contribution of advanced visualization methods to the generation of broader archaeological knowledge. This paper reviews the use of GIS in archaeology from a geographic visualization (geovisual) perspective and examines how these methods can broaden the scope of archaeological research in an era of more user-friendly cyber-infrastructures. Like most computational databases, GIS do not easily support temporal data. This limitation is particularly problematic in archaeology because processes and events are best understood in space and time. To deal with such shortcomings in existing tools, archaeologists often end up having to reduce the diversity and complexity of archaeological phenomena. Recent developments in geographic visualization begin to address some of these issues, and are pertinent in the globalized world as archaeologists amass vast new bodies of geo-referenced information and work towards integrating them with traditional archaeological data. Greater effort in developing geovisualization and geovisual analytics appropriate for archaeological data can create opportunities to visualize, navigate and assess different sources of information within the larger archaeological community, thus enhancing possibilities for collaborative research and new forms of critical inquiry.
Resumo:
L'evoluzione tecnologica e l'utilizzo crescente della computer grafica in diversi settori stanno suscitando l'interesse di sempre più persone verso il mondo della modellazione 3D. I software di modellazione, tuttavia, si presentano spesso inadeguati all'utilizzo da parte di utenti senza esperienza, soprattutto a causa dei comandi di navigazione e modellazione poco intuitivi. Dal punto di vista dell'interazione uomo-computer, questi software devono infatti affrontare un grande ostacolo: il rapporto tra dispositivi di input 2D (come il mouse) e la manipolazione di una scena 3D. Il progetto presentato in questa tesi è un addon per Blender che consente di utilizzare il dispositivo Leap Motion come ausilio alla modellazione di superfici in computer grafica. L'obiettivo di questa tesi è stato quello di progettare e realizzare un'interfaccia user-friendly tra Leap e Blender, in modo da potere utilizzare i sensori del primo per facilitare ed estendere i comandi di navigazione e modellazione del secondo. L'addon realizzato per Blender implementa il concetto di LAM (Leap Aided Modelling: modellazione assistita da Leap), consentendo quindi di estendere le feature di Blender riguardanti la selezione, lo spostamento e la modifica degli oggetti in scena, la manipolazione della vista utente e la modellazione di curve e superfici Non Uniform Rational B-Splines (NURBS). Queste estensioni sono state create per rendere più veloci e semplici le operazioni altrimenti guidate esclusivamente da mouse e tastiera.
Resumo:
Questo elaborato presenta il progetto di una interfaccia per l'aggiunta di sensori inerziali ad un nodo di una WSN (Wireless Sensor Network) �finalizzato al monitoraggio delle frane. Analizzando i vantaggi che avrebbe portato l'utilizzo di ulteriori sensori, si �e cercato di fornire un valido approccio di progettazione; in particolare l'idea �e quella di integrarli con un giroscopio ed un accelerometro aventi applicazioni in altri settori. Con questo particolare utilizzo, essi possono portare ad un miglior monitoraggio riuscendo a rilevare i movimenti in modo dettagliato ed a riconoscere i falsi allarmi. Nell'approccio che si intende suggerire verranno sfruttate schede per la prototipazione rapida, user-friendly e con costi decisamente accessibili, adatte alla sperimentazione elettronica e per lo sviluppo di nuovi dispositivi. Attraverso l'utilizzo di ambienti di sviluppo appositamente creati, si sono simulate le comunicazioni tra nodo e scheda di sensori, mettendo in evidenza i vantaggi ottenuti. Buona parte del progetto ha riguardato la programmazione in linguaggio C/C++, con una particolare attenzione al risparmio energetico.
Resumo:
INTRODUCTION: The ability to reproducibly identify clinically equivalent patient populations is critical to the vision of learning health care systems that implement and evaluate evidence-based treatments. The use of common or semantically equivalent phenotype definitions across research and health care use cases will support this aim. Currently, there is no single consolidated repository for computable phenotype definitions, making it difficult to find all definitions that already exist, and also hindering the sharing of definitions between user groups. METHOD: Drawing from our experience in an academic medical center that supports a number of multisite research projects and quality improvement studies, we articulate a framework that will support the sharing of phenotype definitions across research and health care use cases, and highlight gaps and areas that need attention and collaborative solutions. FRAMEWORK: An infrastructure for re-using computable phenotype definitions and sharing experience across health care delivery and clinical research applications includes: access to a collection of existing phenotype definitions, information to evaluate their appropriateness for particular applications, a knowledge base of implementation guidance, supporting tools that are user-friendly and intuitive, and a willingness to use them. NEXT STEPS: We encourage prospective researchers and health administrators to re-use existing EHR-based condition definitions where appropriate and share their results with others to support a national culture of learning health care. There are a number of federally funded resources to support these activities, and research sponsors should encourage their use.
Resumo:
Corel Geological Drafting Kit (CGDK), a program written in VBA, has been designed to assist geologists and geochemists with their drafting work. It obtains geological data from a running Excel application directly, and uses the data to plot geochemical diagrams and to construct stratigraphic columns. The software also contains functions for creating stereographic projections and rose diagrams, which can be used for spatial analysis, on a calibrated geological map. The user-friendly program has been tested to work with CorelDRAW 13 - 14 - 15 and Excel 2003 - 2007.
Resumo:
The goals of this program of research were to examine the link between self-reported vulvar pain and clinical diagnoses, and to create a user-friendly assessment tool to aid in that process. These goals were undertaken through a series of four empirical studies (Chapters 2-6): one archival study, two online studies, and one study conducted in a Women’s Health clinic. In Chapter 2, the link between self-report and clinical diagnosis was confirmed by extracting data from multiple studies conducted in the Sexual Health Research Laboratory over the course of several years. We demonstrated the accuracy of diagnosis based on multiple factors, and explored the varied gynecological presentation of different diagnostic groups. Chapter 3 was based on an online study designed to create the Vulvar Pain Assessment Questionnaire (VPAQ) inventory. Following the construct validation approach, a large pool of potential items was created to capture a broad selection of vulvar pain symptoms. Nearly 300 participants completed the entire item pool, and a series of factor analyses were utilized to narrow down the items and create scales/subscales. Relationships were computed among subscales and validated scales to establish convergent and discriminant validity. Chapters 4 and 5 were conducted in the Department of Obstetrics & Gynecology at Oregon Health & Science University. The brief screening version of the VPAQ was employed with patients of the Program in Vulvar Health at the Center for Women’s Health. The accuracy and usefulness of the VPAQscreen was determined from the perspective of patients as well as their health care providers, and the treatment-seeking experiences of patients was explored. Finally, a second online study was conducted to confirm the factor structure, internal consistency, and test-retest reliability of the VPAQ inventory. The results presented in these chapters confirm the link between targeted questions and accurate diagnoses, and provide a guideline that is useful and accessible for providers and patients.
Resumo:
The inherent analogue nature of medical ultrasound signals in conjunction with the abundant merits provided by digital image acquisition, together with the increasing use of relatively simple front-end circuitries, have created considerable demand for single-bit beamformers in digital ultrasound imaging systems. Furthermore, the increasing need to design lightweight ultrasound systems with low power consumption and low noise, provide ample justification for development and innovation in the use of single-bit beamformers in ultrasound imaging systems. The overall aim of this research program is to investigate, establish, develop and confirm through a combination of theoretical analysis and detailed simulations, that utilize raw phantom data sets, suitable techniques for the design of simple-to-implement hardware efficient digital ultrasound beamformers to address the requirements for 3D scanners with large channel counts, as well as portable and lightweight ultrasound scanners for point-of-care applications and intravascular imaging systems. In addition, the stability boundaries of higher-order High-Pass (HP) and Band-Pass (BP) Σ−Δ modulators for single- and dual- sinusoidal inputs are determined using quasi-linear modeling together with the describing-function method, to more accurately model the modulator quantizer. The theoretical results are shown to be in good agreement with the simulation results for a variety of input amplitudes, bandwidths, and modulator orders. The proposed mathematical models of the quantizer will immensely help speed up the design of higher order HP and BP Σ−Δ modulators to be applicable for digital ultrasound beamformers. Finally, a user friendly design and performance evaluation tool for LP, BP and HP modulators is developed. This toolbox, which uses various design methodologies and covers an assortment of modulators topologies, is intended to accelerate the design process and evaluation of modulators. This design tool is further developed to enable the design, analysis and evaluation of beamformer structures including the noise analyses of the final B-scan images. Thus, this tool will allow researchers and practitioners to design and verify different reconstruction filters and analyze the results directly on the B-scan ultrasound images thereby saving considerable time and effort.
Resumo:
In this thesis, novel analog-to-digital and digital-to-analog generalized time-interleaved variable bandpass sigma-delta modulators are designed, analysed, evaluated and implemented that are suitable for high performance data conversion for a broad-spectrum of applications. These generalized time-interleaved variable bandpass sigma-delta modulators can perform noise-shaping for any centre frequency from DC to Nyquist. The proposed topologies are well-suited for Butterworth, Chebyshev, inverse-Chebyshev and elliptical filters, where designers have the flexibility of specifying the centre frequency, bandwidth as well as the passband and stopband attenuation parameters. The application of the time-interleaving approach, in combination with these bandpass loop-filters, not only overcomes the limitations that are associated with conventional and mid-band resonator-based bandpass sigma-delta modulators, but also offers an elegant means to increase the conversion bandwidth, thereby relaxing the need to use faster or higher-order sigma-delta modulators. A step-by-step design technique has been developed for the design of time-interleaved variable bandpass sigma-delta modulators. Using this technique, an assortment of lower- and higher-order single- and multi-path generalized A/D variable bandpass sigma-delta modulators were designed, evaluated and compared in terms of their signal-to-noise ratios, hardware complexity, stability, tonality and sensitivity for ideal and non-ideal topologies. Extensive behavioural-level simulations verified that one of the proposed topologies not only used fewer coefficients but also exhibited greater robustness to non-idealties. Furthermore, second-, fourth- and sixth-order single- and multi-path digital variable bandpass digital sigma-delta modulators are designed using this technique. The mathematical modelling and evaluation of tones caused by the finite wordlengths of these digital multi-path sigmadelta modulators, when excited by sinusoidal input signals, are also derived from first principles and verified using simulation and experimental results. The fourth-order digital variable-band sigma-delta modulator topologies are implemented in VHDL and synthesized on Xilinx® SpartanTM-3 Development Kit using fixed-point arithmetic. Circuit outputs were taken via RS232 connection provided on the FPGA board and evaluated using MATLAB routines developed by the author. These routines included the decimation process as well. The experiments undertaken by the author further validated the design methodology presented in the work. In addition, a novel tunable and reconfigurable second-order variable bandpass sigma-delta modulator has been designed and evaluated at the behavioural-level. This topology offers a flexible set of choices for designers and can operate either in single- or dual-mode enabling multi-band implementations on a single digital variable bandpass sigma-delta modulator. This work is also supported by a novel user-friendly design and evaluation tool that has been developed in MATLAB/Simulink that can speed-up the design, evaluation and comparison of analog and digital single-stage and time-interleaved variable bandpass sigma-delta modulators. This tool enables the user to specify the conversion type, topology, loop-filter type, path number and oversampling ratio.
Resumo:
Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.
Resumo:
The purpose of this paper is to examine the promising contributions of the Concept Maps for Learning (CMfL) website to assessment for learning practices. The CMfL website generates concept maps from relatedness degree of concepts pairs through the Pathfinder Scaling Algorithm. This website also confirms the established principles of effective assessment for learning, for it is capable of automatically assessing students' higher order knowledge, simultaneously identifying strengths and weaknesses, immediately providing useful feedback and being user-friendly. According to the default assessment plan, students first create concept maps on a particular subject and then they are given individualized visual feedback followed by associated instructional material (e.g., videos, website links, examples, problems, etc.) based on a comparison of their concept map and a subject matter expert's map. After studying the feedback and instructional material, teachers can monitor their students' progress by having them create revised concept maps. Therefore, we claim that the CMfL website may reduce the workload of teachers as well as provide immediate and delayed feedback on the weaknesses of students in different forms such as graphical and multimedia. For the following study, we will examine whether these promising contributions to assessment for learning are valid in a variety of subjects.
Resumo:
Microneedles (MNs) are emerging devices that can be used for the delivery of drugs at specific locations1. Their performance is primarily judged by different features and the penetration through tissue is one of the most important aspects to evaluate. For detailed studies of MN performance different kind of in-vitro, exvivo and in-vivo tests should be performed. The main limitation of some of these tests is that biological tissue is too heterogeneous, unstable and difficult to obtain. In addition the use of biological materials sometimes present legal issues. There are many studies dealing with artificial membranes for drug diffusion2, but studies of artificial membranes for Microneedle mechanical characterization are scarce3. In order to overcome these limitations we have developed tests using synthetic polymeric membranes instead of biological tissue. The selected artificial membrane is homogeneous, stable, and readily available. This material is mainly composed of a roughly equal blend of a hydrocarbon wax and a polyolefin and it is commercially available under the brand name Parafilm®. The insertion of different kind of MN arrays prepared from crosslinked polymers were performed using this membrane and correlated with the insertion of the MN arrays in ex-vivo neonatal porcine skin. The insertion depth of the MNs was evaluated using Optical coherence tomography (OCT). The implementation of MN transdermal patches in the market can be improved by make this product user-friendly and easy to use. Therefore, manual insertion is preferred to other kind of procedures. Consequently, the insertion studies were performed in neonatal porcine skin and the artificial membrane using a manual insertion force applied by human volunteers. The insertion studies using manual forces correlated very well with the same studies performed with a Texture Analyzer equipment. These synthetic membranes seem to mimic closely the mechanical properties of the skin for the insertion of MNs using different methods of insertion. In conclusion, this artificial membrane substrate offers a valid alternative to biological tissue for the testing of MN insertion and can be a good candidate for developing a reliable quality control MN insertion test.