896 resultados para voip , dispositivi mobili , portabilità , user-friendly
Resumo:
A Budapesti Corvinus Egyetem kutatócsoportja által végzett kutatás során arra a kérdésre kerestük a választ, hogy a Nemzeti Fejlesztési Terv (NFT) és az Új Magyarország Fejlesztési Terv (ÚMFT) keretében létrehozott, kis- és középvállalatok számára kialakított gazdaságfejlesztési pályázatok milyen hatékonysággal működnek. Hogyan volt képes a rendszer beépíteni az NFT Gazdasági Versenyképesség Operatív Program (GVOP) tapasztalatait az ÚMFT Gazdaságfejlesztési Operatív Programba (GOP). Továbbá, hogy maguk a kis- és középvállalatok miként értékelik ezen kiírásokat és milyen javaslatokat tennének az ilyen típusú pályázatok "felhasználóbaráttá" tételéhez. A kutatás 2009 márciusától 2009 decemberéig tartott. A kéziratot 2010 áprilisában zártuk. / === / In a research group of the Corvinus University of Budapest we aimed to answer the questions of efficiency and effectiveness of the economic development grants of the National Development Plan (2004-2006) and the New Hungarian Development Plan (2007-13) with the focus of small and medium sized enterprises (SME's). We tried to highlight the results of learning by doing of the first National Development Plan and how this practice was built into the second development plan. An emphasis was placed on the opinion of the SME's, how they evaluate the grants and what are their proposals to make the grant system more "user friendly".
Resumo:
Raster graphic ampelometric software was not exclusively developed for the estimation of leaf area, but also for the characterization of grapevine (Viti vinifera L.) leaves. The software was written in C-Hprogramming language, using the C-1-1- Builder 2007 for Windows 95-XP and Linux operation systems. It handles desktop-scanned images. On the image analysed with the GRA.LE.D., the user has to determine 11 points. These points are then connected and the distances between them calculated. The GRA.LE.D. software supports standard ampelometric measurements such as leaf area, angles between the veins and lengths of the veins. These measurements are recorded by the software and exported into plain ASCII text files for single or multiple samples. Twenty-two biometric data points of each leaf are identified by the GRA.LE.D. It presents the opportunity to statistically analyse experimental data, allows comparison of cultivars and enables graphic reconstruction of leaves using the Microsoft Excel Chart Wizard. The GRA. LE.D. was thoroughly calibrated and compared to other widely used instruments and methods such as photo-gravimetry, LiCor L0100, WinDIAS2.0 and ImageTool. By comparison, the GRA.LE.D. presented the most accurate measurements of leaf area, but the LiCor L0100 and the WinDIAS2.0 were faster, while the photo-gravimetric method proved to be the most time-consuming. The WinDIAS2.0 instrument was the least reliable. The GRA.LE.D. is uncomplicated, user-friendly, accurate, consistent, reliable and has wide practical application.
Resumo:
This research investigated the effectiveness and efficiency of structured writing as compared to traditional nonstructured writing as a teaching and learning strategy in a training session for teachers.^ Structured writing is a method of identifying, interrelating, sequencing, and graphically displaying information on fields of a page or computer. It is an alternative for improving training and educational outcomes by providing an effective and efficient documentation methodology.^ The problem focuses upon the contradiction between: (a) the supportive research and theory to modify traditional methods of written documents and information presentation and (b) the existing paradigm to continue with traditional communication methods.^ A MANOVA was used to determine significant difference between a control and an experimental group in a posttest only experimental design. The experimental group received the treatment of structured writing materials during a training session. Two variables were analyzed. They were: (a) effectiveness; correct items on a posttest, and (b) efficiency; time spent on test.^ The quantitative data showed a difference for the experimental group on the two dependent variables. The experimental group completed the posttest in 2 minutes less time while scoring 1.5 more items correct. An interview with the training facilitators revealed that the structured writing materials were "user friendly." ^
Resumo:
The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^
Resumo:
Choosing between Light Rail Transit (LRT) and Bus Rapid Transit (BRT) systems is often controversial and not an easy task for transportation planners who are contemplating the upgrade of their public transportation services. These two transit systems provide comparable services for medium-sized cities from the suburban neighborhood to the Central Business District (CBD) and utilize similar right-of-way (ROW) categories. The research is aimed at developing a method to assist transportation planners and decision makers in determining the most feasible system between LRT and BRT. ^ Cost estimation is a major factor when evaluating a transit system. Typically, LRT is more expensive to build and implement than BRT, but has significantly lower Operating and Maintenance (OM) costs than BRT. This dissertation examines the factors impacting capacity and costs, and develops cost models, which are a capacity-based cost estimate for the LRT and BRT systems. Various ROW categories and alignment configurations of the systems are also considered in the developed cost models. Kikuchi's fleet size model (1985) and cost allocation method are used to develop the cost models to estimate the capacity and costs. ^ The comparison between LRT and BRT are complicated due to many possible transportation planning and operation scenarios. In the end, a user-friendly computer interface integrated with the established capacity-based cost models, the LRT and BRT Cost Estimator (LBCostor), was developed by using Microsoft Visual Basic language to facilitate the process and will guide the users throughout the comparison operations. The cost models and the LBCostor can be used to analyze transit volumes, alignments, ROW configurations, number of stops and stations, headway, size of vehicle, and traffic signal timing at the intersections. The planners can make the necessary changes and adjustments depending on their operating practices. ^
Resumo:
The premise of this dissertation is to create a highly integrated platform that combines the most current recording technologies for brain research through the development of new algorithms for three-dimensional (3D) functional mapping and 3D source localization. The recording modalities that were integrated include: Electroencephalography (EEG), Optical Topographic Maps (OTM), Magnetic Resonance Imaging (MRI), and Diffusion Tensor Imaging (DTI). This work can be divided into two parts: The first part involves the integration of OTM with MRI, where the topographic maps are mapped to both the skull and cortical surface of the brain. This integration process is made possible through the development of new algorithms that determine the probes location on the MRI head model and warping the 2D topographic maps onto the 3D MRI head/brain model. Dynamic changes of the brain activation can be visualized on the MRI head model through a graphical user interface. The second part of this research involves augmenting a fiber tracking system, by adding the ability to integrate the source localization results generated by commercial software named Curry. This task involved registering the EEG electrodes and the dipole results to the MRI data. Such Integration will allow the visualization of fiber tracts, along with the source of the EEG, in a 3D transparent brain structure. The research findings of this dissertation were tested and validated through the participation of patients from Miami Children Hospital (MCH). Such an integrated platform presented to the medical professionals in the form of a user-friendly graphical interface is viewed as a major contribution of this dissertation. It should be emphasized that there are two main aspects to this research endeavor: (1) if a dipole could be situated in time at its different positions, its trajectory may reveal additional information on the extent and nature of the brain malfunction; (2) situating such a dipole trajectory with respect to the fiber tracks could ensure the preservation of these fiber tracks (axons) during surgical interventions, preserving as a consequence these parts of the brain that are responsible for information transmission.
Resumo:
Graph-structured databases are widely prevalent, and the problem of effective search and retrieval from such graphs has been receiving much attention recently. For example, the Web can be naturally viewed as a graph. Likewise, a relational database can be viewed as a graph where tuples are modeled as vertices connected via foreign-key relationships. Keyword search querying has emerged as one of the most effective paradigms for information discovery, especially over HTML documents in the World Wide Web. One of the key advantages of keyword search querying is its simplicity—users do not have to learn a complex query language, and can issue queries without any prior knowledge about the structure of the underlying data. The purpose of this dissertation was to develop techniques for user-friendly, high quality and efficient searching of graph structured databases. Several ranked search methods on data graphs have been studied in the recent years. Given a top-k keyword search query on a graph and some ranking criteria, a keyword proximity search finds the top-k answers where each answer is a substructure of the graph containing all query keywords, which illustrates the relationship between the keyword present in the graph. We applied keyword proximity search on the web and the page graph of web documents to find top-k answers that satisfy user’s information need and increase user satisfaction. Another effective ranking mechanism applied on data graphs is the authority flow based ranking mechanism. Given a top- k keyword search query on a graph, an authority-flow based search finds the top-k answers where each answer is a node in the graph ranked according to its relevance and importance to the query. We developed techniques that improved the authority flow based search on data graphs by creating a framework to explain and reformulate them taking in to consideration user preferences and feedback. We also applied the proposed graph search techniques for Information Discovery over biological databases. Our algorithms were experimentally evaluated for performance and quality. The quality of our method was compared to current approaches by using user surveys.
Resumo:
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and nonepileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that (1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and (2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. ^ Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. ^ This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model’s parsing mechanism. ^ The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents. ^
Resumo:
This dissertation introduces a novel automated book reader as an assistive technology tool for persons with blindness. The literature shows extensive work in the area of optical character recognition, but the current methodologies available for the automated reading of books or bound volumes remain inadequate and are severely constrained during document scanning or image acquisition processes. The goal of the book reader design is to automate and simplify the task of reading a book while providing a user-friendly environment with a realistic but affordable system design. This design responds to the main concerns of (a) providing a method of image acquisition that maintains the integrity of the source (b) overcoming optical character recognition errors created by inherent imaging issues such as curvature effects and barrel distortion, and (c) determining a suitable method for accurate recognition of characters that yields an interface with the ability to read from any open book with a high reading accuracy nearing 98%. This research endeavor focuses in its initial aim on the development of an assistive technology tool to help persons with blindness in the reading of books and other bound volumes. But its secondary and broader aim is to also find in this design the perfect platform for the digitization process of bound documentation in line with the mission of the Open Content Alliance (OCA), a nonprofit Alliance at making reading materials available in digital form. The theoretical perspective of this research relates to the mathematical developments that are made in order to resolve both the inherent distortions due to the properties of the camera lens and the anticipated distortions of the changing page curvature as one leafs through the book. This is evidenced by the significant increase of the recognition rate of characters and a high accuracy read-out through text to speech processing. This reasonably priced interface with its high performance results and its compatibility to any computer or laptop through universal serial bus connectors extends greatly the prospects for universal accessibility to documentation.
Resumo:
In his study - Evaluating and Selecting a Property Management System - by Galen Collins, Assistant Professor, School of Hotel and Restaurant Management, Northern Arizona University, Assistant Professor Collins states briefly at the outset: “Computerizing a property requires a game plan. Many have selected a Property Management System without much forethought and have been unhappy with the final results. The author discusses the major factors that must be taken into consideration in the selection of a PMS, based on his personal experience.” Although, this article was written in the year 1988 and some information contained may be dated, there are many salient points to consider. “Technological advances have encouraged many hospitality operators to rethink how information should be processed, stored, retrieved, and analyzed,” offers Collins. “Research has led to the implementation of various cost-effective applications addressing almost every phase of operations,” he says in introducing the computer technology germane to many PMS functions. Professor Collins talks about the Request for Proposal, its conditions and its relevance in negotiating a PMS system. The author also wants the system buyer to be aware [not necessarily beware] of vendor recommendations, and not to rely solely on them. Exercising forethought will help in avoiding the drawback of purchasing an inadequate PMS system. Remember, the vendor is there first and foremost to sell you a system. This doesn’t necessarily mean that the adjectives unreliable and unethical are on the table, but do be advised. Professor Collins presents a graphic outline for the Weighted Average Approach to Scoring Vendor Evaluations. Among the elements to be considered in evaluating a PMS system, and there are several analyzed in this essay, Professor Collins advises that a perspective buyer not overlook the service factor when choosing a PMS system. Service is an important element to contemplate. “In a hotel environment, the special emphasis should be on service. System downtime can be costly and aggravating and will happen periodically,” Collins warns. Professor Collins also examines the topic of PMS system environment; of which the importance of such a factor should not be underestimated. “The design of the computer system should be based on the physical layout of the property and the projected workloads. The heart of the system, housed in a protected, isolated area, can support work stations strategically located throughout the property,” Professor Collins provides. A Property Profile Description is outlined in Table 1. The author would also point out that ease-of-operation is another significant factor to think about. “A user-friendly software package allows the user to easily move through the program without encountering frustrating obstacles,” says Collins. “Programs that require users to memorize abstract abbreviations, codes, and information to carry out standard routines should be avoided,” he counsels.
Resumo:
In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.
Resumo:
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and non-epileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that 1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and 2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model's parsing mechanism. The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents.
Resumo:
The effective control of production activities in dynamic job shop with predetermined resource allocation for all the jobs entering the system is a unique manufacturing environment, which exists in the manufacturing industry. In this thesis a framework for an Internet based real time shop floor control system for such a dynamic job shop environment is introduced. The system aims to maintain the schedule feasibility of all the jobs entering the manufacturing system under any circumstance. The system is capable of deciding how often the manufacturing activities should be monitored to check for control decisions that need to be taken on the shop floor. The system will provide the decision maker real time notification to enable him to generate feasible alternate solutions in case a disturbance occurs on the shop floor. The control system is also capable of providing the customer with real time access to the status of the jobs on the shop floor. The communication between the controller, the user and the customer is through web based user friendly GUI. The proposed control system architecture and the interface for the communication system have been designed, developed and implemented.