17 resultados para user interface development
Resumo:
The text editor WinEdt 5 may be easily configured to provide a user interface for TDA. The configuration described below allows you to launch TDA command files directly from within WinEdt (via menu or shortcut). TDA's standard output will be written to disk and displayed in WinEdt automatically. Furthermore, you may also just execute selected parts of a command file.
Resumo:
Neurally adjusted ventilatory assist (NAVA) delivers airway pressure (P(aw)) in proportion to the electrical activity of the diaphragm (EAdi) using an adjustable proportionality constant (NAVA level, cm·H(2)O/μV). During systematic increases in the NAVA level, feedback-controlled down-regulation of the EAdi results in a characteristic two-phased response in P(aw) and tidal volume (Vt). The transition from the 1st to the 2nd response phase allows identification of adequate unloading of the respiratory muscles with NAVA (NAVA(AL)). We aimed to develop and validate a mathematical algorithm to identify NAVA(AL). P(aw), Vt, and EAdi were recorded while systematically increasing the NAVA level in 19 adult patients. In a multistep approach, inspiratory P(aw) peaks were first identified by dividing the EAdi into inspiratory portions using Gaussian mixture modeling. Two polynomials were then fitted onto the curves of both P(aw) peaks and Vt. The beginning of the P(aw) and Vt plateaus, and thus NAVA(AL), was identified at the minimum of squared polynomial derivative and polynomial fitting errors. A graphical user interface was developed in the Matlab computing environment. Median NAVA(AL) visually estimated by 18 independent physicians was 2.7 (range 0.4 to 5.8) cm·H(2)O/μV and identified by our model was 2.6 (range 0.6 to 5.0) cm·H(2)O/μV. NAVA(AL) identified by our model was below the range of visually estimated NAVA(AL) in two instances and was above in one instance. We conclude that our model identifies NAVA(AL) in most instances with acceptable accuracy for application in clinical routine and research.
Resumo:
Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.
Resumo:
A previously presented algorithm for the reconstruction of bremsstrahlung spectra from transmission data has been implemented into MATHEMATICA. Spectra vectorial algebra has been used to solve the matrix system A * F = T. The new implementation has been tested by reconstructing photon spectra from transmission data acquired in narrow beam conditions, for nominal energies of 6, 15, and 25 MV. The results were in excellent agreement with the original calculations. Our implementation has the advantage to be based on a well-tested mathematical kernel. Furthermore it offers a comfortable user interface.
Resumo:
We developed an object-oriented cross-platform program to perform three-dimensional (3D) analysis of hip joint morphology using two-dimensional (2D) anteroposterior (AP) pelvic radiographs. Landmarks extracted from 2D AP pelvic radiographs and optionally an additional lateral pelvic X-ray were combined with a cone beam projection model to reconstruct 3D hip joints. Since individual pelvic orientation can vary considerably, a method for standardizing pelvic orientation was implemented to determine the absolute tilt/rotation. The evaluation of anatomically morphologic differences was achieved by reconstructing the projected acetabular rim and the measured hip parameters as if obtained in a standardized neutral orientation. The program had been successfully used to interactively objectify acetabular version in hips with femoro-acetabular impingement or developmental dysplasia. Hip(2)Norm is written in object-oriented programming language C++ using cross-platform software Qt (TrollTech, Oslo, Norway) for graphical user interface (GUI) and is transportable to any platform.
Resumo:
Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.
Resumo:
We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.
Resumo:
Background: Dementia is a multifaceted disorder that impairs cognitive functions, such as memory, language, and executive functions necessary to plan, organize, and prioritize tasks required for goal-directed behaviors. In most cases, individuals with dementia experience difficulties interacting with physical and social environments. The purpose of this study was to establish ecological validity and initial construct validity of a fire evacuation Virtual Reality Day-Out Task (VR-DOT) environment based on performance profiles as a screening tool for early dementia. Objective: The objectives were (1) to examine the relationships among the performances of 3 groups of participants in the VR-DOT and traditional neuropsychological tests employed to assess executive functions, and (2) to compare the performance of participants with mild Alzheimer’s-type dementia (AD) to those with amnestic single-domain mild cognitive impairment (MCI) and healthy controls in the VR-DOT and traditional neuropsychological tests used to assess executive functions. We hypothesized that the 2 cognitively impaired groups would have distinct performance profiles and show significantly impaired independent functioning in ADL compared to the healthy controls. Methods: The study population included 3 groups: 72 healthy control elderly participants, 65 amnestic MCI participants, and 68 mild AD participants. A natural user interface framework based on a fire evacuation VR-DOT environment was used for assessing physical and cognitive abilities of seniors over 3 years. VR-DOT focuses on the subtle errors and patterns in performing everyday activities and has the advantage of not depending on a subjective rating of an individual person. We further assessed functional capacity by both neuropsychological tests (including measures of attention, memory, working memory, executive functions, language, and depression). We also evaluated performance in finger tapping, grip strength, stride length, gait speed, and chair stands separately and while performing VR-DOTs in order to correlate performance in these measures with VR-DOTs because performance while navigating a virtual environment is a valid and reliable indicator of cognitive decline in elderly persons. Results: The mild AD group was more impaired than the amnestic MCI group, and both were more impaired than healthy controls. The novel VR-DOT functional index correlated strongly with standard cognitive and functional measurements, such as mini-mental state examination (MMSE; rho=0.26, P=.01) and Bristol Activities of Daily Living (ADL) scale scores (rho=0.32, P=.001). Conclusions: Functional impairment is a defining characteristic of predementia and is partly dependent on the degree of cognitive impairment. The novel virtual reality measures of functional ability seem more sensitive to functional impairment than qualitative measures in predementia, thus accurately differentiating from healthy controls. We conclude that VR-DOT is an effective tool for discriminating predementia and mild AD from controls by detecting differences in terms of errors, omissions, and perseverations while measuring ADL functional ability.
Resumo:
Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.
Resumo:
Bei EKP-Experimenten ist oft nicht von vornherein klar, in welchen Zeitfenstern Effekte erwartet werden. Daher müssen Analysen die Daten über mehrere Zeitfenster hinweg explorieren. Darüber hinaus sind statistische Analysen, die alle Elektroden berücksichtigen, wünschenswert, aber nicht trivial. Zur Lösung dieser Probleme präsentieren wir hier das Programm Ragu (Randomization Graphical User interface), das spezifisch für die statistische Auswertung von Mehrkanal EEG-Experimenten eingesetzt werden kann. Ragu soll Wissenschaftlern die Möglichkeit geben, die Signifikanzen von EKP-Effekten global zu untersuchen, ohne die Notwendigkeit von A-Priori-Annahmen. Das Programm basiert auf der Messung von Feldstärke-Differenzen unter Berücksichtigung aller Elektroden. Im ersten Teil dieses Workshops werden wir die Notwendigkeit von topografischen ERP-Analysen angesichts des Volumenleitungsproblems herausarbeiten und Vergleiche zu Einzelelektroden-Ansätzen anstellen. Wir werden an Hand unserer frei erhältlichen in-house Software Ragu das Prinzip von Randomisierungsstatistiken erklären und deren unterschiedliche Anwendungsmöglichkeiten für ERP-Analysen. In einem zweiten Teil haben die Teilnehmenden die Gelegenheit, Ragu an einem Beispielsatz auszuprobieren und Möglichkeiten der Anwendung von Ragu in ihrer eigenen Forschungs zu besprechen.
Resumo:
This chapter presents an evaluation and initial testing of a meta-application (meta-app) for enhanced communication and improved interaction (e.g., appointment scheduling) between stakeholders (e.g., citizens) in cognitive cities. The underlying theoretical models as well as the paper prototype are presented to ensure the comprehensibility of the user interface. This paper prototype of the meta-app was evaluated through interviews with various experts in different fields (e.g., a strategic consultant, a small and medium-sized enterprises cofounder in the field of online marketing, an IT project leader, and an innovation manager). The results and implications of the evaluation show that the idea behind this meta-app has the potential to improve the living standards of citizens and to lead to a next step in the realization and maturity of the meta-app. The meta-app helps citizens more effectively manage their time and organize their personal schedules and thus allows them to have more leisure time and take full advantage of it to ensure a good work-life balance to enable them to be the most efficient and productive during their working time.
Resumo:
XMapTools is a MATLAB©-based graphical user interface program for electron microprobe X-ray image processing, which can be used to estimate the pressure–temperature conditions of crystallization of minerals in metamorphic rocks. This program (available online at http://www.xmaptools.com) provides a method to standardize raw electron microprobe data and includes functions to calculate the oxide weight percent compositions for various minerals. A set of external functions is provided to calculate structural formulae from the standardized analyses as well as to estimate pressure–temperature conditions of crystallization, using empirical and semi-empirical thermobarometers from the literature. Two graphical user interface modules, Chem2D and Triplot3D, are used to plot mineral compositions into binary and ternary diagrams. As an example, the software is used to study a high-pressure Himalayan eclogite sample from the Stak massif in Pakistan. The high-pressure paragenesis consisting of omphacite and garnet has been retrogressed to a symplectitic assemblage of amphibole, plagioclase and clinopyroxene. Mineral compositions corresponding to ~165,000 analyses yield estimates for the eclogitic pressure–temperature retrograde path from 25 kbar to 9 kbar. Corresponding pressure–temperature maps were plotted and used to interpret the link between the equilibrium conditions of crystallization and the symplectitic microstructures. This example illustrates the usefulness of XMapTools for studying variations of the chemical composition of minerals and for retrieving information on metamorphic conditions on a microscale, towards computation of continuous pressure–temperature-and relative time path in zoned metamorphic minerals not affected by post-crystallization diffusion.
Resumo:
The first part summarises the origins, definitions and debates around the general notions of development, culture and associated more specific concepts such as identity, tradition, exogenous and endogenous knowledge, institutions, governance or territoriality. A second part highlights how culture and development got related to the debates around sustainable governance of natural resources and forests. The third part illustrates on the basis of a case study from Kenya and Bolivia how culture as a transversal element of forest governance is expressed in empirical terms. Moreover it is shown how the cultural dimension affects positively or negatively the outcomes of culturally shaped forest governance outcomes and the role these effects play in shaping the sustainability of the socio-ecological systems of forests in Africa and South America.