718 resultados para Intuitive
Resumo:
The anatomy of the human brain is organized as a complex arrangement of interrelated structures in three dimensional space. To facilitate the understanding of both structure and function, we have created a volume rendered brain atlas (VRBA) with an intuitive interface that allows real-time stereoscopic rendering of brain anatomy. The VRBA incorporates 2-dimensional and 3-dimensional texture mapping to display segmented brain anatomy co-registered with a T1 MRI. The interface allows the user to remove and add any of the 62 brain structures, as well as control the display of the MRI dataset. The atlas also contains brief verbal and written descriptions of the different anatomical regions to correlate structure with function. A variety of stereoscopic projection methods are supported by the VRBA and provide an abstract, yet simple, way of visualizing brain anatomy and 3-dimensional relationships between different nuclei.
Resumo:
In recent years interactive media and tools, like scientific simulations and simulation environments or dynamic data visualizations, became established methods in the neural and cognitive sciences. Hence, university teachers of neural and cognitive sciences are faced with the challenge to integrate these media into the neuroscientific curriculum. Especially simulations and dynamic visualizations offer great opportunities for teachers and learners, since they are both illustrative and explorable. However, simulations bear instructional problems: they are abstract, demand some computer skills and conceptual knowledge about what simulations intend to explain. By following two central questions this article provides an overview on possible approaches to be applied in neuroscience education and opens perspectives for their curricular integration: (i) How can complex scientific media be transformed for educational use in an efficient and (for students on all levels) comprehensible manner and (ii) by what technical infrastructure can this transformation be supported? Exemplified by educational simulations for the neurosciences and their application in courses, answers to these questions are proposed a) by introducing a specific educational simulation approach for the neurosciences b) by introducing an e-learning environment for simulations, and c) by providing examples of curricular integration on different levels which might help academic teachers to integrate newly created or existing interactive educational resources in their courses.
Resumo:
Complementary to automatic extraction processes, Virtual Reality technologies provide an adequate framework to integrate human perception in the exploration of large data sets. In such multisensory system, thanks to intuitive interactions, a user can take advantage of all his perceptual abilities in the exploration task. In this context the haptic perception, coupled to visual rendering, has been investigated for the last two decades, with significant achievements. In this paper, we present a survey related to exploitation of the haptic feedback in exploration of large data sets. For each haptic technique introduced, we describe its principles and its effectiveness.
Resumo:
Opaque products enable service providers to hide specific characteristics of their service fulfillment from the customer until after purchase. Prominent examples include internet-based service providers selling airline tickets without defining details, such as departure time or operating airline, until the booking has been made. Owing to the resulting flexibility in resource utilization, the traditional revenue management process needs to be modified. In this paper, we extend dynamic programming decomposition techniques widely used for traditional revenue management to develop an intuitive capacity control approach that allows for the incorporation of opaque products. In a simulation study, we show that the developed approach significantly outperforms other well-known capacity control approaches adapted to the opaque product setting. Based on the approach, we also provide computational examples of how the share of opaque products as well as the degree of opacity can influence the results.
Resumo:
Die Bestimmung der Leistungsverfügbarkeit als Maß für den Erfüllungsgrad logistischer Prozesse erfolgt gewöhnlich während des Betriebes einer logistischen Anlage. Wir haben ein Simulationssystem entwickelt, um die Leistungsverfügbarkeit einer zellularen intralogistischen Anlage schon im Vorfeld ermitteln zu können. Die detailgenaue Simulation erfasst dabei nicht nur Einflüsse statischer Parameter wie Dimensionierung der Anlage sondern auch dynamische Parameter, wie Fahrzeugverhalten oder Auftragszusammensetzung. Aufgrund der Echtzeit- und VR-Fähigkeit des vorgestellten Systems, ist eine Präsentation in einer VR-Umgebung möglich. Intuitive Interaktionsmechanismen und Visualisierungs-Metaphern bieten einen intuitiven Zugang zur Leistungsverfügbarkeit des Systems und den Größen, die sie beeinflussen.
Resumo:
We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.
Resumo:
Responses of many real-world problems can only be evaluated perturbed by noise. In order to make an efficient optimization of these problems possible, intelligent optimization strategies successfully coping with noisy evaluations are required. In this article, a comprehensive review of existing kriging-based methods for the optimization of noisy functions is provided. In summary, ten methods for choosing the sequential samples are described using a unified formalism. They are compared on analytical benchmark problems, whereby the usual assumption of homoscedastic Gaussian noise made in the underlying models is meet. Different problem configurations (noise level, maximum number of observations, initial number of observations) and setups (covariance functions, budget, initial sample size) are considered. It is found that the choices of the initial sample size and the covariance function are not critical. The choice of the method, however, can result in significant differences in the performance. In particular, the three most intuitive criteria are found as poor alternatives. Although no criterion is found consistently more efficient than the others, two specialized methods appear more robust on average.
Resumo:
For smart cities applications, a key requirement is to disseminate data collected from both scalar and multimedia wireless sensor networks to thousands of end-users. Furthermore, the information must be delivered to non-specialist users in a simple, intuitive and transparent manner. In this context, we present Sensor4Cities, a user-friendly tool that enables data dissemination to large audiences, by using using social networks, or/and web pages. The user can request and receive monitored information by using social networks, e.g., Twitter and Facebook, due to their popularity, user-friendly interfaces and easy dissemination. Additionally, the user can collect or share information from smart cities services, by using web pages, which also include a mobile version for smartphones. Finally, the tool could be configured to periodically monitor the environmental conditions, specific behaviors or abnormal events, and notify users in an asynchronous manner. Sensor4Cities improves the data delivery for individuals or groups of users of smart cities applications and encourages the development of new user-friendly services.
Resumo:
Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.
Resumo:
Time is one of the scarcest resources in modern parliaments. In parliamentary systems of government the control of time in the chamber is a significant power resource enjoyed – to varying degrees – by parliamentary majorities and the governments they support. Minorities may not be able to muster enough votes to stop bills, but they may have – varying degrees of – delaying powers enabling them to extract concessions from majorities attempting to get on with their overall legislative programme. This paper provides a comparative analysis of the dynamics of the legislative process in 17 West European parliaments from the formal initiation of bills to their promulgation. The ‘biographies’ of a sample of bills are examined using techniques of event-history analysis (a) charting the dynamics of the legislative process both across the life-times of individual bills and different political systems and (b) examining whether, and to what extent, parliamentary rules and some general regime attributes influence the dynamics of this process, speeding up or delaying the passage of legislation. Using a veto-points framework and transaction cost politics as a theoretical framework, the quantitative analyses suggest a number of counter-intuitive findings (e.g., the efficiency of powerful committees) and cast doubt on some of the claims made by Tsebelis in his veto-player model.
Resumo:
Bonebridge™ (BB) implantation relies on optimal anchoring of the bone-conduction implant in the temporal bone. Preoperative position planning has to account for the available bone thickness minimizing unwanted interference with underlying anatomical structures. This study describes the first clinical experience with a planning method based on topographic bone thickness maps (TBTM) for presigmoid BB implantations. The temporal bone was segmented enabling three-dimensional surface generation. Distances between the external and internal surface were color encoded and mapped to a TBTM. Suitable implant positions were planned with reference to the TBTM. Surgery was performed according to the standard procedure (n = 7). Computation of the TBTM and consecutive implant position planning took 70 min on average for a trained technician. Surgical time for implantations under passive TBTM image guidance was 60 min, on average. The sigmoid sinus (n = 5) and dura mater (n = 1) were exposed, as predicted with the TBTM. Feasibility of the TBTM method was shown for standard presigmoid BB implantations. The projection of three-dimensional bone thickness information into a single topographic map provides the surgeon with an intuitive display of the anatomical situation prior to implantation. Nevertheless, TBTM generation time has to be significantly reduced to simplify integration in clinical routine.
Resumo:
Although several detailed models of molecular processes essential for circadian oscillations have been developed, their complexity makes intuitive understanding of the oscillation mechanism difficult. The goal of the present study was to reduce a previously developed, detailed model to a minimal representation of the transcriptional regulation essential for circadian rhythmicity in Drosophila. The reduced model contains only two differential equations, each with time delays. A negative feedback loop is included, in which PER protein represses per transcription by binding the dCLOCK transcription factor. A positive feedback loop is also included, in which dCLOCK indirectly enhances its own formation. The model simulated circadian oscillations, light entrainment, and a phase-response curve with qualitative similarities to experiment. Time delays were found to be essential for simulation of circadian oscillations with this model. To examine the robustness of the simplified model to fluctuations in molecule numbers, a stochastic variant was constructed. Robust circadian oscillations and entrainment to light pulses were simulated with fewer than 80 molecules of each gene product present on average. Circadian oscillations persisted when the positive feedback loop was removed. Moreover, elimination of positive feedback did not decrease the robustness of oscillations to stochastic fluctuations or to variations in parameter values. Such reduced models can aid understanding of the oscillation mechanisms in Drosophila and in other organisms in which feedback regulation of transcription may play an important role.
Resumo:
The paper argues for a distinction between sensory-and conceptual-information storage in the human information-processing system. Conceptual information is characterized as meaningful and symbolic, while sensory information may exist in modality-bound form. Furthermore, it is assumed that sensory information does not contribute to conscious remembering and can be used only in data-driven process reptitions, which can be accompanied by a kind of vague or intuitive feeling. Accordingly, pure top-down and willingly controlled processing, such as free recall, should not have any access to sensory data. Empirical results from different research areas and from two experiments conducted by the authors are presented in this article to support these theoretical distinctions. The experiments were designed to separate a sensory-motor and a conceptual component in memory for two-digit numbers and two-letter items, when parts of the numbers or items were imaged or drawn on a tablet. The results of free recall and recognition are discussed in a theoretical framework which distinguishes sensory and conceptual information in memory.
Resumo:
In personal and in society related context, people often evaluate the risk of environmental and technological hazards. Previous research addressing neuroscience of risk evaluation assessed particularly the direct personal risk of presented stimuli, which may have comprised for instance aspects of fear. Further, risk evaluation primarily was compared to tasks of other cognitive domains serving as control conditions, thus revealing general risk related brain activity, but not such specifically associated with estimating a higher level of risk. We here investigated the neural basis on which lay-persons individually evaluated the risk of different potential hazards for the society. Twenty healthy subjects underwent functional magnetic resonance imaging while evaluating the risk of fifty more or less risky conditions presented as written terms. Brain activations during the individual estimations of 'high' against 'low' risk, and of negative versus neutral and positive emotional valences were analyzed. Estimating hazards to be of high risk was associated with activation in medial thalamus, anterior insula, caudate nucleus, cingulate cortex and further prefrontal and temporo-occipital areas. These areas were not involved according to an analysis of the emotion ratings. In conclusion, we emphasize a contribution of the mentioned brain areas involved to signal high risk, here not primarily associated with the emotional valence of the risk items. These areas have earlier been reported to be associated with, beside emotional, viscerosensitive and implicit processing. This leads to assumptions of an intuitive contribution, or a "gut-feeling", not necessarily dependent of the subjective emotional valence, when estimating a high risk of environmental hazards.
Resumo:
PURPOSE This paper describes the development of a forward planning process for modulated electron radiotherapy (MERT). The approach is based on a previously developed electron beam model used to calculate dose distributions of electron beams shaped by a photon multi leaf collimator (pMLC). METHODS As the electron beam model has already been implemented into the Swiss Monte Carlo Plan environment, the Eclipse treatment planning system (Varian Medical Systems, Palo Alto, CA) can be included in the planning process for MERT. In a first step, CT data are imported into Eclipse and a pMLC shaped electron beam is set up. This initial electron beam is then divided into segments, with the electron energy in each segment chosen according to the distal depth of the planning target volume (PTV) in beam direction. In order to improve the homogeneity of the dose distribution in the PTV, a feathering process (Gaussian edge feathering) is launched, which results in a number of feathered segments. For each of these segments a dose calculation is performed employing the in-house developed electron beam model along with the macro Monte Carlo dose calculation algorithm. Finally, an automated weight optimization of all segments is carried out and the total dose distribution is read back into Eclipse for display and evaluation. One academic and two clinical situations are investigated for possible benefits of MERT treatment compared to standard treatments performed in our clinics and treatment with a bolus electron conformal (BolusECT) method. RESULTS The MERT treatment plan of the academic case was superior to the standard single segment electron treatment plan in terms of organs at risk (OAR) sparing. Further, a comparison between an unfeathered and a feathered MERT plan showed better PTV coverage and homogeneity for the feathered plan, with V95% increased from 90% to 96% and V107% decreased from 8% to nearly 0%. For a clinical breast boost irradiation, the MERT plan led to a similar homogeneity in the PTV compared to the standard treatment plan while the mean body dose was lower for the MERT plan. Regarding the second clinical case, a whole breast treatment, MERT resulted in a reduction of the lung volume receiving more than 45% of the prescribed dose when compared to the standard plan. On the other hand, the MERT plan leads to a larger low-dose lung volume and a degraded dose homogeneity in the PTV. For the clinical cases evaluated in this work, treatment plans using the BolusECT technique resulted in a more homogenous PTV and CTV coverage but higher doses to the OARs than the MERT plans. CONCLUSIONS MERT treatments were successfully planned for phantom and clinical cases, applying a newly developed intuitive and efficient forward planning strategy that employs a MC based electron beam model for pMLC shaped electron beams. It is shown that MERT can lead to a dose reduction in OARs compared to other methods. The process of feathering MERT segments results in an improvement of the dose homogeneity in the PTV.