941 resultados para user-defined function (UDF)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In [1], the authors proposed a framework for automated clustering and visualization of biological data sets named AUTO-HDS. This letter is intended to complement that framework by showing that it is possible to get rid of a user-defined parameter in a way that the clustering stage can be implemented more accurately while having reduced computational complexity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel methodology to generate realistic network flow traces to enable systematic evaluation of network monitoring systems in various traffic conditions. Our technique uses a graph-based approach to model the communication structure observed in real-world traces and to extract traffic templates. By combining extracted and user-defined traffic templates, realistic network flow traces that comprise normal traffic and customized conditions are generated in a scalable manner. A proof-of-concept implementation demonstrates the utility and simplicity of our method to produce a variety of evaluation scenarios. We show that the extraction of templates from real-world traffic leads to a manageable number of templates that still enable accurate re-creation of the original communication properties on the network flow level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detecting small amounts of genetic subdivision across geographic space remains a persistent challenge. Often a failure to detect genetic structure is mistaken for evidence of panmixia, when more powerful statistical tests may uncover evidence for subtle geographic differentiation. Such slight subdivision can be demographically and evolutionarily important as well as being critical for management decisions. We introduce here a method, called spatial analysis of shared alleles (SAShA), that detects geographically restricted alleles by comparing the spatial arrangement of allelic co-occurrences with the expectation under panmixia. The approach is allele-based and spatially explicit, eliminating the loss of statistical power that can occur with user-defined populations and statistical averaging within populations. Using simulated data sets generated under a stepping-stone model of gene flow, we show that this method outperforms spatial autocorrelation (SA) and UST under common real-world conditions: at relatively high migration rates when diversity is moderate or high, especially when sampling is poor. We then use this method to show clear differences in the genetic patterns of 2 nearshore Pacific mollusks, Tegula funebralis (5 Chlorostoma funebralis) and Katharina tunicata, whose overall patterns of within-species differentiation are similar according to traditional population genetics analyses. SAShA meaningfully complements UST/FST, SA, and other existing geographic genetic analyses and is especially appropriate for evaluating species with high gene flow and subtle genetic differentiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rationale: Focal onset epileptic seizures are due to abnormal interactions between distributed brain areas. By estimating the cross-correlation matrix of multi-site intra-cerebral EEG recordings (iEEG), one can quantify these interactions. To assess the topology of the underlying functional network, the binary connectivity matrix has to be derived from the cross-correlation matrix by use of a threshold. Classically, a unique threshold is used that constrains the topology [1]. Our method aims to set the threshold in a data-driven way by separating genuine from random cross-correlation. We compare our approach to the fixed threshold method and study the dynamics of the functional topology. Methods: We investigate the iEEG of patients suffering from focal onset seizures who underwent evaluation for the possibility of surgery. The equal-time cross-correlation matrices are evaluated using a sliding time window. We then compare 3 approaches assessing the corresponding binary networks. For each time window: * Our parameter-free method derives from the cross-correlation strength matrix (CCS)[2]. It aims at disentangling genuine from random correlations (due to finite length and varying frequency content of the signals). In practice, a threshold is evaluated for each pair of channels independently, in a data-driven way. * The fixed mean degree (FMD) uses a unique threshold on the whole connectivity matrix so as to ensure a user defined mean degree. * The varying mean degree (VMD) uses the mean degree of the CCS network to set a unique threshold for the entire connectivity matrix. * Finally, the connectivity (c), connectedness (given by k, the number of disconnected sub-networks), mean global and local efficiencies (Eg, El, resp.) are computed from FMD, CCS, VMD, and their corresponding random and lattice networks. Results: Compared to FMD and VMD, CCS networks present: *topologies that are different in terms of c, k, Eg and El. *from the pre-ictal to the ictal and then post-ictal period, topological features time courses that are more stable within a period, and more contrasted from one period to the next. For CCS, pre-ictal connectivity is low, increases to a high level during the seizure, then decreases at offset. k shows a ‘‘U-curve’’ underlining the synchronization of all electrodes during the seizure. Eg and El time courses fluctuate between the corresponding random and lattice networks values in a reproducible manner. Conclusions: The definition of a data-driven threshold provides new insights into the topology of the epileptic functional networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The platform-independent software package consisting of the oligonucleotide mass assembler (OMA) and the oligonucleotide peak analyzer (OPA) was created to support the analysis of oligonucleotide mass spectra. It calculates all theoretically possible fragments of a given input sequence and annotates it to an experimental spectrum, thus, saving a large amount of manual processing time. The software performs analysis of precursor and product ion spectra of oligonucleotides and their analogues comprising user-defined modifications of the backbone, the nucleobases, or the sugar moiety, as well as adducts with metal ions or drugs. The ability to expand the library of building blocks and to implement individual structural variations makes it extremely useful for supporting the analysis of therapeutically active compounds. The functionality of the software tool is demonstrated on the examples of a platinated doublestranded oligonucleotide and a modified RNA sequence. Experiments also reveal the unique dissociation behavior of platinated higher-order DNA structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An Internet portal accessible at www.gdb.unibe.ch has been set up to automatically generate color-coded similarity maps of the ChEMBL database in relation to up to two sets of active compounds taken from the enhanced Directory of Useful Decoys (eDUD), a random set of molecules, or up to two sets of user-defined reference molecules. These maps visualize the relationships between the selected compounds and ChEMBL in six different high dimensional chemical spaces, namely MQN (42-D molecular quantum numbers), SMIfp (34-D SMILES fingerprint), APfp (20-D shape fingerprint), Xfp (55-D pharmacophore fingerprint), Sfp (1024-bit substructure fingerprint), and ECfp4 (1024-bit extended connectivity fingerprint). The maps are supplied in form of Java based desktop applications called “similarity mapplets” allowing interactive content browsing and linked to a “Multifingerprint Browser for ChEMBL” (also accessible directly at www.gdb.unibe.ch) to perform nearest neighbor searches. One can obtain six similarity mapplets of ChEMBL relative to random reference compounds, 606 similarity mapplets relative to single eDUD active sets, 30 300 similarity mapplets relative to pairs of eDUD active sets, and any number of similarity mapplets relative to user-defined reference sets to help visualize the structural diversity of compound series in drug optimization projects and their relationship to other known bioactive compounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CIPWFULL is a user-friendly, stand-alone FORTRAN software program that is designed to calculate the comprehensive CIPW normative mineral composition of igneous rocks and strictly adheres to the original formulation of the CIPW protocol. This faithful adherence alleviates inaccuracies in normative mineral calculations by programs commonly used by petrologists. Additionally, several of the most important petrological and mineralogical parameters of igneous rocks are calculated by the program. Along with all the regular major oxide elements, all the significant minor elements whose contents can potentially effect the CIPW normative mineral composition are included. CIPWFULL also calculates oxidation ratios for igneous rock samples that have only one oxidation state of iron reported in the specimen analysis. It also provides an option for normalization of analyses to unity on a hydrous-free basis in order to facilitate comparison of norms among rock groups. Other capabilities of the program cater for rare situations, like the presence of cancrinite or exclusion from the norm calculation of rare rocks like carbonatite. Several mineralogical, petrological and discriminatory parameters and indexes are additionally calculated by the CIPWFULL program. The CIPWFULL program is very efficient and flexible and allows for a user-defined free-format input of all the chemical species, and it permits feeding of minor elements as parts per million or oxide percentages. Results of calculations are printed in a formatted ASCII text file and may be optionally casted into a space-delimited text files that are ready to be imported to general spreadsheet programs. CIPWFULL is DOS-based and is implemented on WINDOWS and mainframe platforms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Calving is a major mechanism of ice discharge of the Antarctic and Greenland ice sheets, and a change in calving front position affects the entire stress regime of marine terminating glaciers. The representation of calving front dynamics in a 2-D or 3-D ice sheet model remains non-trivial. Here, we present the theoretical and technical framework for a level-set method, an implicit boundary tracking scheme, which we implement into the Ice Sheet System Model (ISSM). This scheme allows us to study the dynamic response of a drainage basin to user-defined calving rates. We apply the method to Jakobshavn Isbræ, a major marine terminating outlet glacier of the West Greenland Ice Sheet. The model robustly reproduces the high sensitivity of the glacier to calving, and we find that enhanced calving triggers significant acceleration of the ice stream. Upstream acceleration is sustained through a combination of mechanisms. However, both lateral stress and ice influx stabilize the ice stream. This study provides new insights into the ongoing changes occurring at Jakobshavn Isbræ and emphasizes that the incorporation of moving boundaries and dynamic lateral effects, not captured in flow-line models, is key for realistic model projections of sea level rise on centennial timescales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

State-of-the-art process-based models have shown to be applicable to the simulation and prediction of coastal morphodynamics. On annual to decadal temporal scales, these models may show limitations in reproducing complex natural morphological evolution patterns, such as the movement of bars and tidal channels, e.g. the observed decadal migration of the Medem Channel in the Elbe Estuary, German Bight. Here a morphodynamic model is shown to simulate the hydrodynamics and sediment budgets of the domain to some extent, but fails to adequately reproduce the pronounced channel migration, due to the insufficient implementation of bank erosion processes. In order to allow for long-term simulations of the domain, a nudging method has been introduced to update the model-predicted bathymetries with observations. The model-predicted bathymetry is nudged towards true states in annual time steps. Sensitivity analysis of a user-defined correlation length scale, for the definition of the background error covariance matrix during the nudging procedure, suggests that the optimal error correlation length is similar to the grid cell size, here 80-90 m. Additionally, spatially heterogeneous correlation lengths produce more realistic channel depths than do spatially homogeneous correlation lengths. Consecutive application of the nudging method compensates for the (stand-alone) model prediction errors and corrects the channel migration pattern, with a Brier skill score of 0.78. The proposed nudging method in this study serves as an analytical approach to update model predictions towards a predefined 'true' state for the spatiotemporal interpolation of incomplete morphological data in long-term simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo del PFC es el diseño e implementación de una aplicación que funcione como osciloscopio, analizador de espectro y generador de funciones virtual, todo dentro de la misma aplicacion. Mediante una tarjeta de adquisición de datos tomaremos muestras de señales del mundo real (sistema analógico) para generar datos que puedan ser manipulados por un ordenador (sistema digital). Con esta misma tarjeta también se podrán generar señales básicas, tales como señales senoidales, cuadradas.... y además se ha añadido la funcionalidad de generar señales moduladas en frecuencia, señales tipo Chirp (usadas comúnmente tanto en aplicaciones sonar y radar, como en transmisión óptica) o PRN (ruido pseudo-aleatorio que consta de una secuencia determinista de pulsos que se repite cada periodo, usada comúnmente en receptores GPS), como también señales ampliamente conocidas como el ruido blanco Gaussiano o el ruido blanco uniforme. La aplicación mostrará con detalle las señales adquiridas y analizará de diversas maneras esas señales. Posee la función de enventanado de los tipos de ventana mas comunes, respuesta en frecuencia, transformada de Fourier, etc. La configuración es elegida por el usuario en un entorno amigable y de visualización atractiva. The objective of the PFC is the design and implementation of an application that works as oscilloscope, spectrum analyzer and virtual signal generator, all within the same application. Through a data acquisition card, the user can take samples of real-world signals (analog system) to generate data that can be manipulated by a computer (digital system). This same card can also generate basic signals, such as sine waves, square waves, sawtooth waves.... and further has added other functionalities as frequency modulated signals generation, Chirp signals type generation (commonly used in both sonar and radar applications, such as optical transmission) or PRN (pseudo-random noise sequence comprising a deterministic pulse that repeats every period, commonly used in GPS receivers). It also can generate widely known as Gaussian white noise signals or white noise uniform signals. The application will show in detail the acquired signals and will analyze these signals in different ways selected by the user. Windowing function has the most common window types, frequency response, Fourier transform are examples of what kind of analyzing that can be processed. The configuration is chosen by the user throught friendly and attractive displays and panels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a general framework for assertion-based debugging of constraint logic programs. Assertions are linguistic constructions for expressing properties of programs. We define several assertion schemas for writing (partial) specifications for constraint logic programs using quite general properties, including user-defined programs. The framework is aimed at detecting deviations of the program behavior (symptoms) with respect to the given assertions, either at compile-time (i.e., statically) or run-time (i.e., dynamically). We provide techniques for using information from global analysis both to detect at compile-time assertions which do not hold in at least one of the possible executions (i.e., static symptoms) and assertions which hold for all possible executions (i.e., statically proved assertions). We also provide program transformations which introduce tests in the program for checking at run-time those assertions whose status cannot be determined at compile-time. Both the static and the dynamic checking are provably safe in the sense that all errors flagged are definite violations of the pecifications. Finally, we report briefly on the currently implemented instances of the generic framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a general framework for assertion-based debugging of constraint logic programs. Assertions are linguistic constructions which allow expressing properties of programs. We define assertion schemas which allow writing (partial) specifications for constraint logic programs using quite general properties, including user-defined programs. The framework is aimed at detecting deviations of the program behavior (symptoms) with respect to the given assertions, either at compile-time or run-time. We provide techniques for using information from global analysis both to detect at compile-time assertions which do not hold in at least one of the possible executions (i.e., static symptoms) and assertions which hold for all possible executions (i.e., statically proved assertions). We also provide program transformations which introduce tests in the program for checking at run-time those assertions whose status cannot be determined at compile-time. Both the static and the dynamic checking are provably safe in the sense that all errors flagged are definite violations of the specifications. Finally, we report on an implemented instance of the assertion language and framework.