962 resultados para Graph cuts
Resumo:
Motivated by the work of Mateu, Orobitg, Pérez and Verdera, who proved inequalities of the form $T_*f\lesssim M(Tf)$ or $T_*f\lesssim M^2(Tf)$ for certain singular integral operators $T$, such as the Hilbert or the Beurling transforms, we study the possibility of establishing this type of control for the Cauchy transform along a Lipschitz graph. We show that this is not possible in general, and we give a partial positive result when the graph is substituted by a Jordan curve.
Resumo:
Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.
Resumo:
In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.
Resumo:
During the Early Toarcian, major paleoenvironnemental and paleoceanographical changes occurred, leading to an oceanic anoxic event (OAE) and to a perturbation of the carbon isotope cycle. Although the standard biochronology of the Lower Jurassic is essentially based upon ammonites, in recent years biostratigraphy based on calcareous nannofossils and dinoflagellate cysts is increasingly used to date Jurassic rocks. However, the precise dating and correlation of the Early Toarcian OAE, and of the associated delta C-13 anomaly in different settings of the western Tethys, are still partly problematic, and it is still unclear whether these events are synchronous or not. In order to allow more accurate correlations of the organic rich levels recorded in the Lower Toarcian OAE, this account proposes a new biozonation based on a quantitative biochronology approach, the Unitary Associations (UA), applied to calcareous nannofossils. This study represents the first attempt to apply the UA method to Jurassic nannofossils. The study incorporates eighteen sections distributed across western Tethys and ranging from the Pliensbachian to Aalenian, comprising 1220 samples and 72 calcareous nannofossil taxa. The BioGraph [Savary, J., Guex, J., 1999. Discrete biochronological scales and unitary associations: description of the Biograph Computer program. Memoires de Geologie de Lausanne 34, 282 pp] and UA-Graph (Copyright Hammer O., Guex and Savary, 2002) softwares provide a discrete biochronological framework based upon multi-taxa concurrent range zones in the different sections. The optimized dataset generates nine UAs using the co-occurrences of 56 taxa. These UAs are grouped into six Unitary Association Zones (UA-Z), which constitute a robust biostratigraphic synthesis of all the observed or deduced biostratigraphic relationships between the analysed taxa. The UA zonation proposed here is compared to ``classic'' calcareous nannofossil biozonations, which are commonly used for the southern and the northern sides of Tethys. The biostratigraphic resolution of the UA-Zones varies from one nannofossil subzone or part of it to several subzones, and can be related to the pattern of calcareous nannoplankton originations and extinctions during the studied time interval. The Late Pliensbachian - Early Toarcian interval (corresponding to the UA-Z II) represents a major step in the Jurassic nannoplankton radiation. The recognized UA-Zones are also compared to the carbon isotopic negative excursion and TOC maximum in five sections of central Italy, Germany and England, with the aim of providing a more reliable correlation tool for the Early Toarcian OAE, and of the associated isotopic anomaly, between the southern and northern part of western Tethys. The results of this work show that the TOC maximum and delta C-13 negative excursion correspond to the upper part of the UA-Z II (i.e., UA 3) in the sections analysed. This suggests that the Early Toarcian OAE was a synchronous event within the western Tethys. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
La presente investigación propone un modelo de datos que puede ser utilizado en diversos tipos de aplicaciones relacionados con la estructura de la propiedad (Catastro, Registro de la propiedad, Notarías, etc). Este modelo definido en lenguaje universal de modelado (UML), e implementado sobre gestor de base de datos orientada a grafos (Neo4j), permite almacenar y consultar ese histórico, pudiendo ser explotado posteriormente tanto por aplicaciones de escritorio como por servicios web.
Resumo:
We consider electroencephalograms (EEGs) of healthy individuals and compare the properties of the brain functional networks found through two methods: unpartialized and partialized cross-correlations. The networks obtained by partial correlations are fundamentally different from those constructed through unpartial correlations in terms of graph metrics. In particular, they have completely different connection efficiency, clustering coefficient, assortativity, degree variability, and synchronization properties. Unpartial correlations are simple to compute and they can be easily applied to large-scale systems, yet they cannot prevent the prediction of non-direct edges. In contrast, partial correlations, which are often expensive to compute, reduce predicting such edges. We suggest combining these alternative methods in order to have complementary information on brain functional networks.
Resumo:
The scenario considered here is one where brain connectivity is represented as a network and an experimenter wishes to assess the evidence for an experimental effect at each of the typically thousands of connections comprising the network. To do this, a univariate model is independently fitted to each connection. It would be unwise to declare significance based on an uncorrected threshold of α=0.05, since the expected number of false positives for a network comprising N=90 nodes and N(N-1)/2=4005 connections would be 200. Control of Type I errors over all connections is therefore necessary. The network-based statistic (NBS) and spatial pairwise clustering (SPC) are two distinct methods that have been used to control family-wise errors when assessing the evidence for an experimental effect with mass univariate testing. The basic principle of the NBS and SPC is the same as supra-threshold voxel clustering. Unlike voxel clustering, where the definition of a voxel cluster is unambiguous, 'clusters' formed among supra-threshold connections can be defined in different ways. The NBS defines clusters using the graph theoretical concept of connected components. SPC on the other hand uses a more stringent pairwise clustering concept. The purpose of this article is to compare the pros and cons of the NBS and SPC, provide some guidelines on their practical use and demonstrate their utility using a case study involving neuroimaging data.
Resumo:
Bimodal dispersal probability distributions with characteristic distances differing by several orders of magnitude have been derived and favorably compared to observations by Nathan [Nature (London) 418, 409 (2002)]. For such bimodal kernels, we show that two-dimensional molecular dynamics computer simulations are unable to yield accurate front speeds. Analytically, the usual continuous-space random walks (CSRWs) are applied to two dimensions. We also introduce discrete-space random walks and use them to check the CSRW results (because of the inefficiency of the numerical simulations). The physical results reported are shown to predict front speeds high enough to possibly explain Reid's paradox of rapid tree migration. We also show that, for a time-ordered evolution equation, fronts are always slower in two dimensions than in one dimension and that this difference is important both for unimodal and for bimodal kernels
Resumo:
A new graph-based construction of generalized low density codes (GLD-Tanner) with binary BCH constituents is described. The proposed family of GLD codes is optimal on block erasure channels and quasi-optimal on block fading channels. Optimality is considered in the outage probability sense. Aclassical GLD code for ergodic channels (e.g., the AWGN channel,the i.i.d. Rayleigh fading channel, and the i.i.d. binary erasure channel) is built by connecting bitnodes and subcode nodes via a unique random edge permutation. In the proposed construction of full-diversity GLD codes (referred to as root GLD), bitnodes are divided into 4 classes, subcodes are divided into 2 classes, and finally both sides of the Tanner graph are linked via 4 random edge permutations. The study focuses on non-ergodic channels with two states and can be easily extended to channels with 3 states or more.
Resumo:
This paper presents our investigation on iterativedecoding performances of some sparse-graph codes on block-fading Rayleigh channels. The considered code ensembles are standard LDPC codes and Root-LDPC codes, first proposed in and shown to be able to attain the full transmission diversity. We study the iterative threshold performance of those codes as a function of fading gains of the transmission channel and propose a numerical approximation of the iterative threshold versus fading gains, both both LDPC and Root-LDPC codes.Also, we show analytically that, in the case of 2 fading blocks,the iterative threshold root of Root-LDPC codes is proportional to (α1 α2)1, where α1 and α2 are corresponding fading gains.From this result, the full diversity property of Root-LDPC codes immediately follows.
Resumo:
This article reports on the results of the research done towards the fully automatically merging of lexical resources. Our main goal is to show the generality of the proposed approach, which have been previously applied to merge Spanish Subcategorization Frames lexica. In this work we extend and apply the same technique to perform the merging of morphosyntactic lexica encoded in LMF. The experiments showed that the technique is general enough to obtain good results in these two different tasks which is an important step towards performing the merging of lexical resources fully automatically.
Resumo:
INTRODUCTION: Lumbar spinal stenosis (LSS) treatment is based primarily on the clinical criteria providing that imaging confirms radiological stenosis. The radiological measurement more commonly used is the dural sac cross-sectional area (DSCA). It has been recently shown that grading stenosis based on the morphology of the dural sac as seen on axial T2 MRI images, better reflects severity of stenosis than DSCA and is of prognostic value. This radiological prospective study investigates the variability of surface measurements and morphological grading of stenosis for varying degrees of angulation of the T2 axial images relative to the disc space as observed in clinical practice. MATERIALS AND METHODS: Lumbar spine TSE T2 three-dimensional (3D) MRI sequences were obtained from 32 consecutive patients presenting with either suspected spinal stenosis or low back pain. Axial reconstructions using the OsiriX software at 0°, 10°, 20° and 30° relative to the disc space orientation were obtained for a total of 97 levels. For each level, DSCA was digitally measured and stenosis was graded according to the 4-point (A-D) morphological grading by two observers. RESULTS: A good interobserver agreement was found in grade evaluation of stenosis (k = 0.71). DSCA varied significantly as the slice orientation increased from 0° to +10°, +20° and +30° at each level examined (P < 0.0001) (-15 to +32% at 10°, -24 to +143% at 20° and -29 to +231% at 30° of slice orientation). Stenosis definition based on the surface measurements changed in 39 out of the 97 levels studied, whereas the morphology grade was modified only in two levels (P < 0.01). DISCUSSION: The need to obtain continuous slices using the classical 2D MRI acquisition technique entails often at least a 10° slice inclination relative to one of the studied discs. Even at this low angulation, we found a significantly statistical difference between surface changes and morphological grading change. In clinical practice, given the above findings, it might therefore not be necessary to align the axial cuts to each individual disc level which could be more time-consuming than obtaining a single series of axial cuts perpendicular to the middle of the lumbar spine or to the most stenotic level. In conclusion, morphological grading seems to offer an alternative means of assessing severity of spinal stenosis that is little affected by image acquisition technique.
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
Pavement settlement occurring in and around utility cuts is a common problem, resulting in uneven pavement surfaces, annoyance to drivers, and ultimately, further maintenance. A survey of municipal authorities and field and laboratory investigations were conducted to identify the factors contributing to the settlement of utility cut restorations in pavement sections. Survey responses were received from seven cities across Iowa and indicate that utility cut restorations often last less than two years. Observations made during site inspections showed that backfill material varies from one city to another, backfill lift thickness often exceeds 12 inches, and the backfill material is often placed at bulking moisture contents with no Quality control/Quality Assurance. Laboratory investigation of the backfill materials indicate that at the field moisture contents encountered, the backfill materials have collapse potentials up to 35%. Falling Weight Deflectometer (FWD) deflection data and elevation shots indicate that the maximum deflection in the pavement occurs in the area around the utility cut restoration. The FWD data indicate a zone of influence around the perimeter of the restoration extending two to three feet beyond the trench perimeter. The research team proposes moisture control, the use of 65% relative density in a granular fill, and removing and compacting the native material near the ground surface around the trench. Test sections with geogrid reinforcement were also incorporated. The performance of inspected and proposed utility cuts needs to be monitored for at least two more years.
Resumo:
Invokes further budget cuts as determinded by the Governor.