966 resultados para Weighted graphs
Resumo:
Graph-structured databases are widely prevalent, and the problem of effective search and retrieval from such graphs has been receiving much attention recently. For example, the Web can be naturally viewed as a graph. Likewise, a relational database can be viewed as a graph where tuples are modeled as vertices connected via foreign-key relationships. Keyword search querying has emerged as one of the most effective paradigms for information discovery, especially over HTML documents in the World Wide Web. One of the key advantages of keyword search querying is its simplicity—users do not have to learn a complex query language, and can issue queries without any prior knowledge about the structure of the underlying data. The purpose of this dissertation was to develop techniques for user-friendly, high quality and efficient searching of graph structured databases. Several ranked search methods on data graphs have been studied in the recent years. Given a top-k keyword search query on a graph and some ranking criteria, a keyword proximity search finds the top-k answers where each answer is a substructure of the graph containing all query keywords, which illustrates the relationship between the keyword present in the graph. We applied keyword proximity search on the web and the page graph of web documents to find top-k answers that satisfy user’s information need and increase user satisfaction. Another effective ranking mechanism applied on data graphs is the authority flow based ranking mechanism. Given a top- k keyword search query on a graph, an authority-flow based search finds the top-k answers where each answer is a node in the graph ranked according to its relevance and importance to the query. We developed techniques that improved the authority flow based search on data graphs by creating a framework to explain and reformulate them taking in to consideration user preferences and feedback. We also applied the proposed graph search techniques for Information Discovery over biological databases. Our algorithms were experimentally evaluated for performance and quality. The quality of our method was compared to current approaches by using user surveys.
Resumo:
Being at-risk is a growing problem in the U.S. because of disturbing societal trends such as unemployment, divorce, substance abuse, child abuse and neglect, and the new threat of terrorist violence. Resilience characterizes individuals who rebound from or adapt to adversities such as these, and academic resilience distinguishes at-risk students who succeed in school despite hardships. ^ The purpose of this research was to perform a meta-analysis to examine the power of resilience and to suggest ways educators might improve academic resilience, which was operationalized by satisfactory test scores and grades. In order to find all studies that were relevant to academic resilience in at-risk kindergarten through 12th-grade students, extensive electronic and hardcopy searches were conducted, and these resulted in a database of 421 articles. Two hundred eighty seven of these were rejected quickly, because they were not empirical research. Upon further examination, another 106 were rejected for not meeting study protocol criteria. Ultimately, 28 studies were coded for study level descriptors and effect size variables. ^ Protective factors for resilience were found to originate in physical, psychological, and behavioral domains on proximal/intraindividual, transitional/intrafamilial, or distal/extrafamilial levels. Effect sizes (ESs) for these were weighted and the means for each level or category were interpreted by commonly accepted benchmarks. Mean effect sizes for proximal (M = .27) and for transitional (M = .15) were small but significant. The mean effect size for the distal level was insignificant. This supported the hypotheses that the proximal level was the source of most protective factors for academic resilience in at-risk students followed by the transitional level. The distal effect size warranted further research particularly in light of the small number of studies (n = 11) contributing effect sizes to that category. A homogeneity test indicated a search for moderators, i.e., study variables affecting outcomes, was justified. “Category” was the largest moderator. Graphs of weighted mean effect sizes in the physical, psychological, and behavioral domains were plotted for each level to better illustrate the findings of the meta-analysis. Suggestions were made for combining resilience development with aspects of positive psychology to promote resilience in the schools. ^
Resumo:
La tecnica di Diffusion Weighted Imaging (DWI) si basa sullo studio del moto diffusivo delle molecole d’acqua nei tessuti biologici ed è in grado di fornire informazioni sulla struttura dei tessuti e sulla presenza di eventuali alterazioni patologiche. Il più recente sviluppo della DWI è rappresentato dal Diffusion Tensor Imaging (DTI), tecnica che permette di determinare non solo l’entità, ma anche le direzioni principali della diffusione. Negli ultimi anni, grazie ai progressi nella tecnica di risonanza magnetica, l’imaging di diffusione è stato anche applicato ad altri distretti anatomici tra cui quello renale, per sfruttarne le potenzialità diagnostiche. Tuttavia, pochi sono ancora gli studi relativi all’applicazione delle metodiche di diffusione per la valutazione della malattia policistica renale autosomica dominante (ADPKD). ADPKD è una delle malattie ereditarie più comuni ed è la principale causa genetica di insufficienza renale dell’adulto. La caratteristica principale consiste nella formazione di cisti in entrambi i reni, che progressivamente aumentano in numero e dimensioni fino a causare la perdita della funzionalità renale nella metà circa dei pazienti. Ad oggi non sono disponibili terapie capaci di arrestare o rallentare l’evoluzione di ADPKD; è possibile controllare le complicanze per evitare che costituiscano componenti peggiorative. Il lavoro di tesi nasce dalla volontà di indagare se la tecnica dell’imaging di diffusione possa essere utile per fornire informazioni sullo stato della malattia e sul suo grado di avanzamento. L’analisi di studio è concentrata sul calcolo del coefficiente di diffusione apparente (ADC), derivato dalle immagini DWI e valutato nella regione della midollare. L’obiettivo di questo lavoro è verificare se tale valore di ADC sia in grado di caratterizzare la malattia policistica renale e possa essere utilizzato in ambito clinico-diagnostico come indicatore prognostico nella progressione di questa patologia.
Resumo:
The Baltic Sea is the largest brackish water area of the world. On the basis of the data from 16 cruises, we show the seasonal and vertical distribution patterns of the appendicularians Fritillaria borealis, Oikopleura dioica and the cyclopoid copepod Oithona similis, in the highly stratified Bornholm Basin. These species live at least temporarily below the permanent halocline and use different life strategies to cope with the brackish environment. The cold-water species F. borealis is abundant in the upper layers of the water column before the thermocline develops. With the formation of the thermocline abundance decreases and the specimens outlast higher temperatures below the halocline. Distribution and strategy suggest that F. borealis might be a glacial relict species in the Baltic Sea. Although Oikopleura dioica is only abundant during summer, O. similis is present all year round. Both species have in common that their vertical distribution is restricted to the waters below the halocline, most likely due to their requirements of higher salinities. We argue that the observed strategies are determined by ecophysiological constraints and life history traits. These species share an omnivorous feeding behaviour and the capability to utilise a spectra of small particles as food. As phytoplankton concentration is negligible below the halocline, we suggest that these species feed on organic material and heterotrophic organisms that accumulate in the density gradient of the halocline. Therefore, the deep haline waters in the Baltic Sea represent a habitat providing shelter from predation and food supply for adapted species that allows them to gather sufficient resources and to maintain populations.
Resumo:
We completely determine the spectra of composition operators induced by linear fractional self-maps of the unit disc acting on weighted Dirichlet spaces; extending earlier results by Higdon [8] and answering the open questions in this context.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
We consider linearly weighted versions of the least core and the (pre)nucleolus and investigate the reduction possibilities in their computation. We slightly extend some well-known related results and establish their counterparts by using the dual game. Our main results imply, for example, that if the core of the game is not empty, all dually inessential coalitions (which can be weakly minorized by a partition in the dual game) can be ignored when we compute the per-capita least core and the per-capita (pre)nucleolus from the dual game. This could lead to the design of polynomial time algorithms for the per-capita (and other monotone nondecreasingly weighted versions of the) least core and the (pre)nucleolus in specific classes of balanced games with polynomial many dually esential coalitions.
Resumo:
Purpose: There are two goals of this study. The first goal of this study is to investigate the feasibility of using classic textural feature extraction in radiotherapy response assessment among a unique cohort of early stage breast cancer patients who received the single-dose preoperative radiotherapy. The second goal of this study is to investigate the clinical feasibility of using classic texture features as potential biomarkers which are supplementary to regional apparent diffusion coefficient in gynecological cancer radiotherapy response assessment.
Methods and Materials: For the breast cancer study, 15 patients with early stage breast cancer were enrolled in this retrospective study. Each patient received a single-fraction radiation treatment, and DWI and DCE-MRI scans were conducted before and after the radiotherapy. DWI scans were acquired using a spin-echo EPI sequence with diffusion weighting factors of b = 0 and b = 500 mm2/s, and the apparent diffusion coefficient (ADC) maps were calculated. DCE-MRI scans were acquired using a T1-weighted 3D SPGR sequence with a temporal resolution of about 1 minute. The contrast agent (CA) was intravenously injected with a 0.1 mmol/kg bodyweight dose at 2 ml/s. Two parameters, volume transfer constant (Ktrans) and kep were analyzed using the two-compartment Tofts pharmacokinetic model. For pharmacokinetic parametric maps and ADC maps, 33 textural features were generated from the clinical target volume (CTV) in a 3D fashion using the classic gray level co-occurrence matrix (GLCOM) and gray level run length matrix (GLRLM). Wilcoxon signed-rank test was used to determine the significance of each texture feature’s change after the radiotherapy. The significance was set to 0.05 with Bonferroni correction.
For the gynecological cancer study, 12 female patients with gynecologic cancer treated with fractionated external beam radiotherapy (EBRT) combined with high dose rate (HDR) intracavitary brachytherapy were studied. Each patient first received EBRT treatment followed by five fractions of HDR treatment. Before EBRT and before each fraction of brachytherapy, Diffusion Weighted MRI (DWI-MRI) and CT scans were acquired. DWI scans were acquired in sagittal plane utilizing a spin-echo echo-planar imaging sequence with weighting factors of b = 500 s/mm2 and b = 1000 s/mm2, one set of images of b = 0 s/mm2 were also acquired. ADC maps were calculated using linear least-square fitting method. Distributed diffusion coefficient (DDC) maps and stretching parameter α were calculated. For ADC and DDC maps, 33 classic texture features were generated utilizing the classic gray level run length matrix (GLRLM) and gray level co-occurrence matrix (GLCOM) from high-risk clinical target volume (HR-CTV). Wilcoxon signed-rank statistics test was applied to determine the significance of each feature’s numerical value change after radiotherapy. Significance level was set to 0.05 with multi-comparison correction if applicable.
Results: For the breast cancer study, regarding ADC maps calculated from DWI-MRI, 24 out of 33 CTV features changed significantly after the radiotherapy. For DCE-MRI pharmacokinetic parameters, all 33 CTV features of Ktrans and 33 features of kep changed significantly.
For the gynecological cancer study, regarding ADC maps, 28 out of 33 HR-CTV texture features showed significant changes after the EBRT treatment. 28 out of 33 HR-CTV texture features indicated significant changes after HDR treatments. The texture features that indicated significant changes after HDR treatments are the same as those after EBRT treatment. 28 out of 33 HR-CTV texture features showed significant changes after whole radiotherapy treatment process. The texture features that indicated significant changes for the whole treatment process are the same as those after HDR treatments.
Conclusion: Initial results indicate that certain classic texture features are sensitive to radiation-induced changes. Classic texture features with significant numerical changes can be used in monitoring radiotherapy effect. This might suggest that certain texture features might be used as biomarkers which are supplementary to ADC and DDC for assessment of radiotherapy response in breast cancer and gynecological cancer.
Resumo:
Extensive investigation has been conducted on network data, especially weighted network in the form of symmetric matrices with discrete count entries. Motivated by statistical inference on multi-view weighted network structure, this paper proposes a Poisson-Gamma latent factor model, not only separating view-shared and view-specific spaces but also achieving reduced dimensionality. A multiplicative gamma process shrinkage prior is implemented to avoid over parameterization and efficient full conditional conjugate posterior for Gibbs sampling is accomplished. By the accommodating of view-shared and view-specific parameters, flexible adaptability is provided according to the extents of similarity across view-specific space. Accuracy and efficiency are tested by simulated experiment. An application on real soccer network data is also proposed to illustrate the model.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Centrality is in fact one of the fundamental notions in graph theory which has established its close connection with various other areas like Social networks, Flow networks, Facility location problems etc. Even though a plethora of centrality measures have been introduced from time to time, according to the changing demands, the term is not well defined and we can only give some common qualities that a centrality measure is expected to have. Nodes with high centrality scores are often more likely to be very powerful, indispensable, influential, easy propagators of information, significant in maintaining the cohesion of the group and are easily susceptible to anything that disseminate in the network.
Resumo:
This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.
Resumo:
Lo scopo della tesi è di stimare le prestazioni del rivelatore ALICE nella rivelazione del barione Lambda_c nelle collisioni PbPb usando un approccio innovativo per l'identificazione delle particelle. L'idea principale del nuovo approccio è di sostituire l'usuale selezione della particella, basata su tagli applicati ai segnali del rivelatore, con una selezione che usi le probabilità derivate dal teorema di Bayes (per questo è chiamato "pesato Bayesiano"). Per stabilire quale metodo è il più efficiente , viene presentato un confronto con altri approcci standard utilizzati in ALICE. Per fare ciò è stato implementato un software di simulazione Monte Carlo "fast", settato con le abbondanze di particelle che ci si aspetta nel nuovo regime energetico di LHC e con le prestazioni osservate del rivelatore. E' stata quindi ricavata una stima realistica della produzione di Lambda_c, combinando i risultati noti da esperimenti precedenti e ciò è stato usato per stimare la significatività secondo la statistica al RUN2 e RUN3 dell'LHC. Verranno descritti la fisica di ALICE, tra cui modello standard, cromodinamica quantistica e quark gluon plasma. Poi si passerà ad analizzare alcuni risultati sperimentali recenti (RHIC e LHC). Verrà descritto il funzionamento di ALICE e delle sue componenti e infine si passerà all'analisi dei risultati ottenuti. Questi ultimi hanno mostrato che il metodo risulta avere una efficienza superiore a quella degli usuali approcci in ALICE e che, conseguentemente, per quantificare ancora meglio le prestazioni del nuovo metodo si dovrebbe eseguire una simulazione "full", così da verificare i risultati ottenuti in uno scenario totalmente realistico.
Resumo:
Hebb proposed that synapses between neurons that fire synchronously are strengthened, forming cell assemblies and phase sequences. The former, on a shorter scale, are ensembles of synchronized cells that function transiently as a closed processing system; the latter, on a larger scale, correspond to the sequential activation of cell assemblies able to represent percepts and behaviors. Nowadays, the recording of large neuronal populations allows for the detection of multiple cell assemblies. Within Hebb's theory, the next logical step is the analysis of phase sequences. Here we detected phase sequences as consecutive assembly activation patterns, and then analyzed their graph attributes in relation to behavior. We investigated action potentials recorded from the adult rat hippocampus and neocortex before, during and after novel object exploration (experimental periods). Within assembly graphs, each assembly corresponded to a node, and each edge corresponded to the temporal sequence of consecutive node activations. The sum of all assembly activations was proportional to firing rates, but the activity of individual assemblies was not. Assembly repertoire was stable across experimental periods, suggesting that novel experience does not create new assemblies in the adult rat. Assembly graph attributes, on the other hand, varied significantly across behavioral states and experimental periods, and were separable enough to correctly classify experimental periods (Naïve Bayes classifier; maximum AUROCs ranging from 0.55 to 0.99) and behavioral states (waking, slow wave sleep, and rapid eye movement sleep; maximum AUROCs ranging from 0.64 to 0.98). Our findings agree with Hebb's view that assemblies correspond to primitive building blocks of representation, nearly unchanged in the adult, while phase sequences are labile across behavioral states and change after novel experience. The results are compatible with a role for phase sequences in behavior and cognition.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08