898 resultados para Exponential Random Graph Model
Resumo:
BACKGROUND AND PURPOSE: To assess whether the combined analysis of all phase III trials of nonvitamin-K-antagonist (non-VKA) oral anticoagulants in patients with atrial fibrillation and previous stroke or transient ischemic attack shows a significant difference in efficacy or safety compared with warfarin. METHODS: We searched PubMed until May 31, 2012, for randomized clinical trials using the following search items: atrial fibrillation, anticoagulation, warfarin, and previous stroke or transient ischemic attack. Studies had to be phase III trials in atrial fibrillation patients comparing warfarin with a non-VKA currently on the market or with the intention to be brought to the market in North America or Europe. Analysis was performed on intention-to-treat basis. A fixed-effects model was used as more appropriate than a random-effects model when combining a small number of studies. RESULTS: Among 47 potentially eligible articles, 3 were included in the meta-analysis. In 14 527 patients, non-VKAs were associated with a significant reduction of stroke/systemic embolism (odds ratios, 0.85 [95% CI, 074-0.99]; relative risk reduction, 14%; absolute risk reduction, 0.7%; number needed to treat, 134 over 1.8-2.0 years) compared with warfarin. Non-VKAs were also associated with a significant reduction of major bleeding compared with warfarin (odds ratios, 0.86 [95% CI, 075-0.99]; relative risk reduction, 13%; absolute risk reduction, 0.8%; number needed to treat, 125), mainly driven by the significant reduction of hemorrhagic stroke (odds ratios, 0.44 [95% CI, 032-0.62]; relative risk reduction, 57.9%; absolute risk reduction, 0.7%; number needed to treat, 139). CONCLUSIONS: In the context of the significant limitations of combining the results of disparate trials of different agents, non-VKAs seem to be associated with a significant reduction in rates of stroke or systemic embolism, hemorrhagic stroke, and major bleeding when compared with warfarin in patients with previous stroke or transient ischemic attack.
Resumo:
This article analyses stability and volatility of party preferences using data from the Swiss Household-Panel (SHP), which, for the first time, allow studying transitions and stability of voters over several years in Switzerland. Analyses cover the years 1999- 2007 and systematically distinguish changes between party blocks and changes within party blocks. The first part looks at different patterns of change, which show relatively high volatility. The second part tests several theories on causes of such changes applying a multinomial random-effects model. Results show that party preferences stabilise with their duration and with age and that the electoral cycle, political sophistication, socio-structural predispositions, the household-context as well as party size and the number of parties each explain part of electoral volatility. Different results for withinand between party-block changes underlie the importance of that differentiation.
Resumo:
Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.
Resumo:
In a distributed key distribution scheme, a set of servers helps a set of users in a group to securely obtain a common key. Security means that an adversary who corrupts some servers and some users has no information about the key of a noncorrupted group. In this work, we formalize the security analysis of one such scheme which was not considered in the original proposal. We prove the scheme is secure in the random oracle model, assuming that the Decisional Diffie-Hellman (DDH) problem is hard to solve. We also detail a possible modification of that scheme and the one in which allows us to prove the security of the schemes without assuming that a specific hash function behaves as a random oracle. As usual, this improvement in the security of the schemes is at the cost of an efficiency loss.
Resumo:
PURPOSE: To determine and compare the diagnostic performance of magnetic resonance imaging (MRI) and computed tomography (CT) for the diagnosis of tumor extent in advanced retinoblastoma, using histopathologic analysis as the reference standard. DESIGN: Systematic review and meta-analysis. PARTICIPANTS: Patients with advanced retinoblastoma who underwent MRI, CT, or both for the detection of tumor extent from published diagnostic accuracy studies. METHODS: Medline and Embase were searched for literature published through April 2013 assessing the diagnostic performance of MRI, CT, or both in detecting intraorbital and extraorbital tumor extension of retinoblastoma. Diagnostic accuracy data were extracted from included studies. Summary estimates were based on a random effects model. Intrastudy and interstudy heterogeneity were analyzed. MAIN OUTCOME MEASURES: Sensitivity and specificity of MRI and CT in detecting tumor extent. RESULTS: Data of the following tumor-extent parameters were extracted: anterior eye segment involvement and ciliary body, optic nerve, choroidal, and (extra)scleral invasion. Articles on MRI reported results of 591 eyes from 14 studies, and articles on CT yielded 257 eyes from 4 studies. The summary estimates with their 95% confidence intervals (CIs) of the diagnostic accuracy of conventional MRI at detecting postlaminar optic nerve, choroidal, and scleral invasion showed sensitivities of 59% (95% CI, 37%-78%), 74% (95% CI, 52%-88%), and 88% (95% CI, 20%-100%), respectively, and specificities of 94% (95% CI, 84%-98%), 72% (95% CI, 31%-94%), and 99% (95% CI, 86%-100%), respectively. Magnetic resonance imaging with a high (versus a low) image quality showed higher diagnostic accuracies for detection of prelaminar optic nerve and choroidal invasion, but these differences were not statistically significant. Studies reporting the diagnostic accuracy of CT did not provide enough data to perform any meta-analyses. CONCLUSIONS: Magnetic resonance imaging is an important diagnostic tool for the detection of local tumor extent in advanced retinoblastoma, although its diagnostic accuracy shows room for improvement, especially with regard to sensitivity. With only a few-mostly old-studies, there is very little evidence on the diagnostic accuracy of CT, and generally these studies show low diagnostic accuracy. Future studies assessing the role of MRI in clinical decision making in terms of prognostic value for advanced retinoblastoma are needed.
Resumo:
Abstract The object of game theory lies in the analysis of situations where different social actors have conflicting requirements and where their individual decisions will all influence the global outcome. In this framework, several games have been invented to capture the essence of various dilemmas encountered in many common important socio-economic situations. Even though these games often succeed in helping us understand human or animal behavior in interactive settings, some experiments have shown that people tend to cooperate with each other in situations for which classical game theory strongly recommends them to do the exact opposite. Several mechanisms have been invoked to try to explain the emergence of this unexpected cooperative attitude. Among them, repeated interaction, reputation, and belonging to a recognizable group have often been mentioned. However, the work of Nowak and May (1992) showed that the simple fact of arranging the players according to a spatial structure and only allowing them to interact with their immediate neighbors is sufficient to sustain a certain amount of cooperation even when the game is played anonymously and without repetition. Nowak and May's study and much of the following work was based on regular structures such as two-dimensional grids. Axelrod et al. (2002) showed that by randomizing the choice of neighbors, i.e. by actually giving up a strictly local geographical structure, cooperation can still emerge, provided that the interaction patterns remain stable in time. This is a first step towards a social network structure. However, following pioneering work by sociologists in the sixties such as that of Milgram (1967), in the last few years it has become apparent that many social and biological interaction networks, and even some technological networks, have particular, and partly unexpected, properties that set them apart from regular or random graphs. Among other things, they usually display broad degree distributions, and show small-world topological structure. Roughly speaking, a small-world graph is a network where any individual is relatively close, in terms of social ties, to any other individual, a property also found in random graphs but not in regular lattices. However, in contrast with random graphs, small-world networks also have a certain amount of local structure, as measured, for instance, by a quantity called the clustering coefficient. In the same vein, many real conflicting situations in economy and sociology are not well described neither by a fixed geographical position of the individuals in a regular lattice, nor by a random graph. Furthermore, it is a known fact that network structure can highly influence dynamical phenomena such as the way diseases spread across a population and ideas or information get transmitted. Therefore, in the last decade, research attention has naturally shifted from random and regular graphs towards better models of social interaction structures. The primary goal of this work is to discover whether or not the underlying graph structure of real social networks could give explanations as to why one finds higher levels of cooperation in populations of human beings or animals than what is prescribed by classical game theory. To meet this objective, I start by thoroughly studying a real scientific coauthorship network and showing how it differs from biological or technological networks using divers statistical measurements. Furthermore, I extract and describe its community structure taking into account the intensity of a collaboration. Finally, I investigate the temporal evolution of the network, from its inception to its state at the time of the study in 2006, suggesting also an effective view of it as opposed to a historical one. Thereafter, I combine evolutionary game theory with several network models along with the studied coauthorship network in order to highlight which specific network properties foster cooperation and shed some light on the various mechanisms responsible for the maintenance of this same cooperation. I point out the fact that, to resist defection, cooperators take advantage, whenever possible, of the degree-heterogeneity of social networks and their underlying community structure. Finally, I show that cooperation level and stability depend not only on the game played, but also on the evolutionary dynamic rules used and the individual payoff calculations. Synopsis Le but de la théorie des jeux réside dans l'analyse de situations dans lesquelles différents acteurs sociaux, avec des objectifs souvent conflictuels, doivent individuellement prendre des décisions qui influenceront toutes le résultat global. Dans ce cadre, plusieurs jeux ont été inventés afin de saisir l'essence de divers dilemmes rencontrés dans d'importantes situations socio-économiques. Bien que ces jeux nous permettent souvent de comprendre le comportement d'êtres humains ou d'animaux en interactions, des expériences ont montré que les individus ont parfois tendance à coopérer dans des situations pour lesquelles la théorie classique des jeux prescrit de faire le contraire. Plusieurs mécanismes ont été invoqués pour tenter d'expliquer l'émergence de ce comportement coopératif inattendu. Parmi ceux-ci, la répétition des interactions, la réputation ou encore l'appartenance à des groupes reconnaissables ont souvent été mentionnés. Toutefois, les travaux de Nowak et May (1992) ont montré que le simple fait de disposer les joueurs selon une structure spatiale en leur permettant d'interagir uniquement avec leurs voisins directs est suffisant pour maintenir un certain niveau de coopération même si le jeu est joué de manière anonyme et sans répétitions. L'étude de Nowak et May, ainsi qu'un nombre substantiel de travaux qui ont suivi, étaient basés sur des structures régulières telles que des grilles à deux dimensions. Axelrod et al. (2002) ont montré qu'en randomisant le choix des voisins, i.e. en abandonnant une localisation géographique stricte, la coopération peut malgré tout émerger, pour autant que les schémas d'interactions restent stables au cours du temps. Ceci est un premier pas en direction d'une structure de réseau social. Toutefois, suite aux travaux précurseurs de sociologues des années soixante, tels que ceux de Milgram (1967), il est devenu clair ces dernières années qu'une grande partie des réseaux d'interactions sociaux et biologiques, et même quelques réseaux technologiques, possèdent des propriétés particulières, et partiellement inattendues, qui les distinguent de graphes réguliers ou aléatoires. Entre autres, ils affichent en général une distribution du degré relativement large ainsi qu'une structure de "petit-monde". Grossièrement parlant, un graphe "petit-monde" est un réseau où tout individu se trouve relativement près de tout autre individu en termes de distance sociale, une propriété également présente dans les graphes aléatoires mais absente des grilles régulières. Par contre, les réseaux "petit-monde" ont, contrairement aux graphes aléatoires, une certaine structure de localité, mesurée par exemple par une quantité appelée le "coefficient de clustering". Dans le même esprit, plusieurs situations réelles de conflit en économie et sociologie ne sont pas bien décrites ni par des positions géographiquement fixes des individus en grilles régulières, ni par des graphes aléatoires. De plus, il est bien connu que la structure même d'un réseau peut passablement influencer des phénomènes dynamiques tels que la manière qu'a une maladie de se répandre à travers une population, ou encore la façon dont des idées ou une information s'y propagent. Ainsi, durant cette dernière décennie, l'attention de la recherche s'est tout naturellement déplacée des graphes aléatoires et réguliers vers de meilleurs modèles de structure d'interactions sociales. L'objectif principal de ce travail est de découvrir si la structure sous-jacente de graphe de vrais réseaux sociaux peut fournir des explications quant aux raisons pour lesquelles on trouve, chez certains groupes d'êtres humains ou d'animaux, des niveaux de coopération supérieurs à ce qui est prescrit par la théorie classique des jeux. Dans l'optique d'atteindre ce but, je commence par étudier un véritable réseau de collaborations scientifiques et, en utilisant diverses mesures statistiques, je mets en évidence la manière dont il diffère de réseaux biologiques ou technologiques. De plus, j'extrais et je décris sa structure de communautés en tenant compte de l'intensité d'une collaboration. Finalement, j'examine l'évolution temporelle du réseau depuis son origine jusqu'à son état en 2006, date à laquelle l'étude a été effectuée, en suggérant également une vue effective du réseau par opposition à une vue historique. Par la suite, je combine la théorie évolutionnaire des jeux avec des réseaux comprenant plusieurs modèles et le réseau de collaboration susmentionné, afin de déterminer les propriétés structurelles utiles à la promotion de la coopération et les mécanismes responsables du maintien de celle-ci. Je mets en évidence le fait que, pour ne pas succomber à la défection, les coopérateurs exploitent dans la mesure du possible l'hétérogénéité des réseaux sociaux en termes de degré ainsi que la structure de communautés sous-jacente de ces mêmes réseaux. Finalement, je montre que le niveau de coopération et sa stabilité dépendent non seulement du jeu joué, mais aussi des règles de la dynamique évolutionnaire utilisées et du calcul du bénéfice d'un individu.
Resumo:
Toperform a meta-analysis of FDG-PET performances in the diagnosis of largevessels vasculitis (Giant Cell Arteritis (GCA) associated or not withPolymyalgia Rheumatica(PMR), Takayasu). Materials and methods : The MEDLINE,Cochrane Library, Embase were searched for relevant original articlesdescribing FDG-PET for vasculitis assessment, using MesH terms ("GiantCell Arteritis or Vasculitis" AND "PET"). Criteria for inclusionwere:(1)FDG-PET for diagnosis of vasculitis(2)American College of Rheumatologycriteria as reference standard(3)control group. After data extraction, analyseswere performed using a random-effects model. Results : Of 184 citations(database search and references screening),70 articles were reviewed of which12 eligible studies were extracted (sensitivity range from 32% to 97%). 7studies fulfilled all inclusion criteria. Owing to overlapping population, 1study was excluded. Statistical heterogeneity justified the random-effectsmodel. Pooled 6 studies analysis(116 vasculitis,224 controls) showed a 81%sensitivity (95%CI:70-89%);a 89% specificity (95%CI:77-95%);a 85%PPV(95%CI:63-95%); a 90% NPV(95%CI:79-95%);a 7.1 positive LR(95%CI:3.4-14.9); a0.2 negative LR(95%CI:0.14-0.35) and 90.1 DOR(95%CI: 18.6-437). Conclusion :FDG-PET has good diagnostic performances in the detection of large vesselsvasculitis. Its promising role could be extended to follow up patients undertreatment, but further studies are needed to confirm this possibility.
Resumo:
BACKGROUND: Resection of lung metastases (LM) from colorectal cancer (CRC) is increasingly performed with a curative intent. It is currently not possible to identify those CRC patients who may benefit the most from this surgical strategy. The aim of this study was to perform a systematic review of risk factors for survival after lung metastasectomy for CRC. METHODS: We performed a meta-analysis of series published between 2000 and 2011, which focused on surgical management of LM from CRC and included more than 40 patients each. Pooled hazard ratios (HR) were calculated by using random effects model for parameters considered as potential prognostic factors. RESULTS: Twenty-five studies including a total of 2925 patients were considered in this analysis. Four parameters were associated with poor survival: (1) a short disease-free interval between primary tumor resection and development of LM (HR 1.59, 95 % confidence interval [CI] 1.27-1.98); (2) multiple LM (HR 2.04, 95 % CI 1.72-2.41); (3) positive hilar and/or mediastinal lymph nodes (HR 1.65, 95 % CI 1.35-2.02); and (4) elevated prethoracotomy carcinoembryonic antigen (HR 1.91, 95 % CI 1.57-2.32). By comparison, a history of resected liver metastases (HR 1.22, 95 % CI 0.91-1.64) did not achieve statistical significance. CONCLUSIONS: Clinical variables associated with prolonged survival after surgery for LM in CRC patients include prolonged disease-free interval between primary tumor and metastatic spread, normal prethoracotomy carcinoembryonic antigen, absence of thoracic node involvement, and a single pulmonary lesion.
Resumo:
Aim. To predict the fate of alpine interactions involving specialized species, using a monophagous beetle and its host-plant as a case study. Location. The Alps. Methods. We investigated genetic structuring of the herbivorous beetle Oreina gloriosa and its specific host-plant Peucedanum ostruthium. We used genome fingerprinting (in the insect and the plant) and sequence data (in the insect) to compare the distribution of the main gene pools in the two associated species and to estimate divergence time in the insect, a proxy for the temporal origin of the interaction. We quantified the similarity in spatial genetic structures by performing a Procrustes analysis, a tool from the shape theory. Finally, we simulated recolonization of an empty space analogous to the deglaciated Alps just after ice retreat by two lineages from two species showing unbalanced dependence, to examine how timing of the recolonization process, as well as dispersal capacities of associated species, could explain the observed pattern. Results. Contrasting with expectations based on their asymmetrical dependence, patterns in the beetle and plant were congruent at a large scale. Exceptions occurred at a regional scale in areas of admixture, matching known suture zones in Alpine plants. Simulations using a lattice-based model suggested these empirical patterns arose during or soon after recolonization, long after the estimated origin of the interaction c. 0.5 million years ago. Main conclusions. Species-specific interactions are scarce in alpine habitats because glacial cycles have limited opportunities for coevolution. Their fate, however, remains uncertain under climate change. Here we show that whereas most dispersal routes are paralleled at large scale, regional incongruence implies that the destinies of the species might differ under changing climate. This may be a consequence of the host-dependence of the beetle that locally limits the establishment of dispersing insects.
Resumo:
We present a continuous time random walk model for the scale-invariant transport found in a self-organized critical rice pile [K. Christensen et al., Phys. Rev. Lett. 77, 107 (1996)]. From our analytical results it is shown that the dynamics of the experiment can be explained in terms of Lvy flights for the grains and a long-tailed distribution of trapping times. Scaling relations for the exponents of these distributions are obtained. The predicted microscopic behavior is confirmed by means of a cellular automaton model.
Resumo:
All derivations of the one-dimensional telegraphers equation, based on the persistent random walk model, assume a constant speed of signal propagation. We generalize here the model to allow for a variable propagation speed and study several limiting cases in detail. We also show the connections of this model with anomalous diffusion behavior and with inertial dichotomous processes.
Resumo:
Aims: Recently, several clinical trials analyzed if extended duration of treatment with pegylated interferon-alfa and ribavirin over 48 weeks can improve sustained virologic response (SVR) rates in HCV genotype 1-infected patients with slow virologic response. Because results of these clinical trials are conflicting, we performed a metaanalysis to determine the overall impact of extended treatment compared to standard treatment on virologic response rates in treatment-naive HCV genotype 1 slow responders. Methods: Literature search was performed independently by two observers using Pub Med, EMBASE, CENTRAL and abstracts presented in English at international liver and gastroenterology meetings. Randomized controlled clinical trials (RCTs; but studies that re-analyzed data retrospectively RCTs were also allowed) were considered if they included monoinfected treatment-naive HCV genotype 1 patients and compared treatment with pegIFN-alfa 2a or 2b in combination with ribavirin for 48 weeks versus extended treatment (up to 72 weeks) in slow responders. Primary and secondary end points were SVR rates and end-of-treatment (EOT) and relapse rates, respectively. In the present meta-analysis, study endpoints were summarized with a DerSimonian-Laird estimate for binary outcome basing on a random effects model. Results: Literature search yielded seven RTCs addressing the benefit of extended treatment with pegylated interferon-alfa and ribavirin in treatment-naive HCV genotype 1 slow responders. In total, 1330 slow responders were included in our meta-analysis. We show that extended treatment duration compared to the standard of care significantly improves SVR rates in HCV genotype 1 slow responders (12.4% improvement of overall SVR rate, 95% CI 0.055- 0.193, P = 0.0005). In addition, we show that rates of viral relapse were significantly reduced by extended treatment (24.1% reduction of relapse, 95% CI −0.3332 to −0.1487, P < 0.0001), whereas no significant impact of extended treatment on EOT response rates was found. Though extended treatment was burdened with an enhanced rate of premature treatment discontinuation due to interferonalfa- and ribavirin-related side effects, the frequency of serious adverse events was not increased. Conclusions: Treatment extension in HCV genotype 1 slow responders can improve SVR rates in difficult to treat patients and should be considered in patients who need to be treated before specific antivirals will be approved.
Resumo:
The progression of liver fibrosis in chronic hepatitis C has long been considered to be independent from viral genotypes. However, recent studies suggest an association between Hepatitis C virus (HCV) genotype 3 and accelerated liver disease progression. We completed a systematic review and meta-analysis of studies evaluating the association between HCV genotypes and fibrosis progression. PubMed, Embase and ISI Web of Knowledge databases were searched for cohort, cross-sectional and case-control studies on treatment-naïve HCV-infected adults in which liver fibrosis progression rate (FPR) was assessed by the ratio of fibrosis stage in one single biopsy to the duration of infection (single-biopsy studies) or from the change in fibrosis stage between two biopsies (paired biopsies studies). A random effect model was used to derive FPR among different HCV genotypes. Eight single-biopsy studies (3182 patients, mean/median duration of infection ranging from 9 to 21 years) and eight paired biopsies studies (mean interval between biopsies 2-12 years) met the selection criteria. The odds ratio for the association of genotype 3 with accelerated fibrosis progression was 1.52 (95% CI 1.12-2.07, P = 0.007) in single-biopsy studies and 1.37 (95% CI 0.87-2.17, P = 0.17) in paired biopsy studies. In conclusion, viral genotype 3 was associated with faster fibrosis progression in single-biopsy studies. This observation may have important consequences on the clinical management of genotype 3-infected patients. The association was not significant in paired biopsies studies, although the latter may be limited by important indication bias, short observation time and small sample size.
Resumo:
We consider the effects of quantum fluctuations in mean-field quantum spin-glass models with pairwise interactions. We examine the nature of the quantum glass transition at zero temperature in a transverse field. In models (such as the random orthogonal model) where the classical phase transition is discontinuous an analysis using the static approximation reveals that the transition becomes continuous at zero temperature.
Resumo:
BACKGROUND: Results from cohort studies evaluating the severity of respiratory viral co-infections are conflicting. We conducted a systematic review and meta-analysis to assess the clinical severity of viral co-infections as compared to single viral respiratory infections. METHODS: We searched electronic databases and other sources for studies published up to January 28, 2013. We included observational studies on inpatients with respiratory illnesses comparing the clinical severity of viral co-infections to single viral infections as detected by molecular assays. The primary outcome reflecting clinical disease severity was length of hospital stay (LOS). A random-effects model was used to conduct the meta-analyses. RESULTS: Twenty-one studies involving 4,280 patients were included. The overall quality of evidence applying the GRADE approach ranged from moderate for oxygen requirements to low for all other outcomes. No significant differences in length of hospital stay (LOS) (mean difference (MD) -0.20 days, 95% CI -0.94, 0.53, p = 0.59), or mortality (RR 2.44, 95% CI 0.86, 6.91, p = 0.09) were documented in subjects with viral co-infections compared to those with a single viral infection. There was no evidence for differences in effects across age subgroups in post hoc analyses with the exception of the higher mortality in preschool children (RR 9.82, 95% CI 3.09, 31.20, p<0.001) with viral co-infection as compared to other age groups (I2 for subgroup analysis 64%, p = 0.04). CONCLUSIONS: No differences in clinical disease severity between viral co-infections and single respiratory infections were documented. The suggested increased risk of mortality observed amongst children with viral co-infections requires further investigation.