911 resultados para Minimal sets
Resumo:
The method of extracting effective atomic orbitals and effective minimal basis sets from molecular wave function characterizing the state of an atom in a molecule is developed in the framework of the "fuzzy" atoms. In all cases studied, there were as many effective orbitals that have considerable occupation numbers as orbitals in the classical minimal basis. That is considered to be of high conceptual importance
Resumo:
Following the approach developed by Luttens (2010), we consider a model where individuals with di fferent levels of skills exert di fferent levels of e ffor. Speci fically, we propose a redistribution mechanism based on a lower bound on what every individual deserves: the so-called minimal rights (O'Neill (1982)). Our re finement of Luttens' mechanism ensures at the same time minimal rights based solidarity, participation (non-negativity) and claims feasibility. Keywords: Redistribution mechanism, Minimal rights, Solidarity, Participation, Claims feasibility. JEL classi fication: C71, D63, D71.
Resumo:
Introduction: Responses to external stimuli are typically investigated by averaging peri-stimulus electroencephalography (EEG) epochs in order to derive event-related potentials (ERPs) across the electrode montage, under the assumption that signals that are related to the external stimulus are fixed in time across trials. We demonstrate the applicability of a single-trial model based on patterns of scalp topographies (De Lucia et al, 2007) that can be used for ERP analysis at the single-subject level. The model is able to classify new trials (or groups of trials) with minimal a priori hypotheses, using information derived from a training dataset. The features used for the classification (the topography of responses and their latency) can be neurophysiologically interpreted, because a difference in scalp topography indicates a different configuration of brain generators. An above chance classification accuracy on test datasets implicitly demonstrates the suitability of this model for EEG data. Methods: The data analyzed in this study were acquired from two separate visual evoked potential (VEP) experiments. The first entailed passive presentation of checkerboard stimuli to each of the four visual quadrants (hereafter, "Checkerboard Experiment") (Plomp et al, submitted). The second entailed active discrimination of novel versus repeated line drawings of common objects (hereafter, "Priming Experiment") (Murray et al, 2004). Four subjects per experiment were analyzed, using approx. 200 trials per experimental condition. These trials were randomly separated in training (90%) and testing (10%) datasets in 10 independent shuffles. In order to perform the ERP analysis we estimated the statistical distribution of voltage topographies by a Mixture of Gaussians (MofGs), which reduces our original dataset to a small number of representative voltage topographies. We then evaluated statistically the degree of presence of these template maps across trials and whether and when this was different across experimental conditions. Based on these differences, single-trials or sets of a few single-trials were classified as belonging to one or the other experimental condition. Classification performance was assessed using the Receiver Operating Characteristic (ROC) curve. Results: For the Checkerboard Experiment contrasts entailed left vs. right visual field presentations for upper and lower quadrants, separately. The average posterior probabilities, indicating the presence of the computed template maps in time and across trials revealed significant differences starting at ~60-70 ms post-stimulus. The average ROC curve area across all four subjects was 0.80 and 0.85 for upper and lower quadrants, respectively and was in all cases significantly higher than chance (unpaired t-test, p<0.0001). In the Priming Experiment, we contrasted initial versus repeated presentations of visual object stimuli. Their posterior probabilities revealed significant differences, which started at 250ms post-stimulus onset. The classification accuracy rates with single-trial test data were at chance level. We therefore considered sub-averages based on five single trials. We found that for three out of four subjects' classification rates were significantly above chance level (unpaired t-test, p<0.0001). Conclusions: The main advantage of the present approach is that it is based on topographic features that are readily interpretable along neurophysiologic lines. As these maps were previously normalized by the overall strength of the field potential on the scalp, a change in their presence across trials and between conditions forcibly reflects a change in the underlying generator configurations. The temporal periods of statistical difference between conditions were estimated for each training dataset for ten shuffles of the data. Across the ten shuffles and in both experiments, we observed a high level of consistency in the temporal periods over which the two conditions differed. With this method we are able to analyze ERPs at the single-subject level providing a novel tool to compare normal electrophysiological responses versus single cases that cannot be considered part of any cohort of subjects. This aspect promises to have a strong impact on both basic and clinical research.
Resumo:
This paper presents several algorithms for joint estimation of the target number and state in a time-varying scenario. Building on the results presented in [1], which considers estimation of the target number only, we assume that not only the target number, but also their state evolution must be estimated. In this context, we extend to this new scenario the Rao-Blackwellization procedure of [1] to compute Bayes recursions, thus defining reduced-complexity solutions for the multi-target set estimator. A performance assessmentis finally given both in terms of Circular Position Error Probability - aimed at evaluating the accuracy of the estimated track - and in terms of Cardinality Error Probability, aimed at evaluating the reliability of the target number estimates.
Resumo:
We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.
Resumo:
In the homogeneous case of one-dimensional objects, we show that any preference relation that is positive and homothetic can be represented by a quantitative utility function and unique bias. This bias may favor or disfavor the preference for an object. In the first case, preferences are complete but not transitive and an object may be preferred even when its utility is lower. In the second case, preferences are asymmetric and transitive but not negatively transitive and it may not be sufficient for an object to have a greater utility for be preferred. In this manner, the bias reflects the extent to which preferences depart from the maximization of a utility function.
Resumo:
The DRG classification provides a useful tool for the evaluation of hospital care. Indicators such as readmissions and mortality rates adjusted for the hospital Casemix could be adopted in Switzerland at the price of minor additions to the hospital discharge record. The additional information required to build patients histories and to identify the deaths occurring after hospital discharge is detailed.
Resumo:
Background and aim of the study: Genomic gains and losses play a crucial role in the development and progression of DLBCL and are closely related to gene expression profiles (GEP), including the germinal center B-cell like (GCB) and activated B-cell like (ABC) cell of origin (COO) molecular signatures. To identify new oncogenes or tumor suppressor genes (TSG) involved in DLBCL pathogenesis and to determine their prognostic values, an integrated analysis of high-resolution gene expression and copy number profiling was performed. Patients and methods: Two hundred and eight adult patients with de novo CD20+ DLBCL enrolled in the prospective multicentric randomized LNH-03 GELA trials (LNH03-1B, -2B, -3B, 39B, -5B, -6B, -7B) with available frozen tumour samples, centralized reviewing and adequate DNA/RNA quality were selected. 116 patients were treated by Rituximab(R)-CHOP/R-miniCHOP and 92 patients were treated by the high dose (R)-ACVBP regimen dedicated to patients younger than 60 years (y) in frontline. Tumour samples were simultaneously analysed by high resolution comparative genomic hybridization (CGH, Agilent, 144K) and gene expression arrays (Affymetrix, U133+2). Minimal common regions (MCR), as defined by segments that affect the same chromosomal region in different cases, were delineated. Gene expression and MCR data sets were merged using Gene expression and dosage integrator algorithm (GEDI, Lenz et al. PNAS 2008) to identify new potential driver genes. Results: A total of 1363 recurrent (defined by a penetrance > 5%) MCRs within the DLBCL data set, ranging in size from 386 bp, affecting a single gene, to more than 24 Mb were identified by CGH. Of these MCRs, 756 (55%) showed a significant association with gene expression: 396 (59%) gains, 354 (52%) single-copy deletions, and 6 (67%) homozygous deletions. By this integrated approach, in addition to previously reported genes (CDKN2A/2B, PTEN, DLEU2, TNFAIP3, B2M, CD58, TNFRSF14, FOXP1, REL...), several genes targeted by gene copy abnormalities with a dosage effect and potential physiopathological impact were identified, including genes with TSG activity involved in cell cycle (HACE1, CDKN2C) immune response (CD68, CD177, CD70, TNFSF9, IRAK2), DNA integrity (XRCC2, BRCA1, NCOR1, NF1, FHIT) or oncogenic functions (CD79b, PTPRT, MALT1, AUTS2, MCL1, PTTG1...) with distinct distribution according to COO signature. The CDKN2A/2B tumor suppressor locus (9p21) was deleted homozygously in 27% of cases and hemizygously in 9% of cases. Biallelic loss was observed in 49% of ABC DLBCL and in 10% of GCB DLBCL. This deletion was strongly correlated to age and associated to a limited number of additional genetic abnormalities including trisomy 3, 18 and short gains/losses of Chr. 1, 2, 19 regions (FDR < 0.01), allowing to identify genes that may have synergistic effects with CDKN2A/2B inactivation. With a median follow-up of 42.9 months, only CDKN2A/2B biallelic deletion strongly correlates (FDR p.value < 0.01) to a poor outcome in the entire cohort (4y PFS = 44% [32-61] respectively vs. 74% [66-82] for patients in germline configuration; 4y OS = 53% [39-72] vs 83% [76-90]). In a Cox proportional hazard prediction of the PFS, CDKN2A/2B deletion remains predictive (HR = 1.9 [1.1-3.2], p = 0.02) when combined with IPI (HR = 2.4 [1.4-4.1], p = 0.001) and GCB status (HR = 1.3 [0.8-2.3], p = 0.31). This difference remains predictive in the subgroup of patients treated by R-CHOP (4y PFS = 43% [29-63] vs. 66% [55-78], p=0.02), in patients treated by R-ACVBP (4y PFS = 49% [28-84] vs. 83% [74-92], p=0.003), and in GCB (4y PFS = 50% [27-93] vs. 81% [73-90], p=0.02), or ABC/unclassified (5y PFS = 42% [28-61] vs. 67% [55-82] p = 0.009) molecular subtypes (Figure 1). Conclusion: We report for the first time an integrated genetic analysis of a large cohort of DLBCL patients included in a prospective multicentric clinical trial program allowing identifying new potential driver genes with pathogenic impact. However CDKN2A/2B deletion constitutes the strongest and unique prognostic factor of chemoresistance to R-CHOP, regardless the COO signature, which is not overcome by a more intensified immunochemotherapy. Patients displaying this frequent genomic abnormality warrant new and dedicated therapeutic approaches.
Resumo:
This article examines the extent and limits of nonstate forms of authority in international relations. It analyzes how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for the adjustment of ICT-related skills. By focusing on the challenge that highly volatile and short-lived cycles of demands for this type of knowledge pose for ensuring the right qualification of the labor force, the article explores how companies and associations provide training and certification programs as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasized that the consent of actors, subject to informal rules and some form of state support, remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues concerned and fail to fully explore the differentiated space in which non state authority is emerging. This article develops a three-dimensional analytical framework that brings together the scope of the issues involved, the range of nonstate actors concerned, and the spatial scope of their authority. The empirical findings highlight the limits of these new forms of nonstate authority and shed light on the role of the state and international governmental organizations in this new context.
Resumo:
Following infection with the protozoan parasite Leishmania major, C57BL/6 mice develop a small lesion that heals spontaneously. Resistance to infection is associated with the development of CD4(+) Th1 cells producing gamma interferon (IFN-gamma) and tumor necrosis factor (TNF), which synergize in activating macrophages to their microbicidal state. We show here that C57BL/6 mice lacking both TNF and Fas ligand (FasL) (gld TNF(-/-) mice) infected with L. major neither resolved their lesions nor controlled Leishmania replication despite the development of a strong Th1 response. Comparable inducible nitric oxide synthase (iNOS) activities were detected in lesions of TNF(-/-), gld TNF(-/-), and gld mice, but only gld and gld TNF(-/-) mice failed to control parasite replication. Parasite numbers were high in gld mice and even more elevated in gld TNF(-/-) mice, suggesting that, in addition to iNOS, the Fas/FasL pathway is required for successful control of parasite replication and that TNF contributes only a small part to this process. Furthermore, FasL was shown to synergize with IFN-gamma for the induction of leishmanicidal activity within macrophages infected with L. major in vitro. Interestingly, TNF(-/-) mice maintained large lesion size throughout infection, despite being able to largely control parasite numbers. Thus, IFN-gamma, FasL, and iNOS appear to be essential for the complete control of parasite replication, while the contribution of TNF is more important in controlling inflammation at the site of parasite inoculation.
Resumo:
[cat] En l'article es dona una condició necessària per a que els conjunts de negociació definits per Shimomura (1997) i el nucli d'un joc cooperatiu amb utilitat transferible coincideixin. A tal efecte, s'introdueix el concepte de vectors de màxim pagament. La condició necessària consiteix a verificar que aquests vectors pertanyen al nucli del joc.