6 resultados para Stolper-Samuelson
em Université de Lausanne, Switzerland
Resumo:
This article examines how Samuelson defined his own role as an economist as a technical expert, who walked what he called the "middle of the road" to-seemingly-stay out of the realm of politics. As point of entry I discuss the highly tempting offers made by Theodore M. Schultz in the 1940s to come over to Chicago, which Schultz persistently repeated over three years and despite strong Chicago faculty resistance. A contrast between Schultz's own experiences as an economic expert at Iowa State, Samuelson's work as an external consultant for the National Resources Planning Board during the Second World War, and the firm support of the MIT administration for Samuelson's research serves to pinpoint the meaning of being technical for Samuelson, and the relation of the technical economic expert to the realm of politics.
Resumo:
Introduction: As part of the MicroArray Quality Control (MAQC)-II project, this analysis examines how the choice of univariate feature-selection methods and classification algorithms may influence the performance of genomic predictors under varying degrees of prediction difficulty represented by three clinically relevant endpoints. Methods: We used gene-expression data from 230 breast cancers (grouped into training and independent validation sets), and we examined 40 predictors (five univariate feature-selection methods combined with eight different classifiers) for each of the three endpoints. Their classification performance was estimated on the training set by using two different resampling methods and compared with the accuracy observed in the independent validation set. Results: A ranking of the three classification problems was obtained, and the performance of 120 models was estimated and assessed on an independent validation set. The bootstrapping estimates were closer to the validation performance than were the cross-validation estimates. The required sample size for each endpoint was estimated, and both gene-level and pathway-level analyses were performed on the obtained models. Conclusions: We showed that genomic predictor accuracy is determined largely by an interplay between sample size and classification difficulty. Variations on univariate feature-selection methods and choice of classification algorithm have only a modest impact on predictor performance, and several statistically equally good predictors can be developed for any given classification problem.
Resumo:
One hypothesis for the origin of alkaline lavas erupted on oceanic islands and in intracontinental settings is that they represent the melts of amphibole-rich veins in the lithosphere (or melts of their dehydrated equivalents if metasomatized lithosphere is recycled into the convecting mantle). Amphibole-rich veins are interpreted as cumulates produced by crystallization of low-degree melts of the underlying asthenosphere as they ascend through the lithosphere. We present the results of trace-element modelling of the formation and melting of veins formed in this way with the goal of testing this hypothesis and for predicting how variability in the formation and subsequent melting of such cumulates (and adjacent cryptically and modally metasomatized lithospheric peridotite) would be manifested in magmas generated by such a process. Because the high-pressure phase equilibria of hydrous near-solidus melts of garnet lherzolite are poorly constrained and given the likely high variability of the hypothesized accumulation and remelting processes, we used Monte Carlo techniques to estimate how uncertainties in the model parameters (e.g. the compositions of the asthenospheric sources, their trace-element contents, and their degree of melting; the modal proportions of crystallizing phases, including accessory phases, as the asthenospheric partial melts ascend and crystallize in the lithosphere; the amount of metasomatism of the peridotitic country rock; the degree of melting of the cumulates and the amount of melt derived from the metasomatized country rock) propagate through the process and manifest themselves as variability in the trace-element contents and radiogenic isotopic ratios of model vein compositions and erupted alkaline magma compositions. We then compare the results of the models with amphibole observed in lithospheric veins and with oceanic and continental alkaline magmas. While the trace-element patterns of the near-solidus peridotite melts, the initial anhydrous cumulate assemblage (clinopyroxene +/- garnet +/- olivine +/- orthopyroxene), and the modelled coexisting liquids do not match the patterns observed in alkaline lavas, our calculations show that with further crystallization and the appearance of amphibole (and accessory minerals such as rutile, ilmenite, apatite, etc.) the calculated cumulate assemblages have trace-element patterns that closely match those observed in the veins and lavas. These calculated hydrous cumulate assemblages are highly enriched in incompatible trace elements and share many similarities with the trace-element patterns of alkaline basalts observed in oceanic or continental setting such as positive Nb/La, negative Ce/Pb, and similiar slopes of the rare earth elements. By varying the proportions of trapped liquid and thus simulating the cryptic and modal metasomatism observed in peridotite that surrounds these veins, we can model the variations in Ba/Nb, Ce/Pb, and Nb/U ratios that are observed in alkaline basalts. If the isotopic compositions of the initial low-degree peridotite melts are similar to the range observed in mid-ocean ridge basalt, our model calculations produce cumulates that would have isotopic compositions similar to those observed in most alkaline ocean island basalt (OIB) and continental magmas after similar to 0 center dot 15 Gyr. However, to produce alkaline basalts with HIMU isotopic compositions requires much longer residence times (i.e. 1-2 Gyr), consistent with subduction and recycling of metasomatized lithosphere through the mantle. such as a heterogeneous asthenosphere. These modelling results support the interpretation proposed by various researchers that amphibole-bearing veins represent cumulates formed during the differentiation of a volatile-bearing low-degree peridotite melt and that these cumulates are significant components of the sources of alkaline OIB and continental magmas. The results of the forward models provide the potential for detailed tests of this class of hypotheses for the origin of alkaline magmas worldwide and for interpreting major and minor aspects of the geochemical variability of these magmas.
Resumo:
RESUME Ce travail se propose de discuter des résultats comportementaux observés chez des rats obtenus dans trois paradigmes expérimentaux différents : le bassin de Morris (Morris Water Maze, Morris, 1984) ; la table à trous (Homing Board, Schenk, 1989) et le labyrinthe radial (Radial Arm Maze, Olton et Samuelson, 1976). Les deux premières tâches sont spatiales et permettent un apprentissage de place en environnements contrôlés, et la troisième est une tâche comportementale qui différencie deux habiletés particulières, celle d'élimination (basée sur la mémoire de travail) et celle de sélection (basée sur la mémoire de référence). La discussion des résultats porte sur les stratégies de navigation utilisées par les animaux pour résoudre les tâches et plus précisément sur les facteurs qui peuvent influencer le choix de ces stratégies. Le facteur environnemental (environnement contrôlé) et le facteur cognitif (vieillissement) représentent les variables étudiées ici. C'est ainsi que certaines hypothèses communément acceptées ont été malmenées par nos résultats. Or si l'espace est habituellement supposé homogène (toutes les positions spatiales présentent le même degré de difficulté lors d'un apprentissage en champ ouvert), ce travail établit qu'une position associée -sans contiguïté - à l'un des trois indices visuels situés dans la périphérie de l'environnement est plus difficile à apprendre qu'une position située entre deux des trois indices. Deuxièmement, alors qu'il est admis que l'apprentissage d'une place dans un environnement riche requiert le même type d'information. dans la bassin de Morris (tâche nagée) que sur la table à trous (tâche marchée), nous avons montré que la discrimination spatiale en bassin ne peut être assurée par les trois indices visuels périphériques et nécessite la présence d'au moins un élément supplémentaire. Enfin, l'étude du vieillissement a souvent montré que l'âge réduit les capacités cognitives nécessaires à la navigation spatiale, conduisant à un déficit général des performances d'un animal sénescent, alors que dans notre travail, nous avons trouvé les animaux âgés plus performants et plus efficaces que les adultes dans une tâche particulière de collecte de nourriture. Ces expériences s'inscrivent dans une étude générale qui met à l'épreuve le modèle théorique proposé pax Schenk et Jacobs (2003), selon lequel l'encodage de la carte cognitive (Tolman, 1948 ; O'Keefe et Nadel, 1978) se ferait dans l'hippocampe par l'activité de deux modules complémentaires :d'une part le CA3 - Gyrus Denté pour le traitement d'une trame spatiale basée sur des éléments directionnels et Jou distribués en gradient (bearing map) et d'autre part le CAl - Subiculum pour le traitement des représentations locales basées sur les positions relatives des éléments fixes de l'environnement (sketch map). SUMMARY This work proposes to talk about behavioural results observed in three different experimental paradigms with rats: the Morris Water Maze (Morris, 1984); the Homing Board (Schenk, 1989) and the Radial Arm Maze (Olton and Samuelson, 1976). The two first tasks are spatial ones and allow place learning in controlled environments. The third one is a behavioural task which contrasts two particular skills, the elimination (based on working memory) and the selection one (based on reference memory). The topic of the discussion will be the navigation strategies used by animals to solve the different tasks, and more precisely the factors which can bias this strategies' choice. The environmental (controlled) and the cognitive (aging) factors are the variables studied here. Thus, some hypotheses usually accepted were manhandled by our results. Indeed, if space is habitually homogenously considered (all spatial positions present the same degree of difficulty in an open field learning), this work establishes that an associated position -without being adjacent - to one of the three visual cues localised in the environmental periphery is more difficult to learn than a configurationnel position (situated between two of the three cues). Secondly, if it is received that place learning in a rich environment requires the same information in the Morris water maze (swimming task) that on the Homing board (walking task), we showed that spatial discrimination in the water maze can't be provided by the three peripheral cue cards and needs the presence of a supplementary cue. At last, aging studies often showed that oldness decreases cognitive skills in spatial navigation, leading to a general deficit in performances. But, in our work, we found that senescent rats were more efficient than adult ones in a special food collecting task. These experiments come within the scope of a general study which tests the theoretical model proposed by Jacobs and Schenk (2003), according to which the cognitive map's encoding (Tolman, 1948, O'Keefe and Nadel, 1978) should take place in the hippocampus by two complementary modules, first the DG-CA3 should encode directional and/or gradients references (the bearing map), and secondly the Subiculum-CAl should process locale elements (the sketch map).
Resumo:
Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.
Resumo:
This paper provides an explanation of the emergence of the standard textbook definition of public goods in the middle of the 20th century. It focuses on Richard Musgrave's contribution in defining public goods as non-rival and non-excludable - from 1939 to 1969. Although Samuelson's mathematical definition is generally used in models of public goods, the qualitative understanding of the specificity of pure public goods owes more to Musgrave's emphasis on the impossibility of exclusion. This paper also highlights the importance of the size of the group to which benefits of a public good accrue. This analysis allow for a reassessment of the Summary table of goods which first appeared in Musgrave and Musgrave (1973) textbook.