985 resultados para static random access memory
Resumo:
In the context of an autologous cell transplantation study, a unilateral biopsy of cortical tissue was surgically performed from the right dorsolateral prefrontal cortex (dlPFC) in two intact adult macaque monkeys (dlPFC lesioned group), together with the implantation of a chronic chamber providing access to the left motor cortex. Three other monkeys were subjected to the same chronic chamber implantation, but without dlPFC biopsy (control group). All monkeys were initially trained to perform sequential manual dexterity tasks, requiring precision grip. The motor performance and the prehension's sequence (temporal order to grasp pellets from different spatial locations) were analysed for each hand. Following the surgery, transient and moderate deficits of manual dexterity per se occurred in both groups, indicating that they were not due to the dlPFC lesion (most likely related to the recording chamber implantation and/or general anaesthesia/medication). In contrast, changes of motor habit were observed for the sequential order of grasping in the two monkeys with dlPFC lesion only. The changes were more prominent in the monkey subjected to the largest lesion, supporting the notion of a specific effect of the dlPFC lesion on the motor habit of the monkeys. These observations are reminiscent of previous studies using conditional tasks with delay that have proposed a specialization of the dlPFC for visuo-spatial working memory, except that this is in a different context of "free-will", non-conditional manual dexterity task, without a component of working memory.
Resumo:
Non-urgent cases represent 30-40% of all ED consults; they contribute to overcrowding of emergency departments (ED), which could be reduced if they were denied emergency care. However, no triage instrument has demonstrated a high enough degree of accuracy to safely rule out serious medical conditions: patients suffering from life-threatening emergencies have been inappropriately denied care. Insurance companies have instituted financial penalties to discourage the use of ED as a source of non-urgent care, but this practice mainly restricts access for the underprivileged. More recent data suggest that in fact most patients consult for appropriate urgent reasons, or have no alternate access to urgent care. The safe reduction of overcrowding requires a reform of the healthcare system based on patients' needs rather than access barriers.
Resumo:
This paper reviews the evidence on the effects of recessions on potential output. In contrast to the assumption in mainstream macroeconomic models that economic fluctuations do not change potential output paths, the evidence is that they do in the case of recessions. A model is proposed to explain this phenomenon, based on an analogy with water flows in porous media. Because of the discrete adjustments made by heterogeneous economic agents in such a world, potential output displays hysteresis with regard to aggregate demand shocks, and thus retains a memory of the shocks associated with recessions.
Resumo:
This paper examines the interactions between multiple national fiscal policy- makers and a single monetary policy maker in response to shocks to government debt in some or all of the countries of a monetary union. We assume that national governments respond to excess debt in an optimal manner, but that they do not have access to a commitment technology. This implies that national fi scal policy gradually reduces debt: the lack of a commitment technology precludes a random walk in steady state debt, but the need to maintain national competitiveness avoids excessively rapid debt reduction. If the central bank can commit, it adjusts its policies only slightly in response to higher debt, allowing national fiscal policy to undertake most of the adjustment. However if it cannot commit, then optimal monetary policy involves using interest rates to rapidly reduce debt, with signifi cant welfare costs. We show that in these circumstances the central bank would do better to ignore national fiscal policies in formulating its policy.
Resumo:
Starting from the observation that ghosts are strikingly recurrent and prominent figures in late-twentieth African diasporic literature, this dissertation proposes to account for this presence by exploring its various functions. It argues that, beyond the poetic function the ghost performs as metaphor, it also does cultural, theoretical and political work that is significant to the African diaspora in its dealings with issues of history, memory and identity. Toni Morrison's Beloved (1987) serves as a guide for introducing the many forms, qualities and significations of the ghost, which are then explored and analyzed in four chapters that look at Fred D'Aguiar's Feeding the Ghosts (1998), Gloria Naylor's Mama Day (1988), Paule Marshall's Praisesong for the Widow (1983) and a selection of novels, short stories and poetry by Michelle Cliff. Moving thematically through these texts, the discussion shifts from history through memory to identity as it examines how the ghost trope allows the writers to revisit sites of trauma; revise historical narratives that are constituted and perpetuated by exclusions and invisibilities; creatively and critically repossess a past marked by violence, dislocation and alienation and reclaim the diasporic culture it contributed to shaping; destabilize and deconstruct the hegemonic, normative categories and boundaries that delimit race or sexuality and envision other, less limited and limiting definitions of identity. These diverse and interrelated concerns are identified and theorized as participating in a project of "re-vision," a critical project that constitutes an epistemological as much as a political gesture. The author-based structure allows for a detailed analysis of the texts and highlights the distinctive shapes the ghost takes and the particular concerns it serves to address in each writer's literary and political project. However, using the ghost as a guide into these texts, taken collectively, also throws into relief new connections between them and sheds light on the complex ways in which the interplay of history, memory and identity positions them as products of and contributions to an African diasporic (literary) culture. If it insists on the cultural specificity of African diasporic ghosts, tracing its origins to African cultures and spiritualities, the argument also follows gothic studies' common view that ghosts in literary and cultural productions-like other related figures of the living dead-respond to particular conditions and anxieties. Considering the historical and political context in which the texts under study were produced, the dissertation makes connections between the ghosts in them and African diasporic people's disillusionment with the broken promises of the civil rights movement in the United States and of postcolonial independence in the Caribbean. It reads the texts' theoretical concerns and narrative qualities alongside the contestation of traditional historiography by black and postcolonial studies as well as the broader challenge to conventional notions such as truth, reality, meaning, power or identity by poststructuralism, postcolonialism or queer theory. Drawing on these various theoretical approaches and critical tools to elucidate the ghost's deconstructive power for African diasporic writers' concerns, this work ultimately offers a contribution to "speciality studies," which is currently emerging as a new field of scholarship in cultural theory.
Resumo:
BACKGROUND: Identification of a Primary Care Physician (PCP) by older patients is considered as essential for the coordination of care, but the extent to which identified PCPs are general practitioners or specialists is unknown. This study described older patients' experiences with their PCP and tested the hypothesis of differences between patients who identify a specialist as their PCP (SP PCP) and those who turn to a general practitioner (GP PCP). METHODS: In 2012, a cross-sectional postal survey on care was conducted in the 68+ year old population of the canton of Vaud. Data was provided by 2,276 participants in the ongoing Lausanne cohort 65+ (Lc65+), a study of those born between 1934 and 1943, and by 998 persons from an additional sample drawn to include the population outside of Lausanne or born before 1934. RESULTS: Participants expressed favourable perceptions, at rates exceeding 75% for most items. However, only 38% to 51% responded positively for out-of-hours availability, easy access and at home visits, likelihood of prescribing expensive medication if needed, and doctors' awareness of over-the-counter drugs. 12.0% had an SP PCP, in 95.9% specialised in a discipline implying training in internal medicine. Bivariate and multivariate analyses did not result in significant differences between GP and SP PCPs regarding perceptions of accessibility/availability, doctor-patient relationship, information and continuity of care, prevention, spontaneous use of the emergency department or ambulatory care utilisation. CONCLUSIONS: Experiences of old patients were mostly positive despite some lack in reported hearing, memory testing, and colorectal cancer screening. We found no differences between GP and SP PCP groups.
Resumo:
Induction of cytotoxic CD8 T-cell responses is enhanced by the exclusive presentation of antigen through dendritic cells, and by innate stimuli, such as toll-like receptor ligands. On the basis of these 2 principles, we designed a vaccine against melanoma. Specifically, we linked the melanoma-specific Melan-A/Mart-1 peptide to virus-like nanoparticles loaded with A-type CpG, a ligand for toll-like receptor 9. Melan-A/Mart-1 peptide was cross-presented, as shown in vitro with human dendritic cells and in HLA-A2 transgenic mice. A phase I/II study in stage II-IV melanoma patients showed that the vaccine was well tolerated, and that 14/22 patients generated ex vivo detectable T-cell responses, with in part multifunctional T cells capable to degranulate and produce IFN-γ, TNF-α, and IL-2. No significant influence of the route of immunization (subcutaneous versus intradermal) nor dosing regimen (weekly versus daily clusters) could be observed. It is interesting to note that, relatively large fractions of responding specific T cells exhibited a central memory phenotype, more than what is achieved by other nonlive vaccines. We conclude that vaccination with CpG loaded virus-like nanoparticles is associated with a human CD8 T-cell response with properties of a potential long-term immune protection from the disease.
Resumo:
This paper analyses the impact of policy initiatives co-ordinated by Asian national governments on firms' access to external finance, using a unique firm-level database of eight Asian countries- Hong Kong SAR, Indonesia, Korea, Malaysia, Philippines, Singapore, Taiwan and Thailand over the period of 1996-2012. Using a difference-indifferences approach and controlling for firm-level and macroeconomic factors, the results show a significant impact of policy on firms' access to external finance. After splitting firms into constrained and unconstrained, using several criteria, the results document that unconstrained firms benefited significantly in obtaining external finance, compared to their constrained counterparts. Finally, we show that the increase in access to external finance after the policy initiative helped firms to raise their investment spending, especially for unconstrained firms.
Resumo:
This paper develops a new test of true versus spurious long memory, based on log-periodogram estimation of the long memory parameter using skip-sampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of significance of the long memory parameter, and composite test procedure described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependent-sample bootstrap technique to approximate short-run dependence following fractional differencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series.
Resumo:
The idea of Chineseness as a geographic, cultural-specific and ethnically-charged concept, and the pivotal role assumed by memory linger throughout the writings of most authors hailing from Chinese community in Southeast Asia. Among these communities, being the Malaysian Chinese the more prolific in terms of number of writers and pieces of literature produced, this paper deals specifically with it. Its focus is put on the literature produced by Malaysian Chinese authors residing in Taiwan, which topic constitutes an important part of the first chapter, and on one of its main representatives, Ng Kim Chew, to whom chapter two and three are fully dedicated. A literary analysis of one of his short stories, Huo yu tu, will allow the reader to have a first-hand experience, through excerpts from the original text, of the importance of Chineseness and memory in the literary production of Ng and of many authors sharing with him similar life and literary experiences. I started this research from the assumption that these authors make large use of their own memories and memories from their own community in their writing as a way to re-tie themselves to the Chineseness they left in their places of origin. However in the case of Ng Kim Chew, the analysis of his works led be to theorizing that the identity he is imbued with, if there is one, is not Chinese, nor Malaysian, but purely and distinctively Malaysian-Chinese. This paper can also serve as an introduction for the general public to the field of Sinophone literature from Southeast Asia and to promote wider and innovative paths of research within the realm of Chinese studies that go beyond China proper.
Resumo:
En termes de temps d'execució i ús de dades, les aplicacions paral·leles/distribuïdes poden tenir execucions variables, fins i tot quan s'empra el mateix conjunt de dades d'entrada. Existeixen certs aspectes de rendiment relacionats amb l'entorn que poden afectar dinàmicament el comportament de l'aplicació, tals com: la capacitat de la memòria, latència de la xarxa, el nombre de nodes, l'heterogeneïtat dels nodes, entre d'altres. És important considerar que l'aplicació pot executar-se en diferents configuracions de maquinari i el desenvolupador d'aplicacions no port garantir que els ajustaments de rendiment per a un sistema en particular continuïn essent vàlids per a d'altres configuracions. L'anàlisi dinàmica de les aplicacions ha demostrat ser el millor enfocament per a l'anàlisi del rendiment per dues raons principals. En primer lloc, ofereix una solució molt còmoda des del punt de vista dels desenvolupadors mentre que aquests dissenyen i evaluen les seves aplicacions paral·leles. En segon lloc, perquè s'adapta millor a l'aplicació durant l'execució. Aquest enfocament no requereix la intervenció de desenvolupadors o fins i tot l'accés al codi font de l'aplicació. S'analitza l'aplicació en temps real d'execució i es considra i analitza la recerca dels possibles colls d'ampolla i optimitzacions. Per a optimitzar l'execució de l'aplicació bioinformàtica mpiBLAST, vam analitzar el seu comportament per a identificar els paràmetres que intervenen en el rendiment d'ella, com ara: l'ús de la memòria, l'ús de la xarxa, patrons d'E/S, el sistema de fitxers emprat, l'arquitectura del processador, la grandària de la base de dades biològica, la grandària de la seqüència de consulta, la distribució de les seqüències dintre d'elles, el nombre de fragments de la base de dades i/o la granularitat dels treballs assignats a cada procés. El nostre objectiu és determinar quins d'aquests paràmetres tenen major impacte en el rendiment de les aplicacions i com ajustar-los dinàmicament per a millorar el rendiment de l'aplicació. Analitzant el rendiment de l'aplicació mpiBLAST hem trobat un conjunt de dades que identifiquen cert nivell de serial·lització dintre l'execució. Reconeixent l'impacte de la caracterització de les seqüències dintre de les diferents bases de dades i una relació entre la capacitat dels workers i la granularitat de la càrrega de treball actual, aquestes podrien ser sintonitzades dinàmicament. Altres millores també inclouen optimitzacions relacionades amb el sistema de fitxers paral·lel i la possibilitat d'execució en múltiples multinucli. La grandària de gra de treball està influenciat per factors com el tipus de base de dades, la grandària de la base de dades, i la relació entre grandària de la càrrega de treball i la capacitat dels treballadors.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
Melan-A specific CD8+ T cells are thought to play an important role against the development of melanoma. Their in vivo expansion is often observed with advanced disease. In recent years, low levels of Melan-A reactive CD8+ T cells have also been found in HLA-A2 healthy donors, but these cells harbor naive characteristics and are thought to be mostly cross-reactive for the Melan-A antigen. Here, we report on a large population of CD8+ T cells reactive for the Melan-A antigen, identified in one donor with no evidence of melanoma. Interestingly, this population is oligoclonal and displays a clear memory phenotype. However, a detailed study of these cells indicated that they are unlikely to be directly specific for melanoma, so that their in vivo expansion may have been driven by an exogenous antigen. Screening of a Melan-A cross-reactive peptide library suggested that these cells may be specific for an epitope derived from a Mycobacterium protein, which would provide a further example of CD8+ T cell cross-reactivity between a pathogen antigen and a tumor antigen. Finally, we discuss potential perspectives regarding the role of such cells in heterologous immunity, by influencing the balance between protective immunity and pathology, e.g. in the case of melanoma development.