7 resultados para Doreen Massey

em Aston University Research Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic experimental lung infection in rats was induced by intratracheal inoculation of agar beads containing Pseudomonas aeruginosa. Bacteria were recovered directly without subculture from the lungs of rats at 14 days post-infection and the outer membrane (OM) antigens were studied. The results indicated that bacteria grew under iron-restricted conditions as revealed by the expression of several iron-regulated membrane proteins (IRMPs) which could also be observed when the isolate was grown under iron-depleted conditions in laboratory media. The antibody response to P. aeruginosa OM protein antigens was investigated by immunoblotting with serum and lung fluid from infected rats. These fluids contained antibodies to all the major OM proteins, including the IRMPs, and protein H1. Results obtained using immunoblotting and enzyme-linked immunosorbent assay indicated that lipopolysaccharide (LPS) was the major antigen recognised by antibodies in sera from infected rats. The animal model was used to follow the development of the immune response to P. aeruginosa protein and LPS antigens. Immunoblotting was used to investigate the antigens recognised by antibodies in sequential serum samples. An antibody response to the IRMPs and OM proteins D, E, G and H1 and alao to rough LPS was detected as early as 4 days post-infection. Results obtained using immunoblotting and crossed immunoelectrophoresis techniques indicated that there was a progressive increase in the number of P. aeruginosa antigens recognised by antibodies in these sera. Both iron and magnesium depletion influenced protein H1 production. Antibodies in sera from patients with infections due to P. aeruginosa reacted with this antigen. Results obtained using quantitative gas-liquid chromatographic analysis indicated that growth phase and magnesium and iron depletion also affected the amount of LPS fatty acids, produced by P. aeruginosa. The silver stained SDS-polyacrylamide gels of proteinase K digested whole cell lysates of P. aeruginosa indicated that the O-antigen and core LPS were both affected by growth phase and specific nutrient depletion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ethosuximide is the drug of choice for treating generalized absence seizures, but its mechanism of action is still a matter of debate. It has long been thought to act by disrupting a thalamic focus via blockade of T-type channels and, thus, generation of spike-wave activity in thalamocortical pathways. However, there is now good evidence that generalized absence seizures may be initiated at a cortical focus and that ethosuximide may target this focus. In the present study we have looked at the effect ethosuximide on glutamate and GABA release at synapses in the rat entorhinal cortex in vitro, using two experimental approaches. Whole-cell patch-clamp studies revealed an increase in spontaneous GABA release by ethosuximide concurrent with no change in glutamate release. This was reflected in studies that estimated global background inhibition and excitation from intracellularly recorded membrane potential fluctuations, where there was a substantial rise in the ratio of network inhibition to excitation, and a concurrent decrease in excitability of neurones embedded in this network. These studies suggest that, in addition to well-characterised effects on ion channels, ethosuximide may directly elevate synaptic inhibition in the cortex and that this could contribute to its anti-absence effects. This article is part of a Special Issue entitled 'Post-Traumatic Stress Disorder'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mobile technology has been one of the major growth areas in computing over recent years (Urbaczewski, Valacich, & Jessup, 2003). Mobile devices are becoming increasingly diverse and are continuing to shrink in size and weight. Although this increases the portability of such devices, their usability tends to suffer. Fuelled almost entirely by lack of usability, users report high levels of frustration regarding interaction with mobile technologies (Venkatesh, Ramesh, & Massey, 2003). This will only worsen if interaction design for mobile technologies does not continue to receive increasing research attention. For the commercial benefit of mobility and mobile commerce (m-commerce) to be fully realized, users’ interaction experiences with mobile technology cannot be negative. To ensure this, it is imperative that we design the right types of mobile interaction (m-interaction); an important prerequisite for this is ensuring that users’ experience meets both their sensory and functional needs (Venkatesh, Ramesh, & Massey, 2003). Given the resource disparity between mobile and desktop technologies, successful electronic commerce (e-commerce) interface design and evaluation does not necessarily equate to successful m-commerce design and evaluation. It is, therefore, imperative that the specific needs of m-commerce are addressed–both in terms of design and evaluation. This chapter begins by exploring the complexities of designing interaction for mobile technology, highlighting the effect of context on the use of such technology. It then goes on to discuss how interaction design for mobile devices might evolve, introducing alternative interaction modalities that are likely to affect that future evolution. It is impossible, within a single chapter, to consider each and every potential mechanism for interacting with mobile technologies; to provide a forward-looking flavor of what might be possible, this chapter focuses on some more novel methods of interaction and does not, therefore, look at the typical keyboard and visual display-based interaction which, in essence, stem from the desktop interaction design paradigm. Finally, this chapter touches on issues associated with effective evaluation of m-interaction and mobile application designs. By highlighting some of the issues and possibilities for novel m-interaction design and evaluation, we hope that future designers will be encouraged to “think out of the box” in terms of their designs and evaluation strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Presynaptic NMDA receptors facilitate the release of glutamate at excitatory cortical synapses and are involved in regulation of synaptic dynamics and plasticity. At synapses in the entorhinal cortex these receptors are tonically activated and provide a positive feedback modulation of the level of background excitation. NMDA receptor activation requires obligatory occupation of a co-agonist binding site, and in the present investigation we have examined whether this site on the presynaptic receptor is activated by endogenous glycine or d-serine. We used whole-cell patch clamp recordings of spontaneous AMPA receptor-mediated synaptic currents from rat entorhinal cortex neurones in vitro as a monitor of presynaptic glutamate release. Addition of exogenous glycine or d-serine had minimal effects on spontaneous release, suggesting that the co-agonist site was endogenously activated and likely to be saturated in our slices. This was supported by the observation that a co-agonist site antagonist reduced the frequency of spontaneous currents. Depletion of endogenous glycine by enzymatic breakdown with a bacterial glycine oxidase had little effect on glutamate release, whereas d-serine depletion with a yeast d-amino acid oxidase significantly reduced glutamate release, suggesting that d-serine is the endogenous agonist. Finally, the effects of d-serine depletion were mimicked by compromising astroglial cell function, and this was rescued by exogenous d-serine, indicating that astroglial cells are the provider of the d-serine that tonically activates the presynaptic NMDA receptor. We discuss the significance of these observations for the aetiology of epilepsy and possible targeting of the presynaptic NMDA receptor in anticonvulsant therapy. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The entorhinal cortex (EC) controls hippocampal input and output, playing major roles in memory and spatial navigation. Different layers of the EC subserve different functions and a number of studies have compared properties of neurones across layers. We have studied synaptic inhibition and excitation in EC neurones, and we have previously compared spontaneous synaptic release of glutamate and GABA using patch clamp recordings of synaptic currents in principal neurones of layers II (L2) and V (L5). Here, we add comparative studies in layer III (L3). Such studies essentially look at neuronal activity from a presynaptic viewpoint. To correlate this with the postsynaptic consequences of spontaneous transmitter release, we have determined global postsynaptic conductances mediated by the two transmitters, using a method to estimate conductances from membrane potential fluctuations. We have previously presented some of this data for L3 and now extend to L2 and L5. Inhibition dominates excitation in all layers but the ratio follows a clear rank order (highest to lowest) of L2>L3>L5. The variance of the background conductances was markedly higher for excitation and inhibition in L2 compared to L3 or L5. We also show that induction of synchronized network epileptiform activity by blockade of GABA inhibition reveals a relative reluctance of L2 to participate in such activity. This was associated with maintenance of a dominant background inhibition in L2, whereas in L3 and L5 the absolute level of inhibition fell below that of excitation, coincident with the appearance of synchronized discharges. Further experiments identified potential roles for competition for bicuculline by ambient GABA at the GABAA receptor, and strychnine-sensitive glycine receptors in residual inhibition in L2. We discuss our results in terms of control of excitability in neuronal subpopulations of EC neurones and what these may suggest for their functional roles. © 2014 Greenhill et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networked Learning, e-Learning and Technology Enhanced Learning have each been defined in different ways, as people's understanding about technology in education has developed. Yet each could also be considered as a terminology competing for a contested conceptual space. Theoretically this can be a ‘fertile trans-disciplinary ground for represented disciplines to affect and potentially be re-orientated by others’ (Parchoma and Keefer, 2012), as differing perspectives on terminology and subject disciplines yield new understandings. Yet when used in government policy texts to describe connections between humans, learning and technology, terms tend to become fixed in less fertile positions linguistically. A deceptively spacious policy discourse that suggests people are free to make choices conceals an economically-based assumption that implementing new technologies, in themselves, determines learning. Yet it actually narrows choices open to people as one route is repeatedly in the foreground and humans are not visibly involved in it. An impression that the effective use of technology for endless improvement is inevitable cuts off critical social interactions and new knowledge for multiple understandings of technology in people's lives. This paper explores some findings from a corpus-based Critical Discourse Analysis of UK policy for educational technology during the last 15 years, to help to illuminate the choices made. This is important when through political economy, hierarchical or dominant neoliberal logic promotes a single ‘universal model’ of technology in education, without reference to a wider social context (Rustin, 2013). Discourse matters, because it can ‘mould identities’ (Massey, 2013) in narrow, objective economically-based terms which 'colonise discourses of democracy and student-centredness' (Greener and Perriton, 2005:67). This undermines subjective social, political, material and relational (Jones, 2012: 3) contexts for those learning when humans are omitted. Critically confronting these structures is not considered a negative activity. Whilst deterministic discourse for educational technology may leave people unconsciously restricted, I argue that, through a close analysis, it offers a deceptively spacious theoretical tool for debate about the wider social and economic context of educational technology. Methodologically it provides insights about ways technology, language and learning intersect across disciplinary borders (Giroux, 1992), as powerful, mutually constitutive elements, ever-present in networked learning situations. In sharing a replicable approach for linguistic analysis of policy discourse I hope to contribute to visions others have for a broader theoretical underpinning for educational technology, as a developing field of networked knowledge and research (Conole and Oliver, 2002; Andrews, 2011).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the course of the last twenty years there has been a growing academic interest in performance management, particularly in respect of the evolution of new techniques and their resulting impact. One important theoretical development has been the emergence of multidimensional performance measurement models that are potentially applicable within the public sector. Empirically, academic researchers are increasingly supporting the use of such models as a way of improving public sector management and the effectiveness of service provision (Mayston, 1985; Pollitt, 1986; Bates and Brignall, 1993; and Massey, 1999). This paper seeks to add to the literature by using both theoretical and empirical evidence to argue that CPA, the external inspection tool used by the Audit Commission to evaluate local authority performance management, is a version of the Balanced Scorecard which, when adapted for internal use, may have beneficial effects. After demonstrating the parallels between the CPA framework and Kaplan and Norton's public sector Balanced Scorecard (BSC), we use a case study of the BSC based performance management system in Hertfordshire County Council to demonstrate the empirical linkages between a local scorecard and CPA. We conclude that CPA is based upon the BSC and has the potential to serve as a springboard for the evolution of local authority performance management systems.