11 resultados para Advent of christianity

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Adrenomedullin (AM), a potent vasoactive peptide, is elevated in certain disease states such as sepsis. Its role as a physiologically relevant peptide has been confirmed with the advent of the homozygous lethal AM peptide knockout mouse. So far, there have been few and conflicting studies which examine the regulatory role of AM at the receptor level. In this article, we discuss the few studies that have been presented on the desensitisation of AM receptors and also present novel data on the desensitisation of endogenous AM receptors in Rat-2 fibroblasts. © 2003 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the advent of globalisation companies all around the world must improve their performance in order to survive. The threats are coming from everywhere, and in different ways, such as low cost products, high quality products, new technologies, and new products. Different companies in different countries are using various techniques and using quality criteria items to strive for excellence. Continuous improvement techniques are used to enable companies to improve their operations. Therefore, companies are using techniques such as TQM, Kaizen, Six-Sigma, Lean Manufacturing, and quality award criteria items such as Customer Focus, Human Resources, Information & Analysis, and Process Management. The purpose of this paper is to compare the use of these techniques and criteria items in two countries, Mexico and the United Kingdom, which differ in culture and industrial structure. In terms of the use of continuous improvement tools and techniques, Mexico formally started to deal with continuous improvement by creating its National Quality Award soon after the Americans began the Malcolm Baldrige National Quality Award. The United Kingdom formally started by using the European Quality Award (EQA), modified and renamed as the EFQM Excellence Model. The methodology used in this study was to undertake a literature review of the subject matter and to study some general applications around the world. A questionnaire survey was then designed and a survey undertaken based on the same scale, about the same sample size, and the about the same industrial sector within the two countries. The survey presents a brief definition of each of the constructs to facilitate understanding of the questions. The analysis of the data was then conducted with the assistance of a statistical software package. The survey results indicate both similarities and differences in the strengths and weaknesses of the companies in the two countries. One outcome of the analysis is that it enables the companies to use the results to benchmark themselves and thus act to reinforce their strengths and to reduce their weaknesses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The advent of Internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This paper reports on an assessment of the branches of a Portuguese bank in terms of their performance in their new roles in three different areas: Their efficiency in fostering the use of new transaction channels, their efficiency in increasing sales and their customer base, and their efficiency in generating profits. Service quality is also a major issue in service organisations like bank branches, and therefore we analyse the way this dimension of performance has been accounted for in the literature and take it into account in our empirical application. We have used data envelopment analysis (DEA) for the different performance assessments, but we depart from traditional DEA models in some cases. Performance comparisons on each dimension allowed us to identify benchmark bank branches and also problematic bank branches. In addition, we found positive links between operational and profit efficiency and also between transactional and operational efficiency. Service quality is positively related with operational and profit efficiency. © 2006 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Synaptic plasticity is the dynamic regulation of the strength of synaptic communication between nerve cells. It is central to neuronal development as well as experience-dependent remodeling of the adult nervous system as occurs during memory formation. Aberrant forms of synaptic plasticity also accompany a variety of neurological and psychiatric diseases, and unraveling the biological basis of synaptic plasticity has been a major goal in neurobiology research. The biochemical and structural mechanisms underlying different forms of synaptic plasticity are complex, involving multiple signaling cascades, reconfigurations of structural proteins and the trafficking of synaptic proteins. As such, proteomics should be a valuable tool in dissecting the molecular events underlying normal and disease-related forms of plasticity. In fact, progress in this area has been disappointingly slow. We discuss the particular challenges associated with proteomic interrogation of synaptic plasticity processes and outline ways in which we believe proteomics may advance the field over the next few years. We pay particular attention to technical advances being made in small sample proteomics and the advent of proteomic imaging in studying brain plasticity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The roots of the concept of cortical columns stretch far back into the history of neuroscience. The impulse to compartmentalise the cortex into functional units can be seen at work in the phrenology of the beginning of the nineteenth century. At the beginning of the next century Korbinian Brodmann and several others published treatises on cortical architectonics. Later, in the middle of that century, Lorente de No writes of chains of ‘reverberatory’ neurons orthogonal to the pial surface of the cortex and called them ‘elementary units of cortical activity’. This is the first hint that a columnar organisation might exist. With the advent of microelectrode recording first Vernon Mountcastle (1957) and then David Hubel and Torsten Wiesel provided evidence consistent with the idea that columns might constitute units of physiological activity. This idea was backed up in the 1970s by clever histochemical techniques and culminated in Hubel and Wiesel’s well-known ‘ice-cube’ model of the cortex and Szentogathai’s brilliant iconography. The cortical column can thus be seen as the terminus ad quem of several great lines of neuroscientific research: currents originating in phrenology and passing through cytoarchitectonics; currents originating in neurocytology and passing through Lorente de No. Famously, Huxley noted the tragedy of a beautiful hypothesis destroyed by an ugly fact. Famously, too, human visual perception is orientated toward seeing edges and demarcations when, perhaps, they are not there. Recently the concept of cortical columns has come in for the same radical criticism that undermined the architectonics of the early part of the twentieth century. Does history repeat itself? This paper reviews this history and asks the question.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this contribution I look at three episodes in the history of neurophysiology that bring out the complex relationship between seeing and believing. I start with Vesalius in the mid-sixteenth century who writes that he can in no way see any cavity in nerves, even in the optic nerves. He thus questions the age-old theory (dating back to the Alexandrians in the third century BC) but, because of the overarching psychophysiology of his time, does not press his case. This conflict between observation and theory persisted for a quarter of a millennium until finally resolved at the beginning of the nineteenth century by the discoveries of Galvani and Volta. The second case is provided by the early history of retinal synaptology. Schultze in 1866 had represented rod spherules and bipolar dendrites in the outer plexiform layer as being separated by a (synaptic) gap, yet in his written account, because of his theoretical commitments, held them to be continuous. Cajal later, 1892, criticized Schultze for this pusillanimity, but his own figure in La Cellule is by no means clear. It was only with the advent of the electron microscopy in the mid-twentieth century that the true complexity of the junction was revealed and it was shown that both investigators were partially right. My final example comes from the Hodgkin-Huxley biophysics of the 1950s. Their theory of the action potential depended on the existence of unseen ion pores with quite complex biophysical characteristics. These were not seen until the Nobel-Prize-winning X-ray diffraction analyses of the early twenty-first century. Seeing, even at several removes, then confirmed Hodgkin and Huxley’s belief. The relation between seeing and believing is by no means straightforward.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The advent of DNA vaccines has heralded a new technology allowing the design and elicitation of immune responses more adequate for a wider range of pathogens. The formulation of these vaccines into the desired dosage forms extends their capability in terms of stability, routes of administration and efficacy. This thesis describes an investigation into the fabrication of plasmid DNA, the active principle of DNA vaccines, into microspheres, based on the tenet of an increased cellular uptake of microparticulate matter by phagocytic cells. The formulation of plasmid DNA into microspheres using two methods, is presented. Formulation of microspheric plasmid DNA using the double emulsion solvent evaporation method and a spray-drying method was explored. The former approach involves formation of a double emulsion, by homogenisation. This method produced microspheres of uniform size and smooth morphology, but had a detrimental effect on the formulated DNA. The spray-drying method resulted in microspheres with an improved preservation of DNA stability. The use of polyethylenimine (PEI) and stearylamine (SA) as agents in the microspheric formulation of plasmid DNA is a novel approach to DNA vaccine design. Using these molecules as model positively-charged agents, their influence on the characteristics of the microspheric formulations was investigated. PEI improved the entrapment efficiency of the plasmid DNA in microspheres, and has minimal effect on either the surface charge, morphology or size distribution of the formulations. Stearylamine effected an increase in the entrapment efficiency and stability of the plasmid DNA and its effect on the micropshere morphology was dependent on the method of preparation. The differences in the effects of the two molecules on microsphere formulations may be attributable to their dissimilar physico-chemical properties. PEI is water-soluble and highly-branched, while SA is hydrophobic and amphipathic. The positive charge of both molecules is imparted by amine functional groups. Preliminary data on the in vivo application of formulated DNA vaccine, using hepatitis B plasmid, showed superior humoral responses to the formulated antigen, compared with free (unformulated) antigen.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is a great deal of literature about the initial stages of innovative design. This is the process whereby a completely new product is conceived, invented and developed. In industry, however, the continuing success of a company is more often achieved by improving or developing existing designs to maintain their marketability. Unfortunately, this process of design by evolution is less well documented. This thesis reports the way in which this process was improved for the sponsoring company. The improvements were achieved by implementing a new form of computer aided design (C.A.D.) system. The advent of this system enabled the company to both shorten the design and development time and also to review the principles underlying the existing design procedures. C.A.D. was a new venture for the company and care had to be taken to ensure that the new procedures were compatible with the existing design office environment. In particular, they had to be acceptable to the design office staff. The C.A.D. system produced guides the designer from the draft specification to the first prototype layout. The computer presents the consequences of the designer's decisions clearly and fully, often by producing charts and sketches. The C.A.D. system and the necessary peripheral facilities were implemented, monitored and maintained. The system structure was left sufficiently flexible for maintenance to be undertaken quickly and effectively. The problems encountered during implementation are well documented in this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the years several articles have tracked the impact of technology on various aspects of the sales domain. However, the advent of social media and technologies related to social media has gone largely unnoticed in the literature. This article first provides brief attention to changing aspects of technology within the sales environment, leading to the identification of social media as a dominant new selling tool. A qualitative approach (focus groups) is employed to explore the breadth of current technology usage by sales managers and salespeople. Analysis of the data, collected in the United States and the United Kingdom, reveals six major themes: connectivity, relationships, selling tools, generational, global, and sales/marketing interface. Results provide evidence of a revolution in the buyer-seller relationship that includes some unanticipated consequences both for sales organization performance and needed future research contributions.