26 resultados para Advent.

em Aston University Research Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adrenomedullin (AM), a potent vasoactive peptide, is elevated in certain disease states such as sepsis. Its role as a physiologically relevant peptide has been confirmed with the advent of the homozygous lethal AM peptide knockout mouse. So far, there have been few and conflicting studies which examine the regulatory role of AM at the receptor level. In this article, we discuss the few studies that have been presented on the desensitisation of AM receptors and also present novel data on the desensitisation of endogenous AM receptors in Rat-2 fibroblasts. © 2003 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of globalisation companies all around the world must improve their performance in order to survive. The threats are coming from everywhere, and in different ways, such as low cost products, high quality products, new technologies, and new products. Different companies in different countries are using various techniques and using quality criteria items to strive for excellence. Continuous improvement techniques are used to enable companies to improve their operations. Therefore, companies are using techniques such as TQM, Kaizen, Six-Sigma, Lean Manufacturing, and quality award criteria items such as Customer Focus, Human Resources, Information & Analysis, and Process Management. The purpose of this paper is to compare the use of these techniques and criteria items in two countries, Mexico and the United Kingdom, which differ in culture and industrial structure. In terms of the use of continuous improvement tools and techniques, Mexico formally started to deal with continuous improvement by creating its National Quality Award soon after the Americans began the Malcolm Baldrige National Quality Award. The United Kingdom formally started by using the European Quality Award (EQA), modified and renamed as the EFQM Excellence Model. The methodology used in this study was to undertake a literature review of the subject matter and to study some general applications around the world. A questionnaire survey was then designed and a survey undertaken based on the same scale, about the same sample size, and the about the same industrial sector within the two countries. The survey presents a brief definition of each of the constructs to facilitate understanding of the questions. The analysis of the data was then conducted with the assistance of a statistical software package. The survey results indicate both similarities and differences in the strengths and weaknesses of the companies in the two countries. One outcome of the analysis is that it enables the companies to use the results to benchmark themselves and thus act to reinforce their strengths and to reduce their weaknesses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of Internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This paper reports on an assessment of the branches of a Portuguese bank in terms of their performance in their new roles in three different areas: Their efficiency in fostering the use of new transaction channels, their efficiency in increasing sales and their customer base, and their efficiency in generating profits. Service quality is also a major issue in service organisations like bank branches, and therefore we analyse the way this dimension of performance has been accounted for in the literature and take it into account in our empirical application. We have used data envelopment analysis (DEA) for the different performance assessments, but we depart from traditional DEA models in some cases. Performance comparisons on each dimension allowed us to identify benchmark bank branches and also problematic bank branches. In addition, we found positive links between operational and profit efficiency and also between transactional and operational efficiency. Service quality is positively related with operational and profit efficiency. © 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What is the nature of our current societies? Do we see a clash of civilizations, or the end of history? The advent of globalization, or the birth of the network society? Are we witnessing the emergence of a risk society, or the advent of the knowledge society? More fundamentally, is ‘society’ an ideological construct that should be abandoned? Coming into English from the Latin term ‘societas’ via Old French ‘société’, the etymology of ‘society,’ in the sense of a system adopted by a group of co-existing individuals for mutually beneficial purposes, can be traced back at least to the mid-sixteenth century. By the Age of Enlightenment, ‘society’ was increasingly used in intellectual discourse to characterize human relations, often in contrast to notions of ‘the state’. During the nineteenth century, the concept was subject to highly elaborate treatment in various intellectual fields, such as political economy, philosophy, and legal thought; and ‘society’ continues to be a central conceptual tool, not only for sociology, but also for many other social-science disciplines, such as anthropology, economics, political sciences, and law. The notion resonates beyond the social sciences into the humanities; it is a fundamental concept, like nature, the universe, or the economy. Moreover, ‘society’ remains a highly contested concept, as was demonstrated, for example, by the controversy surrounding the former British prime minister Margaret Thatcher’s pithy assertion of the neoliberal economic wisdom that ‘there is no such thing as society’ (Woman’s Own, 31 October 1987); and by the term’s rehabilitation at the turn of the twenty-first century, not least with the ascendancy of the notion of ‘civil society’. This four-volume collection, a new title in the Routledge Critical Concepts in Sociology series, brings together both canonical and the best cutting-edge research to document the intellectual origins and development of what remains a key framework within which contemporary work in the social sciences in general, and sociology in particular, proceeds. Edited by Reiner Grundmann and Nico Stehr, two leading scholars in the field, this Routledge Major Work makes available the most useful, important and representative treatments of the subject matter, and helps to make sense of the great variety of perspectives and approaches in which social scientists and other thinkers have understood, and continue to understand, society. Fully indexed and with a comprehensive introduction newly written by the editors, which places the collected material in its historical and intellectual context, Society is an essential reference work, destined to be valued by scholars and students as a vital research resource.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Synaptic plasticity is the dynamic regulation of the strength of synaptic communication between nerve cells. It is central to neuronal development as well as experience-dependent remodeling of the adult nervous system as occurs during memory formation. Aberrant forms of synaptic plasticity also accompany a variety of neurological and psychiatric diseases, and unraveling the biological basis of synaptic plasticity has been a major goal in neurobiology research. The biochemical and structural mechanisms underlying different forms of synaptic plasticity are complex, involving multiple signaling cascades, reconfigurations of structural proteins and the trafficking of synaptic proteins. As such, proteomics should be a valuable tool in dissecting the molecular events underlying normal and disease-related forms of plasticity. In fact, progress in this area has been disappointingly slow. We discuss the particular challenges associated with proteomic interrogation of synaptic plasticity processes and outline ways in which we believe proteomics may advance the field over the next few years. We pay particular attention to technical advances being made in small sample proteomics and the advent of proteomic imaging in studying brain plasticity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of the Integrated Services Digital Network (ISDN) led to the standardisation of the first video codecs for interpersonal video communications, followed closely by the development of standards for the compression, storage and distribution of digital video in the PC environment, mainly targeted at CD-ROM storage. At the same time the second-generation digital wireless networks, and the third-generation networks being developed, have enough bandwidth to support digital video services. The radio propagation medium is a difficult environment in which to deploy low bit error rate, real time services such as video. The video coding standards designed for ISDN and storage applications, were targeted at low bit error rate levels, orders of magnitude lower than the typical bit error rates experienced on wireless networks. This thesis is concerned with the transmission of digital, compressed video over wireless networks. It investigates the behaviour of motion compensated, hybrid interframe DPCM/DCT video coding algorithms, which form the basis of current coding algorithms, in the presence of high bit error rates commonly found on digital wireless networks. A group of video codecs, based on the ITU-T H.261 standard, are developed which are robust to the burst errors experienced on radio channels. The radio link is simulated at low level, to generate typical error files that closely model real world situations, in a Rayleigh fading environment perturbed by co-channel interference, and on frequency selective channels which introduce inter symbol interference. Typical anti-multipath techniques, such as antenna diversity, are deployed to mitigate the effects of the channel. Link layer error control techniques are also investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The roots of the concept of cortical columns stretch far back into the history of neuroscience. The impulse to compartmentalise the cortex into functional units can be seen at work in the phrenology of the beginning of the nineteenth century. At the beginning of the next century Korbinian Brodmann and several others published treatises on cortical architectonics. Later, in the middle of that century, Lorente de No writes of chains of ‘reverberatory’ neurons orthogonal to the pial surface of the cortex and called them ‘elementary units of cortical activity’. This is the first hint that a columnar organisation might exist. With the advent of microelectrode recording first Vernon Mountcastle (1957) and then David Hubel and Torsten Wiesel provided evidence consistent with the idea that columns might constitute units of physiological activity. This idea was backed up in the 1970s by clever histochemical techniques and culminated in Hubel and Wiesel’s well-known ‘ice-cube’ model of the cortex and Szentogathai’s brilliant iconography. The cortical column can thus be seen as the terminus ad quem of several great lines of neuroscientific research: currents originating in phrenology and passing through cytoarchitectonics; currents originating in neurocytology and passing through Lorente de No. Famously, Huxley noted the tragedy of a beautiful hypothesis destroyed by an ugly fact. Famously, too, human visual perception is orientated toward seeing edges and demarcations when, perhaps, they are not there. Recently the concept of cortical columns has come in for the same radical criticism that undermined the architectonics of the early part of the twentieth century. Does history repeat itself? This paper reviews this history and asks the question.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this contribution I look at three episodes in the history of neurophysiology that bring out the complex relationship between seeing and believing. I start with Vesalius in the mid-sixteenth century who writes that he can in no way see any cavity in nerves, even in the optic nerves. He thus questions the age-old theory (dating back to the Alexandrians in the third century BC) but, because of the overarching psychophysiology of his time, does not press his case. This conflict between observation and theory persisted for a quarter of a millennium until finally resolved at the beginning of the nineteenth century by the discoveries of Galvani and Volta. The second case is provided by the early history of retinal synaptology. Schultze in 1866 had represented rod spherules and bipolar dendrites in the outer plexiform layer as being separated by a (synaptic) gap, yet in his written account, because of his theoretical commitments, held them to be continuous. Cajal later, 1892, criticized Schultze for this pusillanimity, but his own figure in La Cellule is by no means clear. It was only with the advent of the electron microscopy in the mid-twentieth century that the true complexity of the junction was revealed and it was shown that both investigators were partially right. My final example comes from the Hodgkin-Huxley biophysics of the 1950s. Their theory of the action potential depended on the existence of unseen ion pores with quite complex biophysical characteristics. These were not seen until the Nobel-Prize-winning X-ray diffraction analyses of the early twenty-first century. Seeing, even at several removes, then confirmed Hodgkin and Huxley’s belief. The relation between seeing and believing is by no means straightforward.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the advent of High Level Programming languages (HLPLs) in the early 1950s researchers have sought ways to automate the construction of HLPL compilers. To this end a variety of Translator Writing Tools (TWTs) have been developed in the last three decades. However, only a very few of these tools have gained significant commercial acceptance. This thesis re-examines traditional compiler construction techniques, along with a number of previous TWTs, and proposes a new improved tool for automated compiler construction called the Aston Compiler Constructor (ACC). This new tool allows the specification of complete compilation systems using a high level compiler oriented specification notation called the Compiler Construction Language (CCL). This specification notation is based on a modern variant of Backus Naur Form (BNF) and an extended variant of Attribute Grammars (AGs). The implementation and processing of the CCL is discussed along with an extensive CCL example. The CCL is shown to have an extensive expressive power, to be convenient in use, and highly readable, and thus a superior alternative to earlier TWTs, and to traditional compiler construction techniques. The execution performance of CCL specifications is evaluated and shown to be acceptable. A number of related areas are also addressed, including tools for the rapid construction of individual compiler components, and tools for the construction of compilation systems for multiprocessor operating systems and hardware. This latter area is expected to become of particular interest in future years due to the anticipated increased use of multiprocessor architectures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a number of methodological developments that were raised by a real life application to measuring the efficiency of bank branches. The advent of internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This fact requires the development of new forms of assessing and comparing branches of a bank. In addition, performance assessment models must also take into account the fact that bank branches are service and for-profit organisations to which providing adequate service quality as well as being profitable are crucial objectives. This study analyses bank branches performance in their new roles in three different areas: their effectiveness in fostering the use of new transaction channels such as the internet and the telephone (transactional efficiency); their effectiveness in increasing sales and their customer base (operational efficiency); and their effectiveness in generating profits without compromising the quality of service (profit efficiency). The chosen methodology for the overall analysis is Data Envelopment Analysis (DEA). The application attempted here required some adaptations to existing DEA models and indeed some new models so that some specialities of our data could be handled. These concern the development of models that can account for negative data, the development of models to measure profit efficiency, and the development of models that yield production units with targets that are nearer to their observed levels than targets yielded by traditional DEA models. The application of the developed models to a sample of Portuguese bank branches allowed their classification according to the three performance dimensions (transactional, operational and profit efficiency). It also provided useful insights to bank managers regarding how bank branches compare between themselves in terms of their performance, and how, in general, the three performance dimensions are connected between themselves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of DNA vaccines has heralded a new technology allowing the design and elicitation of immune responses more adequate for a wider range of pathogens. The formulation of these vaccines into the desired dosage forms extends their capability in terms of stability, routes of administration and efficacy. This thesis describes an investigation into the fabrication of plasmid DNA, the active principle of DNA vaccines, into microspheres, based on the tenet of an increased cellular uptake of microparticulate matter by phagocytic cells. The formulation of plasmid DNA into microspheres using two methods, is presented. Formulation of microspheric plasmid DNA using the double emulsion solvent evaporation method and a spray-drying method was explored. The former approach involves formation of a double emulsion, by homogenisation. This method produced microspheres of uniform size and smooth morphology, but had a detrimental effect on the formulated DNA. The spray-drying method resulted in microspheres with an improved preservation of DNA stability. The use of polyethylenimine (PEI) and stearylamine (SA) as agents in the microspheric formulation of plasmid DNA is a novel approach to DNA vaccine design. Using these molecules as model positively-charged agents, their influence on the characteristics of the microspheric formulations was investigated. PEI improved the entrapment efficiency of the plasmid DNA in microspheres, and has minimal effect on either the surface charge, morphology or size distribution of the formulations. Stearylamine effected an increase in the entrapment efficiency and stability of the plasmid DNA and its effect on the micropshere morphology was dependent on the method of preparation. The differences in the effects of the two molecules on microsphere formulations may be attributable to their dissimilar physico-chemical properties. PEI is water-soluble and highly-branched, while SA is hydrophobic and amphipathic. The positive charge of both molecules is imparted by amine functional groups. Preliminary data on the in vivo application of formulated DNA vaccine, using hepatitis B plasmid, showed superior humoral responses to the formulated antigen, compared with free (unformulated) antigen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The introduction of a micro-electronic based technology to the workplace has had a far reaching and widespread effect on the numbers and content of jobs. The importance of the implications of new technology were recognised by the trade unions, leading to a plethora of advice and literature in the late 70s and early 80s, notably the TUC 'Technology and Employment ' report. However, studies into the union response have consistently found an overall lack of influence by unions in the introduction of technology. Whilst the advent of new technology has coincided with an industrial relations climate of unprecedented hostility to union activity in the post-war period, there are structural weaknesses in unions in coming to terms with the process of technological change. In particular was the identification of a lack of suitable technological expertise. Addressing itself to this perceived weakness of the union response, this thesis is the outcome of a collaborative project between a national union and an academic institution. The thesis is based on detailed case studies concerning technology bargaining in the Civil Service and the response of the Civil and Public Services Associations (CPSA), the union that represents lower grade white collar civil servants. It is demonstrated that the application of expertise to union negotiators is insufficient on its own to extend union influence and that for unions to effectively come to terms with technology and influence its development requires a re-assessment across all spheres of union activity. It is suggested that this has repercussions for not only the internal organisation and quality of union policy formation and the extent, form and nature of collective bargaining with employer representatives, but also in the relationship with consumer and interest groups outside the traditional collective bargaining forum. Three policy options are developed in the thesis with the 'adversarial' and 'co~operative' options representing the more traditional reactive and passive forms of involvement. These are contrasted with an 'independent participative' form of involvement which was a 'pro-active' policy option and utilised the expertise of the Author in the CPSA's response to technological change.