16 resultados para Decade of the sixties

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Models@run.time workshop (MRT) series offers a discussion forum for the rising need to leverage modeling techniques at runtime for the software of the future. MRT has become a mature research topic, which is, e.g., reflected in separate sessions at conferences covering MRT approaches only. The target venues of the workshops audience changed from workshops to conferences. Hence, new topics in the area of MRT need to be identified, which are not yet mature enough for conferences. In consequence, the main goal of this edition was to reflect on the past decade of the workshop's history and to identify new future directions for the workshop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article discusses various reports published within the issue, including one by Movando et al. concerning marketing measures and constructs, and an article by Grandcolas et al. concerning research methodologies and the future of web-based survey research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Editorial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On July 17, 1990, President George Bush ssued “Proclamation #6158" which boldly declared the following ten years would be called theDecade of the Brain” (Bush, 1990). Accordingly, the research mandates of all US federal biomedical institutions worldwide were redirected towards the study of the brain in general and cognitive neuroscience specifically. In 2008, one of the greatest legacies of this “Decade of the Brain” is the impressive array of techniques that can be used to study cortical activity. We now stand at a juncture where cognitive function can be mapped in the time, space and frequency domains, as and when such activity occurs. These advanced techniques have led to discoveries in many fields of research and clinical science, including psychology and psychiatry. Unfortunately, neuroscientific techniques have yet to be enthusiastically adopted by the social sciences. Market researchers, as specialized social scientists, have an unparalleled opportunity to adopt cognitive neuroscientific techniques and significantly redefine the field and possibly even cause substantial dislocations in business models. Following from this is a significant opportunity for more commercially-oriented researchers to employ such techniques in their own offerings. This report examines the feasibility of these techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Garner seeks to explain the absence of far-right political formations in the history of the Republic of Ireland, especially in relation to immigration. He argues that the ‘mainstream’ nationalist parties have implemented a racialized governance of Ireland via the issue of citizenship (in the referendum of 2004). While hegemonic ideas on the racial purity of indigenous populations and the highly ambivalent attitudes and policies on immigration pursued over the last decade are characteristic of a broader European trend, this has not, in the Republic, been accompanied by meaningful far-right political mobilization. Ireland has frequently been seen as sui generis in political terms, and indeed emerges in some ways as a counter-case: increasing hostility towards Others has been identified in the midst of rapid economic growth and political stability. A variety of issues related to the country’s political development have given rise to an especially small left-wing vote, a nationalist centre ground and longlasting domination by a single populist party, Fianna Fa´ il. This party has been partnered in government since 1997 by a free-market party, the Progressive Democrats, who have contributed to Ireland’s movement towards neo-liberal policies and a highly functional approach to immigration. The transition from country of emigration to country of immigration has thus taken place against an ideological backdrop in which the imperatives of labour demand and consolidating domestic support for reform have made an uneasy match, resulting in the racialization of Irishness. The state has, however, amended the Constitution in order to qualify jus soli citizenship entitlement in the case of particular categories of people: those whose parents are not Irish nationals. The significant stakes of these changes are analysed in the context of state responses to Eire’s transition to a country of immigration, and the role of nationalist-populism in the country’s political culture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decade since 1979 has seen the most rapid introduction of microelectronic technology in the workplace. In particular, the scope offered for the application of this new technology to the area of white collar work has meant that it is a sector where trade unions have been confronted with major challenges. However the application of this technology has also provided trade unions with opportunities for exerting influence to reshape traditional attitudes to both industrial relations and the nature of work. Recent academic research on the trade union response to the introduction of new technology at the workplace suggests that, despite the resources and apparent sophistication of modern trade unions, they have not in general been able to take advantage of the opportunities offered during this period of radical technological change,the argument being that this is due both to structural weaknesses and the inappropriateness of the system of collective bargaining where new technology issues are concerned. Despite the significance of the Public Sector in employment terms, research into the response of public sector white collar trade unions to technological change has been fairly limited. This thesis sets out the approach of the National and Local Government Officers Association (NALGO), the largest solely white collar union in the world with over three quarters of a million members employed in a wide range of public service industries. The thesis examines NALGO's response at national level and, through detailed case studies, at local level in respect of Local Government and Water Industry NALGO members. The response is then evaluated and conclusions drawn in terms of a framework based upon an assessment of the key factors relevant in judging the ability of NALGO to respond effectively to the challenges brought about by the technological revolution of the last ten years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last decade the use of randomised gene libraries has had an enormous impact in the field of protein engineering. Such libraries comprise many variations of a single gene in which codon replacements are used to substitute key residues of the encoded protein. The expression of such libraries generates a library of randomised proteins which can subsequently be screened for desired or novel activities. Randomisation in this fashion has predominantly been achieved by the inclusion of the codons NNN or NNGCor T, in which N represents any of the four bases A,C,G, or T. The use of thesis codons however, necessities the cloning of redundant codons at each position of randomisation, in addition to those required to encode the twenty possible amino acid substitutions. As degenerate codons must be included at each position of randomisation, this results in a progressive loss of randomisation efficiency as the number of randomised positions is increased. The ratio of genes to proteins in these libraries rises exponentially with each position of randomisation, creating large gene libraries, which generate protein libraries of limited diversity upon expression. In addition to these problems of library size, the cloning of redundant codons also results in the generation of protein libraries in which substituted amino acids are unevenly represented. As several of the randomised codons may encode the same amino acid, for example serine which is encoded six time using the codon NNN, an inherent bias may be introduced into the resulting protein library during the randomisation procedure. The work outlined here describes the development of a novel randomisation technique aimed at a eliminating codon redundancy from randomised gene libraries, thus addressing the problems of library size and bias, associated with the cloning of redundant codons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the UK, Open Learning has been used in industrial training for at least the last decade. Trainers and Open Learning practitioners have been concerned about the quality of the products and services being delivered. The argument put forward in this thesis is that there is ambiguity amongst industrialists over the meanings of `Open Learning' and `Quality in Open Learning'. For clarity, a new definition of Open Learning is proposed which challenges the traditional learner-centred approach favoured by educationalists. It introduces the concept that there are benefits afforded to the trainer/employer/teacher as well as to the learner. This enables a focussed view of what quality in Open Learning really means. Having discussed these issues, a new quantitative method of evaluating Open Learning is proposed. This is based upon an assessment of the degree of compliance with which products meet Parts 1 & 2 of the Open Learning Code of Practice. The vehicle for these research studies has been a commercial contract commissioned by the Training Agency for the Engineering Industry Training Board (EITB) to examine the quality of Open Learning products supplied to the engineering industry. A major part of this research has been the application of the evaluation technique to a range of 67 Open Learning products (in eight subject areas). The findings were that good quality products can be found right across the price range - so can average and poor quality ones. The study also shows quite convincingly that there are good quality products to be found at less than 50. Finally the majority (24 out of 34) of the good quality products were text based.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

East-West trade has grown rapidly since the sixties, stimulating a parallel expansion in the literature on the subject. An extensive review of this literature shows how: (i) most of the issues involved have at their source the distinctions between East and West in political ideology and/or economic management, and (ii) there has been a tendency to keep theoretical and practical perspectives on the subject too separate. This thesis demonstrates the importance of understanding the fundamental principles implied in the first point, and represents an attempt to bridge the gap identified in the second. A detailed study of the market for fire fighting equipment in Eastern Europe is undertaken in collaboration with a medium-sized company, Angus Fire Armour Limited. Desk research methods are combined with visits to the market to assess the potential for the company's products, and recommendations for future strategy are made. The case demonstrates the scope and limitations of various research methods for the East European market, and a model for market research relevant to all companies is developed. Tne case study highlights three areas largely neglected in the literature: (i) the problems of internal company adaptation to East European conditions; (ii) the division of responsibility between foreign trade organisations; and (iii) bribery and corruption in East-West trade. Further research into the second topic - through a survey of 36 UK exporters - and the third - through analysis of publicised corruption cases - confirms the representativeness of the Angus experience, and reflects on the complexity of the Bast European import process, which does not always function as is commonly supposed. The very complexity of the problems confronting companies reaffirms the need to appreciate the principles underlying the subject, while the detailed analysis into questions of, originally, a marketing nature, reveals wider implications for East-West trade and East-West relations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diagnosis and monitoring of ocular disease presents considerable clinical difficulties for two main reasons i) the substantial physiological variation of anatomical structure of the visual pathway and ii) constraints due to technical limitations of diagnostic hardware. These are further confounded by difficulties in detecting early loss or change in visual function due to the masking of disease effects, for example, due to a high degree of redundancy in terms of nerve fibre number along the visual pathway. This thesis addresses these issues across three areas of study: 1. Factors influencing retinal thickness measures and their clinical interpretation As the retina is the principal anatomical site for damage associated with visual loss, objective measures of retinal thickness and retinal nerve fibre layer thickness are key to the detection of pathology. In this thesis the ability of optical coherence tomography (OCT) to provide repeatable and reproducible measures of retinal structure at the macula and optic nerve head is investigated. In addition, the normal physiological variations in retinal thickness and retinal nerve fibre layer thickness are explored. Principal findings were: • Macular retinal thickness and optic nerve head measurements are repeatable and reproducible for normal subjects and diseased eyes • Macular and retinal nerve fibre layer thickness around the optic nerve correlate negatively with axial length, suggesting that larger eyes have thinner retinae, potentially making them more susceptible to damage or disease • Foveola retinal thickness increases with age while retinal nerve fibre layer thickness around the optic nerve head decreases with age. Such findings should be considered during examination of the eye with suspect pathology or in long-term disease monitoring 2. Impact of glucose control on retinal anatomy and function in diabetes Diabetes is a major health concern in the UK and worldwide and diabetic retinopathy is a major cause of blindness in the working population. Objective, quantitative measurements of retinal thickness. particularly at the macula provide essential information regarding disease progression and the efficacy of treatment. Functional vision loss in diabetic patients is commonly observed in clinical and experimental studies and is thought to be affected by blood glucose levels. In the first study of its kind, the short term impact of fluctuations in blood glucose levels on retinal structure and function over a 12 hour period in patients with diabetes are investigated. Principal findings were: • Acute fluctuations in blood glucose levels are greater in diabetic patients than normal subjects • The fluctuations in blood glucose levels impact contrast sensitivity scores. SWAP visual fields, intraocular pressure and diastolic pressure. This effect is similar for type 1 and type 2 diabetic patients despite the differences in their physiological status. • Long-term metabolic control in the diabetic patient is a useful predictor in the fluctuation of contrast sensitivity scores. • Large fluctuations in blood glucose levels and/or visual function and structure may be indicative of an increased risk of development or progression of retinopathy 3. Structural and functional damage of the visual pathway in glaucomatous optic neuropathy The glaucomatous eye undergoes a number of well documented pathological changes including retinal nerve fibre loss and optic nerve head damage which is correlated with loss of functional vision. In experimental glaucoma there is evidence that glaucomatous damage extends from retinal ganglion cells in the eye, along the visual pathway, to vision centres in the brain. This thesis explores the effects of glaucoma on retinal nerve fibre layer thickness, ocular anterior anatomy and cortical structure, and its correlates with visual function in humans. Principal findings were: • In the retina, glaucomatous retinal nerve fibre layer loss is less marked with increasing distance from the optic nerve head, suggesting that RNFL examination at a greater distance than traditionally employed may provide invaluable early indicators of glaucomatous damage • Neuroretinal rim area and retrobulbar optic nerve diameter are strong indicators of visual field loss • Grey matter density decreases at a rate of 3.85% per decade. There was no clear evidence of a disease effect • Cortical activation as measured by fMRI was a strong indicator of functional damage in patients with significant neuroretinal rim loss despite relatively modest visual field defects These investigations have shown that the effects of senescence are evident in both the anterior and posterior visual pathway. A variety of anatomical and functional diagnostic protocols for the investigation of damage to the visual pathway in ocular disease are required to maximise understanding of the disease processes and thereby optimising patient care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the initial launch of silicone hydrogel lenses, there has been a considerable broadening in the range of available commercial material properties. The very mobile silicon–oxygen bonds convey distinctive surface and mechanical properties on silicone hydrogels, in which advantages of enhanced oxygen permeability, reduced protein deposition, and modest frictional interaction are balanced by increased lipid and elastic response. There are now some 15 silicone hydrogel material variants available to practitioners; arguably, the changes that have taken place have been strongly influenced by feedback based on clinical experience. Water content is one of the most influential properties, and the decade has seen a progressive rise from lotrafilcon-A (24%) to efrofilcon-A (74%). Moduli have decreased over the same period from 1.4 to 0.3 MPa, but not solely as a result of changes in water content. Surface properties do not correlate directly with water content, and ingenious approaches have been used to achieve desirable improvements (e.g., greater lubricity and lower contact angle hysteresis). This is demonstrated by comparing the hysteresis value of the earliest (lotrafilcon-A, >40°) and most recent (delefilcon-A, <10°) coated silicone hydrogels. Although wettability is important, it is not of itself a good predictor of ocular response because this involves a much wider range of physicochemical and biochemical factors. The interference of the lens with ocular dynamics is complex leading separately to tissue–material interactions involving anterior and posterior lens surfaces. The biochemical consequences of these interactions may hold the key to a greater understanding of ocular incompatibility and end of day discomfort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past decade or so a number of changes have been observed in traditional Japanese employment relations (ERs) systems such as an increase in non-regular workers, a move towards performance-based systems and a continuous decline in union membership. There is a large body of Anglo-Saxon and Japanese literature providing evidence that national factors such as national institutions, national culture, and the business and economic environment have significantly influenced what were hitherto three ‘sacred’ aspects of Japanese ERs systems (ERSs). However, no research has been undertaken until now at the firm level regarding the extent to which changes in national factors influence ERSs across firms. This article develops a model to examine the impact of national factors on ER systems; and analyses the impact of national factors at the firm level ER systems. Based on information collected from two different groups of companies, namely Mitsubishi Chemical Group (MCG) and Federation of Shinkin Bank (FSB) the research finds that except for a few similarities, the impact of national factors is different on Japanese ER systems at the firm level. This indicates that the impact of national factors varies in the implementation of employment relations factors. In the case of MCG, national culture has less to do with seniority-based system. Study also reveals that the national culture factors have also less influence on an enterprise-based system in the case of FSB. This analysis is useful for domestic and international organizations as it helps to better understand the role of national factors in determining Japanese ERSs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al.: RQ1. To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2. What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3. Are theory-based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al., the authors have added a fourth question: RQ4. To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach – A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings – The headline features from the results suggest that nearly a decade-and-a-half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory-based articles achieved on average a higher number of citations than non-theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications – This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications – This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications – The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value – The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory-based articles are more highly cited than non-theoretically based papers. This could indicate that researchers are attempting to build on one another's work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade, researchers in the social sciences have increasingly adopted neuroscientific techniques, with the consequent rise of research inspired by neuroscience in disciplines such as economics, marketing, decision sciences, and leadership. In 2007, we introduced the term organizational cognitive neuroscience (OCN), in an attempt to clearly demarcate research carried out in these many areas, and provide an overarching paradigm for research utilizing cognitive neuroscientific methods, theories, and concepts, within the organizational and business research fields. Here we will revisit and further refine the OCN paradigm, and define an approach where we feel the marriage of organizational theory and neuroscience will return even greater dividends in the future and that is within the field of clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In less than a decade, personal computers have become part of our daily lives. Many of us come into contact with computers every day, whether at work, school or home. As useful as the new technologies are, they also have a darker side. By making computers part of our daily lives, we run the risk of allowing thieves, swindlers, and all kinds of deviants directly into our homes. Armed with a personal computer, a modem and just a little knowledge, a thief can easily access confidential information, such as details of bank accounts and credit cards. This book helps people avoid harm at the hands of Internet criminals. It offers a tour of the more dangerous parts of the Internet, as the author explains who the predators are, their motivations, how they operate and how to protect against them. In less than a decade, personal computers have become part of our daily lives. Many of us come into contact with computers every day, whether at work, school or home. As useful as the new technologies are, they also have a darker side. By making computers part of our daily lives, we run the risk of allowing thieves, swindlers, and all kinds of deviants directly into our homes. Armed with a personal computer, a modem and just a little knowledge, a thief can easily access confidential information, such as details of bank accounts and credit cards. This book is intended to help people avoid harm at the hands of Internet criminals. It offers a tour of the more dangerous parts of the Internet, as the author explains who the predators are, their motivations, how they operate and how to protect against them. Behind the doors of our own homes, we assume we are safe from predators, con artists, and other criminals wishing us harm. But the proliferation of personal computers and the growth of the Internet have invited these unsavory types right into our family rooms. With a little psychological knowledge a con man can start to manipulate us in different ways. A terrorist can recruit new members and raise money over the Internet. Identity thieves can gather personal information and exploit it for criminal purposes. Spammers can wreak havoc on businesses and individuals. Here, an expert helps readers recognize the signs of a would-be criminal in their midst. Focusing on the perpetrators, the author provides information about how they operate, why they do it, what they hope to do, and how to protect yourself from becoming a victim.