39 resultados para Sectoral and territorial approaches
em Aston University Research Archive
Resumo:
This paper formulates a logistics distribution problem as the multi-depot travelling salesman problem (MDTSP). The decision makers not only have to determine the travelling sequence of the salesman for delivering finished products from a warehouse or depot to a customer, but also need to determine which depot stores which type of products so that the total travelling distance is minimised. The MDTSP is similar to the combination of the travelling salesman and quadratic assignment problems. In this paper, the two individual hard problems or models are formulated first. Then, the problems are integrated together, that is, the MDTSP. The MDTSP is constructed as both integer nonlinear and linear programming models. After formulating the models, we verify the integrated models using commercial packages, and most importantly, investigate whether an iterative approach, that is, solving the individual models repeatedly, can generate an optimal solution to the MDTSP. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
The use of oligonucleotides directed against the mRNA of HIV promises site-specific inhibition of viral replication. In this work, the effect of aralkyl substituents on oligonucleotide duplex stability was studied using model oligonucleotide sequences in an attempt to improve targeting of oligonucleotides to viral mRNA. Arakyl-substituted oligonucleotides were made by solid phase synthesis using either the appropriate aralkyl-substituted phosphoramidite or by post-synthetic substitution of a pentafluorophenoxy substituent by N-methylphenethylamine. The presence of phenethyl or benzoyl substituents invariably resulted in thermodynamic destabilisation of all duplexes studied. The methods which were developed for the synthesis of nucleoside intermediates for oligonucleotide applications were also used to prepare a series of nucleoside analogues derived from uridine, 2'-deoxyuridine and AZT. Crystal structures of six compounds were successfully determined. Anti-HIV activity was observed for most compounds in the series although none were without cytotoxicity. The most active compound of the series was the ribose nucleoside; 1-β-D-erythro-pentofuranosyl-4-pentafluorophenoxy-pyrimidine-2(1H)-one 95, derived directly from uridine. The same series of compounds also displayed very modest anti-cancer activity. To enable synthesis of prooligonucleotides and analogues for possible antisense applications, the properties of a new Silyl-Linked Controlled Pore Glass solid support were investigated. Synthesis of the sequences d(Tp)7T, d(Tps)7T and the base-sensitive d(Tp)3(CBzp)2(Tp)2T was achieved using the silyl-linked solid support in a fluoride-induced cleavage/deprotection strategy.
Resumo:
Starting from the exploration of the common features related to a postcolonial and feminist analysis, I will attempt to establish new relationships and to open up new perspectives within the cultural exchanges between the two nations, Galicia and Australia, within a global world. On the one hand, this will be new relationships in favour of a non-sexist language which contributes to overcoming gender discrimination; and on the other hand, new relationships which favour a re-evaluation of voices which have been silenced by hegemonic and centralised discourses.
Resumo:
In this letter, we directly compare digital back-propagation (DBP) with spectral inversion (SI) both with and without symmetry correction via dispersive chirping, and numerically demonstrate that predispersed SI outperforms traditional SI, and approaches the performance of computationally exhaustive ideal DBP. Furthermore, we propose for the first time a novel practical scheme employing predispersed SI to compensate the bulk of channel nonlinearities, and DBP to accommodate the residual penalties due to varying SI location, with predispersed SI ubiquitously employed along the transmission link with <;0.5-dB penalty. Our results also show that predispersed SI enables partial compensation of cross-phase modulation effects, increasing the transmission reach by ×2.
Resumo:
In the present state of the art of authorship attribution there seems to be an opposition between two approaches: cognitive and stylistic methodologies. It is proposed in this article that these two approaches are complementary and that the apparent gap between them can be bridged using Systemic Functional Linguistics (SFL) and in particular some of its theoretical constructions, such as codal variation. This article deals with the theoretical explanation of why such a theory would solve the debate between the two approaches and shows how these two views of authorship attribution are indeed complementary. Although the article is fundamentally theoretical, two example experimental trials are reported to show how this theory can be developed into a workable methodology of doing authorship attribution. In Trial 1, a SFL analysis was carried out on a small dataset consisting of three 300-word texts collected from three different authors whose socio-demographic background matched across a number of parameters. This trial led to some conclusions about developing a methodology based on SFL and suggested the development of another trial, which might hint at a more accurate and useful methodology. In Trial 2, Biber's (1988) multidimensional framework is employed, and a final methodology of authorship analysis based on this kind of analysis is proposed for future research. © 2013, EQUINOX PUBLISHING.
Resumo:
Large-scale introduction of Organic Solar Cells (OSCs) onto the market is currently limited by their poor stability in light and air, factors present in normal working conditions for these devices. Thus, great efforts have to be undertaken to understand the photodegradation mechanisms of their organic materials in order to find solutions that mitigate these effects. This study reports on the elucidation of the photodegradation mechanisms occurring in a low bandgap polymer, namely, Si-PCPDTBT (poly[(4,4′-bis(2-ethylhexyl)dithieno[3,2-b:2′,3′-d]silole)-2,6-diyl-alt-(4,7-bis(2-thienyl)-2,1,3-benzothiadiazole)-5,5′-diyl]). Complementary analytical techniques (AFM, HS-SPME-GC-MS, UV-vis and IR spectroscopy) have been employed to monitor the modification of the chemical structure of the polymer upon photooxidative aging and the subsequent consequences on its architecture and nanomechanical properties. Furthermore, these different characterization techniques have been combined with a theoretical approach based on quantum chemistry to elucidate the evolution of the polymer alkyl side chains and backbone throughout exposure. Si-PCPDTBT is shown to be more stable against photooxidation than the commonly studied p-type polymers P3HT and PCDTBT, while modeling demonstrated the benefits of using silicon as a bridging atom in terms of photostability. (Figure Presented).
Resumo:
This chapter explores the different ways in which discourse-analytic approaches reveal the ‘meaningfulness’ of text and talk. It reviews four diverse approaches to discourse analysis of particular value for current research in linguistics: Conversation Analysis (CA), Discourse Analysis (DA), Critical Discourse Analysis (CDA) and Feminist Post-structuralist Discourse Analysis (FPDA). Each approach is examined in terms of its background, motivation, key features, and possible strengths and limitations in relation to the field of linguistics. A key way to schematize discourse-analytic methodology is in terms of its relationship between microanalytical approaches, which examine the finer detail of linguistic interactions in transcripts, and macroanalytical approaches, which consider how broader social processes work through language (Heller, 2001). This chapter assesses whether there is a strength in a discourse-analytic approach that aligns itself exclusively with either a micro- or macrostrategy, or whether, as Heller suggests, the field needs to fi nd a way of ‘undoing’ the micro–macro dichotomy in order to produce richer, more complex insights within linguistic research.
Resumo:
This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.
Resumo:
This is the second edition of our Aston Business School (ABS) Good Practice Guide and the enthusiasm of the contributors appears undiminished. I am again reminded that I work with a group of very committed, dedicated and professional colleagues. Once again this publication is produced to celebrate and promote good teaching across the School and to offer encouragement to those imaginative and innovative staff who continue to wish to challenge students to learn to maximum effect. It is hoped that others will pick up some good ideas from the articles contained in this volume. Contributors to this Guide were not chosen because they are the best teachers in the School, although they are undoubtedly all amongst my colleagues who are exponents of enthusiastic and inspiring approaches to learning. The Quality Unit approached these individuals because they declared on their Annual Module Reflection Forms that they were doing something interesting and worthwhile which they thought others might find useful. Amongst those reading the Guide I am sure that there are many other individuals who are trying to operate similar examples of good practice in their teaching, learning and assessment methods. I hope that this publication will provoke these people into providing comments and articles of their own and that these will form the basis of next year’s Guide. It may also provoke some people to try these methods in their own teaching. The themes of the articles this year can be divided into two groups. The first theme is the quest to help students to help themselves to learn via student-run tutorials, surprise tests and mock examinations linked with individual tutorials. The second theme is making learning come to life in exciting practical ways by, for example, hands-on workshops and simulations, story telling, rhetorical questioning and discussion groups. A common theme is one of enthusiasm, reflection and commitment on behalf of the lecturers concerned. None of the approaches discussed in this publication are low effort activities on the part of the facilitator, but this effort is regarded as worthwhile as a means of creating greater student engagement. As Biggs (2003)[1] says, in his similarly inspiring way, students learn more the less passive they are in their learning. (Ref). The articles in this publication bear witness of this and much more. Since last year Aston Business School has launched its Research Centre in Higher Education Learning and Management (HELM) which is another initiative to promote excellent learning and teaching. Even before this institution has become fully operational, at least one of the articles in this publication has seen the light of day in the research arena and at least two others are ripe for dissemination to a wider audience via journal publication. More news of our successes in this activity will appear in next year’s edition. May I thank the contributors for taking time out of their busy schedules to write the articles this summer, and to Julie Green who runs the ABS Quality Unit, for putting our diverse approaches into a coherent and publishable form and for chasing us when we have needed it! I would also like to thank Ann Morton and her colleagues in the Centre for Staff Development who have supported this publication. During the last year the Centre has further stimulated the learning and teaching life of the School (and the wider University) via their Learning and Teaching Week and sponsorship of Teaching Quality Enhancement Fund (TQEF) projects. Pedagogic excellence is in better health at Aston than ever before – long may this be because this is what life in HE should be about.
Resumo:
Dementia is one of the greatest contemporary health and social care challenges, and novel approaches to the care of its sufferers are needed. New information and communication technologies (ICT) have the potential to assist those caring for people with dementia, through access to networked information and support, tracking and surveillance. This article reports the views about such new technologies of 34 carers of people with dementia. We also held a group discussion with nine carers for respondent validation. The carers' actual use of new ICT was limited, although they thought a gradual increase in the use of networked technology in dementia care was inevitable but would bypass some carers who saw themselves as too old. Carers expressed a general enthusiasm for the benefits of ICT, but usually not for themselves, and they identified several key challenges including: establishing an appropriate balance between, on the one hand, privacy and autonomy and, on the other: maximising safety; establishing responsibility for and ownership of the equipment and who bears the costs; the possibility that technological help would mean a loss of valued personal contact; and the possibility that technology would substitute for existing services rather than be complementary. For carers and dementia sufferers to be supported, the expanding use of these technologies should be accompanied by intensive debate of the associated issues.
Resumo:
Recent developments in the new economic geography and the literature on regional innovation systems have emphasised the potentially important role of networking and the characteristics of firms' local operating environment in shaping their innovative activity. Modeling UK, German and Irish plants' investments in R&D, technology transfer and networking, and their effect on the extent and success of plants' innovation activities, casts some doubt on the importance of both of these relationships. In particular, our analysis provides no support for the contention that firms or plants in the UK, Ireland or Germany with more strongly developed external links (collaborative networks or technology transfer) develop greater innovation intensity. However, although inter-firm links also have no effect on the commercial success of plants' innovation activity, intra-group links are important in terms of achieving commercial success. We also find evidence that R&D, technology transfer and networking inputs are substitutes rather than complements in the innovation process, and that there are systematic sectoral and regional influences in the efficiency with which such inputs are translated into innovation outputs. © 2001 Elsevier Science B.V.
Resumo:
People and their performance are key to an organization's effectiveness. This review describes an evidence-based framework of the links between some key organizational influences and staff performance, health and well-being. This preliminary framework integrates management and psychological approaches, with the aim of assisting future explanation, prediction and organizational change. Health care is taken as the focus of this review, as there are concerns internationally about health care effectiveness. The framework considers empirical evidence for links between the following organizational levels: 1. Context (organizational culture and inter-group relations; resources, including staffing; physical environment) 2. People management (HRM practices and strategies; job design, workload and teamwork; employee involvement and control over work; leadership and support) 3. Psychological consequences for employees (health and stress; satisfaction and commitment; knowledge, skills and motivation) 4. Employee behaviour (absenteeism and turnover; task and contextual performance; errors and near misses) 5. Organizational performance; patient care. This review contributes to an evidence base for policies and practices of people management and performance management. Its usefulness will depend on future empirical research, using appropriate research designs, sufficient study power and measures that are reliable and valid.
Resumo:
To make vision possible, the visual nervous system must represent the most informative features in the light pattern captured by the eye. Here we use Gaussian scale-space theory to derive a multiscale model for edge analysis and we test it in perceptual experiments. At all scales there are two stages of spatial filtering. An odd-symmetric, Gaussian first derivative filter provides the input to a Gaussian second derivative filter. Crucially, the output at each stage is half-wave rectified before feeding forward to the next. This creates nonlinear channels selectively responsive to one edge polarity while suppressing spurious or "phantom" edges. The two stages have properties analogous to simple and complex cells in the visual cortex. Edges are found as peaks in a scale-space response map that is the output of the second stage. The position and scale of the peak response identify the location and blur of the edge. The model predicts remarkably accurately our results on human perception of edge location and blur for a wide range of luminance profiles, including the surprising finding that blurred edges look sharper when their length is made shorter. The model enhances our understanding of early vision by integrating computational, physiological, and psychophysical approaches. © ARVO.
Resumo:
This chapter serves three very important functions within this collection. First, it aims to make the existence of FPDA better known to both gender and language researchers and to the wider community of discourse analysts, by outlining FPDA’s own theoretical and methodological approaches. This involves locating and positioning FPDA in relation, yet in contradistinction to, the fields of discourse analysis to which it is most often compared: Critical Discourse Analysis (CDA) and, to a lesser extent, Conversation Analysis (CA). Secondly, the chapter serves a vital symbolic function. It aims to contest the authority of the more established theoretical and methodological approaches represented in this collection, which currently dominate the field of discourse analysis. FPDA considers that an established field like gender and language study will only thrive and develop if it is receptive to new ways of thinking, divergent methods of study, and approaches that question and contest received wisdoms or established methods. Thirdly, the chapter aims to introduce some new, experimental and ground-breaking FPDA work, including that by Harold Castañeda-Peña and Laurel Kamada (same volume). I indicate the different ways in which a number of young scholars are imaginatively developing the possibilities of an FPDA approach to their specific gender and language projects.