53 resultados para Specialised lexicography

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper asserts the increasing importance of academic English in an increasingly Anglophone world, and looks at the differences between academic English and general English, especially in terms of vocabulary. The creation of wordlists has played an important role in trying to establish the academic English lexicon, but these wordlists are not based on appropriate data, or are implemented inappropriately. There is as yet no adequate dictionary of academic English, and this paper reports on new efforts at Aston University to create a suitable corpus on which such a dictionary could be based.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Category-specific disorders are frequently explained by suggesting that living and non-living things are processed in separate subsystems (e.g. Caramazza & Shelton, 1998). If subsystems exist, there should be benefits for normal processing, beyond the influence of structural similarity. However, no previous study has separated the relative influences of similarity and semantic category. We created novel examples of living and non-living things so category and similarity could be manipulated independently. Pre-tests ensured that our images evoked appropriate semantic information and were matched for familiarity. Participants were trained to associate names with the images and then performed a name-verification task under two levels of time pressure. We found no significant advantage for living things alongside strong effects of similarity. Our results suggest that similarity rather than category is the key determinant of speed and accuracy in normal semantic processing. We discuss the implications of this finding for neuropsychological studies. © 2005 Psychology Press Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses three important aspects of John Sinclair’s legacy: the corpus, lexicography, and the notion of ‘corpus-driven’. The corpus represents his concern with the nature of linguistic evidence. Lexicography is for him the canonical mode of language description at the lexical level. And his belief that the corpus should ‘drive’ the description is reflected in his constant attempts to utilize the emergent computer technologies to automate the initial stages of analysis and defer the intuitive, interpretative contributions of linguists to increasingly later stages in the process. Sinclair’s model of corpus-driven lexicography has spread far beyond its initial implementation at Cobuild, to most EFL dictionaries, to native-speaker dictionaries (e.g. the New Oxford Dictionary of English, and many national language dictionaries in emerging or re-emerging speech communities) and bilingual dictionaries (e.g. Collins, Oxford-Hachette).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Translators wishing to work on translating specialised texts are traditionally recommended to spend much time and effort acquiring specialist knowledge of the domain involved, and for some areas of specialised activity, this is clearly essential. For other types of translation-based, domain-specific of communication, however, it is possible to develop a systematic approach to the task which will allow for the production of target texts which are adequate for purpose, in a range of specialised domains, without necessarily having formal qualifications in those areas. For Esselink (2000) translation agencies, and individual clients, would tend to prefer a subject expert who also happens to have competence in one or more languages over a trained translator with a high degree of translation competence, including the ability to deal with specialised translation tasks. The problem, for the would-be translator, is persuading prospective clients that he or she is capable of this. This paper will offer an overview of the principles used to design training intended to teach trainee translators how to use a systematic approach to specialised translation, in order to extend the range of areas in which they can tackle translation, without compromising quality or reliability. This approach will be described within the context of the functionalist approach developed in particular by Reiss and Vermeer (1984), Nord (1991, 1997) inter alia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Guest editorial: This special issue has been drawn from papers that were published as part of the Second European Conference on Management of Technology (EuroMOT) which was held at Aston Business School (Birmingham, UK) 10-12 September 2006. This was the official European conference for the International Association for Management of Technology (IAMOT); the overall theme of the conference was “Technology and global integration.” There were many high-calibre papers submitted to the conference and published in the associated proceedings (Bennett et al., 2006). The streams of interest that emerged from these submissions were the importance of: technology strategy, innovation, process technologies, managing change, national policies and systems, research and development, supply chain technology, service and operational technology, education and training, small company incubation, technology transfer, virtual operations, technology in developing countries, partnership and alliance, and financing and investment. This special issue focuses upon the streams of interest that accentuate the importance of collaboration between different organisations. Such organisations vary greatly in character; for instance, they may be large or small, publicly or privately owned, and operate in manufacturing or service sectors. Despite these varying characteristics they all have something in common; they all stress the importance of inter-organisational collaboration as a critical success factor for their organisation. In today's global economy it is essential that organisations decide what their core competencies are what those of complementing organisations are. Core competences should be developed to become a bases of differentiation, leverage and competitive advantage, whilst those that are less mature should be outsourced to other organisations that can claim to have had more recognition and success in that particular core competence (Porter, 2001). This strategic trend can be observed throughout advanced economies and is growing strongly. If a posteriori reasoning is applied here it follows that organisations could continue to become more specialised in fewer areas whilst simultaneously becoming more dependent upon other organisations for critical parts of their operations. Such actions seem to fly in the face of rational business strategy and so the question must be asked: why are organisations developing this way? The answer could lie in the recent changes in endogenous and exogenous factors of the organisation; the former emphasising resource-based issues in the short-term, and strategic positioning in the long-term whilst the later emphasises transaction costs in the short-term and acquisition of new skills and knowledge in the long-term. For a harmonious balance of these forces to prevail requires organisations to firstly declare a shared meta-strategy, then to put some cross-organisational processes into place which have their routine operations automated as far as possible. A rolling business plan would review, assess and reposition each organisation within this meta-strategy according to how well they have contributed (Binder and Clegg, 2006). The important common issue here is that an increasing number of businesses today are gaining direct benefit from increasing their levels of inter-organisational collaboration. Such collaboration has largely been possible due to recent technological advances which can make organisational structures more agile (e.g. the extended or the virtual enterprise), organisational infra-structure more connected, and the sharing of real-time information an operational reality. This special issue consists of research papers that have explored the above phenomenon in some way. For instance, the role of government intervention, the use of internet-based technologies, the role of research and development organisations, the changing relationships between start-ups and established firms, the importance of cross-company communities of practice, the practice of networking, the front-loading of large-scale projects, innovation and the probabilistic uncertainties that organisations experience are explored in these papers. The cases cited in these papers are limited as they have a Eurocentric focus. However, it is hoped that readers of this special issue will gain a valuable insight into the increasing importance of collaborative practices via these studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to environmental changes and business trends such as globalisation, outsourcing and virtualisation, more and more companies get involved in business activities that are outside their direct control. This typically occurs by entering into collaborative relationships and joint ventures with specialised companies in order to fulfil the demands of customers quickly (DiMaggio, 2001). Organisational structures that results from such collaborative relationships and joint ventures are referred to in this paper as enterprises and the management of them known as enterprise management. The authors use the definition of the European Commission (2003) that defines an enterprise as “… an entity, regardless of its legal form … including partnerships or associations regularly engaged in economic activities.” Therefore in its most simple form an enterprise could be a single integrated company. However, findings from this research show that enterprises can also be made up of parts of different companies and the structure of the enterprise is contingent upon a variety of different factors. The success of the enterprise as a collaborative venture depends on the ability of companies to intermediate their internal core competencies into other participating companies’ value streams and simultaneously outsource their own peripheral activities to companies that can perform them quicker, cheaper, and more effectively (Lal et al., 1995). In other words, the peripheral activities of one member-company must be complemented by a core competence of another member-company within an overall enterprise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the determinants of technological diversification among UK’s small serial innovators (SSIs). Using a longitudinal study of 339 UK-based small businesses accounting for almost 7000 patents between 1990 and 2006, this study constitutes the first empirical examination of technological diversification among SMEs in the literature. Results demonstrate that technological diversification is not solely a large firm activity, challenging the dominant view that innovative SMEs are extremely focused and specialised players with little technological diversification. Our findings suggest a nonlinear (i.e. inverse-U-shaped) relationship between the level of technological opportunities in the environment and the SSIs’ degree of technological diversification. This points to a trade-off between processes of exploration and exploitation across increasingly volatile technology regimes. The paper also demonstrates that small firms with impactful innovations focus their innovative activity around similar technological capabilities while firms that have introduced platform technologies in the past are more likely to engage in technological diversification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose - This paper provides a deeper examination of the fundamentals of commonly-used techniques - such as coefficient alpha and factor analysis - in order to more strongly link the techniques used by marketing and social researchers to their underlying psychometric and statistical rationale. Design/methodology approach - A wide-ranging review and synthesis of psychometric and other measurement literature both within and outside the marketing field is used to illuminate and reconsider a number of misconceptions which seem to have evolved in marketing research. Findings - The research finds that marketing scholars have generally concentrated on reporting what are essentially arbitrary figures such as coefficient alpha, without fully understanding what these figures imply. It is argued that, if the link between theory and technique is not clearly understood, use of psychometric measure development tools actually runs the risk of detracting from the validity of the measures rather than enhancing it. Research limitations/implications - The focus on one stage of a particular form of measure development could be seen as rather specialised. The paper also runs the risk of increasing the amount of dogma surrounding measurement, which runs contrary to the spirit of this paper. Practical implications - This paper shows that researchers may need to spend more time interpreting measurement results. Rather than simply referring to precedence, one needs to understand the link between measurement theory and actual technique. Originality/value - This paper presents psychometric measurement and item analysis theory in easily understandable format, and offers an important set of conceptual tools for researchers in many fields. © Emerald Group Publishing Limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2002, we published a paper [Brock, J., Brown, C., Boucher, J., Rippon, G., 2002. The temporal binding deficit hypothesis of autism. Development and Psychopathology 142, 209-224] highlighting the parallels between the psychological model of 'central coherence' in information processing [Frith, U., 1989. Autism: Explaining the Enigma. Blackwell, Oxford] and the neuroscience model of neural integration or 'temporal binding'. We proposed that autism is associated with abnormalities of information integration that is caused by a reduction in the connectivity between specialised local neural networks in the brain and possible overconnectivity within the isolated individual neural assemblies. The current paper updates this model, providing a summary of theoretical and empirical advances in research implicating disordered connectivity in autism. This is in the context of changes in the approach to the core psychological deficits in autism, of greater emphasis on 'interactive specialisation' and the resultant stress on early and/or low-level deficits and their cascading effects on the developing brain [Johnson, M.H., Halit, H., Grice, S.J., Karmiloff-Smith, A., 2002. Neuroimaging of typical and atypical development: a perspective from multiple levels of analysis. Development and Psychopathology 14, 521-536].We also highlight recent developments in the measurement and modelling of connectivity, particularly in the emerging ability to track the temporal dynamics of the brain using electroencephalography (EEG) and magnetoencephalography (MEG) and to investigate the signal characteristics of this activity. This advance could be particularly pertinent in testing an emerging model of effective connectivity based on the balance between excitatory and inhibitory cortical activity [Rubenstein, J.L., Merzenich M.M., 2003. Model of autism: increased ratio of excitation/inhibition in key neural systems. Genes, Brain and Behavior 2, 255-267; Brown, C., Gruber, T., Rippon, G., Brock, J., Boucher, J., 2005. Gamma abnormalities during perception of illusory figures in autism. Cortex 41, 364-376]. Finally, we note that the consequence of this convergence of research developments not only enables a greater understanding of autism but also has implications for prevention and remediation. © 2006.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since publication of the first edition, huge developments have taken place in sensory biology research and new insights have been provided in particular by molecular biology. These show the similarities in the molecular architecture and in the physiology of sensory cells across species and across sensory modality and often indicate a common ancestry dating back over half a billion years. Biology of Sensory Systems has thus been completely revised and takes a molecular, evolutionary and comparative approach, providing an overview of sensory systems in vertebrates, invertebrates and prokaryotes, with a strong focus on human senses. Written by a renowned author with extensive teaching experience, the book covers, in six parts, the general features of sensory systems, the mechanosenses, the chemosenses, the senses which detect electromagnetic radiation, other sensory systems including pain, thermosensitivity and some of the minority senses and, finally, provides an outline and discussion of philosophical implications. New in this edition: - Greater emphasis on molecular biology and intracellular mechanisms - New chapter on genomics and sensory systems - Sections on TRP channels, synaptic transmission, evolution of nervous systems, arachnid mechanosensitive sensilla and photoreceptors, electroreception in the Monotremata, language and the FOXP2 gene, mirror neurons and the molecular biology of pain - Updated passages on human olfaction and gustation. Over four hundred illustrations, boxes containing supplementary material and self-assessment questions and a full bibliography at the end of each part make Biology of Sensory Systems essential reading for undergraduate students of biology, zoology, animal physiology, neuroscience, anatomy and physiological psychology. The book is also suitable for postgraduate students in more specialised courses such as vision sciences, optometry, neurophysiology, neuropathology, developmental biology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The law of landlord and tenant has become an increasingly complex area for both professionals and students. Apart from the double hurdle of mastering both common law principles and statutory codes, various aspects of the subject have become increasingly specialised and challenging. This new edition of Question and Answer Landlord and Tenant demonstrates that even complex problems can be explained in straightforward and inspiring terms. The authors, both experienced academics and barristers, provide detailed answers to typical questions in this difficult field. The third edition of this book has been updated in the new Question and Answer style of questions followed by commentary, bullet points and diagrams and flowcharts. It offers new questions based on the latest recommendations of the Law Commission on renting homes and the abolition of the law of forfeiture. There are new questions on the human rights dimension, the recent changes to Part II of the Landlord and Tenant Act 1954 and the substantial amendments made to leasehold enfranchisement under the Commonhold and Leasehold Reform Act 2002.