44 resultados para Endogenous Information Structure

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a data set for the 162 largest Hungarian firms during the period of 1994-1999, this paper explores the determinants of equity shares held by both foreign investors and Hungarian corporations. Evidence is found for a post-privatisation evolution towards more homogeneous equity structures, where dominant categories of Hungarian and foreign owners aim at achieving controlling stakes. In addition, focusing on firm-level characteristics we find that exporting firms attract foreign owners who acquire controlling equity stakes. Similarly, firm-size measurements are positively associated with the presence of foreign investors. However, they are negatively associated with 100% foreign ownership, possibly because the marginal costs of acquiring additional equity are growing with the size of the assets. The results are interpreted within the framework of the existing theory. In particular, following Demsetz and Lehn (1985) and Demsetz and Villalonga (2001) we argue that equity should not be treated as an exogenous variable. As for specific determinants of equity levels, we focus on informational asymmetries and (unobserved) ownership-specific characteristics of foreign investors and Hungarian investors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work is an empirical investigation into the lq`reference skills' of Pakistani learners and their language needs on semantic, phonetic, lexical and pragmatic levels in the dictionary. The introductory chapter discusses the relatively problematic nature of lexis in comparison with the other aspects in EFL learning and spells out the aim of this study. Chapter two provides an analytical survey of the various types of research undertaken in different contexts of the dictionary and explains the eclectic approach adopted in the present work. Chapter three studies the `reference skills' of this category of learners in the background of highly sophisticated information structure of learners' dictionaries under evaluation and suggests some measures for improvement in this context. Chapter four considers various criteria, eg. pedagogic, linguistic and sociolinguistic for determining the macro-structure of learner's dictionary with a focus on specific Ll speakers. Chapter five is concerned with various aspects of the semantic information provided in the dictionaries matched against the needs of Pakistani learners with regard to both comprehension and production. The type, scale and presentation of grammatical information in the dictionary is analysed in chapter six with the object of discovering their role and utility for the learner. Chapter seven explores the rationale for providing phonological information, the extent to which this guidance is vital and the problems of phonetic symbols employed in the dictionaries. Chapter eight brings into perspective the historical background of English-Urdu bilingual lexicography and evalutes the currently popular bilingual dictionaries among the student community, with the aim of discovering the extent to which they have taken account of the modern tents of lexicography and investigating their validity as a useful reference tool in the learning of English language. The final chapter concludes the findings of individual aspects in a coherent fashion to assess the viability of the original hypothesis that learners' dictionaries if compiled with a specific set of users in mind would be more useful.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The design and implementation of data bases involve, firstly, the formulation of a conceptual data model by systematic analysis of the structure and information requirements of the organisation for which the system is being designed; secondly, the logical mapping of this conceptual model onto the data structure of the target data base management system (DBMS); and thirdly, the physical mapping of this structured model into storage structures of the target DBMS. The accuracy of both the logical and physical mapping determine the performance of the resulting systems. This thesis describes research which develops software tools to facilitate the implementation of data bases. A conceptual model describing the information structure of a hospital is derived using the Entity-Relationship (E-R) approach and this model forms the basis for mapping onto the logical model. Rules are derived for automatically mapping the conceptual model onto relational and CODASYL types of data structures. Further algorithms are developed for partly automating the implementation of these models onto INGRES, MIMER and VAX-11 DBMS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present thesis focuses on the overall structure of the language of two types of Speech Exchange Systems (SES) : Interview (INT) and Conversation (CON). The linguistic structure of INT and CON are quantitatively investigated on three different but interrelated levels of analysis : Lexis, Syntax and Information Structure. The corpus of data 1n vest1gated for the project consists of eight sessions of pairs of conversants in carefully planned interviews followed by unplanned, surreptitiously recorded conversational encounters of the same pairs of speakers. The data comprise a total of approximately 15.200 words of INT talk and of about 19.200 words in CON. Taking account of the debatable assumption that the language of SES might be complex on certain linguistic levels (e.g. syntax) (Halliday 1979) and might be simple on others (e.g. lexis) in comparison to written discourse, the thesis sets out to investigate this complexity using a statistical approach to the computation of the structures recurrent in the language of INT and CON. The findings indicate clearly the presence of linguistic complexity in both types. They also show the language of INT to be slightly more syntactically and lexically complex than that of CON. Lexical density seems to be relatively high in both types of spoken discourse. The language of INT seems to be more complex than that of CON on the level of information structure too. This is manifested in the greater use of Inferable and other linguistically complex entities of discourse. Halliday's suggestion that the language of SES is syntactically complex is confirmed but not the one that the more casual the conversation is the more syntactically complex it becomes. The results of the analysis point to the general conclusion that the linguistic complexity of types of SES is not only in the high recurrence of syntactic structures, but also in the combination of these features with each other and with other linguistic and extralinguistic features. The linguistic analysis of the language of SES can be useful in understanding and pinpointing the intricacies of spoken discourse in general and will help discourse analysts and applied linguists in exploiting it both for theoretical and pedagogical purposes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Three studies tested the impact of properties of behavioral intention on intention-behavior consistency, information processing, and resistance. Principal components analysis showed that properties of intention formed distinct factors. Study 1 demonstrated that temporal stability, but not the other intention attributes, moderated intention-behavior consistency. Study 2 found that greater stability of intention was associated with improved memory performance. In Study 3, participants were confronted with a rating scale manipulation designed to alter their intention scores. Findings showed that stable intentions were able to withstand attack. Overall, the present research findings suggest that different properties of intention are not simply manifestations of a single underlying construct ("intention strength"), and that temporal stability exhibits superior resistance and impact compared to other intention attributes. © 2013 Wiley Periodicals, Inc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Receptor activity modifying proteins (RAMPs) are a family of single-pass transmembrane proteins that dimerize with G-protein-coupled receptors. They may alter the ligand recognition properties of the receptors (particularly for the calcitonin receptor-like receptor, CLR). Very little structural information is available about RAMPs. Here, an ab initio model has been generated for the extracellular domain of RAMP1. The disulfide bond arrangement (Cys 27-Cys82, Cys40-Cys72, and Cys 57-Cys104) was determined by site-directed mutagenesis. The secondary structure (a-helices from residues 29-51, 60-80, and 87-100) was established from a consensus of predictive routines. Using these constraints, an assemblage of 25,000 structures was constructed and these were ranked using an all-atom statistical potential. The best 1000 conformations were energy minimized. The lowest scoring model was refined by molecular dynamics simulation. To validate our strategy, the same methods were applied to three proteins of known structure; PDB:1HP8, PDB:1V54 chain H (residues 21-85), and PDB:1T0P. When compared to the crystal structures, the models had root mean-square deviations of 3.8 Å, 4.1 Å, and 4.0 Å, respectively. The model of RAMP1 suggested that Phe93, Tyr 100, and Phe101 form a binding interface for CLR, whereas Trp74 and Phe92 may interact with ligands that bind to the CLR/RAMP1 heterodimer. © 2006 by the Biophysical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of illusory or artefactual structure from the visualisation of high-dimensional structureless data. In particular we examine the role of the distance metric in the use of topographic mappings based on the statistical field of multidimensional scaling. We show that the use of a squared Euclidean metric (i.e. the SSTRESs measure) gives rise to an annular structure when the input data is drawn from a high-dimensional isotropic distribution, and we provide a theoretical justification for this observation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualising data for exploratory analysis is a big challenge in scientific and engineering domains where there is a need to gain insight into the structure and distribution of the data. Typically, visualisation methods like principal component analysis and multi-dimensional scaling are used, but it is difficult to incorporate prior knowledge about structure of the data into the analysis. In this technical report we discuss a complementary approach based on an extension of a well known non-linear probabilistic model, the Generative Topographic Mapping. We show that by including prior information of the covariance structure into the model, we are able to improve both the data visualisation and the model fit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This study seeks to provide valuable new insight into the timeliness of corporate internet reporting (TCIR) by a sample of Irish-listed companies. Design/methodology/approach – The authors apply an updated version of Abdelsalam et al. TCIR index to assess the timeliness of corporate internet reporting. The index encompasses 13 criteria that are used to measure the TCIR for a sample of Irish-listed companies. In addition, the authors assess the timeliness of posting companies’ annual and interim reports to their web sites. Furthermore, the study examines the influence of board independence and ownership structure on the TCIR behaviour. Board composition is measured by the percentage of independent directors, chairman’s dual role and average tenure of directors. Ownership structure is represented by managerial ownership and blockholder ownership. Findings – It is found that Irish-listed companies, on average, satisfy only 46 per cent of the timeliness criteria assessed by the timeliness index. After controlling for size, audit fees and firm performance, evidence that TCIR is positively associated with board of director’s independence and chief executive officer (CEO) ownership is provided. Furthermore, it is found that large companies are faster in posting their annual reports to their web sites. The findings suggest that board composition and ownership structure influence a firm’s TCIR behaviour, presumably in response to the information asymmetry between management and investors and the resulting agency costs. Practical implications – The findings highlight the need for improvement in TCIR by Irish-listed companies in many areas, especially in regard to the regular updates of information provided on their web sites. Originality/value – This study represents one of the first comprehensive examinations of the important dimension of the TCIR in Irish-listed companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial: This special issue has been drawn from papers that were published as part of the Second European Conference on Management of Technology (EuroMOT) which was held at Aston Business School (Birmingham, UK) 10-12 September 2006. This was the official European conference for the International Association for Management of Technology (IAMOT); the overall theme of the conference was “Technology and global integration.” There were many high-calibre papers submitted to the conference and published in the associated proceedings (Bennett et al., 2006). The streams of interest that emerged from these submissions were the importance of: technology strategy, innovation, process technologies, managing change, national policies and systems, research and development, supply chain technology, service and operational technology, education and training, small company incubation, technology transfer, virtual operations, technology in developing countries, partnership and alliance, and financing and investment. This special issue focuses upon the streams of interest that accentuate the importance of collaboration between different organisations. Such organisations vary greatly in character; for instance, they may be large or small, publicly or privately owned, and operate in manufacturing or service sectors. Despite these varying characteristics they all have something in common; they all stress the importance of inter-organisational collaboration as a critical success factor for their organisation. In today's global economy it is essential that organisations decide what their core competencies are what those of complementing organisations are. Core competences should be developed to become a bases of differentiation, leverage and competitive advantage, whilst those that are less mature should be outsourced to other organisations that can claim to have had more recognition and success in that particular core competence (Porter, 2001). This strategic trend can be observed throughout advanced economies and is growing strongly. If a posteriori reasoning is applied here it follows that organisations could continue to become more specialised in fewer areas whilst simultaneously becoming more dependent upon other organisations for critical parts of their operations. Such actions seem to fly in the face of rational business strategy and so the question must be asked: why are organisations developing this way? The answer could lie in the recent changes in endogenous and exogenous factors of the organisation; the former emphasising resource-based issues in the short-term, and strategic positioning in the long-term whilst the later emphasises transaction costs in the short-term and acquisition of new skills and knowledge in the long-term. For a harmonious balance of these forces to prevail requires organisations to firstly declare a shared meta-strategy, then to put some cross-organisational processes into place which have their routine operations automated as far as possible. A rolling business plan would review, assess and reposition each organisation within this meta-strategy according to how well they have contributed (Binder and Clegg, 2006). The important common issue here is that an increasing number of businesses today are gaining direct benefit from increasing their levels of inter-organisational collaboration. Such collaboration has largely been possible due to recent technological advances which can make organisational structures more agile (e.g. the extended or the virtual enterprise), organisational infra-structure more connected, and the sharing of real-time information an operational reality. This special issue consists of research papers that have explored the above phenomenon in some way. For instance, the role of government intervention, the use of internet-based technologies, the role of research and development organisations, the changing relationships between start-ups and established firms, the importance of cross-company communities of practice, the practice of networking, the front-loading of large-scale projects, innovation and the probabilistic uncertainties that organisations experience are explored in these papers. The cases cited in these papers are limited as they have a Eurocentric focus. However, it is hoped that readers of this special issue will gain a valuable insight into the increasing importance of collaborative practices via these studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the importance of collaboration between different types of organizations within an enterprise. To achieve successful collaboration requires both endogenous and exogenous factors of each organization to be considered and a shared meta-strategy supported by shared cross-organizational processes and technology. A rolling business plan would periodically review, assess and reposition each organization within this meta-strategy according to how well they have contributed. We show that recent technological advances have made organizational structures more agile, organizational infra-structure more connected and the sharing of real-time information an operational reality; we also discuss the challenges and risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We outline a scheme for the way in which early vision may handle information about shading (luminance modulation, LM) and texture (contrast modulation, CM). Previous work on the detection of gratings has found no sub-threshold summation, and no cross-adaptation, between LM and CM patterns. This strongly implied separate channels for the detection of LM and CM structure. However, we now report experiments in which adapting to LM (or CM) gratings creates tilt aftereffects of similar magnitude on both LM and CM test gratings, and reduces the perceived strength (modulation depth) of LM and CM gratings to a similar extent. This transfer of aftereffects between LM and CM might suggest a second stage of processing at which LM and CM information is integrated. The nature of this integration, however, is unclear and several simple predictions are not fulfilled. Firstly, one might expect the integration stage to lose identity information about whether the pattern was LM or CM. We show instead that the identity of barely detectable LM and CM patterns is not lost. Secondly, when LM and CM gratings are combined in-phase or out-of-phase we find no evidence for cancellation, nor for 'phase-blindness'. These results suggest that information about LM and CM is not pooled or merged - shading is not confused with texture variation. We suggest that LM and CM signals are carried by separate channels, but they share a common adaptation mechanism that accounts for the almost complete transfer of perceptual aftereffects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent global „credit crunch? has brought sharply into focus the need for better understanding of what it takes for organisations to survive. This research seeks to help organisations maintain their „viability? – the ability to maintain a separate existence and survive on their own. Whilst there are a multitude of factors that contribute to organisational viability, information can be viewed as the lifeblood of organisations. This research increases our understanding of how organisations can manage information effectively to help maintain their viability. The viable systems model (VSM) is an established modelling technique that enables the detailed analysis of organisational activity to examine how the structure and functions performed in an organisation contribute to its „viability?. The VSM has been widely applied, in small/large companies, industries and governments. However, whilst the VSM concentrates on the structure and functions necessary for an organisation to be viable, it pays much less attention to information deployment in organisations. Indeed, the VSM is criticised in the literature for being unable to provide much help with detailed information and communication structures and new theories are called for to explore the way people interact and what information they need in the VSM. This research analyses qualitative data collected from four case studies to contribute to our understanding of the role that information plays in organisational viability, making three key contributions to the academic literature. In the information management literature, this research provides new insight into the roles that specific information plays in organisations. In the systems thinking literature, this research extends our understanding of the VSM and builds on its powerful diagnostic capability to provide further criteria to aid in the diagnosis of viable organisations. In the information systems literature, this research develops a framework that can be used to help organisations design more effective information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three British bituminous coals, (Gedling, Cresswell, and Cortonwood Silkstone) were selected for study. Procedures were developed, using phase transfer catalysts (PTC's), to degrade the solvent insoluble fractions of the coals. PTC's are of interest because they have the potential to bring about selective high conversion reactions, under mild conditions, (often in the past, severe reaction conditions have had to be used to degrade the coals, this in turn resulted in the loss of much of the structural information). We have applied a variety of physical and chemical techniques to maximise the amount of structural information, these include, elemental analysis, 1H-NMR, 13C-CPMAS-NMR, GPC, GC-MS, FTIR spectroscopy, DRIFT spectroscopy, and gas adsorption measurements. The main conclusions from the work are listed below:- ( 1 ) PTC O-methylation; This reaction removes hydrogen bonds within the coal matrix by 'capping' the phenolic groups. It was found that the polymer-like matrix could be made more flexible, but not significantly more soluble, by O-methylation. I.E. the trapped or 'mobile' phase of the coals could be removed at a faster rate after this reaction had been carried out. ( 2 ) PTC Reductive and Acidic Ether Cleavage; The three coals were found to contain insignificant amounts of dialkyl and alkyl aryl ethers. The number of diaryl ethers could not be estimated, by reductive ether cleavage, (even though a high proportion of all three coals was solublised). The majority of the ethers present in the coals were inert to both cleavage methods, and are therefore assumed to be heterocyclic ethers. ( 3 ) Trif!uoroperacetic Acid Oxidation; This oxidant was used to study the aliphatic portions of the polymer-like macromolecular matrix of the coals. Normally this reagent will only solublise low rank coals, we however have developed a method whereby trifluoroperacetic acid can be used to degrade high rank bituminous coals. ( 4 ) PTC/Permanganate Oxidation; This reagent has been found to be much more selective than the traditional alkaline permanganate oxidation, with a lot more structural information being retained within the various fractions. This degradative method therefore has the potential of yielding new information about the molecular structure of coals.