33 resultados para GST and incapacitated entities

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to the textbook approach, the developmental states of the Far East have been considered as strong and autonomous entities. Although their bureaucratic elites have remained isolated from direct pressures stemming from society, the state capacity has also been utilised in order to allocate resources in the interest of the whole society. Yet, society – by and large –has remained weak and subordinated to the state elite. On the other hand, the general perception of Sub-Saharan Africa (SSA) has been just the opposite. The violent and permanent conflict amongst rent-seeking groups for influence and authority over resources has culminated in a situation where states have become extremely weak and fragmented, while society – depending on the capacity of competing groups for mobilising resources to organise themselves mostly on a regional or local level (resulting in local petty kingdoms) – has never had the chance to evolve as a strong player. State failure in the literature, therefore, – in the context of SSA – refers not just to a weak and captured state but also to a non-functioning, and sometimes even non-existent society, too. Recently, however, the driving forces of globalisation might have triggered serious changes in the above described status quo. Accordingly, our hypothesis is the following: globalisation, especially the dynamic changes of technology, capital and communication have made the simplistic “strong state–weak society” (in Asia) and “weak state–weak society” (in Africa) categorisation somewhat obsolete. While our comparative study has a strong emphasis on the empirical scrutiny of trying to uncover the dynamics of changes in state–society relations in the two chosen regions both qualitatively and quantitatively, it also aims at complementing the meaning and essence of the concepts and methodology of stateness, state capacity and state-society relations, the well-known building blocks of the seminal works of Evans (1995), Leftwich (1995), Migdal (1988) or Myrdal (1968).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lexicon-based approaches to Twitter sentiment analysis are gaining much popularity due to their simplicity, domain independence, and relatively good performance. These approaches rely on sentiment lexicons, where a collection of words are marked with fixed sentiment polarities. However, words' sentiment orientation (positive, neural, negative) and/or sentiment strengths could change depending on context and targeted entities. In this paper we present SentiCircle; a novel lexicon-based approach that takes into account the contextual and conceptual semantics of words when calculating their sentiment orientation and strength in Twitter. We evaluate our approach on three Twitter datasets using three different sentiment lexicons. Results show that our approach significantly outperforms two lexicon baselines. Results are competitive but inconclusive when comparing to state-of-art SentiStrength, and vary from one dataset to another. SentiCircle outperforms SentiStrength in accuracy on average, but falls marginally behind in F-measure. © 2014 Springer International Publishing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, there are many instances where public sector organizations and government entities collapse and are unable to provide the required services to the public. Such organizations do not have effective mechanisms of control or any specific department which manages projects occurring in the organization. However, this study suggests the incorporation of the Project Management Office (PMO) in public sector organizations for the purpose of managing project management. There are other relevant roles of the PMO discussed in this study. The study is contextualized with respect to Corporate Governance, Risk Management, and Compliance (GRC) and the study shows how PMO can benefit or compliment GRC and provide overall better standards of practice for public sector organizations. The study uses a mixed methodology for data collection and the findings contribute to the body of knowledge regarding PMO's and GRC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The controlled co-delivery of multiple agents to the lung offers potential benefits to patients. This study investigated the preparation and characterisation of highly respirable spray-dried powders displaying the sustained release of two chemically distinct therapeutic agents. Spray-dried powders were produced from 30% (v/v) aqueous ethanol formulations that contained hydrophilic (terbutaline sulphate) and hydrophobic (beclometasone dipropionate) model drugs, chitosan (as a drug release modifier) and leucine (aerosolisation enhancer). The influence of chitosan molecular weight on spray-drying thermal efficiency, aerosol performance and drug release profile was investigated. Resultant powders were physically characterised: with in vitro aerosolisation performance and drug release profile investigated by the Multi-Stage Liquid Impinger and modified USP II dissolution apparatus, respectively. It was found that increased chitosan molecular weight gave increased spray-drying thermal efficiency. The powders generated were of a suitable size for inhalation—with emitted doses over 90% and fine particle fractions up to 72% of the loaded dose. Sustained drug release profiles were observed in dissolution tests for both agents: increased chitosan molecular weight associated with increased duration of drug release. The controlled co-delivery of hydrophilic and hydrophobic entities underlines the capability of spray drying to produce respirable particles with sustained release for delivery to the lung. (c) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Academic researchers have followed closely the interest of companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? Firstly, it appears that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example in assembly operations. Secondly, the increased tendency towards specialization has forced other, upstream, parts of industrial networks to introduce advanced manufacturing technologies to supply niche markets. Thirdly, the capital market for investments in capacity, and the trade in manufacturing as a commodity, dominates resource allocation to a larger extent than previously was the case. Fourthly, there is a continuous move towards more loosely connected entities that comprise manufacturing networks. More traditional concepts, such as the “keiretsu” and “chaibol” networks of some Asian economies, do not sufficiently support the demands now being placed on networks. Research should address these four fundamental challenges to prepare for the industrial networks of 2020 and beyond.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the use of comparative performance measurement by means of Data Envelopment Analysis in the context of the regulation of English and Welsh water companies. Specifically, the use of Data Envelopment Analysis to estimate potential cost savings in sewerage is discussed as it fed into the price review of water companies carried out by the regulator of water companies in 1994. The application is used as a vehicle for highlighting generic issues in terms of assessing the impact of factors on the ranking of units on performance, the insights gained from using alternative methods to assess comparative performance, and the issue of assessing comparative performance when few in number but highly complex entities are involved. The paper should prove of interest to those interested in regulation and, more generally, in the use of methods of comparative performance measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Often observations are nested within other units. This is particularly the case in the educational sector where school performance in terms of value added is the result of school contribution as well as pupil academic ability and other features relating to the pupil. Traditionally, the literature uses parametric (i.e. it assumes a priori a particular function on the production process) Multi-Level Models to estimate the performance of nested entities. This paper discusses the use of the non-parametric (i.e. without a priori assumptions on the production process) Free Disposal Hull model as an alternative approach. While taking into account contextual characteristics as well as atypical observations, we show how to decompose non-parametrically the overall inefficiency of a pupil into a unit specific and a higher level (i.e. a school) component. By a sample of entry and exit attainments of 3017 girls in British ordinary single sex schools, we test the robustness of the non-parametric and parametric estimates. We find that the two methods agree in the relative measures of the scope for potential attainment improvement. Further, the two methods agree on the variation in pupil attainment and the proportion attributable to pupil and school level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PCA/FA is a method of analyzing complex data sets in which there are no clearly defined X or Y variables. It has multiple uses including the study of the pattern of variation between individual entities such as patients with particular disorders and the detailed study of descriptive variables. In most applications, variables are related to a smaller number of ‘factors’ or PCs that account for the maximum variance in the data and hence, may explain important trends among the variables. An increasingly important application of the method is in the ‘validation’ of questionnaires that attempt to relate subjective aspects of a patients experience with more objective measures of vision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attitudes towards the environment can be manifest in two broad categories, namely anthropocentric and ecocentric. The former regards nature as of value only insofar as it is useful to humanity, whereas the latter assigns intrinsic value to natural entities. Industrial society can be characterised as being dominated by anthropocentrism, which leads to the assumption that a majority of people hold anthropocentric values. However, research shows the most widely held values are ecocentric, which implies that many people's actions are at variance with their values. Furthermore, policy relating to environmental issues is predominantly anthropocentric, which implies it is failing to take account of the values of the majority. Research among experts involved in policy formulation has shown that their values, often ecocentric, are excluded from the policy process. The genetic modification of food can be categorised as anthropocentric, which implies that the technique is in conflict with widely held ecocentric values. This thesis examines data collected from interviews with individuals who have an influence on the debate surrounding the introduction of genetically modified foods, and can be considered 'experts'. Each interviewee is categorised according to whether their values and actions are ecocentric or anthropocentric, and the linkages between the two and the arguments used to justify their positions are explored. Particular emphasis is placed on interviewees who have ecocentric values but act professionally in an anthropocentric way. Finally, common themes are drawn out, and the features the arguments used by the interviewees have in common are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been recognised for some time that a full code of amino acid-based recognition of DNA sequences would be useful. Several approaches, which utilise small DNA binding motifs called zinc fingers, are presently employed. None of the current approaches successfully combine a combinatorial approach to the elucidation of a code with a single stage high throughput screening assay. The work outlined here describes the development of a model system for the study of DNA protein interactions and the development of a high throughput assay for detection of such interactions. A zinc finger protein was designed which will bind with high affinity and specificity to a known DNA sequence. For future work it is possible to mutate the region of the zinc finger responsible for the specificity of binding, in order to observe the effect on the DNA / protein interactions. The zinc finger protein was initially synthesised as a His tagged product. It was not possible however to develop a high throughput assay using the His tagged zinc finger protein. The gene encoding the zinc finger protein was altered and the protein synthesised as a Glutathione S-Transferase (GST) fusion product. A successful assay was developed using the GST protein and Scintillation Proximity Assay technology (Amersham Pharmacia Biotech). The scintillation proximity assay is a dynamic assay that allows the DNA protein interactions to be studied in "real time". This assay not only provides a high throughput method of screening zinc finger proteins for potential ligands but also allows the effect of addition of reagents or competitor ligands to be monitored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cell culture model of the gastric epithelial cell surface would prove useful for biopharmaceutical screening of new chemical entities and dosage forms. A successful model should exhibit tight junction formation, maintenance of differentiation and polarity. Conditions for primary culture of guinea-pig gastric mucous epithelial cell monolayers on Tissue Culture Plastic (TCP) and membrane insects (Transwells) were established. Tight junction formation for cells grown on Transwells for three days was assessed by measurement of transepithelial resistance (TEER) and permeability of mannitol and fluorescein. Coating the polycarbonate filter with collagen IV, rather with collagen I, enhanced tight junction formation. TEER for cells grown on Transwells coated with collagen IV was close to that obtained with intact guinea-pig gastric epithelium in vitro. Differentiation was assessed by incorporation of [3H] glucosamine into glycoprotein and by activity of NADPH oxidase, which produces superoxide. Both of these measures were greater for cells grown on filters coated with collagen I than for cells grown on TCP, but no major difference was found between cells grown on collagens I and IV. However, monolayers grown on membranes coated with collagen IV exhibited apically polarized secretion of mucin and superoxide. The proportion of cells, which stained positively for mucin with periodic Schiff reagent, was greater than 95% for all culture conditions. Gastric epithelial monolayers grown on Transwells coated with collagen IV were able to withstand transient (30 min) apical acidification to pH 3, which was associated with a decrease in [3H] mannitol flux and an increase in TEER relative to pH 7.4. The model was used to provide the first direct demonstration that an NSAID (indomethacin) accumulated in gastric epithelial cells exposed to low apical pH. In conclusion, guinea-pig epithelial cells cultured on collagen IV represent a promising model of the gastric surface epithelium suitable for screening procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis focuses on the overall structure of the language of two types of Speech Exchange Systems (SES) : Interview (INT) and Conversation (CON). The linguistic structure of INT and CON are quantitatively investigated on three different but interrelated levels of analysis : Lexis, Syntax and Information Structure. The corpus of data 1n vest1gated for the project consists of eight sessions of pairs of conversants in carefully planned interviews followed by unplanned, surreptitiously recorded conversational encounters of the same pairs of speakers. The data comprise a total of approximately 15.200 words of INT talk and of about 19.200 words in CON. Taking account of the debatable assumption that the language of SES might be complex on certain linguistic levels (e.g. syntax) (Halliday 1979) and might be simple on others (e.g. lexis) in comparison to written discourse, the thesis sets out to investigate this complexity using a statistical approach to the computation of the structures recurrent in the language of INT and CON. The findings indicate clearly the presence of linguistic complexity in both types. They also show the language of INT to be slightly more syntactically and lexically complex than that of CON. Lexical density seems to be relatively high in both types of spoken discourse. The language of INT seems to be more complex than that of CON on the level of information structure too. This is manifested in the greater use of Inferable and other linguistically complex entities of discourse. Halliday's suggestion that the language of SES is syntactically complex is confirmed but not the one that the more casual the conversation is the more syntactically complex it becomes. The results of the analysis point to the general conclusion that the linguistic complexity of types of SES is not only in the high recurrence of syntactic structures, but also in the combination of these features with each other and with other linguistic and extralinguistic features. The linguistic analysis of the language of SES can be useful in understanding and pinpointing the intricacies of spoken discourse in general and will help discourse analysts and applied linguists in exploiting it both for theoretical and pedagogical purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional method of classifying neurodegenerative diseases is based on the original clinico-pathological concept supported by 'consensus' criteria and data from molecular pathological studies. This review discusses first, current problems in classification resulting from the coexistence of different classificatory schemes, the presence of disease heterogeneity and multiple pathologies, the use of 'signature' brain lesions in diagnosis, and the existence of pathological processes common to different diseases. Second, three models of neurodegenerative disease are proposed: (1) that distinct diseases exist ('discrete' model), (2) that relatively distinct diseases exist but exhibit overlapping features ('overlap' model), and (3) that distinct diseases do not exist and neurodegenerative disease is a 'continuum' in which there is continuous variation in clinical/pathological features from one case to another ('continuum' model). Third, to distinguish between models, the distribution of the most important molecular 'signature' lesions across the different diseases is reviewed. Such lesions often have poor 'fidelity', i.e., they are not unique to individual disorders but are distributed across many diseases consistent with the overlap or continuum models. Fourth, the question of whether the current classificatory system should be rejected is considered and three alternatives are proposed, viz., objective classification, classification for convenience (a 'dissection'), or analysis as a continuum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the issues in the innovation system literature is examination of technological learning strategies of laggard nations. Two distinct bodies of literature have contributed to our insight into forces driving learning and innovation, National Systems of Innovation (NSI) and technological learning literature. Although both literatures yield insights on catch-up strategies of 'latecomer' nations, the explanatory powers of each literature by itself is limited. In this paper, a possible way of linking the macro- and the micro-level approaches by incorporating enterprises as active learning entities into the learning and innovation system is proposed. The proposed model has been used to develop research hypotheses and indicate research directions and is relevant for investigating the learning strategies of firms in less technologically intensive industries outside East Asia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational reflection is a well-established technique that gives a program the ability to dynamically observe and possibly modify its behaviour. To date, however, reflection is mainly applied either to the software architecture or its implementation. We know of no approach that fully supports requirements reflection- that is, making requirements available as runtime objects. Although there is a body of literature on requirements monitoring, such work typically generates runtime artefacts from requirements and so the requirements themselves are not directly accessible at runtime. In this paper, we define requirements reflection and a set of research challenges. Requirements reflection is important because software systems of the future will be self-managing and will need to adapt continuously to changing environmental conditions. We argue requirements reflection can support such self-adaptive systems by making requirements first-class runtime entities, thus endowing software systems with the ability to reason about, understand, explain and modify requirements at runtime. © 2010 ACM.