22 resultados para Critical Evaluation
em Aston University Research Archive
Resumo:
This thesis considers the main theoretical positions within the contemporary sociology of nationalism. These can be grouped into two basic types, primordialist theories which assert that nationalism is an inevitable aspect of all human societies, and modernist theories which assert that nationalism and the nation-state first developed within western Europe in recent centuries. With respect to primordialist approaches to nationalism, it is argued that the main common explanation offered is human biological propensity. Consideration is concentrated on the most recent and plausible of such theories, sociobiology. Sociobiological accounts root nationalism and racism in genetic programming which favours close kin, or rather to the redirection of this programming in complex societies, where the social group is not a kin group. It is argued that the stated assumptions of the sociobiologists do not entail the conclusions they draw as to the roots of nationalism, and that in order to arrive at such conclusions further and implausible assumptions have to be made. With respect to modernists, the first group of writers who are considered are those, represented by Carlton Hayes, Hans Kohn and Elie Kedourie, whose main thesis is that the nation-state and nationalism are recent phenomena. Next, the two major attempts to relate nationalism and the nation-state to imperatives specific either to capitalist societies (in the `orthodox' marxist theory elaborated about the turn of the twentieth century) or to the processes of modernisation and industrialisation (the `Weberian' account of Ernest Gellner) are discussed. It is argued that modernist accounts can only be sustained by starting from a definition of nationalism and the nation-state which conflates such phenomena with others which are specific to the modern world. The marxist and Gellner accounts form the necessary starting point for any explanation as to why the nation-state is apparently the sole viable form of polity in the modern world, but their assumption that no pre-modern society was national leaves them without an adequate account of the earliest origins of the nation-state and of nationalism. Finally, a case study from the history of England argues both the achievement of a national state form and the elucidation of crucial components of a nationalist ideology were attained at a period not consistent with any of the versions of the modernist thesis.
Resumo:
Contrast susceptibility is defined as the difference in visual acuity recorded for high and low contrast optotypes. Other researchers refer to this parameter as "normalised low contrast acuity". Pilot surveys have revealed that contrast susceptibility deficits are more strongly related to driving accident involvement than are deficits in high contrast visual acuity. It has been hypothesised that driving situation avoidance is purely based upon high contrast visual acuity. Hence, the relationship between high contrast visual acuity and accidents is masked by situation avoidance whilst drivers with contrast susceptibility deficits remain prone to accidents in poor visibility conditions. A national survey carried out to test this hypothesis provided no support for either the link between contrast susceptibility deficits and accidents involvement or the proposed hypothesis. Further, systematically worse contrast susceptibility scores emerged from vision screeners compared to wall mounted test charts. This discrepancy was not due to variations in test luminance or instrument myopia. Instead, optical imperfections inherent in vision screeners were considered to be responsible. Although contrast susceptibility is unlikely to provide a useful means of screening drivers' vision, previous research does provide support for its ability to detect visual deficits that may influence everyday tasks. In this respect, individual contrast susceptibility variations were found to reflect variations in the contrast sensitivity function - a parameter that provides a global estimate of human contrast sensitivity.
Resumo:
OBJECTIVES: To examine the volume, relevance and quality of transnational tobacco corporations' (TTCs) evidence that standardised packaging of tobacco products 'won't work', following the UK government's decision to 'wait and see' until further evidence is available. DESIGN: Content analysis. SETTING: We analysed the evidence cited in submissions by the UK's four largest TTCs to the UK Department of Health consultation on standardised packaging in 2012. OUTCOME MEASURES: The volume, relevance (subject matter) and quality (as measured by independence from industry and peer-review) of evidence cited by TTCs was compared with evidence from a systematic review of standardised packaging . Fisher's exact test was used to assess differences in the quality of TTC and systematic review evidence. 100% of the data were second-coded to validate the findings: 94.7% intercoder reliability; all differences were resolved. RESULTS: 77/143 pieces of TTC-cited evidence were used to promote their claim that standardised packaging 'won't work'. Of these, just 17/77 addressed standardised packaging: 14 were industry connected and none were published in peer-reviewed journals. Comparison of TTC and systematic review evidence on standardised packaging showed that the industry evidence was of significantly lower quality in terms of tobacco industry connections and peer-review (p<0.0001). The most relevant TTC evidence (on standardised packaging or packaging generally, n=26) was of significantly lower quality (p<0.0001) than the least relevant (on other topics, n=51). Across the dataset, TTC-connected evidence was significantly less likely to be published in a peer-reviewed journal (p=0.0045). CONCLUSIONS: With few exceptions, evidence cited by TTCs to promote their claim that standardised packaging 'won't work' lacks either policy relevance or key indicators of quality. Policymakers could use these three criteria-subject matter, independence and peer-review status-to critically assess evidence submitted to them by corporate interests via Better Regulation processes.
Resumo:
Abstract: This paper offers a critical evaluation of recent Irish industrial policy (IP) experience. It argues that whilst Ireland managed to get some things “right” through its IP, substantial tensions arose through making foreign direct investment (FDI) attraction the centrepiece of policy, without at the same time adopting a more holistic approach in IP which inter alia also placed an emphasis on indigenous firms and entrepreneurship more generally. In particular, greater efforts should have been made much earlier in attempting to embed transnational corporation (TNC)-led activity better into the wider economy, in fostering domestic small firms and entrepreneurship, in promoting clusters, and more generally in evaluating IP more fully – notwithstanding the context which mitigated against such actions. As a result, Ireland as an economy remained vulnerable to strategic decisions made elsewhere by TNC decision makers, with IP effectively contributing to a situation that can be characterised as institutional and strategic failure. Overall, the paper suggests that wholesale emulation of the Irish IP approach is problematic.
Resumo:
Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.
Resumo:
Following centuries of feminist struggle, centuries which have born witness to the close relationship between linguistic discrimination and social reality, there is a growing tendency in modern society to acknowledge the vital role played by language in overcoming gender discrimination. Political institutions are currently compensating by instituting the use of non-sexist language through legislative guidelines, and this makes an important contribution to social reform for equality between the sexes. Seeing that translation is so important for the creation of the collective identities on which modern global society depends, it is clear that non-sexist translation is crucial if there is to be non-sexist language. In this article I examine the potential of non-sexist translation in the struggle for gender equality from a both a theoretical and a practical viewpoint, and I end with a critical evaluation of non-sexist translation methods.
Resumo:
The thesis investigates the properties of two trends or time series which formed a:part of the Co-Citation bibliometric model "X~Ray Crystallography and Protein Determination in 1978, 1980 and 1982". This model was one of several created for the 1983 ABRC Science Policy Study which aimed to test the utility of bibliometric models in a national science policy context. The outcome of the validation part of that study proved to be especially favourable concerning the utility of trend data, which purport to model the development of speciality areas in science over time. This assessment could have important implications for the use of such data in policy formulation. However one possible problem with the Science Policy Study's conclusions was that insufficient time was available in the study for an in-depth analysis of the data. The thesis aims to continue the validation begun in the ABRC study by providing a detailed.examination of the characteristics of the data contained in the Trends numbered 11 and 44 in the model. A novel methodology for the analysis of the properties of the trends with respect to their literature content is presented. This is followed by an assessment based on questionnaire and interview data, of the ability of Trend 44 to realistically model the historical development of the field of mobile genetic elements research over time, with respect to its scientific content and the activities of its community of researchers. The results of these various analyses are then used to evaluate the strenghts and weaknesses of a trend or time series approach to the modelling of the activities of scientifiic fields. A critical evaluation of the origins of the discovered strengths and weaknesses.in the assumptions underlying the techniques used to generate trends from co-citation data is provided. Possible improvements. to the modelling techniques are discussed.
Resumo:
Despite improvements in interventional and pharmacological therapy of atherosclerotic disease, it is still the leading cause of death in the developed world. Hence, there is a need for further development of effective therapeutic approaches. This requires better understanding of the molecular mechanisms and pathophysiology of the disease. Atherosclerosis has long been identified as having an inflammatory component contributing to its pathogenesis, whereas the available therapy primarily targets hyperlipidemia and prevention of thrombosis. Notwithstanding a pleotropic anti-inflammatory effect to some therapies, such as acetyl salicylic acid and the statins, none of the currently approved medicines for management of either stable or complicated atherosclerosis has inflammation as a primary target. Monocytes, as representatives of the innate immune system, play a major role in the initiation, propagation, and progression of atherosclerosis from a stable to an unstable state. Experimental data support a role of monocytes in acute coronary syndromes and in outcome post-infarction; however, limited research has been done in humans. Analysis of expression of various cell surface receptors allows characterization of the different monocyte subsets phenotypically, whereas downstream assessment of inflammatory pathways provides an insight into their activity. In this review we discuss the functional role of monocytes and their different subpopulations in atherosclerosis, acute coronary syndromes, cardiac healing, and recovery with an aim of critical evaluation of potential future therapeutic targets in atherosclerosis and its complications. We will also discuss technical difficulties of delineating different monocyte subpopulations, understanding their differentiation potential and function.
Resumo:
Strategic sourcing has increased in importance in recent years, and now plays an important role in companies’ planning. The current volatility in supply markets means companies face multiple challenges involving lock-in situations, supplier bankruptcies or supply security issues. In addition, their exposure can increase due to natural disasters, as witnessed recently in the form of bird flu, volcanic ash and tsunamis. Therefore, the primary focus of this study is risk management in the context of strategic sourcing. The study presents a literature review on sourcing based on the 15 years from 1998–2012, and considers 131 academic articles. The literature describes strategic sourcing as a strategic, holistic process in managing supplier relationships, with a long-term focus on adding value to the company and realising competitive advantage. Few studies discovered the real risk impact and status of risk management in strategic sourcing, and evaluation across countries and industries was limited, with the construction sector particularly under-researched. This methodology is founded on a qualitative study of twenty cases across Ger-many and the United Kingdom from the construction sector and electronics manufacturing industries. While considering risk management in the context of strategic sourcing, the thesis takes into account six dimensions that cover trends in strategic sourcing, theoretical and practical sourcing models, risk management, supply and demand management, critical success factors and the strategic supplier evaluation. The study contributes in several ways. First, recent trends are traced and future needs identified across the research dimensions of countries, industries and companies. Second, it evaluates critical success factors in contemporary strategic sourcing. Third, it explores the application of theoretical and practical sourcing models in terms of effectiveness and sustainability. Fourth, based on the case study findings, a risk-oriented strategic sourcing framework and a model for strategic sourcing are developed. These are based on the validation of contemporary requirements and a critical evaluation of the existing situation. It contemplates the empirical findings and leads to a structured process to manage risk in strategic sourcing. The risk-oriented framework considers areas such as trends, corporate and sourcing strategy, critical success factors, strategic supplier selection criteria, risk assessment, reporting, strategy alignment and reporting. The proposed model highlights the essential dimensions in strategic sourcing and guides us to a new definition of strategic sourcing supported by this empirical study.
Resumo:
Nitric oxide (NO) and hydrogen sulfide (H2S) are two major gaseous signaling molecules that regulate diverse physiological functions. Recent publications indicate the regulatory role of H2S on NO metabolism. In this chapter, we discuss the latest findings on H2S-NO interactions through formation of novel chemical derivatives and experimental approaches to study these adducts. This chapter also addresses potential H2S interference on various NO detection techniques, along with precautions for analyzing biological samples from various sources. This information will facilitate critical evaluation and clearer insight into H2S regulation of NO signaling and its influence on various physiological functions.
Resumo:
This paper draws industrial policy lessons for small Central and Eastern European states through a critical evaluation of recent Irish and Hungarian experiences. The paper outlines a ‘holistic view’ of industrial policy before exploring the experiences of the two economies. Whilst both have managed to ‘do’ policy well in some regards, substantial challenges remain in making FDI attraction the centrepiece of industrial policy, as has been highlighted recently. Overall, the paper suggests that wholesale emulation of the Irish and Hungarian approach is problematic for small open CEE states, and that more balanced approaches to development - and hence industrial policy – are warranted.
Resumo:
Book review: Evaluation in Translation: Critical Points of Translator Decision-Making, by Jeremy Munday, London, Routledge, 2012, 194 pp., £95 (hardback), ISBN 978-0-415-57769-4, £26.99 (paperback), ISBN 978-0-415-57770-0.
Resumo:
Whole life costing (WLC) has become the best practice in construction procurement and it is likely to be a major issue in predicting whole life costs of a construction project accurately. However, different expectations from different organizations throughout a project's life and the lack of data, monitoring targets, and long-term interest for many key players are obstacles to be overcome if WLC is to be implemented. A questionnaire survey was undertaken to investigate a set of ten common factors and 188 individual factors. These were grouped into eight critical categories (project scope, time, cost, quality, contract/administration, human resource, risk, and health and safety) by project phase, as perceived by the clients, contractors and subcontractors in order to identify critical success factors for whole life performance assessment (WLPA). Using a relative importance index, the top ten critical factors for each category, from the perspective of project participants, were analyzed and ranked. Their agreement on those categories and factors were analyzed using Spearman's rank correlation. All participants identify “Type of Project” as the most common critical factor in the eight categories for WLPA. Using the relative index ranking technique and weighted average methods, it was found that the most critical individual factors in each category were: “clarity of contract” (scope); “fixed construction period” (time); “precise project budget estimate” (cost); “material quality” (quality); “mutual/trusting relationships” (contract/administration); “leadership/team management” (human resource); and “management of work safety on site” (health and safety). There was relatively a high agreement on these categories among all participants. Obviously, with 80 critical factors of WLPA, there is a stronger positive relationship between client and contactor rather than contractor and subcontractor, client and subcontractor. Putting these critical factors into a criteria matrix can facilitate an initial framework of WLPA in order to aid decision making in the public sector in South Korea for evaluation/selection process of a construction project at the bid stage.
Resumo:
Background: Screening for congenital heart defects (CHDs) relies on antenatal ultrasound and postnatal clinical examination; however, life-threatening defects often go undetected. Objective: To determine the accuracy, acceptability and cost-effectiveness of pulse oximetry as a screening test for CHDs in newborn infants. Design: A test accuracy study determined the accuracy of pulse oximetry. Acceptability of testing to parents was evaluated through a questionnaire, and to staff through focus groups. A decision-analytic model was constructed to assess cost-effectiveness. Setting: Six UK maternity units. Participants: These were 20,055 asymptomatic newborns at = 35 weeks’ gestation, their mothers and health-care staff. Interventions: Pulse oximetry was performed prior to discharge from hospital and the results of this index test were compared with a composite reference standard (echocardiography, clinical follow-up and follow-up through interrogation of clinical databases). Main outcome measures: Detection of major CHDs – defined as causing death or requiring invasive intervention up to 12 months of age (subdivided into critical CHDs causing death or intervention before 28 days, and serious CHDs causing death or intervention between 1 and 12 months of age); acceptability of testing to parents and staff; and the cost-effectiveness in terms of cost per timely diagnosis. Results: Fifty-three of the 20,055 babies screened had a major CHD (24 critical and 29 serious), a prevalence of 2.6 per 1000 live births. Pulse oximetry had a sensitivity of 75.0% [95% confidence interval (CI) 53.3% to 90.2%] for critical cases and 49.1% (95% CI 35.1% to 63.2%) for all major CHDs. When 23 cases were excluded, in which a CHD was already suspected following antenatal ultrasound, pulse oximetry had a sensitivity of 58.3% (95% CI 27.7% to 84.8%) for critical cases (12 babies) and 28.6% (95% CI 14.6% to 46.3%) for all major CHDs (35 babies). False-positive (FP) results occurred in 1 in 119 babies (0.84%) without major CHDs (specificity 99.2%, 95% CI 99.0% to 99.3%). However, of the 169 FPs, there were six cases of significant but not major CHDs and 40 cases of respiratory or infective illness requiring medical intervention. The prevalence of major CHDs in babies with normal pulse oximetry was 1.4 (95% CI 0.9 to 2.0) per 1000 live births, as 27 babies with major CHDs (6 critical and 21 serious) were missed. Parent and staff participants were predominantly satisfied with screening, perceiving it as an important test to detect ill babies. There was no evidence that mothers given FP results were more anxious after participating than those given true-negative results, although they were less satisfied with the test. White British/Irish mothers were more likely to participate in the study, and were less anxious and more satisfied than those of other ethnicities. The incremental cost-effectiveness ratio of pulse oximetry plus clinical examination compared with examination alone is approximately £24,900 per timely diagnosis in a population in which antenatal screening for CHDs already exists. Conclusions: Pulse oximetry is a simple, safe, feasible test that is acceptable to parents and staff and adds value to existing screening. It is likely to identify cases of critical CHDs that would otherwise go undetected. It is also likely to be cost-effective given current acceptable thresholds. The detection of other pathologies, such as significant CHDs and respiratory and infective illnesses, is an additional advantage. Other pulse oximetry techniques, such as perfusion index, may enhance detection of aortic obstructive lesions.
Resumo:
The purpose of this concise paper is to propose, with evidence gathered through a systematic evaluation of an academic development programme in the UK, that training in the use of new and emerging learning technologies should be holistically embedded in every learning and training opportunity in learning, teaching and assessment in higher education, and not only as stand-alone modules or one-off opportunities. The future of learning in higher education cannot afford to allow Universities to disregard that digital literacy is an expected professional skill for their entire staff.