22 resultados para Evidence Containers, Representation, Provenance, Tool Interoperability
em Aston University Research Archive
Resumo:
In recent years, it has become increasingly common for companies to improve their competitiveness and find new markets by extending their operations through international new product development collaborations involving technology transfer. Technology development, cost reduction and market penetration are seen as the foci in such collaborative operations with the aim being to improve the competitive position of both partners. In this paper, the case of technology transfer through collaborative new product development in the machine tool sector is used to provide a typical example of such partnerships. The paper outlines the links between the operational aspects of collaborations and their strategic objectives. It is based on empirical data collected from the machine tool industries in the UK and China. The evidence includes longitudinal case studies and questionnaire surveys of machine tool manufacturers in both countries. The specific case of BSA Tools Ltd and its Chinese partner the Changcheng Machine Tool Works is used to provide an in-depth example of the operational development of a successful collaboration. The paper concludes that a phased coordination of commercial, technical and strategic interactions between the two partners is essential for such collaborations to work.
Resumo:
Purpose. We investigated structural differences in the fatty acid profiles of lipids extracted from ex vivo contact lenses by using gas chromatography mass spectrometry (GCMS). Two lens materials (balafilcon A or lotrafilcon A) were worn on a daily or continuous wear schedule for 30 and 7 days. Methods. Lipids from subject-worn lenses were extracted using 1:1 chloroform: methanol and transmethylated using 5% sulfuric acid in methanol. Fatty acid methyl esters (FAMEs) were collected using hexane and water, and analyzed by GCMS (Varian 3800 GC, Saturn 2000 MS). Results. The gas chromatograms of lens extracts that were worn on a continuous wear schedule showed two predominant peaks, C16:0 and C18:0, both of which are saturated fatty acids. This was the case for balafilcon A and lotrafilcon A lenses. However, the gas chromatograms of lens extracts that were worn on a daily wear schedule showed saturated (C16:0, C18:0) and unsaturated (C16:1 and C18:1) fatty acids. Conclusions. Unsaturated fatty acids are degraded during sleep in contact lenses. Degradation occurred independently of lens material or subject-to-subject variability in lipid deposition. The consequences of lipid degradation are the production of oxidative products, which may be linked to contact lens discomfort. © 2014 The Association for Research in Vision and Ophthalmology, Inc.
Resumo:
BACKGROUND: Standardised packaging (SP) of tobacco products is an innovative tobacco control measure opposed by transnational tobacco companies (TTCs) whose responses to the UK government's public consultation on SP argued that evidence was inadequate to support implementing the measure. The government's initial decision, announced 11 months after the consultation closed, was to wait for 'more evidence', but four months later a second 'independent review' was launched. In view of the centrality of evidence to debates over SP and TTCs' history of denying harms and manufacturing uncertainty about scientific evidence, we analysed their submissions to examine how they used evidence to oppose SP. METHODS AND FINDINGS: We purposively selected and analysed two TTC submissions using a verification-oriented cross-documentary method to ascertain how published studies were used and interpretive analysis with a constructivist grounded theory approach to examine the conceptual significance of TTC critiques. The companies' overall argument was that the SP evidence base was seriously flawed and did not warrant the introduction of SP. However, this argument was underpinned by three complementary techniques that misrepresented the evidence base. First, published studies were repeatedly misquoted, distorting the main messages. Second, 'mimicked scientific critique' was used to undermine evidence; this form of critique insisted on methodological perfection, rejected methodological pluralism, adopted a litigation (not scientific) model, and was not rigorous. Third, TTCs engaged in 'evidential landscaping', promoting a parallel evidence base to deflect attention from SP and excluding company-held evidence relevant to SP. The study's sample was limited to sub-sections of two out of four submissions, but leaked industry documents suggest at least one other company used a similar approach. CONCLUSIONS: The TTCs' claim that SP will not lead to public health benefits is largely without foundation. The tools of Better Regulation, particularly stakeholder consultation, provide an opportunity for highly resourced corporations to slow, weaken, or prevent public health policies.
Resumo:
This paper uses evidence gathered in two perception studies ofAustralasian and British accounting academics to reflect on aspects of the knowledge production systemwithin accounting academe. We provide evidence of the representation of multiple paradigms in many journals that are scored by participants as being of high quality. Indeed most of the journals we surveyed are perceived by accounting academics as incorporating research from more than one paradigm. It is argued that this ‘catholic’ approach by journal editors and the willingness of many respondents in our surveys to score journals highly on material they publish from both paradigm categories reflects a balanced acceptance of the multi-paradigmatic state of accounting research. Our analysis is set within an understanding of systems of accounting knowledge production as socially constructed and as playing an important role in the distribution of power and reward in the academy. We explore the impact of our results on concerns emerging from the work of a number of authors who carefully expose localised 'elites'. The possibilities for a closer relationship between research emerging from a multi-paradigm discipline and policy setting and practice are also discussed. The analysis provides a sense of optimism that the broad constituency of accounting academics operates within an environment conducive for the exchange of ideas. That optimism is dampened by concerns about the impact of local 'elites' and the need for more research on their impact on accounting academe.
Resumo:
A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.
Resumo:
This thesis describes a novel connectionist machine utilizing induction by a Hilbert hypercube representation. This representation offers a number of distinct advantages which are described. We construct a theoretical and practical learning machine which lies in an area of overlap between three disciplines - neural nets, machine learning and knowledge acquisition - hence it is refered to as a "coalesced" machine. To this unifying aspect is added the various advantages of its orthogonal lattice structure as against less structured nets. We discuss the case for such a fundamental and low level empirical learning tool and the assumptions behind the machine are clearly outlined. Our theory of an orthogonal lattice structure the Hilbert hypercube of an n-dimensional space using a complemented distributed lattice as a basis for supervised learning is derived from first principles on clearly laid out scientific principles. The resulting "subhypercube theory" was implemented in a development machine which was then used to test the theoretical predictions again under strict scientific guidelines. The scope, advantages and limitations of this machine were tested in a series of experiments. Novel and seminal properties of the machine include: the "metrical", deterministic and global nature of its search; complete convergence invariably producing minimum polynomial solutions for both disjuncts and conjuncts even with moderate levels of noise present; a learning engine which is mathematically analysable in depth based upon the "complexity range" of the function concerned; a strong bias towards the simplest possible globally (rather than locally) derived "balanced" explanation of the data; the ability to cope with variables in the network; and new ways of reducing the exponential explosion. Performance issues were addressed and comparative studies with other learning machines indicates that our novel approach has definite value and should be further researched.
Resumo:
There is some evidence to suggest that nitriding of alloy steels, in particular high speed tool steels, under carefully controlled conditions might sharply increase rolling contact fatigue resistance. However, the subsurface shear stresses developed in aerospace bearing applications tend to occur at depths greater than the usual case depths currently produced by nitriding. Additionally, case development must be limited with certain materials due to case spalling and may not always be sufficient to achieve the current theoretical depths necessary to ensure that peak stresses occur within the case. It was the aim of' this work to establish suitable to overcome this problem by plasma nitriding. To assist this development a study has been made of prior hardening treatment, case development, residual stress and case cracking tendency. M2 in the underhardened, undertempered and fully hardened and tempered conditions all responded similarly to plasma nitriding - maximum surface hardening being achieved by plasma nitriding at 450°C. Case development varied linearly with increasing treatment temperature and also with the square root of the treatment time. Maximum surface hardness of M5O and Tl steels was achieved by plasma nitriding in 15% nitrogen/85% hydrogen and varied logarithmically with atmosphere nitrogen content. The case-cracking contact stress varied linearly with nitriding temperature for M2. Tl and M5O supported higher stresses after nitriding in low nitrogen plasma atmospheres. Unidirectional bending fatigue of M2 has been improved up to three times the strength of the fully hardened and tempered condition by plasma nitriding for 16hrs at 400°C. Fatigue strengths of Tl and M5O have been improved by up to 30% by plasma nitriding for 16hrs at 450°C in a 75% hydrogen/25% nitrogen atmosphere.
Resumo:
Classification of metamorphic rocks is normally carried out using a poorly defined, subjective classification scheme making this an area in which many undergraduate geologists experience difficulties. An expert system to assist in such classification is presented which is capable of classifying rocks and also giving further details about a particular rock type. A mixed knowledge representation is used with frame, semantic and production rule systems available. Classification in the domain requires that different facets of a rock be classified. To implement this, rocks are represented by 'context' frames with slots representing each facet. Slots are satisfied by calling a pre-defined ruleset to carry out the necessary inference. The inference is handled by an interpreter which uses a dependency graph representation for the propagation of evidence. Uncertainty is handled by the system using a combination of the MYCIN certainty factor system and the Dempster-Shafer range mechanism. This allows for positive and negative reasoning, with rules capable of representing necessity and sufficiency of evidence, whilst also allowing the implementation of an alpha-beta pruning algorithm to guide question selection during inference. The system also utilizes a semantic net type structure to allow the expert to encode simple relationships between terms enabling rules to be written with a sensible level of abstraction. Using frames to represent rock types where subclassification is possible allows the knowledge base to be built in a modular fashion with subclassification frames only defined once the higher level of classification is functioning. Rulesets can similarly be added in modular fashion with the individual rules being essentially declarative allowing for simple updating and maintenance. The knowledge base so far developed for metamorphic classification serves to demonstrate the performance of the interpreter design whilst also moving some way towards providing a useful assistant to the non-expert metamorphic petrologist. The system demonstrates the possibilities for a fully developed knowledge base to handle the classification of igneous, sedimentary and metamorphic rocks. The current knowledge base and interpreter have been evaluated by potential users and experts. The results of the evaluation show that the system performs to an acceptable level and should be of use as a tool for both undergraduates and researchers from outside the metamorphic petrography field. .
Resumo:
Discrete event simulation of manufacturing systems has become widely accepted as an important tool to aid the design of such systems. Often, however, it is applied by practitioners in a manner which largely ignores an important element of industry; namely, the workforce. Workers are usually represented as simple resources, often with deterministic performance values. This approach ignores the potentially large effect that human performance variation can have on a system. A long-term data collection exercise is described with the aim of quantifying the performance variation of workers in a typical automotive assembly plant. The data are presented in a histogram form which is immediately usable in simulations to improve the accuracy of design assessment. The results show levels of skewness and range which are far larger than anticipated by current researchers and practitioners in the field.
Resumo:
The classic hypothesis of Livingstone and Hubel (1984, 1987) proposed two types of color pathways in primate visual cortex based on recordings from single cells: a segregated, modularpathway that signals color but provides little information about shape or form and a second pathway that signals color differences and so defines forms without the need to specify their colors. A major problem has been to reconcile this neurophysiological hypothesis with the behavioral data. A wealth of psychophysical studies has demonstrated that color vision has orientation-tuned responses and little impairment on form related tasks, but these have not revealed any direct evidence for nonoriented mechanisms. Here we use a psychophysical method of subthreshold summation across orthogonal orientations for isoluminant red-green gratings in monocular and dichoptic viewing conditions to differentiate between nonoriented and orientation-tuned responses to color contrast. We reveal nonoriented color responses at low spatial frequencies (0.25-0.375 c/deg) under monocular conditions changing to orientation-tuned responses at higher spatial frequencies (1.5 c/deg) and under binocular conditions. We suggest that two distinct pathways coexist in color vision at the behavioral level, revealed at different spatial scales: one is isotropic, monocular, and best equipped for the representation of surface color, and the other is orientation-tuned, binocular, and selective for shape and form. This advances our understanding of the organization of the neural pathways involved in human color vision and provides a strong link between neurophysiological and behavioral data. © 2013 ARVO.
Resumo:
Innovation is central to the survival and growth of firms, and ultimately to the health of the economies of which they are part. A clear understanding both of the processes by which firms perform innovation and the benefits which flow from innovation in terms of productivity and growth is therefore essential. This paper demonstrates the use of a conceptual framework and modeling tool, the innovation value chain (IVC), and shows how the IVC approach helps to highlight strengths and weaknesses in the innovation performance of a key group of firms-new technology-based firms. The value of the IVC is demonstrated in showing the key interrelationships in the whole process of innovation from sourcing knowledge through product and process innovation to performance in terms of the growth and productivity outcomes of different types of innovation. The use of the IVC highlights key complementarities, such as that between internal R&D, external R&D, and other external sources of knowledge. Other important relationships are also highlighted. Skill resources matter throughout the IVC, being positively associated with external knowledge linkages and innovation success, and also having a direct influence on growth independent of the effect on innovation. A key benefit of the IVC approach is therefore its ability to highlight the roles of different factors at various stages of the knowledge-innovation-performance nexus, and to show their indirect as well as direct impact. This in turn permits both managerial and policy implications to be drawn. © 2012 Product Development & Management Association.
Resumo:
The recall of personal experiences relevant to a claim of food allergy or food intolerance is assessed by a psychologically validated tool for evidence that the suspected food could have caused the adverse symptom suffered. The tool looks at recall from memory of a particular episode or episodes when food was followed by symptoms resulting in self-diagnosis of food allergy or intolerance compared to merely theoretical knowledge that such symptoms could arise after eating the food. If there is detailed recall of events that point to the food as a potential cause of the symptom and the symptom is sufficiently serious, the tool user is recommended to seek testing at an allergy clinic or by the appropriate specialist for a non-allergic sensitivity. If what is recalled does not support the logical possibility of a causal connection between eating that food and occurrence of the symptom, then the user of the tool is pointed to other potential sources of the problem. The user is also recommended to investigate remedies other than avoidance of the food that had been blamed.
Resumo:
Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.
Resumo:
Social Media is becoming an increasingly important part of people’s lives and is being used increasingly in the food and agriculture sector. This paper considers the extent to which each section of the food supply chain is represented in Twitter and use the hashtag #food. We looked at the 20 most popular words for each part of the supply chain by categorising 5000 randomly selected tweets to different sections of the food chain and then analysing each category. We sorted the users by those who tweeted most frequently and categorised their position in the food supply chain. Finally to consider the indegree of influence, we took the top 100 tweeters from the previous list and consider what following these users have. From this we found that consumers are the most represented area of the food chain, and logistics is the least represented. Consumers had 51.50% of the users and 87.42% of the top words tweeted from that part of the food chain. We found little evidence of logistics representation for either tweets or users (0.84% and 0.35% respectively). The top users were found to follow a high percentage of their own followers with most having over 70% the same. This research will bring greater understanding of how people perceive the food sector and how Twitter can be used within this sector.
Resumo:
Over the course of the last twenty years there has been a growing academic interest in performance management, particularly in respect of the evolution of new techniques and their resulting impact. One important theoretical development has been the emergence of multidimensional performance measurement models that are potentially applicable within the public sector. Empirically, academic researchers are increasingly supporting the use of such models as a way of improving public sector management and the effectiveness of service provision (Mayston, 1985; Pollitt, 1986; Bates and Brignall, 1993; and Massey, 1999). This paper seeks to add to the literature by using both theoretical and empirical evidence to argue that CPA, the external inspection tool used by the Audit Commission to evaluate local authority performance management, is a version of the Balanced Scorecard which, when adapted for internal use, may have beneficial effects. After demonstrating the parallels between the CPA framework and Kaplan and Norton's public sector Balanced Scorecard (BSC), we use a case study of the BSC based performance management system in Hertfordshire County Council to demonstrate the empirical linkages between a local scorecard and CPA. We conclude that CPA is based upon the BSC and has the potential to serve as a springboard for the evolution of local authority performance management systems.