29 resultados para Use-wear analysis

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the number of replications required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the DF of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for each error term of the ANOVA. Finally, it is important to consider the design of the experiment because this determines the appropriate ANOVA to use. Some of the most common experimental designs used in the biosciences and their relevant ANOVAs are discussed by. If there is doubt about which ANOVA to use, the researcher should seek advice from a statistician with experience of research in applied microbiology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A plethora of techniques for the imaging of liposomes and other bilayer vesicles are available. However, sample preparation and the technique chosen should be carefully considered in conjunction with the information required. For example, larger vesicles such as multilamellar and giant unilamellar vesicles can be viewed using light microscopy and whilst vesicle confirmation and size prior to additional physical characterisations or more detailed microscopy can be undertaken, the technique is limited in terms of resolution. To consider the options available for visualising liposome-based systems, a wide range of microscopy techniques are described and discussed here: these include light, fluorescence and confocal microscopy and various electron microscopy techniques such as transmission, cryo, freeze fracture and environmental scanning electron microscopy. Their application, advantages and disadvantages are reviewed with regard to their use in analysis of lipid vesicles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

FDI plays a key role in development, particularly in resource-constrained transition economies of Central and Eastern Europe with relatively low savings rates. Gains from technology transfer play a critical role in motivating FDI, yet potential for it may be hampered by a large technology gap between the source and host country. While the extent of this gap has traditionally been attributed to education, skills and capital intensity, recent literature has also emphasized the possible role of institutional environment in this respect. Despite tremendous interest among policy-makers and academics to understand the factors attracting FDI (Bevan and Estrin, 2000; Globerman and Shapiro, 2003) our knowledge about the effects of institutions on the location choice and ownership structure of foreign firms remains limited. This paper attempts to fill this gap in the literature by examining the link between institutions and foreign ownership structures. To the best of our knowledge, Javorcik (2004) is the only papers, which use firm-level data to analyse the role of institutional quality on an outward investor’s entry mode in transition countries. Our paper extends Javorcik (2004) in a number of ways: (a) rather than a cross-section, we use panel data for the period 1997-2006; (b) rather than a binary variable, we use the percentage foreign ownership as continuous variable; (c) we consider multi-dimensional institutional variables, such as corruption, intellectual property rights protection and government stability. We also use factor analysis to generate a composite index of institutional quality and see how stronger institutional environment could affect foreign ownership; (d) we explore how the distance between institutional environment in source and host countries affect foreign ownership in a host country. The firm-level data used includes both domestic and foreign firms for the period 1997-2006 and is drawn from ORBIS, a commercially available dataset provided by Bureau van Dijk. In order to examine the link between institutions and foreign ownership structures, we estimate four log-linear ownership equations/specifications augmented by institutional and other control variables. We find evidence that the decision of a foreign firm to either locate its subsidiary or acquire an existing domestic firm depends not only on factor cost differences but also on differences in institutional environment between the host and source countries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - This Editorial Viewpoint explores what changes are taking place in manufacturing technology management with the aim of identifying future challenges that should be represented in the scope of JMTM. Design/methodology/ approach - The viewpoints use an analysis of relevant articles in JMTM, published since 2007, which have focused on future challenges for manufacturing. It also draws on two recent reports. Findings - While the analysis confirms the inclusion of some subjects already in the journal scope, there are elsewhere gaps that merit it being expanded. Research limitations/implications - Evidence for the findings is only from a limited number of articles identified in this journal together supplemented by other secondary sources. Practical implications - There are implications for practice concerning the future of manufacturing from the analysis and the sources used, especially the reports. Originality/value - Although it has limitations, the article is based on original bibliographic research. Copyright © 2014 Emerald Group Publishing Limited. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Semantic Web has come a long way since its inception in 2001, especially in terms of technical development and research progress. However, adoption by non- technical practitioners is still an ongoing process, and in some areas this process is just now starting. Emergency response is an area where reliability and timeliness of information and technologies is of essence. Therefore it is quite natural that more widespread adoption in this area has not been seen until now, when Semantic Web technologies are mature enough to support the high requirements of the application area. Nevertheless, to leverage the full potential of Semantic Web research results for this application area, there is need for an arena where practitioners and researchers can meet and exchange ideas and results. Our intention is for this workshop, and hopefully coming workshops in the same series, to be such an arena for discussion. The Extended Semantic Web Conference (ESWC - formerly the European Semantic Web conference) is one of the major research conferences in the Semantic Web field, whereas this is a suitable location for this workshop in order to discuss the application of Semantic Web technology to our specific area of applications. Hence, we chose to arrange our first SMILE workshop at ESWC 2013. However, this workshop does not focus solely on semantic technologies for emergency response, but rather Semantic Web technologies in combination with technologies and principles for what is sometimes called the "social web". Social media has already been used successfully in many cases, as a tool for supporting emergency response. The aim of this workshop is therefore to take this to the next level and answer questions like: "how can we make sense of, and furthermore make use of, all the data that is produced by different kinds of social media platforms in an emergency situation?" For the first edition of this workshop the chairs collected the following main topics of interest: • Semantic Annotation for understanding the content and context of social media streams. • Integration of Social Media with Linked Data. • Interactive Interfaces and visual analytics methodologies for managing multiple large-scale, dynamic, evolving datasets. • Stream reasoning and event detection. • Social Data Mining. • Collaborative tools and services for Citizens, Organisations, Communities. • Privacy, ethics, trustworthiness and legal issues in the Social Semantic Web. • Use case analysis, with specific interest for use cases that involve the application of Social Media and Linked Data methodologies in real-life scenarios. All of these, applied in the context of: • Crisis and Disaster Management • Emergency Response • Security and Citizen Journalism The workshop received 6 high-quality paper submissions and based on a thorough review process, thanks to our program committee, the decision was made to accept four of these papers for the workshop (67% acceptance rate). These four papers can be found later in this proceedings volume. Three out of four of these papers particularly discuss the integration and analysis of social media data, using Semantic Web technologies, e.g. for detecting complex events in social media streams, for visualizing and analysing sentiments with respect to certain topics in social media, or for detecting small-scale incidents entirely through the use of social media information. Finally, the fourth paper presents an architecture for using Semantic Web technologies in resource management during a disaster. Additionally, the workshop featured an invited keynote speech by Dr. Tomi Kauppinen from Aalto university. Dr. Kauppinen shared experiences from his work on applying Semantic Web technologies to application fields such as geoinformatics and scientific research, i.e. so-called Linked Science, but also recent ideas and applications in the emergency response field. His input was also highly valuable for the roadmapping discussion, which was held at the end of the workshop. A separate summary of the roadmapping session can be found at the end of these proceedings. Finally, we would like to thank our invited speaker Dr. Tomi Kauppinen, all our program committee members, as well as the workshop chair of ESWC2013, Johanna Völker (University of Mannheim), for helping us to make this first SMILE workshop a highly interesting and successful event!

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use a panel data set of UK-listed companies over the period 2005–2009 to analyse the actuarial assumptions used to value pension plan liabilities under IAS 19. The valuation process requires companies to make assumptions about financial and demographic variables, notably discount rate, price inflation, salary inflation and mortality/life expectancy of plan members/beneficiaries. We use regression analysis to analyse the relationships between these key assumptions (except mortality, where disclosures are limited) and company-specific factors such as the pension plan funding position and duration of pension liabilities. We find evidence of selective ‘management’ of the three assumptions investigated, although the nature of this appears to differ from the findings of US authors. We conclude that IAS 19 does not prevent the use of managerial discretion, particularly by companies whose pension plan funding positions are weak, thereby reducing the representational faithfulness of the reported pension figures. We also highlight that the degree of discretion used reflects the extent to which IAS 19 defines how the assumptions are to be determined. We therefore suggest that companies should be encouraged to justify more explicitly their choice of assumptions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two contrasting multivariate statistical methods, viz., principal components analysis (PCA) and cluster analysis were applied to the study of neuropathological variations between cases of Alzheimer's disease (AD). To compare the two methods, 78 cases of AD were analyzed, each characterised by measurements of 47 neuropathological variables. Both methods of analysis revealed significant variations between AD cases. These variations were related primarily to differences in the distribution and abundance of senile plaques (SP) and neurofibrillary tangles (NFT) in the brain. Cluster analysis classified the majority of AD cases into five groups which could represent subtypes of AD. However, PCA suggested that variation between cases was more continuous with no distinct subtypes. Hence, PCA may be a more appropriate method than cluster analysis in the study of neuropathological variations between AD cases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A dry matrix application for matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI MSI) was used to profile the distribution of 4-bromophenyl-1,4-diazabicyclo(3.2.2)nonane-4-carboxylate, monohydrochloride (BDNC, SSR180711) in rat brain tissue sections. Matrix application involved applying layers of finely ground dry alpha-cyano-4-hydroxycinnamic acid (CHCA) to the surface of tissue sections thaw mounted onto MALDI targets. It was not possible to detect the drug when applying matrix in a standard aqueous-organic solvent solution. The drug was detected at higher concentrations in specific regions of the brain, particularly the white matter of the cerebellum. Pseudomultiple reaction monitoring imaging was used to validate that the observed distribution was the target compound. The semiquantitative data obtained from signal intensities in the imaging was confirmed by laser microdissection of specific regions of the brain directed by the imaging, followed by hydrophilic interaction chromatography in combination with a quantitative high-resolution mass spectrometry method. This study illustrates that a dry matrix coating is a valuable and complementary matrix application method for analysis of small polar drugs and metabolites that can be used for semiquantitative analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article explains first, the reasons why a knowledge of statistics is necessary and describes the role that statistics plays in an experimental investigation. Second, the normal distribution is introduced which describes the natural variability shown by many measurements in optometry and vision sciences. Third, the application of the normal distribution to some common statistical problems including how to determine whether an individual observation is a typical member of a population and how to determine the confidence interval for a sample mean is described.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study is concerned with the analysis of tear proteins, paying particular attention to the state of the tears (e.g. non-stimulated, reflex, closed), created during sampling, and to assess their interactions with hydrogel contact lenses. The work has involved the use of a variety of biochemical and immunological analytical techniques for the measurement of proteins, (a), in tears, (b), on the contact lens, and (c), in the eluate of extracted lenses. Although a diverse range of tear components may contribute to contact lens spoilation, proteins were of particular interest in this study because of their theoretical potential for producing immunological reactions. Although normal host proteins in their natural state are generally not treated as dangerous or non-self, those which undergo denaturation or suffer a conformational change may provoke an excessive and unnecessary immune response. A novel on-lens cell based assay has been developed and exploited in order to study the role of the ubiquitous cell adhesion glycoprotein, vitronectin, in tears and contact lens wear under various parameters. Vitronectin, whose levels are known to increase in the closed eye environment and shown here to increase during contact lens wear, is an important immunoregulatory protein and may be a prominent marker of inflammatory activity. Immunodiffusion assays were developed and optimised for use in tear analysis, and in a series of subsequent studies used for example in the measurement of albumin, lactoferrin, IgA and IgG. The immunodiffusion assays were then applied in the estimation of the closed eye environment; an environment which has been described as sustaining a state of sub-clinical inflammation. The role and presence of a lesser understood and investigated protein, kininogen, was also estimated, in particular, in relation to contact lens wear. Difficulties arise when attempting to extract proteins from the contact lens in order to examine the individual nature of the proteins involved. These problems were partly alleviated with the use of the on-lens cell assay and a UV spectrophotometry assay, which can analyse the lens surface and bulk respectively, the latter yielding only total protein values. Various lens extraction methods were investigated to remove protein from the lens and the most efficient was employed in the analysis of lens extracts. Counter immunoelectrophoresis, an immunodiffusion assay, was then applied to the analysis of albumin, lactoferrin, IgA and IgG in the resultant eluates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work is undertaken in the attempt to understand the processes at work at the cutting edge of the twist drill. Extensive drill life testing performed by the University has reinforced a survey of previously published information. This work demonstrated that there are two specific aspects of drilling which have not previously been explained comprehensively. The first concerns the interrelating of process data between differing drilling situations, There is no method currently available which allows the cutting geometry of drilling to be defined numerically so that such comparisons, where made, are purely subjective. Section one examines this problem by taking as an example a 4.5mm drill suitable for use with aluminium. This drill is examined using a prototype solid modelling program to explore how the required numerical information may be generated. The second aspect is the analysis of drill stiffness. What aspects of drill stiffness provide the very great difference in performance between short flute length, medium flute length and long flute length drills? These differences exist between drills of identical point geometry and the practical superiority of short drills has been known to shop floor drilling operatives since drilling was first introduced. This problem has been dismissed repeatedly as over complicated but section two provides a first approximation and shows that at least for smaller drills of 4. 5mm the effects are highly significant. Once the cutting action of the twist drill is defined geometrically there is a huge body of machinability data that becomes applicable to the drilling process. Work remains to interpret the very high inclination angles of the drill cutting process in terms of cutting forces and tool wear but aspects of drill design may already be looked at in new ways with the prospect of a more analytical approach rather than the present mix of experience and trial and error. Other problems are specific to the twist drill, such as the behaviour of the chips in the flute. It is now possible to predict the initial direction of chip flow leaving the drill cutting edge. For the future the parameters of further chip behaviour may also be explored within this geometric model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study some common types of Rolling Bearing vibrations are analysed in depth both theoretically and experimentally. The study is restricted to vibrations in the radial direction of bearings having pure radial load and a positive radial clearance. The general vibrational behaviour of such bearings has been investigated with respect to the effects of varying compliance, manufacturing tolerances and the interaction between the bearing and the machine structure into which it is fitted. The equations of motion for a rotor supported by a bearing in which the stiffness varies with cage position has been set up and examples of solutions,obtained by digital simulation. is given. A method to calculate amplitudes and frequencies of vibration components due to out of roundness of the inner ring and varying roller diameters has been developed. The results from these investigations have been combined with a theory for bearing/machine frame interaction using mechanical impedance technique, thereby facilitating prediction of the vibrational behaviour of the whole set up. Finally. the effects of bearing fatigue and wear have been studied with particular emphasis on the use of vibration analysis for condition monitoring purposes. A number of monitoring methods have been tried and their effectiveness discussed. The experimental investigation was carried out using two purpose built rigs. For the purpose of analysis of the experimental measurements a digital mini computer was adapted for signal processing and a suite of programs was written. The program package performs several of the commonly used signal analysis processes and :include all necessary input and output functions.