19 resultados para Lagrangien augmenté
em Aston University Research Archive
Resumo:
Objective. Infective endocarditis (IE) is diagnosed by the Duke criteria, which can be inconclusive particularly when blood cultures are negative. This study investigated the application of polymerase chain reaction (PCR) to identify bacterial DNA in excised valvular tissue, and its role in establishing the diagnosis of IE. Methods. Ninety-eight patients undergoing valve replacement surgery were studied. Twenty-eight patients were confirmed as definite for endocarditis by the Duke criteria; nine were considered as possible and 61 had no known or previous microbial infection of the endocardium. A broad-range PCR technique was used to amplify prokaryotic 16S rRNA genes present within homogenised heart valve tissue. Subsequent DNA sequencing of the PCR amplicon allowed identification of the infecting microorganism. Results. PCR results demonstrated the presence of bacterial DNA in the heart valves obtained from 14 out of 20 (70%) definite IE patients with positive blood cultures preoperatively. The causative microorganism for one patient with definite culture negative endocarditis was identified by PCR. Two out of nine (22%) of the valves from possible endocarditis patients also had bacterial DNA present converting them into the definite criteria whereas in the valves of seven out of nine (78%) of these patients no bacterial DNA was detected. Conclusion. The application of PCR to the explanted valves in patients with possible or confirmed diagnosis can augment the Duke criteria thereby improving post-surgical antimicrobial therapeutic options. © 2003 The British Infection Society. Published by Elsevier Ltd. All rights reserved.
Resumo:
The work described in this thesis deals with the development and application of a finite element program for the analysis of several cracked structures. In order to simplify the organisation of the material presented herein, the thesis has been subdivided into two Sections : In the first Section the development of a finite element program for the analysis of two-dimensional problems of plane stress or plane strain is described. The element used in this program is the six-mode isoparametric triangular element which permits the accurate modelling of curved boundary surfaces. Various cases of material aniftropy are included in the derivation of the element stiffness properties. A digital computer program is described and examples of its application are presented. In the second Section, on fracture problems, several cracked configurations are analysed by embedding into the finite element mesh a sub-region, containing the singularities and over which an analytic solution is used. The modifications necessary to augment a standard finite element program, such as that developed in Section I, are discussed and complete programs for each cracked configuration are presented. Several examples are included to demonstrate the accuracy and flexibility of the technique.
Resumo:
Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.
Resumo:
In recent years there has been a great effort to combine the technologies and techniques of GIS and process models. This project examines the issues of linking a standard current generation 2½d GIS with several existing model codes. The focus for the project has been the Shropshire Groundwater Scheme, which is being developed to augment flow in the River Severn during drought periods by pumping water from the Shropshire Aquifer. Previous authors have demonstrated that under certain circumstances pumping could reduce the soil moisture available for crops. This project follows earlier work at Aston in which the effects of drawdown were delineated and quantified through the development of a software package that implemented a technique which brought together the significant spatially varying parameters. This technique is repeated here, but using a standard GIS called GRASS. The GIS proved adequate for the task and the added functionality provided by the general purpose GIS - the data capture, manipulation and visualisation facilities - were of great benefit. The bulk of the project is concerned with examining the issues of the linkage of GIS and environmental process models. To this end a groundwater model (Modflow) and a soil moisture model (SWMS2D) were linked to the GIS and a crop model was implemented within the GIS. A loose-linked approach was adopted and secondary and surrogate data were used wherever possible. The implications of which relate to; justification of a loose-linked versus a closely integrated approach; how, technically, to achieve the linkage; how to reconcile the different data models used by the GIS and the process models; control of the movement of data between models of environmental subsystems, to model the total system; the advantages and disadvantages of using a current generation GIS as a medium for linking environmental process models; generation of input data, including the use of geostatistic, stochastic simulation, remote sensing, regression equations and mapped data; issues of accuracy, uncertainty and simply providing adequate data for the complex models; how such a modelling system fits into an organisational framework.
Resumo:
This paper examines the determinants of short-term wage dynamics, using a sample of large Hungarian companies for the period of 1996-1999. We test the basic implications of an efficient contract model of bargaining between the incumbent employees and the managers, which we are unable to reject. In particular, there are structural differences between the ownership sectors consistent with our prior knowledge on relative bargaining strength and unionisation measures. Stronger bargaining position of workers leads to higher ability to pay elasticity of wages, and lower outside option elasticity. Our results indicate that while bargaining position of workers in domestic privatised firms may be weaker than in the state sector, the more robust difference relate to state sector workers versus the privatised firms with the majority foreign ownership. We examine several extensions. We augment the bargaining specification by controls related to workers' skills and find that the basic findings are robust to that. We take a closer look at the outside options of the workers. We find some interactive effects, where unemployment modify the impact of availability of rents on wages. We interpret our results as an indication that bargaining power of workers may be affected by changes in their outside options. We also experiment with one concise indicator of reservation wage which is closest to the theoretical model specification and combines sectoral wages, unemployment benefits and regional unemployment levels,. We found that measure performing well. Finally, we found that while responsiveness of wages towards ability to pay is higher in the state sector, variation in wage dynamics is lower. This may indicate some wage smoothing in the state sector, consistent with the preferences of employees.
Resumo:
Assessing factors that predict new product success (NPS) holds critical importance for companies, as research shows that despite considerable new product investment, success rates are generally below 25%. Over the decades, meta-analytical attempts have been made to summarize empirical findings on NPS factors. However, market environment changes such as increased global competition, as well as methodological advancements in meta-analytical research, present a timely opportunity to augment their results. Hence, a key objective of this research is to provide an updated and extended meta-analytic investigation of the factors affecting NPS. Using Henard and Szymanski's meta-analysis as the most comprehensive recent summary of empirical findings, this study updates their findings by analyzing articles published from 1999 through 2011, the period following the original meta-analysis. Based on 233 empirical studies (from 204 manuscripts) on NPS, with a total 2618 effect sizes, this study also takes advantage of more recent methodological developments by re-calculating effects of the meta-analysis employing a random effects model. The study's scope broadens by including overlooked but important additional variables, notably “country culture,” and discusses substantive differences between the updated meta-analysis and its predecessor. Results reveal generally weaker effect sizes than those reported by Henard and Szymanski in 2001, and provide evolutionary evidence of decreased effects of common success factors over time. Moreover, culture emerges as an important moderating factor, weakening effect sizes for individualistic countries and strengthening effects for risk-averse countries, highlighting the importance of further investigating culture's role in product innovation studies, and of tracking changes of success factors of product innovations. Finally, a sharp increase since 1999 in studies investigating product and process characteristics identifies a significant shift in research interest in new product development success factors. The finding that the importance of success factors generally declines over time calls for new theoretical approaches to better capture the nature of new product development (NPD) success factors. One might speculate that the potential to create competitive advantages through an understanding of NPD success factors is reduced as knowledge of these factors becomes more widespread among managers. Results also imply that managers attempting to improve success rates of NPDs need to consider national culture as this factor exhibits a strong moderating effect: Working in varied cultural contexts will result in differing antecedents of successful new product ventures.
Resumo:
Desalination of brackish groundwater (BW) is an effective approach to augment water supply, especially for inland regions that are far from seawater resources. Brackish water reverse osmosis (BWRO) desalination is still subject to intensive energy consumption compared to the theoretical minimum energy demand. Here, we review some of the BWRO plants with various system arrangements. We look at how to minimize energy demands, as these contribute considerably to the cost of desalinated water. Different configurations of BWRO system have been compared from the view point of normalized specific energy consumption (SEC). Analysis is made at theoretical limits. The SEC reduction of BWRO can be achieved by (i) increasing number of stages, (ii) using an energy recovery device (ERD), or (iii) operating the BWRO in batch mode or closed circuit mode. Application of more stages not only reduces SEC but also improves water recovery. However, this improvement is less pronounced when the number of stages exceeds four. Alternatively and more favourably, the BWRO system can be operated in Closed Circuit Desalination (CCD) mode and gives a comparative SEC to that of the 3-stage system with a recovery ratio of 80%. A further reduction of about 30% in SEC can be achieved through batch-RO operation. Moreover, the costly ERDs and booster pumps are avoided with both CCD and batch-RO, thus furthering the effectiveness of lowering the costs of these innovative approaches. © 2012 by the authors.
Resumo:
Removal of dead or diseased cells is crucial feature of apoptosis for managing many biological processes such as tissue remodelling, tissue homeostasis and resolution and control of immune responses throughout life. Tissue transglutaminase (TG2) is a protein crosslinking enzyme that has been implicated in apoptotic cell clearance but also mediates many important cell functions including cell adhesion, migration and monocyte-macrophage differentiation. Cell surface-associated TG2 regulates cell adhesion and migration, via its association with receptors such as syndecan-4, ß1 and ß3 integrin. Whilst defective apoptotic cell clearance has been described in TG2-deficient mice, the precise extracellular role of TG2 in apoptotic cell clearance remains ill-defined. This thesis addresses macrophage TG2 in cell corpse clearance. TG2 expression (cytosolic and cell surface) in human macrophages was revealed and data demonstrate that loss of TG2 activity through the use of inhibitors of function, including cellimpermeable inhibitors significantly inhibit the ability of macrophages to clear apoptotic cells (AC). This includes reduced macrophage recruitment to and binding of apoptotic cells. Association studies reveal TG2-syndecan-4 interaction through heparan sulphate side chains, and knockdown of syndecan-4 reduces cell surface TG2 activity and apoptotic cell clearance. Furthermore, inhibition of TG2 activity reduces crosslinking of CD44, reported to augment AC clearance. Thus it defines for the first time a role for TG2 activity at the cell surface of human macrophages in multiple stages of AC clearance and proposed that TG2, in association with heparan sulphates, may exert its effect on AC clearance via crosslinking of CD44.
Resumo:
Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.
Resumo:
Biochemical changes brought about by the influence of the contact lens on the tear film are conveniently split into two categories. Firstly, the lens can remove or reduce the levels of specific components in the tear film, and secondly, the lens can augment the tear film, by stimulating the influx of new components or increasing the level of existing components. The most obvious tear film components for study in this context are lipids, proteins, mucins and electrolytes. The interactions are affected by the properties of the lens, the characteristics of the individual wearer and the wear schedule. An additional complicating factor is the fact that the lens is many times thicker than the tear film and any immobilised tear components will be more extensively exposed to oxygen and UV radiation than is the case in the absence of a lens. It is arguably the lipoidal components that are most markedly affected by lens wear, since their immobilisation on the lens surface markedly increases their susceptibility to autoxidative degradation. The limited information that is available highlights the importance of subject specificity and suggests that lipid oxidation phenomena are potentially important in contributing to the 'end of day' discomfort of symptomatic contact lens patients. It is clear that tear lipids, although regarded as relatively inert for many years, are now seen as a reactive and potentially important family of compounds in the search for understanding of contact lens-induced discomfort. The influence of the lens on tear proteins shows the greatest range of complexity. Deposition and denaturation can stimulate immune response, lower molecular weight proteins can be extensively absorbed into the lens matrix and the lens can stimulate cascade or upregulation processes leading either to the generation of additional proteins and peptides or an increase in concentration of existing components. Added to this is the stimulating influence of the lens on vascular leakage leading to the influx of plasma proteins such as albumin. The evidence from studies of mucin expression in tears is not consistent and conclusive. This is in part because sample sources, lens materials and methods of analysis vary considerably, and in some cases the study population numbers are low. Expression levels show mucin and material specificity but clear patterns of behaviour are elusive. The electrolyte composition of tears is significantly different from that of other body fluids. Sodium and potassium dominate but potassium ion concentrations in tears are much higher than in serum levels. Calcium and magnesium concentrations in tears are lower than in serum but closer to interstitial fluids. The contact lens provides the potential for increased osmolarity through enhanced evaporation and differential electrolyte concentrations between the anterior and posterior tear films. Since the changes in ocular biochemistry consequent upon contact lens wear are known to be subject-dependent - as indeed is wearer response to the lens - pre-characterisation of individual participant tear chemistry in clinical studies would enhance understanding of these complex effects. © 2013 Elsevier Ltd.
Resumo:
Purpose: Ind suggests front line employees can be segmented according to their level of brand-supporting performance. His employee typology has not been empirically tested. The paper aims to explore front line employee performance in retail banking, and profile employee types. Design/methodology/approach: Attitudinal and demographic data from a sample of 404 front line service employees in a leading Irish bank informs a typology of service employees. Findings: Champions, Outsiders and Disruptors exist within retail banking. The authors provide an employee profile for each employee type. They found Champions amongst males, and older employees. The highest proportion of female employees surveyed were Outsiders. Disruptors were more likely to complain, and rated their performance lower than any other employee type. Contrary to extant literature, Disruptors were more likely to hold a permanent contract than other employee types. Originality/value: The authors augment the literature by providing insights about the profile of three employee types: Brand Champions, Outsiders and Disruptors. Moreover, the authors postulate the influence of leadership and commitment on each employee type. The cluster profiles raise important questions for hiring, training and rewarding front line banking employees. The authors also provide guidelines for managers to encourage Champions, and curtail Disruptors. © Emerald Group Publishing Limited.
High stress monitoring of prestressing tendons in nuclear concrete vessels using fibre-optic sensors
Resumo:
Maintaining the structural health of prestressed concrete nuclear containments is a key element in ensuring nuclear reactors are capable of meeting their safety requirements. This paper discusses the attachment, fabrication and characterisation of optical fibre strain sensors suitable for the prestress monitoring of irradiated steel prestressing tendons. The all-metal fabrication and welding process allowed the instrumented strand to simultaneously monitor and apply stresses up to 1300 MPa (80% of steel's ultimate tensile strength). There were no adverse effects to the strand's mechanical properties or integrity. After sensor relaxation through cyclic stress treatment, strain transfer between the optical fibre sensors and the strand remained at 69%. The fibre strain sensors could also withstand the non-axial forces induced as the strand was deflected around a 4.5 m bend radius. Further development of this technology has the potential to augment current prestress monitoring practices, allowing distributed measurements of short- and long-term prestress losses in nuclear prestressed-concrete vessels. © 2014 Elsevier B.V.
Resumo:
Background: A new commercially available device (IOLMaster, Zeiss Instruments) provides high resolution non-contact measurements of axial length (using partial coherent interferometry), anterior chamber depth, and corneal radius (using image analysis). The study evaluates the validity and repeatability of these measurements and compares the findings with those obtained from instrumentation currently used in clinical practice. Method: Measurements were taken on 52 subjects (104 eyes) aged 18-40 years with a range of mean spherical refractive error from +7.0 D to -9.50 D. IOLMaster measurements of anterior chamber depth and axial length were compared with A-scan applanation ultrasonography (Storz Omega) and those for corneal radius with a Javal-Schiötz keratometer (Topcon) and an EyeSys corneal videokeratoscope. Results: Axial length: the difference between IOLMaster and ultrasound measures was insignificant (0.02 (SD 0.32) mm, p = 0.47) with no bias across the range sampled (22.40-27.99 mm). Anterior chamber depth: significantly shorter depths than ultrasound were found with the IOLMaster (-0.06 (0.25) mm, p <0.02) with no bias across the range sampled (2.85-4.40 mm). Corneal radius: IOLMaster measurements matched more closely those of the keratometer than those of the videokeratoscope (mean difference -0.03 v -0.06 mm respectively), but were more variable (95% confidence 0.13 v 0.07 mm). The repeatability of all the above IOLMaster biometric measures was found to be of a high order with no significant bias across the measurement ranges sampled. Conclusions: The validity and repeatability of measurements provided by the IOLMaster will augment future studies in ocular biometry.
Resumo:
Tissue transglutaminase (TG2) is a multifunctional protein cross-linking enzyme that has been implicated in apoptotic cell clearance but is also important in many other cell functions including cell adhesion, migration and monocyte to macrophage differentiation. Cell surface-associated TG2 regulates cell adhesion and migration, via its association with receptors such as syndecan-4 and β1 and β3 integrins. Whilst defective apoptotic cell clearance has been described in TG2-deficient mice, the precise role of TG2 in apoptotic cell clearance remains ill-defined. Our work addresses the role of macrophage extracellular TG2 in apoptotic cell corpse clearance. Here we reveal TG2 expression and activity (cytosolic and cell surface) in human macrophages and demonstrate that inhibitors of protein crosslinking activity reduce macrophage clearance of dying cells. We show also that cell-impermeable TG2 inhibitors significantly inhibit the ability of macrophages to migrate and clear apoptotic cells through reduced macrophage recruitment to, and binding of, apoptotic cells. Association studies reveal TG2-syndecan-4 interaction through heparan sulphate side chains, and knockdown of syndecan-4 reduces cell surface TG2 activity and apoptotic cell clearance. Furthermore, inhibition of TG2 activity reduces crosslinking of CD44, reported to augment AC clearance. Thus our data define a role for TG2 activity at the surface of human macrophages in multiple stages of AC clearance and we propose that TG2, in association with heparan sulphates, may exert its effect on AC clearance via a mechanism involving the crosslinking of CD44.