107 resultados para LCA methodology
Resumo:
Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.
Resumo:
Q methodology was used to enable the identification of discourses among stakeholders to the environmental and resource dimensions of sustainability policies and to gain an understanding of the usefulness of Q methodology in informing sustainability policy development. The application of Q methodology has been useful in identifying shared discourses between different stakeholder groups, and providing insights into how stakeholders frame or understand policy issues; and recommendations are made for ongoing research priorities. These insights, in turn, informed the choice of scenarios for an in parallel process of policy evaluation using Ecological and Carbon Footprinting.
Resumo:
In order to achieve progress towards sustainable resource management, it is essential to evaluate options for the reuse and recycling of secondary raw materials, in order to provide a robust evidence base for decision makers. This paper presents the research undertaken in the development of a web-based decision-support tool (the used tyres resource efficiency tool) to compare three processing routes for used tyres compared to their existing primary alternatives. Primary data on the energy and material flows for the three routes, and their alternatives were collected and analysed. The methodology used was a streamlined life-cycle assessment (sLCA) approach. Processes included were: car tyre baling against aggregate gabions; car tyre retreading against new car tyres; and car tyre shred used in landfill engineering against primary aggregates. The outputs of the assessment, and web-based tool, were estimates of raw materials used, carbon dioxide emissions and costs. The paper discusses the benefits of carrying out a streamlined LCA and using the outputs of this analysis to develop a decision-support tool. The strengths and weakness of this approach are discussed and future research priorities identified which could facilitate the use of life cycle approaches by designers and practitioners.
Resumo:
Protein interactions play key roles throughout all subcellular compartments. In the present paper, we report the visualization of protein interactions throughout living mammalian cells using two oligomerizing MV (measles virus) transmembrane glycoproteins, the H (haemagglutinin) and the F (fusion) glycoproteins, which mediate MV entry into permissive cells. BiFC (bimolecular fluorescence complementation) has been used to examine the dimerization of these viral glycoproteins. The H glycoprotein is a type II membrane-receptor-binding homodimeric glycoprotein and the F glycoprotein is a type I disulfide-linked membrane glycoprotein which homotrimerizes. Together they co-operate to allow the enveloped virus to enter a cell by fusing the viral and cellular membranes. We generated a pair of chimaeric H glycoproteins linked to complementary fragments of EGFP (enhanced green fluorescent protein)--haptoEGFPs--which, on association, generate fluorescence. Homodimerization of H glycoproteins specifically drives this association, leading to the generation of a fluorescent signal in the ER (endoplasmic reticulum), the Golgi and at the plasma membrane. Similarly, the generation of a pair of corresponding F glycoprotein-haptoEGFP chimaeras also produced a comparable fluorescent signal. Co-expression of H and F glycoprotein chimaeras linked to complementary haptoEGFPs led to the formation of fluorescent fusion complexes at the cell surface which retained their biological activity as evidenced by cell-to-cell fusion.
Resumo:
A systematic design methodology is described for the rapid derivation of VLSI architectures for implementing high performance recursive digital filters, particularly ones based on most significant digit (msd) first arithmetic. The method has been derived by undertaking theoretical investigations of msd first multiply-accumulate algorithms and by deriving important relationships governing the dependencies between circuit latency, levels of pipe-lining and the range and number representations of filter operands. The techniques described are general and can be applied to both bit parallel and bit serial circuits, including those based on on-line arithmetic. The method is illustrated by applying it to the design of a number of highly pipelined bit parallel IIR and wave digital filter circuits. It is shown that established architectures, which were previously designed using heuristic techniques, can be derived directly from the equations described.
Resumo:
The need to account for the effect of design decisions on manufacture and the impact of manufacturing cost on the life cycle cost of any product are well established. In this context, digital design and manufacturing solutions have to be further developed to facilitate and automate the integration of cost as one of the major driver in the product life cycle management. This article is to present an integration methodology for implementing cost estimation capability within a digital manufacturing environment. A digital manufacturing structure of knowledge databases are set out and the ontology of assembly and part costing that is consistent with the structure is provided. Although the methodology is currently used for recurring cost prediction, it can be well applied to other functional developments, such as process planning. A prototype tool is developed to integrate both assembly time cost and parts manufacturing costs within the same digital environment. An industrial example is used to validate this approach.
Resumo:
A simple non-linear global-local finite element methodology is presented. A global coarse model, using 2-D shell elements, is solved non-linearly and the displacements and rotations around a region of interest are applied, as displacement boundary conditions, to a refined local 3-D model using Kirchhoff plate assumptions. The global elements' shape functions are used to interpolate between nodes. The local model is then solved non-linearly with an incremental scheme independent of that used for the global model.
Resumo:
Elevated intraocular pressure (IOP) is a major risk factor for the deterioration of open-angle glaucoma (OAG); medical IOP reduction is the standard treatment, yet no randomized placebo-controlled study of medical IOP reduction has been undertaken previously. The United Kingdom Glaucoma Treatment Study (UKGTS) tests the hypothesis that treatment with a topical prostaglandin analog, compared with placebo, reduces the frequency of visual field (VF) deterioration events in OAG patients by 50% over a 2-year period.
Resumo:
Well planned natural ventilation strategies and systems in the built environments may provide healthy and comfortable indoor conditions, while contributing to a significant reduction in the energy consumed by buildings. Computational Fluid Dynamics (CFD) is particularly suited for modelling indoor conditions in naturally ventilated spaces, which are difficult to predict using other types of building simulation tools. Hence, accurate and reliable CFD models of naturally ventilated indoor spaces are necessary to support the effective design and operation of indoor environments in buildings. This paper presents a formal calibration methodology for the development of CFD models of naturally ventilated indoor environments. The methodology explains how to qualitatively and quantitatively verify and validate CFD models, including parametric analysis utilising the response surface technique to support a robust calibration process. The proposed methodology is demonstrated on a naturally ventilated study zone in the library building at the National University of Ireland in Galway. The calibration process is supported by the on-site measurements performed in a normally operating building. The measurement of outdoor weather data provided boundary conditions for the CFD model, while a network of wireless sensors supplied air speeds and air temperatures inside the room for the model calibration. The concepts and techniques developed here will enhance the process of achieving reliable CFD models that represent indoor spaces and provide new and valuable information for estimating the effect of the boundary conditions on the CFD model results in indoor environments. © 2012 Elsevier Ltd.