989 resultados para Historical Methodology
Resumo:
Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.
Resumo:
Q methodology was used to enable the identification of discourses among stakeholders to the environmental and resource dimensions of sustainability policies and to gain an understanding of the usefulness of Q methodology in informing sustainability policy development. The application of Q methodology has been useful in identifying shared discourses between different stakeholder groups, and providing insights into how stakeholders frame or understand policy issues; and recommendations are made for ongoing research priorities. These insights, in turn, informed the choice of scenarios for an in parallel process of policy evaluation using Ecological and Carbon Footprinting.
Resumo:
Protein interactions play key roles throughout all subcellular compartments. In the present paper, we report the visualization of protein interactions throughout living mammalian cells using two oligomerizing MV (measles virus) transmembrane glycoproteins, the H (haemagglutinin) and the F (fusion) glycoproteins, which mediate MV entry into permissive cells. BiFC (bimolecular fluorescence complementation) has been used to examine the dimerization of these viral glycoproteins. The H glycoprotein is a type II membrane-receptor-binding homodimeric glycoprotein and the F glycoprotein is a type I disulfide-linked membrane glycoprotein which homotrimerizes. Together they co-operate to allow the enveloped virus to enter a cell by fusing the viral and cellular membranes. We generated a pair of chimaeric H glycoproteins linked to complementary fragments of EGFP (enhanced green fluorescent protein)--haptoEGFPs--which, on association, generate fluorescence. Homodimerization of H glycoproteins specifically drives this association, leading to the generation of a fluorescent signal in the ER (endoplasmic reticulum), the Golgi and at the plasma membrane. Similarly, the generation of a pair of corresponding F glycoprotein-haptoEGFP chimaeras also produced a comparable fluorescent signal. Co-expression of H and F glycoprotein chimaeras linked to complementary haptoEGFPs led to the formation of fluorescent fusion complexes at the cell surface which retained their biological activity as evidenced by cell-to-cell fusion.
Resumo:
A systematic design methodology is described for the rapid derivation of VLSI architectures for implementing high performance recursive digital filters, particularly ones based on most significant digit (msd) first arithmetic. The method has been derived by undertaking theoretical investigations of msd first multiply-accumulate algorithms and by deriving important relationships governing the dependencies between circuit latency, levels of pipe-lining and the range and number representations of filter operands. The techniques described are general and can be applied to both bit parallel and bit serial circuits, including those based on on-line arithmetic. The method is illustrated by applying it to the design of a number of highly pipelined bit parallel IIR and wave digital filter circuits. It is shown that established architectures, which were previously designed using heuristic techniques, can be derived directly from the equations described.
Resumo:
The need to account for the effect of design decisions on manufacture and the impact of manufacturing cost on the life cycle cost of any product are well established. In this context, digital design and manufacturing solutions have to be further developed to facilitate and automate the integration of cost as one of the major driver in the product life cycle management. This article is to present an integration methodology for implementing cost estimation capability within a digital manufacturing environment. A digital manufacturing structure of knowledge databases are set out and the ontology of assembly and part costing that is consistent with the structure is provided. Although the methodology is currently used for recurring cost prediction, it can be well applied to other functional developments, such as process planning. A prototype tool is developed to integrate both assembly time cost and parts manufacturing costs within the same digital environment. An industrial example is used to validate this approach.
Resumo:
A simple non-linear global-local finite element methodology is presented. A global coarse model, using 2-D shell elements, is solved non-linearly and the displacements and rotations around a region of interest are applied, as displacement boundary conditions, to a refined local 3-D model using Kirchhoff plate assumptions. The global elements' shape functions are used to interpolate between nodes. The local model is then solved non-linearly with an incremental scheme independent of that used for the global model.
Resumo:
In this article we review recent work on the history of French negation in relation to three key issues in socio-historical linguistics: identifying appropriate sources, interpreting scant or anomalous data, and interpreting generational differences in historical data. We then turn to a new case study, that of verbal agreement with la plupart, to see whether this can shed fresh light on these issues. We argue that organising data according to the author’s date of birth is methodologically sounder than according to date of publication. We explore the extent to which different genres and text types reflect changing patterns of usage and suggest that additional, different case-studies are required in order to make more secure generalisations about the reliability of different sources.