940 resultados para attribute-based signature


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system's EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter's components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Professionalism is a key attribute for health professionals. Yet, it is unknown how much faculty development is directed toward skills and behaviours of faculty professionalism. Faculty professionalism includes boundaries in teacher-student relationships, self-reflection, assuring one's own fitness for duty, and maintaining confidentiality when appropriate. METHODS: For five years, we have incorporated faculty professionalism as a routine agenda item for the monthly Physician Assistant Programme faculty meetings, allowing faculty members to introduce issues they are comfortable sharing or have questions about. We also have case discussions of faculty professionalism within faculty meetings every three months. RESULTS: Faculty professionalism is important in the daily work lives of faculty members and including this as part of routine agendas verifies its importance. A faculty survey showed that a majority look forward to the quarterly faculty professionalism case discussions. These have included attempted influence in the admissions process, student/faculty social boundaries, civic professionalism, students requesting medical advice, and self-disclosure. CONCLUSION: A preventive approach works better than a reactionary approach to faculty missteps in professionalism. Routine discussion of faculty professionalism normalizes the topic and is helpful to both new and experienced faculty members. We recommend incorporation of faculty professionalism as a regular agenda item in faculty meetings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.

At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.

The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.

In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.

To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.

In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.

Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.

In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis involved the development of two Biosensors and their associated assays for the detection of diseases, namely IBR and BVD for veterinary use and C1q protein as a biomarker to pancreatic cancer for medical application, using Surface Plasmon Resonance (SPR) and nanoplasmonics. SPR techniques have been used by a number of groups, both in research [1-3] and commercially [4, 5] , as a diagnostic tool for the detection of various biomolecules, especially antibodies [6-8]. The biosensor market is an ever expanding field, with new technology and new companies rapidly emerging on the market, for both human [8] and veterinary applications [9, 10]. In Chapter 2, we discuss the development of a simultaneous IBR and BVD virus assay for the detection of antibodies in bovine serum on an SPR-2 platform. Pancreatic cancer is the most lethal cancer by organ site, partially due to the lack of a reliable molecular signature for diagnostic testing. C1q protein has been recently proposed as a biomarker within a panel for the detection of pancreatic cancer. The third chapter discusses the fabrication, assays and characterisation of nanoplasmonic arrays. We will talk about developing C1q scFv antibody assays, clone screening of the antibodies and subsequently moving the assays onto the nanoplasmonic array platform for static assays, as well as a custom hybrid benchtop system as a diagnostic method for the detection of pancreatic cancer. Finally, in chapter 4, we move on to Guided Mode Resonance (GMR) sensors, as a low-cost option for potential use in Point-of Care diagnostics. C1q and BVD assays used in the prior formats are transferred to this platform, to ascertain its usability as a cost effective, reliable sensor for diagnostic testing. We discuss the fabrication, characterisation and assay development, as well as their use in the benchtop hybrid system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Conventional staging methods are inadequate to identify patients with stage II colon cancer (CC) who are at high risk of recurrence after surgery with curative intent. ColDx is a gene expression, microarray-based assay shown to be independently prognostic for recurrence-free interval (RFI) and overall survival in CC. The objective of this study was to further validate ColDx using formalin-fixed, paraffin-embedded specimens collected as part of the Alliance phase III trial, C9581.

PATIENTS AND METHODS: C9581 evaluated edrecolomab versus observation in patients with stage II CC and reported no survival benefit. Under an initial case-cohort sampling design, a randomly selected subcohort (RS) comprised 514 patients from 901 eligible patients with available tissue. Forty-nine additional patients with recurrence events were included in the analysis. Final analysis comprised 393 patients: 360 RS (58 events) and 33 non-RS events. Risk status was determined for each patient by ColDx. The Self-Prentice method was used to test the association between the resulting ColDx risk score and RFI adjusting for standard prognostic variables.

RESULTS: Fifty-five percent of patients (216 of 393) were classified as high risk. After adjustment for prognostic variables that included mismatch repair (MMR) deficiency, ColDx high-risk patients exhibited significantly worse RFI (multivariable hazard ratio, 2.13; 95% CI, 1.3 to 3.5; P < .01). Age and MMR status were marginally significant. RFI at 5 years for patients classified as high risk was 82% (95% CI, 79% to 85%), compared with 91% (95% CI, 89% to 93%) for patients classified as low risk.

CONCLUSION: ColDx is associated with RFI in the C9581 subsample in the presence of other prognostic factors, including MMR deficiency. ColDx could be incorporated with the traditional clinical markers of risk to refine patient prognosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Our purpose in this report was to define genes and pathways dysregulated as a consequence of the t(4;14) in myeloma, and to gain insight into the downstream functional effects that may explain the different prognosis of this subgroup.Experimental Design: Fibroblast growth factor receptor 3 (FGFR3) overexpression, the presence of immunoglobulin heavy chain-multiple myeloma SET domain (IgH-MMSET) fusion products and the identification of t(4;14) breakpoints were determined in a series of myeloma cases. Differentially expressed genes were identified between cases with (n = 55) and without (n = 24) a t(4;14) by using global gene expression analysis.Results: Cases with a t(4;14) have a distinct expression pattern compared with other cases of myeloma. A total of 127 genes were identified as being differentially expressed including MMSET and cyclin D2, which have been previously reported as being associated with this translocation. Other important functional classes of genes include cell signaling, apoptosis and related genes, oncogenes, chromatin structure, and DNA repair genes. Interestingly, 25% of myeloma cases lacking evidence of this translocation had up-regulation of the MMSET transcript to the same level as cases with a translocation.Conclusions: t(4;14) cases form a distinct subgroup of myeloma cases with a unique gene signature that may account for their poor prognosis. A number of non-t(4;14) cases also express MMSET consistent with this gene playing a role in myeloma pathogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel retrodirective array (RDA) architecture is proposed which utilises a special case spectral signature embedded within the data payload as pilot signals. With the help of a pair of phase-locked-loop (PLL) based phase conjugators (PCs) the RDA’s response to other unwanted and/or unfriendly interrogating signals can be disabled, leading to enhanced secrecy performance directly in the wireless physical layer. The effectiveness of the proposed RDA system is experimentally demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral instruments have been incorporated in satellite missions, providing data of high spectral resolution of the Earth. This data can be used in remote sensing applications, such as, target detection, hazard prevention, and monitoring oil spills, among others. In most of these applications, one of the requirements of paramount importance is the ability to give real-time or near real-time response. Recently, onboard processing systems have emerged, in order to overcome the huge amount of data to transfer from the satellite to the ground station, and thus, avoiding delays between hyperspectral image acquisition and its interpretation. For this purpose, compact reconfigurable hardware modules, such as field programmable gate arrays (FPGAs) are widely used. This paper proposes a parallel FPGA-based architecture for endmember’s signature extraction. This method based on the Vertex Component Analysis (VCA) has several advantages, namely it is unsupervised, fully automatic, and it works without dimensionality reduction (DR) pre-processing step. The architecture has been designed for a low cost Xilinx Zynq board with a Zynq-7020 SoC FPGA based on the Artix-7 FPGA programmable logic and tested using real hyperspectral data sets collected by the NASA’s Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the Cuprite mining district in Nevada. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low cost embedded systems, opening new perspectives for onboard hyperspectral image processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We give a relativistic spin network model for quantum gravity based on the Lorentz group and its q-deformation, the Quantum Lorentz Algebra. We propose a combinatorial model for the path integral given by an integral over suitable representations of this algebra. This generalises the state sum models for the case of the four-dimensional rotation group previously studied in gr-qc/9709028. As a technical tool, formulae for the evaluation of relativistic spin networks for the Lorentz group are developed, with some simple examples which show that the evaluation is finite in interesting cases. We conjecture that the `10J' symbol needed in our model has a finite value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a case-based reasoning (CBR) approach solving educational time-tabling problems. Following the basic idea behind CBR, the solutions of previously solved problems are employed to aid finding the solutions for new problems. A list of feature-value pairs is insufficient to represent all the necessary information. We show that attribute graphs can represent more information and thus can help to retrieve re-usable cases that have similar structures to the new problems. The case base is organised as a decision tree to store the attribute graphs of solved problems hierarchically. An example is given to illustrate the retrieval, re-use and adaptation of structured cases. The results from our experiments show the effectiveness of the retrieval and adaptation in the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structured representation of cases by attribute graphs in a Case-Based Reasoning (CBR) system for course timetabling has been the subject of previous research by the authors. In that system, the case base is organised as a decision tree and the retrieval process chooses those cases which are sub attribute graph isomorphic to the new case. The drawback of that approach is that it is not suitable for solving large problems. This paper presents a multiple-retrieval approach that partitions a large problem into small solvable sub-problems by recursively inputting the unsolved part of the graph into the decision tree for retrieval. The adaptation combines the retrieved partial solutions of all the partitioned sub-problems and employs a graph heuristic method to construct the whole solution for the new case. We present a methodology which is not dependant upon problem specific information and which, as such, represents an approach which underpins the goal of building more general timetabling systems. We also explore the question of whether this multiple-retrieval CBR could be an effective initialisation method for local search methods such as Hill Climbing, Tabu Search and Simulated Annealing. Significant results are obtained from a wide range of experiments. An evaluation of the CBR system is presented and the impact of the approach on timetabling research is discussed. We see that the approach does indeed represent an effective initialisation method for these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecological risk assessment (ERA) is a framework for monitoring risks of exposure and adverse effects of environmental stressors to populations or communities of interest. One tool of ERA is the biomarker, which is a characteristic of an organism that reliably indicates exposure to or effects of a stressor like chemical pollution. Traditional biomarkers which rely on characteristics at the tissue level and higher often detect only acute exposures to stressors. Sensitive molecular biomarkers may detect lower stressor levels than traditional biomarkers, which helps inform risk mitigation and restoration efforts before populations and communities are irreversibly affected. In this study I developed gene expression-based molecular biomarkers of exposure to metals and insecticides in the model toxicological freshwater amphipod Hyalella azteca. My goals were to not only create sensitive molecular biomarkers for these chemicals, but also to show the utility and versatility of H. azteca in molecular studies for toxicology and risk assessment. I sequenced and assembled the H. azteca transcriptome to identify reference and stress-response gene transcripts suitable for expression monitoring. I exposed H. azteca to sub-lethal concentrations of metals (cadmium and copper) and insecticides (DDT, permethrin, and imidacloprid). Reference genes used to create normalization factors were determined for each exposure using the programs BestKeeper, GeNorm, and NormFinder. Both metals increased expression of a nuclear transcription factor (Cnc), an ABC transporter (Mrp4), and a heat shock protein (Hsp90), giving evidence of general metal exposure signature. Cadmium uniquely increased expression of a DNA repair protein (Rad51) and increased Mrp4 expression more than copper (7-fold increase compared to 2-fold increase). Together these may be unique biomarkers distinguishing cadmium and copper exposures. DDT increased expression of Hsp90, Mrp4, and the immune response gene Lgbp. Permethrin increased expression of a cytochrome P450 (Cyp2j2) and decreased expression of the immune response gene Lectin-1. Imidacloprid did not affect gene expression. Unique biomarkers were seen for DDT and permethrin, but the genes studied were not sensitive enough to detect imidacloprid at the levels used here. I demonstrated that gene expression in H. azteca detects specific chemical exposures at sub-lethal concentrations, making expression monitoring using this amphipod a useful and sensitive biomarker for risk assessment of chemical exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structured representation of cases by attribute graphs in a Case-Based Reasoning (CBR) system for course timetabling has been the subject of previous research by the authors. In that system, the case base is organised as a decision tree and the retrieval process chooses those cases which are sub attribute graph isomorphic to the new case. The drawback of that approach is that it is not suitable for solving large problems. This paper presents a multiple-retrieval approach that partitions a large problem into small solvable sub-problems by recursively inputting the unsolved part of the graph into the decision tree for retrieval. The adaptation combines the retrieved partial solutions of all the partitioned sub-problems and employs a graph heuristic method to construct the whole solution for the new case. We present a methodology which is not dependant upon problem specific information and which, as such, represents an approach which underpins the goal of building more general timetabling systems. We also explore the question of whether this multiple-retrieval CBR could be an effective initialisation method for local search methods such as Hill Climbing, Tabu Search and Simulated Annealing. Significant results are obtained from a wide range of experiments. An evaluation of the CBR system is presented and the impact of the approach on timetabling research is discussed. We see that the approach does indeed represent an effective initialisation method for these approaches.