998 resultados para frequency metrics
Resumo:
A protein-truncating variant of CHEK2, 1100delC, is associated with a moderate increase in breast cancer risk. We have determined the prevalence of this allele in index cases from 300 Australian multiple-case breast cancer families, 95% of which had been found to be negative for mutations in BRCA1 and BRCA2. Only two (0.6%) index cases heterozygous for the CHEK2 mutation were identified. All available relatives in these two families were genotyped, but there was no evidence of co-segregation between the CHEK2 variant and breast cancer. Lymphoblastoid cell lines established from a heterozygous carrier contained approximately 20% of the CHEK2 1100delC mRNA relative to wild-type CHEK2 transcript. However, no truncated CHK2 protein was detectable. Analyses of expression and phosphorylation of wild-type CHK2 suggest that the variant is likely to act by haploinsufficiency. Analysis of CDC25A degradation, a downstream target of CHK2, suggests that some compensation occurs to allow normal degradation of CDC25A. Such compensation of the 1100delC defect in CHEK2 might explain the rather low breast cancer risk associated with the CHEK2 variant, compared to that associated with truncating mutations in BRCA1 or BRCA2.
Resumo:
Eigen-based techniques and other monolithic approaches to face recognition have long been a cornerstone in the face recognition community due to the high dimensionality of face images. Eigen-face techniques provide minimal reconstruction error and limit high-frequency content while linear discriminant-based techniques (fisher-faces) allow the construction of subspaces which preserve discriminatory information. This paper presents a frequency decomposition approach for improved face recognition performance utilising three well-known techniques: Wavelets; Gabor / Log-Gabor; and the Discrete Cosine Transform. Experimentation illustrates that frequency domain partitioning prior to dimensionality reduction increases the information available for classification and greatly increases face recognition performance for both eigen-face and fisher-face approaches.
Resumo:
Dynamic load sharing can be defined as a measure of the ability of a heavy vehicle multi-axle group to equalise load across its wheels under typical travel conditions; i.e. in the dynamic sense at typical travel speeds and operating conditions of that vehicle. Various attempts have been made to quantify the ability of heavy vehicles to equalise the load across their wheels during travel. One of these was the concept of the load sharing coefficient (LSC). Other metrics such as the dynamic load coefficient (DLC), peak dynamic wheel force (PDWF) and dynamic impact force (DIF) have been used to compare one heavy vehicle suspension with another for potential road damage. This paper compares these metrics and determines a relationship between DLC and LSC with sensitivity analysis of this relationship. The shortcomings of the presently-available metrics are discussed with a new metric proposed - the dynamic load equalisation (DLE) measure.
Resumo:
In vector space based approaches to natural language processing, similarity is commonly measured by taking the angle between two vectors representing words or documents in a semantic space. This is natural from a mathematical point of view, as the angle between unit vectors is, up to constant scaling, the only unitarily invariant metric on the unit sphere. However, similarity judgement tasks reveal that human subjects fail to produce data which satisfies the symmetry and triangle inequality requirements for a metric space. A possible conclusion, reached in particular by Tversky et al., is that some of the most basic assumptions of geometric models are unwarranted in the case of psychological similarity, a result which would impose strong limits on the validity and applicability vector space based (and hence also quantum inspired) approaches to the modelling of cognitive processes. This paper proposes a resolution to this fundamental criticism of of the applicability of vector space models of cognition. We argue that pairs of words imply a context which in turn induces a point of view, allowing a subject to estimate semantic similarity. Context is here introduced as a point of view vector (POVV) and the expected similarity is derived as a measure over the POVV's. Different pairs of words will invoke different contexts and different POVV's. Hence the triangle inequality ceases to be a valid constraint on the angles. We test the proposal on a few triples of words and outline further research.
Resumo:
Activated protein C resistance (APCR), the most common risk factor for venous thrombosis, is the result of a G to A base substitution at nucleotide 1691 (R506Q) in the factor V gene. Current techniques to detect the factor V Leiden mutation, such as determination of restriction length polymorphisms, do not have the capacity to screen large numbers of samples in a rapid, cost- effective test. The aim of this study was to apply the first nucleotide change (FNC) technology, to the detection of the factor V Leiden mutation. After preliminary amplification of genomic DNA by polymerase chain reaction (PCR), an allele-specific primer was hybridised to the PCR product and extended using fluorescent terminating dideoxynucleotides which were detected by colorimetric assay. Using this ELISA-based assay, the prevalence of the factor V Leiden mutation was determined in an Australian blood donor population (n = 500). A total of 18 heterozygotes were identified (3.6%) and all of these were confirmed with conventional MnlI restriction digest. No homozygotes for the variant allele were detected. We conclude from this study that the frequency of 3.6% is compatible with others published for Caucasian populations. In addition, the FNC technology shows promise as the basis for a rapid, automated DNA based test for factor V Leiden.
Resumo:
Decoupling networks can alleviate the effects of mutual coupling in antenna arrays. Conventional decoupling networks can provide decoupled and matched ports at a single frequency. This paper describes dual-frequency decoupling which is achieved by using a network of series or parallel resonant circuits instead of single reactive elements.
Resumo:
Car Following models have a critical role in all microscopic traffic simulation models. Current microscopic simulation models are unable to mimic the unsafe behaviour of drivers as most are based on presumptions about the safe behaviour of drivers. Gipps model is a widely used car following model embedded in different micro-simulation models. This paper examines the Gipps car following model to investigate ways of improving the model for safety studies application. The paper puts forward some suggestions to modify the Gipps model to improve its capabilities to simulate unsafe vehicle movements (vehicles with safety indicators below critical thresholds). The result of the paper is one step forward to facilitate assessing and predicting safety at motorways using microscopic simulation. NGSIM as a rich source of vehicle trajectory data for a motorway is used to extract its relatively risky events. Short following headways and Time To Collision are used to assess critical safety event within traffic flow. The result shows that the modified proposed car following to a certain extent predicts the unsafe trajectories with smaller error values than the generic Gipps model.
Resumo:
To evaluate the timing of mutations in BRAF (v-raf murine sarcoma viral oncogene homolog B1) during melanocytic neoplasia, we carried out mutation analysis on microdissected melanoma and nevi samples. We observed mutations resulting in the V599E amino-acid substitution in 41 of 60 (68%) melanoma metastases, 4 of 5 (80%) primary melanomas and, unexpectedly, in 63 of 77 (82%) nevi. These data suggest that mutational activation of the RAS/RAF/MAPK pathway in nevi is a critical step in the initiation of melanocytic neoplasia but alone is insufficient for melanoma tumorigenesis.
Resumo:
Almost all metapopulation modelling assumes that connectivity between patches is only a function of distance, and is therefore symmetric. However, connectivity will not depend only on the distance between the patches, as some paths are easy to traverse, while others are difficult. When colonising organisms interact with the heterogeneous landscape between patches, connectivity patterns will invariably be asymmetric. There have been few attempts to theoretically assess the effects of asymmetric connectivity patterns on the dynamics of metapopulations. In this paper, we use the framework of complex networks to investigate whether metapopulation dynamics can be determined by directly analysing the asymmetric connectivity patterns that link the patches. Our analyses focus on “patch occupancy” metapopulation models, which only consider whether a patch is occupied or not. We propose three easily calculated network metrics: the “asymmetry” and “average path strength” of the connectivity pattern, and the “centrality” of each patch. Together, these metrics can be used to predict the length of time a metapopulation is expected to persist, and the relative contribution of each patch to a metapopulation’s viability. Our results clearly demonstrate the negative effect that asymmetry has on metapopulation persistence. Complex network analyses represent a useful new tool for understanding the dynamics of species existing in fragmented landscapes, particularly those existing in large metapopulations.
Resumo:
Background: Bioimpedance techniques provide a reliable method of assessing unilateral lymphedema in a clinical setting. Bioimpedance devices are traditionally used to assess body composition at a current frequency of 50 kHz. However, these devices are not transferable to the assessment of lymphedema, as the sensitivity of measuring the impedance of extracellular fluid is frequency dependent. It has previously been shown that the best frequency to detect extracellular fluid is 0 kHz (or DC). However, measurement at this frequency is not possible in practice due to the high skin impedance at DC, and an estimate is usually determined from low frequency measurements. This study investigated the efficacy of various low frequency ranges for the detection of lymphedema. Methods and Results: Limb impedance was measured at 256 frequencies between 3 kHz and 1000 kHz for a sample control population, arm lymphedema population, and leg lymphedema population. Limb impedance was measured using the ImpediMed SFB7 and ImpediMed L-Dex® U400 with equipotential electrode placement on the wrists and ankles. The contralateral limb impedance ratio for arms and legs was used to calculate a lymphedema index (L-Dex) at each measurement frequency. The standard deviation of the limb impedance ratio in a healthy control population has been shown to increase with frequency for both the arm and leg. Box and whisker plots of the spread of the control and lymphedema populations show that there exists good differentiation between the arm and leg L-Dex measured for lymphedema subjects and the arm and leg L-Dex measured for control subjects up to a frequency of about 30 kHz. Conclusions: It can be concluded that impedance measurements above a frequency of 30 kHz decrease sensitivity to extracellular fluid and are not reliable for early detection of lymphedema.
Resumo:
Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, few attempts have been made to explore the structure damage with frequency response functions (FRFs). This paper illustrates the damage identification and condition assessment of a beam structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). In practice, usage of all available FRF data as an input to artificial neural networks makes the training and convergence impossible. Therefore one of the data reduction techniques Principal Component Analysis (PCA) is introduced in the algorithm. In the proposed procedure, a large set of FRFs are divided into sub-sets in order to find the damage indices for different frequency points of different damage scenarios. The basic idea of this method is to establish features of damaged structure using FRFs from different measurement points of different sub-sets of intact structure. Then using these features, damage indices of different damage cases of the structure are identified after reconstructing of available FRF data using PCA. The obtained damage indices corresponding to different damage locations and severities are introduced as input variable to developed artificial neural networks. Finally, the effectiveness of the proposed method is illustrated and validated by using the finite element modal of a beam structure. The illustrated results show that the PCA based damage index is suitable and effective for structural damage detection and condition assessment of building structures.