75 resultados para Gene Doping, Performance-Enhancement, Pragmatic Ethics
Resumo:
Temporal dynamics and speaker characteristics are two important features of speech that distinguish speech from noise. In this paper, we propose a method to maximally extract these two features of speech for speech enhancement. We demonstrate that this can reduce the requirement for prior information about the noise, which can be difficult to estimate for fast-varying noise. Given noisy speech, the new approach estimates clean speech by recognizing long segments of the clean speech as whole units. In the recognition, clean speech sentences, taken from a speech corpus, are used as examples. Matching segments are identified between the noisy sentence and the corpus sentences. The estimate is formed by using the longest matching segments found in the corpus sentences. Longer speech segments as whole units contain more distinct dynamics and richer speaker characteristics, and can be identified more accurately from noise than shorter speech segments. Therefore, estimation based on the longest recognized segments increases the noise immunity and hence the estimation accuracy. The new approach consists of a statistical model to represent up to sentence-long temporal dynamics in the corpus speech, and an algorithm to identify the longest matching segments between the noisy sentence and the corpus sentences. The algorithm is made more robust to noise uncertainty by introducing missing-feature based noise compensation into the corpus sentences. Experiments have been conducted on the TIMIT database for speech enhancement from various types of nonstationary noise including song, music, and crosstalk speech. The new approach has shown improved performance over conventional enhancement algorithms in both objective and subjective evaluations.
Resumo:
High speed downlink packet access (HSDPA) was introduced to UMTS radio access segment to provide higher capacity for new packet switched services. As a result, packet switched sessions with multiple diverse traffic flows such as concurrent voice and data, or video and data being transmitted to the same user are a likely commonplace cellular packet data scenario. In HSDPA, radio access network (RAN) buffer management schemes are essential to support the end-to-end QoS of such sessions. Hence in this paper we present the end-to-end performance study of a proposed RAN buffer management scheme for multi-flow sessions via dynamic system-level HSDPA simulations. The scheme is an enhancement of a time-space priority (TSP) queuing strategy applied to the node B MAC-hs buffer allocated to an end user with concurrent real-time (RT) and non-real-time (NRT) flows during a multi-flow session. The experimental multi- flow scenario is a packet voice call with concurrent TCP-based file download to the same user. Results show that with the proposed enhancements to the TSP-based RAN buffer management, end-to-end QoS performance gains accrue to the NRT flow without compromising RT flow QoS of the same end user session
Resumo:
End-user multi-flow services support is a crucial aspect of current and next generation mobile networks. This paper presents a dynamic buffer management strategy for HSDPA end-user multi-flow traffic with aggregated real-time and non-real-time flows. The scheme incorporates dynamic priority switching between the flows for transmission on the HSDPA radio channel. The end-to-end performance of the proposed strategy is investigated with an end-user multi-flow session of simultaneous VoIP and TCP-based downlink traffic using detailed HSDPA system-level simulations. Compared to an equivalent static buffer management scheme, the results show that end-to-end throughput performance gains in the non-real-time flow and better HSDPA channel utilization is attainable without compromising the real-time VoIP flow QoS constraints
Resumo:
The inference of gene regulatory networks gained within recent years a considerable interest in the biology and biomedical community. The purpose of this paper is to investigate the influence that environmental conditions can exhibit on the inference performance of network inference algorithms. Specifically, we study five network inference methods, Aracne, BC3NET, CLR, C3NET and MRNET, and compare the results for three different conditions: (I) observational gene expression data: normal environmental condition, (II) interventional gene expression data: growth in rich media, (III) interventional gene expression data: normal environmental condition interrupted by a positive spike-in stimulation. Overall, we find that different statistical inference methods lead to comparable, but condition-specific results. Further, our results suggest that non-steady-state data enhance the inferability of regulatory networks.
Resumo:
Reactive power has become a vital resource in modern electricity networks due to increased penetration of distributed generation. This paper examines the extended reactive power capability of DFIGs to improve network stability and capability to manage network voltage profile during transient faults and dynamic operating conditions. A coordinated reactive power controller is designed by considering the reactive power capabilities of the rotor-side converter (RSC) and the grid-side converter (GSC) of the DFIG in order to maximise the reactive power support from DFIGs. The study has illustrated that, a significant reactive power contribution can be obtained from partially loaded DFIG wind farms for stability enhancement by using the proposed capability curve based reactive power controller; hence DFIG wind farms can function as vital dynamic reactive power resources for power utilities without commissioning additional dynamic reactive power devices. Several network adaptive droop control schemes are also proposed for network voltage management and their performance has been investigated during variable wind conditions. Furthermore, the influence of reactive power capability on network adaptive droop control strategies has been investigated and it has also been shown that enhanced reactive power capability of DFIGs can substantially improve the voltage control performance.
Resumo:
Background: Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take >2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping.
Results: cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance.
Conclusion: Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.
Resumo:
Model selection between competing models is a key consideration in the discovery of prognostic multigene signatures. The use of appropriate statistical performance measures as well as verification of biological significance of the signatures is imperative to maximise the chance of external validation of the generated signatures. Current approaches in time-to-event studies often use only a single measure of performance in model selection, such as logrank test p-values, or dichotomise the follow-up times at some phase of the study to facilitate signature discovery. In this study we improve the prognostic signature discovery process through the application of the multivariate partial Cox model combined with the concordance index, hazard ratio of predictions, independence from available clinical covariates and biological enrichment as measures of signature performance. The proposed framework was applied to discover prognostic multigene signatures from early breast cancer data. The partial Cox model combined with the multiple performance measures were used in both guiding the selection of the optimal panel of prognostic genes and prediction of risk within cross validation without dichotomising the follow-up times at any stage. The signatures were successfully externally cross validated in independent breast cancer datasets, yielding a hazard ratio of 2.55 [1.44, 4.51] for the top ranking signature.
Resumo:
Electrokinetic process is a potential in situ soil remediation process which transports the contaminants via electromigration and electroosmosis. For organic compounds contaminated soil, Fenton’s reagent is utilized as a flushing agent in electrokinetic process (Electrokinetic-Fenton) so that removal of organic contaminants could be achieved by in situ oxidation/destruction. However, this process is not applied widely in industries as the stability issue for Fenton’s reagent is the main drawback. The aim of this mini review is to summarize the developments of Electrokinetic-Fenton process on enhancing the stability of Fenton’s reagent and process efficiency in past decades. Generally, the enhancements are conducted via four paths: (1) chemical stabilization to delay H2O2 decomposition, (2) increase of oxidant availability by monitoring injection method for Fenton’s reagent, (3) electrodes operation and iron catalysts and (4) operating conditions such as voltage gradient, electrolytes and H2O2 concentration. In addition, the types of soils and contaminants are also showing significant effect as the soil with low acid buffering capacity, adequate iron concentration, low organic matter content and low aromatic ring organic contaminants generally gives better efficiency.
Resumo:
In this research, we have investigated the effects of addition of different percentages of nanoclay to the ethylene propylene diene monomer (EPDM) and nitrile butadiene rubber (NBR) on the characteristics of these rubbers as seal material. Properties such as tensile strength, modulus at different extensions, elongation at break, compressive set, hardness, and permeability and abrasion resistance are tested to assess the effect of addition of the nanoclay. Results indicate that addition of nanoclay at certain compositions could slightly reduce the strength of the rubber. However more stable modulus at different strains are provided, the hardness of the rubber is preserved and slightly enhanced, the permeability is reduced in both rubbers especially considerable decrease in EPDM is observed which is desirable in diminishing the effect of explosive decompression. At the same time the compression test shows that the nanoclay improves the performance of the rubbers under compression which is essential in seal application. The X-ray diffraction tests clarify that the dispersion of the nanoclay in the NBR samples is of high quality. In the EPDM samples, the dispersion is in need of improvement. POLYM. COMPOS., 30:1657-1667, 2009. © 2008 Society of Plastics Engineers.
Resumo:
In this paper, the impact of multiple active eavesdroppers on cooperative single carrier systems with multiple relays and multiple destinations is examined. To achieve the secrecy diversity gains in the form of opportunistic selection, a two-stage scheme is proposed for joint relay and destination selection, in which, after the selection of the relay with the minimum effective maximum signal-to-noise ratio (SNR) to a cluster of eavesdroppers, the destination that has the maximum SNR from the chosen relay is selected. In order to accurately assess the secrecy performance, the exact and asymptotic expressions are obtained in closed-form for several security metrics including the secrecy outage probability, the probability of non-zero secrecy rate, and the ergodic secrecy rate in frequency selective fading. Based on the asymptotic analysis, key design parameters such as secrecy diversity gain, secrecy array gain, secrecy multiplexing gain, and power cost are characterized, from which new insights are drawn. Moreover, it is concluded that secrecy performance limits occur when the average received power at the eavesdropper is proportional to the counterpart at the destination. Specifically, for the secrecy outage probability, it is confirmed that the secrecy diversity gain collapses to zero with outage floor, whereas for the ergodic secrecy rate, it is confirmed confirm that its slope collapses to zero with capacity ceiling.
Resumo:
The authors present a VLSI circuit for implementing wave digital filter (WDF) two-port adaptors. Considerable speedups over conventional designs have been obtained using fine grained pipelining. This has been achieved through the use of most significant bit (MSB) first carry-save arithmetic, which allows systems to be designed in which latency L is small and independent of either coefficient or input data wordlength. L is determined by the online delay associated with the computation required at each node in the circuit (in this case a multiply/add plus two separate additions). This in turn means that pipelining can be used to considerably enhance the sampling rate of a recursive digital filter. The level of pipelining which will offer enhancement is determined by L and is fine-grained rather than bit level. In the case of the circuit considered, L = 3. For this reason pipeline delays (half latches) have been introduced between every two rows of cells to produce a system with a once every cycle sample rate.
Resumo:
This letter proposes several relay selection policies for secure communication in cognitive decode-and-forward (DF) relay networks, where a pair of cognitive relays are opportunistically selected for security protection against eavesdropping. The first relay transmits the secrecy information to the destination,
and the second relay, as a friendly jammer, transmits the jamming signal to confound the eavesdropper. We present new exact closed-form expressions for the secrecy outage probability. Our analysis and simulation results strongly support our conclusion that the proposed relay selection policies can enhance the performance of secure cognitive radio. We also confirm that the error floor phenomenon is created in the absence of jamming.
Resumo:
Issues surrounding the misuse of prohibited and licensed substances in animals destined for food production and performance sport competition continue to be an enormous challenge to regulatory authorities charged with enforcing their control. Efficient analytical strategies are implemented to screen and confirm the presence of a wide range of exogenous substances in various biological matrices. However, such methods rely on the direct measurement of drugs and/or their metabolites in a targeted mode, allowing the detection of restricted number of compounds. As a consequence, emerging practices, in particular the use of natural hormones, designer drugs and low-dose cocktails, remain difficult to handle from a control point of view. A new SME-led FP7 funded project, DeTECH21, aims to overcome current limitations by applying an untargeted metabolomics approach based on liquid chromatography coupled to high resolution mass spectrometry and bioinformatic data analysis to identify bovine and equine animals which have been exposed to exogenous substances and assist in the identification of administered compounds. Markerbased strategies, dealing with the comprehensive analysis of metabolites present in a biological sample (urine/plasma/tissue), offer a reliable solution in the areas of food safety and animal sport doping control by effective, high-throughput and sensitive detection of exogenously administered agents. Therefore, the development of the first commercially available forensic test service based on metabolomics profiling will meet 21st century demands in animal forensics.
Resumo:
BACKGROUND: Urothelial pathogenesis is a complex process driven by an underlying network of interconnected genes. The identification of novel genomic target regions and gene targets that drive urothelial carcinogenesis is crucial in order to improve our current limited understanding of urothelial cancer (UC) on the molecular level. The inference of genome-wide gene regulatory networks (GRN) from large-scale gene expression data provides a promising approach for a detailed investigation of the underlying network structure associated to urothelial carcinogenesis.
METHODS: In our study we inferred and compared three GRNs by the application of the BC3Net inference algorithm to large-scale transitional cell carcinoma gene expression data sets from Illumina RNAseq (179 samples), Illumina Bead arrays (165 samples) and Affymetrix Oligo microarrays (188 samples). We investigated the structural and functional properties of GRNs for the identification of molecular targets associated to urothelial cancer.
RESULTS: We found that the urothelial cancer (UC) GRNs show a significant enrichment of subnetworks that are associated with known cancer hallmarks including cell cycle, immune response, signaling, differentiation and translation. Interestingly, the most prominent subnetworks of co-located genes were found on chromosome regions 5q31.3 (RNAseq), 8q24.3 (Oligo) and 1q23.3 (Bead), which all represent known genomic regions frequently deregulated or aberated in urothelial cancer and other cancer types. Furthermore, the identified hub genes of the individual GRNs, e.g., HID1/DMC1 (tumor development), RNF17/TDRD4 (cancer antigen) and CYP4A11 (angiogenesis/ metastasis) are known cancer associated markers. The GRNs were highly dataset specific on the interaction level between individual genes, but showed large similarities on the biological function level represented by subnetworks. Remarkably, the RNAseq UC GRN showed twice the proportion of significant functional subnetworks. Based on our analysis of inferential and experimental networks the Bead UC GRN showed the lowest performance compared to the RNAseq and Oligo UC GRNs.
CONCLUSION: To our knowledge, this is the first study investigating genome-scale UC GRNs. RNAseq based gene expression data is the data platform of choice for a GRN inference. Our study offers new avenues for the identification of novel putative diagnostic targets for subsequent studies in bladder tumors.
Resumo:
Good performance characterizes project success and value for money. However, performance problems are not uncommon in project management. Incentivization is generally recognized as a strategy of addressing performance problems. This chapter aims to explore incentive mechanisms and their impact on project performance. It is mainly based on the use of incentives in construction and engineering projects. The same principles apply to project management in other industry sectors. Incentivization can be used in such performance areas as time, cost, quality, safety and environment. A client has different ways of incentivizing his contractor’s performance, e.g. (1) a single incentive or multiple incentives; and (2) incentives or disincentives or a combination of both. The establishment of incentive mechanisms proves to have a significant potential for relationship development, process enhancement and performance improvement. In order to ensure the success of incentive mechanisms, both contractors and clients need to make extra efforts. As a result, a link is developed among incentive mechanisms, project management system and project performance.