938 resultados para source analysis
Resumo:
Of the ~1.7 million SINE elements in the human genome, only a tiny number are estimated to be active in transcription by RNA polymerase (Pol) III. Tracing the individual loci from which SINE transcripts originate is complicated by their highly repetitive nature. By exploiting RNA-Seq datasets and unique SINE DNA sequences, we devised a bioinformatic pipeline allowing us to identify Pol III-dependent transcripts of individual SINE elements. When applied to ENCODE transcriptomes of seven human cell lines, this search strategy identified ~1300 Alu loci and ~1100 MIR loci corresponding to detectable transcripts, with ~120 and ~60 respectively Alu and MIR loci expressed in at least three cell lines. In vitro transcription of selected SINEs did not reflect their in vivo expression properties, and required the native 5’-flanking region in addition to internal promoter. We also identified a cluster of expressed AluYa5-derived transcription units, juxtaposed to snaR genes on chromosome 19, formed by a promoter-containing left monomer fused to an Alu-unrelated downstream moiety. Autonomous Pol III transcription was also revealed for SINEs nested within Pol II-transcribed genes raising the possibility of an underlying mechanism for Pol II gene regulation by SINE transcriptional units. Moreover the application of our bioinformatic pipeline to both RNA-seq data of cells subjected to an in vitro pro-oncogenic stimulus and of in vivo matched tumor and non-tumor samples allowed us to detect increased Alu RNA expression as well as the source loci of such deregulation. The ability to investigate SINE transcriptomes at single-locus resolution will facilitate both the identification of novel biologically relevant SINE RNAs and the assessment of SINE expression alteration under pathological conditions.
Resumo:
With the globalisation of world business, the border-crossing activities between organisations have substantially increased. Organisations not only need to handle supply functions, but also play a tremendous role in demand simulation through integration both inside the firm and outside with business partners and customers. Logistics has become more and more mature and sophisticated by taking on an external focus, incorporating suppliers and customers in the business processes, with all the supply chain functions integrated into a whole. By minimising the costs in the value chain or providing customised services, logistics acts as a major source of competitive advantages and profitability. To meet this goal, it would require the integration of activities to focus on customer-oriented measures. Customer service and logistics activities are a chain of interdependent activities that supplement each other to facilitate the flow of information, goods and cash within the value chain. The absence of one activity may imply some specific channels need to supplement another unit. Generally, this paper is to study the impact of corporate strategy, technology and customer satisfaction on the firm’s performance, filling the gap of good customer service effects on long-term profits. Two international delivery providers, UPS and FedEx, are studied to realise the critical success factors of express logistics.
Resumo:
Objective of this work was to explore the performance of a recently introduced source extraction method, FSS (Functional Source Separation), in recovering induced oscillatory change responses from extra-cephalic magnetoencephalographic (MEG) signals. Unlike algorithms used to solve the inverse problem, FSS does not make any assumption about the underlying biophysical source model; instead, it makes use of task-related features (functional constraints) to estimate source/s of interest. FSS was compared with blind source separation (BSS) approaches such as Principal and Independent Component Analysis, PCA and ICA, which are not subject to any explicit forward solution or functional constraint, but require source uncorrelatedness (PCA), or independence (ICA). A visual MEG experiment with signals recorded from six subjects viewing a set of static horizontal black/white square-wave grating patterns at different spatial frequencies was analyzed. The beamforming technique Synthetic Aperture Magnetometry (SAM) was applied to localize task-related sources; obtained spatial filters were used to automatically select BSS and FSS components in the spatial area of interest. Source spectral properties were investigated by using Morlet-wavelet time-frequency representations and significant task-induced changes were evaluated by means of a resampling technique; the resulting spectral behaviours in the gamma frequency band of interest (20-70 Hz), as well as the spatial frequency-dependent gamma reactivity, were quantified and compared among methods. Among the tested approaches, only FSS was able to estimate the expected sustained gamma activity enhancement in primary visual cortex, throughout the whole duration of the stimulus presentation for all subjects, and to obtain sources comparable to invasively recorded data.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
The two-way design has been variously described as a matched-sample F-test, a simple within-subjects ANOVA, a one-way within-groups ANOVA, a simple correlated-groups ANOVA, and a one-factor repeated measures design! This confusion of terminology is likely to lead to problems in correctly identifying this analysis within commercially available software. The essential feature of the design is that each treatment is allocated by randomization to one experimental unit within each group or block. The block may be a plot of land, a single occasion in which the experiment was performed, or a human subject. The ‘blocking’ is designed to remove an aspect of the error variation and increase the ‘power’ of the experiment. If there is no significant source of variation associated with the ‘blocking’ then there is a disadvantage to the two-way design because there is a reduction in the DF of the error term compared with a fully randomised design thus reducing the ‘power’ of the analysis.
Resumo:
Studies suggest that frontotemporal lobar degeneration with transactive response (TAR) DNA-binding protein of 43kDa (TDP-43) proteinopathy (FTLD-TDP) is heterogeneous with division into four or five subtypes. To determine the degree of heterogeneity and the validity of the subtypes, we studied neuropathological variation within the frontal and temporal lobes of 94 cases of FTLD-TDP using quantitative estimates of density and principal components analysis (PCA). A PCA based on the density of TDP-43 immunoreactive neuronal cytoplasmic inclusions (NCI), oligodendroglial inclusions (GI), neuronal intranuclear inclusions (NII), and dystrophic neurites (DN), surviving neurons, enlarged neurons (EN), and vacuolation suggested that cases were not segregated into distinct subtypes. Variation in the density of the vacuoles was the greatest source of variation between cases. A PCA based on TDP-43 pathology alone suggested that cases of FTLD-TDP with progranulin (GRN) mutation segregated to some degree. The pathological phenotype of all four subtypes overlapped but subtypes 1 and 4 were the most distinctive. Cases with coexisting motor neuron disease (MND) or hippocampal sclerosis (HS) also appeared to segregate to some extent. We suggest: 1) pathological variation in FTLD-TDP is best described as a ‘continuum’ without clearly distinct subtypes, 2) vacuolation was the single greatest source of variation and reflects the ‘stage’ of the disease, and 3) within the FTLD-TDP ‘continuum’ cases with GRN mutation and with coexisting MND or HS may have a more distinctive pathology.
Resumo:
It has been widely recognised that an in-depth textual analysis of a source text is relevant for translation. This book discusses the role of discourse analysis for translation and translator training. One particular model of discourse analysis is presented in detail, and its application in the context of translator training is critically examined.
Resumo:
Light occlusions are one of the most significant difficulties of photometric stereo methods. When three or more images are available without occlusion, the local surface orientation is overdetermined so that shape can be computed and the shadowed pixels can be discarded. In this paper, we look at the challenging case when only two images are available without occlusion, leading to a one degree of freedom ambiguity per pixel in the local orientation. We show that, in the presence of noise, integrability alone cannot resolve this ambiguity and reconstruct the geometry in the shadowed regions. As the problem is ill-posed in the presence of noise, we describe two regularization schemes that improve the numerical performance of the algorithm while preserving the data. Finally, the paper describes how this theory applies in the framework of color photometric stereo where one is restricted to only three images and light occlusions are common. Experiments on synthetic and real image sequences are presented.
Resumo:
We propose a novel electroencephalographic application of a recently developed cerebral source extraction method (Functional Source Separation, FSS), which starts from extracranial signals and adds a functional constraint to the cost function of a basic independent component analysis model without requiring solutions to be independent. Five ad-hoc functional constraints were used to extract the activity reflecting the temporal sequence of sensory information processing along the somatosensory pathway in response to the separate left and right median nerve galvanic stimulation. Constraints required only the maximization of the responsiveness at specific latencies following sensory stimulation, without taking into account that any frequency or spatial information. After source extraction, the reliability of identified FS was assessed based on the position of single dipoles fitted on its retroprojected signals and on a discrepancy measure. The FS positions were consistent with previously reported data (two early subcortical sources localized in the brain stem and thalamus, the three later sources in cortical areas), leaving negligible residual activity at the corresponding latencies. The high-frequency component of the oscillatory activity (HFO) of the extracted component was analyzed. The integrity of the low amplitude HFOs was preserved for each FS. On the basis of our data, we suggest that FSS can be an effective tool to investigate the HFO behavior of the different neuronal pools, recruited at successive times after median nerve galvanic stimulation. As FSs are reconstructed along the entire experimental session, directional and dynamic HFO synchronization phenomena can be studied.
Resumo:
In Statnote 9, we described a one-way analysis of variance (ANOVA) ‘random effects’ model in which the objective was to estimate the degree of variation of a particular measurement and to compare different sources of variation in space and time. The illustrative scenario involved the role of computer keyboards in a University communal computer laboratory as a possible source of microbial contamination of the hands. The study estimated the aerobic colony count of ten selected keyboards with samples taken from two keys per keyboard determined at 9am and 5pm. This type of design is often referred to as a ‘nested’ or ‘hierarchical’ design and the ANOVA estimated the degree of variation: (1) between keyboards, (2) between keys within a keyboard, and (3) between sample times within a key. An alternative to this design is a 'fixed effects' model in which the objective is not to measure sources of variation per se but to estimate differences between specific groups or treatments, which are regarded as 'fixed' or discrete effects. This statnote describes two scenarios utilizing this type of analysis: (1) measuring the degree of bacterial contamination on 2p coins collected from three types of business property, viz., a butcher’s shop, a sandwich shop, and a newsagent and (2) the effectiveness of drugs in the treatment of a fungal eye infection.
Resumo:
The multiterminal dc wind farm is a promising topology with a voltage-source inverter (VSI) connection at the onshore grid. Voltage-source converters (VSCs) are robust to ac-side fault conditions. However, they are vulnerable to dc faults on the dc side of the converter. This paper analyzes dc faults, their transients, and the resulting protection issues. Overcurrent faults are analyzed in detail and provide an insight into protection system design. The radial wind farm topology with star or string connection is considered. The outcomes may be applicable for VSCs in the multi-VSC dc wind farm collection grid and VSC-based high-voltage direct current (HVDC) offshore transmission systems.
Resumo:
The diagnosis of ocular disease is increasingly important in optometric practice and there is a need for cost effective point of care assays to assist in that. Although tears are a potentially valuable source of diagnostic information difficulties associated with sample collection and limited sample size together with sample storage and transport have proved major limitations. Progressive developments in electronics and fibre optics together with innovation in sensing technology mean that the construction of inexpensive point of care fibre optic sensing devices is now possible. Tear electrolytes are an obvious family of target analytes, not least to complement the availability of devices that make the routine measurement of tear osmolarity possible in the clinic. In this paper we describe the design, fabrication and calibration of a fibre-optic based electrolyte sensor for the quantification of potassium in tears using the ex vivo contact lens as the sample source. The technology is generic and the same principles can be used in the development of calcium and magnesium sensors. An important objective of this sensor technology development is to provide information at the point of routine optometric examination, which would provide supportive evidence of tear abnormality.
Resumo:
The inverse problem of determining a spacewise dependent heat source, together with the initial temperature for the parabolic heat equation, using the usual conditions of the direct problem and information from two supplementary temperature measurements at different instants of time is studied. These spacewise dependent temperature measurements ensure that this inverse problem has a unique solution, despite the solution being unstable, hence the problem is ill-posed. We propose an iterative algorithm for the stable reconstruction of both the initial data and the source based on a sequence of well-posed direct problems for the parabolic heat equation, which are solved at each iteration step using the boundary element method. The instability is overcome by stopping the iterations at the first iteration for which the discrepancy principle is satisfied. Numerical results are presented for a typical benchmark test example, which has the input measured data perturbed by increasing amounts of random noise. The numerical results show that the proposed procedure gives accurate numerical approximations in relatively few iterations.
Resumo:
A simple and cost-effective technique for generating a flat, square-shaped multi-wavelength optical comb with 42.6 GHz line spacing and over 0.5 THz of total bandwidth is presented. A detailed theoretical analysis is presented, showing that using two concatenated modulators driven with voltages of 3.5 Vp are necessary to generate 11 comb lines with a flatness below 2dB. This performance is experimentally demonstrated using two cascaded Versawave 40 Gbit/s low drive voltage electro-optic polarisation modulators, where an 11 channel optical comb with a flatness of 1.9 dB and a side-mode-suppression ratio (SMSR) of 12.6 dB was obtained.
Resumo:
Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. In this study, we provide a taxonomy and review of the fuzzy DEA methods. We present a classification scheme with four primary categories, namely, the tolerance approach, the a-level based approach, the fuzzy ranking approach and the possibility approach. We discuss each classification scheme and group the fuzzy DEA papers published in the literature over the past 20 years. To the best of our knowledge, this paper appears to be the only review and complete source of references on fuzzy DEA. © 2011 Elsevier B.V. All rights reserved.