933 resultados para Low Autocorrelation Binary Sequence Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low back pain (LBP) is currently the most prevalent and costly musculoskeletal problem in modern societies. Screening instruments for the identification of prognostic factors in LBP may help to identify patients with an unfavourable outcome. In this systematic review screening instruments published between 1970 and 2007 were identified by a literature search. Nine different instruments were analysed and their different items grouped into ten structures. Finally, the predictive effectiveness of these structures was examined for the dependent variables including "work status", "functional limitation", and "pain". The strongest predictors for "work status" were psychosocial and occupational structures, whereas for "functional limitation" and "pain" psychological structures were dominating. Psychological and occupational factors show a high reliability for the prognosis of patients with LBP. Screening instruments for the identification of prognostic factors in patients with LBP should include these factors as a minimum core set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whereas a non-operative approach for hemodynamically stable patients with free intraabdominal fluid in the presence of solid organ injury is generally accepted, the presence of free fluid in the abdomen without evidence of solid organ injury not only presents a challenge for the treating emergency physician but also for the surgeon in charge. Despite recent advances in imaging modalities, with multi-detector computed tomography (CT) (with or without contrast agent) usually the imaging method of choice, diagnosis and interpretation of the results remains difficult. While some studies conclude that CT is highly accurate and relatively specific at diagnosing mesenteric and hollow viscus injury, others studies deem CT to be unreliable. These differences may in part be due to the experience and the interpretation of the radiologist and/or the treating physician or surgeon.A search of the literature has made it apparent that there is no straightforward answer to the question what to do with patients with free intraabdominal fluid on CT scanning but without signs of solid organ injury. In hemodynamically unstable patients, free intraabdominal fluid in the absence of solid organ injury usually mandates immediate surgical intervention. For patients with blunt abdominal trauma and more than just a trace of free intraabdominal fluid or for patients with signs of peritonitis, the threshold for a surgical exploration - preferably by a laparoscopic approach - should be low. Based on the available information, we aim to provide the reader with an overview of the current literature with specific emphasis on diagnostic and therapeutic approaches to this problem and suggest a possible algorithm, which might help with the adequate treatment of such patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The car sequencing problem determines sequences of different car models launched down a mixed-model assembly line. To avoid work overloads of workforce, car sequencing restricts the maximum occurrence of labor-intensive options, e.g., a sunroof, by applying sequencing rules. We consider this problem in a resequencing context, where a given number of buffers (denoted as pull-off tables) is available for rearranging a stirred sequence. The problem is formalized and suited solution procedures are developed. A lower bound and a dominance rule are introduced which both reduce the running time of our graph approach. Finally, a real-world resequencing setting is investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the problem of distributed sensors' failure detection in networks with a small number of defective sensors, whose measurements differ significantly from the neighbor measurements. We build on the sparse nature of the binary sensor failure signals to propose a novel distributed detection algorithm based on gossip mechanisms and on Group Testing (GT), where the latter has been used so far in centralized detection problems. The new distributed GT algorithm estimates the set of scattered defective sensors with a low complexity distance decoder from a small number of linearly independent binary messages exchanged by the sensors. We first consider networks with one defective sensor and determine the minimal number of linearly independent messages needed for its detection with high probability. We then extend our study to the multiple defective sensors detection by modifying appropriately the message exchange protocol and the decoding procedure. We show that, for small and medium sized networks, the number of messages required for successful detection is actually smaller than the minimal number computed theoretically. Finally, simulations demonstrate that the proposed method outperforms methods based on random walks in terms of both detection performance and convergence rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to investigate the role of the fronto–striatal system for implicit task sequence learning. We tested performance of patients with compromised functioning of the fronto–striatal loops, that is, patients with Parkinson's disease and patients with lesions in the ventromedial or dorsolateral prefrontal cortex. We also tested amnesic patients with lesions either to the basal forebrain/orbitofrontal cortex or to thalamic/medio-temporal regions. We used a task sequence learning paradigm involving the presentation of a sequence of categorical binary-choice decision tasks. After several blocks of training, the sequence, hidden in the order of tasks, was replaced by a pseudo-random sequence. Learning (i.e., sensitivity to the ordering) was assessed by measuring whether this change disrupted performance. Although all the patients were able to perform the decision tasks quite easily, those with lesions to the fronto–striatal loops (i.e., patients with Parkinson's disease, with lesions in the ventromedial or dorsolateral prefrontal cortex and those amnesic patients with lesions to the basal forebrain/orbitofrontal cortex) did not show any evidence of implicit task sequence learning. In contrast, those amnesic patients with lesions to thalamic/medio-temporal regions showed intact sequence learning. Together, these results indicate that the integrity of the fronto–striatal system is a prerequisite for implicit task sequence learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Storing and recalling spiking sequences is a general problem the brain needs to solve. It is, however, unclear what type of biologically plausible learning rule is suited to learn a wide class of spatiotemporal activity patterns in a robust way. Here we consider a recurrent network of stochastic spiking neurons composed of both visible and hidden neurons. We derive a generic learning rule that is matched to the neural dynamics by minimizing an upper bound on the Kullback–Leibler divergence from the target distribution to the model distribution. The derived learning rule is consistent with spike-timing dependent plasticity in that a presynaptic spike preceding a postsynaptic spike elicits potentiation while otherwise depression emerges. Furthermore, the learning rule for synapses that target visible neurons can be matched to the recently proposed voltage-triplet rule. The learning rule for synapses that target hidden neurons is modulated by a global factor, which shares properties with astrocytes and gives rise to testable predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of fitting a union of subspaces to a collection of data points drawn from one or more subspaces and corrupted by noise and/or gross errors. We pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise and/or gross errors. By self-expressive we mean a dictionary whose atoms can be expressed as linear combinations of themselves with low-rank coefficients. In the case of noisy data, our key contribution is to show that this non-convex matrix decomposition problem can be solved in closed form from the SVD of the noisy data matrix. The solution involves a novel polynomial thresholding operator on the singular values of the data matrix, which requires minimal shrinkage. For one subspace, a particular case of our framework leads to classical PCA, which requires no shrinkage. For multiple subspaces, the low-rank coefficients obtained by our framework can be used to construct a data affinity matrix from which the clustering of the data according to the subspaces can be obtained by spectral clustering. In the case of data corrupted by gross errors, we solve the problem using an alternating minimization approach, which combines our polynomial thresholding operator with the more traditional shrinkage-thresholding operator. Experiments on motion segmentation and face clustering show that our framework performs on par with state-of-the-art techniques at a reduced computational cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Musculoskeletal infections are infections of the bone and surrounding tissues. They are currently diagnosed based on culture analysis, which is the gold standard for pathogen identification. However, these clinical laboratory methods are frequently inadequate for the identification of the causative agents, because a large percentage (25-50%) of confirmed musculoskeletal infections are false negatives in which no pathogen is identified in culture. My data supports these results. The goal of this project was to use PCR amplification of a portion of the 16S rRNA gene to test an alternative approach for the identification of these pathogens and to assess the diversity of the bacteria involved. The advantages of this alternative method are that it should increase sample sensitivity and the speed of detection. In addition, bacteria that are non-culturable or in low abundance can be detected using this molecular technique. However, a complication of this approach is that the majority of musculoskeletal infections are polymicrobial, which prohibits direct identification from the infected tissue by DNA sequencing of the initial 16S rDNA amplification products. One way to solve this problem is to use denaturing gradient gel electrophoresis (DGGE) to separate the PCR products before DNA sequencing. Denaturing gradient gel electrophoresis (DGGE) separates DNA molecules based on their melting point, which is determined by their DNA sequence. This analytical technique allows a mixture of PCR products of the same length that electrophoreses through agarose gels as one band, to be separated into different bands and then used for DNA sequence analysis. In this way, the DGGE allows for the identification of individual bacterial species in polymicrobial-infected tissue, which is critical for improving clinical outcomes. By combining the 16S rDNA amplification and the DGGE techniques together, an alternative approach for identification has been used. The 16S rRNA gene PCR-DGGE method includes several critical steps: DNA extraction from tissue biopsies, amplification of the bacterial DNA, PCR product separation by DGGE, amplification of the gel-extracted DNA, and DNA sequencing and analysis. Each step of the method was optimized to increase its sensitivity and for rapid detection of the bacteria present in human tissue samples. The limit of detection for the DNA extraction from tissue was at least 20 Staphylococcus aureus cells and the limit of detection for PCR was at least 0.05 pg of template DNA. The conditions for DGGE electrophoreses were optimized by using a double gradient of acrylamide (6 – 10%) and denaturant (30-70%), which increased the separation between distinct PCR products. The use of GelRed (Biotium) improved the DNA visualization in the DGGE gel. To recover the DNA from the DGGE gels the gel slices were excised, shredded in a bead beater, and the DNA was allowed to diffuse into sterile water overnight. The use of primers containing specific linkers allowed the entire amplified PCR product to be sequenced and then analyzed. The optimized 16S rRNA gene PCR-DGGE method was used to analyze 50 tissue biopsy samples chosen randomly from our collection. The results were compared to those of the Memorial Hermann Hospital Clinical Microbiology Laboratory for the same samples. The molecular method was congruent for 10 of the 17 (59%) culture negative tissue samples. In 7 of the 17 (41%) culture negative the molecular method identified a bacterium. The molecular method was congruent with the culture identification for 7 of the 33 (21%) positive cultured tissue samples. However, in 8 of the 33 (24%) the molecular method identified more organisms. In 13 of the 15 (87%) polymicrobial cultured tissue samples the molecular method identified at least one organism that was also identified by culture techniques. Overall, the DGGE analysis of 16S rDNA is an effective method to identify bacteria not identified by culture analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

POLN is a nuclear A-family DNA polymerase encoded in vertebrate genomes. POLN has unusual fidelity and DNA lesion bypass properties, including strong strand displacement activity, low fidelity favoring incorporation of T for template G and accurate translesion synthesis past a 5S-thymine glycol (5S-Tg). We searched for conserved features of the polymerase domain that distinguish it from prokaryotic pol I-type DNA polymerases. A Lys residue (679 in human POLN) of particular interest was identified in the conserved 'O-helix' of motif 4 in the fingers sub-domain. The corresponding residue is one of the most important for controlling fidelity of prokaryotic pol I and is a nonpolar Ala or Thr in those enzymes. Kinetic measurements show that K679A or K679T POLN mutant DNA polymerases have full activity on nondamaged templates, but poorly incorporate T opposite template G and do not bypass 5S-Tg efficiently. We also found that a conserved Tyr residue in the same motif not only affects sensitivity to dideoxynucleotides, but also greatly influences enzyme activity, fidelity and bypass. Protein sequence alignment reveals that POLN has three specific insertions in the DNA polymerase domain. The results demonstrate that residues have been strictly retained during evolution that confer unique bypass and fidelity properties on POLN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monocyte developmental heterogeneity is reflected at the cellular level by differential activation competence, at the molecular level by differential regulation of gene expression. LPS activates monocytes to produce tumor necrosis factor-$\alpha$ (TNF). Events occurring at the molecular level necessary for TNF regulation have not been elucidated, but depend both on activation signals and the maturation state of the cell: Peripheral blood monocytes produce TNF upon LPS stimulation, but only within the first 72 hours of culture. Expression of c-fos is associated with monocytic differentiation and activation; the fos-associated protein, c-jun, is also expressed during monocyte activation. Increased cAMP levels are associated with down regulation of macrophage function, including LPS-induced TNF transcription. Due to these associations, we studied a region of the TNF promoter which resembles the binding sites for both AP-1(fos/jun) and CRE-binding protein (or ATF) in order to identify potential molecular markers defining activation competent populations of monocytic cells.^ Nuclear protein binding studies using extracts from THP-1 monocytic cells stimulated with LPS, which stimulates, or dexamethasone (Dex) or pentoxyfilline (PTX), which inhibit TNF production, respectively, suggest that a low mobility doublet complex may be involved in regulation through this promoter region. PTX or Dex increase binding of these complexes equivalently over untreated cells; approximately two hours after LPS induction, the upper complex is undetectable. The upper complex is composed of ATF2 (CRE-BP1); the lower is a heterodimer of jun/ATF2. LPS induces c-jun and thus may enhance formation of jun-ATF2 complexes. The simultaneous presence of both complexes may reduce the amount of TNF transcription through competitive binding, while a loss of the upper (ATF2) and/or gain of the lower (jun-ATF2) allow increased transcription. AP-1 elements generally transduce signals involving PKC; the CRE mediates a cAMP response, involving PKA. Thus, this element has the potential of receiving signals through divergent signalling pathways. Our findings also suggest that cAMP-induced inhibition of macrophage functions may occur via down regulation of activation-associated genes through competitive binding of particular cAMP-responsive nuclear protein complexes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information about fluid evolution and solute transport in a low-permeability metamorphic rock sequence has been obtained by comparing chloride concentrations and chlorine isotope ratios of pore water, groundwater, and fluid inclusions. The similarity of d37Cl values in fluid inclusions and groundwater suggests a closed-system evolution during the metamorphic overprint, and signatures established at this time appear to form the initial conditions for chloride transport after exhumation of the rock sequence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of multimedia content and the demand for new audio or video services have fostered the development of a new era based on multimedia information, which allowed the evolution of Wireless Multimedia Sensor Networks (WMSNs) and also Flying Ad-Hoc Networks (FANETs). In this way, live multimedia services require real-time video transmissions with a low frame loss rate, tolerable end-to-end delay, and jitter to support video dissemination with Quality of Experience (QoE) support. Hence, a key principle in a QoE-aware approach is the transmission of high priority frames (protect them) with a minimum packet loss ratio, as well as network overhead. Moreover, multimedia content must be transmitted from a given source to the destination via intermediate nodes with high reliability in a large scale scenario. The routing service must cope with dynamic topologies caused by node failure or mobility, as well as wireless channel changes, in order to continue to operate despite dynamic topologies during multimedia transmission. Finally, understanding user satisfaction on watching a video sequence is becoming a key requirement for delivery of multimedia content with QoE support. With this goal in mind, solutions involving multimedia transmissions must take into account the video characteristics to improve video quality delivery. The main research contributions of this thesis are driven by the research question how to provide multimedia distribution with high energy-efficiency, reliability, robustness, scalability, and QoE support over wireless ad hoc networks. The thesis addresses several problem domains with contributions on different layers of the communication stack. At the application layer, we introduce a QoE-aware packet redundancy mechanism to reduce the impact of the unreliable and lossy nature of wireless environment to disseminate live multimedia content. At the network layer, we introduce two routing protocols, namely video-aware Multi-hop and multi-path hierarchical routing protocol for Efficient VIdeo transmission for static WMSN scenarios (MEVI), and cross-layer link quality and geographical-aware beaconless OR protocol for multimedia FANET scenarios (XLinGO). Both protocols enable multimedia dissemination with energy-efficiency, reliability and QoE support. This is achieved by combining multiple cross-layer metrics for routing decision in order to establish reliable routes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequence analysis and optimal matching are useful heuristic tools for the descriptive analysis of heterogeneous individual pathways such as educational careers, job sequences or patterns of family formation. However, to date it remains unclear how to handle the inevitable problems caused by missing values with regard to such analysis. Multiple Imputation (MI) offers a possible solution for this problem but it has not been tested in the context of sequence analysis. Against this background, we contribute to the literature by assessing the potential of MI in the context of sequence analyses using an empirical example. Methodologically, we draw upon the work of Brendan Halpin and extend it to additional types of missing value patterns. Our empirical case is a sequence analysis of panel data with substantial attrition that examines the typical patterns and the persistence of sex segregation in school-to-work transitions in Switzerland. The preliminary results indicate that MI is a valuable methodology for handling missing values due to panel mortality in the context of sequence analysis. MI is especially useful in facilitating a sound interpretation of the resulting sequence types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude spaceborne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing missionwide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExOPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.