39 resultados para Low Autocorrelation Binary Sequence Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Low back pain (LBP) is by far the most prevalent and costly musculoskeletal problem in our society today. Following the recommendations of the Multinational Musculoskeletal Inception Cohort Study (MMICS) Statement, our study aims to define outcome assessment tools for patients with acute LBP and the time point at which chronic LBP becomes manifest and to identify patient characteristics which increase the risk of chronicity. METHODS: Patients with acute LBP will be recruited from clinics of general practitioners (GPs) in New Zealand (NZ) and Switzerland (CH). They will be assessed by postal survey at baseline and at 3, 6, 12 weeks and 6 months follow-up. Primary outcome will be disability as measured by the Oswestry Disability Index (ODI); key secondary endpoints will be general health as measured by the acute SF-12 and pain as measured on the Visual Analogue Scale (VAS). A subgroup analysis of different assessment instruments and baseline characteristics will be performed using multiple linear regression models. This study aims to examine: 1. Which biomedical, psychological, social, and occupational outcome assessment tools are identifiers for the transition from acute to chronic LBP and at which time point this transition becomes manifest. 2. Which psychosocial and occupational baseline characteristics like work status and period of work absenteeism influence the course from acute to chronic LBP. 3. Differences in outcome assessment tools and baseline characteristics of patients in NZ compared with CH. DISCUSSION: This study will develop a screening tool for patients with acute LBP to be used in GP clinics to access the risk of developing chronic LBP. In addition, biomedical, psychological, social, and occupational patient characteristics which influence the course from acute to chronic LBP will be identified. Furthermore, an appropriate time point for follow-ups will be given to detect this transition. The generalizability of our findings will be enhanced by the international perspective of this study. TRIAL REGISTRATION: [Clinical Trial Registration Number, ACTRN12608000520336].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Because of the growing life expectancy in developed countries and the exponential increase in vision loss with increasing age, a growing number of elderly persons will eventually suffer from visual impairment and blindness. This paper describes the association between self-reported vision and well-being in individuals aged 50 years and older and their families. METHODS: Using binary logistic regressions on data from the 2004 Survey of Health, Ageing and Retirement in Europe (SHARE), we analysed the association between self-reported corrected vision in general, corrected distance vision and corrected reading vision on 11 variables capturing emotional well-being, future hopes and perspectives, and concentration on daily activities. RESULTS: For 22,486 individuals from 10 European countries, aged 64.23 +/- 10.52 years, lower vision was associated with a highly significant negative impact on all measured aspects of well-being. CONCLUSIONS: These data from a large population base in Europe provide evidence that persons with low vision have a higher probability of concentration problems during reading and entertainment; losing interest and enjoyment in their activities; feeling fatigued, irritable, sad, and tearful; having less hope for the future; and wishing for death. Effective measures of early detection, prevention, rehabilitation, education and research, as well as a holistic view of a patient, could help counter these problems, thereby improving mental and physical health and reducing the economic impact of low vision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low back pain (LBP) is currently the most prevalent and costly musculoskeletal problem in modern societies. Screening instruments for the identification of prognostic factors in LBP may help to identify patients with an unfavourable outcome. In this systematic review screening instruments published between 1970 and 2007 were identified by a literature search. Nine different instruments were analysed and their different items grouped into ten structures. Finally, the predictive effectiveness of these structures was examined for the dependent variables including "work status", "functional limitation", and "pain". The strongest predictors for "work status" were psychosocial and occupational structures, whereas for "functional limitation" and "pain" psychological structures were dominating. Psychological and occupational factors show a high reliability for the prognosis of patients with LBP. Screening instruments for the identification of prognostic factors in patients with LBP should include these factors as a minimum core set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whereas a non-operative approach for hemodynamically stable patients with free intraabdominal fluid in the presence of solid organ injury is generally accepted, the presence of free fluid in the abdomen without evidence of solid organ injury not only presents a challenge for the treating emergency physician but also for the surgeon in charge. Despite recent advances in imaging modalities, with multi-detector computed tomography (CT) (with or without contrast agent) usually the imaging method of choice, diagnosis and interpretation of the results remains difficult. While some studies conclude that CT is highly accurate and relatively specific at diagnosing mesenteric and hollow viscus injury, others studies deem CT to be unreliable. These differences may in part be due to the experience and the interpretation of the radiologist and/or the treating physician or surgeon.A search of the literature has made it apparent that there is no straightforward answer to the question what to do with patients with free intraabdominal fluid on CT scanning but without signs of solid organ injury. In hemodynamically unstable patients, free intraabdominal fluid in the absence of solid organ injury usually mandates immediate surgical intervention. For patients with blunt abdominal trauma and more than just a trace of free intraabdominal fluid or for patients with signs of peritonitis, the threshold for a surgical exploration - preferably by a laparoscopic approach - should be low. Based on the available information, we aim to provide the reader with an overview of the current literature with specific emphasis on diagnostic and therapeutic approaches to this problem and suggest a possible algorithm, which might help with the adequate treatment of such patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the problem of distributed sensors' failure detection in networks with a small number of defective sensors, whose measurements differ significantly from the neighbor measurements. We build on the sparse nature of the binary sensor failure signals to propose a novel distributed detection algorithm based on gossip mechanisms and on Group Testing (GT), where the latter has been used so far in centralized detection problems. The new distributed GT algorithm estimates the set of scattered defective sensors with a low complexity distance decoder from a small number of linearly independent binary messages exchanged by the sensors. We first consider networks with one defective sensor and determine the minimal number of linearly independent messages needed for its detection with high probability. We then extend our study to the multiple defective sensors detection by modifying appropriately the message exchange protocol and the decoding procedure. We show that, for small and medium sized networks, the number of messages required for successful detection is actually smaller than the minimal number computed theoretically. Finally, simulations demonstrate that the proposed method outperforms methods based on random walks in terms of both detection performance and convergence rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to investigate the role of the fronto–striatal system for implicit task sequence learning. We tested performance of patients with compromised functioning of the fronto–striatal loops, that is, patients with Parkinson's disease and patients with lesions in the ventromedial or dorsolateral prefrontal cortex. We also tested amnesic patients with lesions either to the basal forebrain/orbitofrontal cortex or to thalamic/medio-temporal regions. We used a task sequence learning paradigm involving the presentation of a sequence of categorical binary-choice decision tasks. After several blocks of training, the sequence, hidden in the order of tasks, was replaced by a pseudo-random sequence. Learning (i.e., sensitivity to the ordering) was assessed by measuring whether this change disrupted performance. Although all the patients were able to perform the decision tasks quite easily, those with lesions to the fronto–striatal loops (i.e., patients with Parkinson's disease, with lesions in the ventromedial or dorsolateral prefrontal cortex and those amnesic patients with lesions to the basal forebrain/orbitofrontal cortex) did not show any evidence of implicit task sequence learning. In contrast, those amnesic patients with lesions to thalamic/medio-temporal regions showed intact sequence learning. Together, these results indicate that the integrity of the fronto–striatal system is a prerequisite for implicit task sequence learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Storing and recalling spiking sequences is a general problem the brain needs to solve. It is, however, unclear what type of biologically plausible learning rule is suited to learn a wide class of spatiotemporal activity patterns in a robust way. Here we consider a recurrent network of stochastic spiking neurons composed of both visible and hidden neurons. We derive a generic learning rule that is matched to the neural dynamics by minimizing an upper bound on the Kullback–Leibler divergence from the target distribution to the model distribution. The derived learning rule is consistent with spike-timing dependent plasticity in that a presynaptic spike preceding a postsynaptic spike elicits potentiation while otherwise depression emerges. Furthermore, the learning rule for synapses that target visible neurons can be matched to the recently proposed voltage-triplet rule. The learning rule for synapses that target hidden neurons is modulated by a global factor, which shares properties with astrocytes and gives rise to testable predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of fitting a union of subspaces to a collection of data points drawn from one or more subspaces and corrupted by noise and/or gross errors. We pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise and/or gross errors. By self-expressive we mean a dictionary whose atoms can be expressed as linear combinations of themselves with low-rank coefficients. In the case of noisy data, our key contribution is to show that this non-convex matrix decomposition problem can be solved in closed form from the SVD of the noisy data matrix. The solution involves a novel polynomial thresholding operator on the singular values of the data matrix, which requires minimal shrinkage. For one subspace, a particular case of our framework leads to classical PCA, which requires no shrinkage. For multiple subspaces, the low-rank coefficients obtained by our framework can be used to construct a data affinity matrix from which the clustering of the data according to the subspaces can be obtained by spectral clustering. In the case of data corrupted by gross errors, we solve the problem using an alternating minimization approach, which combines our polynomial thresholding operator with the more traditional shrinkage-thresholding operator. Experiments on motion segmentation and face clustering show that our framework performs on par with state-of-the-art techniques at a reduced computational cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information about fluid evolution and solute transport in a low-permeability metamorphic rock sequence has been obtained by comparing chloride concentrations and chlorine isotope ratios of pore water, groundwater, and fluid inclusions. The similarity of d37Cl values in fluid inclusions and groundwater suggests a closed-system evolution during the metamorphic overprint, and signatures established at this time appear to form the initial conditions for chloride transport after exhumation of the rock sequence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of multimedia content and the demand for new audio or video services have fostered the development of a new era based on multimedia information, which allowed the evolution of Wireless Multimedia Sensor Networks (WMSNs) and also Flying Ad-Hoc Networks (FANETs). In this way, live multimedia services require real-time video transmissions with a low frame loss rate, tolerable end-to-end delay, and jitter to support video dissemination with Quality of Experience (QoE) support. Hence, a key principle in a QoE-aware approach is the transmission of high priority frames (protect them) with a minimum packet loss ratio, as well as network overhead. Moreover, multimedia content must be transmitted from a given source to the destination via intermediate nodes with high reliability in a large scale scenario. The routing service must cope with dynamic topologies caused by node failure or mobility, as well as wireless channel changes, in order to continue to operate despite dynamic topologies during multimedia transmission. Finally, understanding user satisfaction on watching a video sequence is becoming a key requirement for delivery of multimedia content with QoE support. With this goal in mind, solutions involving multimedia transmissions must take into account the video characteristics to improve video quality delivery. The main research contributions of this thesis are driven by the research question how to provide multimedia distribution with high energy-efficiency, reliability, robustness, scalability, and QoE support over wireless ad hoc networks. The thesis addresses several problem domains with contributions on different layers of the communication stack. At the application layer, we introduce a QoE-aware packet redundancy mechanism to reduce the impact of the unreliable and lossy nature of wireless environment to disseminate live multimedia content. At the network layer, we introduce two routing protocols, namely video-aware Multi-hop and multi-path hierarchical routing protocol for Efficient VIdeo transmission for static WMSN scenarios (MEVI), and cross-layer link quality and geographical-aware beaconless OR protocol for multimedia FANET scenarios (XLinGO). Both protocols enable multimedia dissemination with energy-efficiency, reliability and QoE support. This is achieved by combining multiple cross-layer metrics for routing decision in order to establish reliable routes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequence analysis and optimal matching are useful heuristic tools for the descriptive analysis of heterogeneous individual pathways such as educational careers, job sequences or patterns of family formation. However, to date it remains unclear how to handle the inevitable problems caused by missing values with regard to such analysis. Multiple Imputation (MI) offers a possible solution for this problem but it has not been tested in the context of sequence analysis. Against this background, we contribute to the literature by assessing the potential of MI in the context of sequence analyses using an empirical example. Methodologically, we draw upon the work of Brendan Halpin and extend it to additional types of missing value patterns. Our empirical case is a sequence analysis of panel data with substantial attrition that examines the typical patterns and the persistence of sex segregation in school-to-work transitions in Switzerland. The preliminary results indicate that MI is a valuable methodology for handling missing values due to panel mortality in the context of sequence analysis. MI is especially useful in facilitating a sound interpretation of the resulting sequence types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude spaceborne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing missionwide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExOPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fragmentation of electrospray-generated multiply deprotonated RNA and mixed-sequence RNA/DNA pentanucleotides upon low-energy collision-induced dissociation (CID) in a hybrid quadrupole time-of-flight mass spectrometer was investigated. The goal of unambiguous sequence identification of mixed-sequence RNA/DNA oligonucleotides requires detailed understanding of the gas-phase dissociation of this class of compounds. The two major dissociation events, base loss and backbone fragmentation, are discussed and the unique fragmentation behavior of oligoribonucleotides is demonstrated. Backbone fragmentation of the all-RNA pentanucleotides is characterized by abundant c-ions and their complementary y-ions as the major sequence-defining fragment ion series. In contrast to the dissociation of oligodeoxyribonucleotides, where backbone fragmentation is initiated by the loss of a nucleobase which subsequently leads to the formation of the w- and [a-base]-ions, backbone dissociation of oligoribonucleotides is essentially decoupled from base loss. The different behavior of RNA and DNA oligonucleotides is related to the presence of the 2'-hydroxyl substituent, which is the only structural alteration between the DNA and RNA pentanucleotides studied. CID of mixed-sequence RNA/DNA pentanucleotides results in a combination of the nucleotide-typical backbone fragmentation products, with abundant w-fragment ions generated by cleavage of the phosphodiester backbone adjacent to the deoxy building blocks, whereas backbone cleavage adjacent to ribonucleotides induces the formation of c- and y-ions. (C) 2002 American Society for Mass Spectrometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The intensity of long-range correlations observed with the classical HMBC pulse sequence using static optimization of the long-range coupling delay is directly related to the size of the coupling constant and is often set as a compromise. As such, some long-range correlations might appear with a reduced intensity or might even be completely absent from the spectra. After a short introduction, this third manuscript will give a detailed review of some selected HMBC variants dedicated to improve the detection of long-range correlations, such as the ACCORD-HMBC, CIGAR-HMBC, and Broadband HMBC experiments. Practical details about the accordion optimization, which affords a substantial improvement in both the number and intensity of the long-range correlations observed, but introduces a modulation in F1, will be discussed. The incorporation of the so-called constant time variable delay in the CIGAR-HMBC experiment, which can trigger or even completely suppress 1H–1H coupling modulation inherent to the utilization of the accordion principle, will be also discussed. The broadband HMBC scheme, which consists of recording a series of HMBC spectra with different delays set as a function of the long-range heteronuclear coupling constant ranges and transverse relaxation times T2, is also examined.