842 resultados para data movement problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Knee pain is associated with radiographic knee osteoarthritis, but the relationships between physical examination, pain and radiographic features are unclear. OBJECTIVE To examine whether deficits in knee extension or flexion were associated with radiographic severity and pain during clinical examination in persons with knee pain or radiographic features of osteoarthritis. DESIGN Cross-sectional data of the Somerset and Avon Survey of Health (SASH) cohort study. METHODS Participants with knee pain or radiographic features of osteoarthritis were included. We assessed the range of passive knee flexion and extension, pain on movement and Kellgren and Lawrence (K/L) grades. Odds ratios were calculated for the association between range of motion and pain as well as radiographic severity. RESULTS/FINDINGS Of 1117 participants with a clinical assessment, 805 participants and 1530 knees had complete data and were used for this analysis. Pain and radiographic changes were associated with limited range of motion. In knees with pain on passive movement, extension and flexion were reduced per one grade of K/L by -1.4° (95% CI -2.2 to -0.5) and -1.6° (95% CI -2.8 to -0.4), while in knees without pain the reduction was -0.3° (95% CI -0.6 to -0.1) (extension) and -1.1° (-1.8 to -0.3) (flexion). The interaction of pain with K/L was significant (p = 0.021) for extension but not for flexion (p = 0.333). CONCLUSIONS Pain during passive movement, which may be an indicator of reversible soft-tissue changes, e.g., reversible through physical therapy, is independently associated with reduced flexion and extension of the knee.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Disturbed interpersonal communication is a core problem in schizophrenia. Patients with schizophrenia often appear disconnected and "out of sync" when interacting with others. This may involve perception, cognition, motor behavior, and nonverbal expressiveness. Although well-known from clinical observation, mainstream research has neglected this area. Corresponding theoretical concepts, statistical methods, and assessment were missing. In recent research, however, it has been shown that objective, video-based measures of nonverbal behavior can be used to reliably quantify nonverbal behavior in schizophrenia. Newly developed algorithms allow for a calculation of movement synchrony. We found that the objective amount of movement of patients with schizophrenia during social interactions was closely related to the symptom profiles of these patients (Kupper et al., 2010). In addition and above the mere amount of movement, the degree of synchrony between patients and healthy interactants may be indicative of various problems in the domain of interpersonal communication and social cognition. Methods: Based on our earlier study, head movement synchrony was assessed objectively (using Motion Energy Analysis, MEA) in 378 brief, videotaped role-play scenes involving 27 stabilized outpatients diagnosed with paranoid-type schizophrenia. Results: Lower head movement synchrony was indicative of symptoms (negative symptoms, but also of conceptual disorganization and lack of insight), verbal memory, patients’ self-evaluation of competence, and social functioning. Many of these relationships remained significant even when corrected for the amount of movement of the patients. Conclusion: The results suggest that nonverbal synchrony may be an objective and sensitive indicator of the severity of symptoms, cognition and social functioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Untersuchungen zum Umweltbewusstsein und Umweltverhalten werden die zentralen Variablen in der Regel auf der Basis von Selbstauskünften und mittels Itembatterien erhoben. Fraglich ist dabei, ob die Items zur Messung ökologischen Handelns mehr als nur \"symbolisches Verhalten\" erfassen. Bodenstein, Spiller und Elbers (1997) haben kürzlich einen Index vorgeschlagen, der sich am tatsächlichen Energie- und Materialverbrauch eines Haushalts orientiert. Wir vergleichen dieses Konzept mit anderen Skalen und untersuchen empirisch anhand der Daten des Surveys \"Umweltbewusstsein in Deutschland 1998\", in welchem Ausmaß die Ergebnisse von Regressionsmodellen des Umwelthandelns von den jeweils zugrunde gelegten Skalen abhängen. Es zeigt sich, dass gerade bezüglich der zentralen Determinanten \"Umweltbewusstsein\" und \"Einkommen\" beträchtliche Unterschiede bestehen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new fully-automatic method for localizing and segmenting 3D intervertebral discs from MR images, where the two problems are solved in a unified data-driven regression and classification framework. We estimate the output (image displacements for localization, or fg/bg labels for segmentation) of image points by exploiting both training data and geometric constraints simultaneously. The problem is formulated in a unified objective function which is then solved globally and efficiently. We validate our method on MR images of 25 patients. Taking manually labeled data as the ground truth, our method achieves a mean localization error of 1.3 mm, a mean Dice metric of 87%, and a mean surface distance of 1.3 mm. Our method can be applied to other localization and segmentation tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. To detect landmarks, we estimate the displacements from some randomly sampled image patches to the (unknown) landmark positions, and then we integrate these predictions via a voting scheme. Our key contribution is a new algorithm for estimating these displacements. Different from other methods where each image patch independently predicts its displacement, we jointly estimate the displacements from all patches together in a data driven way, by considering not only the training data but also geometric constraints on the test image. The displacements estimation is formulated as a convex optimization problem that can be solved efficiently. Finally, we use the sparse shape composition model as the a priori information to regularize the landmark positions and thus generate the segmented shape contour. We validate our method on X-ray image datasets of three different anatomical structures: complete femur, proximal femur and pelvis. Experiments show that our method is accurate and robust in landmark detection, and, combined with the shape model, gives a better or comparable performance in shape segmentation compared to state-of-the art methods. Finally, a preliminary study using CT data shows the extensibility of our method to 3D data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of fully-automatic localization and segmentation of 3D intervertebral discs (IVDs) from MR images. Our method contains two steps, where we first localize the center of each IVD, and then segment IVDs by classifying image pixels around each disc center as foreground (disc) or background. The disc localization is done by estimating the image displacements from a set of randomly sampled 3D image patches to the disc center. The image displacements are estimated by jointly optimizing the training and test displacement values in a data-driven way, where we take into consideration both the training data and the geometric constraint on the test image. After the disc centers are localized, we segment the discs by classifying image pixels around disc centers as background or foreground. The classification is done in a similar data-driven approach as we used for localization, but in this segmentation case we are aiming to estimate the foreground/background probability of each pixel instead of the image displacements. In addition, an extra neighborhood smooth constraint is introduced to enforce the local smoothness of the label field. Our method is validated on 3D T2-weighted turbo spin echo MR images of 35 patients from two different studies. Experiments show that compared to state of the art, our method achieves better or comparable results. Specifically, we achieve for localization a mean error of 1.6-2.0 mm, and for segmentation a mean Dice metric of 85%-88% and a mean surface distance of 1.3-1.4 mm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interaction between sibling species that share a zone of contact is a multifaceted relationship affected by climate change [ 1, 2 ]. Between sibling species, interactions may occur at whole-organism (direct or indirect competition) or genomic (hybridization and introgression) levels [ 3–5 ]. Tracking hybrid zone movements can provide insights about influences of environmental change on species interactions [ 1 ]. Here, we explore the extent and mechanism of movement of the contact zone between black-capped chickadees (Poecile atricapillus) and Carolina chickadees (Poecile carolinensis) at whole-organism and genomic levels. We find strong evidence that winter temperatures limit the northern extent of P. carolinensis by demonstrating a current-day association between the range limit of this species and minimum winter temperatures. We further show that this temperature limitation has been consistent over time because we are able to accurately hindcast the previous northern range limit under earlier climate conditions. Using genomic data, we confirm northward movement of this contact zone over the past decade and highlight temporally consistent differential—but limited—geographic introgression of alleles. Our results provide an informative example of the influence of climate change on a contact zone between sibling species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of nonparametric estimation of a concave regression function F. We show that the supremum distance between the least square s estimatorand F on a compact interval is typically of order(log(n)/n)2/5. This entails rates of convergence for the estimator’s derivative. Moreover, we discuss the impact of additional constraints on F such as monotonicity and pointwise bounds. Then we apply these results to the analysis of current status data, where the distribution function of the event times is assumed to be concave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigations have shown that the analysis results of ground level enhancements (GLEs) based on neutron monitor (NM) data for a selected event can differ considerably depending the procedure used. This may have significant consequences e.g. for the assessment of radiation doses at flight altitudes. The reasons for the spread of the GLE parameters deduced from NM data can be manifold and are at present unclear. They include differences in specific properties of the various analysis procedures (e.g. NM response functions, different ways in taking into account the dynamics of the Earth’s magnetospheric field), different characterisations of the solar particle flux near Earth as well as the specific selection of NM stations used for the analysis. In the present paper we quantitatively investigate this problem for a time interval during the maximum phase of the GLE on 13 December 2006. We present and discuss the changes in the resulting GLE parameters when using different NM response functions, different model representations of the Earth’s magnetospheric field as well as different assumptions for the solar particle spectrum and pitch angle distribution near Earth. The results of the study are expected to yield a basis for the reduction in the spread of the GLE parameters deduced from NM data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Index tracking has become one of the most common strategies in asset management. The index-tracking problem consists of constructing a portfolio that replicates the future performance of an index by including only a subset of the index constituents in the portfolio. Finding the most representative subset is challenging when the number of stocks in the index is large. We introduce a new three-stage approach that at first identifies promising subsets by employing data-mining techniques, then determines the stock weights in the subsets using mixed-binary linear programming, and finally evaluates the subsets based on cross validation. The best subset is returned as the tracking portfolio. Our approach outperforms state-of-the-art methods in terms of out-of-sample performance and running times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND We investigated the rate of severe hypoglycemic events and confounding factors in patients with type-2-diabetes treated with sulfonylurea (SU) at specialized diabetes centers, documented in the German/Austrian DPV-Wiss-database. METHODS Data from 29,485 SU-treated patients were analyzed (median[IQR] age 70.8[62.2-77.8]yrs, diabetes-duration 8.2[4.3-12.8]yrs). The primary objective was to estimate the event-rate of severe hypoglycemia (requiring external help, causing unconsciousness/coma/convulsion and/or emergency.hospitalization). Secondary objectives included exploration of confounding risk-factors through group-comparison and Poisson-regression. RESULTS Severe hypoglycemic events were reported in 826(2.8%) of all patients during their most recent year of SU-treatment. Of these, n = 531(1.8%) had coma, n = 501(1.7%) were hospitalized at least once. The adjusted event-rate of severe hypoglycemia [95%CI] was 3.9[3.7-4.2] events/100 patient-years (coma: 1.9[1.8-2.1]; hospitalization: 1.6[1.5-1.8]). Adjusted event-rates by diabetes-treatment were 6.7 (SU + insulin), 4.9 (SU + insulin + other OAD), 3.1 (SU + other OAD), and 3.8 (SU only). Patients with ≥1 severe event were older (p < 0.001) and had longer diabetes-duration (p = 0.020) than patients without severe events. Participation in educational diabetes-programs and indirect measures of insulin-resistance (increased BMI, plasma-triglycerides) were associated with fewer events (all p < 0.001). Impaired renal function was common (N = 3,113 eGFR ≤30 mL/min) and associated with an increased rate of severe events (≤30 mL/min: 7.7; 30-60 mL/min: 4.8; >60 mL/min: 3.9). CONCLUSIONS These real-life data showed a rate of severe hypoglycemia of 3.9/100 patient-years in SU-treated patients from specialized diabetes centers. Higher risk was associated with known risk-factors including lack of diabetes-education, older age, and decreased eGFR, but also with lower BMI and lower triglyceride-levels, suggesting that SU-treatment in those patients should be considered with caution. This article is protected by copyright. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lovell and Rouse (LR) have recently proposed a modification of the standard DEA model that overcomes the infeasibility problem often encountered in computing super-efficiency. In the LR procedure one appropriately scales up the observed input vector (scale down the output vector) of the relevant super-efficient firm thereby usually creating its inefficient surrogate. An alternative procedure proposed in this paper uses the directional distance function introduced by Chambers, Chung, and Färe and the resulting Nerlove-Luenberger (NL) measure of super-efficiency. The fact that the directional distance function combines features of both an input-oriented and an output-oriented model, generally leads to a more complete ranking of the observations than either of the oriented models. An added advantage of this approach is that the NL super-efficiency measure is unique and does not depend on any arbitrary choice of a scaling parameter. A data set on international airlines from Coelli, Perelman, and Griffel-Tatje (2002) is utilized in an illustrative empirical application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A problem frequently encountered in Data Envelopment Analysis (DEA) is that the total number of inputs and outputs included tend to be too many relative to the sample size. One way to counter this problem is to combine several inputs (or outputs) into (meaningful) aggregate variables reducing thereby the dimension of the input (or output) vector. A direct effect of input aggregation is to reduce the number of constraints. This, in its turn, alters the optimal value of the objective function. In this paper, we show how a statistical test proposed by Banker (1993) may be applied to test the validity of a specific way of aggregating several inputs. An empirical application using data from Indian manufacturing for the year 2002-03 is included as an example of the proposed test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microarray technology is a high-throughput method for genotyping and gene expression profiling. Limited sensitivity and specificity are one of the essential problems for this technology. Most of existing methods of microarray data analysis have an apparent limitation for they merely deal with the numerical part of microarray data and have made little use of gene sequence information. Because it's the gene sequences that precisely define the physical objects being measured by a microarray, it is natural to make the gene sequences an essential part of the data analysis. This dissertation focused on the development of free energy models to integrate sequence information in microarray data analysis. The models were used to characterize the mechanism of hybridization on microarrays and enhance sensitivity and specificity of microarray measurements. ^ Cross-hybridization is a major obstacle factor for the sensitivity and specificity of microarray measurements. In this dissertation, we evaluated the scope of cross-hybridization problem on short-oligo microarrays. The results showed that cross hybridization on arrays is mostly caused by oligo fragments with a run of 10 to 16 nucleotides complementary to the probes. Furthermore, a free-energy based model was proposed to quantify the amount of cross-hybridization signal on each probe. This model treats cross-hybridization as an integral effect of the interactions between a probe and various off-target oligo fragments. Using public spike-in datasets, the model showed high accuracy in predicting the cross-hybridization signals on those probes whose intended targets are absent in the sample. ^ Several prospective models were proposed to improve Positional Dependent Nearest-Neighbor (PDNN) model for better quantification of gene expression and cross-hybridization. ^ The problem addressed in this dissertation is fundamental to the microarray technology. We expect that this study will help us to understand the detailed mechanism that determines sensitivity and specificity on the microarrays. Consequently, this research will have a wide impact on how microarrays are designed and how the data are interpreted. ^