986 resultados para adomians decomposition method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a method for extracting semantic informationfrom online music discussion forums is proposed. The semantic relations are inferred from the co-occurrence of musical concepts in forum posts, using network analysis. The method starts by defining a dictionary of common music terms in an art music tradition. Then, it creates a complex network representation of the online forum by matchingsuch dictionary against the forum posts. Once the complex network is built we can study different network measures, including node relevance, node co-occurrence andterm relations via semantically connecting words. Moreover, we can detect communities of concepts inside the forum posts. The rationale is that some music terms are more related to each other than to other terms. All in all, this methodology allows us to obtain meaningful and relevantinformation from forum discussions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to obtain richer resources and a broader range of potential uses for a significant number of languages. With the objective of reducing cost by eliminating human intervention, we present a new method towards the automatic merging of resources. This method includes both, the automatic mapping of resources involved to a common format and merging them, once in this format. This paper presents how we have addressed the merging of two verb subcategorization frame lexica for Spanish, but our method will be extended to cover other types of Lexical Resources. The achieved results, that almost replicate human work, demonstrate the feasibility of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Red blood cells (RBCs) present unique reversible shape deformability, essential for both function and survival, resulting notably in cell membrane fluctuations (CMF). These CMF have been subject of many studies in order to obtain a better understanding of these remarkable biomechanical membrane properties altered in some pathological states including blood diseases. In particular the discussion over the thermal or metabolic origin of the CMF has led in the past to a large number of investigations and modeling. However, the origin of the CMF is still debated. In this article, we present an analysis of the CMF of RBCs by combining digital holographic microscopy (DHM) with an orthogonal subspace decomposition of the imaging data. These subspace components can be reliably identified and quantified as the eigenmode basis of CMF that minimizes the deformation energy of the RBC structure. By fitting the observed fluctuation modes with a theoretical dynamic model, we find that the CMF are mainly governed by the bending elasticity of the membrane and that shear and tension elasticities have only a marginal influence on the membrane fluctations of the discocyte RBC. Further, our experiments show that the role of ATP as a driving force of CMF is questionable. ATP, however, seems to be required to maintain the unique biomechanical properties of the RBC membrane that lead to thermally excited CMF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To prospectively study the intraocular pressure (IOP) lowering effect and safety of the new method of very deep sclerectomy with collagen implant (VDSCI) compared with standard deep sclerectomy with collagen implant (DSCI). METHODS: The trial involved 50 eyes of 48 patients with medically uncontrolled primary and secondary open-angle glaucoma, randomized to undergo either VDSCI procedure (25 eyes) or DSCI procedure (25 eyes). Follow-up examinations were performed before surgery and after surgery at day 1, at week 1, at months 1, 2, 3, 6, 9, 12, 18, and 24 months. Ultrasound biomicroscopy was performed at 3 and 12 months. RESULTS: Mean follow-up period was 18.6+/-5.9 (VDSCI) and 18.9+/-3.6 (DSCI) months (P=NS). Mean preoperative IOP was 22.4+/-7.4 mm Hg for VDSCI and 20.4+/-4.4 mm Hg for DSCI eyes (P=NS). Mean postoperative IOP was 3.9+/-2.3 (VDSCI) and 6.3+/-4.3 (DSCI) (P<0.05) at day 1, and 12.2+/-3.9 (VDSCI) and 13.3+/-3.4 (DSCI) (P=NS) at month 24. At the last visit, the complete success rate (defined as an IOP of < or =18 mm Hg and a percentage drop of at least 20%, achieved without medication) was 57% in VDSCI and 62% in DSCI eyes (P=NS) ultrasound biomicroscopy at 12 months showed a mean volume of the subconjunctival filtering bleb of 3.9+/-4.2 mm3 (VDSCI) and 6.8+/-7.5 mm3 (DSCI) (P=0.426) and 5.2+/-3.6 mm3 (VDSCI) and 5.4+/-2.9 mm3 (DSCI) (P=0.902) for the intrascleral space. CONCLUSIONS: Very deep sclerectomy seems to provide stable and good control of IOP at 2 years of follow-up with few postoperative complications similar to standard deep sclerectomy with the collagen implant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Radiation dose exposure is of particular concern in children due to the possible harmful effects of ionizing radiation. The adaptive statistical iterative reconstruction (ASIR) method is a promising new technique that reduces image noise and produces better overall image quality compared with routine-dose contrast-enhanced methods. OBJECTIVE: To assess the benefits of ASIR on the diagnostic image quality in paediatric cardiac CT examinations. MATERIALS AND METHODS: Four paediatric radiologists based at two major hospitals evaluated ten low-dose paediatric cardiac examinations (80 kVp, CTDI(vol) 4.8-7.9 mGy, DLP 37.1-178.9 mGy·cm). The average age of the cohort studied was 2.6 years (range 1 day to 7 years). Acquisitions were performed on a 64-MDCT scanner. All images were reconstructed at various ASIR percentages (0-100%). For each examination, radiologists scored 19 anatomical structures using the relative visual grading analysis method. To estimate the potential for dose reduction, acquisitions were also performed on a Catphan phantom and a paediatric phantom. RESULTS: The best image quality for all clinical images was obtained with 20% and 40% ASIR (p < 0.001) whereas with ASIR above 50%, image quality significantly decreased (p < 0.001). With 100% ASIR, a strong noise-free appearance of the structures reduced image conspicuity. A potential for dose reduction of about 36% is predicted for a 2- to 3-year-old child when using 40% ASIR rather than the standard filtered back-projection method. CONCLUSION: Reconstruction including 20% to 40% ASIR slightly improved the conspicuity of various paediatric cardiac structures in newborns and children with respect to conventional reconstruction (filtered back-projection) alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new method for constructing exact distribution-free tests (and confidence intervals) for variables that can generate more than two possible outcomes.This method separates the search for an exact test from the goal to create a non-randomized test. Randomization is used to extend any exact test relating to meansof variables with finitely many outcomes to variables with outcomes belonging to agiven bounded set. Tests in terms of variance and covariance are reduced to testsrelating to means. Randomness is then eliminated in a separate step.This method is used to create confidence intervals for the difference between twomeans (or variances) and tests of stochastic inequality and correlation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models incorporating more realistic models of customer behavior, as customers choosing froman offer set, have recently become popular in assortment optimization and revenue management.The dynamic program for these models is intractable and approximated by a deterministiclinear program called the CDLP which has an exponential number of columns. However, whenthe segment consideration sets overlap, the CDLP is difficult to solve. Column generationhas been proposed but finding an entering column has been shown to be NP-hard. In thispaper we propose a new approach called SDCP to solving CDLP based on segments and theirconsideration sets. SDCP is a relaxation of CDLP and hence forms a looser upper bound onthe dynamic program but coincides with CDLP for the case of non-overlapping segments. Ifthe number of elements in a consideration set for a segment is not very large (SDCP) can beapplied to any discrete-choice model of consumer behavior. We tighten the SDCP bound by(i) simulations, called the randomized concave programming (RCP) method, and (ii) by addingcuts to a recent compact formulation of the problem for a latent multinomial-choice model ofdemand (SBLP+). This latter approach turns out to be very effective, essentially obtainingCDLP value, and excellent revenue performance in simulations, even for overlapping segments.By formulating the problem as a separation problem, we give insight into why CDLP is easyfor the MNL with non-overlapping considerations sets and why generalizations of MNL posedifficulties. We perform numerical simulations to determine the revenue performance of all themethods on reference data sets in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The network revenue management (RM) problem arises in airline, hotel, media,and other industries where the sale products use multiple resources. It can be formulatedas a stochastic dynamic program but the dynamic program is computationallyintractable because of an exponentially large state space, and a number of heuristicshave been proposed to approximate it. Notable amongst these -both for their revenueperformance, as well as their theoretically sound basis- are approximate dynamic programmingmethods that approximate the value function by basis functions (both affinefunctions as well as piecewise-linear functions have been proposed for network RM)and decomposition methods that relax the constraints of the dynamic program to solvesimpler dynamic programs (such as the Lagrangian relaxation methods). In this paperwe show that these two seemingly distinct approaches coincide for the network RMdynamic program, i.e., the piecewise-linear approximation method and the Lagrangianrelaxation method are one and the same.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: The aim of this study was to compare specificity and sensitivity of different biological markers that can be used in a forensic field to identify potentially dangerous drivers because of their alcohol habits. Methods: We studied 280 Swiss drivers after driving while under the alcohol influence. 33 were excluded for not having CDT N results, 247 were included (218 men (88%) and 29 women (12%). Mean age was 42,4 (SD:12, min: 20 max: 76). The evaluation of the alcohol consumption concerned the month before the CDT test and was considered as such after the interview: Heavy drinkers (>3 drinks per day): 60 (32.7%), < 3 drinks per day and moderate: 127 (51.4%) 114 (46.5%), abstinent: 60 (24.3%) 51 (21%). Alcohol intake was monitored by structured interviews, self-reported drinking habits and the C-Audit questionnaire as well as information provided by their family and general practitioner. Consumption was quantified in terms of standard drinks, which contain approximately 10 grams of pure alcohol (Ref. WHO). Results: comparison between moderate (less or equal to 3 drinks per day) and excessive drinkers (more than 3 drinks) Marker ROC area 95% CI cut-off sensitivity specificity CDT TIA 0.852 0.786-0917 2.6* 0.93 LR+1.43 0.35 LR-0.192 CDT N latex 0.875 0.821-0.930 2.5* 0.66 LR+ 6.93 0.90 LR- 0.369 Asialo+disialo-tf 0.881 0.826-0.936 1.2* 0.78 LR+4.07 0.80 LR-0.268 1.7° 0.66 LR+8.9 0.93 LR-0.360 GGT 0.659 0.580-0.737 85* 0.37 LR+2.14 0.83 LR-0.764 * cut-off point suggested by the manufacturer ° cut-off point suggested by our laboratory Conclusion: With the cut-off point established by the manufacturer, CDT TIA performed poorly in term of specificity. N latex CDT and CZE CDT were better, especially if a 1.7 cut-off is used with CZE