990 resultados para Log Replayer


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose/Objective(s): Primary bone lymphoma (PBL) represents less than 1% of all malignant lymphomas, and 4-5% of all extranodal lymphomas. In this study, we assessed the disease profile, outcome, and prognostic factors in patients with stage I and II PBL.Materials/Methods: Between 1987 and 2008, 116 consecutive patients with PBL treated in 13 RCNinstitutions were included in this study. Inclusion criteriawere: age.17 yrs, PBLin stage I and II, andminimum6months follow-up. The median agewas 51 yrs (range: 17-93).Diagnosticwork-up included plain boneXray (74%of patients), scintigraphy (62%), CT-scan (65%),MRI (58%), PET (18%), and bone-marrow biopsy (84%).All patients had biopsy-proven confirmation of non-Hodgkin's lymphoma (NHL). The histopathological type was predominantly diffuse large B-cell lymphoma (78%) and follicular lymphoma (6%), according to theWHOclassification. One hundred patients had a high-grade, 7 intermediate and 9 low-gradeNHL. Ninety-three patients had anAnn-Arbor stage I, and 23 had a stage II. Seventy-seven patients underwent chemoradiotherapy (CXRT), 12 radiotherapy (RT) alone, 10 chemotherapy alone (CXT), 9 surgery followed by CXRT, 5 surgery followed by CXT, and 2 surgery followed by RT. One patient died before treatment.Median RT dosewas 40Gy (range: 4-60).Themedian number ofCXTcycleswas 6 (range, : 2-8).Median follow-upwas 41months (range: 6-242).Results: Following treatment, the overall response rate was 91% (CR 74%, PR 17%). Local recurrence was observed in 12 (10%) patients, and systemic recurrence in 17 (15%) patients. Causes of death included disease progression in 16, unrelated disease in 6, CXT-related toxicity in 1, and secondary cancer in 2 patients. The 5-yr overall survival (OS), disease-free survival (DFS), lymphoma- specific survival (LSS), and local control (LC) were 76%, 69%, 78%, and 92%, respectively. In univariate analyses (log-rank test), favorable prognostic factors for survival were: age\50 years (p = 0.008), IPI score #1 (p = 0.009), complete response (p\0.001), CXT (p = 0.008), number of CXT cycles $6 (p = 0.007), and RT dose . 40 Gy (p = 0.005). In multivariate analysis age, RT dose, complete response, and absence of B symptoms were independent factors influencing the outcome. There were 3 patients developing grade 3 or more (CTCAE.V3.0) toxicities.Conclusions: This large multicenter study, confirms the relatively good prognosis of early stage PBL, treated with combined CXRT. Local control was excellent, and systemic failure occurred infrequently. A sufficient dose of RT (. 40 Gy) and completeCXT regime (. 6 cycles) were associated with a better outcome. Combined modality appears to be the treatment of choice.Author Disclosure: L. Cai, None; M.C. Stauder, None; Y.J. Zhang, None; P. Poortmans, None; Y.X. Li, None; N. Constantinou, None; J. Thariat, None; S. Kadish, None; M. Ozsahin, None; R.O. Mirimanoff, None.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to interpret the biplot it is necessary to know which points usually variables are the ones that are important contributors to the solution, and this information is available separately as part of the biplot s numerical results. We propose a new scaling of the display, called the contribution biplot, which incorporates this diagnostic directly into the graphical display, showing visually the important contributors and thus facilitating the biplot interpretation and often simplifying the graphical representation considerably. The contribution biplot can be applied to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. In the contribution biplot one set of points, usually the rows of the data matrix, optimally represent the spatial positions of the cases or sample units, according to some distance measure that usually incorporates some form of standardization unless all data are comparable in scale. The other set of points, usually the columns, is represented by vectors that are related to their contributions to the low-dimensional solution. A fringe benefit is that usually only one common scale for row and column points is needed on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot legible. Furthermore, this version of the biplot also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important, when they are in fact contributing minimally to the solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use aggregate GDP data and within-country income shares for theperiod 1970-1998 to assign a level of income to each person in theworld. We then estimate the gaussian kernel density function for theworldwide distribution of income. We compute world poverty rates byintegrating the density function below the poverty lines. The $1/daypoverty rate has fallen from 20% to 5% over the last twenty five years.The $2/day rate has fallen from 44% to 18%. There are between 300 and500 million less poor people in 1998 than there were in the 70s.We estimate global income inequality using seven different popularindexes: the Gini coefficient, the variance of log-income, two ofAtkinson s indexes, the Mean Logarithmic Deviation, the Theil indexand the coefficient of variation. All indexes show a reduction in globalincome inequality between 1980 and 1998. We also find that most globaldisparities can be accounted for by across-country, not within-country,inequalities. Within-country disparities have increased slightly duringthe sample period, but not nearly enough to offset the substantialreduction in across-country disparities. The across-country reductionsin inequality are driven mainly, but not fully, by the large growth rateof the incomes of the 1.2 billion Chinese citizens. Unless Africa startsgrowing in the near future, we project that income inequalities willstart rising again. If Africa does not start growing, then China, India,the OECD and the rest of middle-income and rich countries diverge awayfrom it, and global inequality will rise. Thus, the aggregate GDP growthof the African continent should be the priority of anyone concerned withincreasing global income inequality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Valganciclovir (VGC) has proved efficacious and safe for the prophylaxis against cytomegalovirus (CMV) in high-risk transplant recipients and for the treatment of CMV retinitis in AIDS patients. We used VGC for the treatment of CMV infection (viremia without symptoms) or disease (CMV syndrome or tissue-invasive disease) in kidney, heart, and lung transplant recipients. Fourteen transplant recipients were treated: five for asymptomatic CMV infection and nine for CMV disease. VGC was administered in doses adjusted to renal function for 4 to 12 weeks (induction and maintenance therapy). Clinically, all nine patients with CMV disease responded to treatment. Microbiologically, treatment with VGC turned blood culture negative for CMV within 2 weeks in all patients and was associated with a > or =2 log decrease in blood CMV DNA within 3 weeks in 8 of 8 tested patients. With a follow-up of 6 months (n = 12 patients), asymptomatic recurrent CMV viremia was noted in five cases, and CMV syndrome noted in one case (all cases in the first 2 months after the end of treatment). VGC was clinically well tolerated in all patients; however, laboratory abnormalities occurred in three cases (mild increase in transaminases, thrombocytopenia, and pancytopenia). This preliminary experience strongly suggests that therapy with VGC is effective against CMV in organ transplant recipients; however, the exact duration of therapy remains to be determined: a longer course may be necessary to prevent early recurrence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho baseia-se na análise de dados do desemprego em Cabo Verde nos anos de 2006 e 2008, usando informação da base de dados do INE e IEFP. Partindo da análise dos dados em estudo vai-se procurar descrever e perspectivar metodologias que contemplam as variáveis qualitativas e quantitativas com significado social positivo para a sociedade deste país. Após a introdução no capítulo 1, fez-se, no capítulo 2, a análise exploratória dos dados do desemprego em Cabo Verde referente aos anos 2006 e 2008. No capítulo 3 estudam-se associações entre variáveis, usando a metodologia de tabelas contingência, através da realização de testes de independência e testes de homogeneidade, e análise de medidas de associação. As variáveis usadas, vão ser essencialmente, o escalão etário, o género e o ano. O capítulo 4 é dedicado ao estudo de modelos Log - lineares em tabela de contingência, finalizando-se o trabalho com a apresentação das principais conclusões.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diversidade de formigas epigéicas (Hymenoptera, Formicidae) em ambientes no Centro-Oeste do Brasil. Foi comparada, através do uso de índices de diversidade e modelos de abundância de espécies, a diversidade das comunidades de formigas epigéicas que ocorrem em duas estruturas vegetacionais diferentes: mata nativa e cultura de eucalipto. Para a captura das formigas foram utilizadas 800 armadilhas de solo do tipo pitfall, em oito amostras distintas. Um total de 85 espécies, distribuídas em 36 gêneros de sete subfamílias foram coletadas nos dois ambientes, sendo que destas, 83 ocorreram na mata nativa e 60 na cultura de eucalipto. A diversidade de espécies de formigas calculada pelo índice de Simpson não foi significativamente diferente entre os ambientes, ao contrário do resultado obtido a partir da aplicação do índice de Shannon, o qual indicou maior diversidade de espécies na mata nativa. O modelo log-series não se ajustou satisfatoriamente aos dados das comunidades de formigas encontradas na cultura de eucalipto e na mata nativa, mas o modelo log-normal mostrou-se adequado para descrever a estrutura das comunidades dos dois ambientes. O modelo broken-stick, que representa uma comunidade bem estruturada, ajustou-se apenas aos dados da mata nativa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Subcompositional coherence is a fundamental property of Aitchison s approach to compositional data analysis, and is the principal justification for using ratios of components. We maintain, however, that lack of subcompositional coherence, that is incoherence, can be measured in an attempt to evaluate whether any given technique is close enough, for all practical purposes, to being subcompositionally coherent. This opens up the field to alternative methods, which might be better suited to cope with problems such as data zeros and outliers, while being only slightly incoherent. The measure that we propose is based on the distance measure between components. We show that the two-part subcompositions, which appear to be the most sensitive to subcompositional incoherence, can be used to establish a distance matrix which can be directly compared with the pairwise distances in the full composition. The closeness of these two matrices can be quantified using a stress measure that is common in multidimensional scaling, providing a measure of subcompositional incoherence. The approach is illustrated using power-transformed correspondence analysis, which has already been shown to converge to log-ratio analysis as the power transform tends to zero.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Para obtener una primera aproximación del número de muestras requerido para un determinado nivel de precisión del estimado de producción de huevos de anchoveta por unidad de área superficial del mar, se han utilizado datos existentes de muestras de huevos provenientes de exploraciones efectuadas frente al Perú en los últimos 20 años. La anchoveta parece desovar en cardúmenes, produciendo una gran can­ tidad de huevos, de los que las muestras toman una pequeña proporción. Aproximadamente el 8º/o de las muestras positivas fueron superiores a 4096 huevos por m2. Un importante cambio en el tamaño de la población de la anchoveta peruana se ha notado en 1972. El rango de valores de huevos fue similar antes y después del cambio, pero el número promedio de huevos por muestra positiva fue aproximada­ mente el doble antes de la declinación en 1971. Se ha tomado en consideración la precisión de la estimación deseada y los costos y disponibilidad de tiempo de barco en la estación de desove para elaborar un plan de crucero a un costo razonable. El área investigada cubre 57600 millas cuadradas; con 640 millas a lo largo y 90 millas hacia afuera de la costa. En esta proyección una precisión del estimado de 30º/o requiere 924 muestras y 20º/o de precisión requiere 2078 muestras. Se discute los sesgos en la estimación de precisión. Se usa la distribución de probabilidad log-normal en la descripción de muestras positivas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: HIV-1 RNA viral load is a key parameter for reliable treatment monitoring of HIV-1 infection. Accurate HIV-1 RNA quantitation can be impaired by primer and probe sequence polymorphisms as a result of tremendous genetic diversity and ongoing evolution of HIV-1. A novel dual HIV-1 target amplification approach was realized in the quantitative COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 (HIV-1 TaqMan test v2.0) to cope with the high genetic diversity of the virus. OBJECTIVES AND STUDY DESIGN: The performance of the new assay was evaluated for sensitivity, dynamic range, precision, subtype inclusivity, diagnostic and analytical specificity, interfering substances, and correlation with the COBAS AmpliPrep/COBAS TaqMan HIV-1 (HIV-1 TaqMan test v1.0) predecessor test in patients specimens. RESULTS: The new assay demonstrated a sensitivity of 20 copies/mL, a linear measuring range of 20-10,000,000 copies/mL, with a lower limit of quantitation of 20 copies/mL. HIV-1 Group M subtypes and HIV-1 Group O were quantified within +/-0.3 log(10) of the assigned titers. Specificity was 100% in 660 tested specimens, no cross reactivity was found for 15 pathogens nor any interference for endogenous substances or 29 drugs. Good comparability with the predecessor assay was demonstrated in 82 positive patient samples. In selected clinical samples 35/66 specimens were found underquantitated in the predecessor assay; all were quantitated correctly in the new assay. CONCLUSIONS: The dual-target approach for the HIV-1 TaqMan test v2.0 enables superior HIV-1 Group M subtype coverage including HIV-1 Group O detection. Correct quantitation of specimens underquantitated in the HIV-1 TaqMan test v1.0 test was demonstrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The n-octanol/water partition coefficient (log Po/w) is a key physicochemical parameter for drug discovery, design, and development. Here, we present a physics-based approach that shows a strong linear correlation between the computed solvation free energy in implicit solvents and the experimental log Po/w on a cleansed data set of more than 17,500 molecules. After internal validation by five-fold cross-validation and data randomization, the predictive power of the most interesting multiple linear model, based on two GB/SA parameters solely, was tested on two different external sets of molecules. On the Martel druglike test set, the predictive power of the best model (N = 706, r = 0.64, MAE = 1.18, and RMSE = 1.40) is similar to six well-established empirical methods. On the 17-drug test set, our model outperformed all compared empirical methodologies (N = 17, r = 0.94, MAE = 0.38, and RMSE = 0.52). The physical basis of our original GB/SA approach together with its predictive capacity, computational efficiency (1 to 2 s per molecule), and tridimensional molecular graphics capability lay the foundations for a promising predictor, the implicit log P method (iLOGP), to complement the portfolio of drug design tools developed and provided by the SIB Swiss Institute of Bioinformatics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let a class $\F$ of densities be given. We draw an i.i.d.\ sample from a density $f$ which may or may not be in $\F$. After every $n$, one must make a guess whether $f \in \F$ or not. A class is almost surely testable if there exists such a testing sequence such that for any $f$, we make finitely many errors almost surely. In this paper, several results are given that allowone to decide whether a class is almost surely testable. For example, continuity and square integrability are not testable, but unimodality, log-concavity, and boundedness by a given constant are.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider adaptive sequential lossy coding of bounded individual sequences when the performance is measured by the sequentially accumulated mean squared distortion. Theencoder and the decoder are connected via a noiseless channel of capacity $R$ and both are assumed to have zero delay. No probabilistic assumptions are made on how the sequence to be encoded is generated. For any bounded sequence of length $n$, the distortion redundancy is defined as the normalized cumulative distortion of the sequential scheme minus the normalized cumulative distortion of the best scalarquantizer of rate $R$ which is matched to this particular sequence. We demonstrate the existence of a zero-delay sequential scheme which uses common randomization in the encoder and the decoder such that the normalized maximum distortion redundancy converges to zero at a rate $n^{-1/5}\log n$ as the length of the encoded sequence $n$ increases without bound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.