916 resultados para Approximate Sum Rule


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The family Phycodnaviridae encompasses a diverse and rapidly expanding collection of large icosahedral, dsDNA viruses that infect algae. These lytic and lysogenic viruses have genomes ranging from 160 to 560 kb. The family consists of six genera based initially on host range and supported by sequence comparisons. The family is monophyletic with branches for each genus, but the phycodnaviruses have evolutionary roots that connect them with several other families of large DNA viruses, referred to as the nucleocytoplasmic large DNA viruses (NCLDV).The phycodnaviruses have diverse genome structures, some with large regions of noncoding sequence and others with regions of ssDNA. The genomes of members in three genera in the Phycodnaviridae have been sequenced. The genome analyses have revealed more than 1000 unique genes, with only 14 homologous genes in common among the three genera of phycodnaviruses sequenced to date. Thus, their gene diversity far exceeds the number of so-called core genes. Not much is known about the replication of these viruses, but the consequences of these infections on phytoplankton have global affects, including influencing geochemical cycling and weather patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maximum-likelihood decoding is often the optimal decoding rule one can use, but it is very costly to implement in a general setting. Much effort has therefore been dedicated to find efficient decoding algorithms that either achieve or approximate the error-correcting performance of the maximum-likelihood decoder. This dissertation examines two approaches to this problem. In 2003 Feldman and his collaborators defined the linear programming decoder, which operates by solving a linear programming relaxation of the maximum-likelihood decoding problem. As with many modern decoding algorithms, is possible for the linear programming decoder to output vectors that do not correspond to codewords; such vectors are known as pseudocodewords. In this work, we completely classify the set of linear programming pseudocodewords for the family of cycle codes. For the case of the binary symmetric channel, another approximation of maximum-likelihood decoding was introduced by Omura in 1972. This decoder employs an iterative algorithm whose behavior closely mimics that of the simplex algorithm. We generalize Omura's decoder to operate on any binary-input memoryless channel, thus obtaining a soft-decision decoding algorithm. Further, we prove that the probability of the generalized algorithm returning the maximum-likelihood codeword approaches 1 as the number of iterations goes to infinity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of topology design for optical networks. We investigate the problem of selecting switching sites to minimize total cost of the optical network. The cost of an optical network can be expressed as a sum of three main factors: the site cost, the link cost, and the switch cost. To the best of our knowledge, this problem has not been studied in its general form as investigated in this paper. We present a mixed integer quadratic programming (MIQP) formulation of the problem to find the optimal value of the total network cost. We also present an efficient heuristic to approximate the solution in polynomial time. The experimental results show good performance of the heuristic. The value of the total network cost computed by the heuristic varies within 2% to 21% of its optimal value in the experiments with 10 nodes. The total network cost computed by the heuristic for 51% of the experiments with 10 node network topologies varies within 8% of its optimal value. We also discuss the insight gained from our experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use the QCD sum rules to study possible B-c-like molecular states. We consider isoscalar J(P) = 0(+) and J(P) = 1(+) D(*) B(*) molecular currents. We consider the contributions of condensates up to dimension eight and we work at leading order in alpha(s). We obtain for these states masses around 7 GeV. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: In chronic renal failure patients under hemodialysis (HD) treatment, the availability of simple, safe, and effective tools to assess body composition enables evaluation of body composition accurately, in spite of changes in body fluids that occur in dialysis therapy, thus contributing to planning and monitoring of nutritional treatment. We evaluated the performance of bioelectrical impedance analysis (BIA) and the skinfold thickness sum (SKF) to assess fat mass (FM) in chronic renal failure patients before (BHD) and after (AHD) HD, using air displacement plethysmography (ADP) as the standard method. Design: This single-center cross-sectional trial involved comparing the FM of 60 HD patients estimated BHD and AHD by BIA (multifrequential; 29 women, 31 men) and by SKF with those estimated by the reference method, ADP. Body fat-free mass (FFM) was also obtained by subtracting the total body fat from the individual total weight. Results: Mean estimated FM (kg [%]) observed by ADP BHD was 17.95 +/- 0.99 kg (30.11% +/- 1.30%), with a 95% confidence interval (CI) of 16.00 to 19.90 (27.56 to 32.66); mean estimated FM observed AHD was 17.92 +/- 1.11 kg (30.04% +/- 1.40%), with a 95% CI of 15.74 to 20.10 (27.28 to 32.79). Neither study period showed a difference in FM and FFM (for both kg and %) estimates by the SKF method when compared with ADP; however, the BIA underestimated the FM and overestimated the FFM (for both kg and %) when compared with ADP. Conclusion: The SKF, but not the BIA, method showed results similar to ADP and can be considered adequate for FM evaluation in HD patients. (C) 2012 by the National Kidney Foundation, Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim The study aimed to determine the value of postchemoradiation biopsies, performed after significant tumour downsizing following neoadjuvant therapy, in predicting complete tumour regression in patients with distal rectal cancer. Method A retrospective comparative study was performed in patients with rectal cancer who achieved an incomplete clinical response after neoadjuvant chemoradiotherapy. Patients with significant tumour downsizing (> 30% of the initial tumour size) were compared with controls (< 30% reduction of the initial tumour size). During flexible proctoscopy carried out postchemoradiation, biopsies were performed using 3-mm biopsy forceps. The biopsy results were compared with the histopathological findings of the resected specimen. UICC (Union for International Cancer Control) ypTNM classification, tumour differentiation and regression grade were evaluated. The main outcome measures were sensitivity and specificity, negative and positive predictive values, and accuracy of a simple forceps biopsy for predicting pathological response after neoadjuvant chemoradiotherapy. Results Of the 172 patients, 112 were considered to have had an incomplete clinical response and were included in the study. Thirty-nine patients achieved significant tumour downsizing and underwent postchemoradiation biopsies. Overall, 53 biopsies were carried out. Of the 39 patients who achieved significant tumour downsizing, the biopsy result was positive in 25 and negative in 14. Only three of the patients with a negative biopsy result were found to have had a complete pathological response (giving a negative predictive value of 21%). Considering all biopsies performed, only three of 28 negative biopsies were true negatives, giving a negative predictive value of 11%. Conclusion In patients with distal rectal cancer undergoing neoadjuvant chemoradiation, post-treatment biopsies are of limited clinical value in ruling out persisting cancer. A negative biopsy result after a near-complete clinical response should not be considered sufficient for avoiding a radical resection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Medical ecology is a conceptual framework introduced in 1961 to describe the relationship and utilization of health care services by a given population. We applied this conception to individuals enrolled in a private health maintenance organization (HMO) in Sao Paulo, Brazil, with the aim of describing the utilization of primary health care, verifying the frequency of various symptoms, and identifying the roles of different health care sources. METHODS: This was a cross-sectional telephone survey among a random sample of people enrolled in a private HMO. We interviewed a random sample of non-pregnant adults over age 18 using 10 questions about symptoms and health care use during the month prior to interview. RESULTS: The final sample consisted of 1,065 participants (mean age 68 years, 68% female). From this sample, 424 (39.8%) reported the presence of symptoms, 311 (29.2%) had a medical office consult, 104 (9.8%) went directly to an emergency medical department, 63 (5.9%) were hospitalized, 22 (2.1%) used complementary medicine resources, seven (0.7%) were referred to home care, and one (0.1%) was admitted to an academic hospital. CONCLUSIONS: The proportion of study participants referred to an academic care center was similar to that observed in previous "medical ecology" studies in different populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies of electoral fraud tend to focus their analyses only on the pre-electoral or electoral phases. By examining the Brazilian First Republic (1889-1930), this article shifts the focus to a later phase, discussing a particular type of electoral fraud that has been little explored by the literature, namely, that perpetrated by the legislatures themselves during the process of giving final approval to election results. The Brazilian case is interesting because of a practice known as degola ('beheading') whereby electoral results were altered when Congress decided on which deputies to certify as duly elected. This has come to be seen as a widespread and standard practice in this period. However, this article shows that this final phase of rubber-stamping or overturning election results was important not because of the number of degolas, which was actually much lower than the literature would have us believe, but chiefly because of their strategic use during moments of political uncertainty. It argues that the congressional certification of electoral results was deployed as a key tool in ensuring the political stability of the Republican regime in the absence of an electoral court.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background Decreased heart rate variability (HRV) is related to higher morbidity and mortality. In this study we evaluated the linear and nonlinear indices of the HRV in stable angina patients submitted to coronary angiography. Methods We studied 77 unselected patients for elective coronary angiography, which were divided into two groups: coronary artery disease (CAD) and non-CAD groups. For analysis of HRV indices, HRV was recorded beat by beat with the volunteers in the supine position for 40 minutes. We analyzed the linear indices in the time (SDNN [standard deviation of normal to normal], NN50 [total number of adjacent RR intervals with a difference of duration greater than 50ms] and RMSSD [root-mean square of differences]) and frequency domains ultra-low frequency (ULF) ≤ 0,003 Hz, very low frequency (VLF) 0,003 – 0,04 Hz, low frequency (LF) (0.04–0.15 Hz), and high frequency (HF) (0.15–0.40 Hz) as well as the ratio between LF and HF components (LF/HF). In relation to the nonlinear indices we evaluated SD1, SD2, SD1/SD2, approximate entropy (−ApEn), α1, α2, Lyapunov Exponent, Hurst Exponent, autocorrelation and dimension correlation. The definition of the cutoff point of the variables for predictive tests was obtained by the Receiver Operating Characteristic curve (ROC). The area under the ROC curve was calculated by the extended trapezoidal rule, assuming as relevant areas under the curve ≥ 0.650. Results Coronary arterial disease patients presented reduced values of SDNN, RMSSD, NN50, HF, SD1, SD2 and -ApEn. HF ≤ 66 ms2, RMSSD ≤ 23.9 ms, ApEn ≤−0.296 and NN50 ≤ 16 presented the best discriminatory power for the presence of significant coronary obstruction. Conclusion We suggest the use of Heart Rate Variability Analysis in linear and nonlinear domains, for prognostic purposes in patients with stable angina pectoris, in view of their overall impairment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use the QCD sum rules to study the recently observed charmonium-like structure Z+ c (3900) as a tetraquark state. We evaluate the three-point function and extract the coupling constants of the Z+ c J/ψ π+, Z+ c ηc ρ+ and Z+ c D+ ¯D∗0 vertices and the corresponding decay widths in these channels. The results obtained are in good agreement with the experimental data and supports to the tetraquark picture of this state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]Approximate inverses, based on Frobenius norm minimization, of real nonsingular matrices are analyzed from a purely theoretical point of view. In this context, this paper provides several sufficient conditions, that assure us the possibility of improving (in the sense of the Frobenius norm) some given approximate inverses. Moreover, the optimal approximate inverses of matrix A ∈ R n×n , among all matrices belonging to certain subspaces of R n×n , are obtained. Particularly, a natural generalization of the classical normal equations of the system Ax = b is given, when searching for approximate inverses N 6= AT such that AN is symmetric and kAN − IkF < AAT − I F …

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN ]The classical optimal (in the Frobenius sense) diagonal preconditioner for large sparse linear systems Ax = b is generalized and improved. The new proposed approximate inverse preconditioner N is based on the minimization of the Frobenius norm of the residual matrix AM − I, where M runs over a certain linear subspace of n × n real matrices, defined by a prescribed sparsity pattern. The number of nonzero entries of the n×n preconditioning matrix N is less than or equal to 2n, and n of them are selected as the optimal positions in each of the n columns of matrix N. All theoretical results are justified in detail…

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.