8 resultados para q-Analysis

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper establishes the spawning habitat of the Brazilian sardine Sardinella brasiliensis and investigates the spatial variability of egg density and its relation with oceanographic conditions in the shelf of the south-east Brazil Bight (SBB). The spawning habitats of S. brasiliensis have been defined in terms of spatial models of egg density, temperature-salinity plots, quotient (Q) analysis and remote sensing data. Quotient curves (Q(C)) were constructed using the geographic distribution of egg density, temperature and salinity from samples collected during nine survey cruises between 1976 and 1993. The interannual sea surface temperature (SST) variability was determined using principal component analysis on the SST anomalies (SSTA) estimated from remote sensing data over the period between 1985 and 2007. The spatial pattern of egg occurrences in the SBB indicated that the largest concentration occurred between Paranagua and Sao Sebastiao. Spawning habitat expanded and contracted during the years, fluctuating around Paranagua. In January 1978 and January 1993, eggs were found nearly everywhere along the inner shelf of the SBB, while in January 1988 and 1991 spawning had contracted to their southernmost position. The SSTA maps for the spawning periods showed that in the case of habitat expansion (1993 only) anomalies over the SBB were zero or slightly negative, whereas for the contraction period anomalies were all positive. Sardinella brasiliensis is capable of exploring suitable spawning sites provided by the entrainment of the colder and less-saline South Atlantic Central Water onto the shelf by means of both coastal wind-driven (to the north-east of the SBB) and meander-induced (to the south-west of the SBB) upwelling.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper compares the effectiveness of the Tsallis entropy over the classic Boltzmann-Gibbs-Shannon entropy for general pattern recognition, and proposes a multi-q approach to improve pattern analysis using entropy. A series of experiments were carried out for the problem of classifying image patterns. Given a dataset of 40 pattern classes, the goal of our image case study is to assess how well the different entropies can be used to determine the class of a newly given image sample. Our experiments show that the Tsallis entropy using the proposed multi-q approach has great advantages over the Boltzmann-Gibbs-Shannon entropy for pattern classification, boosting image recognition rates by a factor of 3. We discuss the reasons behind this success, shedding light on the usefulness of the Tsallis entropy and the multi-q approach. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pulse repetition rates and the number of laser pulses are among the most important parameters that do affect the analysis of solid materials by laser induced breakdown spectroscopy, and the knowledge of their effects is of fundamental importance for suggesting analytical strategies when dealing with laser ablation processes of polymers. In this contribution, the influence of these parameters in the ablated mass and in the features of craters was evaluated in polypropylene and high density polyethylene plates containing pigment-based PbCrO4. Surface characterization and craters profile were carried out by perfilometry and scanning electron microscopy. Area, volume and profile of craters were obtained using Taylor Map software. A laser induced breakdown spectroscopy system consisted of a Q-Switched Nd:YAG laser (1064 nm, 5 ns) and an Echelle spectrometer equipped with ICCD detector were used. The evaluated operating conditions consisted of 10, 25 and 50 laser pulses at 1, 5 and 10 Hz, 250 mJ/pulse (85 J cm(-2)), 2 mu s delay time and 6 mu s integration time gate. Differences in the topographical features among craters of both polymers were observed. The decrease in the repetition rate resulted in irregular craters and formation of edges, especially in polypropylene sample. The differences in the topographical features and ablated masses were attributed to the influence of the degree of crystallinity, crystalline melting temperature and glass transition temperature in the ablation process of the high density polyethylene and polypropylene. It was also observed that the intensities of chromium and lead emission signals obtained at 10 Hz were two times higher than at 5 Hz by keeping the number of laser pulses constant. (C) 2011 Elsevier B. V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Endomyocardial biopsy (EMB) plays an important role in allograft surveillance to screen an acute rejection episode after heart transplantation (HT), to diagnose an unknown cause of cardiomyopathies (CMP) or to reveal a cardiac tumor. However, the procedure is not risk free. Objective. The main objective of this research was to describe our experience with EMB during the last 33 years comparing surgical risk between FIT versus no-HT patients. Method. We analyzed retrospectively the data of 5347 EMBs performed from 1978 to 2011 (33 years). For surveillance of acute rejection episodes after HT we performed 3564 (66.7%), whereas 1777 (33.2%) for CMP diagnosis, and 6 (1.0%) for cardiac tumor identification. Results. The main complications due to EMB were divided into 2 groups to facilitate analysis: major complications associated with potential death risk, and minor complications. The variables that showed a significant difference in the HT group were as follows: tricuspid Injury (.0490) and coronary fistula (.0000). Among the no-HT cohort they were insufficient fragment (.0000), major complications (.0000) and total complications (.0000). Conclusions. EMB can be accomplished with a low risk of complications and high effectiveness to diagnose CMP and rejection after HT. However, the risk is great among patients with CMP due to their anatomic characteristics. Children also constitute a risk group for EMB due to their small size in addition to the heart disease. The risk of injury to the tricuspid valve was higher among the HT group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Acute respiratory distress syndrome (ARDS) is associated with high in-hospital mortality. Alveolar recruitment followed by ventilation at optimal titrated PEEP may reduce ventilator-induced lung injury and improve oxygenation in patients with ARDS, but the effects on mortality and other clinical outcomes remain unknown. This article reports the rationale, study design, and analysis plan of the Alveolar Recruitment for ARDS Trial (ART). Methods/Design: ART is a pragmatic, multicenter, randomized (concealed), controlled trial, which aims to determine if maximum stepwise alveolar recruitment associated with PEEP titration is able to increase 28-day survival in patients with ARDS compared to conventional treatment (ARDSNet strategy). We will enroll adult patients with ARDS of less than 72 h duration. The intervention group will receive an alveolar recruitment maneuver, with stepwise increases of PEEP achieving 45 cmH(2)O and peak pressure of 60 cmH2O, followed by ventilation with optimal PEEP titrated according to the static compliance of the respiratory system. In the control group, mechanical ventilation will follow a conventional protocol (ARDSNet). In both groups, we will use controlled volume mode with low tidal volumes (4 to 6 mL/kg of predicted body weight) and targeting plateau pressure <= 30 cmH2O. The primary outcome is 28-day survival, and the secondary outcomes are: length of ICU stay; length of hospital stay; pneumothorax requiring chest tube during first 7 days; barotrauma during first 7 days; mechanical ventilation-free days from days 1 to 28; ICU, in-hospital, and 6-month survival. ART is an event-guided trial planned to last until 520 events (deaths within 28 days) are observed. These events allow detection of a hazard ratio of 0.75, with 90% power and two-tailed type I error of 5%. All analysis will follow the intention-to-treat principle. Discussion: If the ART strategy with maximum recruitment and PEEP titration improves 28-day survival, this will represent a notable advance to the care of ARDS patients. Conversely, if the ART strategy is similar or inferior to the current evidence-based strategy (ARDSNet), this should also change current practice as many institutions routinely employ recruitment maneuvers and set PEEP levels according to some titration method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of laser focusing and fluence on LIBS analysis of pellets of plant leaves was evaluated. A Q-switched Nd:YAG laser (5ns, 10Hz, 1064nm) was used and the emission signals were collected by lenses into an optical fiber coupled to a spectrometer with Echelle optics and ICCD. Data were acquired from the accumulation of 20 laser pulses at 2.0 mu s delay and 5.0 mu s integration time gate. The emission signal intensities increased with both laser fluence and spot size. Higher sensitivities for Ca, K, Mg, P, Al, B, Cu, Fe, Mn, and Zn determinations were observed for fluences in the range from 25 to 60Jcm(-2). Coefficients of variation of site-to-site measurements were generally lower than 10% (n=30 sites, 20 laser pulses/site) for a fluence of 50Jcm(-2) and 750 mu m spot size. For most elements, there is an indication that accuracy is improved with higher fluences. (C) 2012 Elsevier B.V. All rights reserved.