80 resultados para Optimal time delay
Resumo:
Background: Although clinical diabetes mellitus is obviously a high risk factor for myocardial infarction (MI), in experimental studies disagreement exists about the sensitivity to ischemic injury of an infarcted myocardium. Recently, our group demonstrated that diabetic animals presented better cardiac function recovery and cellular resistance to ischemic injury than nondiabetics. In the present study, we evaluated the chronic effects of MI on left ventricular (LV) and autonomic functions in streptozotocin (STZ) diabetic rats. Methods: Male Wistar rats were divided into 4 groups: control (C, n = 15), diabetes (D, n = 16), MI (I, n = 21), and diabetes + MI (DI, n = 30). MI was induced 15 days after diabetes (STZ) induction. Ninety days after MI, LV and autonomic functions were evaluated (8 animals each group). Left ventricular homogenates were analyzed by Western blotting to evaluate the expression of calcium handling proteins. Results: MI area was similar in infarcted groups (similar to 43%). Ejection fraction and + dP/dt were reduced in I compared with DI. End-diastolic pressure was additionally increased in I compared with DI. Compared with DI, I had increased Na(+)-Ca(2+) exchange and phospholamban expression (164%) and decreased phosphorylated phospholamban at serine(16) (65%) and threonine(17) (70%) expression. Nevertheless, diabetic groups had greater autonomic dysfunction, observed by baroreflex sensitivity and pulse interval variability reductions. Consequently, the mortality rate was increased in DI compared with I, D, and C groups. Conclusions: LV dysfunction in diabetic animals was attenuated after 90 days of myocardial infarction and was associated with a better profile of calcium handling proteins. However, this positive adaptation was not able to reduce the mortality rate of DI animals, suggesting that autonomic dysfunction is associated with increased mortality in this group. Therefore, it is possible that the better cardiac function has been transitory, and the autonomic dysfunction, more prominent in diabetic group, may lead, in the future, to the cardiovascular damage.
Resumo:
The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.
Resumo:
An extension of the uniform invariance principle for ordinary differential equations with finite delay is developed. The uniform invariance principle allows the derivative of the auxiliary scalar function V to be positive in some bounded sets of the state space while the classical invariance principle assumes that. V <= 0. As a consequence, the uniform invariance principle can deal with a larger class of problems. The main difficulty to prove an invariance principle for functional differential equations is the fact that flows are defined on an infinite dimensional space and, in such spaces, bounded solutions may not be precompact. This difficulty is overcome by imposing the vector field taking bounded sets into bounded sets.
Resumo:
Carrying out information about the microstructure and stress behaviour of ferromagnetic steels, magnetic Barkhausen noise (MBN) has been used as a basis for effective non-destructive testing methods, opening new areas in industrial applications. One of the factors that determines the quality and reliability of the MBN analysis is the way information is extracted from the signal. Commonly, simple scalar parameters are used to characterize the information content, such as amplitude maxima and signal root mean square. This paper presents a new approach based on the time-frequency analysis. The experimental test case relates the use of MBN signals to characterize hardness gradients in a AISI4140 steel. To that purpose different time-frequency (TFR) and time-scale (TSR) representations such as the spectrogram, the Wigner-Ville distribution, the Capongram, the ARgram obtained from an AutoRegressive model, the scalogram, and the Mellingram obtained from a Mellin transform are assessed. It is shown that, due to nonstationary characteristics of the MBN, TFRs can provide a rich and new panorama of these signals. Extraction techniques of some time-frequency parameters are used to allow a diagnostic process. Comparison with results obtained by the classical method highlights the improvement on the diagnosis provided by the method proposed.
Resumo:
A susceptible-infective-recovered (SIR) epidemiological model based on probabilistic cellular automaton (PCA) is employed for simulating the temporal evolution of the registered cases of chickenpox in Arizona, USA, between 1994 and 2004. At each time step, every individual is in one of the states S, I, or R. The parameters of this model are the probabilities of each individual (each cell forming the PCA lattice ) passing from a state to another state. Here, the values of these probabilities are identified by using a genetic algorithm. If nonrealistic values are allowed to the parameters, the predictions present better agreement with the historical series than if they are forced to present realistic values. A discussion about how the size of the PCA lattice affects the quality of the model predictions is presented. Copyright (C) 2009 L. H. A. Monteiro et al.
Resumo:
This paper deals with the long run average continuous control problem of piecewise deterministic Markov processes (PDMPs) taking values in a general Borel space and with compact action space depending on the state variable. The control variable acts on the jump rate and transition measure of the PDMP, and the running and boundary costs are assumed to be positive but not necessarily bounded. Our first main result is to obtain an optimality equation for the long run average cost in terms of a discrete-time optimality equation related to the embedded Markov chain given by the postjump location of the PDMP. Our second main result guarantees the existence of a feedback measurable selector for the discrete-time optimality equation by establishing a connection between this equation and an integro-differential equation. Our final main result is to obtain some sufficient conditions for the existence of a solution for a discrete-time optimality inequality and an ordinary optimal feedback control for the long run average cost using the so-called vanishing discount approach. Two examples are presented illustrating the possible applications of the results developed in the paper.
Resumo:
Over the last couple of decades, many methods for synchronizing chaotic systems have been proposed with communications applications in view. Yet their performance has proved disappointing in face of the nonideal character of usual channels linking transmitter and receiver, that is, due to both noise and signal propagation distortion. Here we consider a discrete-time master-slave system that synchronizes despite channel bandwidth limitations and an allied communication system. Synchronization is achieved introducing a digital filter that limits the spectral content of the feedback loop responsible for producing the transmitted signal. Copyright (C) 2009 Marcio Eisencraft et al.
Resumo:
For obtaining accurate and reliable gene expression results it is essential that quantitative real-time RT-PCR (qRT-PCR) data are normalized with appropriate reference genes. The current exponential increase in postgenomic studies on the honey bee, Apis mellifera, makes the standardization of qRT-PCR results an important task for ongoing community efforts. For this aim we selected four candidate reference genes (actin, ribosomal protein 49, elongation factor 1-alpha, tbp-association factor) and used three software-based approaches (geNorm, BestKeeper and NormFinder) to evaluate the suitability of these genes as endogenous controls. Their expression was examined during honey bee development, in different tissues, and after juvenile hormone exposure. Furthermore, the importance of choosing an appropriate reference gene was investigated for two developmentally regulated target genes. The results led us to consider all four candidate genes as suitable genes for normalization in A. mellifera. However, each condition evaluated in this study revealed a specific set of genes as the most appropriated ones.
Resumo:
The aim of the present study was to investigate whether the perception of presentation durations of pictures of different body postures was distorted as function of the embodied movement that originally produced these postures. Participants were presented with two pictures, one with a low-arousal body posture judged to require no movement and the other with a high-arousal body posture judged to require considerable movement. In a temporal bisection task with two ranges of standard durations (0.4/1.6 s and 2/8 s), the participants had to judge whether the presentation duration of each of the pictures was more similar to the short or to the long standard duration. The results showed that the duration was judged longer for the posture requiring more movement than for the posture requiring less movement. However the magnitude of this overestimation was relatively greater for the range of short durations than for that of longer durations. Further analyses suggest that this lengthening effect was mediated by an arousal effect of limited duration on the speed of the internal clock system.
Resumo:
SEVERAL MODELS OF TIME ESTIMATION HAVE BEEN developed in psychology; a few have been applied to music. In the present study, we assess the influence of the distances travelled through pitch space on retrospective time estimation. Participants listened to an isochronous chord sequence of 20-s duration. They were unexpectedly asked to reproduce the time interval of the sequence. The harmonic structure of the stimulus was manipulated so that the sequence either remained in the same key (CC) or travelled through a closely related key (CFC) or distant key (CGbC). Estimated times were shortened when the sequence modulated to a very distant key. This finding is discussed in light of Lerdahl's Tonal Pitch Space Theory (2001), Firmino and Bueno's Expected Development Fraction Model (in press), and models of time estimation.
Resumo:
In this paper we discuss the existence of mild, strict and classical solutions for a class of abstract integro-differential equations in Banach spaces. Some applications to ordinary and partial integro-differential equations are considered.
Resumo:
In this paper we study the existence and regularity of mild solutions for a class of abstract partial neutral integro-differential equations with unbounded delay.
Resumo:
The objective is to differentiate noncavitated caries enamel through time-resolved fluorescence and to find excitation and emission parameters that can be applied in future clinical practice for detection of caries lesions that are not clearly visible to the professional. Sixteen human teeth with noncavitiated white-spot caries were selected for this work. Fluorescence intensity decay was measured by using an apparatus based on the time-correlated single-photon counting method. An optical fiber bundle was employed for sample excitation (440 nm), and the fluorescence collected by the same bundle (500 nm) was registered. The average lifetime for sound enamel was 7: 93 +/- 0: 09, 2: 46 +/- 0: 04, and 0: 51 +/- 0: 02 ns, whereas for the carious enamel the lifetimes were 4: 84 +/- 0: 06, 1: 35 +/- 0: 02, and 0: 16 +/- 0: 01 ns. It was concluded that it is possible to differentiate between carious and sound regions by time-resolved fluorescence and that, although the origin of enamel fluorescence is still uncertain, the lifetime values seem to be typical of fluorophores like collagen I. (C) 2010 Optical Society of America
Resumo:
Background: Microarray techniques have become an important tool to the investigation of genetic relationships and the assignment of different phenotypes. Since microarrays are still very expensive, most of the experiments are performed with small samples. This paper introduces a method to quantify dependency between data series composed of few sample points. The method is used to construct gene co-expression subnetworks of highly significant edges. Results: The results shown here are for an adapted subset of a Saccharomyces cerevisiae gene expression data set with low temporal resolution and poor statistics. The method reveals common transcription factors with a high confidence level and allows the construction of subnetworks with high biological relevance that reveals characteristic features of the processes driving the organism adaptations to specific environmental conditions. Conclusion: Our method allows a reliable and sophisticated analysis of microarray data even under severe constraints. The utilization of systems biology improves the biologists ability to elucidate the mechanisms underlying celular processes and to formulate new hypotheses.
Resumo:
Background: Hepatitis C virus (HCV) genotyping is the most significant predictor of the response to antiviral therapy. The aim of this study was to develop and evaluate a novel real-time PCR method for HCV genotyping based on the NS5B region. Methodology/Principal Findings: Two triplex reaction sets were designed, one to detect genotypes 1a, 1b and 3a; and another to detect genotypes 2a, 2b, and 2c. This approach had an overall sensitivity of 97.0%, detecting 295 of the 304 tested samples. All samples genotyped by real-time PCR had the same type that was assigned using LiPA version 1 (Line in Probe Assay). Although LiPA v. 1 was not able to subtype 68 of the 295 samples (23.0%) and rendered different subtype results from those assigned by real-time PCR for 12/295 samples (4.0%), NS5B sequencing and real-time PCR results agreed in all 146 tested cases. Analytical sensitivity of the real-time PCR assay was determined by end-point dilution of the 5000 IU/ml member of the OptiQuant HCV RNA panel. The lower limit of detection was estimated to be 125 IU/ml for genotype 3a, 250 IU/ml for genotypes 1b and 2b, and 500 IU/ml for genotype 1a. Conclusions/Significance: The total time required for performing this assay was two hours, compared to four hours required for LiPA v. 1 after PCR-amplification. Furthermore, the estimated reaction cost was nine times lower than that of available commercial methods in Brazil. Thus, we have developed an efficient, feasible, and affordable method for HCV genotype identification.