969 resultados para Linear Duration Invariant


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a need by engine manufactures for computationally efficient and accurate predictive combustion modeling tools for integration in engine simulation software for the assessment of combustion system hardware designs and early development of engine calibrations. This thesis discusses the process for the development and validation of a combustion modeling tool for Gasoline Direct Injected Spark Ignited Engine with variable valve timing, lift and duration valvetrain hardware from experimental data. Data was correlated and regressed from accepted methods for calculating the turbulent flow and flame propagation characteristics for an internal combustion engine. A non-linear regression modeling method was utilized to develop a combustion model to determine the fuel mass burn rate at multiple points during the combustion process. The computational fluid dynamic software Converge ©, was used to simulate and correlate the 3-D combustion system, port and piston geometry to the turbulent flow development within the cylinder to properly predict the experimental data turbulent flow parameters through the intake, compression and expansion processes. The engine simulation software GT-Power © is then used to determine the 1-D flow characteristics of the engine hardware being tested to correlate the regressed combustion modeling tool to experimental data to determine accuracy. The results of the combustion modeling tool show accurate trends capturing the combustion sensitivities to turbulent flow, thermodynamic and internal residual effects with changes in intake and exhaust valve timing, lift and duration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES The aim of this study was to analyze trigger activity in the long-term follow-up after left atrial (LA) linear ablation. BACKGROUND Interventional strategies for curative treatment of atrial fibrillation (AF) are targeted at the triggers and/or the maintaining substrate. After substrate modification using nonisolating linear lesions, the activity of triggers is unknown. METHODS With the LA linear lesion concept, 129 patients were treated using intraoperative ablation with minimal invasive surgical techniques. Contiguous radiofrequency energy-induced lesion lines involving the mitral annulus and the orifices of the pulmonary veins without isolation were placed under direct vision. RESULTS After a mean follow-up of 3.6 +/- 0.4 years, atrial ectopy, atrial runs, and reoccurrence of AF episodes were analyzed by digital 7-day electrocardiograms in 30 patients. Atrial ectopy was present in all patients. Atrial runs were present in 25 of 30 patients (83%), with a median number of 9 runs per patient/week (range 1 to 321) and a median duration of 1.2 s/run (range 0.7 to 25), without a significant difference in atrial ectopy and atrial runs between patients with former paroxysmal (n = 17) or persistent AF (n = 13). Overall, 87% of all patients were completely free from AF without antiarrhythmic drugs. CONCLUSIONS A detailed rhythm analysis late after specific LA linear lesion ablation shows that trigger activity remains relatively frequent but short and does not induce AF episodes in most patients. The long-term success rate of this concept is high in patients with paroxysmal or persistent AF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: There is an ongoing debate concerning how outcome variables change during the course of psychotherapy. We compared the dose–effect model, which posits diminishing effects of additional sessions in later treatment phases, against a model that assumes a linear and steady treatment progress through termination. Method: Session-by-session outcome data of 6,375 outpatients were analyzed, and participants were categorized according to treatment length. Linear and log-linear (i.e., negatively accelerating) latent growth curve models (LGCMs) were estimated and compared for different treatment length categories. Results: When comparing the fit of the various models, the log-linear LGCMs assuming negatively accelerating treatment progress consistently outperformed the linear models irre- spective of treatment duration. The rate of change was found to be inversely related to the length of treatment. Conclusion: As proposed by the dose–effect model, the expected course of improvement in psychotherapy appears to follow a negatively accelerated pattern of change, irrespective of the duration of the treatment. However, our results also suggest that the rate of change is not constant across various treatment lengths. As proposed by the “good enough level” model, longer treatments are associated with less rapid rates of change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims Ongoing global warming has been implicated in shifting phenological patterns such as the timing and duration of the growing season across a wide variety of ecosystems. Linear models are routinely used to extrapolate these observed shifts in phenology into the future and to estimate changes in associated ecosystem properties such as net primary productivity. Yet, in nature, linear relationships may be special cases. Biological processes frequently follow more complex, non-linear patterns according to limiting factors that generate shifts and discontinuities, or contain thresholds beyond which responses change abruptly. This study investigates to what extent cambium phenology is associated with xylem growth and differentiation across conifer species of the northern hemisphere. Methods Xylem cell production is compared with the periods of cambial activity and cell differentiation assessed on a weekly time scale on histological sections of cambium and wood tissue collected from the stems of nine species in Canada and Europe over 1–9 years per site from 1998 to 2011. Key Results The dynamics of xylogenesis were surprisingly homogeneous among conifer species, although dispersions from the average were obviously observed. Within the range analysed, the relationships between the phenological timings were linear, with several slopes showing values close to or not statistically different from 1. The relationships between the phenological timings and cell production were distinctly non-linear, and involved an exponential pattern. Conclusions The trees adjust their phenological timings according to linear patterns. Thus, shifts of one phenological phase are associated with synchronous and comparable shifts of the successive phases. However, small increases in the duration of xylogenesis could correspond to a substantial increase in cell production. The findings suggest that the length of the growing season and the resulting amount of growth could respond differently to changes in environmental conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Nowadays there is extensive evidence available showing the efficacy of cognitive remediation (CR). To date, only limited evidence is available about the impact of the duration of illness on CR effects. The Integrated Neurocognitive Therapy (INT) represents a new developed CR approach. It is a manualized group therapy targeting all 11 NIMH-MATRICS domains. Methods In an international multicenter study, 166 schizophrenia outpatients (DSM-IV-TR) were randomly assigned either to INT or to Treatment-As-Usual (TAU). 60 patients were defined as Early Course group (EC) characterized by less than 5 years of illness, 40 patients were in the Long-Term group (LT) characterized by more than 15 years of illness, and 76 patients were in the Medium-Long-Term group (MLT) characterized by an illness of 5-15 years. Treatment comprised of 15 biweekly sessions. Assessments were conducted before and after treatment and at follow up (1 year). Multivariate General Linear Models (GLM) examined our hypothesis, whether EC, LT, and MLT groups differ under INT and TAU from each other in outcome. Results First of all, the attendance rate of 65% was significantly lower and the drop out rate of 18.5% during therapy was higher in the EC group compared to the other groups. Interaction effects regarding proximal outcome showed that the duration of illness has a strong impact on neurocognitive functioning in speed of processing (F>2.4) and attention (F>2.8). But INT intervention compared to TAU only had a significant effect in more chronically ill patients of MLT and LT, but not in younger patients in EC. In social cognitive domains, only the EC group showed a significant change in attribution (hostility; F>2.5), LT and MLT groups did not. However, no differences between the 3 groups were evident in memory, problem solving, and emotion perception. Regarding more distal outcome, LT patients had more symptoms compared to EC (F>4.4). Finally, EC patients showed higher improvements in psychosocial functioning compared to LT and MLT (F=1.8). Conclusions Against common expectations, long-term, more chronically ill patients showed higher effects in basal cognitive functions compared to younger patients and patients without any active therapy (TAU). On the other hand, early-course patients had a greater potential to change in attribution, symptoms and psychosocial functioning. Consequently, more integrated therapy offers are also recommended for long-term course schizophrenia patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A 560-meter-thick sequence of Cenomanian through Pleistocene sediments cored at DSDP Site 462 in the Nauru Basin overlies a 500-meter-thick complex unit of altered basalt flows, diabase sills, and thin intercalated volcaniclastic sediments. The Upper Cretaceous and Cenozoic sediments contain a high proportion of calcareous fossils, although the site has apparently been below the calcite compensation depth (CCD) from the late Mesozoic to the Pleistocene. This fact and the contemporaneous fluctuations of the calcite and opal accumulation rates suggest an irregular influx of displaced pelagic sediments from the shallow margins of the basin to its center, resulting in unusually high overall sedimentation rates for such a deep (5190 m) site. Shallow-water benthic fossils and planktonic foraminifers both occur as reworked materials, but usually are not found in the same intervals of the sediment section. We interpret this as recording separate erosional interludes in the shallow-water and intermediate-water regimes. Lower and upper Cenozoic hiatuses also are believed to have resulted from mid-water events. High accumulation rates of volcanogenic material during Santonian time suggest a corresponding significant volcanic episode. The coincidence of increased carbonate accumulation rates during the Campanian and displacement of shallow-water fossils during the late Campanian-early Maestrichtian with the volcanic event implies that this early event resulted in formation of the island chains around the Nauru Basin, which then served as platforms for initial carbonate deposition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objeto deste trabalho é a análise do aproveitamento múltiplo do reservatório de Barra Bonita, localizado na confluência entre os rios Piracicaba e Tietê, no estado de São Paulo e pertencente ao chamado sistema Tietê-Paraná. Será realizada a otimização da operação do reservatório, através de programação linear, com o objetivo de aumentar a geração de energia elétrica, através da maximização da vazão turbinada. Em seguida, a partir dos resultados da otimização da geração de energia, serão utilizadas técnicas de simulação computacional, para se obter índices de desempenho conhecidos como confiabilidade, resiliência e vulnerabilidade, além de outros fornecidos pelo próprio modelo de simulação a ser utilizado. Estes índices auxiliam a avaliação da freqüência, magnitude e duração dos possíveis conflitos existentes. Serão analisados os possíveis conflitos entre a navegação, o armazenamento no reservatório, a geração de energia e a ocorrência de enchentes na cidade de Barra Bonita, localizada a jusante da barragem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims Plants regulate their architecture strongly in response to density, and there is evidence that this involves changes in the duration of leaf extension. This questions the approximation, central in crop models, that development follows a fixed thermal time schedule. The aim of this research is to investigate, using maize as a model, how the kinetics of extension of grass leaves change with density, and to propose directions for inclusion of this regulation in plant models. • Methods Periodic dissection of plants allowed the establishment of the kinetics of lamina and sheath extension for two contrasting sowing densities. The temperature of the growing zone was measured with thermocouples. Two-phase (exponential plus linear) models were fitted to the data, allowing analysis of the timing of the phase changes of extension, and the extension rate of sheaths and blades during both phases. • Key Results The duration of lamina extension dictated the variation in lamina length between treatments. The lower phytomers were longer at high density, with delayed onset of sheath extension allowing more time for the lamina to extend. In the upper phytomers—which were shorter at high density—the laminae had a lower relative extension rate (RER) in the exponential phase and delayed onset of linear extension, and less time available for extension since early sheath extension was not delayed. • Conclusions The relative timing of the onset of fast extension of the lamina with that of sheath development is the main determinant of the response of lamina length to density. Evidence is presented that the contrasting behaviour of lower and upper phytomers is related to differing regulation of sheath ontogeny before and after panicle initiation. A conceptual model is proposed to explain how the observed asynchrony between lamina and sheath development is regulated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We studied the visual mechanisms that encode edge blur in images. Our previous work suggested that the visual system spatially differentiates the luminance profile twice to create the 'signature' of the edge, and then evaluates the spatial scale of this signature profile by applying Gaussian derivative templates of different sizes. The scale of the best-fitting template indicates the blur of the edge. In blur-matching experiments, a staircase procedure was used to adjust the blur of a comparison edge (40% contrast, 0.3 s duration) until it appeared to match the blur of test edges at different contrasts (5% - 40%) and blurs (6 - 32 min of arc). Results showed that lower-contrast edges looked progressively sharper.We also added a linear luminance gradient to blurred test edges. When the added gradient was of opposite polarity to the edge gradient, it made the edge look progressively sharper. Both effects can be explained quantitatively by the action of a half-wave rectifying nonlinearity that sits between the first and second (linear) differentiating stages. This rectifier was introduced to account for a range of other effects on perceived blur (Barbieri-Hesse and Georgeson, 2002 Perception 31 Supplement, 54), but it readily predicts the influence of the negative ramp. The effect of contrast arises because the rectifier has a threshold: it not only suppresses negative values but also small positive values. At low contrasts, more of the gradient profile falls below threshold and its effective spatial scale shrinks in size, leading to perceived sharpening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: This study aimed to explore methods of assessing interactions between neuronal sources using MEG beamformers. However, beamformer methodology is based on the assumption of no linear long-term source interdependencies [VanVeen BD, vanDrongelen W, Yuchtman M, Suzuki A. Localization of brain electrical activity via linearly constrained minimum variance spatial filtering. IEEE Trans Biomed Eng 1997;44:867-80; Robinson SE, Vrba J. Functional neuroimaging by synthetic aperture magnetometry (SAM). In: Recent advances in Biomagnetism. Sendai: Tohoku University Press; 1999. p. 302-5]. Although such long-term correlations are not efficient and should not be anticipated in a healthy brain [Friston KJ. The labile brain. I. Neuronal transients and nonlinear coupling. Philos Trans R Soc Lond B Biol Sci 2000;355:215-36], transient correlations seem to underlie functional cortical coordination [Singer W. Neuronal synchrony: a versatile code for the definition of relations? Neuron 1999;49-65; Rodriguez E, George N, Lachaux J, Martinerie J, Renault B, Varela F. Perception's shadow: long-distance synchronization of human brain activity. Nature 1999;397:430-3; Bressler SL, Kelso J. Cortical coordination dynamics and cognition. Trends Cogn Sci 2001;5:26-36]. Methods: Two periodic sources were simulated and the effects of transient source correlation on the spatial and temporal performance of the MEG beamformer were examined. Subsequently, the interdependencies of the reconstructed sources were investigated using coherence and phase synchronization analysis based on Mutual Information. Finally, two interacting nonlinear systems served as neuronal sources and their phase interdependencies were studied under realistic measurement conditions. Results: Both the spatial and the temporal beamformer source reconstructions were accurate as long as the transient source correlation did not exceed 30-40 percent of the duration of beamformer analysis. In addition, the interdependencies of periodic sources were preserved by the beamformer and phase synchronization of interacting nonlinear sources could be detected. Conclusions: MEG beamformer methods in conjunction with analysis of source interdependencies could provide accurate spatial and temporal descriptions of interactions between linear and nonlinear neuronal sources. Significance: The proposed methods can be used for the study of interactions between neuronal sources. © 2005 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 53A07, 53A35, 53A10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 13N15, 13A50, 16W25.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is to theoretically investigate shockwave and microbubble formation due to laser absorption by microparticles and nanoparticles. The initial motivation for this research was to understand the underlying physical mechanisms responsible for laser damage to the retina, as well as the predict threshold levels for damage for laser pulses with of progressively shorter durations. The strongest absorbers in the retina are micron size melanosomes, and their absorption of laser light causes them to accrue very high energy density. I theoretically investigate how this absorbed energy is transferred to the surrounding medium. For a wide range of conditions I calculate shockwave generation and bubble growth as a function of the three parameters; fluence, pulse duration and pulse shape. In order to develop a rigorous physical treatment, the governing equations for the behavior of an absorber and for the surrounding medium are derived. Shockwave theory is investigated and the conclusion is that a shock pressure explanation is likely to be the underlying physical cause of retinal damage at threshold fluences for sub-nanosecond pulses. The same effects are also expected for non-biological micro and nano absorbers. ^