8 resultados para Estimação da curva de juros com cupom zero
em University of Queensland eSpace - Australia
Resumo:
Surge flow phenomena. e.g.. as a consequence of a dam failure or a flash flood, represent free boundary problems. ne extending computational domain together with the discontinuities involved renders their numerical solution a cumbersome procedure. This contribution proposes an analytical solution to the problem, It is based on the slightly modified zero-inertia (ZI) differential equations for nonprismatic channels and uses exclusively physical parameters. Employing the concept of a momentum-representative cross section of the moving water body together with a specific relationship for describing the cross sectional geometry leads, after considerable mathematical calculus. to the analytical solution. The hydrodynamic analytical model is free of numerical troubles, easy to run, computationally efficient. and fully satisfies the law of volume conservation. In a first test series, the hydrodynamic analytical ZI model compares very favorably with a full hydrodynamic numerical model in respect to published results of surge flow simulations in different types of prismatic channels. In order to extend these considerations to natural rivers, the accuracy of the analytical model in describing an irregular cross section is investigated and tested successfully. A sensitivity and error analysis reveals the important impact of the hydraulic radius on the velocity of the surge, and this underlines the importance of an adequate description of the topography, The new approach is finally applied to simulate a surge propagating down the irregularly shaped Isar Valley in the Bavarian Alps after a hypothetical dam failure. The straightforward and fully stable computation of the flood hydrograph along the Isar Valley clearly reflects the impact of the strongly varying topographic characteristics on the How phenomenon. Apart from treating surge flow phenomena as a whole, the analytical solution also offers a rigorous alternative to both (a) the approximate Whitham solution, for generating initial values, and (b) the rough volume balance techniques used to model the wave tip in numerical surge flow computations.
Resumo:
Objectives: Resternotomy is a common part of cardiac surgical practice. Associated with resternotomy are the risks of cardiac injury and catastrophic hemorrhage and the subsequent elevated morbidity and mortality in the operating room or during the postoperative period. The technique of direct vision resternotomy is safe and has fewer, if any, serious cardiac injuries. The technique, the reduced need for groin cannulation and the overall low operative mortality and morbidity are the focus of this restrospective analysis. Methods: The records of 495 patients undergoing 546 resternotomies over a 21-year period to January 2000 were reviewed. All consecutive reoperations by the one surgeon comprised patients over the age of 20 at first resternotomy: M:F 343:203, mean age 57 years (range 20 to 85, median age 60). The mean NYHA grade was 2.3 [with 67 patients (1), 273 (11),159 (111), 43 (IV), and 4 (V classification)] with elective reoperation in 94.6%. Cardiac injury was graded into five groups and the incidence and reasons for groin cannulation estimated. The morbidity and mortality as a result of the reoperation and resternotomy were assessed. Results: The hospital/30 day mortality was 2.9% (95% Cl: 1.6%-4.4%) (16 deaths) over the 21 years. First (481), second (53), and third (12) resternotomies produced 307 uncomplicated technical reopenings, 203 slower but uncomplicated procedures, 9 minor superficial cardiac lacerations, and no moderate or severe cardiac injuries. Direct vision resternotomy is crystalized into the principle that only adhesions that are visualized from below are divided and only sternal bone that is freed of adhesions is sewn. Groin exposure was never performed prophylactically for resternotomy. Fourteen patients (2.6%) had such cannulation for aortic dissection/aneurysm (9 patients), excessive sternal adherence of cardiac structures (3 patients), presurgery cardiac arrest (1 patient), and high aortic cannulation desired and not possible (1 patient). The average postop blood loss was 594 mL (95% CI:558-631) in the first 12 hours. The need to return to the operating room for control of excessive bleeding was 2% (11 patients). Blood transfusion was given in 65% of the resternotomy procedures over the 21 years (mean 854 mL 95% Cl 765-945 mL) and 41% over the last 5 years. Conclusions: The technique of direct vision resternotomy has been associated with zero moderate or major cardiac injury/catastrophic hemorrhage at reoperation. Few patients have required groin cannulation. In the postoperative period, there was acceptable blood loss, transfusion rates, reduced morbidity, and moderate low mortality for this potentially high risk group.
Resumo:
The technique of frequency-resolved optical gating is used to characterize the intensity and the phase of picosecond pulses after propagation through 700 m of fiber at close to the zero-dispersion wavelength. Using the frequency-resolved optical gating technique, we directly measure the severe temporal distortion resulting from the interplay between self-phase modulation and higher-order dispersion in this regime. The measured intensity and phase of the pulses after propagation are found to be in good agreement with the predictions of numerical simulations with the nonlinear Schrodinger equation. (C) 1997 Optical Society of America.
Resumo:
We examined the burst swimming performance of two Antarctic fishes, Trematomus bernacchii and T. centronotus, at five temperatures between -1 degreesC and 10 degreesC. As Antarctic fishes are considered one of the most cold specialised and stenothermal of all ectotherms, we predicted they would possess a narrow thermal performance breadth for burst swimming and a correlative decrease in performance at high temperatures. Burst swimming was assessed by videotaping swimming sequences with a 50-Hz video camera and analysing the sequences frame-by-frame to determine maximum velocity, the distance moved throughout the initial 200 ms, and the time taken to reach maximum velocity. In contrast to our prediction, we found both species possessed a wide thermal performance breadth for burst swimming. Although maximum swimming velocity for both T. bernacchii and T. centronotus was significantly highest at 6 degreesC, maximum velocity at ah other test temperatures was less than 20% lower. Thus, it appears that specialisation to a highly stable and cold environment is not necessarily associated with a narrow thermal performance breadth for burst swimming in Antarctic fish. We also examined the ability of the Antarctic fish Pagothenia borchgrevinki to acclimate their burst-swimming performance to different temperatures. We exposed P, borchgrevinki to either -1 degreesC or 4 degreesC for 4 weeks and tested their burst-swimming performance at four temperatures between -1 degreesC and 10 degreesC. Burst-swimming performance of Pagothenia borchgrevinki was unaffected by exposure to either -1 degreesC or 4 degreesC for 4 weeks. Maximum swimming velocity of both acclimation groups was thermally independent over the total temperature range of -1 degreesC to 10 degreesC. Therefore, the loss of any capacity to restructure the phenotype and an inability to thermally acclimate swimming performance appears to be associated with inhabiting a highly stable thermal environment.
Resumo:
In many occupational safety interventions, the objective is to reduce the injury incidence as well as the mean claims cost once injury has occurred. The claims cost data within a period typically contain a large proportion of zero observations (no claim). The distribution thus comprises a point mass at 0 mixed with a non-degenerate parametric component. Essentially, the likelihood function can be factorized into two orthogonal components. These two components relate respectively to the effect of covariates on the incidence of claims and the magnitude of claims, given that claims are made. Furthermore, the longitudinal nature of the intervention inherently imposes some correlation among the observations. This paper introduces a zero-augmented gamma random effects model for analysing longitudinal data with many zeros. Adopting the generalized linear mixed model (GLMM) approach reduces the original problem to the fitting of two independent GLMMs. The method is applied to evaluate the effectiveness of a workplace risk assessment teams program, trialled within the cleaning services of a Western Australian public hospital.
Resumo:
By examining Japanese fictional novels, this article will discuss how anaphoric devices (noun phrases (NPs), third person pronouns (TPPs), and zero anaphors) are selected and arranged in a given discourse. The traditional view of anaphora considers the co-referential relationship between anaphoric devices to be syntagmatic; that is, a pronoun, for example, refers back to its antecedent. It also declares the hierarchical order of information values between anaphoric devices; NPs are semantically the most informative, indicating an episode boundary, and pronouns less informative. Furthermore, zero anaphora is the most referentially transparent, showing the most accessibility of a topic. However, real text shows the contrary. NPs occur frequently while there is no apparent discourse boundary, and the same episode is continuous. This is because zero anaphors and TPPs (if they occur) break down readily due to the nature of a forthcoming sentence and the NP is reinstated, in order to continue the same topic in a given discourse. Therefore, the article opposes the traditional view of anaphora. Based on the concept of text processing, using ‘mental representations’, this article will determine certain occurrence patterns of the three anaphoric devices.