863 resultados para critical path methods
Resumo:
In this paper we have discussed limits of the validity of Whitham's characteristic rule for finding successive positions of a shock in one space dimension. We start with an example for which the exact solution is known and show that the characteristic rule gives correct result only if the state behind the shock is uniform. Then we take the gas dynamic equations in two cases: one of a shock propagating through a stratified layer and other down a nonuniform tube and derive exact equations for the evolution of the shock amplitude along a shock path. These exact results are then compared with the results obtained by the characteristic rule. The characteristic rule not only incorrectly accounts for the deviation of the state behind the shock from a uniform state but also gives a coefficient in the equation which differ significantly from the exact coefficients for a wide range of values of the shock strength.
Resumo:
Two methods of pre-harvest inventory were designed and tested on three cutting sites containing a total of 197 500 m3 of wood. These sites were located on flat-ground boreal forests located in northwestern Quebec. Both methods studied involved scaling of trees harvested to clear the road path one year (or more) prior to harvest of adjacent cut-blocks. The first method (ROAD) considers the total road right-of-way volume divided by the total road area cleared. The resulting volume per hectare is then multiplied by the total cut-block area scheduled for harvest during the following year to obtain the total estimated cutting volume. The second method (STRATIFIED) also involves scaling of trees cleared from the road. However, in STRATIFIED, log scaling data are stratified by forest stand location. A volume per hectare is calculated for each stretch of road that crosses a single forest stand. This volume per hectare is then multiplied by the remaining area of the same forest stand scheduled for harvest one year later. The sum of all resulting estimated volumes per stand gives the total estimated cutting-volume for all cut-blocks adjacent to the studied road. A third method (MNR) was also used to estimate cut-volumes of the sites studied. This method represents the actual existing technique for estimating cutting volume in the province of Quebec. It involves summing the cut volume for all forest stands. The cut volume is estimated by multiplying the area of each stand by its estimated volume per hectare obtained from standard stock tables provided by the governement. The resulting total estimated volume per cut-block for all three methods was then compared with the actual measured cut-block volume (MEASURED). This analysis revealed a significant difference between MEASURED and MNR methods with the MNR volume estimate being 30 % higher than MEASURED. However, no significant difference from MEASURED was observed for volume estimates for the ROAD and STRATIFIED methods which respectively had estimated cutting volumes 19 % and 5 % lower than MEASURED. Thus the ROAD and STRATIFIED methods are good ways to estimate cut-block volumes after road right-of-way harvest for conditions similar to those examined in this study.
Resumo:
Fujikawa's method of evaluating the supercurrent and the superconformal current anomalies, using the heat-kernel regularization scheme, is extended to theories with gauge invariance, in particular, to the off-shell N=1 supersymmetric Yang-Mills (SSYM) theory. The Jacobians of supersymmetry and superconformal transformations are finite. Although the gauge-fixing term is not supersymmetric and the regularization scheme is not manifestly supersymmetric, we find that the regularized Jacobians are gauge invariant and finite and they can be expressed in such a way that there is no one-loop supercurrent anomaly for the N=1 SSYM theory. The superconformal anomaly is nonzero and the anomaly agrees with a similar result obtained using other methods.
Resumo:
Fujikawa's method of evaluating the anomalies is extended to the on-shell supersymmetric (SUSY) theories. The supercurrent and the superconformal current anomalies are evaluated for the Wess-Zumino model using the background-field formulation and heat-kernel regularization. We find that the regularized Jacobians for SUSY and superconformal transformations are finite. The results can be expressed in a form such that there is no supercurrent anomaly but a finite nonzero superconformal anomaly, in agreement with similar results obtained using other methods.
Resumo:
Critical chronic lower limb ischaemia (CLI) is the most severe form of peripheral arterial disease. Even though the treatment of CLI has evolved during the last decade, CLI is still associated with considerable morbidity, mortality and a decreased quality of life, in addition to a large financial impact on society. ---- Bypass surgery has traditionally been considered the approach of choice to treat CLI patients in order to avoid amputation. However, there are increasing data on the efficacy of endovascular revascularization procedures, such as percutaneous transluminal angioplasty (PTA), to achieve good leg salvage rates as well. Data gathered on all the 2,054 CLI patients revascularized at the Helsinki University Central Hospital between 2000 and 2007 were retrospectively analyzed. This patient cohort was used to compare the results of infrainguinal PTA and bypass surgery as well as to investigate predictors of failure after PTA. This study showed that infrainguinal PTA and bypass surgery yielded rather similar results in terms of survival, amputation-free survival and freedom from any re-intervention. When the femoropoliteal segment was treated, leg salvage was significantly better in the bypass surgery group, whereas no significant difference was observed between the two treatment methods when the revascularization extended to the infrapopliteal segment. PTA resulted in a significantly lower freedom from surgical re-interventions when compared to surgical revascularization. In this study the most important predictors of poor outcome after PTA for CLI were cardiac morbidity, nonambulatory status upon hospital arrival, and gangrene as a manifestation of CLI. Thus, when feasible, PTA seems to be a valid alternative for bypass surgery in the treatment of CLI provided that active redo-surgery is utilized. The optimal revascularization strategy should always be sought for each CLI patient individually considering the clinical state of the leg, the occlusive lesions to be treated, co-morbidities, life-expectancy, and the availability of a suitable vein for bypass.
Resumo:
The main purpose of revascularization procedures for critical limb ischaemia (CLI) is to preserve the leg and sustain the patient s ambulatory status. Other goals are ischaemic pain relief and healing of ischaemic ulcers. Patients with CLI are usually old and have several comorbidities affecting the outcome. Revascularization for CLI is meaningless unless both life and limb are preserved. Therefore, the knowledge of both patient- and bypass-related risk factors is of paramount importance in clinical decision-making, patient selection and resource allocation. The aim of this study was to identify patient- and graft-related predictors of impaired outcome after infrainguinal bypass for CLI. The purpose was to assess the outcome of high-risk patients undergoing infrainguinal bypass and to evaluate the usefulness of specific risk scoring methods. The results of bypasses in the absence of optimal vein graft material were also evaluated, and the feasibility of the new method of scaffolding suboptimal vein grafts was assessed. The results of this study showed that renal insufficiency - not only renal failure but also moderate impairment in renal function - seems to be a significant risk factor for both limb loss and death after infrainguinal bypass in patients with CLI. Low estimated GFR (PIENEMPI KUIN 30 ml/min/1.73 m2) is a strong independent marker of poor prognosis. Furthermore, estimated GFR is a more accurate predictor of survival and leg salvage after infrainguinal bypass in CLI patients than serum creatinine level alone. We also found out that the life expectancy of octogenarians with CLI is short. In this patient group endovascular revascularization is associated with a better outcome than bypass in terms of survival, leg salvage and amputation-free survival especially in presence of coronary artery disease. This study was the first one to demonstrate that Finnvasc and modified Prevent III risk scoring methods both predict the long-term outcome of patients undergoing both surgical and endovascular infrainguinal revascularization for CLI. Both risk scoring methods are easy to use and might be helpful in clinical practice as an aid in preoperative patient selection and decision-making. Similarly than in previous studies, we found out that a single-segment great saphenous vein graft is superior to any other autologous vein graft in terms of mid-term patency and leg salvage. However, if optimal vein graft is lacking, arm vein conduits are superior to prosthetic grafts especially in infrapopliteal bypasses for CLI. We studied also the new method of scaffolding suboptimal quality vein grafts and found out that this method may enable the use of vein grafts of compromised quality otherwise unsuitable for bypass grafting. The remarkable finding was that patients with the combination of high operative risk due to severe comorbidities and risk graft have extremely poor survival, suggesting that only relatively fit patients should undergo complex bypasses with risk grafts. The results of this study can be used in clinical practice as an aid in preoperative patient selection and decision-making. In the future, the need of vascular surgery will increase significantly as the elderly and diabetic population increases, which emphasises the importance of focusing on those patients that will gain benefit from infrainguinal bypass. Therefore, the individual risk of the patient, ambulatory status, outcome expectations, the risk of bypass procedure as well as technical factors such as the suitability of outflow anatomy and the available vein material should all be assessed and taken into consideration when deciding on the best revascularization strategy.
Resumo:
The study of soil microbiota and their activities is central to the understanding of many ecosystem processes such as decomposition and nutrient cycling. The collection of microbiological data from soils generally involves several sequential steps of sampling, pretreatment and laboratory measurements. The reliability of results is dependent on reliable methods in every step. The aim of this thesis was to critically evaluate some central methods and procedures used in soil microbiological studies in order to increase our understanding of the factors that affect the measurement results and to provide guidance and new approaches for the design of experiments. The thesis focuses on four major themes: 1) soil microbiological heterogeneity and sampling, 2) storage of soil samples, 3) DNA extraction from soil, and 4) quantification of specific microbial groups by the most-probable-number (MPN) procedure. Soil heterogeneity and sampling are discussed as a single theme because knowledge on spatial (horizontal and vertical) and temporal variation is crucial when designing sampling procedures. Comparison of adjacent forest, meadow and cropped field plots showed that land use has a strong impact on the degree of horizontal variation of soil enzyme activities and bacterial community structure. However, regardless of the land use, the variation of microbiological characteristics appeared not to have predictable spatial structure at 0.5-10 m. Temporal and soil depth-related patterns were studied in relation to plant growth in cropped soil. The results showed that most enzyme activities and microbial biomass have a clear decreasing trend in the top 40 cm soil profile and a temporal pattern during the growing season. A new procedure for sampling of soil microbiological characteristics based on stratified sampling and pre-characterisation of samples was developed. A practical example demonstrated the potential of the new procedure to reduce the analysis efforts involved in laborious microbiological measurements without loss of precision. The investigation of storage of soil samples revealed that freezing (-20 °C) of small sample aliquots retains the activity of hydrolytic enzymes and the structure of the bacterial community in different soil matrices relatively well whereas air-drying cannot be recommended as a storage method for soil microbiological properties due to large reductions in activity. Freezing below -70 °C was the preferred method of storage for samples with high organic matter content. Comparison of different direct DNA extraction methods showed that the cell lysis treatment has a strong impact on the molecular size of DNA obtained and on the bacterial community structure detected. An improved MPN method for the enumeration of soil naphthalene degraders was introduced as an alternative to more complex MPN protocols or the DNA-based quantification approach. The main advantage of the new method is the simple protocol and the possibility to analyse a large number of samples and replicates simultaneously.
Resumo:
At the the heart of this study can be seen the dual concern of how the nation is represented as a categorical entity and how this is put to use in everyday social interactions.This can be seen as a reaction to the general approach to categorisation and identity functions that tend to be reified and essentialized within the social sciences. The empirical focus of this study is the Isle of Man, a crown dependency situated geographically central within the British Isles while remaining political outside the United Kingdom. The choice of this site was chosen explicitly as ‘notions of nation’ expressed on the island can be seen as being contested and ephemerally unstable. To get at these ‘notions of nation’ is was necessary to choose specific theoretical tools that were able to capture the wider cultural and representational domain while being capable of addressing the nuanced and functional aspects of interaction. As such, the main theoretical perspective used within this study was that of critical discursive psychology which incorporates the specific theoretical tools interpretative repertoires, ideological dilemmas and subject positions. To supplement these tools, a discursive approach to place was taken in tandem to address the form and function of place attached to nationhood. Two methods of data collection were utilized, that of computer mediated communication and acquaintance interviews. From the data a number of interpretative repertoires were proposed, namely being, essential rights, economic worth, heritage claims, conflict orientation, people-as-nation and place-as-nation. Attached to such interpretative repertoires were the ideological dilemmas region vs. country, people vs. place and individualism vs. collectivism. The subject positions found are much more difficult to condense, but the most significant ones were gender, age and parentage. The final focus of the study, that of place, was shown to be more than just an unreflected on ‘container’ of people but was significant in terms of the rhetorical construction of such places for how people saw themselves and the discursive function of the particular interaction. As such, certain forms of place construction included size, community, temporal, economic, safety, political and recognition. A number of conclusions were drawn from the above which included, that when looking at nation categories we should take into account the specific meanings that people attach to such concepts and to be aware of the particular uses they are put to in interaction. Also, that it is impossible to separate concepts neatly, but it is necessary to be aware of the intersection where concepts cross, and clash, when looking at nationhood.
Resumo:
The statistical properties of fractional Brownian walks are used to construct a path integral representation of the conformations of polymers with different degrees of bond correlation. We specifically derive an expression for the distribution function of the chains’ end‐to‐end distance, and evaluate it by several independent methods, including direct evaluation of the discrete limit of the path integral, decomposition into normal modes, and solution of a partial differential equation. The distribution function is found to be Gaussian in the spatial coordinates of the monomer positions, as in the random walk description of the chain, but the contour variables, which specify the location of the monomer along the chain backbone, now depend on an index h, the degree of correlation of the fractional Brownian walk. The special case of h=1/2 corresponds to the random walk. In constructing the normal mode picture of the chain, we conjecture the existence of a theorem regarding the zeros of the Bessel function.
Resumo:
This paper addresses the problem of determining an optimal (shortest) path in three dimensional space for a constant speed and turn-rate constrained aerial vehicle, that would enable the vehicle to converge to a rectilinear path, starting from any arbitrary initial position and orientation. Based on 3D geometry, we propose an optimal and also a suboptimal path planning approach. Unlike the existing numerical methods which are computationally intensive, this optimal geometrical method generates an optimal solution in lesser time. The suboptimal solution approach is comparatively more efficient and gives a solution that is very close to the optimal one. Due to its simplicity and low computational requirements this approach can be implemented on an aerial vehicle with constrained turn radius to reach a straight line with a prescribed orientation as required in several applications. But, if the distance between the initial point and the straight line to be followed along the vertical axis is high, then the generated path may not be flyable for an aerial vehicle with limited range of flight path angle and we resort to a numerical method for obtaining the optimal solution. The numerical method used here for simulation is based on multiple shooting and is found to be comparatively more efficient than other methods for solving such two point boundary value problem.
Resumo:
The problem of determining optimal power spectral density models for earthquake excitation which satisfy constraints on total average power, zero crossing rate and which produce the highest response variance in a given linear system is considered. The solution to this problem is obtained using linear programming methods. The resulting solutions are shown to display a highly deterministic structure and, therefore, fail to capture the stochastic nature of the input. A modification to the definition of critical excitation is proposed which takes into account the entropy rate as a measure of uncertainty in the earthquake loads. The resulting problem is solved using calculus of variations and also within linear programming framework. Illustrative examples on specifying seismic inputs for a nuclear power plant and a tall earth dam are considered and the resulting solutions are shown to be realistic.
Resumo:
We describe here two non-interferometric methods for the estimation of the phase of transmitted wavefronts through refracting objects. The phase of the wavefronts obtained is used to reconstruct either the refractive index distribution of the objects or their contours. Refraction corrected reconstructions are obtained by the application of an iterative loop incorporating digital ray tracing for forward propagation and a modified filtered back projection (FBP) for reconstruction. The FBP is modified to take into account non-straight path propagation of light through the object. When the iteration stagnates, the difference between the projection data and an estimate of it obtained by ray tracing through the final reconstruction is reconstructed using a diffraction tomography algorithm. The reconstruction so obtained, viewed as a correction term, is added to the estimate of the object from the loop to obtain an improved final refractive index reconstruction.
Resumo:
Filtering methods are explored for removing noise from data while preserving sharp edges that many indicate a trend shift in gas turbine measurements. Linear filters are found to be have problems with removing noise while preserving features in the signal. The nonlinear hybrid median filter is found to accurately reproduce the root signal from noisy data. Simulated faulty data and fault-free gas path measurement data are passed through median filters and health residuals for the data set are created. The health residual is a scalar norm of the gas path measurement deltas and is used to partition the faulty engine from the healthy engine using fuzzy sets. The fuzzy detection system is developed and tested with noisy data and with filtered data. It is found from tests with simulated fault-free and faulty data that fuzzy trend shift detection based on filtered data is very accurate with no false alarms and negligible missed alarms.
Resumo:
Just-in-Time (JIT) compilers for Java can be augmented by making use of runtime profile information to produce better quality code and hence achieve higher performance. In a JIT compilation environment, the profile information obtained can be readily exploited in the same run to aid recompilation and optimization of frequently executed (hot) methods. This paper discusses a low overhead path profiling scheme for dynamically profiling AT produced native code. The profile information is used in recompilation during a subsequent invocation of the hot method. During recompilation tree regions along the hot paths are enlarged and instruction scheduling at the superblock level is performed. We have used the open source LaTTe AT compiler framework for our implementation. Our results on a SPARC platform for SPEC JVM98 benchmarks indicate that (i) there is a significant reduction in the number of tree regions along the hot paths, and (ii) profile aided recompilation in LaTTe achieves performance comparable to that of adaptive LaTTe in spite of retranslation and profiling overheads.
Resumo:
SARAS is a correlation spectrometer purpose designed for precision measurements of the cosmic radio background and faint features in the sky spectrum at long wavelengths that arise from redshifted 21-cm from gas in the reionization epoch. SARAS operates in the octave band 87.5-175 MHz. We present herein the system design arguing for a complex correlation spectrometer concept. The SARAS design concept provides a differential measurement between the antenna temperature and that of an internal reference termination, with measurements in switched system states allowing for cancellation of additive contaminants from a large part of the signal flow path including the digital spectrometer. A switched noise injection scheme provides absolute spectral calibration. Additionally, we argue for an electrically small frequency-independent antenna over an absorber ground. Various critical design features that aid in avoidance of systematics and in providing calibration products for the parametrization of other unavoidable systematics are described and the rationale discussed. The signal flow and processing is analyzed and the response to noise temperatures of the antenna, reference termination and amplifiers is computed. Multi-path propagation arising from internal reflections are considered in the analysis, which includes a harmonic series of internal reflections. We opine that the SARAS design concept is advantageous for precision measurement of the absolute cosmic radio background spectrum; therefore, the design features and analysis methods presented here are expected to serve as a basis for implementations tailored to measurements of a multiplicity of features in the background sky at long wavelengths, which may arise from events in the dark ages and subsequent reionization era.