976 resultados para new procedure


Relevância:

60.00% 60.00%

Publicador:

Resumo:

When comparing a new treatment with a control in a randomized clinical study, the treatment effect is generally assessed by evaluating a summary measure over a specific study population. The success of the trial heavily depends on the choice of such a population. In this paper, we show a systematic, effective way to identify a promising population, for which the new treatment is expected to have a desired benefit, using the data from a current study involving similar comparator treatments. Specifically, with the existing data we first create a parametric scoring system using multiple covariates to estimate subject-specific treatment differences. Using this system, we specify a desired level of treatment difference and create a subgroup of patients, defined as those whose estimated scores exceed this threshold. An empirically calibrated group-specific treatment difference curve across a range of threshold values is constructed. The population of patients with any desired level of treatment benefit can then be identified accordingly. To avoid any ``self-serving'' bias, we utilize a cross-training-evaluation method for implementing the above two-step procedure. Lastly, we show how to select the best scoring system among all competing models. The proposals are illustrated with the data from two clinical trials in treating AIDS and cardiovascular diseases. Note that if we are not interested in designing a new study for comparing similar treatments, the new procedure can also be quite useful for the management of future patients who would receive nontrivial benefits to compensate for the risk or cost of the new treatment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5$\sp\circ$ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1$\sp\circ$ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross-correlation technique were implemented within an experimental radiotherapy picture archival and communication system (RT-PACS) and were used clinically to evaluate the setup variability of two groups of cancer patients treated with and without an alpha-cradle immobilization aid. The tools developed in this project have proven to be very effective and have played an important role in detecting patient alignment errors and field-shape errors in treatment fields formed by a multileaf collimator (MLC). ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Occasional strong droughts are an important feature of the climatic environment of tropical rain forest in much of Borneo. This paper compares the response of a lowland dipterocarp forest at Danum, Sabah, in a period of low (LDI) and a period of high (HDI) drought intensity (1986-96, 9.98 y;1996-99, 2.62 y). Mean annual drought intensity was two-fold higher in the HDI than LDI period (1997 v. 976 mm), and each period had one moderately strong main drought (viz. 1992, 1998). Mortality of `all' trees greater than or equal to 10 cm gbh (girth at breast height) and stem growth rates of `small' trees 10less than or equal to50 cm gbh were measured in sixteen 0.16-ha subplots (half on ridge, half on lower slope sites) within two 4-ha plots. These 10-50-cm trees were composed largely of true understorey species. A new procedure was developed to correct for the effect of differences in length of census interval when comparing tree mortality rates. Mortality rates of small trees declined slightly but not significantly between the LDI and HDI periods (1.53 to 1.48% y(-1)): mortality of all trees showed a similar pattern. Relative growth rates declined significantly by 23% from LDI to HDI periods (11.1 to 8.6 mm m(-1) y(-1)): for absolute growth rates the decrease was 28% (2.45 to 1.77 mm y(-1)). Neither mortality nor growth rates were significantly influenced by topography. For small trees, across subplots, absolute growth rate was positively correlated in the LDI period, but negatively correlated in the HDI period, with mortality rate. There was no consistent pattern in the responses among the 19 most abundant species (n greater than or equal to 50 trees) which included a proposed drought-tolerant guild. In terms of tree survival, the forest at Danum was resistant to increasing drought intensity, but showed decreased stem growth attributable to increasing water stress.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a new estimator for the fixed effects ordered logit model. In contrast to existing methods, the new procedure allows estimating the thresholds. The empirical relevance and simplicity of implementation is illustrated in an application on the effect of unemployment on life satisfaction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Because natural selection is likely to act on multiple genes underlying a given phenotypic trait, we study here the potential effect of ongoing and past selection on the genetic diversity of human biological pathways. We first show that genes included in gene sets are generally under stronger selective constraints than other genes and that their evolutionary response is correlated. We then introduce a new procedure to detect selection at the pathway level based on a decomposition of the classical McDonald–Kreitman test extended to multiple genes. This new test, called 2DNS, detects outlier gene sets and takes into account past demographic effects and evolutionary constraints specific to gene sets. Selective forces acting on gene sets can be easily identified by a mere visual inspection of the position of the gene sets relative to their two-dimensional null distribution. We thus find several outlier gene sets that show signals of positive, balancing, or purifying selection but also others showing an ancient relaxation of selective constraints. The principle of the 2DNS test can also be applied to other genomic contrasts. For instance, the comparison of patterns of polymorphisms private to African and non-African populations reveals that most pathways show a higher proportion of nonsynonymous mutations in non-Africans than in Africans, potentially due to different demographic histories and selective pressures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a series of attempts to research and document relevant sloshing type phenomena, a series of experiments have been conducted. The aim of this paper is to describe the setup and data processing of such experiments. A sloshing tank is subjected to angular motion. As a result pressure registers are obtained at several locations, together with the motion data, torque and a collection of image and video information. The experimental rig and the data acquisition systems are described. Useful information for experimental sloshing research practitioners is provided. This information is related to the liquids used in the experiments, the dying techniques, tank building processes, synchronization of acquisition systems, etc. A new procedure for reconstructing experimental data, that takes into account experimental uncertainties, is presented. This procedure is based on a least squares spline approximation of the data. Based on a deterministic approach to the first sloshing wave impact event in a sloshing experiment, an uncertainty analysis procedure of the associated first pressure peak value is described.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Erosion potential and the effects of tillage can be evaluated from quantitative descriptions of soil surface roughness. The present study therefore aimed to fill the need for a reliable, low-cost and convenient method to measure that parameter. Based on the interpretation of micro-topographic shadows, this new procedure is primarily designed for use in the field after tillage. The principle underlying shadow analysis is the direct relationship between soil surface roughness and the shadows cast by soil structures under fixed sunlight conditions. The results obtained with this method were compared to the statistical indexes used to interpret field readings recorded by a pin meter. The tests were conducted on 4-m2 sandy loam and sandy clay loam plots divided into 1-m2 subplots tilled with three different tools: chisel, tiller and roller. The highly significant correlation between the statistical indexes and shadow analysis results obtained in the laboratory as well as in the field for all the soil?tool combinations proved that both variability (CV) and dispersion (SD) are accommodated by the new method. This procedure simplifies the interpretation of soil surface roughness and shortens the time involved in field operations by a factor ranging from 12 to 20.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The numerical strategies employed in the evaluation of singular integrals existing in the Cauchy principal value (CPV) sense are, undoubtedly, one of the key aspects which remarkably affect the performance and accuracy of the boundary element method (BEM). Thus, a new procedure, based upon a bi-cubic co-ordinate transformation and oriented towards the numerical evaluation of both the CPV integrals and some others which contain different types of singularity is developed. Both the ideas and some details involved in the proposed formulae are presented, obtaining rather simple and-attractive expressions for the numerical quadrature which are also easily embodied into existing BEM codes. Some illustrative examples which assess the stability and accuracy of the new formulae are included.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this chapter, we are going to describe the main features as well as the basic steps of the Boundary Element Method (BEM) as applied to elastostatic problems and to compare them with other numerical procedures. As we shall show, it is easy to appreciate the adventages of the BEM, but it is also advisable to refrain from a possible unrestrained enthusiasm, as there are also limitations to its usefulness in certain types of problems. The number of these problems, nevertheless, is sufficient to justify the interest and activity that the new procedure has aroused among researchers all over the world. Briefly speaking, the most frequently used version of the BEM as applied to elastostatics works with the fundamental solution, i.e. the singular solution of the governing equations, as an influence function and tries to satisfy the boundary conditions of the problem with the aid of a discretization scheme which consists exclusively of boundary elements. As in other numerical methods, the BEM was developed thanks to the computational possibilities offered by modern computers on totally "classical" basis. That is, the theoretical grounds are based on linear elasticity theory, incorporated long ago into the curricula of most engineering schools. Its delay in gaining popularity is probably due to the enormous momentum with which Finite Element Method (FEM) penetrated the professional and academic media. Nevertheless, the fact that these methods were developed before the BEM has been beneficial because de BEM successfully uses those results and techniques studied in past decades. Some authors even consider the BEM as a particular case of the FEM while others view both methods as special cases of the general weighted residual technique. The first paper usually cited in connection with the BEM as applied to elastostatics is that of Rizzo, even though the works of Jaswon et al., Massonet and Oliveira were published at about the same time, the reason probably being the attractiveness of the "direct" approach over the "indirect" one. The work of Tizzo and the subssequent work of Cruse initiated a fruitful period with applicatons of the direct BEM to problems of elastostacs, elastodynamics, fracture, etc. The next key contribution was that of Lachat and Watson incorporating all the FEM discretization philosophy in what is sometimes called the "second BEM generation". This has no doubt, led directly to the current developments. Among the various researchers who worked on elastostatics by employing the direct BEM, one can additionallly mention Rizzo and Shippy, Cruse et al., Lachat and Watson, Alarcón et al., Brebbia el al, Howell and Doyle, Kuhn and Möhrmann and Patterson and Sheikh, and among those who used the indirect BEM, one can additionally mention Benjumea and Sikarskie, Butterfield, Banerjee et al., Niwa et al., and Altiero and Gavazza. An interesting version of the indirct method, called the Displacement Discontinuity Method (DDM) has been developed by Crounh. A comprehensive study on various special aspects of the elastostatic BEM has been done by Heisse, while review-type articles on the subject have been reported by Watson and Hartmann. At the present time, the method is well established and is being used for the solution of variety of problems in engineering mechanics. Numerous introductory and advanced books have been published as well as research-orientated ones. In this sense, it is worth noting the series of conferences promoted by Brebbia since 1978, wich have provoked a continuous research effort all over the world in relation to the BEM. In the following sections, we shall concentrate on developing the direct BEM as applied to elastostatics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

True stress-true strain curves of naturally spun viscid line fibers retrieved directly from the spiral of orb-webs built by Argiope trifasciata spiders were measured using a novel methodology. This new procedure combines a method for removing the aqueous coating of the fibers and a technique that allows the accurate measurement of their cross sectional area. Comparison of the tensile behaviour of different samples indicates that naturally spun viscid lines show a large variability, comparable to that of other silks, such as major ampullate gland silk and silkworm silk. Nevertheless, application of a statistical analysis allowed identifying two independent parameters that underlie the variability and characterize the observed range of true stress-true strain curves. Combination of this result with previous mechanical and microstructural data suggested the assignment of these two independent effects to the degree of alignment of the protein chains and to the local relative humidity which, in turn, depends on the composition of the viscous coating and on the external environmental conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Las cargas de origen térmico causadas por las acciones medioambientales generan esfuerzos apreciables en estructuras hiperestáticas masivas, como es el caso de las presas bóvedas. Ciertas investigaciones apuntan que la variación de la temperatura ambiental es la segunda causa de reparaciones en las presas del hormigón en servicio. Del mismo modo, es una causa de fisuración en un porcentaje apreciable de casos. Las presas son infraestructuras singulares por sus dimensiones, su vida útil, su impacto sobre el territorio y por el riesgo que implica su presencia. La evaluación de ese riesgo requiere, entre otras herramientas, de modelos matemáticos de predicción del comportamiento. Los modelos han de reproducir la realidad del modo más fidedigno posible. Además, en un escenario de posible cambio climático en el que se prevé un aumento de las temperaturas medias, la sociedad ha de conocer cuál será el comportamiento estructural de las infraestructuras sensibles en los futuros escenarios climáticos. No obstante, existen escasos estudios enfocados a determinar el campo de temperaturas de las presas de hormigón. Así, en esta investigación se han mejorado los modelos de cálculo térmico existentes con la incorporación de nuevos fenómenos físicos de transferencia de calor entre la estructura y el medio ambiente que la rodea. También se han propuesto nuevas metodologías más eficientes para cuantificar otros mecanismos de transferencia de calor. La nueva metodología se ha aplicado a un caso de estudio donde se disponía de un amplio registro de temperaturas de su hormigón. Se ha comprobado la calidad de las predicciones realizadas por los diversos modelos térmicos en el caso piloto. También se han comparado los resultados de los diversos modelos entre sí. Finalmente, se ha determinado las consecuencias de las predicciones de las temperaturas por algunos de los modelos térmicos sobre la respuesta estructural del caso de estudio. Los modelos térmicos se han empleado para caracterizar térmicamente las presas bóveda. Se ha estudiado el efecto de ciertas variables atmosféricas y determinados aspectos geométricos de las presas sobre su respuesta térmica. También se ha propuesto una metodología para evaluar la respuesta térmica y estructural de las infraestructuras frente a los posibles cambios meteorológicos inducidos por el cambio climático. La metodología se ha aplicado a un caso de estudio, una presa bóveda, y se ha obtenido su futura respuesta térmica y estructural frente a diversos escenarios climáticos. Frente a este posible cambio de las variables meteorológicas, se han detallado diversas medidas de adaptación y se ha propuesto una modificación de la normativa española de proyecto de presas en el punto acerca del cálculo de la distribución de temperaturas de diseño. Finalmente, se han extraído una serie de conclusiones y se han sugerido posibles futuras líneas de investigación para ampliar el conocimiento del fenómeno de la distribución de temperaturas en el interior de las presas y las consecuencias sobre su respuesta estructural. También se han propuesto futuras investigaciones para desarrollar nuevos procedimiento para definir las cargas térmicas de diseño, así como posibles medidas de adaptación frente al cambio climático. Thermal loads produced by external temperature variations may cause stresses in massive hyperstatic structures, such as arch dams. External temperature changes are pointed out as the second most major repairs in dams during operation. Moreover, cracking is caused by thermal loads in a quite number of cases. Dams are unique infrastructures given their dimensions, lifetime, spatial impacts and the risks involve by their presence. The risks are assessed by means of mathematical models which compute the behavior of the structure. The behavior has to be reproduced as reliable as possible. Moreover, since mean temperature on Earth is expected to increase, society has to know the structural behavior of sensitive structures to climate change. However, few studies have addressed the assessment of the thermal field in concrete dams. Thermal models are improved in this research. New heat transfer phenomena have been accounted for. Moreover, new and more efficient methodologies for computing other heat transfer phenomena have been proposed. The methodology has been applied to a case study where observations from thermometers embedded in the concrete were available. Recorded data were predicted by several thermal models and the quality of the predictions was assessed. Furthermore, predictions were compared between them. Finally, the consequences on the stress calculations were analyzed. Thermal models have been used to characterize arch dams from a thermal point of view. The effect of some meteorological and geometrical variables on the thermal response of the dam has been analyzed. Moreover, a methodology for assessing the impacts of global warming on the thermal and structural behavior of infrastructures has been proposed. The methodology was applied to a case study, an arch dam, and its thermal and structural response to several future climatic scenarios was computed. In addition, several adaptation strategies has been outlined and a new formulation for computing design thermal loads in concrete dams has been proposed. Finally, some conclusions have been reported and some future research works have been outlined. Future research works will increase the knowledge of the concrete thermal field and its consequences on the structural response of the infrastructures. Moreover, research works will develope a new procedure for computing the design thermal loads and will study some adaptation strategies against the climate change.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent improvements of a hierarchical ab initio or de novo approach for predicting both α and β structures of proteins are described. The united-residue energy function used in this procedure includes multibody interactions from a cumulant expansion of the free energy of polypeptide chains, with their relative weights determined by Z-score optimization. The critical initial stage of the hierarchical procedure involves a search of conformational space by the conformational space annealing (CSA) method, followed by optimization of an all-atom model. The procedure was assessed in a recent blind test of protein structure prediction (CASP4). The resulting lowest-energy structures of the target proteins (ranging in size from 70 to 244 residues) agreed with the experimental structures in many respects. The entire experimental structure of a cyclic α-helical protein of 70 residues was predicted to within 4.3 Å α-carbon (Cα) rms deviation (rmsd) whereas, for other α-helical proteins, fragments of roughly 60 residues were predicted to within 6.0 Å Cα rmsd. Whereas β structures can now be predicted with the new procedure, the success rate for α/β- and β-proteins is lower than that for α-proteins at present. For the β portions of α/β structures, the Cα rmsd's are less than 6.0 Å for contiguous fragments of 30–40 residues; for one target, three fragments (of length 10, 23, and 28 residues, respectively) formed a compact part of the tertiary structure with a Cα rmsd less than 6.0 Å. Overall, these results constitute an important step toward the ab initio prediction of protein structure solely from the amino acid sequence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis describes an investigation in which we compare Ni(0), Ni(I) and Ni(II) complexes containing 1,3-bis(diphenylphosphino)propane (dppp) as a phosphine ligand for their abilities to effect three types of cross-coupling reactions: Buchwald-Hartwig Amination, Heck-Mizoroki, and Suzuki-Miyaura cross-coupling reactions with different types of substrates. The Ni(0) complex Ni(dppp)2 is known and we have synthesized it via a new procedure involving zinc reduction of the known NiCl2(dppp) in the presence of an excess of dppp. The Ni(0) complex was characterized by NMR spectroscopy and X-ray crystallography. Since Ni(I) complexes of dppp seem unknown, we have synthesized what at this stage appear to be NiXdpppn/[NiX(dppp)n]x (X = Cl, Br, I; n = 1,2, x = 1, 2) by comproportionation of molar equivalents of Ni(dppp)2 and NiX2dppp, X= Cl, Br, I.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose a new identification method based on the residual white noise autoregressive criterion (Pukkila et al. , 1990) to select the order of VARMA structures. Results from extensive simulation experiments based on different model structures with varying number of observations and number of component series are used to demonstrate the performance of this new procedure. We also use economic and business data to compare the model structures selected by this order selection method with those identified in other published studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To introduce a new technique for co-registration of Magnetoencephalography (MEG) with magnetic resonance imaging (MRI). We compare the accuracy of a new bite-bar with fixed fiducials to a previous technique whereby fiducial coils were attached proximal to landmarks on the skull. Methods: A bite-bar with fixed fiducial coils is used to determine the position of the head in the MEG co-ordinate system. Co-registration is performed by a surface-matching technique. The advantage of fixing the coils is that the co-ordinate system is not based upon arbitrary and operator dependent fiducial points that are attached to landmarks (e.g. nasion and the preauricular points), but rather on those that are permanently fixed in relation to the skull. Results: As a consequence of minimizing coil movement during digitization, errors in localization of the coils are significantly reduced, as shown by a randomization test. Displacement of the bite-bar caused by removal and repositioning between MEG recordings is minimal (∼0.5 mm), and dipole localization accuracy of a somatosensory mapping paradigm shows a repeatability of ∼5 mm. The overall accuracy of the new procedure is greatly improved compared to the previous technique. Conclusions: The test-retest reliability and accuracy of target localization with the new design is superior to techniques that incorporate anatomical-based fiducial points or coils placed on the circumference of the head. © 2003 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.