954 resultados para Local linearization methods
Resumo:
Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014
Resumo:
Mode of access: Internet.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Mode of access: Internet.
Resumo:
S/N 017-024-02146-7
Resumo:
Documentation of the activities and accomplishments of the Service and Methods Demonstration Program for fiscal years 1979, 1980, and 1981.
Resumo:
Mode of access: Internet.
Resumo:
Authorized by the Board of Supervisors.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Most of the modem developments with classification trees are aimed at improving their predictive capacity. This article considers a curiously neglected aspect of classification trees, namely the reliability of predictions that come from a given classification tree. In the sense that a node of a tree represents a point in the predictor space in the limit, the aim of this article is the development of localized assessment of the reliability of prediction rules. A classification tree may be used either to provide a probability forecast, where for each node the membership probabilities for each class constitutes the prediction, or a true classification where each new observation is predictively assigned to a unique class. Correspondingly, two types of reliability measure will be derived-namely, prediction reliability and classification reliability. We use bootstrapping methods as the main tool to construct these measures. We also provide a suite of graphical displays by which they may be easily appreciated. In addition to providing some estimate of the reliability of specific forecasts of each type, these measures can also be used to guide future data collection to improve the effectiveness of the tree model. The motivating example we give has a binary response, namely the presence or absence of a species of Eucalypt, Eucalyptus cloeziana, at a given sampling location in response to a suite of environmental covariates, (although the methods are not restricted to binary response data).
Resumo:
Background: Sentinel node biopsy (SNB) is being increasingly used but its place outside randomized trials has not yet been established. Methods: The first 114 sentinel node (SN) biopsies performed for breast cancer at the Princess Alexandra Hospital from March 1999 to June 2001 are presented. In 111 cases axillary dissection was also performed, allowing the accuracy of the technique to be assessed. A standard combination of preoperative lymphoscintigraphy, intraoperative gamma probe and injection of blue dye was used in most cases. Results are discussed in relation to the risk and potential consequences of understaging. Results: Where both probe and dye were used, the SN was identified in 90% of patients. A significant number of patients were treated in two stages and the technique was no less effective in patients who had SNB performed at a second operation after the primary tumour had already been removed. The interval from radioisotope injection to operation was very wide (between 2 and 22 h) and did not affect the outcome. Nodal metastases were present in 42 patients in whom an SN was found, and in 40 of these the SN was positive, giving a false negative rate of 4.8% (2/42), with the overall percentage of patients understaged being 2%. For this particular group as a whole, the increased risk of death due to systemic therapy being withheld as a consequence of understaging (if SNB alone had been employed) is estimated at less than 1/500. The risk for individuals will vary depending on other features of the particular primary tumour. Conclusion: For patients who elect to have the axilla staged using SNB alone, the risk and consequences of understaging need to be discussed. These risks can be estimated by allowing for the specific surgeon's false negative rate for the technique, and considering the likelihood of nodal metastases for a given tumour. There appears to be no disadvantage with performing SNB at a second operation after the primary tumour has already been removed. Clearly, for a large number of patients, SNB alone will be safe, but ideally participation in randomized trials should continue to be encouraged.
Resumo:
This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications, and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes. We present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods that preserve the underlying structure of the problem. We also present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable-step-size implementations based on various types of control.
Resumo:
An X-ray visualization technique has been used for the quantitative determination of local liquid holdups distribution and liquid holdup hysteresis in a nonwetting two-dimensional (2-D) packed bed. A medical diagnostic X-ray unit has been used to image the local holdups in a 2-D cold model having a random packing of expanded polystyrene beads. An aqueous barium chloride solution was used as a fluid to achieve good contrast on X-ray images. To quantify the local liquid holdup, a simple calibration technique has been developed that can be used for most of the radiological methods such as gamma ray and neutron radiography. The global value of total liquid holdup, obtained by X-ray method, has been compared with two conventional methods: drainage and tracer response. The X-ray technique, after validation, has been used to visualize and quantify, the liquid hysteresis phenomena in a packed bed. The liquid flows in preferred paths or channels that carry droplets/rivulets of increasing size and number as the liquid flow rate is increased. When the flow is reduced, these paths are retained and the higher liquid holdup that persists in these regions leads to the holdup hysteresis effect. Holdup in some regions of the packed bed may be an order of magnitude higher than average at a particular flow rate. (c) 2005 American Institute of Chemical Engineers
Resumo:
The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Tourism planning has been advocated by many as a possible means of alleviating some of the negative impacts of tourism. While a number of approaches have evolved, tourism planning based on the philosophies of sustainability has emerged as the most comprehensive approaches. To investigate the tourism planning approaches of local tourism destinations in Queensland, particularly the extent to which tourism plans exhibit the sustainable approach to tourism planning, 30 local tourism planning documents have been reviewed. Despite claims that sustainable tourism planning is one of the most accepted approaches the study has shown that this is not necessarily the case in practice. It was found that although a number of plans addressed the issue of sustainability, the subsequent strategies and actions suggest that the sustainable approach is not the dominant planning approach, but in fact the economic and infrastructure approaches are the primary planning methods.