971 resultados para Pitt, Christopher, 1699-1748


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a scaling method for templating digital radiographs using conventional acetate templates independent of template magnification without the need for a calibration marker. The mean magnification factor for the radiology department was determined (119.8%, range117%-123.4%). This fixed magnification factor was used to scale the radiographs by the method described. 32 femoral heads on postoperative THR radiographs were then measured and compared to the actual size. The mean absolute accuracy was within 0.5% of actual head size (range 0 to 3%) with a mean absolute difference of 0.16mm (range 0-1mm, SD 0.26mm). Intraclass Correlation Coefficient (ICC) showed excellent reliability for both inter and intraobserver measurements with ICC scores of 0.993 (95% CI 0.988-0.996) for interobserver measurements and intraobserver measurements ranging between 0.990-0.993 (95% CI 0.980-0.997).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper I discuss David Shaw’s claim that the body of a terminally ill person can be conceived as a kind of life-support, akin to an artificial ventilator. I claim that this position rests upon an untenable dualism between the mind and the body. Given that dualism continues to be attractive to some thinkers, I attempt to diagnose the reasons why it continues to be attractive, as well as to demonstrate its incoherence, drawing on some recent work in the philosophy of psychology. I conclude that, if my criticisms are sound, Shaw’s attempt to deny the distinction between withdrawal and euthanasia fails.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Value Management (VM) initially started in early 1940s in the US manufacturing industry has increasingly becoming popular within the construction industry community internationally. It has been widely accepted as an important tool in the management of projects. The structured, systematic and multi-disciplinary approach in decision making process is a niche for VM in delivering better value for money project to the client investment. It would appear to be gaining some momentum as an essential management tool in the Malaysian construction sector especially in the quantity surveying practice. Quantity surveyors increasing involvement in VM provides an opportunity for the profession to re-model some of its traditional services in a more positive light and develop leading-edge skills and promote the profession. International practice has associated VM to be part of services offered in the quantity surveying practice; especially in UK has proven to be a natural progression of QS profession. The introduction of VM as early 1980’s in Malaysia combined with increasing demand for construction project to facilitate nation progress is shedding a positive light for quantity surveying profession to take lead in developing VM as one of their niche area. Therefore, the quantity surveying profession having the opportunity to take lead of this service which reflect their traditional attributes for providing the best value-for-money advise to the client. This paper shall discuss on the development of VM in Malaysia and the challenges VM face services in QS firm to remain ahead of their competitors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we explore the ability of a recent model-based learning technique Receding Horizon Locally Weighted Regression (RH-LWR) useful for learning temporally dependent systems. In particular this paper investigates the application of RH-LWR to learn control of Multiple-input Multiple-output robot systems. RH-LWR is demonstrated through learning joint velocity and position control of a three Degree of Freedom (DoF) rigid body robot.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though the driving ability of older adults may decline with age, there is evidence that some individuals attempt to compensate for these declines using strategies such as restricting their driving exposure. Such compensatory mechanisms rely on drivers’ ability to evaluate their own driving performance. This paper focuses on one key aspect of driver ability that is associated with crash risk and has been found to decline with age: hazard perception. Three hundred and seven drivers, aged 65 to 96, completed a validated video-based hazard perception test. There was no significant relationship between hazard perception test response latencies and drivers’ ratings of their hazard perception test performance, suggesting that their ability to assess their own test performance was poor. Also, age related declines in hazard perception latency were not reflected in drivers’ self-ratings. Nonetheless, ratings of test performance were associated with self-reported regulation of driving, as was self-rated driving ability. These findings are consistent with the proposal that, while self-assessments of driving ability may be used by drivers to determine the degree to which they restrict their driving, the problem is that drivers have little insight into their own driving ability. This may impact on the potential road safety benefits of self-restriction of driving because drivers may not have the information needed to optimally self-restrict. Strategies for addressing this problem are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective The current study evaluated part of the Multifactorial Model of Driving Safety to elucidate the relative importance of cognitive function and a limited range of standard measures of visual function in relation to the Capacity to Drive Safely. Capacity to Drive Safely was operationalized using three validated screening measures for older drivers. These included an adaptation of the well validated Useful Field of View (UFOV) and two newer measures, namely a Hazard Perception Test (HPT), and a Hazard Change Detection Task (HCDT). Method Community dwelling drivers (n = 297) aged 65–96 were assessed using a battery of measures of cognitive and visual function. Results Factor analysis of these predictor variables yielded factors including Executive/Speed, Vision (measured by visual acuity and contrast sensitivity), Spatial, Visual Closure, and Working Memory. Cognitive and Vision factors explained 83–95% of age-related variance in the Capacity to Drive Safely. Spatial and Working Memory were associated with UFOV, HPT and HCDT, Executive/Speed was associated with UFOV and HCDT and Vision was associated with HPT. Conclusion The Capacity to Drive Safely declines with chronological age, and this decline is associated with age-related declines in several higher order cognitive abilities involving manipulation and storage of visuospatial information under speeded conditions. There are also age-independent effects of cognitive function and vision that determine driving safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of steady subcritical free surface flow past a submerged inclined step is considered. The asymptotic limit of small Froude number is treated, with particular emphasis on the effect that changing the angle of the step face has on the surface waves. As demonstrated by Chapman & Vanden-Broeck (2006), the divergence of a power series expansion in powers of the square of the Froude number is caused by singularities in the analytic continuation of the free surface; for an inclined step, these singularities may correspond to either the corners or stagnation points of the step, or both, depending on the angle of incline. Stokes lines emanate from these singularities, and exponentially small waves are switched on at the point the Stokes lines intersect with the free surface. Our results suggest that for a certain range of step angles, two wavetrains are switched on, but the exponentially subdominant one is switched on first, leading to an intermediate wavetrain not previously noted. We extend these ideas to the problem of flow over a submerged bump or trench, again with inclined sides. This time there may be two, three or four active Stokes lines, depending on the inclination angles. We demonstrate how to construct a base topography such that wave contributions from separate Stokes lines are of equal magnitude but opposite phase, thus cancelling out. Our asymptotic results are complemented by numerical solutions to the fully nonlinear equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a sequential Monte Carlo algorithm for Bayesian sequential experimental design applied to generalised non-linear models for discrete data. The approach is computationally convenient in that the information of newly observed data can be incorporated through a simple re-weighting step. We also consider a flexible parametric model for the stimulus-response relationship together with a newly developed hybrid design utility that can produce more robust estimates of the target stimulus in the presence of substantial model and parameter uncertainty. The algorithm is applied to hypothetical clinical trial or bioassay scenarios. In the discussion, potential generalisations of the algorithm are suggested to possibly extend its applicability to a wide variety of scenarios

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we describe an analysis for data collected on a three-dimensional spatial lattice with treatments applied at the horizontal lattice points. Spatial correlation is accounted for using a conditional autoregressive model. Observations are defined as neighbours only if they are at the same depth. This allows the corresponding variance components to vary by depth. We use the Markov chain Monte Carlo method with block updating, together with Krylov subspace methods, for efficient estimation of the model. The method is applicable to both regular and irregular horizontal lattices and hence to data collected at any set of horizontal sites for a set of depths or heights, for example, water column or soil profile data. The model for the three-dimensional data is applied to agricultural trial data for five separate days taken roughly six months apart in order to determine possible relationships over time. The purpose of the trial is to determine a form of cropping that leads to less moist soils in the root zone and beyond.We estimate moisture for each date, depth and treatment accounting for spatial correlation and determine relationships of these and other parameters over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Core(polyvinyl neodecanoate-ethylene glycol dimethacrylate)-shell(polyvinyl alcohol) (core (P(VND-EGDMA))-shell(PVA)) microspheres were developed by seeded polymerization with the use of conventional free radical and RAFT/MADIX mediated polymerization. Poly(vinyl pivalate) PVPi was grafted onto microspheres prepared via suspension polymerization of vinylneodecanoate and ethylene glycol dimethacrylate. The amount of grafted polymer was found to be independent from the technique used with conventional free radical polymerization and MADIX polymerization resulting into similar shell thicknesses. Both systems—grafting via free radical polymerization or the MADIX process—were found to follow slightly different kinetics. While the free radical polymerization resulted in a weight gain linear with the monomer consumption in solution the growth in the MADIX controlled system experienced a delay. The core-shell microspheres were obtained by hydrolysis of the poly(vinyl pivalate) surface grafted brushes to form poly(vinyl alcohol). During hydrolysis the microspheres lost a significant amount of weight, consistent with the hydrolysis of 40–70% of all VPi units. Drug loading was found to be independent of the shell layer thickness, suggesting that the drug loading is governed by the amount of bulk material. The shell layer does not appear to represent an obstacle to the drug ingress. Cell testing using colorectal cancer cell lines HT 29 confirm the biocompatibility of the empty microspheres whereas the clofazimine loaded particles lead to 50% cell death, confirming the release of the drug.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PySSM is a Python package that has been developed for the analysis of time series using linear Gaussian state space models (SSM). PySSM is easy to use; models can be set up quickly and efficiently and a variety of different settings are available to the user. It also takes advantage of scientific libraries Numpy and Scipy and other high level features of the Python language. PySSM is also used as a platform for interfacing between optimised and parallelised Fortran routines. These Fortran routines heavily utilise Basic Linear Algebra (BLAS) and Linear Algebra Package (LAPACK) functions for maximum performance. PySSM contains classes for filtering, classical smoothing as well as simulation smoothing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.