965 resultados para Malthusian parameter


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adequate blood supply and sufficient mechanical stability are necessary for timely fracture healing. Damage to vessels impairs blood supply; hindering the transport of oxygen which is an essential metabolite for cells involved in repair. The degree of mechanical stability determines the mechanical conditions in the healing tissues. The mechanical conditions can influence tissue differentiation and may also inhibit revascularization. Knowledge of the actual conditions in a healing fracture in vivo is extremely limited. This study aimed to quantify the pressure, oxygen tension and temperature in the external callus during the early phase of bone healing. Six Merino-mix sheep underwent a tibial osteotomy. The tibia was stabilized with a standard mono-lateral external fixator. A multi-parameter catheter was placed adjacent to the osteotomy gap on the medial aspect of the tibia. Measurements of oxygen tension and temperature were performed for ten days post-op. Measurements of pressure were performed during gait on days three and seven. The ground reaction force and the interfragmentary movements were measured simultaneously. The maximum pressure during gait increased (p=0.028) from three (41.3 [29.2-44.1] mm Hg) to seven days (71.8 [61.8-84.8] mm Hg). During the same interval, there was no change (p=0.92) in the peak ground reaction force or in the interfragmentary movement (compression: p=0.59 and axial rotation: p=0.11). Oxygen tension in the haematoma (74.1 mm Hg [68.6-78.5]) was initially high post-op and decreased steadily over the first five days. The temperature increased over the first four days before reaching a plateau at approximately 38.5 degrees C on day four. This study is the first to report pressure, oxygen tension and temperature in the early callus tissues. The magnitude of pressure increased even though weight bearing and IFM remained unchanged. Oxygen tensions were initially high in the haematoma and fell gradually with a low oxygen environment first established after four to five days. This study illustrates that in bone healing the local environment for cells may not be considered constant with regard to oxygen tension, pressure and temperature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urban water quality can be significantly impaired by the build-up of pollutants such as heavy metals and volatile organics on urban road surfaces due to vehicular traffic. Any control strategy for the mitigation of traffic related build-up of heavy metals and volatile organic pollutants should be based on the knowledge of their build-up processes. In the study discussed in this paper, the outcomes of a detailed experiment investigation into build-up processes of heavy metals and volatile organics are presented. It was found that traffic parameters such as average daily traffic, volume over capacity ratio and surface texture depth had similar strong correlations with the build-up of heavy metals and volatile organics. Multicriteria decision analyses revealed that the 1 - 74 um particulate fraction of total suspended solids (TSS) could be regarded as a surrogate indicator for particulate heavy metals in build-up and this same fraction of total organic carbon could be regarded as a surrogate indicator for particulate volatile organics build-up. In terms of pollutants affinity, TSS was found to be the predominant parameter for particulate heavy metals build-up and total dissolved solids was found to be the predominant parameter for he potential dissolved particulate fraction in heavy metals build-up. It was also found that land use did not play a significant role in the build-up of traffic generated heavy metals and volatile organics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine the impact of individual-specific information processing strategies (IPSs) on the inclusion/exclusion of attributes on the parameter estimates and behavioural outputs of models of discrete choice. Current practice assumes that individuals employ a homogenous IPS with regards to how they process attributes of stated choice (SC) experiments. We show how information collected exogenous of the SC experiment on whether respondents either ignored or considered each attribute may be used in the estimation process, and how such information provides outputs that are IPS segment specific. We contend that accounting the inclusion/exclusion of attributes will result in behaviourally richer population parameter estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper maps gendered trouble spots in contemporary works of female travel. Since travel itself is a metaphor for the slippage or displacement of cultural knowledge, there is need for closer and more complex readings of the female practice. This paper locates the contemporary female traveller on ground and on page, and flags some of the cultural myths and misconceptions that affect how she moves through the world. When women travel, they inscribe themselves across landscapes that have been previously overlooked, openly discarded and largely unexamined. In doing so, they travel intricate courses due to historical connections between wandering and promiscuity and continuing confusions between issues of mobility and morality in the modern world. Taking gender then as its interpretative parameter, this paper explores the troublesome nature of women’s travel, citing various texts as examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimal design for generalized linear models has primarily focused on univariate data. Often experiments are performed that have multiple dependent responses described by regression type models, and it is of interest and of value to design the experiment for all these responses. This requires a multivariate distribution underlying a pre-chosen model for the data. Here, we consider the design of experiments for bivariate binary data which are dependent. We explore Copula functions which provide a rich and flexible class of structures to derive joint distributions for bivariate binary data. We present methods for deriving optimal experimental designs for dependent bivariate binary data using Copulas, and demonstrate that, by including the dependence between responses in the design process, more efficient parameter estimates are obtained than by the usual practice of simply designing for a single variable only. Further, we investigate the robustness of designs with respect to initial parameter estimates and Copula function, and also show the performance of compound criteria within this bivariate binary setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Small element spacing in compact arrays results in strong mutual coupling between array elements. Performance degradation associated with the strong coupling can be avoided through the introduction of a decoupling network consisting of interconnected reactive elements. We present a systematic design procedure for decoupling networks of symmetrical arrays with more than three elements and characterized by circulant scattering parameter matrices. The elements of the decoupling network are obtained through repeated decoupling of the characteristic eigenmodes of the array, which allows the calculation of element values using closed-form expressions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many bridges, vertical displacements are the most relevant parameter for monitoring in the both short and long term. However, it is difficult to measure vertical displacements of bridges and yet they are among the most important indicators of structural behaviour. Therefore, it prompts a need to develop a simple, inexpensive and yet more practical method to measure vertical displacements of bridges. With the development of fiber-optics technologies, fiber Bragg grating (FBG) sensors have been widely used in structural health monitoring. The advantages of these sensors over the conventional sensors include multiplexing capabilities, high sample rate, small size and electro magnetic interference (EMI) immunity. In this paper, methods of vertical displacement measurements of bridges are first reviewed. Then, FBG technology is briefly introduced including principle, sensing system, characteristics and different types of FBG sensors. Finally, the methodology of vertical displacement measurements using FBG sensors is presented and a trial test is described. It is concluded that using FBG sensors is feasible to measure vertical displacements of bridges. This method can be used to understand global behaviour of bridge‘s span and can further develop for structural health monitoring techniques such as damage detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The size of rat-race and branch-line couplers can be reduced by using periodic loading or artificial transmission lines. The objective of this work is to extend the idea of size reduction through periodic loading to coupled-line 90° hybrids. A procedure for the extraction of the characteristic parameters of a coupled-line 4-port from a single set of S-parameters is described. This method can be employed to design of coupled artificial transmission line couplers of arbitrary geometry. The procedure is illustrated through the design a broadside-coupled stripline hybrid, periodically loaded with stubs. Measured results for a prototype coupler confirm the validity of the theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project was a step forward in the examination and identification of key variables on the perception, decision making and action of team sport athletes through theoretical insights provided by the ecological dynamics perspective. The methodology drew on experiential knowledge of elite coaches to drive further empirical investigation into the specific task, environmental and personal constraints that shape the behaviour of athletes in specific performance contexts. The thesis has provided an effective rationale for further investigation into the emergent perception, decision making and action demanded of athletes in these unpredictable, fluent, fast-paced environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Columns are one of the key load bearing elements that are highly susceptible to vehicle impacts. The resulting severe damages to columns may leads to failures of the supporting structure that are catastrophic in nature. However, the columns in existing structures are seldom designed for impact due to inadequacies of design guidelines. The impact behaviour of columns designed for gravity loads and actions other than impact is, therefore, of an interest. A comprehensive investigation is conducted on reinforced concrete column with a particular focus on investigating the vulnerability of the exposed columns and to implement mitigation techniques under low to medium velocity car and truck impacts. The investigation is based on non-linear explicit computer simulations of impacted columns followed by a comprehensive validation process. The impact is simulated using force pulses generated from full scale vehicle impact tests. A material model capable of simulating triaxial loading conditions is used in the analyses. Circular columns adequate in capacity for five to twenty story buildings, designed according to Australian standards are considered in the investigation. The crucial parameters associated with the routine column designs and the different load combinations applied at the serviceability stage on the typical columns are considered in detail. Axially loaded columns are examined at the initial stage and the investigation is extended to analyse the impact behaviour under single axis bending and biaxial bending. The impact capacity reduction under varying axial loads is also investigated. Effects of the various load combinations are quantified and residual capacity of the impacted columns based on the status of the damage and mitigation techniques are also presented. In addition, the contribution of the individual parameter to the failure load is scrutinized and analytical equations are developed to identify the critical impulses in terms of the geometrical and material properties of the impacted column. In particular, an innovative technique was developed and introduced to improve the accuracy of the equations where the other techniques are failed due to the shape of the error distribution. Above all, the equations can be used to quantify the critical impulse for three consecutive points (load combinations) located on the interaction diagram for one particular column. Consequently, linear interpolation can be used to quantify the critical impulse for the loading points that are located in-between on the interaction diagram. Having provided a known force and impulse pair for an average impact duration, this method can be extended to assess the vulnerability of columns for a general vehicle population based on an analytical method that can be used to quantify the critical peak forces under different impact durations. Therefore the contribution of this research is not only limited to produce simplified yet rational design guidelines and equations, but also provides a comprehensive solution to quantify the impact capacity while delivering new insight to the scientific community for dealing with impacts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A model for drug diffusion from a spherical polymeric drug delivery device is considered. The model contains two key features. The first is that solvent diffuses into the polymer, which then transitions from a glassy to a rubbery state. The interface between the two states of polymer is modelled as a moving boundary, whose speed is governed by a kinetic law; the same moving boundary problem arises in the one-phase limit of a Stefan problem with kinetic undercooling. The second feature is that drug diffuses only through the rubbery region, with a nonlinear diffusion coefficient that depends on the concentration of solvent. We analyse the model using both formal asymptotics and numerical computation, the latter by applying a front-fixing scheme with a finite volume method. Previous results are extended and comparisons are made with linear models that work well under certain parameter regimes. Finally, a model for a multi-layered drug delivery device is suggested, which allows for more flexible control of drug release.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gradient-based approaches to direct policy search in reinforcement learning have received much recent attention as a means to solve problems of partial observability and to avoid some of the problems associated with policy degradation in value-function methods. In this paper we introduce GPOMDP, a simulation-based algorithm for generating a biased estimate of the gradient of the average reward in Partially Observable Markov Decision Processes (POMDPs) controlled by parameterized stochastic policies. A similar algorithm was proposed by Kimura, Yamamura, and Kobayashi (1995). The algorithm's chief advantages are that it requires storage of only twice the number of policy parameters, uses one free parameter β ∈ [0,1) (which has a natural interpretation in terms of bias-variance trade-off), and requires no knowledge of the underlying state. We prove convergence of GPOMDP, and show how the correct choice of the parameter β is related to the mixing time of the controlled POMDP. We briefly describe extensions of GPOMDP to controlled Markov chains, continuous state, observation and control spaces, multiple-agents, higher-order derivatives, and a version for training stochastic policies with internal states. In a companion paper (Baxter, Bartlett, & Weaver, 2001) we show how the gradient estimates generated by GPOMDP can be used in both a traditional stochastic gradient algorithm and a conjugate-gradient procedure to find local optima of the average reward. ©2001 AI Access Foundation and Morgan Kaufmann Publishers. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive semidefinite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space - classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -using the labeled part of the data one can learn an embedding also for the unlabeled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem.