624 resultados para wheel load distribution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary objective of this paper is to study the use of medical image-based finite element (FE) modelling in subjectspecific midsole design and optimisation for heel pressure reduction using a midsole plug under the calcaneus area (UCA). Plugs with different relative dimensions to the size of the calcaneus of the subject have been incorporated in the heel region of the midsole. The FE foot model was validated by comparing the numerically predicted plantar pressure with biomechanical tests conducted on the same subject. For each UCA midsole plug design, the effect of material properties and plug thicknesses on the plantar pressure distribution and peak pressure level during the heel strike phase of normal walking was systematically studied. The results showed that the UCA midsole insert could effectively modify the pressure distribution, and its effect is directly associated with the ratio of the plug dimension to the size of the calcaneus bone of the subject. A medium hardness plug with a size of 95% of the calcaneus has achieved the best performance for relieving the peak pressure in comparison with the pressure level for a solid midsole without a plug, whereas a smaller plug with a size of 65% of the calcaneus insert with a very soft material showed minimum beneficial effect for the pressure relief.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High resolution, USPIO-enhanced MR imaging can be used to identify inflamed atherosclerotic plaque. We report a case of a 79-year-old man with a symptomatic carotid stenosis of 82%. The plaque was retrieved for histology and finite element analysis (FEA) based on the preoperative MR imaging was used to predict maximal Von Mises stress on the plaque. Macrophage location correlated with maximal predicted stresses on the plaque. This supports the hypothesis that macrophages thin the fibrous cap at points of highest stress, leading to an increased risk of plaque rupture and subsequent stroke.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider estimating the total load from frequent flow data but less frequent concentration data. There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates that minimizes the biases and makes use of informative predictive variables. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized rating-curve approach with additional predictors that capture unique features in the flow data, such as the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and the discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. Forming this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach for two rivers delivering to the Great Barrier Reef, Queensland, Australia. One is a data set from the Burdekin River, and consists of the total suspended sediment (TSS) and nitrogen oxide (NO(x)) and gauged flow for 1997. The other dataset is from the Tully River, for the period of July 2000 to June 2008. For NO(x) Burdekin, the new estimates are very similar to the ratio estimates even when there is no relationship between the concentration and the flow. However, for the Tully dataset, by incorporating the additional predictive variables namely the discounted flow and flow phases (rising or recessing), we substantially improved the model fit, and thus the certainty with which the load is estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of fish density distribution and effort distribution on the overall catchability coefficient are examined. Emphasis is also on how aggregation and effort distribution interact to affect overall catch rate [catch per unit effort (cpue)]. In particular, it is proposed to evaluate three indices, the catchability index, the knowledge parameter, and the aggregation index, to describe the effectiveness of targeting and the effects on overall catchability in the stock area. Analytical expressions are provided so that these indices can easily be calculated. The average of the cpue calculated from small units where fishing is random is a better index for measuring the stock abundance. The overall cpue, the ratio of lumped catch and effort, together with the average cpue, can be used to assess the effectiveness of targeting. The proposed methods are applied to the commercial catch and effort data from the Australian northern prawn fishery. The indices are obtained assuming a power law for the effort distribution as an approximation of targeting during the fishing operation. Targeting increased catchability in some areas by 10%, which may have important implications on management advice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The biomass and species composition of tropical phytoplankton in Albatross Bay, Gulf of Carpentaria, northern Australia, were examined monthly for 6 yr (1986 to 1992). Chlorophyll a (chl a) concentrations were highest (2 to 5.7 mu g l(-1)) in the wet season at inshore sites, usually coinciding with low salinities (30 to 33 ppt) and high temperatures (29 to 32 degrees C). At the offshore sites chi a concentrations were lower (0.2 to 2 mu g l(-1)) and did not vary seasonally. Nitrate and phosphate concentrations were generally low (0 to 3.68 mu M and 0.09 to 3 mu M for nitrate and phosphate respectively), whereas silicate was present in concentrations in the range 0.19 to 13 mu M. The phytoplankton community was dominated by diatoms, particularly at the inshore sites, as determined by a combination of microscopic and high-performance liquid chromatography (HPLC) pigment analyses. At the offshore sites the proportion of green flagellates increased. The cyanobacterium genus Trichodesmium and the diatom genera Chaetoceros, Rhizosolenia, Bacteriastrum and Thalassionema dominated the phytoplankton caught in 37 mu m mesh nets; however, in contrast to many other coastal areas studied worldwide there was no distinct species succession of the diatoms and only Trichodesmium showed seasonal changes in abundance. This reflects a stable phytoplankton community in waters without pulses of physical and chemical disturbances. These results are discussed in the context of the commercial prawn fishery in the Gulf of Carpentaria and the possible effect of phytoplankton on prawn larval growth and survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was a step forward in improving the voltage profile of traditional low voltage distribution networks with high photovoltaic generation or high peak demand. As a practical and economical solution, the developed methods use a Dynamic Voltage Restorer or DVR, which is a series voltage compensator, for continuous and communication-less power quality enhancement. The placement of DVR in the network is optimised in order to minimise its power rating and cost. In addition, new approaches were developed for grid synchronisation and control of DVR which are integrated with the voltage quality improvement algorithm for stable operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fatigue of the steel in rails continues to be of major concern to heavy haul track owners despite careful selection and maintenance of rails. The persistence of fatigue is due in part to the erroneous assumption that the maximum loads on, and stresses in, the rails are predictable. Recent analysis of extensive wheel impact detector data from a number of heavy haul tracks has shown that the most damaging forces are in fact randomly distributed with time and location and can be much greater than generally expected. Large- scale Monte-Carlo simulations have been used to identify rail stresses caused by actual, measured distributions of wheel-rail forces on heavy haul tracks. The simulations show that fatigue failure of the rail foot can occur in situations which would be overlooked by traditional analyses. The most serious of these situations are those where track is accessed by multiple operators and in situations where there is a mix of heavy haul, general freight and/or passenger traffic. The least serious are those where the track is carrying single-operator-owned heavy haul unit trains. The paper shows how using the nominal maximum axle load of passing traffic, which is the key issue in traditional analyses, is insufficient and must be augmented with consideration of important operational factors. Ignoring such factors can be costly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To strive to improve the rehabilitation program of individuals with transfemoral amputation fitted with bone-anchored prosthesis based on data from direct measurements of the load applied on the residuum we first of all need to understand the load applied on the fixation. Therefore the load applied on the residuum was first directly measured during standardized activities of daily living such as straight line level walking, ascending and descending stairs and a ramp and walking around a circle. From measuring the load in standardized activities of daily living the load was also measured during different phases of the rehabilitation program such as during walking with walking aids and during load bearing exercises.[1-15] The rehabilitation program for individuals with a transfemoral amputation fitted with an OPRA implant relies on a combination of dynamic and static load bearing exercises.[16-20] This presentation will focus on the study of a set of experimental static load bearing exercises. [1] A group of eleven individuals with unilateral transfemoral amputation fitted with an OPRA implant participated in this study. The load on the implant during the static load bearing exercises was measured using a portable system including a commercial transducer embedded in a short pylon, a laptop and a customized software package. This apparatus was previously shown effective in a proof-of-concept study published by Prof. Frossard. [1-9] The analysis of the static load bearing exercises included an analysis of the reliability as well as the loading compliance. The analysis of the loading reliability showed a high reliability between the loading sessions indicating a correct repetition of the LBE by the participants. [1, 5] The analysis of the loading compliance showed a significant lack of axial compliance leading to a systematic underloading of the long axis of the implant during the proposed experimental static LBE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Online dynamic load modeling has become possible with the availability of Static Voltage Compensator (SVC) and Phasor Measurement Unit (PMU) devices. The power of the load response to the small random bounded voltage fluctuations caused from SVC can be measured by PMU for modelling purposes. The aim of this paper is to illustrate the capability of identifying an aggregated load model from high voltage substation level in the online environment. The induction motor is used as the main test subject since it contributes the majority of the dynamic loads. A test system representing simple electromechanical generator model serving dynamic loads through the transmission network is used to verify the proposed method. Also, dynamic load with multiple induction motors are modeled to achieve a better realistic load representation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To apply genetic analysis of genome-wide association data to study the extent and nature of a shared biological basis between migraine and coronary artery disease (CAD). Methods: Four separate methods for cross-phenotype genetic analysis were applied on data from 2 large-scale genome-wide association studies of migraine (19,981 cases, 56,667 controls) and CAD (21,076 cases, 63,014 controls). The first 2 methods quantified the extent of overlapping risk variants and assessed the load of CAD risk loci in migraineurs. Genomic regions of shared risk were then identified by analysis of covariance patterns between the 2 phenotypes and by querying known genome-wide significant loci. Results: We found a significant overlap of genetic risk loci for migraine and CAD. When stratified by migraine subtype, this was limited to migraine without aura, and the overlap was protective in that patients with migraine had a lower load of CAD risk alleles than controls. Genes indicated by 16 shared risk loci point to mechanisms with potential roles in migraine pathogenesis and CAD, including endothelial dysfunction (PHACTR1) and insulin homeostasis (GIP). Conclusions: The results suggest that shared biological processes contribute to risk of migraine and CAD, but surprisingly this commonality is restricted to migraine without aura and the impact is in opposite directions. Understanding the mechanisms underlying these processes and their opposite relationship to migraine and CAD may improve our understanding of both disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of species distribution models (SDMs) relies to a large degree on the quality of the input data, from bioclimatic indices to environmental and habitat descriptors (Austin, 2002). Recent reviews of SDM techniques, have sought to optimize predictive performance e.g. Elith et al., 2006. In general SDMs employ one of three approaches to variable selection. The simplest approach relies on the expert to select the variables, as in environmental niche models Nix, 1986 or a generalized linear model without variable selection (Miller and Franklin, 2002). A second approach explicitly incorporates variable selection into model fitting, which allows examination of particular combinations of variables. Examples include generalized linear or additive models with variable selection (Hastie et al. 2002); or classification trees with complexity or model based pruning (Breiman et al., 1984, Zeileis, 2008). A third approach uses model averaging, to summarize the overall contribution of a variable, without considering particular combinations. Examples include neural networks, boosted or bagged regression trees and Maximum Entropy as compared in Elith et al. 2006. Typically, users of SDMs will either consider a small number of variable sets, via the first approach, or else supply all of the candidate variables (often numbering more than a hundred) to the second or third approaches. Bayesian SDMs exist, with several methods for eliciting and encoding priors on model parameters (see review in Low Choy et al. 2010). However few methods have been published for informative variable selection; one example is Bayesian trees (O’Leary 2008). Here we report an elicitation protocol that helps makes explicit a priori expert judgements on the quality of candidate variables. This protocol can be flexibly applied to any of the three approaches to variable selection, described above, Bayesian or otherwise. We demonstrate how this information can be obtained then used to guide variable selection in classical or machine learning SDMs, or to define priors within Bayesian SDMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Body fat distribution is a heritable trait and a well-established predictor of adverse metabolic outcomes, independent of overall adiposity. To increase our understanding of the genetic basis of body fat distribution and its molecular links to cardiometabolic traits, here we conduct genome-wide association meta-analyses of traits related to waist and hip circumferences in up to 224,459 individuals. We identify 49 loci (33 new) associated with waist-to-hip ratio adjusted for body mass index (BMI), and an additional 19 loci newly associated with related waist and hip circumference measures (P < 5 × 10−8). In total, 20 of the 49 waist-to-hip ratio adjusted for BMI loci show significant sexual dimorphism, 19 of which display a stronger effect in women. The identified loci were enriched for genes expressed in adipose tissue and for putative regulatory elements in adipocytes. Pathway analyses implicated adipogenesis, angiogenesis, transcriptional regulation and insulin resistance as processes affecting fat distribution, providing insight into potential pathophysiological mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of bidders, N, involved in a construction procurement auction is known to have an important effect on the value of the lowest bid and the mark up applied by bidders. In practice, for example, it is important for a bidder to have a good estimate of N when bidding for a current contract. One approach, instigated by Friedman in 1956, is to make such an estimate by statistical analysis and modelling. Since then, however, finding a suitable model for N has been an enduring problem for researchers and, despite intensive research activity in the subsequent thirty years little progress has been made - due principally to the absence of new ideas and perspectives. This paper resumes the debate by checking old assumptions, providing new evidence relating to concomitant variables and proposing a new model. In doing this and in order to assure universality, a novel approach is developed and tested by using a unique set of twelve construction tender databases from four continents. This shows the new model provides a significant advancement on previous versions. Several new research questions are also posed and other approaches identified for future study.