954 resultados para Roadway live load model
Resumo:
Study Design. An experimental animal study. Objective. To investigate histomorphometric and radiographical changes in the BB.4S rat model after PEEK (polyetheretherketone) nonfusion interspinous device implantation. Summary of Background Data. Clinical effectiveness of the PEEK nonfusion spine implant Wallis (Abbott, Bordeaux, France; now Zimmer, Warsaw, IN) is well documented. However, there is a lack of evidence on the long-term effects of this implant on bone, in particular its influence on structural changes of bone elements of the lumbar spine. Methods. Twenty-four male BB.4S rats aged 11 weeks underwent surgery for implantation of a PEEK nonfusion interspinous device or for a sham procedure in 3 groups of 8 animals each: 1) implantation at level L4–L5; 2) implantation at level L5–L6; and 3) sham surgery. Eleven weeks postoperatively osteolyses at the implant-bone interface were measured via radiograph, bone mineral density of vertebral bodies was analyzed using osteodensitometry, and bone mineral content as well as resorption of the spinous processes were examined by histomorphometry. Results. Resorption of the spinous processes at the site of the interspinous implant was found in all treated segments. There was no significant difference in either bone density of vertebral bodies or histomorphometric structure of the spinous processes between adjacent vertebral bodies, between treated and untreated segments and between groups. Conclusion. These findings indicate that resorption of spinous processes because of a result of implant loosening, inhibit the targeted load redistribution through the PEEK nonfusion interspinous device in the lumbar spinal segment of the rat. This leads to reduced long-term stability of the implant in the animal model. These results suggest that PEEK nonfusion interspinous devices like the Wallis implants may have time-limited effects and should only be used for specified indications.
Resumo:
The purpose of this prospective observational field study was to present a model for measuring energy expenditure among nurses and to determine if there was a difference between the energy expenditure of nurses providing direct care to adult patients on general medical-surgical units in two major metropolitan hospitals and a recommended energy expenditure of 3.0 kcal/minute over 8 hours. One-third of the predicted cycle ergometer VO2max for the study population was used to calculate the recommended energy expenditure.^ Two methods were used to measure energy expenditure among participants during an 8 hour day shift. First, the Energy Expenditure Prediction Program (EEPP) developed by the University of Michigan Center for Ergonomics was used to calculate energy expenditure using activity recordings from observation (OEE; n = 39). The second method used ambulatory electrocardiography and the heart rate-oxygen consumption relationship (HREE; n = 20) to measure energy expenditure. It was concluded that energy expenditure among nurses can be estimated using the EEPP. Using classification systems from previous research, work load among the study population was categorized as "moderate" but was significantly less than (p = 0.021) 3.0 kcal/minute over 8 hours or 1/3 of the predicted VO2max.^ In addition, the relationships between OEE, body-part discomfort (BPCDS) and mental work load (MWI) were evaluated. The relationships between OEE/BPCDS and OEE/MWI were not significant (p = 0.062 and 0.091, respectively). Among the study population, body-part discomfort significantly increased for upper arms, mid-back, lower-back, legs and feet by mid-shift and by the end of the shift, the increase was also significant for neck and thighs.^ The study also provided documentation of a comprehensive list of nursing activities. Among the most important findings were the facts that the study population spent 23% of the workday in a bent posture, walked an average of 3.14 miles, and spent two-thirds of the shift doing activities other than direct patient care, such as paperwork and communicating with other departments. A discussion is provided regarding the ergonomic implications of these findings. ^
Resumo:
INTRODUCTION: The objective of this study was to evaluate the effects of two different mean arterial blood pressure (MAP) targets on needs for resuscitation, organ dysfunction, mitochondrial respiration and inflammatory response in a long-term model of fecal peritonitis. METHODS: Twenty-four anesthetized and mechanically ventilated pigs were randomly assigned (n = 8/group) to a septic control group (septic-CG) without resuscitation until death or one of two groups with resuscitation performed after 12 hours of untreated sepsis for 48 hours, targeting MAP 50-60 mmHg (low-MAP) or 75-85 mmHg (high-MAP). RESULTS: MAP at the end of resuscitation was 56 ± 13 mmHg (mean ± SD) and 76 ± 17 mmHg respectively, for low-MAP and high-MAP groups. One animal each in high- and low-MAP groups, and all animals in septic-CG died (median survival time: 21.8 hours, inter-quartile range: 16.3-27.5 hours). Norepinephrine was administered to all animals of the high-MAP group (0.38 (0.21-0.56) mcg/kg/min), and to three animals of the low-MAP group (0.00 (0.00-0.25) mcg/kg/min; P = 0.009). The high-MAP group had a more positive fluid balance (3.3 ± 1.0 mL/kg/h vs. 2.3 ± 0.7 mL/kg/h; P = 0.001). Inflammatory markers, skeletal muscle ATP content and hemodynamics other than MAP did not differ between low- and high-MAP groups. The incidence of acute kidney injury (AKI) after 12 hours of untreated sepsis was, respectively for low- and high-MAP groups, 50% (4/8) and 38% (3/8), and in the end of the study 57% (4/7) and 0% (P = 0.026). In septic-CG, maximal isolated skeletal muscle mitochondrial Complex I, State 3 respiration increased from 1357 ± 149 pmol/s/mg to 1822 ± 385 pmol/s/mg, (P = 0.020). In high- and low-MAP groups, permeabilized skeletal muscle fibers Complex IV-state 3 respiration increased during resuscitation (P = 0.003). CONCLUSIONS: The MAP targets during resuscitation did not alter the inflammatory response, nor affected skeletal muscle ATP content and mitochondrial respiration. While targeting a lower MAP was associated with increased incidence of AKI, targeting a higher MAP resulted in increased net positive fluid balance and vasopressor load during resuscitation. The long-term effects of different MAP targets need to be evaluated in further studies.
Resumo:
Tick-borne encephalitis virus (TBEV) is the causative agent of human TBE, a severe infection that can cause long-lasting neurologic sequelae. Langat virus (LGTV), which is closely related to TBEV, has a low virulence for human hosts and has been used as a live vaccine against TBEV. Tick-borne encephalitis by natural infection of LGTV in humans has not been described, but one of 18,500 LGTV vaccinees developed encephalitis. The pathogenetic mechanisms of TBEV are poorly understood and, currently, no effective therapy is available. We developed an infant rat model of TBE using LGTV as infective agent. Infant Wistar rats were inoculated intracisternally with 10 focus-forming units of LGTV and assessed for clinical disease and neuropathologic findings at Days 2, 4, 7, and 9 after infection. Infection with LGTV led to gait disturbance, hypokinesia, and reduced weight gain or weight loss. Cerebrospinal fluid concentrations of RANTES, interferon-γ, interferon-β, interleukin-6, and monocyte chemotactic protein-1 were increased in infected animals. The brains of animals with LGTV encephalitis exhibited characteristic perivascular inflammatory cuffs and glial nodules; immunohistochemistry documented the presence of LGTV in the thalamus, hippocampus, midbrain, frontal pole, and cerebellum. Thus, LGTV meningoencephalitis in infant rats mimics important clinical and histopathologic features of human TBE. This new model provides a tool to investigate disease mechanisms and to evaluate new therapeutic strategies against encephalitogenic flaviviruses.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
The BepiColombo Laser Altimeter (BELA) has been selected to fly on ESA׳s BepiColombo mission to Mercury. The instrument will be the first European laser altimeter designed for interplanetary flight. This paper describes the setup used to characterize the angular movements of BELA under the simulated environmental conditions that the instrument will encounter when orbiting Mercury. The system comprises a laser transmitter and a receiving telescope, which can move with respect to each other under thermal load. Tests performed using the Engineering Qualification Model show that the setup is accurate enough to characterize angular movements of the instrument components to an accuracy of ≈10 μrad. The qualification instrument is thermally stable to operate during all mission phases around Mercury proving that the transmitter and receiver sections will remain within the alignment requirements during its mission.
Resumo:
We developed a model to calculate a quantitative risk score for individual aquaculture sites. The score indicates the risk of the site being infected with a specific fish pathogen (viral haemorrhagic septicaemia virus (VHSV); infectious haematopoietic necrosis virus, Koi herpes virus), and is intended to be used for risk ranking sites to support surveillance for demonstration of zone or member state freedom from these pathogens. The inputs to the model include a range of quantitative and qualitative estimates of risk factors organised into five risk themes (1) Live fish and egg movements; (2) Exposure via water; (3) On-site processing; (4) Short-distance mechanical transmission; (5) Distance-independent mechanical transmission. The calculated risk score for an individual aquaculture site is a value between zero and one and is intended to indicate the risk of a site relative to the risk of other sites (thereby allowing ranking). The model was applied to evaluate 76 rainbow trout farms in 3 countries (42 from England, 32 from Italy and 2 from Switzerland) with the aim to establish their risk of being infected with VHSV. Risk scores for farms in England and Italy showed great variation, clearly enabling ranking. Scores ranged from 0.002 to 0.254 (mean score 0.080) in England and 0.011 to 0.778 (mean of 0.130) for Italy, reflecting the diversity of infection status of farms in these countries. Requirements for broader application of the model are discussed. Cost efficient farm data collection is important to realise the benefits from a risk-based approach.
Resumo:
Trabecular bone is a porous mineralized tissue playing a major load bearing role in the human body. Prediction of age-related and disease-related fractures and the behavior of bone implant systems needs a thorough understanding of its structure-mechanical property relationships, which can be obtained using microcomputed tomography-based finite element modeling. In this study, a nonlinear model for trabecular bone as a cohesive-frictional material was implemented in a large-scale computational framework and validated by comparison of μFE simulations with experimental tests in uniaxial tension and compression. A good correspondence of stiffness and yield points between simulations and experiments was found for a wide range of bone volume fraction and degree of anisotropy in both tension and compression using a non-calibrated, average set of material parameters. These results demonstrate the ability of the model to capture the effects leading to failure of bone for three anatomical sites and several donors, which may be used to determine the apparent behavior of trabecular bone and its evolution with age, disease, and treatment in the future.
Resumo:
The regenerative pathways during periosteal distraction osteogenesis may be influenced by the local environment composed by cells, growth factors, nutrition and mechanical load. The aim of the present study was to evaluate the influence of two protocols of periosteal distraction on bone formation. Custom made distraction devices were surgically fixed onto the calvariae of 60 rabbits. After an initial healing period of 7 days, two groups of animals were submitted to distraction rates of 0.25 and 0.5 mm/24 h for 10 days, respectively. Six animals per group were sacrificed 10 (mid-distraction), 17 (end-distraction), 24 (1-week consolidation), 31 (2-week consolidation) and 77 days (2-month consolidation) after surgery. Newly formed bone was assessed by means of micro-CT and histologically. Expression of transcripts encoding tissue-specific genes (BMP-2, RUNX2, ACP5, SPARC, collagen I α1, collagen II α1 and SOX9) was analyzed by quantitative PCR. Two patterns of bone formation were observed, originating from the old bone surface in Group I and from the periosteum in Group II. Bone volume (BV) and bone mineral density (BMD) significantly increased up to the 2-month consolidation period within the groups (p < 0.05). Significantly more bone was observed in Group II compared to Group I at the 2-month consolidation period (p < 0.001). Expression of transcripts encoding osteogenic genes in bone depended on the time-point of observation (p < 0.05). Low level of transcripts reveals an indirect role of periosteum in the osteogenic process. Two protocols of periosteal distraction in the present model resulted in moderate differences in terms of bone formation.
Resumo:
The Out-of-Africa (OOA) dispersal ∼50,000 y ago is characterized by a series of founder events as modern humans expanded into multiple continents. Population genetics theory predicts an increase of mutational load in populations undergoing serial founder effects during range expansions. To test this hypothesis, we have sequenced full genomes and high-coverage exomes from seven geographically divergent human populations from Namibia, Congo, Algeria, Pakistan, Cambodia, Siberia, and Mexico. We find that individual genomes vary modestly in the overall number of predicted deleterious alleles. We show via spatially explicit simulations that the observed distribution of deleterious allele frequencies is consistent with the OOA dispersal, particularly under a model where deleterious mutations are recessive. We conclude that there is a strong signal of purifying selection at conserved genomic positions within Africa, but that many predicted deleterious mutations have evolved as if they were neutral during the expansion out of Africa. Under a model where selection is inversely related to dominance, we show that OOA populations are likely to have a higher mutation load due to increased allele frequencies of nearly neutral variants that are recessive or partially recessive.
Resumo:
Expanding populations incur a mutation burden – the so-called expansion load. Previous studies of expansion load have focused on codominant mutations. An important consequence of this assumption is that expansion load stems exclusively from the accumulation of new mutations occurring in individuals living at the wave front. Using individual-based simulations, we study here the dynamics of standing genetic variation at the front of expansions, and its consequences on mean fitness if mutations are recessive. We find that deleterious genetic diversity is quickly lost at the front of the expansion, but the loss of deleterious mutations at some loci is compensated by an increase of their frequencies at other loci. The frequency of deleterious homozygotes therefore increases along the expansion axis, whereas the average number of deleterious mutations per individual remains nearly constant across the species range. This reveals two important differences to codominant models: (i) mean fitness at the front of the expansion drops much faster if mutations are recessive, and (ii) mutation load can increase during the expansion even if the total number of deleterious mutations per individual remains constant. We use our model to make predictions about the shape of the site frequency spectrum at the front of range expansion, and about correlations between heterozygosity and fitness in different parts of the species range. Importantly, these predictions provide opportunities to empirically validate our theoretical results. We discuss our findings in the light of recent results on the distribution of deleterious genetic variation across human populations and link them to empirical results on the correlation of heterozygosity and fitness found in many natural range expansions.
Resumo:
Domestic dog rabies is an endemic disease in large parts of the developing world and also epidemic in previously free regions. For example, it continues to spread in eastern Indonesia and currently threatens adjacent rabies-free regions with high densities of free-roaming dogs, including remote northern Australia. Mathematical and simulation disease models are useful tools to provide insights on the most effective control strategies and to inform policy decisions. Existing rabies models typically focus on long-term control programs in endemic countries. However, simulation models describing the dog rabies incursion scenario in regions where rabies is still exotic are lacking. We here describe such a stochastic, spatially explicit rabies simulation model that is based on individual dog information collected in two remote regions in northern Australia. Illustrative simulations produced plausible results with epidemic characteristics expected for rabies outbreaks in disease free regions (mean R0 1.7, epidemic peak 97 days post-incursion, vaccination as the most effective response strategy). Systematic sensitivity analysis identified that model outcomes were most sensitive to seven of the 30 model parameters tested. This model is suitable for exploring rabies spread and control before an incursion in populations of largely free-roaming dogs that live close together with their owners. It can be used for ad-hoc contingency or response planning prior to and shortly after incursion of dog rabies in previously free regions. One challenge that remains is model parameterisation, particularly how dogs' roaming and contacts and biting behaviours change following a rabies incursion in a previously rabies free population.
Resumo:
Cytochromes P450 4Fs (CYP4F) are a subfamily of enzymes involved in arachidonic acid metabolism with highest catalytic activity towards leukotriene B 4 (LTB4), a potent chemoattractant involved in prompting inflammation. CYP4F-mediated metabolism of LTB4 leads to inactive ω-hydroxy products incapable of initiating chemotaxis and the inflammatory stimuli that result in the influx of inflammatory cells. Our hypothesis is based on the catalytic ability of CYP4Fs to inactivate pro-inflammatory LTB4 which assures these enzymes a pivotal role in the process of inflammation resolution. ^ To test this hypothesis and evaluate the changes in CYP4F expression under complex inflammatory conditions, we designed two mouse models, one challenged with lipopolysaccharide (LPS) as a sterile model of sepsis and the other challenged with a systemic live bacterial infection of Citrobacter rodentium, an equivalent of the human enterobacterium E. coli pathogen invasion. Based on the evidence that Peroxisome Proliferator Activated Receptors (PPARs) play an active role in inflammation regulation, we also examined PPARs as a regulation mechanism in CYP4F expression during inflammation using PPARα knockout mice under LPS challenge. Using the Citrobacter rodentium model of inflammation, we studied CYP4F levels to compare them to those in LPS challenged animals. LPS-triggered inflammation signal is mediated by Toll-like 4 (TLR4) receptors which specifically respond to LPS in association with several other proteins. Using TLR4 knockout mice challenged with Citrobacter rodentium we addressed possible mediation of CYP4F expression regulation via these receptors. ^ Our results show isoform- and tissue-specific CYP4F expression in all the tissues examined. The Citrobacter rodentium inflammation model revealed significant reduction in liver expression of CYP4F14 and CYP4F15 and an up-regulation of gene expression of CYP4F16 and CYP4F18. TLR4 knockout studies showed that the decrease in hepatic CYP4F15 expression is TLR4-dependent. CYP4F expression in kidney shows down-regulation of CYP4F14 and CYP4F15 and up-regulation of CYP4F18 expression. In the LPS inflammation model, we showed similar patterns of CYP4F changes as in Citrobacter rodentium -infected mice. The renal profile of CYP4Fs in PPARα knockout mice with LPS challenge showed CYP4F15 down-regulation to be PPARα dependent. Our study confirmed tissue- and isoform-specific regulation of CYP4F isoforms in the course of inflammation. ^
Resumo:
Background Past and recent evidence shows that radionuclides in drinking water may be a public health concern. Developmental thresholds for birth defects with respect to chronic low level domestic radiation exposures, such as through drinking water, have not been definitely recognized, and there is a strong need to address this deficiency in information. In this study we examined the geographic distribution of orofacial cleft birth defects in and around uranium mining district Counties in South Texas (Atascosa, Bee, Brooks, Calhoun, Duval, Goliad, Hidalgo, Jim Hogg, Jim Wells, Karnes, Kleberg, Live Oak, McMullen, Nueces, San Patricio, Refugio, Starr, Victoria, Webb, and Zavala), from 1999 to 2007. The probable association of cleft birth defect rates by ZIP codes classified according to uranium and radium concentrations in drinking water supplies was evaluated. Similar associations between orofacial cleft birth defects and radium/radon in drinking water were reported earlier by Cech and co-investigators in another of the Gulf Coast region (Harris County, Texas).50, 55 Since substantial uranium mining activity existed and still exists in South Texas, contamination of drinking water sources with radiation and its relation to birth defects is a ground for concern. ^ Methods Residential addresses of orofacial cleft birth defect cases, as well as live births within the twenty Counties during 1999-2007 were geocoded and mapped. Prevalence rates were calculated by ZIP codes and were mapped accordingly. Locations of drinking water supplies were also geocoded and mapped. ZIP codes were stratified as having high combined uranium (≥30μg/L) vs. low combined uranium (<30μg/L). Likewise, ZIP codes having the uranium isotope, Ra-226 in drinking water, were also stratified as having elevated radium (≥3 pCi/L) vs. low radium (<3 pCi/L). A linear regression was performed using STATA® generalized linear model (GLM) program to evaluate the probable association between cleft birth defect rates by ZIP codes and concentration of uranium and radium via domestic water supply. These rates were further adjusted for potentially confounding variables such as maternal age, education, occupation, and ethnicity. ^ Results This study showed higher rates of cleft births in ZIP codes classified as having high combined uranium versus ZIP codes having low combined uranium. The model was further improved by adding radium stratified as explained above. Adjustment for maternal age and ethnicity did not substantially affect the statistical significance of uranium or radium concentrations in household water supplies. ^ Conclusion Although this study lacks individual exposure levels, the findings suggest a significant association between elevated uranium and radium concentrations in tap water and high orofacial birth defect rates by ZIP codes. Future case-control studies that can measure individual exposure levels and adjust for contending risk factors could result in a better understanding of the exposure-disease association.^
Resumo:
Background. The United Nations' Millennium Development Goal (MDG) 4 aims for a two-thirds reduction in death rates for children under the age of five by 2015. The greatest risk of death is in the first week of life, yet most of these deaths can be prevented by such simple interventions as improved hygiene, exclusive breastfeeding, and thermal care. The percentage of deaths in Nigeria that occur in the first month of life make up 28% of all deaths under five years, a statistic that has remained unchanged despite various child health policies. This paper will address the challenges of reducing the neonatal mortality rate in Nigeria by examining the literature regarding efficacy of home-based, newborn care interventions and policies that have been implemented successfully in India. ^ Methods. I compared similarities and differences between India and Nigeria using qualitative descriptions and available quantitative data of various health indicators. The analysis included identifying policy-related factors and community approaches contributing to India's newborn survival rates. Databases and reference lists of articles were searched for randomized controlled trials of community health worker interventions shown to reduce neonatal mortality rates. ^ Results. While it appears that Nigeria spends more money than India on health per capita ($136 vs. $132, respectively) and as percent GDP (5.8% vs. 4.2%, respectively), it still lags behind India in its neonatal, infant, and under five mortality rates (40 vs. 32 deaths/1000 live births, 88 vs. 48 deaths/1000 live births, 143 vs. 63 deaths/1000 live births, respectively). Both countries have comparably low numbers of healthcare providers. Unlike their counterparts in Nigeria, Indian community health workers receive training on how to deliver postnatal care in the home setting and are monetarily compensated. Gender-related power differences still play a role in the societal structure of both countries. A search of randomized controlled trials of home-based newborn care strategies yielded three relevant articles. Community health workers trained to educate mothers and provide a preventive package of interventions involving clean cord care, thermal care, breastfeeding promotion, and danger sign recognition during multiple postnatal visits in rural India, Bangladesh, and Pakistan reduced neonatal mortality rates by 54%, 34%, and 15–20%, respectively. ^ Conclusion. Access to advanced technology is not necessary to reduce neonatal mortality rates in resource-limited countries. To address the urgency of neonatal mortality, countries with weak health systems need to start at the community level and invest in cost-effective, evidence-based newborn care interventions that utilize available human resources. While more randomized controlled studies are urgently needed, the current available evidence of models of postnatal care provision demonstrates that home-based care and health education provided by community health workers can reduce neonatal mortality rates in the immediate future.^