897 resultados para cache-based mechanism
Resumo:
In this paper we present a new population-based implant design methodology, which advances the state-of-the-art approaches by combining shape and bone quality information into the design strategy. The method may enhance the mechanical stability of the fixation and reduces the intra-operative in-plane bending which might impede the functionality of the locking mechanism. The computational method is presented for the case of mandibular locking fixation plates, where the mandibular angle and the bone quality at screw locations are taken into account. The method automatically derives the mandibular angle and the bone thickness and intensity values at the path of every screw from a set of computed tomography images. An optimization strategy is then used to optimize the two parameters of plate angle and screw position. The method was applied to two populations of different genders. Results for the new design are presented along with a comparison with a commercially available mandibular locking fixation plate (MODUS(®) TriLock(®) 2.0/2.3/2.5, Medartis AG, Basel, Switzerland). The proposed designs resulted in a statistically significant improvement in the available bone thickness when compared to the standard plate. There is a higher probability that the proposed implants cover areas of thicker cortical bone without compromising the bone mineral density around the screws. The obtained results allowed us to conclude that an angle and screw separation of 129° and 9 mm for females and 121° and 10 mm for males are more suitable designs than the commercially available 120° and 9 mm.
Resumo:
The spatio-temporal control of gene expression is fundamental to elucidate cell proliferation and deregulation phenomena in living systems. Novel approaches based on light-sensitive multiprotein complexes have recently been devised, showing promising perspectives for the noninvasive and reversible modulation of the DNA-transcriptional activity in vivo. This has lately been demonstrated in a striking way through the generation of the artificial protein construct light-oxygen-voltage (LOV)-tryptophan-activated protein (TAP), in which the LOV-2-Jα photoswitch of phototropin1 from Avena sativa (AsLOV2-Jα) has been ligated to the tryptophan-repressor (TrpR) protein from Escherichia coli. Although tremendous progress has been achieved on the generation of such protein constructs, a detailed understanding of their functioning as opto-genetical tools is still in its infancy. Here, we elucidate the early stages of the light-induced regulatory mechanism of LOV-TAP at the molecular level, using the noninvasive molecular dynamics simulation technique. More specifically, we find that Cys450-FMN-adduct formation in the AsLOV2-Jα-binding pocket after photoexcitation induces the cleavage of the peripheral Jα-helix from the LOV core, causing a change of its polarity and electrostatic attraction of the photoswitch onto the DNA surface. This goes along with the flexibilization through unfolding of a hairpin-like helix-loop-helix region interlinking the AsLOV2-Jα- and TrpR-domains, ultimately enabling the condensation of LOV-TAP onto the DNA surface. By contrast, in the dark state the AsLOV2-Jα photoswitch remains inactive and exerts a repulsive electrostatic force on the DNA surface. This leads to a distortion of the hairpin region, which finally relieves its tension by causing the disruption of LOV-TAP from the DNA.
Resumo:
When reengineering legacy systems, it is crucial to assess if the legacy behavior has been preserved or how it changed due to the reengineering effort. Ideally if a legacy system is covered by tests, running the tests on the new version can identify potential differences or discrepancies. However, writing tests for an unknown and large system is difficult due to the lack of internal knowledge. It is especially difficult to bring the system to an appropriate state. Our solution is based on the acknowledgment that one of the few trustable piece of information available when approaching a legacy system is the running system itself. Our approach reifies the execution traces and uses logic programming to express tests on them. Thereby it eliminates the need to programatically bring the system in a particular state, and handles the test-writer a high-level abstraction mechanism to query the trace. The resulting system, called TESTLOG, was used on several real-world case studies to validate our claims.
Resumo:
The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.
Resumo:
INTRODUCTION: Cognitive complaints, such as poor concentration and memory deficits, are frequent after whiplash injury and play an important role in disability. The origin of these complaints is discussed controversially. Some authors postulate brain lesions as a consequence of whiplash injuries. Potential diffuse axonal injury (DAI) with subsequent atrophy of the brain and ventricular expansion is of particular interest as focal brain lesions have not been documented so far in whiplash injury. OBJECTIVE: To investigate whether traumatic brain injury can be identified using a magnetic resonance (MR)-based quantitative analysis of normalized ventricle-brain ratios (VBR) in chronic whiplash patients with subjective cognitive impairment that cannot be objectively confirmed by neuropsychological testing. MATERIALS AND METHODS: MR examination was performed in 21 patients with whiplash injury and symptom persistence for 9 months on average and in 18 matched healthy controls. Conventional MR imaging (MRI) was used to assess the volumes of grey and white matter and of ventricles. The normalized VBR was calculated. RESULTS: The values of normalized VBR did not differ in whiplash patients when compared with that in healthy controls (F = 0.216, P = 0.645). CONCLUSIONS: This study does not support loss of brain tissue following whiplash injury as measured by VBR. On this basis, traumatic brain injury with subsequent DAI does not seem to be the underlying mechanism for persistent concentration and memory deficits that are subjectively reported but not objectively verifiable as neuropsychological deficits.
Resumo:
BACKGROUND: The activity of dihydropyrimidine dehydrogenase (DPD), the key enzyme of pyrimidine catabolism, is thought to be an important determinant for the occurrence of severe toxic reactions to 5-fluorouracil (5-FU), which is one of the most commonly prescribed chemotherapeutic agents for the treatment of solid cancers. Genetic variation in the DPD gene (DPYD) has been proposed as a main factor for variation in DPD activity in the population. However, only a small proportion of severe toxicities in 5-FU based chemotherapy can be explained with such rare deleterious DPYD mutations resulting in severe enzyme deficiencies. Recently, hypermethylation of the DPYD promoter region has been proposed as an alternative mechanism for DPD deficiency and thus as a major cause of severe 5-FU toxicity. METHODS: Here, the prognostic significance of this epigenetic marker with respect to severe 5-FU toxicity was assessed in 27 cancer patients receiving 5-FU based chemotherapy, including 17 patients experiencing severe toxic side effects following drug administration, none of which were carriers of a known deleterious DPYD mutation, and ten control patients. The methylation status of the DPYD promoter region in peripheral blood mononuclear cells was evaluated by analysing for each patient between 19 and 30 different clones of a PCR-amplified 209 base pair fragment of the bisulfite-modified DPYD promoter region. The fragments were sequenced to detect bisulfite-induced, methylation-dependent sequence differences. RESULTS: No evidence of DPYD promoter methylation was observed in any of the investigated patient samples, whereas in a control experiment, as little as 10% methylated genomic DNA could be detected. CONCLUSION: Our results indicate that DYPD promoter hypermethylation is not of major importance as a prognostic factor for severe toxicity in 5-FU based chemotherapy.
Resumo:
Bulk metallic glasses (BMGs) exhibit superior mechanical properties as compared with other conventional materials and have been proposed for numerous engineering and technological applications. Zr/Hf-based BMGs or tungsten reinforced BMG composites are considered as a potential replacement for depleted uranium armor-piercing projectiles because of their ability to form localized shear bands during impact, which has been known to be the dominant plastic deformation mechanism in BMGs. However, in conventional tensile, compressive and bending tests, limited ductility has been observed because of fracture initiation immediately following the shear band formation. To fully investigate shear band characteristics, indentation tests that can confine the deformation in a limited region have been pursued. In this thesis, a detailed investigation of thermal stability and mechanical deformation behavior of Zr/Hf-based BMGs is conducted. First, systematic studies had been implemented to understand the influence of relative compositions of Zr and Hf on thermal stability and mechanical property evolution. Second, shear band evolution under indentations were investigated experimentally and theoretically. Three kinds of indentation studies were conducted on BMGs in the current study. (a) Nano-indentation to determine the mechanical properties as a function of Hf/Zr content. (b) Static Vickers indentation on bonded split specimens to investigate the shear band evolution characteristics beneath the indention. (c) Dynamic Vickers indentation on bonded split specimens to investigate the influence of strain rate. It was found in the present work that gradually replacing Zr by Hf remarkably increases the density and improves the mechanical properties. However, a slight decrease in glass forming ability with increasing Hf content has also been identified through thermodynamic analysis although all the materials in the current study were still found to be amorphous. Many indentation studies have revealed only a few shear bands surrounding the indent on the top surface of the specimen. This small number of shear bands cannot account for the large plastic deformation beneath the indentations. Therefore, a bonded interface technique has been used to observe the slip-steps due to shear band evolution. Vickers indentations were performed along the interface of the bonded split specimen at increasing loads. At small indentation loads, the plastic deformation was primarily accommodated by semi-circular primary shear bands surrounding the indentation. At higher loads, secondary and tertiary shear bands were formed inside this plastic zone. A modified expanding cavity model was then used to predict the plastic zone size characterized by the shear bands and to identify the stress components responsible for the evolution of the various types of shear bands. The applicability of various hardness—yield-strength ( H −σγ ) relationships currently available in the literature for bulk metallic glasses (BMGs) is also investigated. Experimental data generated on ZrHf-based BMGs in the current study and those available elsewhere on other BMG compositions were used to validate the models. A modified expanding-cavity model, employed in earlier work, was extended to propose a new H −σγ relationship. Unlike previous models, the proposed model takes into account not only the indenter geometry and the material properties, but also the pressure sensitivity index of the BMGs. The influence of various model parameters is systematically analyzed. It is shown that there is a good correlation between the model predictions and the experimental data for a wide range of BMG compositions. Under dynamic Vickers indentation, a decrease in indentation hardness at high loading rate was observed compared to static indentation hardness. It was observed that at equivalent loads, dynamic indentations produced more severe deformation features on the loading surface than static indentations. Different from static indentation, two sets of widely spaced semi-circular shear bands with two different curvatures were observed. The observed shear band pattern and the strain rate softening in indentation hardness were rationalized based on the variations in the normal stress on the slip plane, the strain rate of shear and the temperature rise associated with the indentation deformation. Finally, a coupled thermo-mechanical model is proposed that utilizes a momentum diffusion mechanism for the growth and evolution of the final spacing of shear bands. The influence of strain rate, confinement pressure and critical shear displacement on the shear band spacing, temperature rise within the shear band, and the associated variation in flow stress have been captured and analyzed. Consistent with the known pressure sensitive behavior of BMGs, the current model clearly captures the influence of the normal stress in the formation of shear bands. The normal stress not only reduces the time to reach critical shear displacement but also causes a significant temperature rise during the shear band formation. Based on this observation, the variation of shear band spacing in a typical dynamic indentation test has been rationalized. The temperature rise within a shear band can be in excess of 2000K at high strain rate and high confinement pressure conditions. The associated drop in viscosity and flow stress may explain the observed decrease in fracture strength and indentation hardness. The above investigations provide valuable insight into the deformation behavior of BMGs under static and dynamic loading conditions. The shear band patterns observed in the above indentation studies can be helpful to understand and model the deformation features under complex loading scenarios such as the interaction of a penetrator with armor. Future work encompasses (1) extending and modifying the coupled thermo-mechanical model to account for the temperature rise in quasistatic deformation; and (2) expanding this model to account for the microstructural variation-crystallization and free volume migration associated with the deformation.
Resumo:
As the performance gap between microprocessors and memory continues to increase, main memory accesses result in long latencies which become a factor limiting system performance. Previous studies show that main memory access streams contain significant localities and SDRAM devices provide parallelism through multiple banks and channels. These locality and parallelism have not been exploited thoroughly by conventional memory controllers. In this thesis, SDRAM address mapping techniques and memory access reordering mechanisms are studied and applied to memory controller design with the goal of reducing observed main memory access latency. The proposed bit-reversal address mapping attempts to distribute main memory accesses evenly in the SDRAM address space to enable bank parallelism. As memory accesses to unique banks are interleaved, the access latencies are partially hidden and therefore reduced. With the consideration of cache conflict misses, bit-reversal address mapping is able to direct potential row conflicts to different banks, further improving the performance. The proposed burst scheduling is a novel access reordering mechanism, which creates bursts by clustering accesses directed to the same rows of the same banks. Subjected to a threshold, reads are allowed to preempt writes and qualified writes are piggybacked at the end of the bursts. A sophisticated access scheduler selects accesses based on priorities and interleaves accesses to maximize the SDRAM data bus utilization. Consequentially burst scheduling reduces row conflict rate, increasing and exploiting the available row locality. Using a revised SimpleScalar and M5 simulator, both techniques are evaluated and compared with existing academic and industrial solutions. With SPEC CPU2000 benchmarks, bit-reversal reduces the execution time by 14% on average over traditional page interleaving address mapping. Burst scheduling also achieves a 15% reduction in execution time over conventional bank in order scheduling. Working constructively together, bit-reversal and burst scheduling successfully achieve a 19% speedup across simulated benchmarks.
Resumo:
Research on rehabilitation showed that appropriate and repetitive mechanical movements can help spinal cord injured individuals to restore their functional standing and walking. The objective of this paper was to achieve appropriate and repetitive joint movements and approximately normal gait through the PGO by replicating normal walking, and to minimize the energy consumption for both patients and the device. A model based experimental investigative approach is presented in this dissertation. First, a human model was created in Ideas and human walking was simulated in Adams. The main feature of this model was the foot ground contact model, which had distributed contact points along the foot and varied viscoelasticity. The model was validated by comparison of simulated results of normal walking and measured ones from the literature. It was used to simulate current PGO walking to investigate the real causes of poor function of the current PGO, even though it had joint movements close to normal walking. The direct cause was one leg moving at a time, which resulted in short step length and no clearance after toe off. It can not be solved by simply adding power on both hip joints. In order to find a better answer, a PGO mechanism model was used to investigate different walking mechanisms by locking or releasing some joints. A trade-off between energy consumption, control complexity and standing position was found. Finally a foot release PGO virtual model was created and simulated and only foot release mechanism was developed into a prototype. Both the release mechanism and the design of foot release were validated through the experiment by adding the foot release on the current PGO. This demonstrated an advancement in improving functional aspects of the current PGO even without a whole physical model of foot release PGO for comparison.
Resumo:
Mt Etna's activity has increased during the last decade with a tendency towards more explosive eruptions that produce paroxysmal lava fountains. From January 2011 to April 2012, 25 lava fountaining episodes took place at Etna's New South-East Crater (NSEC). Improved understanding of the mechanism driving these explosive basaltic eruptions is needed to reduce volcanic hazards. This type of activity produces high sulfur dioxide (SO2) emissions, associated with lava flows and ash fall-out, but to date the SO2 emissions associated with Etna's lava fountains have been poorly constrained. The Ultraviolet (UV) Ozone Monitoring Instrument (OMI) on NASA's Aura satellite and the Atmospheric Infrared Sounder (AIRS) on Aqua were used to measure the SO2 loadings. Ground-based data from the Observatoire de Physique du Globe de Clermont-Ferrand (OPGC) L-band Doppler radar, VOLDORAD 2B, used in collaboration with the Italian National Institute of Geophysics and Volcanology in Catania (INGV-CT), also detected the associated ash plumes, giving precise timing and duration for the lava fountains. This study resulted in the first detailed analysis of the OMI and AIRS SO2 data for Etna's lava fountains during the 2011-2012 eruptive cycle. The HYSPLIT trajectory model is used to constrain the altitude of the observed SO2 clouds, and results show that the SO2 emission usually coincided with the lava fountain peak intensity as detected by VOLDORAD. The UV OMI and IR AIRS SO2 retrievals permit quantification of the SO2 loss rate in the volcanic SO2 clouds, many of which were tracked for several days after emission. A first attempt to quantitatively validate AIRS SO2 retrievals with OMI data revealed a good correlation for high altitude SO2 clouds. Using estimates of the emitted SO2 at the time each paroxysm, we observe a correlation with the inter-paroxysm repose time. We therefore suggest that our data set supports the collapsing foam (CF) model [1] as driving mechanism for the paroxysmal events at the NSEC. Using VOLDORAD-based estimates of the erupted magma mass, we observe a large excess of SO2 in the eruption clouds. Satellite measurements indicate that SO2 emissions from Etnean lava fountains can reach the lower stratosphere and hence could pose a hazard to aviation. [1] Parfitt E.A (2004). A discussion of the mechanisms of explosive basaltic eruptions. J. Volcanol. Geotherm. Res. 134, 77-107.
Resumo:
Contention-based MAC protocols follow periodic listen/sleep cycles. These protocols face the problem of virtual clustering if different unsynchronized listen/sleep schedules occur in the network, which has been shown to happen in wireless sensor networks. To interconnect these virtual clusters, border nodes maintaining all respective listen/sleep schedules are required. However, this is a waste of energy, if locally a common schedule can be determined. We propose to achieve local synchronization with a mechanism that is similar to gravitation. Clusters represent the mass, whereas synchronization messages sent by each cluster represent the gravitation force of the according cluster. Due to the mutual attraction caused by the clusters, all clusters merge finally. The exchange of synchronization messages itself is not altered by LACAS. Accordingly, LACAS introduces no overhead. Only a not yet used property of synchronization mechanisms is exploited.
Resumo:
Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.
Resumo:
Loss to follow-up (LTFU) is a common problem in many epidemiological studies. In antiretroviral treatment (ART) programs for patients with human immunodeficiency virus (HIV), mortality estimates can be biased if the LTFU mechanism is non-ignorable, that is, mortality differs between lost and retained patients. In this setting, routine procedures for handling missing data may lead to biased estimates. To appropriately deal with non-ignorable LTFU, explicit modeling of the missing data mechanism is needed. This can be based on additional outcome ascertainment for a sample of patients LTFU, for example, through linkage to national registries or through survey-based methods. In this paper, we demonstrate how this additional information can be used to construct estimators based on inverse probability weights (IPW) or multiple imputation. We use simulations to contrast the performance of the proposed estimators with methods widely used in HIV cohort research for dealing with missing data. The practical implications of our approach are illustrated using South African ART data, which are partially linkable to South African national vital registration data. Our results demonstrate that while IPWs and proper imputation procedures can be easily constructed from additional outcome ascertainment to obtain valid overall estimates, neglecting non-ignorable LTFU can result in substantial bias. We believe the proposed estimators are readily applicable to a growing number of studies where LTFU is appreciable, but additional outcome data are available through linkage or surveys of patients LTFU. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Block bootstrap has been introduced in the literature for resampling dependent data, i.e. stationary processes. One of the main assumptions in block bootstrapping is that the blocks of observations are exchangeable, i.e. their joint distribution is immune to permutations. In this paper we propose a new Bayesian approach to block bootstrapping, starting from the construction of exchangeable blocks. Our sampling mechanism is based on a particular class of reinforced urn processes
Resumo:
A number of controlled trials have demonstrated the efficacy of Internet-based cognitive-behaviour therapy for treating social anxiety disorder (SAD). However, little is known about what makes those interventions work. The current trial focuses on patient expectations as one common mechanism of change. The study examines whether patients' expectancy predicts outcome, adherence, and dropout in an unguided Internet-based self-help programme for SAD. Data of 109 participants in a 10-week self-help programme for SAD were analysed. Social anxiety measures were administered prior to the intervention, at week 2, and after the intervention. Expectancy was assessed at week 2. Patient expectations were a significant predictor of change in social anxiety (β = - .35 to - .40, all p < .003). Patient expectations also predicted treatment adherence (β = .27, p = .02). Patients with higher expectations showed more adherence and better outcome. Dropout was not predicted by expectations. The effect of positive expectations on outcome was mediated by early symptom change (from week 0 to week 2). Results suggest that positive outcome expectations have a beneficial effect on outcome in Internet-based self-help for SAD. Furthermore, patient expectations as early process predictors could be used to inform therapeutic decisions such as stepping up patients to guided or face-to-face treatment options