13 resultados para Experimental work

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Aging societies suffer from an increasing incidence of bone fractures. Bone strength depends on the amount of mineral measured by clinical densitometry, but also on the micromechanical properties of the bone hierarchical organization. A good understanding has been reached for elastic properties on several length scales, but up to now there is a lack of reliable postyield data on the lower length scales. In order to be able to describe the behavior of bone at the microscale, an anisotropic elastic-viscoplastic damage model was developed using an eccentric generalized Hill criterion and nonlinear isotropic hardening. The model was implemented as a user subroutine in Abaqus and verified using single element tests. A FE simulation of microindentation in lamellar bone was finally performed show-ing that the new constitutive model can capture the main characteristics of the indentation response of bone. As the generalized Hill criterion is limited to elliptical and cylindrical yield surfaces and the correct shape for bone is not known, a new yield surface was developed that takes any convex quadratic shape. The main advantage is that in the case of material identification the shape of the yield surface does not have to be anticipated but a minimization results in the optimal shape among all convex quadrics. The generality of the formulation was demonstrated by showing its degeneration to classical yield surfaces. Also, existing yield criteria for bone at multiple length scales were converted to the quadric formulation. Then, a computational study to determine the influence of yield surface shape and damage on the in-dentation response of bone using spherical and conical tips was performed. The constitutive model was adapted to the quadric criterion and yield surface shape and critical damage were varied. They were shown to have a major impact on the indentation curves. Their influence on indentation modulus, hardness, their ratio as well as the elastic to total work ratio were found to be very well described by multilinear regressions for both tip shapes. For conical tips, indentation depth was not a significant fac-tor, while for spherical tips damage was insignificant. All inverse methods based on microindentation suffer from a lack of uniqueness of the found material properties in the case of nonlinear material behavior. Therefore, monotonic and cyclic micropillar com-pression tests in a scanning electron microscope allowing a straightforward interpretation comple-mented by microindentation and macroscopic uniaxial compression tests were performed on dry ovine bone to identify modulus, yield stress, plastic deformation, damage accumulation and failure mecha-nisms. While the elastic properties were highly consistent, the postyield deformation and failure mech-anisms differed between the two length scales. A majority of the micropillars showed a ductile behavior with strain hardening until failure by localization in a slip plane, while the macroscopic samples failed in a quasi-brittle fashion with microcracks coalescing into macroscopic failure surfaces. In agreement with a proposed rheological model, these experiments illustrate a transition from a ductile mechanical behavior of bone at the microscale to a quasi-brittle response driven by the growth of preexisting cracks along interfaces or in the vicinity of pores at the macroscale. Subsequently, a study was undertaken to quantify the topological variability of indentations in bone and examine its relationship with mechanical properties. Indentations were performed in dry human and ovine bone in axial and transverse directions and their topography measured by AFM. Statistical shape modeling of the residual imprint allowed to define a mean shape and describe the variability with 21 principal components related to imprint depth, surface curvature and roughness. The indentation profile of bone was highly consistent and free of any pile up. A few of the topological parameters, in particular depth, showed significant correlations to variations in mechanical properties, but the cor-relations were not very strong or consistent. We could thus verify that bone is rather homogeneous in its micromechanical properties and that indentation results are not strongly influenced by small de-viations from the ideal case. As the uniaxial properties measured by micropillar compression are in conflict with the current literature on bone indentation, another dissipative mechanism has to be present. The elastic-viscoplastic damage model was therefore extended to viscoelasticity. The viscoelastic properties were identified from macroscopic experiments, while the quasistatic postelastic properties were extracted from micropillar data. It was found that viscoelasticity governed by macroscale properties has very little influence on the indentation curve and results in a clear underestimation of the creep deformation. Adding viscoplasticity leads to increased creep, but hardness is still highly overestimated. It was possible to obtain a reasonable fit with experimental indentation curves for both Berkovich and spherical indenta-tion when abandoning the assumption of shear strength being governed by an isotropy condition. These results remain to be verified by independent tests probing the micromechanical strength prop-erties in tension and shear. In conclusion, in this thesis several tools were developed to describe the complex behavior of bone on the microscale and experiments were performed to identify its material properties. Micropillar com-pression highlighted a size effect in bone due to the presence of preexisting cracks and pores or inter-faces like cement lines. It was possible to get a reasonable fit between experimental indentation curves using different tips and simulations using the constitutive model and uniaxial properties measured by micropillar compression. Additional experimental work is necessary to identify the exact nature of the size effect and the mechanical role of interfaces in bone. Deciphering the micromechanical behavior of lamellar bone and its evolution with age, disease and treatment and its failure mechanisms on several length scales will help preventing fractures in the elderly in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this study was to develop a criteria catalogue serving as a guideline for authors to improve quality of reporting experiments in basic research in homeopathy. A Delphi Process was initiated including three rounds of adjusting and phrasing plus two consensus conferences. European researchers who published experimental work within the last 5 years were involved. A checklist for authors provide a catalogue with 23 criteria. The “Introduction” should focus on underlying hypotheses, the homeopathic principle investigated and state if experiments are exploratory or confirmatory. “Materials and methods” should comprise information on object of investigation, experimental setup, parameters, intervention and statistical methods. A more detailed description on the homeopathic substances, for example, manufacture, dilution method, starting point of dilution is required. A further result of the Delphi process is to raise scientists' awareness of reporting blinding, allocation, replication, quality control and system performance controls. The part “Results” should provide the exact number of treated units per setting which were included in each analysis and state missing samples and drop outs. Results presented in tables and figures are as important as appropriate measures of effect size, uncertainty and probability. “Discussion” in a report should depict more than a general interpretation of results in the context of current evidence but also limitations and an appraisal of aptitude for the chosen experimental model. Authors of homeopathic basic research publications are encouraged to apply our checklist when preparing their manuscripts. Feedback is encouraged on applicability, strength and limitations of the list to enable future revisions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A number of advances in our understanding of the pathophysiology of bacterial meningitis have been made in recent years. In vivo studies have shown that bacterial cell wall fragments and endotoxins are highly active components, independent of the presence of viable bacteria in the subarachnoid space. Their presence in the cerebrospinal fluid is associated with the induction of inflammation and with the development of brain edema and increased intracranial pressure. Antimicrobial therapy may cause an additional increase of harmful bacterial products in the cerebrospinal fluid and thereby potentiate these pathophysiological alterations. These changes may contribute to the development of brain damage during meningitis. Some promising experimental work has been directed toward counteracting the above phenomena with non-steroidal or steroidal anti-inflammatory agents as well as with monoclonal antibodies. Although considerable advances have been made, further research needs to be done in these areas to improve the prognosis of bacterial meningitis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The study is based on experimental work conducted in alpine snow. We made microwave radiometric and near-infrared reflectance measurements of snow slabs under different experimental conditions. We used an empirical relation to link near-infrared reflectance of snow to the specific surface area (SSA), and converted the SSA into the correlation length. From the measurements of snow radiances at 21 and 35 GHz , we derived the microwave scattering coefficient by inverting two coupled radiative transfer models (the sandwich and six-flux model). The correlation lengths found are in the same range as those determined in the literature using cold laboratory work. The technique shows great potential in the determination of the snow correlation length under field conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Synaesthesia is a condition in which a stimulus elicits an additional subjective experience. For example, the letter E printed in black (the inducer) may trigger an additional colour experience as a concurrent (e.g., blue). Synaesthesia tends to run in families and thus, a genetic component is likely. However, given that the stimuli that typically induce synaesthesia are cultural artefacts, a learning component must also be involved. Moreover, there is evidence that synaesthetic experiences not only activate brain areas typically involved in processing sensory input of the concurrent modality; synaesthesia seems to cause a structural reorganisation of the brain. Attempts to train non-synaesthetes with synaesthetic associations have been successful in mimicking certain behavioural aspects and posthypnotic induction of synaesthetic experiences in non-synaesthetes has even led to the according phenomenological reports. These latter findings suggest that structural brain reorganization may not be a critical precondition, but rather a consequence of the sustained coupling of inducers and concurrents. Interestingly, synaesthetes seem to be able to easily transfer synaesthetic experiences to novel stimuli. Beyond this, certain drugs (e.g., LSD) can lead to synaesthesia-like experiences and may provide additional insights into the neurobiological basis of the condition. Furthermore, brain damage can both lead to a sudden presence of synaesthetic experiences in previously non-synaesthetic individuals and a sudden absence of synaesthesia in previously synaesthetic individuals. Moreover, enduring sensory substitution has been effective in inducing a kind of acquired synaesthesia. Besides informing us about the cognitive mechanisms of synaesthesia, synaesthesia research is relevant for more general questions, for example about consciousness such as the binding problem, about crossmodal correspondences and about how individual differences in perceiving and experiencing the world develop. Hence the aim of the current Research Topic is to provide novel insights into the development of synaesthesia both in its genuine and acquired form. We welcome novel experimental work and theoretical contributions (e.g., review and opinion articles) focussing on factors such as brain maturation, learning, training, hypnosis, drugs, sensory substitution and brain damage and their relation to the development of any form of synaesthesia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is not sufficiently understood why some lineages of cichlid fishes have proliferated in the Great Lakes of East Africa much more than anywhere else in the world, and much faster than other cichlid lineages or any other group of freshwater fish. Recent field and experimental work on Lake Victoria haplochromines suggests that mate choice-mediated disruptive sexual selection on coloration, that can cause speciation even in the absence of geographical isolation, may explain it. We summarize the evidence and propose a hypothesis for the genetics of coloration that may help understand the phenomenon. By detl ning colour patterns by hue and arrangement of hues on the body, we could assign almost all observed phenotypes of Lake Victoria cichlids to one of three female («plain», «orange blotched», «black and white») and three male («blue», «red-ventrum», «reddorsum») colour patterns. These patterns diagnose species but frequently eo-occur also as morphs within the same population, where they are associated with variation in mate preferences, and appear to be transient stages in speciation. Particularly the male patterns occur in almost every genus of the species flock. We propose that the patterns and their association into polymorphisms express an ancestral trait that is retained across speciation. Our model for male colour pattern assumes two structural loci. When both are switched off, the body is blue. When switched on by a cascade of polymorphic regulatory genes, one expresses a yellow to red ventrum, the other one a yellow to red dorsum. The expression of colour variation initiates speciation. The blue daughter species will inherit the variation at the regulatory genes that can, without new mutational events, purely by recombination, again expose the colour polymorphism, starting the process anew. Very similar colour patterns also dominate among the Mbuna of Lake Malawi. In contrast, similar colour polymorphisms do not exist in the lineages that have not proliferated in the Great Lakes. The colour pattern polymorphism may be an ancient trait in the lineage (or lineages) that gave rise to the two large haplochromine radiations. We propose two tests of our hypothesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Each year about 650,000 Europeans die from stroke and a similar number lives with the sequelae of multiple sclerosis (MS). Stroke and MS differ in their etiology. Although cause and likewise clinical presentation set the two diseases apart, they share common downstream mechanisms that lead to damage and recovery. Demyelination and axonal injury are characteristics of MS but are also observed in stroke. Conversely, hallmarks of stroke, such as vascular impairment and neurodegeneration, are found in MS. However, the most conspicuous common feature is the marked neuroinflammatory response, marked by glia cell activation and immune cell influx. In MS and stroke the blood-brain barrier is disrupted allowing bone marrow-derived macrophages to invade the brain in support of the resident microglia. In addition, there is a massive invasion of auto-reactive T-cells into the brain of patients with MS. Though less pronounced a similar phenomenon is also found in ischemic lesions. Not surprisingly, the two diseases also resemble each other at the level of gene expression and the biosynthesis of other proinflammatory mediators. While MS has traditionally been considered to be an autoimmune neuroinflammatory disorder, the role of inflammation for cerebral ischemia has only been recognized later. In the case of MS the long track record as neuroinflammatory disease has paid off with respect to treatment options. There are now about a dozen of approved drugs for the treatment of MS that specifically target neuroinflammation by modulating the immune system. Interestingly, experimental work demonstrated that drugs that are in routine use to mitigate neuroinflammation in MS may also work in stroke models. Examples include Fingolimod, glatiramer acetate, and antibodies blocking the leukocyte integrin VLA-4. Moreover, therapeutic strategies that were discovered in experimental autoimmune encephalomyelitis (EAE), the animal model of MS, turned out to be also effective in experimental stroke models. This suggests that previous achievements in MS research may be relevant for stroke. Interestingly, the converse is equally true. Concepts on the neurovascular unit that were developed in a stroke context turned out to be applicable to neuroinflammatory research in MS. Examples include work on the important role of the vascular basement membrane and the BBB for the invasion of immune cells into the brain. Furthermore, tissue plasminogen activator (tPA), the only established drug treatment in acute stroke, modulates the pathogenesis of MS. Endogenous tPA is released from endothelium and astroglia and acts on the BBB, microglia and other neuroinflammatory cells. Thus, the vascular perspective of stroke research provides important input into the mechanisms on how endothelial cells and the BBB regulate inflammation in MS, particularly the invasion of immune cells into the CNS. In the current review we will first discuss pathogenesis of both diseases and current treatment regimens and will provide a detailed overview on pathways of immune cell migration across the barriers of the CNS and the role of activated astrocytes in this process. This article is part of a Special Issue entitled: Neuro inflammation: A common denominator for stroke, multiple sclerosis and Alzheimer's disease, guest edited by Helga de Vries and Markus Swaninger.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

I argue that scientific realism, insofar as it is only committed to those scientific posits of which we have causal knowledge, is immune to Kyle Stanford’s argument from unconceived alternatives. This causal strategy (previously introduced, but not worked out in detail, by Anjan Chakravartty) is shown not to repeat the shortcomings of previous realist responses to Stanford’s argument. Furthermore, I show that the notion of causal knowledge underlying it can be made sufficiently precise by means of conceptual tools recently introduced into the debate on scientific realism. Finally, I apply this strategy to the case of Jean Perrin’s experimental work on the atomic hypothesis, disputing Stanford’s claim that the problem of unconceived alternatives invalidates a realist interpretation of this historical episode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT : INTRODUCTION : V2-receptor (V2R) stimulation potentially aggravates sepsis-induced vasodilation, fluid accumulation and microvascular thrombosis. Therefore, the present study was performed to determine the effects of a first-line therapy with the selective V2R-antagonist (Propionyl1-D-Tyr(Et)2-Val4-Abu6-Arg8,9)-Vasopressin on cardiopulmonary hemodynamics and organ function vs. the mixed V1aR/V2R-agonist arginine vasopressin (AVP) or placebo in an established ovine model of septic shock. METHODS : After the onset of septic shock, chronically instrumented sheep were randomly assigned to receive first-line treatment with the selective V2R-antagonist (1 g/kg per hour), AVP (0.05 g/kg per hour), or normal saline (placebo, each n = 7). In all groups, open-label norepinephrine was additionally titrated up to 1 g/kg per minute to maintain mean arterial pressure at 70 ± 5 mmHg, if necessary. RESULTS : Compared to AVP- and placebo-treated animals, the selective V2R-antagonist stabilized cardiopulmonary hemodynamics (mean arterial and pulmonary artery pressure, cardiac index) as effectively and increased intravascular volume as suggested by higher cardiac filling pressures. Furthermore, left ventricular stroke work index was higher in the V2R-antagonist group than in the AVP group. Notably, metabolic (pH, base excess, lactate concentrations), liver (transaminases, bilirubin) and renal (creatinine and blood urea nitrogen plasma levels, urinary output, creatinine clearance) dysfunctions were attenuated by the V2R-antagonist when compared with AVP and placebo. The onset of septic shock was associated with an increase in AVP plasma levels as compared to baseline in all groups. Whereas AVP plasma levels remained constant in the placebo group, infusion of AVP increased AVP plasma levels up to 149 ± 21 pg/mL. Notably, treatment with the selective V2R-antagonist led to a significant decrease of AVP plasma levels as compared to shock time (P < 0.001) and to both other groups (P < 0.05 vs. placebo; P < 0.001 vs. AVP). Immunohistochemical analyses of lung tissue revealed higher hemeoxygenase-1 (vs. placebo) and lower 3-nitrotyrosine concentrations (vs. AVP) in the V2R-antagonist group. In addition, the selective V2R-antagonist slightly prolonged survival (14 ± 1 hour) when compared to AVP (11 ± 1 hour, P = 0.007) and placebo (11 ± 1 hour, P = 0.025). CONCLUSIONS : Selective V2R-antagonism may represent an innovative therapeutic approach to attenuate multiple organ dysfunction in early septic shock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the availability of effective antibiotic therapies, pneumococcal meningitis (PM) has a case fatality rate of up to 30% and causes neurological sequelae in up to half of the surviving patients. The underlying brain damage includes apoptosis of neurons in the hippocampus and necrosis in the cortex. Therapeutic options to reduce acute injury and to improve outcome from PM are severely limited.With the aim to develop new therapies a number of pharmacologic interventions have been evaluated. However, the often unpredictable outcome of interventional studies suggests that the current concept of the pathophysiologic events during bacterial meningitis is fragmentary. The aim of this work is to describe the transcriptomic changes underlying the complex mechanisms of the host response to pneumococcal meningitis in a temporal and spatial context using a well characterized infant rat model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-standing rotator cuff tendon tearing is associated with retraction, loss of work capacity, irreversible fatty infiltration, and atrophy of the rotator cuff muscles. Although continuous musculotendinous relengthening can experimentally restore muscular architecture, restoration of atrophy and fatty infiltration is hitherto impossible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Osteoporosis-related vertebral fractures represent a major health problem in elderly populations. Such fractures can often only be diagnosed after a substantial deformation history of the vertebral body. Therefore, it remains a challenge for clinicians to distinguish between stable and progressive potentially harmful fractures. Accordingly, novel criteria for selection of the appropriate conservative or surgical treatment are urgently needed. Computer tomography-based finite element analysis is an increasingly accepted method to predict the quasi-static vertebral strength and to follow up this small strain property longitudinally in time. A recent development in constitutive modeling allows us to simulate strain localization and densification in trabecular bone under large compressive strains without mesh dependence. The aim of this work was to validate this recently developed constitutive model of trabecular bone for the prediction of strain localization and densification in the human vertebral body subjected to large compressive deformation. A custom-made stepwise loading device mounted in a high resolution peripheral computer tomography system was used to describe the progressive collapse of 13 human vertebrae under axial compression. Continuum finite element analyses of the 13 compression tests were realized and the zones of high volumetric strain were compared with the experiments. A fair qualitative correspondence of the strain localization zone between the experiment and finite element analysis was achieved in 9 out of 13 tests and significant correlations of the volumetric strains were obtained throughout the range of applied axial compression. Interestingly, the stepwise propagating localization zones in trabecular bone converged to the buckling locations in the cortical shell. While the adopted continuum finite element approach still suffers from several limitations, these encouraging preliminary results towardsthe prediction of extended vertebral collapse may help in assessing fracture stability in future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.