931 resultados para Recent Structural Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In evaluating the accuracy of diagnosis tests, it is common to apply two imperfect tests jointly or sequentially to a study population. In a recent meta-analysis of the accuracy of microsatellite instability testing (MSI) and traditional mutation analysis (MUT) in predicting germline mutations of the mismatch repair (MMR) genes, a Bayesian approach (Chen, Watson, and Parmigiani 2005) was proposed to handle missing data resulting from partial testing and the lack of a gold standard. In this paper, we demonstrate an improved estimation of the sensitivities and specificities of MSI and MUT by using a nonlinear mixed model and a Bayesian hierarchical model, both of which account for the heterogeneity across studies through study-specific random effects. The methods can be used to estimate the accuracy of two imperfect diagnostic tests in other meta-analyses when the prevalence of disease, the sensitivities and/or the specificities of diagnostic tests are heterogeneous among studies. Furthermore, simulation studies have demonstrated the importance of carefully selecting appropriate random effects on the estimation of diagnostic accuracy measurements in this scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An appropriate model of recent human evolution is not only important to understand our own history, but it is necessary to disentangle the effects of demography and selection on genome diversity. Although most genetic data support the view that our species originated recently in Africa, it is still unclear if it completely replaced former members of the Homo genus, or if some interbreeding occurred during its range expansion. Several scenarios of modern human evolution have been proposed on the basis of molecular and paleontological data, but their likelihood has never been statistically assessed. Using DNA data from 50 nuclear loci sequenced in African, Asian and Native American samples, we show here by extensive simulations that a simple African replacement model with exponential growth has a higher probability (78%) as compared with alternative multiregional evolution or assimilation scenarios. A Bayesian analysis of the data under this best supported model points to an origin of our species approximately 141 thousand years ago (Kya), an exit out-of-Africa approximately 51 Kya, and a recent colonization of the Americas approximately 10.5 Kya. We also find that the African replacement model explains not only the shallow ancestry of mtDNA or Y-chromosomes but also the occurrence of deep lineages at some autosomal loci, which has been formerly interpreted as a sign of interbreeding with Homo erectus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rabbits models of bacterial meningitis have contributed substantially to our understanding of the disease, although the technical characteristics of these models only allow the study of specific aspects of the disease. Bacterial multiplication in the subarachnoidal space is not substantially influenced by host defense mechanisms, mainly because of the lack of sufficient amounts of specific antibodies and functional complement in infected CSF. The multiplying bacteria induce profound changes in the blood-brain barrier, an influx of serum proteins into the CSF and the invasion of polymorphonuclear leukocytes at the site of the infection. The presence of polymorphonuclear leukocytes in CSF not only appears to be of limited value in combating the infection, but also seems to produce deleterious effects on the central nervous system. Components of the leukocytes, such as unsaturated fatty acids, arachidonic metabolites and free oxygen radicals, may contribute to the profound hydrodynamic, structural and metabolic changes that are currently under study in experimental models of the disease. A better understanding of the pathophysiology of bacterial meningitis may allow us to design more effective therapeutic strategies and improve the outcome of this disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High flexural strength and stiffness can be achieved by forming a thin panel into a wave shape perpendicular to the bending direction. The use of corrugated shapes to gain flexural strength and stiffness is common in metal and reinforced plastic products. However, there is no commercial production of corrugated wood composite panels. This research focuses on the application of corrugated shapes to wood strand composite panels. Beam theory, classical plate theory and finite element models were used to analyze the bending behavior of corrugated panels. The most promising shallow corrugated panel configuration was identified based on structural performance and compatibility with construction practices. The corrugation profile selected has a wavelength equal to 8”, a channel depth equal to ¾”, a sidewall angle equal to 45 degrees and a panel thickness equal to 3/8”. 16”x16” panels were produced using random mats and 3-layer aligned mats with surface flakes parallel to the channels. Strong axis and weak axis bending tests were conducted. The test results indicate that flake orientation has little effect on the strong axis bending stiffness. The 3/8” thick random mat corrugated panels exhibit bending stiffness (400,000 lbs-in2/ft) and bending strength (3,000 in-lbs/ft) higher than 23/32” or 3/4” thick APA Rated Sturd-I-Floor with a 24” o.c. span rating. Shear and bearing test results show that the corrugated panel can withstand more than 50 psf of uniform load at 48” joist spacings. Molding trials on 16”x16” panels provided data for full size panel production. Full size 4’x8’ shallow corrugated panels were produced with only minor changes to the current oriented strandboard manufacturing process. Panel testing was done to simulate floor loading during construction, without a top underlayment layer, and during occupancy, with an underlayment over the panel to form a composite deck. Flexural tests were performed in single-span and two-span bending with line loads applied at mid-span. The average strong axis bending stiffness and bending strength of the full size corrugated panels (without the underlayment) were over 400,000 lbs-in2/ft and 3,000 in-lbs/ft, respectively. The composite deck system, which consisted of an OSB sheathing (15/32” thick) nailed-glued (using 3d ringshank nails and AFG-01 subfloor adhesive) to the corrugated subfloor achieved about 60% of the full composite stiffness resulting in about 3 times the bending stiffness of the corrugated subfloor (1,250,000 lbs-in2/ft). Based on the LRFD design criteria, the corrugated composite floor system can carry 40 psf of unfactored uniform loads, limited by the L/480 deflection limit state, at 48” joist spacings. Four 10-ft long composite T-beam specimens were built and tested for the composite action and the load sharing between a 24” wide corrugated deck system and the supporting I-joist. The average bending stiffness of the composite T-beam was 1.6 times higher than the bending stiffness of the I-joist. A 8-ft x 12-ft mock up floor was built to evaluate construction procedures. The assembly of the composite floor system is relatively simple. The corrugated composite floor system might be able to offset the cheaper labor costs of the single-layer Sturd-IFloor through the material savings. However, no conclusive result can be drawn, in terms of the construction costs, at this point without an in depth cost analysis of the two systems. The shallow corrugated composite floor system might be a potential alternative to the Sturd-I-Floor in the near future because of the excellent flexural stiffness provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reducing the uncertainties related to blade dynamics by the improvement of the quality of numerical simulations of the fluid structure interaction process is a key for a breakthrough in wind-turbine technology. A fundamental step in that direction is the implementation of aeroelastic models capable of capturing the complex features of innovative prototype blades, so they can be tested at realistic full-scale conditions with a reasonable computational cost. We make use of a code based on a combination of two advanced numerical models implemented in a parallel HPC supercomputer platform: First, a model of the structural response of heterogeneous composite blades, based on a variation of the dimensional reduction technique proposed by Hodges and Yu. This technique has the capacity of reducing the geometrical complexity of the blade section into a stiffness matrix for an equivalent beam. The reduced 1-D strain energy is equivalent to the actual 3-D strain energy in an asymptotic sense, allowing accurate modeling of the blade structure as a 1-D finite-element problem. This substantially reduces the computational effort required to model the structural dynamics at each time step. Second, a novel aerodynamic model based on an advanced implementation of the BEM(Blade ElementMomentum) Theory; where all velocities and forces are re-projected through orthogonal matrices into the instantaneous deformed configuration to fully include the effects of large displacements and rotation of the airfoil sections into the computation of aerodynamic forces. This allows the aerodynamic model to take into account the effects of the complex flexo-torsional deformation that can be captured by the more sophisticated structural model mentioned above. In this thesis we have successfully developed a powerful computational tool for the aeroelastic analysis of wind-turbine blades. Due to the particular features mentioned above in terms of a full representation of the combined modes of deformation of the blade as a complex structural part and their effects on the aerodynamic loads, it constitutes a substantial advancement ahead the state-of-the-art aeroelastic models currently available, like the FAST-Aerodyn suite. In this thesis, we also include the results of several experiments on the NREL-5MW blade, which is widely accepted today as a benchmark blade, together with some modifications intended to explore the capacities of the new code in terms of capturing features on blade-dynamic behavior, which are normally overlooked by the existing aeroelastic models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transmission electron microscopy has provided most of what is known about the ultrastructural organization of tissues, cells, and organelles. Due to tremendous advances in crystallography and magnetic resonance imaging, almost any protein can now be modeled at atomic resolution. To fully understand the workings of biological "nanomachines" it is necessary to obtain images of intact macromolecular assemblies in situ. Although the resolution power of electron microscopes is on the atomic scale, in biological samples artifacts introduced by aldehyde fixation, dehydration and staining, but also section thickness reduces it to some nanometers. Cryofixation by high pressure freezing circumvents many of the artifacts since it allows vitrifying biological samples of about 200 mum in thickness and immobilizes complex macromolecular assemblies in their native state in situ. To exploit the perfect structural preservation of frozen hydrated sections, sophisticated instruments are needed, e.g., high voltage electron microscopes equipped with precise goniometers that work at low temperature and digital cameras of high sensitivity and pixel number. With them, it is possible to generate high resolution tomograms, i.e., 3D views of subcellular structures. This review describes theory and applications of the high pressure cryofixation methodology and compares its results with those of conventional procedures. Moreover, recent findings will be discussed showing that molecular models of proteins can be fitted into depicted organellar ultrastructure of images of frozen hydrated sections. High pressure freezing of tissue is the base which may lead to precise models of macromolecular assemblies in situ, and thus to a better understanding of the function of complex cellular structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this doctoral research is to investigate the internal frost damage due to crystallization pore pressure in porous cement-based materials by developing computational and experimental characterization tools. As an essential component of the U.S. infrastructure system, the durability of concrete has significant impact on maintenance costs. In cold climates, freeze-thaw damage is a major issue affecting the durability of concrete. The deleterious effects of the freeze-thaw cycle depend on the microscale characteristics of concrete such as the pore sizes and the pore distribution, as well as the environmental conditions. Recent theories attribute internal frost damage of concrete is caused by crystallization pore pressure in the cold environment. The pore structures have significant impact on freeze-thaw durability of cement/concrete samples. The scanning electron microscope (SEM) and transmission X-ray microscopy (TXM) techniques were applied to characterize freeze-thaw damage within pore structure. In the microscale pore system, the crystallization pressures at sub-cooling temperatures were calculated using interface energy balance with thermodynamic analysis. The multi-phase Extended Finite Element Modeling (XFEM) and bilinear Cohesive Zone Modeling (CZM) were developed to simulate the internal frost damage of heterogeneous cement-based material samples. The fracture simulation with these two techniques were validated by comparing the predicted fracture behavior with the captured damage from compact tension (CT) and single-edge notched beam (SEB) bending tests. The study applied the developed computational tools to simulate the internal frost damage caused by ice crystallization with the two dimensional (2-D) SEM and three dimensional (3-D) reconstructed SEM and TXM digital samples. The pore pressure calculated from thermodynamic analysis was input for model simulation. The 2-D and 3-D bilinear CZM predicted the crack initiation and propagation within cement paste microstructure. The favorably predicted crack paths in concrete/cement samples indicate the developed bilinear CZM techniques have the ability to capture crack nucleation and propagation in cement-based material samples with multiphase and associated interface. By comparing the computational prediction with the actual damaged samples, it also indicates that the ice crystallization pressure is the main mechanism for the internal frost damage in cementitious materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inhibition of ErbB2 (HER2) with monoclonal antibodies, an effective therapy in some forms of breast cancer, is associated with cardiotoxicity, the pathophysiology of which is poorly understood. Recent data suggest, that dual inhibition of ErbB1 (EGFR) and ErbB2 signaling is more efficient in cancer therapy, however, cardiac safety of this therapeutic approach is unknown. We therefore tested an ErbB1-(CGP059326) and an ErbB1/ErbB2-(PKI166) tyrosine kinase inhibitor in an in-vitro system of adult rat ventricular cardiomyocytes and assessed their effects on 1. cell viability, 2. myofibrillar structure, 3. contractile function, and 4. MAPK- and Akt-signaling alone or in combination with Doxorubicin. Neither CGP nor PKI induced cardiomyocyte necrosis or apoptosis. PKI but not CGP caused myofibrillar structural damage that was additive to that induced by Doxorubicin at clinically relevant doses. These changes were associated with an inhibition of excitation-contraction coupling. PKI but not CGP decreased p-Erk1/2, suggesting a role for this MAP-kinase signaling pathway in the maintenance of myofibrils. These data indicate that the ErbB2 signaling pathway is critical for the maintenance of myofibrillar structure and function. Clinical studies using ErbB2-targeted inhibitors for the treatment of cancer should be designed to include careful monitoring for cardiac dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Several recent studies have shown that a positive fluid balance in critical illness is associated with worse outcome. We tested the effects of moderate vs. high-volume resuscitation strategies on mortality, systemic and regional blood flows, mitochondrial respiration, and organ function in two experimental sepsis models. Methods 48 pigs were randomized to continuous endotoxin infusion, fecal peritonitis, and a control group (n = 16 each), and each group further to two different basal rates of volume supply for 24 hours [moderate-volume (10 ml/kg/h, Ringer's lactate, n = 8); high-volume (15 + 5 ml/kg/h, Ringer's lactate and hydroxyethyl starch (HES), n = 8)], both supplemented by additional volume boli, as guided by urinary output, filling pressures, and responses in stroke volume. Systemic and regional hemodynamics were measured and tissue specimens taken for mitochondrial function assessment and histological analysis. Results Mortality in high-volume groups was 87% (peritonitis), 75% (endotoxemia), and 13% (controls). In moderate-volume groups mortality was 50% (peritonitis), 13% (endotoxemia) and 0% (controls). Both septic groups became hyperdynamic. While neither sepsis nor volume resuscitation strategy was associated with altered hepatic or muscle mitochondrial complex I- and II-dependent respiration, non-survivors had lower hepatic complex II-dependent respiratory control ratios (2.6 +/- 0.7, vs. 3.3 +/- 0.9 in survivors; P = 0.01). Histology revealed moderate damage in all organs, colloid plaques in lung tissue of high-volume groups, and severe kidney damage in endotoxin high-volume animals. Conclusions High-volume resuscitation including HES in experimental peritonitis and endotoxemia increased mortality despite better initial hemodynamic stability. This suggests that the strategy of early fluid management influences outcome in sepsis. The high mortality was not associated with reduced mitochondrial complex I- or II-dependent muscle and hepatic respiration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe four recent additions to NEURON's suite of graphical tools that make it easier for users to create and manage models: an enhancement to the Channel Builder that facilitates the specification and efficient simulation of stochastic channel models

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated choice and latent variable (ICLV) models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM) for latent variables. Despite their conceptual appeal, applications of ICLV models in marketing remain rare. We extend previous ICLV applications by first estimating a multinomial choice model and, second, by estimating hierarchical relations between latent variables. An empirical study on travel mode choice clearly demonstrates the value of ICLV models to enhance the understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonism as well as attitudes such as a desire for flexibility impact on travel mode choice. Furthermore, we show that it is possible to estimate such a complex ICLV model with the widely available structural equation modeling package Mplus. This finding is likely to encourage more widespread application of this appealing model class in the marketing field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The report examines the relationship between day care institutions, schools and so called “parents unfamiliar to education” as well as the relationship between the institutions. With in Danish public and professional discourse concepts like parents unfamiliar to education are usually referring to environments, parents or families with either no or just very restricted experience of education except for the basic school (folkeskole). The “grand old man” of Danish educational research, Prof. Em. Erik Jørgen Hansen, defines the concept as follows: Parents who are distant from or not familiar with education, are parents without tradition of education and by that fact they are not able to contribute constructively in order to back up their own children during their education. Many teachers and pedagogues are not used to that term; they rather prefer concepts like “socially exposed” or “socially disadvantaged” parents or social classes or strata. The report does not only focus on parents who are not capable to support the school achievements of their children, since a low level of education is usually connected with social disadvantage. Such parents are often not capable of understanding and meeting the demands from side of the school when sending their children to school. They lack the competencies or the necessary competence of action. For the moment being much attention is done from side of the Ministries of Education and Social Affairs (recently renamed Ministry of Welfare) in order to create equal possibilities for all children. Many kinds of expertise (directions, counsels, researchers, etc.) have been more than eager to promote recommendations aiming at achieving the ambitious goal: 2015 95% of all young people should complement a full education (classes 10.-12.). Research results are pointing out the importance of increased participation of parents. In other word the agenda is set for ‘parents’ education’. It seems necessary to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and effects, and by implementing a new wage policy depending on achievements and/or effects a new system of accountability is manufactured. The consequences are already perceptible. The government decides to do some special interventions concerning parents, children or youngsters, the public servants on municipality level are instructed to carry out their services by following a manual, and the parents are no longer protected by privacy. Protection of privacy and minority is no longer a valuable argumentation to prevent further interventions in people’s life (health, food, school, etc.). The citizens are becoming objects of investment, also implying that people are investing in their own health, education, and family. This means that investments in changes of life style and development of competences go hand in hand. The below mentioned programmes are conditioned by this shift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The economic and social changes taking place in Russia in recent decades have implied a restructuring of the Russian society. Among other things, Russian leaders have expressed a need for the reorientation of social development. In the 1990’s, cooperation was initiated on a number of social work and social welfare projects with international support, a process further speeded up during President Jeltsin’s state visit to Sweden in 1997. Discussions between the Swedish International Development Cooperation Agency (Sida) and the Russian authorities dealing with welfare issues started from the assumption that Russian professional social work was weak and needed to be strengthened. In the 1990's Sida was also given a stronger general mandate to work with other former Soviet countries in Eastern Europe, for example the Baltic States. The Russian-Swedish discussions resulted in projects aiming to raise social work competencies in public authorities, managements and among social workers in Russia. One of the areas chosen for these projects was Saint Petersburg, where several projects aiming to develop new models for social work were launched. The point of departure has been to transfer and adjust Swedish models of social work to the Russian context. The Stockholm University Department of Social Work became responsible for a number of such projects and besides using academic teachers also involved a number of practitioners, such as social workers in disablement services and reformatory staff who could meet and match Russian authorities and partners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of the Internet has made it possible to transfer data ‘around the globe at the click of a mouse’. Especially fresh business models such as cloud computing, the newest driver to illustrate the speed and breadth of the online environment, allow this data to be processed across national borders on a routine basis. A number of factors cause the Internet to blur the lines between public and private space: Firstly, globalization and the outsourcing of economic actors entrain an ever-growing exchange of personal data. Secondly, the security pressure in the name of the legitimate fight against terrorism opens the access to a significant amount of data for an increasing number of public authorities.And finally,the tools of the digital society accompany everyone at each stage of life by leaving permanent individual and borderless traces in both space and time. Therefore, calls from both the public and private sectors for an international legal framework for privacy and data protection have become louder. Companies such as Google and Facebook have also come under continuous pressure from governments and citizens to reform the use of data. Thus, Google was not alone in calling for the creation of ‘global privacystandards’. Efforts are underway to review established privacy foundation documents. There are similar efforts to look at standards in global approaches to privacy and data protection. The last remarkable steps were the Montreux Declaration, in which the privacycommissioners appealed to the United Nations ‘to prepare a binding legal instrument which clearly sets out in detail the rights to data protection and privacy as enforceable human rights’. This appeal was repeated in 2008 at the 30thinternational conference held in Strasbourg, at the 31stconference 2009 in Madrid and in 2010 at the 32ndconference in Jerusalem. In a globalized world, free data flow has become an everyday need. Thus, the aim of global harmonization should be that it doesn’t make any difference for data users or data subjects whether data processing takes place in one or in several countries. Concern has been expressed that data users might seek to avoid privacy controls by moving their operations to countries which have lower standards in their privacy laws or no such laws at all. To control that risk, some countries have implemented special controls into their domestic law. Again, such controls may interfere with the need for free international data flow. A formula has to be found to make sure that privacy at the international level does not prejudice this principle.