899 resultados para MODEL-DRIVEN DEVELOPMENT
Resumo:
Total ankle replacement remains a less satisfactory solution compared to other joint replacements. The goal of this study was to develop and validate a finite element model of total ankle replacement, for future testing of hypotheses related to clinical issues. To validate the finite element model, an experimental setup was specifically developed and applied on 8 cadaveric tibias. A non-cemented press fit tibial component of a mobile bearing prosthesis was inserted into the tibias. Two extreme anterior and posterior positions of the mobile bearing insert were considered, as well as a centered one. An axial force of 2kN was applied for each insert position. Strains were measured on the bone surface using digital image correlation. Tibias were CT scanned before implantation, after implantation, and after mechanical tests and removal of the prosthesis. The finite element model replicated the experimental setup. The first CT was used to build the geometry and evaluate the mechanical properties of the tibias. The second CT was used to set the implant position. The third CT was used to assess the bone-implant interface conditions. The coefficient of determination (R-squared) between the measured and predicted strains was 0.91. Predicted bone strains were maximal around the implant keel, especially at the anterior and posterior ends. The finite element model presented here is validated for future tests using more physiological loading conditions.
Resumo:
Gut microbiota has recently been proposed as a crucial environmental factor in the development of metabolic diseases such as obesity and type 2 diabetes, mainly due to its contribution in the modulation of several processes including host energy metabolism, gut epithelial permeability, gut peptide hormone secretion, and host inflammatory state. Since the symbiotic interaction between the gut microbiota and the host is essentially reflected in specific metabolic signatures, much expectation is placed on the application of metabolomic approaches to unveil the key mechanisms linking the gut microbiota composition and activity with disease development. The present review aims to summarize the gut microbial-host co-metabolites identified so far by targeted and untargeted metabolomic studies in humans, in association with impaired glucose homeostasis and/or obesity. An alteration of the co-metabolism of bile acids, branched fatty acids, choline, vitamins (i.e., niacin), purines, and phenolic compounds has been associated so far with the obese or diabese phenotype, in respect to healthy controls. Furthermore, anti-diabetic treatments such as metformin and sulfonylurea have been observed to modulate the gut microbiota or at least their metabolic profiles, thereby potentially affecting insulin resistance through indirect mechanisms still unknown. Despite the scarcity of the metabolomic studies currently available on the microbial-host crosstalk, the data-driven results largely confirmed findings independently obtained from in vitro and animal model studies, putting forward the mechanisms underlying the implication of a dysfunctional gut microbiota in the development of metabolic disorders.
Resumo:
The development of model observers for mimicking human detection strategies has followed from symmetric signals in simple noise to increasingly complex backgrounds. In this study we implement different model observers for the complex task of detecting a signal in a 3D image stack. The backgrounds come from real breast tomosynthesis acquisitions and the signals were simulated and reconstructed within the volume. Two different tasks relevant to the early detection of breast cancer were considered: detecting an 8 mm mass and detecting a cluster of microcalcifications. The model observers were calculated using a channelized Hotelling observer (CHO) with dense difference-of-Gaussian channels, and a modified (Partial prewhitening [PPW]) observer which was adapted to realistic signals which are not circularly symmetric. The sustained temporal sensitivity function was used to filter the images before applying the spatial templates. For a frame rate of five frames per second, the only CHO that we calculated performed worse than the humans in a 4-AFC experiment. The other observers were variations of PPW and outperformed human observers in every single case. This initial frame rate was a rather low speed and the temporal filtering did not affect the results compared to a data set with no human temporal effects taken into account. We subsequently investigated two higher speeds at 5, 15 and 30 frames per second. We observed that for large masses, the two types of model observers investigated outperformed the human observers and would be suitable with the appropriate addition of internal noise. However, for microcalcifications both only the PPW observer consistently outperformed the humans. The study demonstrated the possibility of using a model observer which takes into account the temporal effects of scrolling through an image stack while being able to effectively detect a range of mass sizes and distributions.
Resumo:
Coherent regulation of landscape as a resource is a major challenge. How can the development interests of some actors (eg cable car operators and property developers) be reconciled with those of others (agriculture, forestry) and with conservation of biodiversity and scenic value? To help understand how the newly introduced Regional Nature Parks (RNPs) can improve the coherence of the regulation regime in Switzerland, we highlight current direct mechanisms for regulation of landscape as a resource (bans, inventories, subsidies) as well as indirect mechanisms (taking place through the regulation of the physical basis of landscapes, eg forest, land, and water planning policies). We show that RNPs are fundamentally innovative because they make it possible to manage and coordinate indirect strategies for appropriate regulation of resources at a landscape scale. In other words, RNPs enable organization of governance of landscape as a resource in a perimeter that is not necessarily restricted to administrative boundaries.
Resumo:
DNA vaccination is a promising approach for inducing both humoral and cellular immune responses. The mode of plasmid DNA delivery is critical to make progress in DNA vaccination. Using human papillomavirus type 16 E7 as a model antigen, this study evaluated the effect of peptide-polymer hybrid including PEI600-Tat conjugate as a novel gene delivery system on the potency of antigen-specific immunity in mice model. At ratio of 10:50 PEI-Tat/E7DNA (w/w), both humoral and cellular immune responses were significantly enhanced as compared with E7DNA construct and induced Th1 response. Therefore, this new delivery system could have promising applications in gene therapy.
Resumo:
A quantitative model of water movement within the immediate vicinity of an individual root is developed and results of an experiment to validate the model are presented. The model is based on the assumption that the amount of water transpired by a plant in a certain period is replaced by an equal volume entering its root system during the same time. The model is based on the Darcy-Buckingham equation to calculate the soil water matric potential at any distance from a plant root as a function of parameters related to crop, soil and atmospheric conditions. The model output is compared against measurements of soil water depletion by rice roots monitored using γ-beam attenuation in a greenhouse of the Escola Superior de Agricultura "Luiz de Queiroz"/Universidade de São Paulo(ESALQ/USP) in Piracicaba, State of São Paulo, Brazil, in 1993. The experimental results are in agreement with the output from the model. Model simulations show that a single plant root is able to withdraw water from more than 0.1 m away within a few days. We therefore can assume that root distribution is a less important factor for soil water extraction efficiency.
Resumo:
Abnormal development can lead to deficits in adult brain function, a trajectory likely underlying adolescent-onset psychiatric conditions such as schizophrenia. Developmental manipulations yielding adult deficits in rodents provide an opportunity to explore mechanisms involved in a delayed emergence of anomalies driven by developmental alterations. Here we assessed whether oxidative stress during presymptomatic stages causes adult anomalies in rats with a neonatal ventral hippocampal lesion, a developmental rodent model useful for schizophrenia research. Juvenile and adolescent treatment with the antioxidant N-acetyl cysteine prevented the reduction of prefrontal parvalbumin interneuron activity observed in this model, as well as electrophysiological and behavioral deficits relevant to schizophrenia. Adolescent treatment with the glutathione peroxidase mimic ebselen also reversed behavioral deficits in this animal model. These findings suggest that presymptomatic oxidative stress yields abnormal adult brain function in a developmentally compromised brain, and highlight redox modulation as a potential target for early intervention.
Resumo:
A laboratory study has been conducted with two aims in mind. The first goal was to develop a description of how a cutting edge scrapes ice from the road surface. The second goal was to investigate the extent, if any, to which serrated blades were better than un-serrated or "classical" blades at ice removal. The tests were conducted in the Ice Research Laboratory at the Iowa Institute of Hydraulic Research of the University of Iowa. A specialized testing machine, with a hydraulic ram capable of attaining scraping velocities of up to 30 m.p.h. was used in the testing. In order to determine the ice scraping process, the effects of scraping velocity, ice thickness, and blade geometry on the ice scraping forces were determined. Higher ice thickness lead to greater ice chipping (as opposed to pulverization at lower thicknesses) and thus lower loads. Behavior was observed at higher velocities. The study of blade geometry included the effect of rake angle, clearance angle, and flat width. The latter were found to be particularly important in developing a clear picture of the scraping process. As clearance angle decreases and flat width increases, the scraping loads show a marked increase, due to the need to re-compress pulverized ice fragments. The effect of serrations was to decrease the scraping forces. However, for the coarsest serrated blades (with the widest teeth and gaps) the quantity of ice removed was significantly less than for a classical blade. Finer serrations appear to be able to match the ice removal of classical blades at lower scraping loads. Thus, one of the recommendations of this study is to examine the use of serrated blades in the field. Preliminary work (by Nixon and Potter, 1996) suggests such work will be fruitful. A second and perhaps more challenging result of the study is that chipping of ice is more preferable to pulverization of the ice. How such chipping can be forced to occur is at present an open question.
Resumo:
A phase-field model for dealing with dynamic instabilities in membranes is presented. We use it to study curvature-driven pearling instability in vesicles induced by the anchorage of amphiphilic polymers on the membrane. Within this model, we obtain the morphological changes reported in recent experiments. The formation of a homogeneous pearled structure is achieved by consequent pearling of an initial cylindrical tube from the tip. For high enough concentration of anchors, we show theoretically that the homogeneous pearled shape is energetically less favorable than an inhomogeneous one, with a large sphere connected to an array of smaller spheres.
Resumo:
Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.
Resumo:
A laboratory study has been conducted with two aims in mind. The first goal was to develop a description of how a cutting edge scrapes ice from the road surface. The second goal was to investigate the extent, if any, to which serrated blades were better than un-serrated or "classical" blades at ice removal. The tests were conducted in the Ice Research Laboratory at the Iowa Institute of Hydraulic Research of the University of Iowa. A specialized testing machine, with a hydraulic ram capable of attaining scraping velocities of up to 30 m.p.h. was used in the testing. In order to determine the ice scraping process, the effects of scraping velocity, ice thickness, and blade geometry on the ice scraping forces were determined. Higher ice thickness lead to greater ice chipping (as opposed to pulverization at lower thicknesses) and thus lower loads. S~milabr ehavior was observed at higher velocities. The study of blade geometry included the effect of rake angle, clearance angle, and flat width. The latter were found to be particularly important in developing a clear picture of the scraping process. As clearance angle decreases and flat width increases, the scraping loads show a marked increase, due to the need to re-compress pulverized ice fragments. The effect of serrations was to decrease the scraping forces. However, for the coarsest serrated blades (with the widest teeth and gaps) the quantity of ice removed was significantly less than for a classical blade. Finer serrations appear to be able to match the ice removal of classical blades at lower scraping loads. Thus, one of the recommendations of this study is to examine the use of serrated blades in the field. Preliminary work (by Nixon and Potter, 1996) suggests such work will be fruitful. A second and perhaps more challenging result of the study is that chipping of ice is more preferable to pulverization of the ice. How such chipping can be forced to occur is at present an open question.
Resumo:
For well over 100 years, the Working Stress Design (WSD) approach has been the traditional basis for geotechnical design with regard to settlements or failure conditions. However, considerable effort has been put forth over the past couple of decades in relation to the adoption of the Load and Resistance Factor Design (LRFD) approach into geotechnical design. With the goal of producing engineered designs with consistent levels of reliability, the Federal Highway Administration (FHWA) issued a policy memorandum on June 28, 2000, requiring all new bridges initiated after October 1, 2007, to be designed according to the LRFD approach. Likewise, regionally calibrated LRFD resistance factors were permitted by the American Association of State Highway and Transportation Officials (AASHTO) to improve the economy of bridge foundation elements. Thus, projects TR-573, TR-583 and TR-584 were undertaken by a research team at Iowa State University’s Bridge Engineering Center with the goal of developing resistance factors for pile design using available pile static load test data. To accomplish this goal, the available data were first analyzed for reliability and then placed in a newly designed relational database management system termed PIle LOad Tests (PILOT), to which this first volume of the final report for project TR-573 is dedicated. PILOT is an amalgamated, electronic source of information consisting of both static and dynamic data for pile load tests conducted in the State of Iowa. The database, which includes historical data on pile load tests dating back to 1966, is intended for use in the establishment of LRFD resistance factors for design and construction control of driven pile foundations in Iowa. Although a considerable amount of geotechnical and pile load test data is available in literature as well as in various State Department of Transportation files, PILOT is one of the first regional databases to be exclusively used in the development of LRFD resistance factors for the design and construction control of driven pile foundations. Currently providing an electronically organized assimilation of geotechnical and pile load test data for 274 piles of various types (e.g., steel H-shaped, timber, pipe, Monotube, and concrete), PILOT (http://srg.cce.iastate.edu/lrfd/) is on par with such familiar national databases used in the calibration of LRFD resistance factors for pile foundations as the FHWA’s Deep Foundation Load Test Database. By narrowing geographical boundaries while maintaining a high number of pile load tests, PILOT exemplifies a model for effective regional LRFD calibration procedures.
Resumo:
Drilled shafts have been used in the US for more than 100 years in bridges and buildings as a deep foundation alternative. For many of these applications, the drilled shafts were designed using the Working Stress Design (WSD) approach. Even though WSD has been used successfully in the past, a move toward Load Resistance Factor Design (LRFD) for foundation applications began when the Federal Highway Administration (FHWA) issued a policy memorandum on June 28, 2000.The policy memorandum requires all new bridges initiated after October 1, 2007, to be designed according to the LRFD approach. This ensures compatibility between the superstructure and substructure designs, and provides a means of consistently incorporating sources of uncertainty into each load and resistance component. Regionally-calibrated LRFD resistance factors are permitted by the American Association of State Highway and Transportation Officials (AASHTO) to improve the economy and competitiveness of drilled shafts. To achieve this goal, a database for Drilled SHAft Foundation Testing (DSHAFT) has been developed. DSHAFT is aimed at assimilating high quality drilled shaft test data from Iowa and the surrounding regions, and identifying the need for further tests in suitable soil profiles. This report introduces DSHAFT and demonstrates its features and capabilities, such as an easy-to-use storage and sharing tool for providing access to key information (e.g., soil classification details and cross-hole sonic logging reports). DSHAFT embodies a model for effective, regional LRFD calibration procedures consistent with PIle LOad Test (PILOT) database, which contains driven pile load tests accumulated from the state of Iowa. PILOT is now available for broader use at the project website: http://srg.cce.iastate.edu/lrfd/. DSHAFT, available in electronic form at http://srg.cce.iastate.edu/dshaft/, is currently comprised of 32 separate load tests provided by Illinois, Iowa, Minnesota, Missouri and Nebraska state departments of transportation and/or department of roads. In addition to serving as a manual for DSHAFT and providing a summary of the available data, this report provides a preliminary analysis of the load test data from Iowa, and will open up opportunities for others to share their data through this quality–assured process, thereby providing a platform to improve LRFD approach to drilled shafts, especially in the Midwest region.
Resumo:
New and alternative scientific publishing business models is a reality driven mostly by the information and communication technologies, by the movements towards the recovery of control of the scientific communication activities by the academic community, and by the open access approaches. The hybrid business model, mixing open and toll-access is a reality and they will probably co-exist with respective trade-offs. This essay discusses the changes driven by the epublishing and the impacts on the scholarly communication system stakeholders' interrelationships (publishers-researchers, publishers-libraries and publishers-users interrelationships), and the changes on the scientific publishing business models, followed by a discussion of possible evolving business models. Whatever the model which evolves and dominates, a huge cultural change in authors' and institutions publishing practices will be necessary in order to make the open access happen and to consolidate the right business models for the traditional publishers. External changes such as policies, rewarding systems and institutions mandates should also happen in order to sustain the whole changing scenario.