203 resultados para Optimal Component Proportions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An energy method for a linear-elastic perfectly plastic method utilising the von Mises yield criterion with associated flow developed in 2013 by McMahon and co-workers is used to compare the ellipsoidal cavity-expansion mechanism, from the same work, and the displacement fields of other research by Levin, in 1995, and Osman and Bolton, in 2005, which utilise the Hill and Prandtl mechanisms respectively. The energy method was also used with a mechanism produced by performing a linear-elastic finite-element analysis in Abaqus. At small values of settlement and soil rigidity the elastic mechanism provides the lowest upper-bound solution, and matches well with finite-element analysis results published in the literature. At typical footing working loads and settlements the cavity-expansion mechanism produces a more optimal solution than the displacement fields within the Hill and Prandtl mechanisms, and also matches well with the published finite-element analysis results in this range. Beyond these loads, at greater footing settlements, or soil rigidity, the Prandtl mechanism is shown to be the most appropriate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately 40% of annual demand for steel worldwide is used to replace products that have failed. With this percentage set to rise, extending the lifespan of steel in products presents a significant opportunity to reduce demand and thus decrease carbon dioxide emissions from steel production. This article presents a new, simplified framework with which to analyse product failure. When applied to the products that dominate steel use, this framework reveals that they are often replaced because a component/sub-assembly becomes degraded, inferior, unsuitable or worthless. In light of this, four products, which are representative of high steel content products in general, are analysed at the component level, determining steel mass and cost profiles over the lifespan of each product. The results show that the majority of the steel components are underexploited - still functioning when the product is discarded; in particular, the potential lifespan of the steel-rich structure is typically much greater than its actual lifespan. Twelve case studies, in which product or component life has been increased, are then presented. The resulting evidence is used to tailor life-extension strategies to each reason for product failure and to identify the economic motivations for implementing these strategies. The results suggest that a product template in which the long-lived structure accounts for a relatively high share of costs while short-lived components can be easily replaced (offering profit to the producer and enhanced utility to owners) encourages product life extension. © 2013 The Author.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copyright © (2014) by the International Machine Learning Society (IMLS) All rights reserved. Classical methods such as Principal Component Analysis (PCA) and Canonical Correlation Analysis (CCA) are ubiquitous in statistics. However, these techniques are only able to reveal linear re-lationships in data. Although nonlinear variants of PCA and CCA have been proposed, these are computationally prohibitive in the large scale. In a separate strand of recent research, randomized methods have been proposed to construct features that help reveal nonlinear patterns in data. For basic tasks such as regression or classification, random features exhibit little or no loss in performance, while achieving drastic savings in computational requirements. In this paper we leverage randomness to design scalable new variants of nonlinear PCA and CCA; our ideas extend to key multivariate analysis tools such as spectral clustering or LDA. We demonstrate our algorithms through experiments on real- world data, on which we compare against the state-of-the-art. A simple R implementation of the presented algorithms is provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A venerable history of classical work on autoassociative memory has significantly shaped our understanding of several features of the hippocampus, and most prominently of its CA3 area, in relation to memory storage and retrieval. However, existing theories of hippocampal memory processing ignore a key biological constraint affecting memory storage in neural circuits: the bounded dynamical range of synapses. Recent treatments based on the notion of metaplasticity provide a powerful model for individual bounded synapses; however, their implications for the ability of the hippocampus to retrieve memories well and the dynamics of neurons associated with that retrieval are both unknown. Here, we develop a theoretical framework for memory storage and recall with bounded synapses. We formulate the recall of a previously stored pattern from a noisy recall cue and limited-capacity (and therefore lossy) synapses as a probabilistic inference problem, and derive neural dynamics that implement approximate inference algorithms to solve this problem efficiently. In particular, for binary synapses with metaplastic states, we demonstrate for the first time that memories can be efficiently read out with biologically plausible network dynamics that are completely constrained by the synaptic plasticity rule, and the statistics of the stored patterns and of the recall cue. Our theory organises into a coherent framework a wide range of existing data about the regulation of excitability, feedback inhibition, and network oscillations in area CA3, and makes novel and directly testable predictions that can guide future experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrogels, three-dimensional hydrophilic polymer networks, are appealing candidate materials for studying the cellular microenvironment as their substantial water content helps to better mimic soft tissue. However, hydrogels can lack mechanical stiffness, strength, and toughness. Composite hydrogel systems have been shown to improve upon mechanical properties compared to their singlecomponent counterparts. Poly (ethylene glycol) dimethacrylate (PEGDMA) and alginate are polymers that have been used to form hydrogels for biological applications. Singlecomponent and composite PEGDMA and alginate systems were fabricated with a range of total polymer concentrations. Bulk gels were mechanically characterized using spherical indentation testing and a viscoelastic analysis framework. An increase in shear modulus with increasing polymer concentration was demonstrated for all systems. Alginate hydrogels were shown to have a smaller viscoelastic ratio than the PEGDMA gels, indicating more extensive relaxation over time. Composite alginate and PEGDMA hydrogels exhibited a combination of the mechanical properties of the constituents, as well as a qualitative increase in toughness. Additionally, multiple hydrogel systems were produced that had similar shear moduli, but different viscoelastic behaviors. Accurate measurement of the mechanical properties of hydrogels is necessary in order to determine what parameters are key in modeling the cellular microenvironment. © 2014 The Chinese Society of Theoretical and Applied Mechanics; Institute of Mechanics, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method is presented for the extraction of single-chain form factors and interchain interference functions from a range of small-angle neutron scattering (SANS) experiments on bimodal homopolymer blends. The method requires a minimum of three blends, made up of hydrogenated and deuterated components with matched degree of polymerization at two different chain lengths, but with carefully varying deuteration levels. The method is validated through an experimental study on polystyrene homopolymer bimodal blends with M A≈1/2MB. By fitting Debye functions to the structure factors, it is shown that there is good agreement between the molar mass of the components obtained from SANS and from chromatography. The extraction method also enables, for the first time, interchain scattering functions to be produced for scattering between chains of different lengths. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.