896 resultados para segmental compression forces
Resumo:
Low-velocity impact damage can drastically reduce the residual strength of a composite structure even when the damage is barely visible. The ability to computationally predict the extent of damage and compression-after-impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant time and cost penalties. A high-fidelity three-dimensional composite damage model, to predict both low-velocity impact damage and CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The intralaminar damage model component accounts for physically-based tensile and compressive failure mechanisms, of the fibres and matrix, when subjected to a three-dimensional stress state. Cohesive behaviour was employed to model the interlaminar failure between plies with a bi-linear traction–separation law for capturing damage onset and subsequent damage evolution. The virtual tests, set up in ABAQUS/Explicit, were executed in three steps, one to capture the impact damage, the second to stabilize the specimen by imposing new boundary conditions required for compression testing, and the third to predict the CAI strength. The observed intralaminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing without the need of model calibration which is often required with other damage models.
Resumo:
Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.
Resumo:
The structure and properties of melt mixed high-density polyethylene/multi-walled carbon nanotube (HDPE/MWCNT) composites processed by compression molding and blown film extrusion were investigated to assess the influence of processing route on properties. The addition of MWCNTs leads to a more elastic response during deformations that result in a more uniform thick-ness distribution in the blown films. Blown film composites exhibit better mechanical properties due to the enhanced orientation and disentanglement of MWCNTs. At a blow up ratio (BUR) of 3 the breaking strength and elongation in the machine direction of the film with 4 wt % MWCNTs are 239% and 1054% higher than those of compression molded (CM) samples. Resistivity of the composite films increases significantly with increasing BURs due to the destruction of conductive pathways. These pathways can be recovered partially using an appropriate annealing process. At 8 wt % MWCNTs, there is a sufficient density of nanotubes to maintain a robust network even at high BURs.
Resumo:
Land wars in India: Contestations, social forces and evolving neoliberal urban transformation
The recent incidents of ‘land wars’ in India have highlighted the contradictions and challenges of the neoliberal urban transformation through a range of issues across governance, equity and empowerment in the development agenda. Simply put, a strong top down approach and corporate-political nexus have determined the modality of land acquisition, compensation and ultimately the nature of its consumption leaving out majority urban poor from its benefits. The paper focuses on the concept of neoliberalism as a modality of urban governance and emergence of the grassroots activism as a countermagnate to neoliberalist hegemony by examining the inequity and marginalization that embody these ‘land wars’ in India and the forms of resistance from the grassroots - their capacity, relationship and modus operandi. Emerging lessons suggest the potential for advancing governance from the bottoms up leading to more equitable distribution of resources. It is however argued that there is a need for a stronger conception of the ‘grassroots’ in both epistemological and empirical context. In particular, the preconditions for the ‘grassroots organisations’ to foster and play a more effective role requires a more inclusive notion of ‘institutionality and plurality’ within the current political economic context. The empirical focus of the paper is ‘land wars’ observed in Kolkata, West Bengal, however references to other examples across the country have also been made.
Resumo:
This paper contributes to the understanding of lime-mortar masonry strength and deformation (which determine durability and allowable stresses/stiffness in design codes) by measuring the mechanical properties of brick bound with lime and lime-cement mortars. Based on the regression analysis of experimental results, models to estimate lime-mortar masonry compressive strength are proposed (less accurate for hydrated lime (CL90s) masonry due to the disparity between mortar and brick strengths). Also, three relationships between masonry elastic modulus and its compressive strength are proposed for cement-lime; hydraulic lime (NHL3.5 and 5); and hydrated/feebly hydraulic lime masonries respectively.
Disagreement between the experimental results and former mathematical prediction models (proposed primarily for cement masonry) is caused by a lack of provision for the significant deformation of lime masonry and the relative changes in strength and stiffness between mortar and brick over time (at 6 months and 1 year, the NHL 3.5 and 5 mortars are often stronger than the brick). Eurocode 6 provided the best predictions for the compressive strength of lime and cement-lime masonry based on the strength of their components. All models vastly overestimated the strength of CL90s masonry at 28 days however, Eurocode 6 became an accurate predictor after 6 months, when the mortar had acquired most of its final strength and stiffness.
The experimental results agreed with former stress-strain curves. It was evidenced that mortar strongly impacts masonry deformation, and that the masonry stress/strain relationship becomes increasingly non-linear as mortar strength lowers. It was also noted that, the influence of masonry stiffness on its compressive strength becomes smaller as the mortar hydraulicity increases.
Resumo:
The effects of a 100 mm diameter integrally-flanged hole in a hat-stiffenend carbon-fibre composite panel, loaded in uniaxial compression, were investigated and compared with a similar panel containing an unflanged hole. Details of the manufacturing techniques used in the production of the integral flange are presented. The stiffening effects of the flange reduced the bending strains, which may lead to high interlaminar shear strains, around the cutout while increasing the membrane strains. These membrane strains were well below the limit strains for this composite material. The skin in the unflanged hole also underwent a change in buckling mode shape from three longitudinal half-wavelengths to five half-wavelengths. No such change was observed in the flanged panel and this buckled in four longitudinal half-wavelengths. The ultimate strength of both panels was determined by the load carrying capability of the stiffeners.
Resumo:
The European Cystic Fibrosis Society Clinical Trial Network (ECFS-CTN) has established a Standardization Committee to undertake a rigorous evaluation of promising outcome measures with regard to use in multicentre clinical trials in cystic fibrosis (CF). The aim of this article is to present a review of literature on clinimetric properties of the infant raised-volume rapid thoracic compression (RVRTC) technique in the context of CF, to summarise the consensus amongst the group on feasibility and answer key questions regarding the promotion of this technique to surrogate endpoint status.
METHODS: A literature search (from 1985 onwards) identified 20 papers that met inclusion criteria of RVRTC use in infants with CF. Data were extracted and tabulated regarding repeatability, validity, correlation with other outcome measures, responsiveness and reference values. A working group discussed the tables and answered 4 key questions.
RESULTS: Overall, RVRTC in particular forced expiratory volume in 0.5s, showed good clinimetric properties despite presence of individual variability. Few studies showed a relationship between RVRTC and inflammation and infection, and to date, data remains limited regarding the responsiveness of RVRTC after an intervention. Concerns were raised regarding feasibility in multi-centre studies and availability of reference values.
CONCLUSION: The ECFS-CTN Working Group considers that RVRTC cannot be used as a primary outcome in clinical trials in infants with CF before universal standardization of this measurement is achieved and implementation of inter-institutional networking is in place. We advise its use currently in phase I/II trials and as a secondary endpoint in phase III studies. We emphasise the need for (1) more short-term variability and longitudinal 'natural history' studies, and (2) robust reference values for commercially available devices.
Resumo:
Purpose: To assess the bacterial contamination risk in cataract surgery associated with mechanical compression of the lid margin immediately after sterilization of the ocular surface.
Setting: Department of Cataract, Zhongshan Ophthalmic Center, Sun Yat-sen University, Guangzhou, China.
Design: Prospective randomized controlled double-masked trial.
Methods: Patients with age-related cataract were randomly assigned to 1 of 2 groups. In Group A (153 eyes), the lid margin was compressed and scrubbed for 360 degrees 5 times with a dry sterile cotton-tipped applicator immediately after ocular sterilization and before povidone-iodine irrigation of the conjunctival sac. Group B (153 eyes) had identical sterilization but no lid scrubbing. Samples from the lid margin, liquid in the collecting bag, and aqueous humor were collected for bacterial culture. Primary outcome measures included the rate of positive bacterial culture for the above samples. The species of bacteria isolated were recorded.
Results: Group A and Group B each comprised 153 eyes. The positive rate of lid margin cultures was 54.24%. The positive rate of cultures for liquid in the collecting bag was significantly higher in Group A (23.53%) than in Group B (9.80%) (P=.001).The bacterial species cultured from the collecting bag in Group B were the same as those from the lid margin in Group A. The positive culture rate of aqueous humor in both groups was 0%.
Conclusion: Mechanical compression of the lid margin immediately before and during cataract surgery increased the risk for bacterial contamination of the surgical field, perhaps due to secretions from the lid margin glands.
Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned.
Resumo:
The duration compression effect is a phenomenon in which prior adaptation to a spatially circumscribed dynamic stimulus results in the duration of subsequent subsecond stimuli presented in the adapted region being underestimated. There is disagreement over the frame of reference within which the duration compression phenomenon occurs. One view holds that the effect is driven by retinotopic-tuned mechanisms located at early stages of visual processing, and an alternate position is that the mechanisms are spatiotopic and occur at later stages of visual processing (MT+). We addressed the retinotopic-spatiotopic question by using adapting stimuli – drifting plaids - that are known to activate global-motion mechanisms in area MT. If spatiotopic mechanisms contribute to the duration compression effect, drifting plaid adaptors should be well suited to revealing them. Following adaptation participants were tasked with estimating the duration of a 600ms random dot stimulus, whose direction was identical to the pattern direction of the adapting plaid, presented at either the same retinotopic or the same spatiotopic location as the adaptor. Our results reveal significant duration compression in both conditions, pointing to the involvement of both retinotopic-tuned and spatiotopic-tuned mechanisms in the duration compression effect.
Resumo:
This paper builds on previous work to show how using holistic and iterative design optimisation tools can be used to produce a commercially viable product that reduces a costly assembly into a single moulded structure. An assembly consisting of a structural metallic support and compression moulding outer shell undergo design optimisation and analysis to remove the support from the assembly process in favour of a structural moulding. The support is analysed and a sheet moulded compound (SMC) alternative is presented, this is then combined into a manufacturable shell design which is then assessed on viability as an alternative to the original.
Alongside this a robust material selection system is implemented that removes user bias towards materials for designs. This system builds on work set out by the Cambridge Material Selector and Boothroyd and Dewhurst, while using a selection of applicable materials currently available for the compression moulding process. This material selection process has been linked into the design and analysis stage, via scripts for use in the finite element environment. This builds towards an analysis toolkit that is suggested to develop and enhance manufacturability of design studies.
Resumo:
This paper describes the first use of inter-particle force measurement in reworked aerosols to better understand the mechanics of dust deflation and its consequent ecological ramifications. Dust is likely to carry hydrocarbons and micro-organisms including human pathogens and cultured microbes and thereby is a threat to plants, animals and human. Present-day global aerosol emissions are substantially greater than in 1850; however, the projected influx rates are highly disputable. This uncertainty, in part, has roots in the lack of understanding of deflation mechanisms. A growing body of literature shows that whether carbon emission continues to increase, plant transpiration drops and soil water retention enhances, allowing more greenery to grow and less dust to flux. On the other hand, a small but important body of geochemistry literature shows that increasing emission and global temperature leads to extreme climates, decalcification of surface soils containing soluble carbonate polymorphs and hence a greater chance of deflation. The consistency of loosely packed reworked silt provides background data against which the resistance of dust’s bonding components (carbonates and water) can be compared. The use of macro-scale phenomenological approaches to measure dust consistency is trivial. Instead, consistency can be measured in terms of inter-particle stress state. This paper describes a semi-empirical parametrisation of the inter-particle cohesion forces in terms of the balance of contact-level forces at the instant of particle motion. We put forward the hypothesis that the loss of Ca2+-based pedogenic salts is responsible for much of the dust influx and surficial drying pays a less significant role.
Resumo:
La compression des données est la technique informatique qui vise à réduire la taille de l’information pour minimiser l’espace de stockage nécessaire et accélérer la transmission des données dans les réseaux à bande passante limitée. Plusieurs techniques de compression telles que LZ77 et ses variantes souffrent d’un problème que nous appelons la redondance causée par la multiplicité d’encodages. La multiplicité d’encodages (ME) signifie que les données sources peuvent être encodées de différentes manières. Dans son cas le plus simple, ME se produit lorsqu’une technique de compression a la possibilité, au cours du processus d’encodage, de coder un symbole de différentes manières. La technique de compression par recyclage de bits a été introduite par D. Dubé et V. Beaudoin pour minimiser la redondance causée par ME. Des variantes de recyclage de bits ont été appliquées à LZ77 et les résultats expérimentaux obtenus conduisent à une meilleure compression (une réduction d’environ 9% de la taille des fichiers qui ont été compressés par Gzip en exploitant ME). Dubé et Beaudoin ont souligné que leur technique pourrait ne pas minimiser parfaitement la redondance causée par ME, car elle est construite sur la base du codage de Huffman qui n’a pas la capacité de traiter des mots de code (codewords) de longueurs fractionnaires, c’est-à-dire qu’elle permet de générer des mots de code de longueurs intégrales. En outre, le recyclage de bits s’appuie sur le codage de Huffman (HuBR) qui impose des contraintes supplémentaires pour éviter certaines situations qui diminuent sa performance. Contrairement aux codes de Huffman, le codage arithmétique (AC) peut manipuler des mots de code de longueurs fractionnaires. De plus, durant ces dernières décennies, les codes arithmétiques ont attiré plusieurs chercheurs vu qu’ils sont plus puissants et plus souples que les codes de Huffman. Par conséquent, ce travail vise à adapter le recyclage des bits pour les codes arithmétiques afin d’améliorer l’efficacité du codage et sa flexibilité. Nous avons abordé ce problème à travers nos quatre contributions (publiées). Ces contributions sont présentées dans cette thèse et peuvent être résumées comme suit. Premièrement, nous proposons une nouvelle technique utilisée pour adapter le recyclage de bits qui s’appuie sur les codes de Huffman (HuBR) au codage arithmétique. Cette technique est nommée recyclage de bits basé sur les codes arithmétiques (ACBR). Elle décrit le cadriciel et les principes de l’adaptation du HuBR à l’ACBR. Nous présentons aussi l’analyse théorique nécessaire pour estimer la redondance qui peut être réduite à l’aide de HuBR et ACBR pour les applications qui souffrent de ME. Cette analyse démontre que ACBR réalise un recyclage parfait dans tous les cas, tandis que HuBR ne réalise de telles performances que dans des cas très spécifiques. Deuxièmement, le problème de la technique ACBR précitée, c’est qu’elle requiert des calculs à précision arbitraire. Cela nécessite des ressources illimitées (ou infinies). Afin de bénéficier de cette dernière, nous proposons une nouvelle version à précision finie. Ladite technique devienne ainsi efficace et applicable sur les ordinateurs avec les registres classiques de taille fixe et peut être facilement interfacée avec les applications qui souffrent de ME. Troisièmement, nous proposons l’utilisation de HuBR et ACBR comme un moyen pour réduire la redondance afin d’obtenir un code binaire variable à fixe. Nous avons prouvé théoriquement et expérimentalement que les deux techniques permettent d’obtenir une amélioration significative (moins de redondance). À cet égard, ACBR surpasse HuBR et fournit une classe plus étendue des sources binaires qui pouvant bénéficier d’un dictionnaire pluriellement analysable. En outre, nous montrons qu’ACBR est plus souple que HuBR dans la pratique. Quatrièmement, nous utilisons HuBR pour réduire la redondance des codes équilibrés générés par l’algorithme de Knuth. Afin de comparer les performances de HuBR et ACBR, les résultats théoriques correspondants de HuBR et d’ACBR sont présentés. Les résultats montrent que les deux techniques réalisent presque la même réduction de redondance sur les codes équilibrés générés par l’algorithme de Knuth.