900 resultados para Forces de compression
Resumo:
Purpose: To assess the bacterial contamination risk in cataract surgery associated with mechanical compression of the lid margin immediately after sterilization of the ocular surface.
Setting: Department of Cataract, Zhongshan Ophthalmic Center, Sun Yat-sen University, Guangzhou, China.
Design: Prospective randomized controlled double-masked trial.
Methods: Patients with age-related cataract were randomly assigned to 1 of 2 groups. In Group A (153 eyes), the lid margin was compressed and scrubbed for 360 degrees 5 times with a dry sterile cotton-tipped applicator immediately after ocular sterilization and before povidone-iodine irrigation of the conjunctival sac. Group B (153 eyes) had identical sterilization but no lid scrubbing. Samples from the lid margin, liquid in the collecting bag, and aqueous humor were collected for bacterial culture. Primary outcome measures included the rate of positive bacterial culture for the above samples. The species of bacteria isolated were recorded.
Results: Group A and Group B each comprised 153 eyes. The positive rate of lid margin cultures was 54.24%. The positive rate of cultures for liquid in the collecting bag was significantly higher in Group A (23.53%) than in Group B (9.80%) (P=.001).The bacterial species cultured from the collecting bag in Group B were the same as those from the lid margin in Group A. The positive culture rate of aqueous humor in both groups was 0%.
Conclusion: Mechanical compression of the lid margin immediately before and during cataract surgery increased the risk for bacterial contamination of the surgical field, perhaps due to secretions from the lid margin glands.
Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned.
Resumo:
The duration compression effect is a phenomenon in which prior adaptation to a spatially circumscribed dynamic stimulus results in the duration of subsequent subsecond stimuli presented in the adapted region being underestimated. There is disagreement over the frame of reference within which the duration compression phenomenon occurs. One view holds that the effect is driven by retinotopic-tuned mechanisms located at early stages of visual processing, and an alternate position is that the mechanisms are spatiotopic and occur at later stages of visual processing (MT+). We addressed the retinotopic-spatiotopic question by using adapting stimuli – drifting plaids - that are known to activate global-motion mechanisms in area MT. If spatiotopic mechanisms contribute to the duration compression effect, drifting plaid adaptors should be well suited to revealing them. Following adaptation participants were tasked with estimating the duration of a 600ms random dot stimulus, whose direction was identical to the pattern direction of the adapting plaid, presented at either the same retinotopic or the same spatiotopic location as the adaptor. Our results reveal significant duration compression in both conditions, pointing to the involvement of both retinotopic-tuned and spatiotopic-tuned mechanisms in the duration compression effect.
Resumo:
This paper builds on previous work to show how using holistic and iterative design optimisation tools can be used to produce a commercially viable product that reduces a costly assembly into a single moulded structure. An assembly consisting of a structural metallic support and compression moulding outer shell undergo design optimisation and analysis to remove the support from the assembly process in favour of a structural moulding. The support is analysed and a sheet moulded compound (SMC) alternative is presented, this is then combined into a manufacturable shell design which is then assessed on viability as an alternative to the original.
Alongside this a robust material selection system is implemented that removes user bias towards materials for designs. This system builds on work set out by the Cambridge Material Selector and Boothroyd and Dewhurst, while using a selection of applicable materials currently available for the compression moulding process. This material selection process has been linked into the design and analysis stage, via scripts for use in the finite element environment. This builds towards an analysis toolkit that is suggested to develop and enhance manufacturability of design studies.
Resumo:
This paper describes the first use of inter-particle force measurement in reworked aerosols to better understand the mechanics of dust deflation and its consequent ecological ramifications. Dust is likely to carry hydrocarbons and micro-organisms including human pathogens and cultured microbes and thereby is a threat to plants, animals and human. Present-day global aerosol emissions are substantially greater than in 1850; however, the projected influx rates are highly disputable. This uncertainty, in part, has roots in the lack of understanding of deflation mechanisms. A growing body of literature shows that whether carbon emission continues to increase, plant transpiration drops and soil water retention enhances, allowing more greenery to grow and less dust to flux. On the other hand, a small but important body of geochemistry literature shows that increasing emission and global temperature leads to extreme climates, decalcification of surface soils containing soluble carbonate polymorphs and hence a greater chance of deflation. The consistency of loosely packed reworked silt provides background data against which the resistance of dust’s bonding components (carbonates and water) can be compared. The use of macro-scale phenomenological approaches to measure dust consistency is trivial. Instead, consistency can be measured in terms of inter-particle stress state. This paper describes a semi-empirical parametrisation of the inter-particle cohesion forces in terms of the balance of contact-level forces at the instant of particle motion. We put forward the hypothesis that the loss of Ca2+-based pedogenic salts is responsible for much of the dust influx and surficial drying pays a less significant role.
Resumo:
La compression des données est la technique informatique qui vise à réduire la taille de l’information pour minimiser l’espace de stockage nécessaire et accélérer la transmission des données dans les réseaux à bande passante limitée. Plusieurs techniques de compression telles que LZ77 et ses variantes souffrent d’un problème que nous appelons la redondance causée par la multiplicité d’encodages. La multiplicité d’encodages (ME) signifie que les données sources peuvent être encodées de différentes manières. Dans son cas le plus simple, ME se produit lorsqu’une technique de compression a la possibilité, au cours du processus d’encodage, de coder un symbole de différentes manières. La technique de compression par recyclage de bits a été introduite par D. Dubé et V. Beaudoin pour minimiser la redondance causée par ME. Des variantes de recyclage de bits ont été appliquées à LZ77 et les résultats expérimentaux obtenus conduisent à une meilleure compression (une réduction d’environ 9% de la taille des fichiers qui ont été compressés par Gzip en exploitant ME). Dubé et Beaudoin ont souligné que leur technique pourrait ne pas minimiser parfaitement la redondance causée par ME, car elle est construite sur la base du codage de Huffman qui n’a pas la capacité de traiter des mots de code (codewords) de longueurs fractionnaires, c’est-à-dire qu’elle permet de générer des mots de code de longueurs intégrales. En outre, le recyclage de bits s’appuie sur le codage de Huffman (HuBR) qui impose des contraintes supplémentaires pour éviter certaines situations qui diminuent sa performance. Contrairement aux codes de Huffman, le codage arithmétique (AC) peut manipuler des mots de code de longueurs fractionnaires. De plus, durant ces dernières décennies, les codes arithmétiques ont attiré plusieurs chercheurs vu qu’ils sont plus puissants et plus souples que les codes de Huffman. Par conséquent, ce travail vise à adapter le recyclage des bits pour les codes arithmétiques afin d’améliorer l’efficacité du codage et sa flexibilité. Nous avons abordé ce problème à travers nos quatre contributions (publiées). Ces contributions sont présentées dans cette thèse et peuvent être résumées comme suit. Premièrement, nous proposons une nouvelle technique utilisée pour adapter le recyclage de bits qui s’appuie sur les codes de Huffman (HuBR) au codage arithmétique. Cette technique est nommée recyclage de bits basé sur les codes arithmétiques (ACBR). Elle décrit le cadriciel et les principes de l’adaptation du HuBR à l’ACBR. Nous présentons aussi l’analyse théorique nécessaire pour estimer la redondance qui peut être réduite à l’aide de HuBR et ACBR pour les applications qui souffrent de ME. Cette analyse démontre que ACBR réalise un recyclage parfait dans tous les cas, tandis que HuBR ne réalise de telles performances que dans des cas très spécifiques. Deuxièmement, le problème de la technique ACBR précitée, c’est qu’elle requiert des calculs à précision arbitraire. Cela nécessite des ressources illimitées (ou infinies). Afin de bénéficier de cette dernière, nous proposons une nouvelle version à précision finie. Ladite technique devienne ainsi efficace et applicable sur les ordinateurs avec les registres classiques de taille fixe et peut être facilement interfacée avec les applications qui souffrent de ME. Troisièmement, nous proposons l’utilisation de HuBR et ACBR comme un moyen pour réduire la redondance afin d’obtenir un code binaire variable à fixe. Nous avons prouvé théoriquement et expérimentalement que les deux techniques permettent d’obtenir une amélioration significative (moins de redondance). À cet égard, ACBR surpasse HuBR et fournit une classe plus étendue des sources binaires qui pouvant bénéficier d’un dictionnaire pluriellement analysable. En outre, nous montrons qu’ACBR est plus souple que HuBR dans la pratique. Quatrièmement, nous utilisons HuBR pour réduire la redondance des codes équilibrés générés par l’algorithme de Knuth. Afin de comparer les performances de HuBR et ACBR, les résultats théoriques correspondants de HuBR et d’ACBR sont présentés. Les résultats montrent que les deux techniques réalisent presque la même réduction de redondance sur les codes équilibrés générés par l’algorithme de Knuth.
Resumo:
Senior thesis written for Oceanography 445
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
This study compared the ground reaction forces (GRF) and plantar pressures between unloaded and occasional loaded gait. The GRF and plantar pressures of 60 participants were recorded during unloaded gait and occasional loaded gait (wearing a backpack that raised their body mass index to 30); this load criterion was adopted because is considered potentially harmful in permanent loaded gait (obese people). The results indicate an overall increase (absolute values) of GRF and plantar pressures during occasional loaded gait (p < 0.05); also, higher normalized (by total weight) values in the medial midfoot and toes, and lower values in the lateral rearfoot region were observed. During loaded gait the magnitude of the vertical GRF (impact and thrust maximum) decreased and the shear forces increased more than did the proportion of the load (normalized values). These data suggest a different pattern of GRF and plantar pressure distribution during occasional loaded compared to unloaded gait.
Resumo:
Si3N4 tools were coated with a thin diamond film using a Hot-Filament Chemical Vapour Deposition (HFCVD) reactor, in order to machining a grey cast iron. Wear behaviour of these tools in high speed machining was the main subject of this work. Turning tests were performed with a combination of cutting speeds of 500, 700 and 900 m min−1, and feed rates of 0.1, 0.25 and 0.4 mm rot−1, remaining constant the depth of cut of 1 mm. In order to evaluate the tool behaviour during the turning tests, cutting forces were analyzed being verified a significant increase with feed rate. Diamond film removal occurred for the most severe set of cutting parameters. It was also observed the adhesion of iron and manganese from the workpiece to the tool. Tests were performed on a CNC lathe provided with a 3-axis dynamometer. Results were collected and registered by homemade software. Tool wear analysis was achieved by a Scanning Electron Microscope (SEM) provided with an X-ray Energy Dispersive Spectroscopy (EDS) system. Surface analysis was performed by a profilometer.
Resumo:
Purpose: Because walking is highly recommended for prevention and treatment of obesity and some of its biomechanical aspects are not clearly understood for overweight people, we compared the absolute and normalized ground reaction forces (GRF), plantar pressures, and temporal parameters of normal-weight and overweight participants during overground walking. Method: A force plate and an in-shoe pressure system were used to record GRF, plantar pressures (foot divided in 10 regions), and temporal parameters of 17 overweight adults and 17 gender-matched normal-weight adults while walking. Results: With high effect sizes, the overweight participants showed higher absolute medial-lateral and vertical GRF and pressure peaks in the central rearfoot, lateral midfoot, and lateral and central forefoot. However, analyzing normalized (scaled to body weight) data, the overweight participants showed lower vertical and anterior-posterior GRF and lower pressure peaks in the medial rearfoot and hallux, but the lateral forefoot peaks continued to be greater compared with normal-weight participants. Time of occurrence of medial-lateral GRF and pressure peaks in the midfoot occurred later in overweight individuals. Conclusions: The overweight participants adapted their gait pattern to minimize the consequences of the higher vertical and propulsive GRF in their musculoskeletal system. However, they were not able to improve their balance as indicated by medial-lateral GRF. The overweight participants showed higher absolute pressure peaks in 4 out of 10 foot regions. Furthermore, the normalized data suggest that the lateral forefoot in overweight adults was loaded more than the proportion of their extra weight, while the hallux and medial rearfoot were seemingly protected.
Resumo:
Biomechanical gait parameters—ground reaction forces (GRFs) and plantar pressures—during load carriage of young adults were compared at a low gait cadence and a high gait cadence. Differences between load carriage and normal walking during both gait cadences were also assessed. A force plate and an in-shoe plantar pressure system were used to assess 60 adults while they were walking either normally (unloaded condition) or wearing a backpack (loaded condition) at low (70 steps per minute) and high gait cadences (120 steps per minute). GRF and plantar pressure peaks were scaled to body weight (or body weight plus backpack weight). With medium to high effect sizes we found greater anterior-posterior and vertical GRFs and greater plantar pressure peaks in the rearfoot, forefoot and hallux when the participants walked carrying a backpack at high gait cadences compared to walking at low gait cadences. Differences between loaded and unloaded conditions in both gait cadences were also observed.
Resumo:
1929/01/01.