67 resultados para Deep eutectic solvent
Resumo:
The deep transverse metatarsal ligaments play an important role in stabilizing the metatarsal bones and manipulating foot transverse arch deformation. However, the biomechanical research about transverse metatarsal ligaments in the foot maneuver is quite few. Due to the difficulties and lack of better measurement technology for these ligaments experimental monitor, the load transfer mechanism and internal stress state also hadn't been well addressed. The purpose of this study was to develop a detailing foot finite element model including transverse metatarsal ligaments tissues, to investigate the mechanical response of transverse metatarsal ligaments during the landing condition. The transverse metatarsal ligaments were considered as hyperelastic material model was used to represent the nonlinear and nearly incompressible nature of the ligament tissue. From the simulation results, it is clearly to find that the peak maiximal principal stress of transverse metatarsal ligaments was between the third and fourth metatarsals. Meanwhile, it seems the transverse metatarsal ligaments in the middle position experienced higher tension than the sides transverse metatarsal ligaments.
Resumo:
The deep transverse metatarsal ligaments (DTML) play an important role in stabilizing the metatarsal bones and manipulating foot transverse arch deformation. However, the biomechanical research about DTML in the foot maneuver is quite few. Due to the difficulties and lack of better measurement technology for these ligaments experimental monitor, the load transfer mechanism and internal stress state also hadn't been well addressed. The purpose of this study was to develop a detailing foot finite element model including DTML tissues, to investigate the mechanical response of DTML during the landing condition. The DTML was considered as hyperelastic material model was used to represent the nonlinear and nearly incompressible nature of the ligament tissue. From the simulation results, it is clearly to find that the peak maiximal principal stress of DTML was between the third and fourth metatarsals. Meanwhile, it seems the DTML in the middle position experienced higher tension than the sides DTML.
Resumo:
Groundwater tables are rising beneath irrigated fields in some areas of the Lower Burdekin in North Queensland, Australia. The soils where this occurs are predominantly sodic clay soils with low hydraulic conductivities. Many of these soils have been treated by applying gypsum or by increasing the salinity of irrigation water by mixing saline groundwater with fresh river water. While the purpose of these treatments is to increase infiltration into the surface soils and improve productivity of the root zone, it is thought that the treatments may have altered the soil hydraulic properties well below the root zone leading to increased groundwater recharge and rising water tables. In this paper we discuss the use of column experiments and HYDRUS modelling, with major ion reaction and transport and soil water chemistry-dependent hydraulic conductivity, to assess the likely depth, magnitude and timing of the impacts of surface soil amelioration on soil hydraulic properties below the root zone and hence groundwater recharge. In the experiments, columns of sodic clays from the Lower Burdekin were leached for extended periods of time with either gypsum solutions or mixed cation salt solutions and change s in hydraulic conductivity were measured. Leaching with a gypsum solution for an extended time period, until the flow rate stabilised, resulted in an approximately twenty fold increase in the hydraulic conductivity when compared with a low salinity, mixed cation solution. HYDRUS modelling was used to high light the role of those factors which might influence the impacts of soil treatment, particularly at depth, including the large amounts of rain during the relatively short wet season and the presence of thick low permeability clay layers.
Resumo:
This paper introduces a machine learning based system for controlling a robotic manipulator with visual perception only. The capability to autonomously learn robot controllers solely from raw-pixel images and without any prior knowledge of configuration is shown for the first time. We build upon the success of recent deep reinforcement learning and develop a system for learning target reaching with a three-joint robot manipulator using external visual observation. A Deep Q Network (DQN) was demonstrated to perform target reaching after training in simulation. Transferring the network to real hardware and real observation in a naive approach failed, but experiments show that the network works when replacing camera images with synthetic images.
Resumo:
Organochlorine pesticides (OCPs) are ubiquitous environmental contaminants with adverse impacts on aquatic biota, wildlife and human health even at low concentrations. However, conventional methods for their determination in river sediments are resource intensive. This paper presents an approach that is rapid and also reliable for the detection of OCPs. Accelerated Solvent Extraction (ASE) with in-cell silica gel clean-up followed by Triple Quadrupole Gas Chromatograph Mass Spectrometry (GCMS/MS) was used to recover OCPs from sediment samples. Variables such as temperature, solvent ratio, adsorbent mass and extraction cycle were evaluated and optimised for the extraction. With the exception of Aldrin, which was unaffected by any of the variables evaluated, the recovery of OCPs from sediment samples was largely influenced by solvent ratio and adsorbent mass and, to some extent, the number of cycles and temperature. The optimised conditions for OCPs extraction in sediment with good recoveries were determined to be 4 cycles, 4.5 g of silica gel, 105 ᴼC, and 4:3 v/v DCM: hexane mixture. With the exception of two compounds (α-BHC and Aldrin) whose recoveries were low (59.73 and 47.66 % respectively), the recovery of the other pesticides were in the range 85.35 – 117.97% with precision < 10 % RSD. The method developed significantly reduces sample preparation time, the amount of solvent used, matrix interference, and is highly sensitive and selective.
Resumo:
This chapter provides a critical legal geography of outer Space, charting the topography of the debates and struggles around its definition, management, and possession. As the emerging field of critical legal geography demonstrates, law is not a neutral organiser of space, but is instead a powerful cultural technology of spatial production. Drawing on legal documents such as the Outer Space Treaty and the Moon Treaty, as well as on the analogous and precedent-setting legal geographies of Antarctica and the deep seabed, the chapter addresses key questions about the legal geography of outer Space, questions which are of growing importance as Space’s available satellite spaces in the geostationary orbit diminish, Space weapons and mining become increasingly viable, Space colonisation and tourism emerge, and questions about Space’s legal status grow in intensity. Who owns outer Space? Who, and whose rules, govern what may or may not (literally) take place there? Is the geostationary orbit the sovereign property of the equatorial states it supertends, as these states argued in the 1970s? Or is it a part of the res communis, or common property of humanity, which currently legally characterises outer Space? Does Space belong to no one, or to everyone? As challenges to the existing legal spatiality of outer Space emerge from spacefaring states, companies, and non-spacefaring states, it is particularly critical that the current spatiality of Space is understood and considered.
Resumo:
Deep packet inspection is a technology which enables the examination of the content of information packets being sent over the Internet. The Internet was originally set up using “end-to-end connectivity” as part of its design, allowing nodes of the network to send packets to all other nodes of the network, without requiring intermediate network elements to maintain status information about the transmission. In this way, the Internet was created as a “dumb” network, with “intelligent” devices (such as personal computers) at the end or “last mile” of the network. The dumb network does not interfere with an application's operation, nor is it sensitive to the needs of an application, and as such it treats all information sent over it as (more or less) equal. Yet, deep packet inspection allows the examination of packets at places on the network which are not endpoints, In practice, this permits entities such as Internet service providers (ISPs) or governments to observe the content of the information being sent, and perhaps even manipulate it. Indeed, the existence and implementation of deep packet inspection may challenge profoundly the egalitarian and open character of the Internet. This paper will firstly elaborate on what deep packet inspection is and how it works from a technological perspective, before going on to examine how it is being used in practice by governments and corporations. Legal problems have already been created by the use of deep packet inspection, which involve fundamental rights (especially of Internet users), such as freedom of expression and privacy, as well as more economic concerns, such as competition and copyright. These issues will be considered, and an assessment of the conformity of the use of deep packet inspection with law will be made. There will be a concentration on the use of deep packet inspection in European and North American jurisdictions, where it has already provoked debate, particularly in the context of discussions on net neutrality. This paper will also incorporate a more fundamental assessment of the values that are desirable for the Internet to respect and exhibit (such as openness, equality and neutrality), before concluding with the formulation of a legal and regulatory response to the use of this technology, in accordance with these values.