904 resultados para New cutting tool
Resumo:
Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, where asset value and population density are greatest, the model spatial resolution required to represent flows through a typical street network (i.e. < 10m) often results in impractical computational cost at the whole city scale. Explicit diffusive storage cell models become very inefficient at such high resolutions, relative to shallow water models, because the stable time step in such schemes scales as a quadratic of resolution. This paper presents the calibration and evaluation of a recently developed new formulation of the LISFLOOD-FP model, where stability is controlled by the Courant–Freidrichs–Levy condition for the shallow water equations, such that, the stable time step instead scales linearly with resolution. The case study used is based on observations during the summer 2007 floods in Tewkesbury, UK. Aerial photography is available for model evaluation on three separate days from the 24th to the 31st of July. The model covered a 3.6 km by 2 km domain and was calibrated using gauge data from high flows during the previous month. The new formulation was benchmarked against the original version of the model at 20 m and 40 m resolutions, demonstrating equally accurate performance given the available validation data but at 67x faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in a significantly more accurate simulation of the drying dynamics compared to that simulated by the coarse resolution models, although estimates of peak inundation depth were similar.
Resumo:
Resistant strains of Plasmodium falciparum and the unavailability of useful antimalarial vaccines reinforce the need to develop new efficacious antimalarials. This study details a pharmacophore model that has been used to identify a potent, soluble, orally bioavailable antimalarial bisquinoline, metaquine (N,N'-bis(7-chloroquinolin-4-yl)benzene-1,3-diamine) (dihydrochloride), which is active against Plasmodium berghei in vivo (oral ID50 of 25 mu mol/kg) and multidrug-resistant Plasmodium falciparum K1 in vitro (0.17 mu M). Metaquine shows strong affinity for the putative antimalarial receptor, heme at pH 7.4 in aqueous DMSO. Both crystallographic analyses and quantum mechanical calculations (HF/6-31+G*) reveal important regions of protonation and bonding thought to persist at parasitic vacuolar pH concordant with our receptor model. Formation of drug-heme adduct in solution was confirmed using high-resolution positive ion electrospray mass spectrometry. Metaquine showed strong binding with the receptor in a 1: 1 ratio (log K = 5.7 +/- 0.1) that was predicted by molecular mechanics calculations. This study illustrates a rational multidisciplinary approach for the development of new 4-aminoquinoline antimalarials, with efficacy superior to chloroquine, based on the use of a pharmacophore model.
Resumo:
The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.
Resumo:
The implications of whether new surfaces in cutting are formed just by plastic flow past the tool or by some fracturelike separation process involving significant surface work, are discussed. Oblique metalcutting is investigated using the ideas contained in a new algebraic model for the orthogonal machining of metals (Atkins, A. G., 2003, "Modeling Metalcutting Using Modern Ductile Fracture Mechanics: Quantitative Explanations for Some Longstanding Problems," Int. J. Mech. Sci., 45, pp. 373–396) in which significant surface work (ductile fracture toughnesses) is incorporated. The model is able to predict explicit material-dependent primary shear plane angles and provides explanations for a variety of well-known effects in cutting, such as the reduction of at small uncut chip thicknesses; the quasilinear plots of cutting force versus depth of cut; the existence of a positive force intercept in such plots; why, in the size-effect regime of machining, anomalously high values of yield stress are determined; and why finite element method simulations of cutting have to employ a "separation criterion" at the tool tip. Predictions from the new analysis for oblique cutting (including an investigation of Stabler's rule for the relation between the chip flow velocity angle C and the angle of blade inclination i) compare consistently and favorably with experimental results.
Resumo:
Cutting force data for Nylon 66 has been examined in terms of various different models of cutting. Theory that includes significant work of separation at the tool tip was found to give the best correlation with experimental data over a wide range of rake angles for derived primary shear plane angle. A fracture toughness parameter was used as the measure of the specific work of separation. Variation in toughness with rake angle determined from cutting is postulated to be caused by mixed mode separation at the tool tip. A rule of mixtures using independently determined values of toughness in tension (mode 1) and shear (mode 11) is found to describe well the variation with rake angle. The ratio of modes varies with rake angle and, in turn, with the primary shear plane angle. Previous suggestions that cutting is a means of experimentally determining fracture toughness are now seen to be extended to identify the mode of fracture toughness as well.
Resumo:
The tides of globalization and the unsteady surges and distortions in the evolution of the European Union are causing identities and cultures to be in a state of flux. Education is used by politicians as a major lever for political and social change through micro-management, but it is a crude tool. There can, however, be opportunities within educational experience for individual learners to gain strong, reflexive, multiple identities and multiple citizenship through the engagement of their creative energies. It has been argued that the twenty-first century needs a new kind of creativity characterized by unselfishness, caring and compassion—still involving monetary wealth, but resulting in a healthy planet and healthy people. Creativity and its economically derived relation, innovation, have become `buzz words' of our times. They are often misconstrued, misunderstood and plainly misused within educational conversations. The small-scale pan-European research study upon which this article is founded discovered that more emphasis needs to be placed on creative leadership, empowering teachers and learners, reducing pupils' fear of school, balancing teaching approaches, and ensuring that the curriculum and assessment are responsive to the needs of individual learners. These factors are key to building strong educational provision that harnesses the creative potential of learners, teachers and other stakeholders, values what it is to be human and creates a foundation upon which to build strong, morally based, consistent, participative democracies.
Resumo:
In this paper we present the novel concepts incorporated in a planetary surface exploration rover design that is currently under development. The Multitasking Rover (MTR) aims to demonstrate functionality that will cover many of the current and future needs such as rough-terrain mobility, modularity and upgradeability. The rover system has enhanced mobility characteristics. It operates in conjunction with Science Packs (SPs) and Tool Packs (TPs)-modules attached to the main frame of the rover, which are either special tools or science instruments and alter the operation capabilities of the system.
Resumo:
Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, the model spatial resolution required to represent flows through a typical street network often results in an impractical computational cost at the city scale. This paper presents the calibration and evaluation of a recently developed formulation of the LISFLOOD-FP model, which is more computationally efficient at these resolutions. Aerial photography was available for model evaluation on 3 days from the 24 to the 31 of July. The new formulation was benchmarked against the original version of the model at 20 and 40 m resolutions, demonstrating equally accurate simulation, given the evaluation data but at a 67 times faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in more accurate simulation of the floodplain drying dynamics compared with the coarse resolution models, although maximum inundation levels were simulated equally well at all resolutions tested.
Resumo:
The Earth's climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a ‘primary standard’ and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a ‘metrology laboratory in space’.
Resumo:
The Virtual Lightbox for Museums and Archives (VLMA) is a tool for collecting and reusing, in a structured fashion, the online contents of museums and archive datasets. It is not restricted to datasets with visual components although VLMA includes a lightbox service that enables comparison and manipulation of visual information. With VLMA, one can browse and search collections, construct personal collections, annotate them, export these collections to XML or Impress (Open Office) presentation format, and share collections with other VLMA users. VLMA was piloted as an e-Learning tool as part of JISC’s e-Learning focus in its first phase (2004-2005) and in its second phase (2005-2006) it has incorporated new partner collections while improving and expanding interfaces and services. This paper concerns its development as a research and teaching tool, especially to teachers using museum collections, and discusses the recent development of VLMA.
Resumo:
Purpose – This paper aims to provide a brief re´sume´ of previous research which has analysed the impact of e-commerce on retail real estate in the UK, and to examine the important marketing role of the internet for shopping centre managers, and retail landlords. Design/methodology/approach – Based on the results from a wider study carried out in 2003, the paper uses case studies from two different shopping centres in the UK, and documents the innovative uses of both web-based marketing and online retailing by organisations that historically have not directly been involved in the retailing process. Findings – The paper highlights the importance of considering online sales within a multi-channel approach to retailing. The two types of emerging shopping centre model which are identified are characterised by their ultimate relationship with the physical shopping centre on whose web site they reside. These can be summarised as: the “centre-led” approach, and the “brand-led” or “marketing-led” approach. Research limitations/implications – The research is based on a limited number of in-depth case studies and secondary data. Further research is needed to monitor the continuing impact of e-commerce on retail property and the marketing strategies of shopping centre managers and owners. Practical implications – Internet-based sales provide an important adjunct to conventional retail sales and an important source of potential risk for landlords and tenants in the real estate investment market. Regardless of whether retailers use the internet as a sales channel, as a product-sourcing tool, or merely to provide information to the consumer, the internet has become a keystone within the greater retail marketing mix. The findings have ramifications for understanding the way in which landlords are structuring their retail property to defray potential risks. Originality/value – The paper examines shopping centre online marketing models for the first time in detail, and will be of value to retail occupiers, owners and other stakeholders of shopping centres.
Resumo:
This Editorial presents the focus, scope and policies of the inaugural issue of Nature Conservation, a new open access, peer-reviewed journal bridging natural sciences, social sciences and hands-on applications in conservation management. The journal covers all aspects of nature conservation and aims particularly at facilitating better interaction between scientists and practitioners. The journal will impose no restrictions on manuscript size or the use of colour. We will use an XML-based editorial workflow and several cutting-edge innovations in publishing and information dissemination. These include semantic mark-up of, and enhancements to published text, data, and extensive cross-linking within the journal and to external sources. We believe the journal will make an important contribution to better linking science and practice, offers rapid, peer-reviewed and flexible publication for authors and unrestricted access to content.
Resumo:
How effective are multi-stakeholder scenarios building processes to bring diverse actors together and create a policy-making tool to support sustainable development and promote food security in the developing world under climate change? The effectiveness of a participatory scenario development process highlights the importance of ‘boundary work’ that links actors and organizations involved in generating knowledge on the one hand, and practitioners and policymakers who take actions based on that knowledge on the other. This study reports on the application of criteria for effective boundary work to a multi-stakeholder scenarios process in East Africa that brought together a range of regional agriculture and food systems actors. This analysis has enabled us to evaluate the extent to which these scenarios were seen by the different actors as credible, legitimate and salient, and thus more likely to be useful. The analysis has shown gaps and opportunities for improvement on these criteria, such as the quantification of scenarios, attention to translating and communicating the results through various channels and new approaches to enable a more inclusive and diverse group of participants. We conclude that applying boundary work criteria to multi-stakeholder scenarios processes can do much to increase the likelihood of developing sustainable development and food security policies that are more appropriate.
Resumo:
This paper introduces a new agent-based model, which incorporates the actions of individual homeowners in a long-term domestic stock model, and details how it was applied in energy policy analysis. The results indicate that current policies are likely to fall significantly short of the 80% target and suggest that current subsidy levels need re-examining. In the model, current subsidy levels appear to offer too much support to some technologies, which in turn leads to the suppression of other technologies that have a greater energy saving potential. The model can be used by policy makers to develop further scenarios to find alternative, more effective, sets of policy measures. The model is currently limited to the owner-occupied stock in England, although it can be expanded, subject to the availability of data.