19 resultados para shape vs use
em CentAUR: Central Archive University of Reading - UK
Resumo:
This paper reviews late Roman `nail-cleaner strap-ends', a group of objects first discussed by Hawkes and Dunning (1961). The precise function of these objects is unclear as their shape suggests use as toilet instruments but the split socket suggests that they were part of belt-fittings. We suggest a detailed typology and discuss the dating evidence and the spatial distribution of the type. Regardless of their precise function, it is argued in this paper that nail-cleaner strap-ends of this type are unique to late Roman Britain and thus represent a distinct regional type. The use of nail-cleaner strap-ends can be viewed in the context of gender associations, military status and religious beliefs.
Resumo:
Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.
Resumo:
University students suffer from variable sleep patterns including insomnia;[1] furthermore, the highest incidence of herbal use appears to be among college graduates.[2] Our objective was to test the perception of safety and value of herbal against conventional medicine for the treatment of insomnia in a non-pharmacy student population. We used an experimental design and bespoke vignettes that relayed the same effectiveness information to test our hypothesis that students would give higher ratings of safety and value to herbal product compared to conventional medicine. We tested another hypothesis that the addition of side-effect information would lower people’s perception of the safety and value of the herbal product to a greater extent than it would with the conventional medicine.
Resumo:
Lava domes comprise core, carapace, and clastic talus components. They can grow endogenously by inflation of a core and/or exogenously with the extrusion of shear bounded lobes and whaleback lobes at the surface. Internal structure is paramount in determining the extent to which lava dome growth evolves stably, or conversely the propensity for collapse. The more core lava that exists within a dome, in both relative and absolute terms, the more explosive energy is available, both for large pyroclastic flows following collapse and in particular for lateral blast events following very rapid removal of lateral support to the dome. Knowledge of the location of the core lava within the dome is also relevant for hazard assessment purposes. A spreading toe, or lobe of core lava, over a talus substrate may be both relatively unstable and likely to accelerate to more violent activity during the early phases of a retrogressive collapse. Soufrière Hills Volcano, Montserrat has been erupting since 1995 and has produced numerous lava domes that have undergone repeated collapse events. We consider one continuous dome growth period, from August 2005 to May 2006 that resulted in a dome collapse event on 20th May 2006. The collapse event lasted 3 h, removing the whole dome plus dome remnants from a previous growth period in an unusually violent and rapid collapse event. We use an axisymmetrical computational Finite Element Method model for the growth and evolution of a lava dome. Our model comprises evolving core, carapace and talus components based on axisymmetrical endogenous dome growth, which permits us to model the interface between talus and core. Despite explicitly only modelling axisymmetrical endogenous dome growth our core–talus model simulates many of the observed growth characteristics of the 2005–2006 SHV lava dome well. Further, it is possible for our simulations to replicate large-scale exogenous characteristics when a considerable volume of talus has accumulated around the lower flanks of the dome. Model results suggest that dome core can override talus within a growing dome, potentially generating a region of significant weakness and a potential locus for collapse initiation.
Resumo:
The tridentate Schiff base ligand, 7-amino-4-methyl-5-aza-3-hepten-2-one (HAMAH), prepared by the mono-condensation of 1,2diaminoethane and acetylacetone, reacts with Cu(BF4)(2) center dot 6H(2)O to produce initially a dinuclear Cu(II) complex, [{Cu(AMAH)}(2) (mu-4,4'-bipyJ](BF4)(2) (1) which undergoes hydrolysis in the reaction mixture and finally produces a linear polymeric chain compound, [Cu(acac)(2)(mu-4,4'-bipy)](n) (2). The geometry around the copper atom in compound 1 is distorted square planar while that in compound 2 is essentially an elongated octahedron. On the other hand, the ligand HAMAH reacts with Cu(ClO4)(2) center dot 6H(2)O to yield a polymeric zigzag chain, [{Cu(acac)(CH3OH)(mu-4,4'-bipy)}(ClO4)](n) (3). The geometry of the copper atom in 3 is square pyramidal with the two bipyridine molecules in the cis equatorial positions. All three complexes have been characterized by elemental analysis, IR and UV-Vis spectroscopy and single crystal X-ray diffraction studies. A probable explanation for the different size and shape of the reported polynuclear complexes formed by copper(II) and 4,4'-bipyridine has been put forward by taking into account the denticity and crystal field strength of the blocking ligand as well as the Jahn-Teller effect in copper(II). (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Aims and objectives. To examine the impact of written and verbal education on bed-making practices, in an attempt to reduce the prevalence of pressure ulcers. Background. The Department of Health has set targets for a 5% reduction per annum in the incidence of pressure ulcers. Electric profiling beds with a visco-elastic polymer mattress are a new innovation in pressure ulcer prevention; however, mattress efficacy is reduced by tightly tucking sheets around the mattress. Design. A prospective randomized pre/post-test experimental design. Methods. Ward managers at a teaching hospital were approached to participate in the study. Two researchers independently examined the tightness of the sheets around the mattresses. Wards were randomized to one of two groups. Groups A and B received written education. In addition, group B received verbal education on alternate days for one week. Beds were re-examined one month later. One researcher was blinded to the educational delivery received by the wards. Results. Twelve wards agreed to participate in the study and 245 beds were examined. Before education, 113 beds (46%) had sheets tucked correctly around the mattresses. Following education, this increased to 215 beds (87.8%) (chi(2) = 68.03, P < 0.001). There was no significant difference in the number of correctly made beds between the two different education groups: 100 (87.72%) beds correctly made in group A vs. 115 (87.79%) beds in group B (chi(2) = 0, P 0.987). Conclusions. Clear, concise written instruction improved practice but verbal education was not additionally beneficial. Relevance to clinical practice. Nurses are receptive to clear, concise written evidence regarding pressure ulcer prevention and incorporate this into clinical practice.
Resumo:
In the emerging digital economy, the management of information in aerospace and construction organisations is facing a particular challenge due to the ever-increasing volume of information and the extensive use of information and communication technologies (ICTs). This paper addresses the problems of information overload and the value of information in both industries by providing some cross-disciplinary insights. In particular it identifies major issues and challenges in the current information evaluation practice in these two industries. Interviews were conducted to get a spectrum of industrial perspectives (director/strategic, project management and ICT/document management) on these issues in particular to information storage and retrieval strategies and the contrasting approaches to knowledge and information management of personalisation and codification. Industry feedback was collected by a follow-up workshop to strengthen the findings of the research. An information-handling agenda is outlined for the development of a future Information Evaluation Methodology (IEM) which could facilitate the practice of the codification of high-value information in order to support through-life knowledge and information management (K&IM) practice.
Resumo:
Multiparous rumen-fistulated Holstein cows were fed, from d 1 to 28 post-calving, an ad libitum TMR containing (g/kg DM) grass silage (196), corn silage (196), wheat (277), soybean meal (100), and other feeds (231) with CP, NDF, starch and water soluble carbohydrate concentrations of 176, 260, 299 and 39 g/kg DM respectively and ME of 12.2 MJ/kg DM. Treatments consisting of a minimum of 1010 cfu Megasphaera elsdenii NCIMB 41125 in 250 ml solution (MEGA) or 250 ml of autoclaved M. elsdenii (CONT) were administered via the rumen cannula on d 3 and 12 of lactation (n=7 per treatment). Mid-rumen pH was measured every 15 minutes and eating and ruminating behavior was recorded for 24 h on d 2, 4, 6, 8, 11, 13, 15, 17, 22 and 28. Rumen fluid for VFA and lactic acid (LA) analysis was collected at 11 timepoints on each of d 2, 4, 6, 13 and 15. Data were analysed as repeated measures using the Glimmix (LA data) or Mixed (all other data) procedures of SAS with previous 305 d milk yield and d 2 measurements as covariates where appropriate. Milk yield was higher (CONT 43.0 vs MEGA 45.4 ±0.75 kg/d, P=0.051) and fat concentration was lower (CONT 45.6 vs MEGA 40.4 ±1.05 g/kg, P=0.005) in cows that received MEGA. Time spent eating (263 ±15 min/d) and ruminating (571 ±13 min/d), DM intake (18.4 ±0.74 kg/d), proportion of each 24 h period with rumen pH below 5.6 (3.69 ±0.94 h) and LA concentrations (2.00 mM) were similar (P>0.327) across treatments. Ruminal total VFA concentration (104 ±3 mM) was similar (P=0.404) across treatments, but a shift from acetate (CONT 551 vs MEGA 524 ±14 mmol/mol VFA, P=0.161) to propionate production (CONT 249 vs MEGA 275 ±11 mmol/mol VFA, P=0.099) meant that the acetate:propionate ratio (CONT 2.33 vs MEGA 1.94 ±0.15) was reduced (P=0.072) in cows that received MEGA. This study provides evidence that supplementation of early lactation dairy cows with MEGA alters rumen fermentation patterns in favour of propionate, with potential benefits for animal health and productivity.
Resumo:
This paper will present a conceptual framework for the examination of land redevelopment based on a complex systems/networks approach. As Alvin Toffler insightfully noted, modern scientific enquiry has become exceptionally good at splitting problems into pieces but has forgotten how to put the pieces back together. Twenty-five years after his remarks, governments and corporations faced with the requirements of sustainability are struggling to promote an ‘integrated’ or ‘holistic’ approach to tackling problems. Despite the talk, both practice and research provide few platforms that allow for ‘joined up’ thinking and action. With socio-economic phenomena, such as land redevelopment, promising prospects open up when we assume that their constituents can make up complex systems whose emergent properties are more than the sum of the parts and whose behaviour is inherently difficult to predict. A review of previous research shows that it has mainly focused on idealised, ‘mechanical’ views of property development processes that fail to recognise in full the relationships between actors, the structures created and their emergent qualities. When reality failed to live up to the expectations of these theoretical constructs then somebody had to be blamed for it: planners, developers, politicians. However, from a ‘synthetic’ point of view the agents and networks involved in property development can be seen as constituents of structures that perform complex processes. These structures interact, forming new more complex structures and networks. Redevelopment then can be conceptualised as a process of transformation: a complex system, a ‘dissipative’ structure involving developers, planners, landowners, state agencies etc., unlocks the potential of previously used sites, transforms space towards a higher order of complexity and ‘consumes’ but also ‘creates’ different forms of capital in the process. Analysis of network relations point toward the ‘dualism’ of structure and agency in these processes of system transformation and change. Insights from actor network theory can be conjoined with notions of complexity and chaos to build an understanding of the ways in which actors actively seek to shape these structures and systems, whilst at the same time are recursively shaped by them in their strategies and actions. This approach transcends the blame game and allows for inter-disciplinary inputs to be placed within a broader explanatory framework that does away with many past dichotomies. Better understanding of the interactions between actors and the emergent qualities of the networks they form can improve our comprehension of the complex socio-spatial phenomena that redevelopment comprises. The insights that this framework provides when applied in UK institutional investment into redevelopment are considered to be significant.
Resumo:
In an essay from 1910 the architect and critic Adolf Loos distinguishes between buildings that are for everyday practical use and buildings made for contemplation. The latter type he asserts may be considered as both architecture and works of art. He refers to only two types of contemplative architecture namely the tomb and the monument. There are certain paintings made in the early part of the twentieth century that do not observe this separation such as certain works by Hopper and de Chirico. Here the commonplace is simultaneously experienced in the way a tomb might be. This mortifying gaze condemns building by inducing a sense that space has become inhospitable and alienating. It could be argued that these and other paintings made around this time such as Carlo Carra The Abandoned House 1916 are like premonitions of what will occur when building observes the prescription laid down by Loos and omit an aesthetic dimension. However it might also suggest that buildings need their tombs or at least some space that is not completely assimilable by the daily, practical and functional needs of an inhabitant.
Resumo:
There are approximately 7000 languages spoken in the world today. This diversity reflects the legacy of thousands of years of cultural evolution. How far back we can trace this history depends largely on the rate at which the different components of language evolve. Rates of lexical evolution are widely thought to impose an upper limit of 6000-10,000 years on reliably identifying language relationships. In contrast, it has been argued that certain structural elements of language are much more stable. Just as biologists use highly conserved genes to uncover the deepest branches in the tree of life, highly stable linguistic features hold the promise of identifying deep relationships between the world's languages. Here, we present the first global network of languages based on this typological information. We evaluate the relative evolutionary rates of both typological and lexical features in the Austronesian and Indo-European language families. The first indications are that typological features evolve at similar rates to basic vocabulary but their evolution is substantially less tree-like. Our results suggest that, while rates of vocabulary change are correlated between the two language families, the rates of evolution of typological features and structural subtypes show no consistent relationship across families.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
The 3D shape of an object and its 3D location have traditionally thought of as very separate entities, although both can be described within a single 3D coordinate frame. Here, 3D shape and location are considered as two aspects of a view-based approach to representing depth, avoiding the use of 3D coordinate frames.
Resumo:
Background. Current models of concomitant, intermittent strabismus, heterophoria, convergence and accommodation anomalies are either theoretically complex or incomplete. We propose an alternative and more practical way to conceptualize clinical patterns. Methods. In each of three hypothetical scenarios (normal; high AC/A and low CA/C ratios; low AC/A and high CA/C ratios) there can be a disparity-biased or blur-biased “style”, despite identical ratios. We calculated a disparity bias index (DBI) to reflect these biases. We suggest how clinical patterns fit these scenarios and provide early objective data from small illustrative clinical groups. Results. Normal adults and children showed disparity bias (adult DBI 0.43 (95%CI 0.50-0.36), child DBI 0.20 (95%CI 0.31-0.07) (p=0.001). Accommodative esotropes showed less disparity-bias (DBI 0.03). In the high AC/A and low CA/C scenario, early presbyopes had mean DBI of 0.17 (95%CI 0.28-0.06), compared to DBI of -0.31 in convergence excess esotropes. In the low AC/A and high CA/C scenario near exotropes had mean DBI of 0.27, while we predict that non-strabismic, non-amblyopic hyperopes with good vision without spectacles will show lower DBIs. Disparity bias ranged between 1.25 and -1.67. Conclusions. Establishing disparity or blur bias, together with knowing whether convergence to target demand exceeds accommodation or vice versa explains clinical patterns more effectively than AC/A and CA/C ratios alone. Excessive bias or inflexibility in near-cue use increases risk of clinical problems. We suggest clinicians look carefully at details of accommodation and convergence changes induced by lenses, dissociation and prisms and use these to plan treatment in relation to the model.
Resumo:
Global controls on month-by-month fractional burnt area (2000–2005) were investigated by fitting a generalised linear model (GLM) to Global Fire Emissions Database (GFED) data, with 11 predictor variables representing vegetation, climate, land use and potential ignition sources. Burnt area is shown to increase with annual net primary production (NPP), number of dry days, maximum temperature, grazing-land area, grass/shrub cover and diurnal temperature range, and to decrease with soil moisture, cropland area and population density. Lightning showed an apparent (weak) negative influence, but this disappeared when pure seasonal-cycle effects were taken into account. The model predicts observed geographic and seasonal patterns, as well as the emergent relationships seen when burnt area is plotted against each variable separately. Unimodal relationships with mean annual temperature and precipitation, population density and gross domestic product (GDP) are reproduced too, and are thus shown to be secondary consequences of correlations between different controls (e.g. high NPP with high precipitation; low NPP with low population density and GDP). These findings have major implications for the design of global fire models, as several assumptions in current models – most notably, the widely assumed dependence of fire frequency on ignition rates – are evidently incorrect.