919 resultados para Tools and techniques
Resumo:
Bodies of Ding kiln white porcelains and their imitations from Guantai and Jiexiu kilns of the Chinese Song dynasty (960-1279 AD) were analysed for 40 trace elements by inductively coupled plasma mass spectrometry (ICP-MS). Numerous trace element compositions and ratios allow these visually similar products to be distinguished, and a Ding-style shard of uncertain origin is identified as a likely genuine Ding product. In Jiexiu kiln, Ding-style products have trace element features distinctive from blackwares of an inferior quality intended for the lower end market. Based on geochemical behaviour of these trace elements, we propose that geochemically distinctive raw materials were used for Ding-style products of a higher quality, which possibly also underwent purification by levigation prior to use. Capable of analysing over 40 elements with a typical long term precision of < 2%, this high precision ICP-MS method proves to be very powerful for grouping and characterising archaeological ceramics. Combined with geochemical interpretation, it can provide insights into the raw materials and techniques used by ancient potters. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Adaptive management is the pathway to effective conservation, use and management of Australia’s coastal catchments and waterways. While the concepts of adaptive management are not new, applications involving both assessment and management responses are indeed limited at the national and regional scales. This paper outlines the components of a systematic framework for linking scientific knowledge, existing tools, planning approaches and participatory processes to achieve healthy regional partnerships between community, industry, government agencies and science providers to overcome institutional barriers and uncoordinated monitoring. The framework developed by the Coastal CRC (www.coastal.crc.org.au/amf/amf_index.htm) is hierarchical in the way it displays information to allow associated frameworks to be integrated, and represents a construct in which processes, information, decision tools and outcomes are brought together in a structured and transparent way for adaptive catchment and coastal management. This paper proposes how an adaptive management approach could be used to benefit the implementation of the Reef Water Quality Protection Plan (RWQPP).
Resumo:
The outreach social work service is one of the dominant youth work approaches in dealing with delinquents and youths 'at-risk' in Hong Kong. Yet this approach presents particular challenges. Outreach social workers usually play an active role in initiating and establishing contacts with young people, yet young people are reluctant to engage with the outreach social workers and are resistant towards therapeutic change. To date, little is known about what strategies and techniques are most effective in dealing with client resistance in this context. The aims of this paper are to gain a better understanding of the common resistant behaviours that outreach social workers usually encounter in their day-to-day practice, and to investigate how the outreach social workers respond to their clients' resistance with reference to case examples given in the in-depth interviews. The findings of this study provide evidence that whilst client resistance is common in the outreach social work setting, social workers' patience as well as sensitivity are essential in resolving resistance and building up a rapport with clients.
Resumo:
Machine learning techniques have been recognized as powerful tools for learning from data. One of the most popular learning techniques, the Back-Propagation (BP) Artificial Neural Networks, can be used as a computer model to predict peptides binding to the Human Leukocyte Antigens (HLA). The major advantage of computational screening is that it reduces the number of wet-lab experiments that need to be performed, significantly reducing the cost and time. A recently developed method, Extreme Learning Machine (ELM), which has superior properties over BP has been investigated to accomplish such tasks. In our work, we found that the ELM is as good as, if not better than, the BP in term of time complexity, accuracy deviations across experiments, and most importantly - prevention from over-fitting for prediction of peptide binding to HLA.
Resumo:
Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly.
Resumo:
Research in verification and validation (V&V) for concurrent programs can be guided by practitioner information. A survey was therefore run to gain state-of-practice information in this context. The survey presented in this paper collected state-of-practice information on V&V technology in concurrency from 35 respondents. The results of the survey can help refine existing V&V technology by providing a better understanding of the context of V&V technology usage. Responses to questions regarding the motivation for selecting V&V technologies can help refine a systematic approach to V&V technology selection.
Resumo:
The results of empirical studies are limited to particular contexts, difficult to generalise and the studies themselves are expensive to perform. Despite these problems, empirical studies in software engineering can be made effective and they are important to both researchers and practitioners. The key to their effectiveness lies in the maximisation of the information that can be gained by examining existing studies, conducting power analyses for an accurate minimum sample size and benefiting from previous studies through replication. This approach was applied in a controlled experiment examining the combination of automated static analysis tools and code inspection in the context of verification and validation (V&V) of concurrent Java components. The combination of these V&V technologies was shown to be cost-effective despite the size of the study, which thus contributes to research in V&V technology evaluation.