926 resultados para ASSESSMENT MODELS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Wet Tropics World Heritage Area in Far North Queens- land, Australia consists predominantly of tropical rainforest and wet sclerophyll forest in areas of variable relief. Previous maps of vegetation communities in the area were produced by a labor-intensive combination of field survey and air-photo interpretation. Thus,. the aim of this work was to develop a new vegetation mapping method based on imaging radar that incorporates topographical corrections, which could be repeated frequently, and which would reduce the need for detailed field assessments and associated costs. The method employed G topographic correction and mapping procedure that was developed to enable vegetation structural classes to be mapped from satellite imaging radar. Eight JERS-1 scenes covering the Wet Tropics area for 1996 were acquired from NASDA under the auspices of the Global Rainforest Mapping Project. JERS scenes were geometrically corrected for topographic distortion using an 80 m DEM and a combination of polynomial warping and radar viewing geometry modeling. An image mosaic was created to cover the Wet Tropics region, and a new technique for image smoothing was applied to the JERS texture bonds and DEM before a Maximum Likelihood classification was applied to identify major land-cover and vegetation communities. Despite these efforts, dominant vegetation community classes could only be classified to low levels of accuracy (57.5 percent) which were partly explained by the significantly larger pixel size of the DEM in comparison to the JERS image (12.5 m). In addition, the spatial and floristic detail contained in the classes of the original validation maps were much finer than the JERS classification product was able to distinguish. In comparison to field and aerial photo-based approaches for mapping the vegetation of the Wet Tropics, appropriately corrected SAR data provides a more regional scale, all-weather mapping technique for broader vegetation classes. Further work is required to establish an appropriate combination of imaging radar with elevation data and other environmental surrogates to accurately map vegetation communities across the entire Wet Tropics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GH receptor (GHR) is essential for normal postnatal growth and development, and the molecular basis of GHR action has been studied intensively. Clinical case studies and more recently mouse models have revealed the extensive phenotype of impaired GH action. We recently reported two new mouse models, possessing cytoplasmic truncations at position 569 (plus Y539/545-F) and 391, which were created to identify functional subdomains within the cytoplasmic signaling domain. In the homozygous state, these animals show progressively impaired postnatal growth coupled with complex changes in gene expression. We describe here an extended phenotype analysis encompassing the heterozygote state to identify whether single copies of these mutant receptors bring about partial or dominant-negative phenotypes. It appears that the retention of the ubiquitin-dependent endocytosis motif the N-terminal cytoplasmic domain permits turnover of these mutant receptors because no dominant-negative phenotype is seen. Nonetheless, we do observe partial impairment of postnatal growth in heterozygotes supporting limited haploinsufficiency. Reproductive function is impaired in these models in a progressive manner, in parallel with loss of signal transducer and activator of transcription-5 activation ability. In summary, we describe a more comprehensive phenotypic analysis of these mouse models, encompassing overall and longitudinal body growth, reproductive function, and hormonal status in both the heterozygote and homozygote state. Our results suggest that patients expressing single copies of similarly mutated GHRs would not display an obvious clinical phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elevated ocean temperatures can cause coral bleaching, the loss of colour from reef-building corals because of a breakdown of the symbiosis with the dinoflagellate Symbiodinium. Recent studies have warned that global climate change could increase the frequency of coral bleaching and threaten the long-term viability of coral reefs. These assertions are based on projecting the coarse output from atmosphere-ocean general circulation models (GCMs) to the local conditions around representative coral reefs. Here, we conduct the first comprehensive global assessment of coral bleaching under climate change by adapting the NOAA Coral Reef Watch bleaching prediction method to the output of a low- and high-climate sensitivity GCM. First, we develop and test algorithms for predicting mass coral bleaching with GCM-resolution sea surface temperatures for thousands of coral reefs, using a global coral reef map and 1985-2002 bleaching prediction data. We then use the algorithms to determine the frequency of coral bleaching and required thermal adaptation by corals and their endosymbionts under two different emissions scenarios. The results indicate that bleaching could become an annual or biannual event for the vast majority of the world's coral reefs in the next 30-50 years without an increase in thermal tolerance of 0.2-1.0 degrees C per decade. The geographic variability in required thermal adaptation found in each model and emissions scenario suggests that coral reefs in some regions, like Micronesia and western Polynesia, may be particularly vulnerable to climate change. Advances in modelling and monitoring will refine the forecast for individual reefs, but this assessment concludes that the global prognosis is unlikely to change without an accelerated effort to stabilize atmospheric greenhouse gas concentrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Document classification is a supervised machine learning process, where predefined category labels are assigned to documents based on the hypothesis derived from training set of labelled documents. Documents cannot be directly interpreted by a computer system unless they have been modelled as a collection of computable features. Rogati and Yang [M. Rogati and Y. Yang, Resource selection for domain-specific cross-lingual IR, in SIGIR 2004: Proceedings of the 27th annual international conference on Research and Development in Information Retrieval, ACM Press, Sheffied: United Kingdom, pp. 154-161.] pointed out that the effectiveness of document classification system may vary in different domains. This implies that the quality of document model contributes to the effectiveness of document classification. Conventionally, model evaluation is accomplished by comparing the effectiveness scores of classifiers on model candidates. However, this kind of evaluation methods may encounter either under-fitting or over-fitting problems, because the effectiveness scores are restricted by the learning capacities of classifiers. We propose a model fitness evaluation method to determine whether a model is sufficient to distinguish positive and negative instances while still competent to provide satisfactory effectiveness with a small feature subset. Our experiments demonstrated how the fitness of models are assessed. The results of our work contribute to the researches of feature selection, dimensionality reduction and document classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical modelling is a valuable tool for simulating the fundamental processes that take place during a heating. The models presented in this paper have enabled a quantitative assessment of the effects of initial pile temperature, pile size and mass and coal particle size on the development of a heating. All of these parameters have a certain criticality in the coal self-heating process.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of astrogliosis, or reactive gliosis, is a typical response of astrocytes to a wide range of physical and chemical injuries. The up-regulation of the astrocyte specific glial fibrillary acidic protein (GFAP) is a hallmark of reactive gliosis and is widely used as a marker to identify the response. In order to develop a reliable, sensitive and high throughput astrocyte toxicity assay that is more relevant to the human response than existing animal cell based models, the U251-MG, U373-MG and CCF-STTG 1 human astrocytoma cell lines were investigated for their ability to exhibit reactive-like changes following exposure to ethanol, chloroquine diphosphate, trimethyltin chloride and acrylamide. Cytotoxicity analysis showed that the astrocytic cells were generally more resistant to the cytotoxic effects of the agents than the SH-SY5Y neuroblastoma cells. Retinoic acid induced differentiation of the SH-SY5Y line was also seen to confer some degree of resistance to toxicant exposure, particularly in the case of ethanol. Using a cell based ELISA for GFAP together with concurrent assays for metabolic activity and cell number, each of the three cell lines responded to toxicant exposure by an increase in GFAP immunoreactivity (GFAP-IR), or by increased metabolic activity. Ethanol, chloroquine diphosphate, trimethyltin chloride and bacterial lipopolysaccharide all induced either GFAP or MTT increases depending upon the cell line, dose and exposure time. Preliminary investigations of additional aspects of astrocytic injury indicated that IL-6, but not TNF-α. or nitric oxide, is released following exposure to each of the compounds, with the exception of acrylamide. It is clear that these human astrocytoma cell lines are capable of responding to toxicant exposure in a manner typical of reactive gliosis and are therefore a valuable cellular model in the assessment of in vitro neurotoxicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigated expertise in hazardous substance risk assessment (HSRA). Competent pro-active risk assessment is needed to prevent occupational ill-health caused by hazardous substance exposure occurring in the future. In recent years there has been a strong demand for HSRA expertise and a shortage of expert practitioners. The discipline of Occupational Hygiene was identified as the key repository of knowledge and skills for HSRA and one objective of this research was to develop a method to elicit this expertise from experienced occupational hygienists. In the study of generic expertise, many methods of knowledge elicitation (KE) have been investigated, since this has been relevant to the development of 'expert systems' (thinking computers). Here, knowledge needed to be elicited from human experts, and this stage was often a bottleneck in system development, since experts could not explain the basis of their expertise. At an intermediate stage, information collected was used to structure a basic model of hazardous substance risk assessment activity (HSRA Model B) and this formed the basis of tape transcript analysis in the main study with derivation of a 'classification' and a 'performance matrix'. The study aimed to elicit the expertise of occupational hygienists and compare their performance with other health and safety professionals (occupational health physicians, occupational health nurses, health and safety practitioners and trainee health and safety inspectors), as evaluated using the matrix. As a group, the hygienists performed best in the exercise, and this group were particularly good at process elicitation and at recommending specific control measures, although the other groups also performed well in selected aspects of the matrix and the work provided useful findings and insights. From the research, two models of HSRA have been derived, an HSRA aid, together with a novel videotape KE technique and interesting research findings. The implications of this are discussed with respect to future training of HS professionals and wider application of the videotape KE method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investment in capacity expansion remains one of the most critical decisions for a manufacturing organisation with global production facilities. Multiple factors need to be considered making the decision process very complex. The purpose of this paper is to establish the state-of-the-art in multi-factor models for capacity expansion of manufacturing plants within a corporation. The research programme consisting of an extensive literature review and a structured assessment of the strengths and weaknesses of the current research is presented. The study found that there is a wealth of mathematical multi-factor models for evaluating capacity expansion decisions however no single contribution captures all the different facets of the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of the paper was to conduct an empirical investigation to explore the impact of project management maturity models (PMMMs) on improving project performance. Design/methodology/approach – The investigation used a cross-case analysis involving over 90 individuals in seven organisations. Findings – The findings of the empirical investigation indicate that PMMMs demonstrate very high levels of variability in individual's assessment of project management maturity. Furthermore, at higher levels of maturity, the type of performance improvement adopted following their application is related to the type of PMMM used in the assessment. The paradox of the unreliability of PMMMs and their widespread acceptance is resolved by calling upon the “wisdom of crowds” phenomenon which has implications for the use of maturity model assessments in other arena. Research limitations/implications – The investigation does have the usual issues associated with case research, but the steps that have been taken in the cross-case construction and analysis have improved the overall robustness and extendibility of the findings. Practical implications – The tendency for PMMMs to shape improvements based on their own inherent structure needs further understanding. Originality/value – The use of empirical methods to investigate the link between project maturity models and extant changes in project management performance is highly novel and the findings that result from this have added resonance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the course of the last twenty years there has been a growing academic interest in performance management, particularly in respect of the evolution of new techniques and their resulting impact. One important theoretical development has been the emergence of multidimensional performance measurement models that are potentially applicable within the public sector. Empirically, academic researchers are increasingly supporting the use of such models as a way of improving public sector management and the effectiveness of service provision (Mayston, 1985; Pollitt, 1986; Bates and Brignall, 1993; and Massey, 1999). This paper seeks to add to the literature by using both theoretical and empirical evidence to argue that CPA, the external inspection tool used by the Audit Commission to evaluate local authority performance management, is a version of the Balanced Scorecard which, when adapted for internal use, may have beneficial effects. After demonstrating the parallels between the CPA framework and Kaplan and Norton's public sector Balanced Scorecard (BSC), we use a case study of the BSC based performance management system in Hertfordshire County Council to demonstrate the empirical linkages between a local scorecard and CPA. We conclude that CPA is based upon the BSC and has the potential to serve as a springboard for the evolution of local authority performance management systems.