908 resultados para CLASSIFICATION AND REGRESSION TREE


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Areas of the landscape that are priorities for conservation should be those that are both vulnerable to threatening processes and that if lost or degraded, will result in conservation targets being compromised. While much attention is directed towards understanding the patterns of biodiversity, much less is given to determining the areas of the landscape most vulnerable to threats. We assessed the relative vulnerability of remaining areas of native forest to conversion to plantations in the ecologically significant temperate rainforest region of south central Chile. The area of the study region is 4.2 million ha and the extent of plantations is approximately 200000 ha. First, the spatial distribution of native forest conversion to plantations was determined. The variables related to the spatial distribution of this threatening process were identified through the development of a classification tree and the generation of a multivariate. spatially explicit, statistical model. The model of native forest conversion explained 43% of the deviance and the discrimination ability of the model was high. Predictions were made of where native forest conversion is likely to occur in the future. Due to patterns of climate, topography, soils and proximity to infrastructure and towns, remaining forest areas differ in their relative risk of being converted to plantations. Another factor that may increase the vulnerability of remaining native forest in a subset of the study region is the proposed construction of a highway. We found that 90% of the area of existing plantations within this region is within 2.5 km of roads. When the predictions of native forest conversion were recalculated accounting for the construction of this highway, it was found that: approximately 27000 ha of native forest had an increased probability of conversion. The areas of native forest identified to be vulnerable to conversion are outside of the existing reserve network. (C) 2004 Elsevier Ltd. All tights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemical engineers are turning to multiscale modelling to extend traditional modelling approaches into new application areas and to achieve higher levels of detail and accuracy. There is, however, little advice available on the best strategy to use in constructing a multiscale model. This paper presents a starting point for the systematic analysis of multiscale models by defining several integrating frameworks for linking models at different scales. It briefly explores how the nature of the information flow between the models at the different scales is influenced by the choice of framework, and presents some restrictions on model-framework compatibility. The concepts are illustrated with reference to the modelling of a catalytic packed bed reactor. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Male Nezara viridula produce sex pheromones from many independent single cells, each with a duct that opens onto the ventral abdominal surface. Despite the presence of along duct and an associated end complex (in the form of a cupule and microvillus saccule), the structural organization of the cells that comprise the gland conform to Class 1 epidermal gland cell classification : a single cell surrounds the entire secretory complex. Each cuticular cupule contains a central bed of filaments and opens into a narrow tubular ductule that leads from the base of the cupule through the epidermis to the cuticle to open externally as a pore. The cuticle of the cupule is continuous with that of the ductule and has the appearance of three layers, although the inner (middle) layer may be a gap formed during construction of the complex. In young adult males, just molted, the ultrastructure of the cells and their inclusions indicate that they are not active. The region of the cell that is distal to the abdominal cuticle is reduced and the proximal region, surrounding the duct, is enlarged when compared with sexually mature (3-4 weeks old) adult males. At maturity the pheromone cells are enlarged distally around the cupule, but are reduced to a narrow sleeve proximally, around the ductule. Two characteristic cell profiles are evident, based on the shape of the cupule and the organelle content. Type A shows a broad opening to the cupule, an abundance of mitochondria, and few vesicular bodies. Type B has an elongated, narrow, vase-like opening to the cupule, few mitochondria, and numerous vesicular bodies. Type B cells are smaller and more abundant than Type A. Distribution within the epidermal layer also differs. It is likely that the different types represent cells producing different secretion profiles. However, the secretions retained by the standard fixation protocol within mature cells of both types look similar and appear to collect as crystalline bodies within the lumen. This may represent a common storage mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Correlation and regression are two of the statistical procedures most widely used by optometrists. However, these tests are often misused or interpreted incorrectly, leading to erroneous conclusions from clinical experiments. This review examines the major statistical tests concerned with correlation and regression that are most likely to arise in clinical investigations in optometry. First, the use, interpretation and limitations of Pearson's product moment correlation coefficient are described. Second, the least squares method of fitting a linear regression to data and for testing how well a regression line fits the data are described. Third, the problems of using linear regression methods in observational studies, if there are errors associated in measuring the independent variable and for predicting a new value of Y for a given X, are discussed. Finally, methods for testing whether a non-linear relationship provides a better fit to the data and for comparing two or more regression lines are considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time, cost and quality achievements on large-scale construction projects are uncertain because of technological constraints, involvement of many stakeholders, long durations, large capital requirements and improper scope definitions. Projects that are exposed to such an uncertain environment can effectively be managed with the application of risk management throughout the project life cycle. Risk is by nature subjective. However, managing risk subjectively poses the danger of non-achievement of project goals. Moreover, risk analysis of the overall project also poses the danger of developing inappropriate responses. This article demonstrates a quantitative approach to construction risk management through an analytic hierarchy process (AHP) and decision tree analysis. The entire project is classified to form a few work packages. With the involvement of project stakeholders, risky work packages are identified. As all the risk factors are identified, their effects are quantified by determining probability (using AHP) and severity (guess estimate). Various alternative responses are generated, listing the cost implications of mitigating the quantified risks. The expected monetary values are derived for each alternative in a decision tree framework and subsequent probability analysis helps to make the right decision in managing risks. In this article, the entire methodology is explained by using a case application of a cross-country petroleum pipeline project in India. The case study demonstrates the project management effectiveness of using AHP and DTA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional feed forward Neural Networks have used the sum-of-squares cost function for training. A new cost function is presented here with a description length interpretation based on Rissanen's Minimum Description Length principle. It is a heuristic that has a rough interpretation as the number of data points fit by the model. Not concerned with finding optimal descriptions, the cost function prefers to form minimum descriptions in a naive way for computational convenience. The cost function is called the Naive Description Length cost function. Finding minimum description models will be shown to be closely related to the identification of clusters in the data. As a consequence the minimum of this cost function approximates the most probable mode of the data rather than the sum-of-squares cost function that approximates the mean. The new cost function is shown to provide information about the structure of the data. This is done by inspecting the dependence of the error to the amount of regularisation. This structure provides a method of selecting regularisation parameters as an alternative or supplement to Bayesian methods. The new cost function is tested on a number of multi-valued problems such as a simple inverse kinematics problem. It is also tested on a number of classification and regression problems. The mode-seeking property of this cost function is shown to improve prediction in time series problems. Description length principles are used in a similar fashion to derive a regulariser to control network complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study proposes an integrated analytical framework for effective management of project risks using combined multiple criteria decision-making technique and decision tree analysis. First, a conceptual risk management model was developed through thorough literature review. The model was then applied through action research on a petroleum oil refinery construction project in the Central part of India in order to demonstrate its effectiveness. Oil refinery construction projects are risky because of technical complexity, resource unavailability, involvement of many stakeholders and strict environmental requirements. Although project risk management has been researched extensively, practical and easily adoptable framework is missing. In the proposed framework, risks are identified using cause and effect diagram, analysed using the analytic hierarchy process and responses are developed using the risk map. Additionally, decision tree analysis allows modelling various options for risk response development and optimises selection of risk mitigating strategy. The proposed risk management framework could be easily adopted and applied in any project and integrated with other project management knowledge areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an approach to development of intelligent search system and automatic document classification and cataloging tools for CASE-system based on metadata. The described method uses advantages of ontology approach and traditional approach based on keywords. The method has powerful intelligent means and it can be integrated with existing document search systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is to establish new optimization methods for pattern recognition and classification of different white blood cells in actual patient data to enhance the process of diagnosis. Beckman-Coulter Corporation supplied flow cytometry data of numerous patients that are used as training sets to exploit the different physiological characteristics of the different samples provided. The methods of Support Vector Machines (SVM) and Artificial Neural Networks (ANN) were used as promising pattern classification techniques to identify different white blood cell samples and provide information to medical doctors in the form of diagnostic references for the specific disease states, leukemia. The obtained results prove that when a neural network classifier is well configured and trained with cross-validation, it can perform better than support vector classifiers alone for this type of data. Furthermore, a new unsupervised learning algorithm---Density based Adaptive Window Clustering algorithm (DAWC) was designed to process large volumes of data for finding location of high data cluster in real-time. It reduces the computational load to ∼O(N) number of computations, and thus making the algorithm more attractive and faster than current hierarchical algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Annual precipitation for the last 2,500 years was reconstructed for northeastern Qinghai from living and archaeological juniper trees. A dominant feature of the precipitation of this area is a high degree of variability in mean rainfall at annual, decadal, and centennial scales, with many wet and dry periods that are corroborated by other paleoclimatic indicators. Reconstructed values of annual precipitation vary mostly from 100 to 300 mm and thus are no different from the modern instrumental record in Dulan. However, relatively dry years with below-average precipitation occurred more frequently in the past than in the present. Periods of relatively dry years occurred during 74-25 BC, AD 51-375, 426-500, 526-575, 626-700, 1100-1225, 1251-1325, 1451-1525, 1651-1750 and 1801-1825. Periods with a relatively wet climate occurred during AD 376-425, 576-625, 951-1050, 1351-1375, 1551-1600 and the present. This variability is probably related to latitudinal positions of winter frontal storms. Another key feature of precipitation in this area is an apparently direct relationship between interannual variability in rainfall with temperature, whereby increased warming in the future might lead to increased flooding and droughts. Such increased climatic variability might then impact human societies of the area, much as the climate has done for the past 2,500 years.