989 resultados para TREE METHOD


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background The vast sequence divergence among different virus groups has presented a great challenge to alignment-based analysis of virus phylogeny. Due to the problems caused by the uncertainty in alignment, existing tools for phylogenetic analysis based on multiple alignment could not be directly applied to the whole-genome comparison and phylogenomic studies of viruses. There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. Among the alignment-free methods, a dynamical language (DL) method proposed by our group has successfully been applied to the phylogenetic analysis of bacteria and chloroplast genomes. Results In this paper, the DL method is used to analyze the whole-proteome phylogeny of 124 large dsDNA viruses and 30 parvoviruses, two data sets with large difference in genome size. The trees from our analyses are in good agreement to the latest classification of large dsDNA viruses and parvoviruses by the International Committee on Taxonomy of Viruses (ICTV). Conclusions The present method provides a new way for recovering the phylogeny of large dsDNA viruses and parvoviruses, and also some insights on the affiliation of a number of unclassified viruses. In comparison, some alignment-free methods such as the CV Tree method can be used for recovering the phylogeny of large dsDNA viruses, but they are not suitable for resolving the phylogeny of parvoviruses with a much smaller genome size.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: In order to optimise the cost-effectiveness of active surveillance to substantiate freedom from disease, a new approach using targeted sampling of farms was developed and applied on the example of infectious bovine rhinotracheitis (IBR) and enzootic bovine leucosis (EBL) in Switzerland. Relevant risk factors (RF) for the introduction of IBR and EBL into Swiss cattle farms were identified and their relative risks defined based on literature review and expert opinions. A quantitative model based on the scenario tree method was subsequently used to calculate the required sample size of a targeted sampling approach (TS) for a given sensitivity. We compared the sample size with that of a stratified random sample (sRS) with regard to efficiency. RESULTS: The required sample sizes to substantiate disease freedom were 1,241 farms for IBR and 1,750 farms for EBL to detect 0.2% herd prevalence with 99% sensitivity. Using conventional sRS, the required sample sizes were 2,259 farms for IBR and 2,243 for EBL. Considering the additional administrative expenses required for the planning of TS, the risk-based approach was still more cost-effective than a sRS (40% reduction on the full survey costs for IBR and 8% for EBL) due to the considerable reduction in sample size. CONCLUSIONS: As the model depends on RF selected through literature review and was parameterised with values estimated by experts, it is subject to some degree of uncertainty. Nevertheless, this approach provides the veterinary authorities with a promising tool for future cost-effective sampling designs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose a new algorithm for learning polyhedral classifiers. In contrast to existing methods for learning polyhedral classifier which solve a constrained optimization problem, our method solves an unconstrained optimization problem. Our method is based on a logistic function based model for the posterior probability function. We propose an alternating optimization algorithm, namely, SPLA1 (Single Polyhedral Learning Algorithm1) which maximizes the loglikelihood of the training data to learn the parameters. We also extend our method to make it independent of any user specified parameter (e.g., number of hyperplanes required to form a polyhedral set) in SPLA2. We show the effectiveness of our approach with experiments on various synthetic and real world datasets and compare our approach with a standard decision tree method (OC1) and a constrained optimization based method for learning polyhedral sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

参阅了大量国内外有关乔木蒸腾研究方法文献,认为乔木蒸腾量研究方法主要有二大类,即组织器官测定、单木测定;分类对典型研究方法(快速称重法、气孔计法、整株容器称重法、同位素示踪法、热脉冲法、树干热平衡法、热扩散探针法)进行了述评,对比分析了各种方法间的优缺点及其适用范围;展望了乔木蒸腾耗水作用研究方法的应用前景,认为热技术法是未来几年内的主要测定方法。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Based on the fractal theories, contractive mapping principles as well as the fixed point theory, by means of affine transform, this dissertation develops a novel Explicit Fractal Interpolation Function(EFIF)which can be used to reconstruct the seismic data with high fidelity and precision. Spatial trace interpolation is one of the important issues in seismic data processing. Under the ideal circumstances, seismic data should be sampled with a uniform spatial coverage. However, practical constraints such as the complex surface conditions indicate that the sampling density may be sparse or for other reasons some traces may be lost. The wide spacing between receivers can result in sparse sampling along traverse lines, thus result in a spatial aliasing of short-wavelength features. Hence, the method of interpolation is of very importance. It not only needs to make the amplitude information obvious but the phase information, especially that of the point that the phase changes acutely. Many people put forward several interpolation methods, yet this dissertation focuses attention on a special class of fractal interpolation function, referred to as explicit fractal interpolation function to improve the accuracy of the interpolation reconstruction and to make the local information obvious. The traditional fractal interpolation method mainly based on the randomly Fractional Brown Motion (FBM) model, furthermore, the vertical scaling factor which plays a critical role in the implementation of fractal interpolation is assigned the same value during the whole interpolating process, so it can not make the local information obvious. In addition, the maximal defect of the traditional fractal interpolation method is that it cannot obtain the function values on each interpolating nodes, thereby it cannot analyze the node error quantitatively and cannot evaluate the feasibility of this method. Detailed discussions about the applications of fractal interpolation in seismology have not been given by the pioneers, let alone the interpolating processing of the single trace seismogram. On the basis of the previous work and fractal theory this dissertation discusses the fractal interpolation thoroughly and the stability of this special kind of interpolating function is discussed, at the same time the explicit presentation of the vertical scaling factor which controls the precision of the interpolation has been proposed. This novel method develops the traditional fractal interpolation method and converts the fractal interpolation with random algorithms into the interpolation with determined algorithms. The data structure of binary tree method has been applied during the process of interpolation, and it avoids the process of iteration that is inevitable in traditional fractal interpolation and improves the computation efficiency. To illustrate the validity of the novel method, this dissertation develops several theoretical models and synthesizes the common shot gathers and seismograms and reconstructs the traces that were erased from the initial section using the explicit fractal interpolation method. In order to compare the differences between the theoretical traces that were erased in the initial section and the resulting traces after reconstruction on waveform and amplitudes quantitatively, each missing traces are reconstructed and the residuals are analyzed. The numerical experiments demonstrate that the novel fractal interpolation method is not only applicable to reconstruct the seismograms with small offset but to the seismograms with large offset. The seismograms reconstructed by explicit fractal interpolation method resemble the original ones well. The waveform of the missing traces could be estimated very well and also the amplitudes of the interpolated traces are a good approximation of the original ones. The high precision and computational efficiency of the explicit fractal interpolation make it a useful tool to reconstruct the seismic data; it can not only make the local information obvious but preserve the overall characteristics of the object investigated. To illustrate the influence of the explicit fractal interpolation method to the accuracy of the imaging of the structure in the earth’s interior, this dissertation applies the method mentioned above to the reverse-time migration. The imaging sections obtained by using the fractal interpolated reflected data resemble the original ones very well. The numerical experiments demonstrate that even with the sparse sampling we can still obtain the high accurate imaging of the earth’s interior’s structure by means of the explicit fractal interpolation method. So we can obtain the imaging results of the earth’s interior with fine quality by using relatively small number of seismic stations. With the fractal interpolation method we will improve the efficiency and the accuracy of the reverse-time migration under economic conditions. To verify the application effect to real data of the method presented in this paper, we tested the method by using the real data provided by the Broadband Seismic Array Laboratory, IGGCAS. The results demonstrate that the accuracy of explicit fractal interpolation is still very high even with the real data with large epicenter and large offset. The amplitudes and the phase of the reconstructed station data resemble the original ones that were erased in the initial section very well. Altogether, the novel fractal interpolation function provides a new and useful tool to reconstruct the seismic data with high precision and efficiency, and presents an alternative to image the deep structure of the earth accurately.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Artificial neural network (ANN) methods are used to predict forest characteristics. The data source is the Southeast Alaska (SEAK) Grid Inventory, a ground survey compiled by the USDA Forest Service at several thousand sites. The main objective of this article is to predict characteristics at unsurveyed locations between grid sites. A secondary objective is to evaluate the relative performance of different ANNs. Data from the grid sites are used to train six ANNs: multilayer perceptron, fuzzy ARTMAP, probabilistic, generalized regression, radial basis function, and learning vector quantization. A classification and regression tree method is used for comparison. Topographic variables are used to construct models: latitude and longitude coordinates, elevation, slope, and aspect. The models classify three forest characteristics: crown closure, species land cover, and tree size/structure. Models are constructed using n-fold cross-validation. Predictive accuracy is calculated using a method that accounts for the influence of misclassification as well as measuring correct classifications. The probabilistic and generalized regression networks are found to be the most accurate. The predictions of the ANN models are compared with a classification of the Tongass national forest in southeast Alaska based on the interpretation of satellite imagery and are found to be of similar accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Magellanic Clouds are uniquely placed to study the stellar contribution to dust emission. Individual stars can be resolved in these systems even in the mid-infrared, and they are close enough to allow detection of infrared excess caused by dust. We have searched the Spitzer Space Telescope data archive for all Infrared Spectrograph (IRS) staring-mode observations of the Small Magellanic Cloud (SMC) and found that 209 Infrared Array Camera (IRAC) point sources within the footprint of the Surveying the Agents of Galaxy Evolution in the Small Magellanic Cloud (SAGE-SMC) Spitzer Legacy programme were targeted, within a total of 311 staring-mode observations. We classify these point sources using a decision tree method of object classification, based on infrared spectral features, continuum and spectral energy distribution shape, bolometric luminosity, cluster membership and variability information. We find 58 asymptotic giant branch (AGB) stars, 51 young stellar objects, 4 post-AGB objects, 22 red supergiants, 27 stars (of which 23 are dusty OB stars), 24 planetary nebulae (PNe), 10 Wolf-Rayet stars, 3 H II regions, 3 R Coronae Borealis stars, 1 Blue Supergiant and 6 other objects, including 2 foreground AGB stars. We use these classifications to evaluate the success of photometric classification methods reported in the literature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AEA Technology has provided an assessment of the probability of α-mode containment failure for the Sizewell B PWR. After a preliminary review of the methodologies available it was decided to use the probabilistic approach described in the paper, based on an extension of the methodology developed by Theofanous et al. (Nucl. Sci. Eng. 97 (1987) 259–325). The input to the assessment is 12 probability distributions; the bases for the quantification of these distributions are discussed. The α-mode assessment performed for the Sizewell B PWR has demonstrated the practicality of the event-tree method with input data represented by probability distributions. The assessment itself has drawn attention to a number of topics, which may be plant and sequence dependent, and has indicated the importance of melt relocation scenarios. The α-mode failure probability following an accident that leads to core melt relocation to the lower head for the Sizewell B PWR has been assessed as a few parts in 10 000, on the basis of current information. This assessment has been the first to consider elevated pressures (6 MPa and 15 MPa) besides atmospheric pressure, but the results suggest only a modest sensitivity to system pressure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper introduces a new technique in the investigation of object classification and illustrates the potential use of this technique for the analysis of a range of biological data, using avian morphometric data as an example. The nascent variable precision rough sets (VPRS) model is introduced and compared with the decision tree method ID3 (through a ‘leave n out’ approach), using the same dataset of morphometric measures of European barn swallows (Hirundo rustica) and assessing the accuracy of gender classification based on these measures. The results demonstrate that the VPRS model, allied with the use of a modern method of discretization of data, is comparable with the more traditional non-parametric ID3 decision tree method. We show that, particularly in small samples, the VPRS model can improve classification and to a lesser extent prediction aspects over ID3. Furthermore, through the ‘leave n out’ approach, some indication can be produced of the relative importance of the different morphometric measures used in this problem. In this case we suggest that VPRS has advantages over ID3, as it intelligently uses more of the morphometric data available for the data classification, whilst placing less emphasis on variables with low reliability. In biological terms, the results suggest that the gender of swallows can be determined with reasonable accuracy from morphometric data and highlight the most important variables in this process. We suggest that both analysis techniques are potentially useful for the analysis of a range of different types of biological datasets, and that VPRS in particular has potential for application to a range of biological circumstances.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis provides a unified and comprehensive treatment of the fuzzy neural networks as the intelligent controllers. This work has been motivated by a need to develop the solid control methodologies capable of coping with the complexity, the nonlinearity, the interactions, and the time variance of the processes under control. In addition, the dynamic behavior of such processes is strongly influenced by the disturbances and the noise, and such processes are characterized by a large degree of uncertainty. Therefore, it is important to integrate an intelligent component to increase the control system ability to extract the functional relationships from the process and to change such relationships to improve the control precision, that is, to display the learning and the reasoning abilities. The objective of this thesis was to develop a self-organizing learning controller for above processes by using a combination of the fuzzy logic and the neural networks. An on-line, direct fuzzy neural controller using the process input-output measurement data and the reference model with both structural and parameter tuning has been developed to fulfill the above objective. A number of practical issues were considered. This includes the dynamic construction of the controller in order to alleviate the bias/variance dilemma, the universal approximation property, and the requirements of the locality and the linearity in the parameters. Several important issues in the intelligent control were also considered such as the overall control scheme, the requirement of the persistency of excitation and the bounded learning rates of the controller for the overall closed loop stability. Other important issues considered in this thesis include the dependence of the generalization ability and the optimization methods on the data distribution, and the requirements for the on-line learning and the feedback structure of the controller. Fuzzy inference specific issues such as the influence of the choice of the defuzzification method, T-norm operator and the membership function on the overall performance of the controller were also discussed. In addition, the e-completeness requirement and the use of the fuzzy similarity measure were also investigated. Main emphasis of the thesis has been on the applications to the real-world problems such as the industrial process control. The applicability of the proposed method has been demonstrated through the empirical studies on several real-world control problems of industrial complexity. This includes the temperature and the number-average molecular weight control in the continuous stirred tank polymerization reactor, and the torsional vibration, the eccentricity, the hardness and the thickness control in the cold rolling mills. Compared to the traditional linear controllers and the dynamically constructed neural network, the proposed fuzzy neural controller shows the highest promise as an effective approach to such nonlinear multi-variable control problems with the strong influence of the disturbances and the noise on the dynamic process behavior. In addition, the applicability of the proposed method beyond the strictly control area has also been investigated, in particular to the data mining and the knowledge elicitation. When compared to the decision tree method and the pruned neural network method for the data mining, the proposed fuzzy neural network is able to achieve a comparable accuracy with a more compact set of rules. In addition, the performance of the proposed fuzzy neural network is much better for the classes with the low occurrences in the data set compared to the decision tree method. Thus, the proposed fuzzy neural network may be very useful in situations where the important information is contained in a small fraction of the available data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

São apresentados dois acidentes do trabalho típicos, ocorridos em empresa de grande porte, investigados com o Método de Árvore de Causas ­ ADC, método que permite identificar o papel desempenhado por fatores gerenciais e de organização do trabalho no desencadeamento desses fenômenos. Os casos apresentados revelam a participação, na gênese dos acidentes, de fatores como designação temporária e improvisada de trabalhadores para funções e postos de trabalho, execução de tarefas deixadas à iniciativa e ao arbítrio dos trabalhadores, falta de ferramentas e de materiais apropriados à execução de tarefas e falhas na circulação de informações, entre outros. São também analisadas as indicações para o uso do método, suas potencialidades em termos de prevenção, bem como as implicações decorrentes de dificuldades de aplicação, de necessidades de treinamento e reciclagens e do dispêndio elevado de tempo para investigação de cada acidente.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s,without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, apposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. on the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows:the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present here the results of a study of 21 work-related accidents that occurred in a Brazilian manufacturing company. The aim was to assess the safety level of the company to improve its work accident prevention policy. In the last 6 months of 1992 and 1993, all accidents resulting in 15 days' absence from work, reported for social security purposes, were analyzed using the INRS causal tree method (ADC) and a questionnaire completed on site. Potential risk factors for accidents were identified based on the specific factors highlighted by the ADC. More universal trees were also compiled for the safety assessment. Three hundred and thirty specific accident factors were recorded (man of 15.71 per accident). This is consistent with there being multiple causes of accidents rather than the assertion of Brazilian business safety departments that accidents are due to 'dangerous' or 'unsafe' behavior. Introducing the idea of culpability into accidents prevents the implementation of an appropriate information feedback process, essential for effective prevention. However, the large number of accidents related to 'material' (78%) and 'environment' (70%) indicates that working conditions are poor. This shows that the technical risks, mostly due to unsafe machinery and equipment are not being dealt with. Seventy-five potential accident factors were identified. Of these, 35% were 'organizational', a high proportion for the company studied. Improvisation occurs at all levels, particularly at the organizational level. This is, thus a major determinant for entire series of, if not most, accident situations. The poor condition of equipment also plays a major role in accidents. The effects of poor equipment on safety exacerbate the organizational shortcomings. The company's safety intervention policy should improve the management of human resources (rules designating particular workers for particular workstations; instructions for the safe operation of machines and equipment; training of operators, etc.) and introduce programs to detect risks and to improve the safety of machines and equipment. We also recommend the establishment of a program to follow the results of any preventive measures adopted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present here the results of a study of 21 work-related accidents that occurred in a Brazilian manufacturing company. The aim was to assess the safety level of the company to improve its work accident prevention policy. In the last 6 months of 1992 and 1993, all accidents resulting in 15 days' absence from work, reported for social security purposes, were analyzed using the INRS causal tree method (ADC) and a questionnaire completed on site. Potential risk factors for accidents were identified based on the specific factors highlighted by the ADC. More universal trees were also compiled for the safety assessment. Three hundred and thirty specific accident factors were recorded (mean of 15.71 per accident). This is consistent with there being multiple causes of accidents rather than the assertion of Brazilian business safety departments that accidents are due to dangerous or unsafe behavior. Introducing the idea of culpability into accidents prevents the implementation of an appropriate information feedback process, essential for effective prevention. However, the large number of accidents related to material (78%) and environment (70%) indicates that working conditions are poor. This shows that the technical risks, mostly due to unsafe machinery and equipment are not being dealt with. Seventy-five potential accident factors were identified. Of these, 35% were organizational, a high proportion for the company studied. Improvisation occurs at all levels, particularly at the organizational level. This is thus a major determinant for entire series of, if not most, accident situations. The poor condition of equipment also plays a major role in accidents. The effects of poor equipment on safety exacerbate the organizational shortcomings. The company's safety intervention policy should improve the management of human resources (rules designating particular workers for particular workstations; instructions for the safe operation of machines and equipment; training of operators, etc.) and introduce programs to detect risks and to improve the safety of machines and equipment. We also recommend the establishment of a program to follow the results of any preventive measures adopted.