864 resultados para Multicriteria Climatic Classification
Resumo:
Combined micropaleontological and geochemical analyses of the high-sedimentation gravity core M-4G provided new centennial-scale paleoceanographic data for sapropel S1 deposition in the NE Aegean Sea during the Holocene Climatic Optimum. Sapropel layer S1a (10.2–8.0 ka) was deposited in dysoxic to oxic bottom waters characterized by a high abundance of benthic foraminiferal species tolerating surface sediment and/or pore water oxygen depletion (e.g., Chilostomella mediterranensis, Globobulimina affinis), and the presence of Uvigerina mediterranea, which thrives in oxic mesotrophic-eutrophic environments. Preservation of organic matter (OM) is inferred based on high organic carbon as well as loliolide and isololiolide contents, while the biomarker record and the abundances of eutrophic planktonic foraminifera document enhanced productivity. High inputs of terrigenous OM are attributed to north Aegean borderland riverine inputs. Both alkenone-based sea surface temperatures (SSTs) and δO18G. bulloides records indicate cooling at 8.2 ka (S1a) and ~7.8 ka (S1 interruption). Sapropelic layer S1b (7.7–6.4 ka) is characterized by rather oxic conditions; abundances of foraminiferal species tolerant to oxygen depletion are very low compared with the U. mediterranea rise. Strongly fluctuating SSTs demonstrate repeated cooling and associated dense water formation, with a major event at 7.4 ka followed by cold spells at 7.0, 6.8, and 6.5 ka. The prominent rise of the carbon preference index within the S1b layer indicates the delivery of less degraded terrestrial OM. The increase of algal biomarkers, labile OM-feeding foraminifera and eutrophic planktonic species pinpoints an enhanced in situ marine productivity, promoted by more efficient vertical convection due to repeated cold events. The associated contributions of labile marine OM along with fresher terrestrial OM inputs after ~7.7 ka imply sources alternative/additional to the north Aegean riverine borderland sources for the influx of organic matter in the south Limnos Basin, plausibly related to the inflow of highly productive Marmara/Black Sea waters.
Resumo:
In the mid-1990s the North Atlantic subpolar gyre warmed rapidly, which had important climate impacts, such as increased hurricane numbers, and changes to rainfall over Africa, Europe and North America. Evidence suggests that the warming was largely due to a strengthening of the ocean circulation, particularly the Atlantic Meridional Overturning Circulation (AMOC). Since the mid-1990s direct and indirect measurements have suggested a decline in the strength of the ocean circulation, which is expected to lead to a reduction in northward heat transport. Here we show that since 2005 a large volume of the upper North Atlantic Ocean has cooled significantly by approximately -0.45C or 1.5x10^22 J, reversing the previous warming trend. By analysing observations and a state-of-the-art climate model, we show that this cooling is consistent with a reduction in the strength of the ocean circulation and heat transport, linked to record low densities in the deep Labrador Sea. The low density in the deep Labrador Sea is primarily due to deep ocean warming since 1995, but a long-term freshening also played a role. The observed upper ocean cooling since 2005 is not consistent with the hypothesis that anthropogenic aerosols directly drive Atlantic temperatures.
Resumo:
The South American low level jet (SALLJ) of the Eastern Andes is investigated with Regional Climate Model version 3 (RegCM3) simulations during the 2002-2003 austral summer using two convective parameterizations (Grell and Emanuel). The simulated SALLJ is compared with the special observations of SALLJEX (SALLJ Experiment). Both the Grell and Emanuel schemes adequately simulate the low level flow over South America. However, there are some intensity differences. Due to the larger (smaller) convective activity, the Emanuel (Grell) scheme simulates more intense (weaker) low level wind than analysis in the tropics and subtropics. The objectives criteria of Sugahara (SJ) and Bonner (BJ) were used for LLJ identification. When applied to the observations, both criteria suggest a larger frequency of the SALLJ in Santa Cruz, followed by Mariscal, Trinidad and Asuncin. In Mariscal and Asuncin, the diurnal cycle indicates that SJ occurs mainly at 12 UTCs (morning), while the BJ criterion presents the SALLJ as more homogenously distributed. The concentration into two of the four-times-a-day observations does not allow conclusions about the diurnal cycle in Santa Cruz and Trinidad. The simulated wind profiles result in a lower than observed frequency of SALLJ using both the SJ and BJ criteria, with fewer events obtained with the BJ. Due to the stronger simulated winds, the Emanuel scheme produces an equal or greater relative frequency of SALLJ than the Grell scheme. However, the Grell scheme using the SJ criterion simulates the SALLJ diurnal cycle closer to the observed one. Although some discrepancies between observed and simulated mean vertical profiles of the horizontal wind are noted, there is large agreement between the composites of the vertical structure of the SALLJ, especially when the SJ criterion is used with the Grell scheme. On an intraseasonal scale, a larger southward displacement of SALLJ in February and December when compared with January has been noted. The Grell and Emanuel schemes simulated this observed oscillation in the low-level flow. However, the spatial pattern and intensity of rainfall and circulation anomalies simulated by the Grell scheme are closer to the analyses than those obtained with the Emanuel scheme.
Resumo:
This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.
Resumo:
Epidendrum L. is the largest genus of Orchidaceae in the Neotropical region; it has an impressive morphological diversification, which imposes difficulties in delimitation of both infrageneric and interspecific boundaries. In this study, we review infrageneric boundaries within the subgenus Amphiglottium and try to contribute to the understanding of morphological diversification and taxa delimitation within this group. We tested the monophyly of the subgenus Amphiglottium sect. Amphiglottium, expanding previous phylogenetic investigations and reevaluated previous infrageneric classifications proposed. Sequence data from the trnL-trnF region were analyzed with both parsimony and maximum likelihood criteria. AFLP markers were also obtained and analyzed with phylogenetic and principal coordinate analyses. Additionally, we obtained chromosome numbers for representative species within the group. The results strengthen the monophyly of the subgenus Amphiglottium but do not support the current classification system proposed by previous authors. Only section Tuberculata comprises a well-supported monophyletic group, with sections Carinata and Integra not supported. Instead of morphology, biogeographical and ecological patterns are reflected in the phylogenetic signal in this group. This study also confirms the large variability of chromosome numbers for the subgenus Amphiglottium (numbers ranging from 2n = 24 to 2n = 240), suggesting that polyploidy and hybridization are probably important mechanisms of speciation within the group.
Resumo:
The increase in biodiversity from high to low latitudes is a widely recognized biogeographical pattern. According to the latitudinal gradient hypothesis (LGH), this pattern was shaped by differential effects of Late Quaternary climatic changes across a latitudinal gradient. Here, we evaluate the effects of climatic changes across a tropical latitudinal gradient and its implications to diversification of an Atlantic Forest (AF) endemic passerine. We studied the intraspecific diversification and historical demography of Sclerurus scansor, based on mitochondrial (ND2, ND3 and cytb) and nuclear (FIB7) gene sequences. Phylogenetic analyses recovered three well-supported clades associated with distinct latitudinal zones. Coalescent-based methods were applied to estimate divergence times and changes in effective population sizes. Estimates of divergence times indicate that intraspecific diversification took place during Middle-Late Pleistocene. Distinct demographic scenarios were identified, with the southern lineage exhibiting a clear signature of demographic expansion, while the central one remained more stable. The northern lineage, contrasting with LGH predictions, exhibited a clear sign of a recent bottleneck. Our results suggest that different AF regions reacted distinctly, even in opposite ways, under the same climatic period, producing simultaneously favourable scenarios for isolation and contact among populations.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
This work proposes and discusses an approach for inducing Bayesian classifiers aimed at balancing the tradeoff between the precise probability estimates produced by time consuming unrestricted Bayesian networks and the computational efficiency of Naive Bayes (NB) classifiers. The proposed approach is based on the fundamental principles of the Heuristic Search Bayesian network learning. The Markov Blanket concept, as well as a proposed ""approximate Markov Blanket"" are used to reduce the number of nodes that form the Bayesian network to be induced from data. Consequently, the usually high computational cost of the heuristic search learning algorithms can be lessened, while Bayesian network structures better than NB can be achieved. The resulting algorithms, called DMBC (Dynamic Markov Blanket Classifier) and A-DMBC (Approximate DMBC), are empirically assessed in twelve domains that illustrate scenarios of particular interest. The obtained results are compared with NB and Tree Augmented Network (TAN) classifiers, and confinn that both proposed algorithms can provide good classification accuracies and better probability estimates than NB and TAN, while being more computationally efficient than the widely used K2 Algorithm.
Resumo:
The substitution of missing values, also called imputation, is an important data preparation task for many domains. Ideally, the substitution of missing values should not insert biases into the dataset. This aspect has been usually assessed by some measures of the prediction capability of imputation methods. Such measures assume the simulation of missing entries for some attributes whose values are actually known. These artificially missing values are imputed and then compared with the original values. Although this evaluation is useful, it does not allow the influence of imputed values in the ultimate modelling task (e.g. in classification) to be inferred. We argue that imputation cannot be properly evaluated apart from the modelling task. Thus, alternative approaches are needed. This article elaborates on the influence of imputed values in classification. In particular, a practical procedure for estimating the inserted bias is described. As an additional contribution, we have used such a procedure to empirically illustrate the performance of three imputation methods (majority, naive Bayes and Bayesian networks) in three datasets. Three classifiers (decision tree, naive Bayes and nearest neighbours) have been used as modelling tools in our experiments. The achieved results illustrate a variety of situations that can take place in the data preparation practice.
Resumo:
Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Extending our previous work `Fields on the Poincare group and quantum description of orientable objects` (Gitman and Shelepin 2009 Eur. Phys. J. C 61 111-39), we consider here a classification of orientable relativistic quantum objects in 3 + 1 dimensions. In such a classification, one uses a maximal set of ten commuting operators (generators of left and right transformations) in the space of functions on the Poincare group. In addition to the usual six quantum numbers related to external symmetries (given by left generators), there appear additional quantum numbers related to internal symmetries (given by right generators). Spectra of internal and external symmetry operators are interrelated, which, however, does not contradict the Coleman-Mandula no-go theorem. We believe that the proposed approach can be useful for the description of elementary spinning particles considered as orientable objects. In particular, it gives a group-theoretical interpretation of some facts of the existing phenomenological classification of spinning particles.
Resumo:
In this paper, we present a study on a deterministic partially self-avoiding walk (tourist walk), which provides a novel method for texture feature extraction. The method is able to explore an image on all scales simultaneously. Experiments were conducted using different dynamics concerning the tourist walk. A new strategy, based on histograms. to extract information from its joint probability distribution is presented. The promising results are discussed and compared to the best-known methods for texture description reported in the literature. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Shape provides one of the most relevant information about an object. This makes shape one of the most important visual attributes used to characterize objects. This paper introduces a novel approach for shape characterization, which combines modeling shape into a complex network and the analysis of its complexity in a dynamic evolution context. Descriptors computed through this approach show to be efficient in shape characterization, incorporating many characteristics, such as scale and rotation invariant. Experiments using two different shape databases (an artificial shapes database and a leaf shape database) are presented in order to evaluate the method. and its results are compared to traditional shape analysis methods found in literature. (C) 2009 Published by Elsevier B.V.
Resumo:
Differently from theoretical scale-free networks, most real networks present multi-scale behavior, with nodes structured in different types of functional groups and communities. While the majority of approaches for classification of nodes in a complex network has relied on local measurements of the topology/connectivity around each node, valuable information about node functionality can be obtained by concentric (or hierarchical) measurements. This paper extends previous methodologies based on concentric measurements, by studying the possibility of using agglomerative clustering methods, in order to obtain a set of functional groups of nodes, considering particular institutional collaboration network nodes, including various known communities (departments of the University of Sao Paulo). Among the interesting obtained findings, we emphasize the scale-free nature of the network obtained, as well as identification of different patterns of authorship emerging from different areas (e.g. human and exact sciences). Another interesting result concerns the relatively uniform distribution of hubs along concentric levels, contrariwise to the non-uniform pattern found in theoretical scale-free networks such as the BA model. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The latest version of CATH (class, architecture, topology, homology) (version 3.2), released in July 2008 (http://www.cathdb.info), contains 1 14215 domains, 2178 Homologous superfamilies and 1110 fold groups. We have assigned 20 330 new domains, 87 new homologous superfamilies and 26 new folds since CATH release version 3.1. A total of 28 064 new domains have been assigned since our NAR 2007 database publication (CATH version 3.0). The CATH website has been completely redesigned and includes more comprehensive documentation. We have revisited the CATH architecture level as part of the development of a `Protein Chart` and present information on the population of each architecture. The CATHEDRAL structure comparison algorithm has been improved and used to characterize structural diversity in CATH superfamilies and structural overlaps between superfamilies. Although the majority of superfamilies in CATH are not structurally diverse and do not overlap significantly with other superfamilies, similar to 4% of superfamilies are very diverse and these are the superfamilies that are most highly populated in both the PDB and in the genomes. Information on the degree of structural diversity in each superfamily and structural overlaps between superfamilies can now be downloaded from the CATH website.