77 resultados para thematic map making
Resumo:
Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.
Resumo:
Background: The analysis of the promoter sequence of genes with similar expression patterns isa basic tool to annotate common regulatory elements. Multiple sequence alignments are on thebasis of most comparative approaches. The characterization of regulatory regions from coexpressedgenes at the sequence level, however, does not yield satisfactory results in manyoccasions as promoter regions of genes sharing similar expression programs often do not shownucleotide sequence conservation.Results: In a recent approach to circumvent this limitation, we proposed to align the maps ofpredicted transcription factors (referred as TF-maps) instead of the nucleotide sequence of tworelated promoters, taking into account the label of the corresponding factor and the position in theprimary sequence. We have now extended the basic algorithm to permit multiple promotercomparisons using the progressive alignment paradigm. In addition, non-collinear conservationblocks might now be identified in the resulting alignments. We have optimized the parameters ofthe algorithm in a small, but well-characterized collection of human-mouse-chicken-zebrafishorthologous gene promoters.Conclusion: Results in this dataset indicate that TF-map alignments are able to detect high-levelregulatory conservation at the promoter and the 3'UTR gene regions, which cannot be detectedby the typical sequence alignments. Three particular examples are introduced here to illustrate thepower of the multiple TF-map alignments to characterize conserved regulatory elements inabsence of sequence similarity. We consider this kind of approach can be extremely useful in thefuture to annotate potential transcription factor binding sites on sets of co-regulated genes fromhigh-throughput expression experiments.
Resumo:
We address the problem of comparing and characterizing the promoter regions of genes with similar expression patterns. This remains a challenging problem in sequence analysis, because often the promoter regions of co-expressed genes do not show discernible sequence conservation. In our approach, thus, we have not directly compared the nucleotide sequence of promoters. Instead, we have obtained predictions of transcription factor binding sites, annotated the predicted sites with the labels of the corresponding binding factors, and aligned the resulting sequences of labels—to which we refer here as transcription factor maps (TF-maps). To obtain the global pairwise alignment of two TF-maps, we have adapted an algorithm initially developed to align restriction enzyme maps. We have optimized the parameters of the algorithm in a small, but well-curated, collection of human–mouse orthologous gene pairs. Results in this dataset, as well as in an independent much larger dataset from the CISRED database, indicate that TF-map alignments are able to uncover conserved regulatory elements, which cannot be detected by the typical sequence alignments.
Resumo:
Either calorie restriction, loss of function of the nutrient-dependent PKA or TOR/SCH9 pathways, or activation of stress defences improves longevity in different eukaryotes. However, the molecular links between glucose depletion, nutrient-dependent pathways and stress responses are unknown. Here we show that either calorie restriction or inactivation of nutrient-dependent pathways induces life-span extension in fission yeast, and that such effect is dependent on the activation of the stress-dependent Sty1 MAP kinase. During transition to stationary phase in glucose-limiting conditions, Sty1 becomes activated and triggers a transcriptional stress program, whereas such activation does not occur under glucose-rich conditions. Deletion of the genes coding for the SCH9-homologue Sck2 or the Pka1 kinases, or mutations leading to constitutive activation of the Sty1 stress pathway increase life span under glucose-rich conditions, and importantly such beneficial effects depend ultimately on Sty1. Furthermore, cells lacking Pka1 display enhanced oxygen consumption and Sty1 activation under glucose-rich conditions. We conclude that calorie restriction favours oxidative metabolism, reactive oxygen species production and Sty1 MAP kinase activation, and this stress pathway favours life-span extension.
Resumo:
Recently, there has been an increased interest on the neural mechanisms underlying perceptual decision making. However, the effect of neuronal adaptation in this context has not yet been studied. We begin our study by investigating how adaptation can bias perceptual decisions. We considered behavioral data from an experiment on high-level adaptation-related aftereffects in a perceptual decision task with ambiguous stimuli on humans. To understand the driving force behind the perceptual decision process, a biologically inspired cortical network model was used. Two theoretical scenarios arose for explaining the perceptual switch from the category of the adaptor stimulus to the opposite, nonadapted one. One is noise-driven transition due to the probabilistic spike times of neurons and the other is adaptation-driven transition due to afterhyperpolarization currents. With increasing levels of neural adaptation, the system shifts from a noise-driven to an adaptation-driven modus. The behavioral results show that the underlying model is not just a bistable model, as usual in the decision-making modeling literature, but that neuronal adaptation is high and therefore the working point of the model is in the oscillatory regime. Using the same model parameters, we studied the effect of neural adaptation in a perceptual decision-making task where the same ambiguous stimulus was presented with and without a preceding adaptor stimulus. We find that for different levels of sensory evidence favoring one of the two interpretations of the ambiguous stimulus, higher levels of neural adaptation lead to quicker decisions contributing to a speed–accuracy trade off.
Resumo:
Purpose: The objective of this study is to investigate the feasibility of detecting and quantifying 3D cerebrovascular wall motion from a single 3D rotational x-ray angiography (3DRA) acquisition within a clinically acceptable time and computing from the estimated motion field for the further biomechanical modeling of the cerebrovascular wall. Methods: The whole motion cycle of the cerebral vasculature is modeled using a 4D B-spline transformation, which is estimated from a 4D to 2D + t image registration framework. The registration is performed by optimizing a single similarity metric between the entire 2D + t measured projection sequence and the corresponding forward projections of the deformed volume at their exact time instants. The joint use of two acceleration strategies, together with their implementation on graphics processing units, is also proposed so as to reach computation times close to clinical requirements. For further characterizing vessel wall properties, an approximation of the wall thickness changes is obtained through a strain calculation. Results: Evaluation on in silico and in vitro pulsating phantom aneurysms demonstrated an accurate estimation of wall motion curves. In general, the error was below 10% of the maximum pulsation, even in the situation when substantial inhomogeneous intensity pattern was present. Experiments on in vivo data provided realistic aneurysm and vessel wall motion estimates, whereas in regions where motion was neither visible nor anatomically possible, no motion was detected. The use of the acceleration strategies enabled completing the estimation process for one entire cycle in 5-10 min without degrading the overall performance. The strain map extracted from our motion estimation provided a realistic deformation measure of the vessel wall. Conclusions: The authors' technique has demonstrated that it can provide accurate and robust 4D estimates of cerebrovascular wall motion within a clinically acceptable time, although it has to be applied to a larger patient population prior to possible wide application to routine endovascular procedures. In particular, for the first time, this feasibility study has shown that in vivo cerebrovascular motion can be obtained intraprocedurally from a 3DRA acquisition. Results have also shown the potential of performing strain analysis using this imaging modality, thus making possible for the future modeling of biomechanical properties of the vascular wall.
Resumo:
This article presents a formal model of policy decision-making in an institutional framework of separation of powers in which the main actors are pivotal political parties with voting discipline. The basic model previously developed from pivotal politics theory for the analysis of the United States lawmaking is here modified to account for policy outcomes and institutional performances in other presidential regimes, especially in Latin America. Legislators' party indiscipline at voting and multi-partism appear as favorable conditions to reduce the size of the equilibrium set containing collectively inefficient outcomes, while a two-party system with strong party discipline is most prone to produce 'gridlock', that is, stability of socially inefficient policies. The article provides a framework for analysis which can induce significant revisions of empirical data, especially regarding the effects of situations of (newly defined) unified and divided government, different decision rules, the number of parties and their discipline. These implications should be testable and may inspire future analytical and empirical work.
Resumo:
We present a new method for constructing exact distribution-free tests (and confidence intervals) for variables that can generate more than two possible outcomes.This method separates the search for an exact test from the goal to create a non-randomized test. Randomization is used to extend any exact test relating to meansof variables with finitely many outcomes to variables with outcomes belonging to agiven bounded set. Tests in terms of variance and covariance are reduced to testsrelating to means. Randomness is then eliminated in a separate step.This method is used to create confidence intervals for the difference between twomeans (or variances) and tests of stochastic inequality and correlation.
Resumo:
An important problem in descriptive and prescriptive research in decision making is to identify regions of rationality, i.e., the areas for which heuristics are and are not effective. To map the contours of such regions, we derive probabilities that heuristics identify the best of m alternatives (m > 2) characterized by k attributes or cues (k > 1). The heuristics include a single variable (lexicographic), variations of elimination-by-aspects, equal weighting, hybrids of the preceding, and models exploiting dominance. We use twenty simulated and four empirical datasets for illustration. We further provide an overview by regressing heuristic performance on factors characterizing environments. Overall, sensible heuristics generally yield similar choices in many environments. However, selection of the appropriate heuristic can be important in some regions (e.g., if there is low inter-correlation among attributes/cues). Since our work assumes a hit or miss decision criterion, we conclude by outlining extensions for exploring the effects of different loss functions.
Resumo:
This article presents a formal model of policy decision-making in an institutional framework of separation of powers in which the main actors are pivotal political parties with voting discipline. The basic model previously developed from pivotal politics theory for the analysis of the United States lawmaking is here modified to account for policy outcomes and institutional performances in other presidential regimes, especially in Latin America. Legislators' party indiscipline at voting and multi-partism appear as favorable conditions to reduce the size of the equilibrium set containing collectively inefficient outcomes, while a two-party system with strong party discipline is most prone to produce 'gridlock', that is, stability of socially inefficient policies. The article provides a framework for analysis which can induce significant revisions of empirical data, especially regarding the effects of situations of (newly defined) unified and divided government, different decision rules, the number of parties and their discipline. These implications should be testable and may inspire future analytical and empirical work.
Resumo:
In spite of increasing representation of women in politics, little is known about their impact onpolicies. Comparing outcomes of parliaments with different shares of female members does not identifytheir causal impact because of possible differences in the underlying electorate. This paper usesa unique data set on voting decisions to sheds new light on gender gaps in policy making. Ouranalysis focuses on Switzerland, where all citizens can directly decide on a broad range of policiesin referendums and initiatives. We show that there are large gender gaps in the areas of health,environmental protection, defense spending and welfare policy which typically persist even conditionalon socio-economic characteristics. We also find that female policy makers have a substantial effect onthe composition of public spending, but a small effect on the overall size of government.
Resumo:
Much of empirical economics involves regression analysis. However, does thepresentation of results affect economists ability to make inferences for decision makingpurposes? In a survey, 257 academic economists were asked to make probabilisticinferences on the basis of the outputs of a regression analysis presented in a standardformat. Questions concerned the distribution of the dependent variable conditional onknown values of the independent variable. However, many respondents underestimateduncertainty by failing to take into account the standard deviation of the estimatedresiduals. The addition of graphs did not substantially improve inferences. On the otherhand, when only graphs were provided (i.e., with no statistics), respondents weresubstantially more accurate. We discuss implications for improving practice in reportingresults of regression analyses.
Resumo:
We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.
Resumo:
We investigate whether the gender composition of teams affect theireconomic performance. We study a large business game, played in groups ofthree, where each group takes the role of a general manager. There are twoparallel competitions, one involving undergraduates and the other involvingMBAs. Our analysis shows that teams formed by three women aresignificantly outperformed by any other gender combination, both at theundergraduate and MBA levels. Looking across the performancedistribution, we find that for undergraduates, three women teams areoutperformed throughout, but by as much as 10pp at the bottom and by only1pp at the top. For MBAs, at the top, the best performing group is two menand one woman. The differences in performance are explained bydifferences in decision-making. We observe that three women teams are lessaggressive in their pricing strategies, invest less in R&D, and invest more insocial sustainability initiatives, than any other gender combination teams.Finally, we find support for the hypothesis that it is poor work dynamicsamong the three women teams that drives the results.
Resumo:
This paper shows that information effects per se are not responsible forthe Giffen goods anomaly affecting competitive traders demands in multi-asset, noisy rational expectations equilibrium models. The role thatinformation plays in traders strategies also matters. In a market withrisk averse, uninformed traders, informed agents havea dual motive for trading: speculation and market making. Whilespeculation entails using prices to assess the effect of private signalerror terms, market making requires employing them to disentangle noisetraders effects in traders aggregate orders. In a correlated environment,this complicates a trader s signal-extraction problem and maygenerate upward-sloping demand curves. Assuming either (i) that competitive,risk neutral market makers price the assets, or that (ii) the risktolerance coefficient of uninformed traders grows without bound, removesthe market making component from informed traders demands, rendering themwell behaved in prices.