932 resultados para spatial information processing theories


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soil surveys are the main source of spatial information on soils and have a range of different applications, mainly in agriculture. The continuity of this activity has however been severely compromised, mainly due to a lack of governmental funding. The purpose of this study was to evaluate the feasibility of two different classifiers (artificial neural networks and a maximum likelihood algorithm) in the prediction of soil classes in the northwest of the state of Rio de Janeiro. Terrain attributes such as elevation, slope, aspect, plan curvature and compound topographic index (CTI) and indices of clay minerals, iron oxide and Normalized Difference Vegetation Index (NDVI), derived from Landsat 7 ETM+ sensor imagery, were used as discriminating variables. The two classifiers were trained and validated for each soil class using 300 and 150 samples respectively, representing the characteristics of these classes in terms of the discriminating variables. According to the statistical tests, the accuracy of the classifier based on artificial neural networks (ANNs) was greater than of the classic Maximum Likelihood Classifier (MLC). Comparing the results with 126 points of reference showed that the resulting ANN map (73.81 %) was superior to the MLC map (57.94 %). The main errors when using the two classifiers were caused by: a) the geological heterogeneity of the area coupled with problems related to the geological map; b) the depth of lithic contact and/or rock exposure, and c) problems with the environmental correlation model used due to the polygenetic nature of the soils. This study confirms that the use of terrain attributes together with remote sensing data by an ANN approach can be a tool to facilitate soil mapping in Brazil, primarily due to the availability of low-cost remote sensing data and the ease by which terrain attributes can be obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional imaging and quantification of myocardial function are essential steps in the evaluation of cardiac disease. We propose a tagged magnetic resonance imaging methodology called zHARP that encodes and automatically tracks myocardial displacement in three dimensions. Unlike other motion encoding techniques, zHARP encodes both in-plane and through-plane motion in a single image plane without affecting the acquisition speed. Postprocessing unravels this encoding in order to directly track the 3-D displacement of every point within the image plane throughout an entire image sequence. Experimental results include a phantom validation experiment, which compares zHARP to phase contrast imaging, and an in vivo study of a normal human volunteer. Results demonstrate that the simultaneous extraction of in-plane and through-plane displacements from tagged images is feasible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study proposes a new concept for upscaling local information on failure surfaces derived from geophysical data, in order to develop the spatial information and quickly estimate the magnitude and intensity of a landslide. A new vision of seismic interpretation on landslides is also demonstrated by taking into account basic geomorphic information with a numeric method based on the Sloping Local Base Level (SLBL). The SLBL is a generalization of the base level defined in geomorphology applied to landslides, and allows the calculation of the potential geometry of the landslide failure surface. This approach was applied to a large scale landslide formed mainly in gypsum and situated in a former glacial valley along the Rhone within the Western European Alps. Previous studies identified the existence of two sliding surfaces that may continue below the level of the valley. In this study. seismic refraction-reflexion surveys were carried out to verify the existence of these failure surfaces. The analysis of the seismic data provides a four-layer model where three velocity layers (<1000 ms(-1), 1500 ms(-1) and 3000 ms(-1)) are interpreted as the mobilized mass at different weathering levels and compaction. The highest velocity layer (>4000 ms(-1)) with a maximum depth of similar to 58 m is interpreted as the stable anhydrite bedrock. Two failure surfaces were interpreted from the seismic surveys: an upper failure and a much deeper one (respectively 25 and 50 m deep). The upper failure surface depth deduced from geophysics is slightly different from the results obtained using the SLBL, and the deeper failure surface depth calculated with the SLBL method is underestimated in comparison with the geophysical interpretations. Optimal results were therefore obtained by including the seismic data in the SLBL calculations according to the geomorphic limits of the landslide (maximal volume of mobilized mass = 7.5 x 10(6) m(3)).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A. Costanza, K. Weber, S. Gandy, C. Bouras, P. R. Hof, P. Giannakopoulos and A. Canuto (2011) Neuropathology and Applied Neurobiology37, 570-584 Contact sport-related chronic traumatic encephalopathy in the elderly: clinical expression and structural substrates Professional boxers and other contact sport athletes are exposed to repetitive brain trauma that may affect motor functions, cognitive performance, emotional regulation and social awareness. The term of chronic traumatic encephalopathy (CTE) was recently introduced to regroup a wide spectrum of symptoms such as cerebellar, pyramidal and extrapyramidal syndromes, impairments in orientation, memory, language, attention, information processing and frontal executive functions, as well as personality changes and behavioural and psychiatric symptoms. Magnetic resonance imaging usually reveals hippocampal and vermis atrophy, a cavum septum pellucidum, signs of diffuse axonal injury, pituitary gland atrophy, dilated perivascular spaces and periventricular white matter disease. Given the partial overlapping of the clinical expression, epidemiology and pathogenesis of CTE and Alzheimer's disease (AD), as well as the close association between traumatic brain injuries (TBIs) and neurofibrillary tangle formation, a mixed pathology promoted by pathogenetic cascades resulting in either CTE or AD has been postulated. Molecular studies suggested that TBIs increase the neurotoxicity of the TAR DNA-binding protein 43 (TDP-43) that is a key pathological marker of ubiquitin-positive forms of frontotemporal dementia (FTLD-TDP) associated or not with motor neurone disease/amyotrophic lateral sclerosis (ALS). Similar patterns of immunoreactivity for TDP-43 in CTE, FTLD-TDP and ALS as well as epidemiological correlations support the presence of common pathogenetic mechanisms. The present review provides a critical update of the evolution of the concept of CTE with reference to its neuropathological definition together with an in-depth discussion of the differential diagnosis between this entity, AD and frontotemporal dementia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is estimated that around 230 people die each year due to radon (222Rn) exposure in Switzerland. 222Rn occurs mainly in closed environments like buildings and originates primarily from the subjacent ground. Therefore it depends strongly on geology and shows substantial regional variations. Correct identification of these regional variations would lead to substantial reduction of 222Rn exposure of the population based on appropriate construction of new and mitigation of already existing buildings. Prediction of indoor 222Rn concentrations (IRC) and identification of 222Rn prone areas is however difficult since IRC depend on a variety of different variables like building characteristics, meteorology, geology and anthropogenic factors. The present work aims at the development of predictive models and the understanding of IRC in Switzerland, taking into account a maximum of information in order to minimize the prediction uncertainty. The predictive maps will be used as a decision-support tool for 222Rn risk management. The construction of these models is based on different data-driven statistical methods, in combination with geographical information systems (GIS). In a first phase we performed univariate analysis of IRC for different variables, namely the detector type, building category, foundation, year of construction, the average outdoor temperature during measurement, altitude and lithology. All variables showed significant associations to IRC. Buildings constructed after 1900 showed significantly lower IRC compared to earlier constructions. We observed a further drop of IRC after 1970. In addition to that, we found an association of IRC with altitude. With regard to lithology, we observed the lowest IRC in sedimentary rocks (excluding carbonates) and sediments and the highest IRC in the Jura carbonates and igneous rock. The IRC data was systematically analyzed for potential bias due to spatially unbalanced sampling of measurements. In order to facilitate the modeling and the interpretation of the influence of geology on IRC, we developed an algorithm based on k-medoids clustering which permits to define coherent geological classes in terms of IRC. We performed a soil gas 222Rn concentration (SRC) measurement campaign in order to determine the predictive power of SRC with respect to IRC. We found that the use of SRC is limited for IRC prediction. The second part of the project was dedicated to predictive mapping of IRC using models which take into account the multidimensionality of the process of 222Rn entry into buildings. We used kernel regression and ensemble regression tree for this purpose. We could explain up to 33% of the variance of the log transformed IRC all over Switzerland. This is a good performance compared to former attempts of IRC modeling in Switzerland. As predictor variables we considered geographical coordinates, altitude, outdoor temperature, building type, foundation, year of construction and detector type. Ensemble regression trees like random forests allow to determine the role of each IRC predictor in a multidimensional setting. We found spatial information like geology, altitude and coordinates to have stronger influences on IRC than building related variables like foundation type, building type and year of construction. Based on kernel estimation we developed an approach to determine the local probability of IRC to exceed 300 Bq/m3. In addition to that we developed a confidence index in order to provide an estimate of uncertainty of the map. All methods allow an easy creation of tailor-made maps for different building characteristics. Our work is an essential step towards a 222Rn risk assessment which accounts at the same time for different architectural situations as well as geological and geographical conditions. For the communication of 222Rn hazard to the population we recommend to make use of the probability map based on kernel estimation. The communication of 222Rn hazard could for example be implemented via a web interface where the users specify the characteristics and coordinates of their home in order to obtain the probability to be above a given IRC with a corresponding index of confidence. Taking into account the health effects of 222Rn, our results have the potential to substantially improve the estimation of the effective dose from 222Rn delivered to the Swiss population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The good news with regard to this (or any) chapter on the future of leadership is that there is one. There was a time when researchers called for a moratorium on new leadership theory and research (e.g., Miner, 1975) citing the uncertain future of the field. Then for a time there was a popular academic perspective that leadership did not really matter when it came to shaping organizational outcomes (Meindl & Ehrlich, 1987; Meindl, Ehrlich, & Dukerich, 1985; Pfeffer, 1977). That perspective was laid to rest by "realists" in the field (Day & Antonakis, 2012a) by means of empirical re-interpretation of the results used to support the position that leadership does not matter (Lieberson & O'Connor, 1972; Salancik & Pfeffer, 1977). Specifically, Day and Lord (1988) showed that when proper methodological concerns were addressed (e.g., controlling for industry and company size effects; incorporating appropriate time lags) that the impact of top-level leadership was considerable - explaining as much as 45% of the variance in measures of organizational performance. Despite some recent pessimistic sentiments about the "curiously unformed" state of leadership research and theory (Hackman & Wageman, 2007), others have argued that the field has continued to evolve and is potentially on the threshold of some significant breakthroughs (Day & Antonakis, 2012a). Leadership scholars have been re-energized by new directions in the field and research efforts have revitalized areas previously abandoned for apparent lack of consistency in findings (e.g., leadership trait theory). Our accumulated knowledge now allows us to explain the nature of leadership including its biological bases and other antecedents, and consequences with some degree of confidence. There are other comprehensive sources that review the extensive theoretical and empirical foundation of leadership (Bass, 2008; Day & Antonakis, 2012b) so that will not be the focus of the present chapter. Instead, we will take a future-oriented perspective in identifying particular areas within the leadership field that we believe offer promising perspectives on the future of leadership. Nonetheless, it is worthwhile as background to first provide an overview of how we see the leadership field changing over the past decade or so. This short chronicle will set the stage for a keener understanding of where the future contributions are likely to emerge. Overall, across nine major schools of leadership - trait, behavioural, contingency, contextual, relational, sceptics, information processing, New Leadership, biological and evolutionary - researchers have seen a resurgence in interest in one area, a high level of activity in at least four other areas, inactivity in three areas, and one that was modestly active in the previous decade but we think holds strong promise for the future (Gardner, Lowe, Moss, Mahoney, & Cogliser, 2010). We will next provide brief overviews of these nine schools and their respective levels of research activity (see Figure 1).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

THE COMBINATION OF ADVANCED NEUROIMAGING TECHNIQUES AND MAJOR DEVELOPMENTS IN COMPLEX NETWORK SCIENCE, HAVE GIVEN BIRTH TO A NEW FRAMEWORK FOR STUDYING THE BRAIN: "connectomics." This framework provides the ability to describe and study the brain as a dynamic network and to explore how the coordination and integration of information processing may occur. In recent years this framework has been used to investigate the developing brain and has shed light on many dynamic changes occurring from infancy through adulthood. The aim of this article is to review this work and to discuss what we have learned from it. We will also use this body of work to highlight key technical aspects that are necessary in general for successful connectome analysis using today's advanced neuroimaging techniques. We look to identify current limitations of such approaches, what can be improved, and how these points generalize to other topics in connectome research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Helping behavior is any intentional behavior that benefits another living being or group (Hogg & Vaughan, 2010). People tend to underestimate the probability that others will comply with their direct requests for help (Flynn & Lake, 2008). This implies that when they need help, they will assess the probability of getting it (De Paulo, 1982, cited in Flynn & Lake, 2008) and then they will tend to estimate one that is actually lower than the real chance, so they may not even consider worth asking for it. Existing explanations for this phenomenon attribute it to a mistaken cost computation by the help seeker, who will emphasize the instrumental cost of “saying yes”, ignoring that the potential helper also needs to take into account the social cost of saying “no”. And the truth is that, especially in face-to-face interactions, the discomfort caused by refusing to help can be very high. In short, help seekers tend to fail to realize that it might be more costly to refuse to comply with a help request rather than accepting. A similar effect has been observed when estimating trustworthiness of people. Fetchenhauer and Dunning (2010) showed that people also tend to underestimate it. This bias is reduced when, instead of asymmetric feedback (getting feedback only when deciding to trust the other person), symmetric feedback (always given) was provided. This cause could as well be applicable to help seeking as people only receive feedback when they actually make their request but not otherwise. Fazio, Shook, and Eiser (2004) studied something that could be reinforcing these outcomes: Learning asymmetries. By means of a computer game called BeanFest, they showed that people learn better about negatively valenced objects (beans in this case) than about positively valenced ones. This learning asymmetry esteemed from “information gain being contingent on approach behavior” (p. 293), which could be identified with what Fetchenhauer and Dunning mention as ‘asymmetric feedback’, and hence also with help requests. Fazio et al. also found a generalization asymmetry in favor of negative attitudes versus positive ones. They attributed it to a negativity bias that “weights resemblance to a known negative more heavily than resemblance to a positive” (p. 300). Applied to help seeking scenarios, this would mean that when facing an unknown situation, people would tend to generalize and infer that is more likely that they get a negative rather than a positive outcome from it, so, along with what it was said before, people will be more inclined to think that they will get a “no” when requesting help. Denrell and Le Mens (2011) present a different perspective when trying to explain judgment biases in general. They deviate from the classical inappropriate information processing (depicted among other by Fiske & Taylor, 2007, and Tversky & Kahneman, 1974) and explain this in terms of ‘adaptive sampling’. Adaptive sampling is a sampling mechanism in which the selection of sample items is conditioned by the values of the variable of interest previously observed (Thompson, 2011). Sampling adaptively allows individuals to safeguard themselves from experiences they went through once and turned out to lay negative outcomes. However, it also prevents them from giving a second chance to those experiences to get an updated outcome that could maybe turn into a positive one, a more positive one, or just one that regresses to the mean, whatever direction that implies. That, as Denrell and Le Mens (2011) explained, makes sense: If you go to a restaurant, and you did not like the food, you do not choose that restaurant again. This is what we think could be happening when asking for help: When we get a “no”, we stop asking. And here, we want to provide a complementary explanation for the underestimation of the probability that others comply with our direct help requests based on adaptive sampling. First, we will develop and explain a model that represents the theory. Later on, we will test it empirically by means of experiments, and will elaborate on the analysis of its results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formative, also called asymmetric, cell divisions produce daughter cells with different identities. Like other divisions, formative divisions rely first of all on the cell cycle machinery with centrally acting cyclin-dependent kinases (CDKs) and their cyclin partners to control progression through the cell cycle. However, it is still largely obscure how developmental cues are translated at the cellular level to promote asymmetric divisions. Here, we show that formative divisions in the shoot and root of the flowering plant Arabidopsis thaliana are controlled by a common mechanism that relies on the activity level of the Cdk1 homolog CDKA;1, with medium levels being sufficient for symmetric divisions but high levels being required for formative divisions. We reveal that the function of CDKA;1 in asymmetric cell divisions operates through a transcriptional regulation system that is mediated by the Arabidopsis Retinoblastoma homolog RBR1. RBR1 regulates not only cell cycle genes, but also, independent of the cell cycle transcription factor E2F, genes required for formative divisions and cell fate acquisition, thus directly linking cell proliferation with differentiation. This mechanism allows the implementation of spatial information, in the form of high kinase activity, with intracellular gating of developmental decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This letter presents advanced classification methods for very high resolution images. Efficient multisource information, both spectral and spatial, is exploited through the use of composite kernels in support vector machines. Weighted summations of kernels accounting for separate sources of spectral and spatial information are analyzed and compared to classical approaches such as pure spectral classification or stacked approaches using all the features in a single vector. Model selection problems are addressed, as well as the importance of the different kernels in the weighted summation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation‑based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi‑resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Among the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, have the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical‑based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acquisition of a mature dendritic morphology is critical for neural information processing. In particular, hepatocyte growth factor (HGF) controls dendritic arborization during brain development. However, the cellular mechanisms underlying the effects of HGF on dendritic growth remain elusive. Here, we show that HGF increases dendritic length and branching of rat cortical neurons through activation of the mitogen-activated protein kinase (MAPK) signaling pathway. Activation of MAPK by HGF leads to the rapid and transient phosphorylation of cAMP response element-binding protein (CREB), a key step necessary for the control of dendritic development by HGF. In addition to CREB phosphorylation, regulation of dendritic growth by HGF requires the interaction between CREB and CREB-regulated transcription coactivator 1 (CRTC1), as expression of a mutated form of CREB unable to bind CRTC1 completely abolished the effects of HGF on dendritic morphology. Treatment of cortical neurons with HGF in combination with brain-derived neurotrophic factor (BDNF), a member of the neurotrophin family that regulates dendritic development via similar mechanisms, showed additive effects on MAPK activation, CREB phosphorylation and dendritic growth. Collectively, these results support the conclusion that regulation of cortical dendritic morphology by HGF is mediated by activation of the MAPK pathway, phosphorylation of CREB and interaction of CREB with CRTC1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we propose a novel method for calculating cardiac 3-D strain. The method requires the acquisition of myocardial short-axis (SA) slices only and produces the 3-D strain tensor at every point within every pair of slices. Three-dimensional displacement is calculated from SA slices using zHARP which is then used for calculating the local displacement gradient and thus the local strain tensor. There are three main advantages of this method. First, the 3-D strain tensor is calculated for every pixel without interpolation; this is unprecedented in cardiac MR imaging. Second, this method is fast, in part because there is no need to acquire long-axis (LA) slices. Third, the method is accurate because the 3-D displacement components are acquired simultaneously and therefore reduces motion artifacts without the need for registration. This article presents the theory of computing 3-D strain from two slices using zHARP, the imaging protocol, and both phantom and in-vivo validation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vuosi vuodelta kasvava tietokoneiden prosessointikyky on mahdollistanut harmaataso- ja RGB-värikuvia tarkempien spektrikuvien käsittelyn järjellisessä ajassa ilman suuria kustannuksia. Ongelmana on kuitenkin, ettei talletus- ja tiedonsiirtomedia ole kehittynyt prosessointikyvyn vauhdissa. Ratkaisu tähän ongelmaan on spektrikuvien tiivistäminen talletuksen ja tiedonsiirron ajaksi. Tässä työssä esitellään menetelmä, jossa spektrikuva tiivistetään kahdessa vaiheessa: ensin ryhmittelemällä itseorganisoituvan kartan (SOM) avulla ja toisessa vaiheessa jatketaan tiivistämistä perinteisin menetelmin. Saadut tiivistyssuhteet ovat merkittäviä vääristymän pysyessä siedettävänä. Työ on tehty Lappeenrannan teknillisen korkeakoulun Tietotekniikan osaston Tietojenkäsittelytekniikan tutkimuslaboratoriossa osana laajempaa kuvantiivistyksen tutkimushanketta.