929 resultados para Google Maps


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advent of mass digitization projects, such as the Google Book Search, a peculiar shift has occurred in the way that copyright works are dealt with. Contrary to what has so far been the case, works are turned into machine-readable data to be automatically processed for various purposes without the expression of works being displayed to the public. In the Google Book Settlement Agreement, this new kind of usage is referred to as ‘non-display uses’ of digital works. The legitimacy of these uses has not yet been tested by Courts and does not comfortably fit in the current copyright doctrine, plainly because the works are not used as works but as something else, namely as data. Since non-display uses may prove to be a very lucrative market in the near future, with the potential to affect the way people use copyright works, we examine non-display uses under the prism of copyright principles to determine the boundaries of their legitimacy. Through this examination, we provide a categorization of the activities carried out under the heading of ‘non-display uses’, we examine their lawfulness under the current copyright doctrine and approach the phenomenon from the spectrum of data protection law that could apply, by analogy, to the use of copyright works as processable data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The chapter starts from the premise that an historically- and institutionally-formed orientation to music education at primary level in European countries privileges a nineteenth century Western European music aesthetic, with its focus on formal characteristics such as melody and rhythm. While there is a move towards a multi-faceted understanding of musical ability, a discrete intelligence and willingness to accept musical styles or 'open-earedness', there remains a paucity of documented evidence of this in research at primary school level. To date there has been no study undertaken which has the potential to provide policy makers and practitioners with insights into the degree of homogeneity or universality in conceptions of musical ability within this educational sector. Against this background, a study was set up to explore the following research questions: 1. What conceptions of musical ability do primary teachers hold a) of themselves and; b) of their pupils? 2. To what extent are these conceptions informed by Western classical practices? A mixed methods approach was used which included survey questionnaire and semi-structured interview. Questionnaires have been sent to all classroom teachers in a random sample of primary schools in the South East of England. This was followed up with a series of semi-structured interviews with a sub-sample of respondents. The main ideas are concerned with the attitudes, beliefs and working theories held by teachers in contemporary primary school settings. By mapping the extent to which a knowledge base for teaching can be resistant to change in schools, we can problematise primary schools as sites for diversity and migration of cultural ideas. Alongside this, we can use the findings from the study undertaken in an English context as a starting point for further investigation into conceptions of music, musical ability and assessment held by practitioners in a variety of primary school contexts elsewhere in Europe; our emphasis here will be on the development of shared understanding in terms of policies and practices in music education. Within this broader framework, our study can have a significant impact internationally, with potential to inform future policy making, curriculum planning and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Remotely sensed land cover maps are increasingly used as inputs into environmental simulation models whose outputs inform decisions and policy-making. Risks associated with these decisions are dependent on model output uncertainty, which is in turn affected by the uncertainty of land cover inputs. This article presents a method of quantifying the uncertainty that results from potential mis-classification in remotely sensed land cover maps. In addition to quantifying uncertainty in the classification of individual pixels in the map, we also address the important case where land cover maps have been upscaled to a coarser grid to suit the users’ needs and are reported as proportions of land cover type. The approach is Bayesian and incorporates several layers of modelling but is straightforward to implement. First, we incorporate data in the confusion matrix derived from an independent field survey, and discuss the appropriate way to model such data. Second, we account for spatial correlation in the true land cover map, using the remotely sensed map as a prior. Third, spatial correlation in the mis-classification characteristics is induced by modelling their variance. The result is that we are able to simulate posterior means and variances for individual sites and the entire map using a simple Monte Carlo algorithm. The method is applied to the Land Cover Map 2000 for the region of England and Wales, a map used as an input into a current dynamic carbon flux model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new method to calculate sky view factors (SVFs) from high resolution urban digital elevation models using a shadow casting algorithm. By utilizing weighted annuli to derive SVF from hemispherical images, the distance light source positions can be predefined and uniformly spread over the whole hemisphere, whereas another method applies a random set of light source positions with a cosine-weighted distribution of sun altitude angles. The 2 methods have similar results based on a large number of SVF images. However, when comparing variations at pixel level between an image generated using the new method presented in this paper with the image from the random method, anisotropic patterns occur. The absolute mean difference between the 2 methods is 0.002 ranging up to 0.040. The maximum difference can be as much as 0.122. Since SVF is a geometrically derived parameter, the anisotropic errors created by the random method must be considered as significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let H ∈ C 2(ℝ N×n ), H ≥ 0. The PDE system arises as the Euler-Lagrange PDE of vectorial variational problems for the functional E ∞(u, Ω) = ‖H(Du)‖ L ∞(Ω) defined on maps u: Ω ⊆ ℝ n → ℝ N . (1) first appeared in the author's recent work. The scalar case though has a long history initiated by Aronsson. Herein we study the solutions of (1) with emphasis on the case of n = 2 ≤ N with H the Euclidean norm on ℝ N×n , which we call the “∞-Laplacian”. By establishing a rigidity theorem for rank-one maps of independent interest, we analyse a phenomenon of separation of the solutions to phases with qualitatively different behaviour. As a corollary, we extend to N ≥ 2 the Aronsson-Evans-Yu theorem regarding non existence of zeros of |Du| and prove a maximum principle. We further characterise all H for which (1) is elliptic and also study the initial value problem for the ODE system arising for n = 1 but with H(·, u, u′) depending on all the arguments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate knowledge of ice-production rates within the marginal ice zones of the Arctic Ocean requires monitoring of the thin-ice distribution within polynyas. The thickness of the ice layer controls the heat loss and hence the new-ice formation. An established thinice algorithm using high-resolution MODIS data allows deriving the ice-thickness distribution within polynyas. The average uncertainty is ±4.7 cm for ice thicknesses below 0.2 m. In this study, the ice-thickness distributions within the Laptev Sea polynya for the two winter seasons 2007/08 and 2008/09 are calculated. Then, a new method is applied to determine a daily MODIS thin-ice product.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have analyzed XMM-Newton archive data for five clusters of galaxies (redshifts 0.223-0.313) covering a wide range of dynamical states, from relaxed objects to clusters undergoing several mergers. We present here temperature maps of the X-ray gas together with a preliminary interpretation of the formation history of these clusters. (c) 2007 COSPAR. Published by Elsevier Ltd. All rights reserved.