892 resultados para Topographic categorization
Resumo:
A new database of weather and circulation type catalogs is presented comprising 17 automated classification methods and five subjective classifications. It was compiled within COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions" in order to evaluate different methods for weather and circulation type classification. This paper gives a technical description of the included methods using a new conceptual categorization for classification methods reflecting the strategy for the definition of types. Methods using predefined types include manual and threshold based classifications while methods producing types derived from the input data include those based on eigenvector techniques, leader algorithms and optimization algorithms. In order to allow direct comparisons between the methods, the circulation input data and the methods' configuration were harmonized for producing a subset of standard catalogs of the automated methods. The harmonization includes the data source, the climatic parameters used, the classification period as well as the spatial domain and the number of types. Frequency based characteristics of the resulting catalogs are presented, including variation of class sizes, persistence, seasonal and inter-annual variability as well as trends of the annual frequency time series. The methodological concept of the classifications is partly reflected by these properties of the resulting catalogs. It is shown that the types of subjective classifications compared to automated methods show higher persistence, inter-annual variation and long-term trends. Among the automated classifications optimization methods show a tendency for longer persistence and higher seasonal variation. However, it is also concluded that the distance metric used and the data preprocessing play at least an equally important role for the properties of the resulting classification compared to the algorithm used for type definition and assignment.
Resumo:
A precipitation downscaling method is presented using precipitation from a general circulation model (GCM) as predictor. The method extends a previous method from monthly to daily temporal resolution. The simplest form of the method corrects for biases in wet-day frequency and intensity. A more sophisticated variant also takes account of flow-dependent biases in the GCM. The method is flexible and simple to implement. It is proposed here as a correction of GCM output for applications where sophisticated methods are not available, or as a benchmark for the evaluation of other downscaling methods. Applied to output from reanalyses (ECMWF, NCEP) in the region of the European Alps, the method is capable of reducing large biases in the precipitation frequency distribution, even for high quantiles. The two variants exhibit similar performances, but the ideal choice of method can depend on the GCM/reanalysis and it is recommended to test the methods in each case. Limitations of the method are found in small areas with unresolved topographic detail that influence higher-order statistics (e.g. high quantiles). When used as benchmark for three regional climate models (RCMs), the corrected reanalysis and the RCMs perform similarly in many regions, but the added value of the latter is evident for high quantiles in some small regions.
Resumo:
With the advent of mass digitization projects, such as the Google Book Search, a peculiar shift has occurred in the way that copyright works are dealt with. Contrary to what has so far been the case, works are turned into machine-readable data to be automatically processed for various purposes without the expression of works being displayed to the public. In the Google Book Settlement Agreement, this new kind of usage is referred to as ‘non-display uses’ of digital works. The legitimacy of these uses has not yet been tested by Courts and does not comfortably fit in the current copyright doctrine, plainly because the works are not used as works but as something else, namely as data. Since non-display uses may prove to be a very lucrative market in the near future, with the potential to affect the way people use copyright works, we examine non-display uses under the prism of copyright principles to determine the boundaries of their legitimacy. Through this examination, we provide a categorization of the activities carried out under the heading of ‘non-display uses’, we examine their lawfulness under the current copyright doctrine and approach the phenomenon from the spectrum of data protection law that could apply, by analogy, to the use of copyright works as processable data.
Resumo:
This paper explores the relationship between national institutional archetypes and investments in training and development. A recent trend within the literature on comparative capitalism has been to explore the nature and extent of heterogeneity within the coordinated market economies (CMEs) of Europe. Based on a review of the existing comparative literature on training and development, and comparative firm-level survey evidence of differences in training and development practices, we both support and critique existing country clusters and argue for a more nuanced and flexible categorization.
Resumo:
The recession of mountain glaciers around the world has been linked to anthropogenic climate change and small glaciers (e.g. < 2 km2) are thought to be particularly vulnerable, with reports of their disappearance from several regions. However, the response of small glaciers to climate change can be modulated by non-climatic factors such as topography and debris cover and there remain a number of regions where their recent change has evaded scrutiny. This paper presents results of the first multi-year remote sensing survey of glaciers in the Kodar Mountains, the only glaciers in SE Siberia, which we compare to previous glacier inventories from this continental setting that reported total glacier areas of 18.8 km2 in ca. 1963 (12.6 km2 of exposed ice) and 15.5 km2 in 1974 (12 km2 of exposed ice). Mapping their debris-covered termini is difficult but delineation of debris-free ice on Landsat imagery reveals 34 glaciers with a total area of 11.72 ± 0.72 km2 in 1995, followed by a reduction to 9.53 ± 0.29 km2 in 2001 and 7.01 ± 0.23 km2 in 2010. This represents a ~ 44% decrease in exposed glacier ice between ca. 1963 and 2010, but with 40% lost since 1995 and with individual glaciers losing as much as 93% of their exposed ice. Thus, although continental glaciers are generally thought to be less sensitive than their maritime counterparts, a recent acceleration in shrinkage of exposed ice has taken place and we note its coincidence with a strong summer warming trend in the region initiated at the start of the 1980s. Whilst smaller and shorter glaciers have, proportionally, tended to shrink more rapidly, we find no statistically significant relationship between shrinkage and elevation characteristics, aspect or solar radiation. This is probably due to the small sample size, limited elevation range, and topographic setting of the glaciers in deep valleys-heads. Furthermore, many of the glaciers possess debris-covered termini and it is likely that the ablation of buried ice is lagging the shrinkage of exposed ice, such that a growth in the proportion of debris cover is occurring, as observed elsewhere. If recent trends continue, we hypothesise that glaciers could evolve into a type of rock glacier within the next few decades, introducing additional complexity in their response and delaying their potential demise.
Resumo:
Background: Since their inception, Twitter and related microblogging systems have provided a rich source of information for researchers and have attracted interest in their affordances and use. Since 2009 PubMed has included 123 journal articles on medicine and Twitter, but no overview exists as to how the field uses Twitter in research. // Objective: This paper aims to identify published work relating to Twitter indexed by PubMed, and then to classify it. This classification will provide a framework in which future researchers will be able to position their work, and to provide an understanding of the current reach of research using Twitter in medical disciplines. Limiting the study to papers indexed by PubMed ensures the work provides a reproducible benchmark. // Methods: Papers, indexed by PubMed, on Twitter and related topics were identified and reviewed. The papers were then qualitatively classified based on the paper’s title and abstract to determine their focus. The work that was Twitter focused was studied in detail to determine what data, if any, it was based on, and from this a categorization of the data set size used in the studies was developed. Using open coded content analysis additional important categories were also identified, relating to the primary methodology, domain and aspect. // Results: As of 2012, PubMed comprises more than 21 million citations from biomedical literature, and from these a corpus of 134 potentially Twitter related papers were identified, eleven of which were subsequently found not to be relevant. There were no papers prior to 2009 relating to microblogging, a term first used in 2006. Of the remaining 123 papers which mentioned Twitter, thirty were focussed on Twitter (the others referring to it tangentially). The early Twitter focussed papers introduced the topic and highlighted the potential, not carrying out any form of data analysis. The majority of published papers used analytic techniques to sort through thousands, if not millions, of individual tweets, often depending on automated tools to do so. Our analysis demonstrates that researchers are starting to use knowledge discovery methods and data mining techniques to understand vast quantities of tweets: the study of Twitter is becoming quantitative research. // Conclusions: This work is to the best of our knowledge the first overview study of medical related research based on Twitter and related microblogging. We have used five dimensions to categorise published medical related research on Twitter. This classification provides a framework within which researchers studying development and use of Twitter within medical related research, and those undertaking comparative studies of research relating to Twitter in the area of medicine and beyond, can position and ground their work.
Resumo:
The effect of stratospheric radiative damping time scales on stratospheric variability and on stratosphere–troposphere coupling is investigated in a simplified global circulation model by modifying the vertical profile of radiative damping in the stratosphere while holding it fixed in the troposphere. Perpetual-January conditions are imposed, with sinusoidal topography of zonal wavenumber 1 or 2. The depth and duration of the simulated sudden stratospheric warmings closely track the lower-stratospheric radiative time scales. Simulations with the most realistic profiles of radiative damping exhibit extended time-scale recoveries analogous to polar-night jet oscillation (PJO) events, which are observed to follow sufficiently deep stratospheric warmings. These events are characterized by weak lower-stratospheric winds and enhanced stability near the tropopause, which persist for up to 3 months following the initial warming. They are obtained with both wave-1 and wave-2 topography. Planetary-scale Eliassen–Palm (EP) fluxes entering the vortex are also suppressed, which is in agreement with observed PJO events. Consistent with previous studies, the tropospheric jets shift equatorward in response to the warmings. The duration of the shift is closely correlated with the period of enhanced stability. The magnitude of the shift in these runs, however, is sensitive only to the zonal wavenumber of the topography. Although the shift is sustained primarily by synoptic-scale eddies, the net effect of the topographic form drag and the planetary-scale fluxes is not negligible; they damp the surface wind response but enhance the vertical shear. The tropospheric response may also reduce the generation of planetary waves, further extending the stratospheric dynamical time scales.
Resumo:
Recent radiocarbon dates obtained from two soil cores taken through the Marlborough Castle mound, Wiltshire, show the main body of it to be a contemporaneous monument to Silbury Hill, dating to the second half of the 3rd millennium cal bc. In light of these dates, this paper considers the sequence identified within the cores, which includes two possible flood events early in the construction of the mound. It also describes four cores taken through the surrounding ditch, as well as small-scale work to the north-east of the mound. The topographic location of the mound in a low-lying area and close to rivers and springs is discussed, and the potential for Late Neolithic sites nearby is set out, with the land to the south of the mound identified as an area for future research. The paper ends with the prospect that other apparent mottes in Wiltshire and beyond may well also have prehistoric origins
Resumo:
Tests of the new Rossby wave theories that have been developed over the past decade to account for discrepancies between theoretical wave speeds and those observed by satellite altimeters have focused primarily on the surface signature of such waves. It appears, however, that the surface signature of the waves acts only as a rather weak constraint, and that information on the vertical structure of the waves is required to better discriminate between competing theories. Due to the lack of 3-D observations, this paper uses high-resolution model data to construct realistic vertical structures of Rossby waves and compares these to structures predicted by theory. The meridional velocity of a section at 24° S in the Atlantic Ocean is pre-processed using the Radon transform to select the dominant westward signal. Normalized profiles are then constructed using three complementary methods based respectively on: (1) averaging vertical profiles of velocity, (2) diagnosing the amplitude of the Radon transform of the westward propagating signal at different depths, and (3) EOF analysis. These profiles are compared to profiles calculated using four different Rossby wave theories: standard linear theory (SLT), SLT plus mean flow, SLT plus topographic effects, and theory including mean flow and topographic effects. Our results support the classical theoretical assumption that westward propagating signals have a well-defined vertical modal structure associated with a phase speed independent of depth, in contrast with the conclusions of a recent study using the same model but for different locations in the North Atlantic. The model structures are in general surface intensified, with a sign reversal at depth in some regions, notably occurring at shallower depths in the East Atlantic. SLT provides a good fit to the model structures in the top 300 m, but grossly overestimates the sign reversal at depth. The addition of mean flow slightly improves the latter issue, but is too surface intensified. SLT plus topography rectifies the overestimation of the sign reversal, but overestimates the amplitude of the structure for much of the layer above the sign reversal. Combining the effects of mean flow and topography provided the best fit for the mean model profiles, although small errors at the surface and mid-depths are carried over from the individual effects of mean flow and topography respectively. Across the section the best fitting theory varies between SLT plus topography and topography with mean flow, with, in general, SLT plus topography performing better in the east where the sign reversal is less pronounced. None of the theories could accurately reproduce the deeper sign reversals in the west. All theories performed badly at the boundaries. The generalization of this method to other latitudes, oceans, models and baroclinic modes would provide greater insight into the variability in the ocean, while better observational data would allow verification of the model findings.
Resumo:
As the calibration and evaluation of flood inundation models are a prerequisite for their successful application, there is a clear need to ensure that the performance measures that quantify how well models match the available observations are fit for purpose. This paper evaluates the binary pattern performance measures that are frequently used to compare flood inundation models with observations of flood extent. This evaluation considers whether these measures are able to calibrate and evaluate model predictions in a credible and consistent way, i.e. identifying the underlying model behaviour for a number of different purposes such as comparing models of floods of different magnitudes or on different catchments. Through theoretical examples, it is shown that the binary pattern measures are not consistent for floods of different sizes, such that for the same vertical error in water level, a model of a flood of large magnitude appears to perform better than a model of a smaller magnitude flood. Further, the commonly used Critical Success Index (usually referred to as F<2 >) is biased in favour of overprediction of the flood extent, and is also biased towards correctly predicting areas of the domain with smaller topographic gradients. Consequently, it is recommended that future studies consider carefully the implications of reporting conclusions using these performance measures. Additionally, future research should consider whether a more robust and consistent analysis could be achieved by using elevation comparison methods instead.
Resumo:
Social tagging has become very popular around the Internet as well as in research. The main idea behind tagging is to allow users to provide metadata to the web content from their perspective to facilitate categorization and retrieval. There are many factors that influence users' tag choice. Many studies have been conducted to reveal these factors by analysing tagging data. This paper uses two theories to identify these factors, namely the semiotics theory and activity theory. The former treats tags as signs and the latter treats tagging as an activity. The paper uses both theories to analyse tagging behaviour by explaining all aspects of a tagging system, including tags, tagging system components and the tagging activity. The theoretical analysis produced a framework that was used to identify a number of factors. These factors can be considered as categories that can be consulted to redirect user tagging choice in order to support particular tagging behaviour, such as cross-lingual tagging.
Resumo:
Tagging provides support for retrieval and categorization of online content depending on users' tag choice. A number of models of tagging behaviour have been proposed to identify factors that are considered to affect taggers, such as users' tagging history. In this paper, we use Semiotics Analysis and Activity theory, to study the effect the system designer has over tagging behaviour. The framework we use shows the components that comprise the tagging system and how they interact together to direct tagging behaviour. We analysed two collaborative tagging systems: CiteULike and Delicious by studying their components by applying our framework. Using datasets from both systems, we found that 35% of CiteULike users did not provide tags compared to only 0.1% of Delicious users. This was directly linked to the type of tools used by the system designer to support tagging.
Resumo:
The central sector of the last British–Irish Ice Sheet (BIIS) was characterised by considerable complexity, both in terms of its glacial stratigraphy and geomorphological signature. This complexity is reflected by the large number and long history of papers that have attempted to decipher the glaciodynamic history of the region. Despite significant advances in our understanding, reconstructions remain hotly debated and relatively local, thereby hindering attempts to piece together BIIS dynamics. This paper seeks to address these issues by reviewing geomorphological mapping evidence of palimpsest flow signatures and providing an up-to-date stratigraphy of the region. Reconciling geomorphological and sedimentological evidence with relative and absolute dating constraints has allowed us to develop a new six-stage glacial model of ice-flow history and behaviour in the central sector of the last BIIS, with three major phases of glacial advance. This includes: I. Eastwards ice flow through prominent topographic corridors of the north Pennines; II. Cessation of the Stainmore ice flow pathway and northwards migration of the North Irish Sea Basin ice divide; III. Stagnation and retreat of the Tyne Gap Ice Stream; IV. Blackhall Wood–Gosforth Oscillation; V. Deglaciation of the Solway Lowlands; and VI. Scottish Re-advance and subsequent final retreat of ice out of the central sector of the last BIIS. The ice sheet was characterised by considerable dynamism, with flow switches, initiation (and termination) of ice streams, draw-down of ice into marine ice streams, repeated ice-marginal fluctuations and the production of large volumes of meltwater, locally impounded to form ice-dammed glacial lakes. Significantly, we tie this reconstruction to work carried out and models developed for the entire ice sheet. This therefore situates research in the central sector within contemporary understanding of how the last BIIS evolved over time.
Resumo:
The aim of the current study is to investigate motion event cognition in second language learners in a higher education learning context. Based on recent findings showing that speakers of grammatical aspect languages like English attend less to the endpoint (goal) of events than speakers of non-aspect languages like Swedish in a nonverbal categorization task involving working memory (Athanasopoulos & Bylund, 2013; Bylund & Athanasopoulos, this issue), the current study asks whether native speakers of an aspect language start paying more attention to event endpoints when learning a non-aspect language. Native English and German (a non-aspect language) speakers, and English learners of L2 German, who were pursuing studies in German language and literature at an English university, were asked to match a target scene with intermediate degree of endpoint orientation with two alternate scenes with low and high degree of endpoint orientation, respectively. Results showed that, when compared to the native English speakers, the learners of German were more prone to base their similarity judgements on endpoint saliency, rather than ongoingness, primarily as a function of increasing L2 proficiency and year of university study. Further analyses revealed a non-linear relationship between length of L2 exposure and categorization patterns, subserved by a progressive strengthening of the relationship between L2 proficiency and categorization as length of exposure increased. These findings present evidence that cognitive restructuring may occur through increasing experience with an L2, but also suggest that this relationship may be complex, and unfold over a long period of time.
Resumo:
The convectively active part of the Madden-Julian Oscillation (MJO) propagates eastward through the warm pool, from the Indian Ocean through the Maritime Continent (the Indonesian archipelago) to the western Pacific. The Maritime Continent's complex topography means the exact nature of the MJO propagation through this region is unclear. Model simulations of the MJO are often poor over the region, leading to local errors in latent heat release and global errors in medium-range weather prediction and climate simulation. Using 14 northern winters of TRMM satellite data it is shown that, where the mean diurnal cycle of precipitation is strong, 80% of the MJO precipitation signal in the Maritime Continent is accounted for by changes in the amplitude of the diurnal cycle. Additionally, the relationship between outgoing long-wave radiation (OLR) and precipitation is weakened here, such that OLR is no longer a reliable proxy for precipitation. The canonical view of the MJO as the smooth eastward propagation of a large-scale precipitation envelope also breaks down over the islands of the Maritime Continent. Instead, a vanguard of precipitation (anomalies of 2.5 mm day^-1 over 10^6 km^2) jumps ahead of the main body by approximately 6 days or 2000 km. Hence, there can be enhanced precipitation over Sumatra, Borneo or New Guinea when the large-scale MJO envelope over the surrounding ocean is one of suppressed precipitation. This behaviour can be accommodated into existing MJO theories. Frictional and topographic moisture convergence and relatively clear skies ahead of the main convective envelope combine with the low thermal inertia of the islands, to allow a rapid response in the diurnal cycle which rectifies onto the lower-frequency MJO. Hence, accurate representations of the diurnal cycle and its scale interaction appear to be necessary for models to simulate the MJO successfully.