910 resultados para Long-Polling, GCM, Google Cloud Messaging, RESTful Web services, Push, Notifiche
Resumo:
Resumen basado en el de la publicación
Resumo:
En la segunda d??cada del siglo XXI se asiste a una completa transformaci??n en el proceso de ense??anza-aprendizaje de las lenguas extranjeras en general, y del espa??ol, en particular. En este nuevo contexto, las nuevas tecnolog??as aplicadas al ??mbito educativo juegan un papel muy importante. Este universo es conocido como la Web 2.0. Se presentan las ventajas e inconvenientes de la utilizaci??n de estas tecnolog??as en la clase de idiomas centr??ndose en cuatro herramientas digitales que ofrecen m??s posibilidades para dinamizar las clases de espa??ol: los blogs, los wikis, el podcasting y Google Docs o el cloud computing.
Resumo:
When publishing information on the web, one expects it to reach all the people that could be interested in. This is mainly achieved with general purpose indexing and search engines like Google which is the most used today. In the particular case of geographic information (GI) domain, exposing content to mainstream search engines is a complex task that needs specific actions. In many occasions it is convenient to provide a web site with a specially tailored search engine. Such is the case for on-line dictionaries (wikipedia, wordreference), stores (amazon, ebay), and generally all those holding thematic databases. Due to proliferation of these engines, A9.com proposed a standard interface called OpenSearch, used by modern web browsers to manage custom search engines. Geographic information can also benefit from the use of specific search engines. We can distinguish between two main approaches in GI retrieval information efforts: Classical OGC standardization on one hand (CSW, WFS filters), which are very complex for the mainstream user, and on the other hand the neogeographer’s approach, usually in the form of specific APIs lacking a common query interface and standard geographic formats. A draft ‘geo’ extension for OpenSearch has been proposed. It adds geographic filtering for queries and recommends a set of simple standard response geographic formats, such as KML, Atom and GeoRSS. This proposal enables standardization while keeping simplicity, thus covering a wide range of use cases, in both OGC and the neogeography paradigms. In this article we will analyze the OpenSearch geo extension in detail and its use cases, demonstrating its applicability to both the SDI and the geoweb. Open source implementations will be presented as well
Resumo:
Como consecuencia de la realización del curso 'Aplicaciones de la Web 2.0 en la Investigación y la Docencia' organizado por La Fundación General Universidad de Granada-Empresa en el curso académico 2009-2010, los autores del presente trabajo presentamos el Proyecto de Innovación Docente concedido por el Vicerrectorado para la Garantía de la calidad de la Universidad de Granada,titulado 'Utilización de la herramienta “Google Docs” en la Docencia Universitaria dentro del marco del EEES. Creemos que el proyecto tiene un diseño aplicable a cualquier asignatura de la actual Licenciatura y/o Grado de Farmacia. En este trabajo se expone la experiencia piloto llevada a cabo en la asignatura de Química Farmacéutica, incluyendo el desarrollo, objetivos, metodología, resultados y las conclusiones que se están obteniendo en el trascurso del mismo
Resumo:
The amateur birding community has a long and proud tradition of contributing to bird surveys and bird atlases. Coordinated activities such as Breeding Bird Atlases and the Christmas Bird Count are examples of "citizen science" projects. With the advent of technology, Web 2.0 sites such as eBird have been developed to facilitate online sharing of data and thus increase the potential for real-time monitoring. However, as recently articulated in an editorial in this journal and elsewhere, monitoring is best served when based on a priori hypotheses. Harnessing citizen scientists to collect data following a hypothetico-deductive approach carries challenges. Moreover, the use of citizen science in scientific and monitoring studies has raised issues of data accuracy and quality. These issues are compounded when data collection moves into the Web 2.0 world. An examination of the literature from social geography on the concept of "citizen sensors" and volunteered geographic information (VGI) yields thoughtful reflections on the challenges of data quality/data accuracy when applying information from citizen sensors to research and management questions. VGI has been harnessed in a number of contexts, including for environmental and ecological monitoring activities. Here, I argue that conceptualizing a monitoring project as an experiment following the scientific method can further contribute to the use of VGI. I show how principles of experimental design can be applied to monitoring projects to better control for data quality of VGI. This includes suggestions for how citizen sensors can be harnessed to address issues of experimental controls and how to design monitoring projects to increase randomization and replication of sampled data, hence increasing scientific reliability and statistical power.
Resumo:
Results are presented from a new web application called OceanDIVA - Ocean Data Intercomparison and Visualization Application. This tool reads hydrographic profiles and ocean model output and presents the data on either depth levels or isotherms for viewing in Google Earth, or as probability density functions (PDFs) of regional model-data misfits. As part of the CLIVAR Global Synthesis and Observations Panel, an intercomparison of water mass properties of various ocean syntheses has been undertaken using OceanDIVA. Analysis of model-data misfits reveals significant differences between the water mass properties of the syntheses, such as the ability to capture mode water properties.
Resumo:
Understanding links between the El Nino-Southern Oscillation (ENSO) and snow would be useful for seasonal forecasting, but also for understanding natural variability and interpreting climate change predictions. Here, a 545-year run of the general circulation model HadCM3, with prescribed external forcings and fixed greenhouse gas concentrations, is used to explore the impact of ENSO on snow water equivalent (SWE) anomalies. In North America, positive ENSO events reduce the mean SWE and skew the distribution towards lower values, and vice versa during negative ENSO events. This is associated with a dipole SWE anomaly structure, with anomalies of opposite sign centered in western Canada and the central United States. In Eurasia, warm episodes lead to a more positively skewed distribution and the mean SWE is raised. Again, the opposite effect is seen during cold episodes. In Eurasia the largest anomalies are concentrated in the Himalayas. These correlations with February SWE distribution are seen to exist from the previous June-July-August (JJA) ENSO index onwards, and are weakly detected in 50-year subsections of the control run, but only a shifted North American response can be detected in the anaylsis of 40 years of ERA40 reanalysis data. The ENSO signal in SWE from the long run could still contribute to regional predictions although it would be a weak indicator only
Resumo:
Anomalous heavy snow during winter or spring has long been regarded as a possible precursor of deficient Indian monsoon rainfall during the subsequent summer. However previous work in this field is inconclusive, in terms of the mechanism that communicates snow anomalies to the monsoon summer, and even the region from which snow has the most impact. In this study we explore these issues in coupled and atmosphere-only versions of the Hadley Centre model. A 1050-year control integration of the HadCM3 coupled model, which well represents the seasonal cycle of snow cover over the Eurasian continent, is analysed and shows evidence for weakened monsoons being preceded by strong snow forcing (in the absence of ENSO) over either the Himalaya/Tibetan Plateau or north/west Eurasia regions. However, empirical orthogonal function (EOF) analysis of springtime interannual variability in snow depth shows the leading mode to have opposite signs between these two regions, suggesting that competing mechanisms may be possible. To determine the dominant region, ensemble integrations are carried out using HadAM3, the atmospheric component of HadCM3, and a variety of anomalous snow forcing initial conditions obtained from the control integration of the coupled model. Forcings are applied during spring in separate experiments over the Himalaya/Tibetan Plateau and north/west Eurasia regions, in conjunction with climatological SSTs in order to avoid the direct effects of ENSO. With the aid of idealized forcing conditions in sensitivity tests, we demonstrate that forcing from the Himalaya region is dominant in this model via a Blanford-type mechanism involving reduced surface sensible heat and longwave fluxes, reduced heating of the troposphere over the Tibetan Plateau and consequently a reduced meridional tropospheric temperature gradient which weakens the monsoon during early summer. Snow albedo is shown to be key to the mechanism, explaining around 50% of the perturbation in sensible heating over the Tibetan Plateau, and accounting for the majority of cooling through the troposphere.
Resumo:
Although the potential importance of scattering of long-wave radiation by clouds has been recognised, most studies have concentrated on the impact of high clouds and few estimates of the global impact of scattering have been presented. This study shows that scattering in low clouds has a significant impact on outgoing long-wave radiation (OLR) in regions of marine stratocumulus (-3.5 W m(-2) for overcast conditions) where the column water vapour is relatively low. This corresponds to an enhancement of the greenhouse effect of such clouds by 10%. The near-global impact of scattering on OLR is estimated to be -3.0 W m(-2), with low clouds contributing -0.9 W m(-2), mid-level cloud -0.7 W m(-2) and high clouds -1.4 W m(-2). Although this effect appears small compared to the global mean OLR of 240 W m(-2), it indicates that neglect of scattering will lead to an error in cloud long-wave forcing of about 10% and an error in net cloud forcing of about 20%.
Resumo:
Even if we have recognized many short-term benefits of agile methods, we still know very little about their long-term effects. In this panel, we discuss the long-term perspective of the agile methods. The panelists are either industrial or academic representatives. They will discuss problems and benefits related to the long-term lifecycle system management in agile projects. Ideally, the panel’s outcome will provide ideas for future research.
Resumo:
Long-term effects of the elevated atmospheric CO2 on biosphere have been in focus of research since the last few decades. In this experiment undisturbed soil monoliths of loess grassland were exposed to an elevated CO2 environment (two-times the ambient CO2 level) for a period of six years with the aid of the open top chamber method. Control without a chamber and CO2 elevation was applied as well. Elevated CO2 level had very little impact oil soil food web. It did not influence either root and microbial biomass or microbial and nematode community structure. The only significant response was that density of the bacterial feeder genus Heterocephalobus increased in the chamber with elevated CO2 concentration. Application of the open top chambers initiated more changes on nematodes than the elevated CO2 level. Open top chamber (OTC) method decreased nematode density (total and plant feeder as well) to less than half of the original level. Negative effect was found on the genus level in the case of fungal feeder Aphelenchoides, plant feeder Helicotylenchus and Paratylenchus. It is very likely that the significantly lower belowground root biomass and partly its decreased quality reflected by the increased C/N ratio are the main responsible factors for the lower density of the plant feeder nematodes in the plots of chambers. According to diversity profiles, MI and MI(2-15) parameters, nematode communities in the open top chambers (both on ambient and elevated CO2 level) seem to be more structured than those under normal circumstances six years after start of the experiment.
Resumo:
Purpose - The purpose of this paper is to identify the most popular techniques used to rank a web page highly in Google. Design/methodology/approach - The paper presents the results of a study into 50 highly optimized web pages that were created as part of a Search Engine Optimization competition. The study focuses on the most popular techniques that were used to rank highest in this competition, and includes an analysis on the use of PageRank, number of pages, number of in-links, domain age and the use of third party sites such as directories and social bookmarking sites. A separate study was made into 50 non-optimized web pages for comparison. Findings - The paper provides insight into the techniques that successful Search Engine Optimizers use to ensure a page ranks highly in Google. Recognizes the importance of PageRank and links as well as directories and social bookmarking sites. Research limitations/implications - Only the top 50 web sites for a specific query were analyzed. Analysing more web sites and comparing with similar studies in different competition would provide more concrete results. Practical implications - The paper offers a revealing insight into the techniques used by industry experts to rank highly in Google, and the success or other-wise of those techniques. Originality/value - This paper fulfils an identified need for web sites and e-commerce sites keen to attract a wider web audience.
Resumo:
A poor representation of cloud structure in a general circulation model (GCM) is widely recognised as a potential source of error in the radiation budget. Here, we develop a new way of representing both horizontal and vertical cloud structure in a radiation scheme. This combines the ‘Tripleclouds’ parametrization, which introduces inhomogeneity by using two cloudy regions in each layer as opposed to one, each with different water content values, with ‘exponential-random’ overlap, in which clouds in adjacent layers are not overlapped maximally, but according to a vertical decorrelation scale. This paper, Part I of two, aims to parametrize the two effects such that they can be used in a GCM. To achieve this, we first review a number of studies for a globally applicable value of fractional standard deviation of water content for use in Tripleclouds. We obtain a value of 0.75 ± 0.18 from a variety of different types of observations, with no apparent dependence on cloud type or gridbox size. Then, through a second short review, we create a parametrization of decorrelation scale for use in exponential-random overlap, which varies the scale linearly with latitude from 2.9 km at the Equator to 0.4 km at the poles. When applied to radar data, both components are found to have radiative impacts capable of offsetting biases caused by cloud misrepresentation. Part II of this paper implements Tripleclouds and exponential-random overlap into a radiation code and examines both their individual and combined impacts on the global radiation budget using re-analysis data.
Resumo:
Reliably representing both horizontal cloud inhomogeneity and vertical cloud overlap is fundamentally important for the radiation budget of a general circulation model. Here, we build on the work of Part One of this two-part paper by applying a pair of parameterisations that account for horizontal inhomogeneity and vertical overlap to global re-analysis data. These are applied both together and separately in an attempt to quantify the effects of poor representation of the two components on radiation budget. Horizontal inhomogeneity is accounted for using the “Tripleclouds” scheme, which uses two regions of cloud in each layer of a gridbox as opposed to one; vertical overlap is accounted for using “exponential-random” overlap, which aligns vertically continuous cloud according to a decorrelation height. These are applied to a sample of scenes from a year of ERA-40 data. The largest radiative effect of horizontal inhomogeneity is found to be in areas of marine stratocumulus; the effect of vertical overlap is found to be fairly uniform, but with larger individual short-wave and long-wave effects in areas of deep, tropical convection. The combined effect of the two parameterisations is found to reduce the magnitude of the net top-of-atmosphere cloud radiative forcing (CRF) by 2.25 W m−2, with shifts of up to 10 W m−2 in areas of marine stratocumulus. The effects of the uncertainty in our parameterisations on radiation budget is also investigated. It is found that the uncertainty in the impact of horizontal inhomogeneity is of order ±60%, while the uncertainty in the impact of vertical overlap is much smaller. This suggests an insensitivity of the radiation budget to the exact nature of the global decorrelation height distribution derived in Part One.
Resumo:
In this paper, observations by a ground-based vertically pointing Doppler lidar and sonic anemometer are used to investigate the diurnal evolution of boundary-layer turbulence in cloudless, cumulus and stratocumulus conditions. When turbulence is driven primarily by surface heating, such as in cloudless and cumulus-topped boundary layers, both the vertical velocity variance and skewness follow similar profiles, on average, to previous observational studies of turbulence in convective conditions, with a peak skewness of around 0.8 in the upper third of the mixed layer. When the turbulence is driven primarily by cloud-top radiative cooling, such as in the presence of nocturnal stratocumulus, it is found that the skewness is inverted in both sign and height: its minimum value of around −0.9 occurs in the lower third of the mixed layer. The profile of variance is consistent with a cloud-top cooling rate of around 30Wm−2. This is also consistent with the evolution of the thermodynamic profile and the rate of growth of the mixed layer into the stable nocturnal boundary layer from above. In conditions where surface heating occurs simultaneously with cloud-top cooling, the skewness is found to be useful for diagnosing the source of the turbulence, suggesting that long-term Doppler lidar observations would be valuable for evaluating boundary-layer parametrization schemes. Copyright c 2009 Royal Meteorological Society