110 resultados para Ubiquitous Computing, Pervasive Computing, Internet of Things, Cloud Computing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the VOCALS campaign spaceborne satellite observations showed that travelling gravity wave packets, generated by geostrophic adjustment, resulted in perturbations to marine boundary layer (MBL) clouds over the south-east Pacific Ocean (SEP). Often, these perturbations were reversible in that passage of the wave resulted in the clouds becoming brighter (in the wave crest), then darker (in the wave trough) and subsequently recovering their properties after the passage of the wave. However, occasionally the wave packets triggered irreversible changes to the clouds, which transformed from closed mesoscale cellular convection to open form. In this paper we use large eddy simulation (LES) to examine the physical mechanisms that cause this transition. Specifically, we examine whether the clearing of the cloud is due to (i) the wave causing additional cloud-top entrainment of warm, dry air or (ii) whether the additional condensation of liquid water onto the existing drops and the subsequent formation of drizzle are the important mechanisms. We find that, although the wave does cause additional drizzle formation, this is not the reason for the persistent clearing of the cloud; rather it is the additional entrainment of warm, dry air into the cloud followed by a reduction in longwave cooling, although this only has a significant effect when the cloud is starting to decouple from the boundary layer. The result in this case is a change from a stratocumulus to a more patchy cloud regime. For the simulations presented here, cloud condensation nuclei (CCN) scavenging did not play an important role in the clearing of the cloud. The results have implications for understanding transitions between the different cellular regimes in marine boundary layer (MBL) clouds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What is it that gives celebrities the voice and authority to do and say the things they do in the realm of development politics? Asked another way, how is celebrity practised and, simultaneously, how does this praxis make celebrity, personas, politics and, indeed, celebrities themselves? In this article, we explore this ‘celebrity praxis’ through the lens of the creation of the contemporary ‘development celebrity’ in those stars working for development writ large in the so-called Third World. Drawing on work in science studies, material cultures and the growing geo-socio-anthropologies of things, the key to understanding the material practices embedded in and creating development celebrity networks is the multiple and complex circulations of the everyday and bespectacled artefacts of celebrity. Conceptualised as the ‘celebrity–consumption–compassion complex’, the performances of development celebrities are as much about everyday events, materials, technologies, emotions and consumer acts as they are about the mediated and liquidised constructions of the stars who now ‘market’ development.Moreover, this complex is constructed by and constructs what we are calling ‘star/poverty space’ that works to facilitate the ‘expertise’ and ‘authenticity’ and, thus, elevated voice and authority, of development celebrities through poverty tours, photoshoots, textual and visual diaries, websites and tweets. In short, the creation of star/poverty space is performed through a kind of ‘materiality of authenticity’ that is at the centre of the networks of development celebrity. The article concludes with several brief observations about the politics, possibilities and problematics of development celebrities and the star/poverty spaces that they create.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tagging provides support for retrieval and categorization of online content depending on users' tag choice. A number of models of tagging behaviour have been proposed to identify factors that are considered to affect taggers, such as users' tagging history. In this paper, we use Semiotics Analysis and Activity theory, to study the effect the system designer has over tagging behaviour. The framework we use shows the components that comprise the tagging system and how they interact together to direct tagging behaviour. We analysed two collaborative tagging systems: CiteULike and Delicious by studying their components by applying our framework. Using datasets from both systems, we found that 35% of CiteULike users did not provide tags compared to only 0.1% of Delicious users. This was directly linked to the type of tools used by the system designer to support tagging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter looks at three films whose Portuguese urban settings offer a privileged ground for the re-evaluation of the classical-modern-postmodern categorisation with regard to cinema. They are The State of Things (Wim Wenders, 1982), Foreign Land (Walter Salles and Daniela Thomas, 1995) and Mysteries of Lisbon (Raúl Ruiz, 2010). In them, the city is the place where characters lose their bearings, names, identities, and where vicious circles, mirrors, replicas and mise-en-abyme bring the vertiginous movement that had characterised the modernist city of 1920s cinema to a halt. Curiously, too, it is the place where so-called postmodern aesthetics finally finds an ideal home in self-ironical tales that expose the film medium’s narrative shortcomings. Intermedial devices, whether Polaroid stills or a cardboard cut-out theatre, are then resorted to in order to turn a larger-than-life reality into framed, manageable narrative miniatures. The scaled-down real, however, turns out to be a disappointing simulacrum, a memory ersatz that unveils the illusory character of cosmopolitan teleology. In my approach, I start by examining the intertwined and transnational genesis of these films that resulted in three correlated visions of the end of history and of storytelling, typical of postmodern aesthetics. I move on to consider intermedia miniaturism as an attempt to stop time within movement, an equation that inevitably brings to mind the Deleuzian movement-time binary, which I revisit in an attempt to disentangle it from the classical-modern opposition. I conclude by proposing reflexive stasis and scale reversal as the common denominator across all modern projects, hence, perhaps, a more advantageous model than modernity to signify artistic and political values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the second half of a two-part paper dealing with the social theoretic assumptions underlying system dynamics. In the first half it was concluded that analysing system dynamics using traditional, paradigm-based social theories is highly problematic. An innovative and potentially fruitful resolution is now proposed to these problems. In the first section it is argued that in order to find an appropriate social theoretic home for system dynamics it is necessary to look to a key exchange in contemporary social science: the agency/structure debate. This debate aims to move beyond both the theories based only on the actions of individual human agents, and those theories that emphasise only structural influences. Emerging from this debate are various theories that instead aim to unite the human agent view of the social realm with views that concentrate solely on system structure. It is argued that system dynamics is best viewed as being implicitly grounded in such theories. The main conclusion is therefore that system dynamics can contribute to an important part of social thinking by providing a formal approach for explicating social mechanisms. This conclusion is of general significance for system dynamics. However, the over-arching aim of the two-part paper is to increase the understanding of system dynamics in related disciplines. Four suggestions are therefore offered for how the system dynamics method might be extended further into the social sciences. It is argued that, presented in the right way, the formal yet contingent feedback causality thinking of system dynamics should diffuse widely in the social sciences and make a distinctive and important contribution to them. Felix qui potuit rerum cognoscere causas Happy is he who comes to know the causes of things Virgil - Georgics, Book II, line 490. 29 BCE

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to calculate unbiased microphysical and radiative quantities in the presence of a cloud, it is necessary to know not only the mean water content but also the distribution of this water content. This article describes a study of the in-cloud horizontal inhomogeneity of ice water content, based on CloudSat data. In particular, by focusing on the relations with variables that are already available in general circulation models (GCMs), a parametrization of inhomogeneity that is suitable for inclusion in GCM simulations is developed. Inhomogeneity is defined in terms of the fractional standard deviation (FSD), which is given by the standard deviation divided by the mean. The FSD of ice water content is found to increase with the horizontal scale over which it is calculated and also with the thickness of the layer. The connection to cloud fraction is more complicated; for small cloud fractions FSD increases as cloud fraction increases while FSD decreases sharply for overcast scenes. The relations to horizontal scale, layer thickness and cloud fraction are parametrized in a relatively simple equation. The performance of this parametrization is tested on an independent set of CloudSat data. The parametrization is shown to be a significant improvement on the assumption of a single-valued global FSD

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high computational cost of calculating the radiative heating rates in numerical weather prediction (NWP) and climate models requires that calculations are made infrequently, leading to poor sampling of the fast-changing cloud field and a poor representation of the feedback that would occur. This paper presents two related schemes for improving the temporal sampling of the cloud field. Firstly, the ‘split time-stepping’ scheme takes advantage of the independent nature of the monochromatic calculations of the ‘correlated-k’ method to split the calculation into gaseous absorption terms that are highly dependent on changes in cloud (the optically thin terms) and those that are not (optically thick). The small number of optically thin terms can then be calculated more often to capture changes in the grey absorption and scattering associated with cloud droplets and ice crystals. Secondly, the ‘incremental time-stepping’ scheme uses a simple radiative transfer calculation using only one or two monochromatic calculations representing the optically thin part of the atmospheric spectrum. These are found to be sufficient to represent the heating rate increments caused by changes in the cloud field, which can then be added to the last full calculation of the radiation code. We test these schemes in an operational forecast model configuration and find a significant improvement is achieved, for a small computational cost, over the current scheme employed at the Met Office. The ‘incremental time-stepping’ scheme is recommended for operational use, along with a new scheme to correct the surface fluxes for the change in solar zenith angle between radiation calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing optical depth poleward of 45° is a robust response to warming in global climate models. Much of this cloud optical depth increase has been hypothesized to be due to transitions from ice-dominated to liquid-dominated mixed-phase cloud. In this study, the importance of liquid-ice partitioning for the optical depth feedback is quantified for 19 Coupled Model Intercomparison Project Phase 5 models. All models show a monotonic partitioning of ice and liquid as a function of temperature, but the temperature at which ice and liquid are equally mixed (the glaciation temperature) varies by as much as 40 K across models. Models that have a higher glaciation temperature are found to have a smaller climatological liquid water path (LWP) and condensed water path and experience a larger increase in LWP as the climate warms. The ice-liquid partitioning curve of each model may be used to calculate the response of LWP to warming. It is found that the repartitioning between ice and liquid in a warming climate contributes at least 20% to 80% of the increase in LWP as the climate warms, depending on model. Intermodel differences in the climatological partitioning between ice and liquid are estimated to contribute at least 20% to the intermodel spread in the high-latitude LWP response in the mixed-phase region poleward of 45°S. It is hypothesized that a more thorough evaluation and constraint of global climate model mixed-phase cloud parameterizations and validation of the total condensate and ice-liquid apportionment against observations will yield a substantial reduction in model uncertainty in the high-latitude cloud response to warming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent field campaign in southwest England used numerical modeling integrated with aircraft and radar observations to investigate the dynamic and microphysical interactions that can result in heavy convective precipitation. The COnvective Precipitation Experiment (COPE) was a joint UK-US field campaign held during the summer of 2013 in the southwest peninsula of England, designed to study convective clouds that produce heavy rain leading to flash floods. The clouds form along convergence lines that develop regularly due to the topography. Major flash floods have occurred in the past, most famously at Boscastle in 2004. It has been suggested that much of the rain was produced by warm rain processes, similar to some flash floods that have occurred in the US. The overarching goal of COPE is to improve quantitative convective precipitation forecasting by understanding the interactions of the cloud microphysics and dynamics and thereby to improve NWP model skill for forecasts of flash floods. Two research aircraft, the University of Wyoming King Air and the UK BAe 146, obtained detailed in situ and remote sensing measurements in, around, and below storms on several days. A new fast-scanning X-band dual-polarization Doppler radar made 360-deg volume scans over 10 elevation angles approximately every 5 minutes, and was augmented by two UK Met Office C-band radars and the Chilbolton S-band radar. Detailed aerosol measurements were made on the aircraft and on the ground. This paper: (i) provides an overview of the COPE field campaign and the resulting dataset; (ii) presents examples of heavy convective rainfall in clouds containing ice and also in relatively shallow clouds through the warm rain process alone; and (iii) explains how COPE data will be used to improve high-resolution NWP models for operational use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The collective representation within global models of aerosol, cloud, precipitation, and their radiative properties remains unsatisfactory. They constitute the largest source of uncertainty in predictions of climatic change and hamper the ability of numerical weather prediction models to forecast high-impact weather events. The joint European Space Agency (ESA)–Japan Aerospace Exploration Agency (JAXA) Earth Clouds, Aerosol and Radiation Explorer (EarthCARE) satellite mission, scheduled for launch in 2018, will help to resolve these weaknesses by providing global profiles of cloud, aerosol, precipitation, and associated radiative properties inferred from a combination of measurements made by its collocated active and passive sensors. EarthCARE will improve our understanding of cloud and aerosol processes by extending the invaluable dataset acquired by the A-Train satellites CloudSat, Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and Aqua. Specifically, EarthCARE’s cloud profiling radar, with 7 dB more sensitivity than CloudSat, will detect more thin clouds and its Doppler capability will provide novel information on convection, precipitating ice particle, and raindrop fall speeds. EarthCARE’s 355-nm high-spectral-resolution lidar will measure directly and accurately cloud and aerosol extinction and optical depth. Combining this with backscatter and polarization information should lead to an unprecedented ability to identify aerosol type. The multispectral imager will provide a context for, and the ability to construct, the cloud and aerosol distribution in 3D domains around the narrow 2D retrieved cross section. The consistency of the retrievals will be assessed to within a target of ±10 W m–2 on the (10 km)2 scale by comparing the multiview broadband radiometer observations to the top-of-atmosphere fluxes estimated by 3D radiative transfer models acting on retrieved 3D domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes new advances in the exploitation of oxygen A-band measurements from POLDER3 sensor onboard PARASOL, satellite platform within the A-Train. These developments result from not only an account of the dependence of POLDER oxygen parameters to cloud optical thickness τ and to the scene's geometrical conditions but also, and more importantly, from the finer understanding of the sensitivity of these parameters to cloud vertical extent. This sensitivity is made possible thanks to the multidirectional character of POLDER measurements. In the case of monolayer clouds that represent most of cloudy conditions, new oxygen parameters are obtained and calibrated from POLDER3 data colocalized with the measurements of the two active sensors of the A-Train: CALIOP/CALIPSO and CPR/CloudSat. From a parameterization that is (μs, τ) dependent, with μs the cosine of the solar zenith angle, a cloud top oxygen pressure (CTOP) and a cloud middle oxygen pressure (CMOP) are obtained, which are estimates of actual cloud top and middle pressures (CTP and CMP). Performances of CTOP and CMOP are presented by class of clouds following the ISCCP classification. In 2008, the coefficient of the correlation between CMOP and CMP is 0.81 for cirrostratus, 0.79 for stratocumulus, 0.75 for deep convective clouds. The coefficient of the correlation between CTOP and CTP is 0.75, 0.73, and 0.79 for the same cloud types. The score obtained by CTOP, defined as the confidence in the retrieval for a particular range of inferred value and for a given error, is higher than the one of MODIS CTP estimate. Scores of CTOP are the highest for bin value of CTP superior in numbers. For liquid (ice) clouds and an error of 30 hPa (50 hPa), the score of CTOP reaches 50% (70%). From the difference between CTOP and CMOP, a first estimate of the cloud vertical extent h is possible. A second estimate of h comes from the correlation between the angular standard deviation of POLDER oxygen pressure σPO2 and the cloud vertical extent. This correlation is studied in detail in the case of liquid clouds. It is shown to be spatially and temporally robust, except for clouds above land during winter months. The analysis of the correlation's dependence on the scene's characteristics leads to a parameterization providing h from σPO2. For liquid water clouds above ocean in 2008, the mean difference between the actual cloud vertical extent and the one retrieved from σPO2 (from the pressure difference) is 5 m (−12 m). The standard deviation of the mean difference is close to 1000 m for the two methods. POLDER estimates of the cloud geometrical thickness obtain a global score of 50% confidence for a relative error of 20% (40%) of the estimate for ice (liquid) clouds over ocean. These results need to be validated outside of the CALIPSO/CloudSat track.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification, tracking, and statistical analysis of tropical convective complexes using satellite imagery is explored in the context of identifying feature points suitable for tracking. The feature points are determined based on the shape of complexes using the distance transform technique. This approach has been applied to the determination feature points for tropical convective complexes identified in a time series of global cloud imagery. The feature points are used to track the complexes, and from the tracks statistical diagnostic fields are computed. This approach allows the nature and distribution of organized deep convection in the Tropics to be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A stochastic parameterization scheme for deep convection is described, suitable for use in both climate and NWP models. Theoretical arguments and the results of cloud-resolving models, are discussed in order to motivate the form of the scheme. In the deterministic limit, it tends to a spectrum of entraining/detraining plumes and is similar to other current parameterizations. The stochastic variability describes the local fluctuations about a large-scale equilibrium state. Plumes are drawn at random from a probability distribution function (pdf) that defines the chance of finding a plume of given cloud-base mass flux within each model grid box. The normalization of the pdf is given by the ensemble-mean mass flux, and this is computed with a CAPE closure method. The characteristics of each plume produced are determined using an adaptation of the plume model from the Kain-Fritsch parameterization. Initial tests in the single column version of the Unified Model verify that the scheme is effective in producing the desired distributions of convective variability without adversely affecting the mean state.