884 resultados para Ubiquitous Computing, Pervasive Computing, Internet of Things, Cloud Computing
Resumo:
In order to calculate unbiased microphysical and radiative quantities in the presence of a cloud, it is necessary to know not only the mean water content but also the distribution of this water content. This article describes a study of the in-cloud horizontal inhomogeneity of ice water content, based on CloudSat data. In particular, by focusing on the relations with variables that are already available in general circulation models (GCMs), a parametrization of inhomogeneity that is suitable for inclusion in GCM simulations is developed. Inhomogeneity is defined in terms of the fractional standard deviation (FSD), which is given by the standard deviation divided by the mean. The FSD of ice water content is found to increase with the horizontal scale over which it is calculated and also with the thickness of the layer. The connection to cloud fraction is more complicated; for small cloud fractions FSD increases as cloud fraction increases while FSD decreases sharply for overcast scenes. The relations to horizontal scale, layer thickness and cloud fraction are parametrized in a relatively simple equation. The performance of this parametrization is tested on an independent set of CloudSat data. The parametrization is shown to be a significant improvement on the assumption of a single-valued global FSD
Resumo:
The high computational cost of calculating the radiative heating rates in numerical weather prediction (NWP) and climate models requires that calculations are made infrequently, leading to poor sampling of the fast-changing cloud field and a poor representation of the feedback that would occur. This paper presents two related schemes for improving the temporal sampling of the cloud field. Firstly, the ‘split time-stepping’ scheme takes advantage of the independent nature of the monochromatic calculations of the ‘correlated-k’ method to split the calculation into gaseous absorption terms that are highly dependent on changes in cloud (the optically thin terms) and those that are not (optically thick). The small number of optically thin terms can then be calculated more often to capture changes in the grey absorption and scattering associated with cloud droplets and ice crystals. Secondly, the ‘incremental time-stepping’ scheme uses a simple radiative transfer calculation using only one or two monochromatic calculations representing the optically thin part of the atmospheric spectrum. These are found to be sufficient to represent the heating rate increments caused by changes in the cloud field, which can then be added to the last full calculation of the radiation code. We test these schemes in an operational forecast model configuration and find a significant improvement is achieved, for a small computational cost, over the current scheme employed at the Met Office. The ‘incremental time-stepping’ scheme is recommended for operational use, along with a new scheme to correct the surface fluxes for the change in solar zenith angle between radiation calculations.
Resumo:
Increasing optical depth poleward of 45° is a robust response to warming in global climate models. Much of this cloud optical depth increase has been hypothesized to be due to transitions from ice-dominated to liquid-dominated mixed-phase cloud. In this study, the importance of liquid-ice partitioning for the optical depth feedback is quantified for 19 Coupled Model Intercomparison Project Phase 5 models. All models show a monotonic partitioning of ice and liquid as a function of temperature, but the temperature at which ice and liquid are equally mixed (the glaciation temperature) varies by as much as 40 K across models. Models that have a higher glaciation temperature are found to have a smaller climatological liquid water path (LWP) and condensed water path and experience a larger increase in LWP as the climate warms. The ice-liquid partitioning curve of each model may be used to calculate the response of LWP to warming. It is found that the repartitioning between ice and liquid in a warming climate contributes at least 20% to 80% of the increase in LWP as the climate warms, depending on model. Intermodel differences in the climatological partitioning between ice and liquid are estimated to contribute at least 20% to the intermodel spread in the high-latitude LWP response in the mixed-phase region poleward of 45°S. It is hypothesized that a more thorough evaluation and constraint of global climate model mixed-phase cloud parameterizations and validation of the total condensate and ice-liquid apportionment against observations will yield a substantial reduction in model uncertainty in the high-latitude cloud response to warming.
Resumo:
A recent field campaign in southwest England used numerical modeling integrated with aircraft and radar observations to investigate the dynamic and microphysical interactions that can result in heavy convective precipitation. The COnvective Precipitation Experiment (COPE) was a joint UK-US field campaign held during the summer of 2013 in the southwest peninsula of England, designed to study convective clouds that produce heavy rain leading to flash floods. The clouds form along convergence lines that develop regularly due to the topography. Major flash floods have occurred in the past, most famously at Boscastle in 2004. It has been suggested that much of the rain was produced by warm rain processes, similar to some flash floods that have occurred in the US. The overarching goal of COPE is to improve quantitative convective precipitation forecasting by understanding the interactions of the cloud microphysics and dynamics and thereby to improve NWP model skill for forecasts of flash floods. Two research aircraft, the University of Wyoming King Air and the UK BAe 146, obtained detailed in situ and remote sensing measurements in, around, and below storms on several days. A new fast-scanning X-band dual-polarization Doppler radar made 360-deg volume scans over 10 elevation angles approximately every 5 minutes, and was augmented by two UK Met Office C-band radars and the Chilbolton S-band radar. Detailed aerosol measurements were made on the aircraft and on the ground. This paper: (i) provides an overview of the COPE field campaign and the resulting dataset; (ii) presents examples of heavy convective rainfall in clouds containing ice and also in relatively shallow clouds through the warm rain process alone; and (iii) explains how COPE data will be used to improve high-resolution NWP models for operational use.
Resumo:
The collective representation within global models of aerosol, cloud, precipitation, and their radiative properties remains unsatisfactory. They constitute the largest source of uncertainty in predictions of climatic change and hamper the ability of numerical weather prediction models to forecast high-impact weather events. The joint European Space Agency (ESA)–Japan Aerospace Exploration Agency (JAXA) Earth Clouds, Aerosol and Radiation Explorer (EarthCARE) satellite mission, scheduled for launch in 2018, will help to resolve these weaknesses by providing global profiles of cloud, aerosol, precipitation, and associated radiative properties inferred from a combination of measurements made by its collocated active and passive sensors. EarthCARE will improve our understanding of cloud and aerosol processes by extending the invaluable dataset acquired by the A-Train satellites CloudSat, Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and Aqua. Specifically, EarthCARE’s cloud profiling radar, with 7 dB more sensitivity than CloudSat, will detect more thin clouds and its Doppler capability will provide novel information on convection, precipitating ice particle, and raindrop fall speeds. EarthCARE’s 355-nm high-spectral-resolution lidar will measure directly and accurately cloud and aerosol extinction and optical depth. Combining this with backscatter and polarization information should lead to an unprecedented ability to identify aerosol type. The multispectral imager will provide a context for, and the ability to construct, the cloud and aerosol distribution in 3D domains around the narrow 2D retrieved cross section. The consistency of the retrievals will be assessed to within a target of ±10 W m–2 on the (10 km)2 scale by comparing the multiview broadband radiometer observations to the top-of-atmosphere fluxes estimated by 3D radiative transfer models acting on retrieved 3D domains.
Resumo:
As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.
Resumo:
This paper describes new advances in the exploitation of oxygen A-band measurements from POLDER3 sensor onboard PARASOL, satellite platform within the A-Train. These developments result from not only an account of the dependence of POLDER oxygen parameters to cloud optical thickness τ and to the scene's geometrical conditions but also, and more importantly, from the finer understanding of the sensitivity of these parameters to cloud vertical extent. This sensitivity is made possible thanks to the multidirectional character of POLDER measurements. In the case of monolayer clouds that represent most of cloudy conditions, new oxygen parameters are obtained and calibrated from POLDER3 data colocalized with the measurements of the two active sensors of the A-Train: CALIOP/CALIPSO and CPR/CloudSat. From a parameterization that is (μs, τ) dependent, with μs the cosine of the solar zenith angle, a cloud top oxygen pressure (CTOP) and a cloud middle oxygen pressure (CMOP) are obtained, which are estimates of actual cloud top and middle pressures (CTP and CMP). Performances of CTOP and CMOP are presented by class of clouds following the ISCCP classification. In 2008, the coefficient of the correlation between CMOP and CMP is 0.81 for cirrostratus, 0.79 for stratocumulus, 0.75 for deep convective clouds. The coefficient of the correlation between CTOP and CTP is 0.75, 0.73, and 0.79 for the same cloud types. The score obtained by CTOP, defined as the confidence in the retrieval for a particular range of inferred value and for a given error, is higher than the one of MODIS CTP estimate. Scores of CTOP are the highest for bin value of CTP superior in numbers. For liquid (ice) clouds and an error of 30 hPa (50 hPa), the score of CTOP reaches 50% (70%). From the difference between CTOP and CMOP, a first estimate of the cloud vertical extent h is possible. A second estimate of h comes from the correlation between the angular standard deviation of POLDER oxygen pressure σPO2 and the cloud vertical extent. This correlation is studied in detail in the case of liquid clouds. It is shown to be spatially and temporally robust, except for clouds above land during winter months. The analysis of the correlation's dependence on the scene's characteristics leads to a parameterization providing h from σPO2. For liquid water clouds above ocean in 2008, the mean difference between the actual cloud vertical extent and the one retrieved from σPO2 (from the pressure difference) is 5 m (−12 m). The standard deviation of the mean difference is close to 1000 m for the two methods. POLDER estimates of the cloud geometrical thickness obtain a global score of 50% confidence for a relative error of 20% (40%) of the estimate for ice (liquid) clouds over ocean. These results need to be validated outside of the CALIPSO/CloudSat track.
Resumo:
This study investigated the physical processes involved in the development of thunderstorms over southwestern Amazon by hypothesizing causalities for the observed cloud-to-ground lightning variability and the local environmental characteristics. Southwestern Amazon experiences every year a large variety of environmental factors, such as the gradual increase in atmospheric moisture, extremely high pollution due to biomass burning, and intense deforestation, which directly affects cloud development by differential surface energy partition. In the end of the dry period it was observed higher percentages of positive cloud-to-ground (+CG) lightning due to a relative increase in +CG dominated thunderstorms (positive thunderstorms). Positive (negative) thunderstorms initiated preferentially over deforested (forest) areas with higher (lower) cloud base heights, shallower (deeper) warm cloud depths, and higher (lower) convective potential available energy. These features characterized the positive (negative) thunderstorms as deeper (relatively shallower) clouds, stronger (relatively weaker) updrafts with enhanced (decreased) mixed and cold vertically integrated liquid. No significant difference between thunderstorms (negative and positive) and nonthunderstorms were observed in terms of atmospheric pollution, once the atmosphere was overwhelmed by pollution leading to an updraft-limited regime. However, in the wet season both negative and positive thunderstorms occurred during periods of relatively higher aerosol concentration and differentiated size distributions, suggesting an aerosol-limited regime where cloud electrification could be dependent on the aerosol concentration to suppress the warm and enhance the ice phase. The suggested causalities are consistent with the invoked hypotheses, but they are not observed facts; they are just hypotheses based on plausible physical mechanisms.
Resumo:
The Amazon is one of the few continental regions where atmospheric aerosol particles and their effects on climate are not dominated by anthropogenic sources. During the wet season, the ambient conditions approach those of the pristine pre-industrial era. We show that the fine submicrometer particles accounting for most cloud condensation nuclei are predominantly composed of secondary organic material formed by oxidation of gaseous biogenic precursors. Supermicrometer particles, which are relevant as ice nuclei, consist mostly of primary biological material directly released from rainforest biota. The Amazon Basin appears to be a biogeochemical reactor, in which the biosphere and atmospheric photochemistry produce nuclei for clouds and precipitation sustaining the hydrological cycle. The prevailing regime of aerosol-cloud interactions in this natural environment is distinctly different from polluted regions.
Resumo:
The interest in attractive Bose-Einstein Condensates arises due to the chemical instabilities generate when the number of trapped atoms is above a critical number. In this case, recombination process promotes the collapse of the cloud. This behavior is normally geometry dependent. Within the context of the mean field approximation, the system is described by the Gross-Pitaevskii equation. We have considered the attractive Bose-Einstein condensate, confined in a nonspherical trap, investigating numerically and analytically the solutions, using controlled perturbation and self-similar approximation methods. This approximation is valid in all interval of the negative coupling parameter allowing interpolation between weak-coupling and strong-coupling limits. When using the self-similar approximation methods, accurate analytical formulas were derived. These obtained expressions are discussed for several different traps and may contribute to the understanding of experimental observations.
Resumo:
This degree project includes both a theoretic component and a practical component within the graphicprofiling domain. Literature on graphic design has been studied for the theoretical part and the achievedknowledge has been used for the practical part. The project has been to produce graphic materialfor “Rookiefestivalen” in Hultsfred.In modern society it is important for a corporation to distinguish itself from others by using a graphicprofile. Through using a graphic profile the company can have an affect on how it is perceived byothers. This is true not only for corporations as both organisations, societies and other events will benefitfrom the use of a graphic profile.The material that’s been produced is not within the traditional boundaries of graphic profiling. It israther marketing material for the festival. The graphic profile was applied to a variety of things suchas posters, flyers, ad’s and the festival’s website. The result is to be seen in the appendix of this degreeproject and on the Internet: http://www.rookierockparty.se
Resumo:
Information and communication technology (ICT) is a subject that is being discussed as a tool that is used within education around the world. Furthermore it can be seen as a tool for teachers to individualize students´ education. Students with literacy difficulties, such as dyslexia, are in constant need of new ways to learn, and new ways to be motivated to learn. The aim of this study is to see what research says in regard to how ICT can be used as a tool to help students with literacy difficulties. Literacy difficulties can be due to a number of things, such as the student has not been taught how to read, trouble within the family which can cause distress, or a neurological disorder such as dyslexia. Furthermore, the main research questions will focus on how ICT can be compared to traditional education forms, such as books and a more teacher centered education within the classroom, and whether ICT can be preferred. The results of this literature review indicates that ICT can be seen as a way for teachers to help students with literacy difficulties gain more self-esteem – something the literature tells us students with learning difficulties lack. The results also show how ICT can lead to a more individualized education. This is due to tools that increase reading comprehension and tools that give direct response when working with ICT, which helps students work more independently.
Resumo:
By the time you read this column this story may have lost all it relevance but it has made a bit of a dust up lately and so I think it deserves some further treatment. About two weeks ago, the cyberverse was all a twitter about naked selfies, mainly of celebrities, that had been hacked right out of the cloud. Imagine that. What goes online isn’t exactly private. Doh!
Resumo:
No jornalismo, são chamadas suítes as matérias que trazem a sequência de um fato já noticiado. Conforme a imprensa cresce na Internet, podemos ver frequentemente um mesmo fato sendo repetido em portais de notícias dia após dia. Este trabalho visa medir as quantidades de artigos a respeito de um mesmo assunto que tenha iniciado uma suíte, com esta medição acontecendo ao longo dos dias em que ele foi explorado. Os resultados permitiram que fossem encontrados padrões que identifiquem os dias em que os fatos mais relevantes foram noticiados, bem como o tempo em que o assunto foi desenvolvido. Para esta análise, foram escolhidos alguns dos mais importantes fatos que viraram suítes no Brasil ao longo dos últimos anos. As quantidades de artigos são provenientes do maior portal de notícias do país, o G1, e da base de dados do Media Cloud Brasil.