937 resultados para ubiquitous and transparent clouds


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three-dimensional (3D) hierarchical nanoscale architectures comprised of building blocks, with specifically engineered morphologies, are expected to play important roles in the fabrication of 'next generation' microelectronic and optoelectronic devices due to their high surface-to-volume ratio as well as opto-electronic properties. Herein, a series of well-defined 3D hierarchical rutile TiO2 architectures (HRT) were successfully prepared using a facile hydrothermal method without any surfactant or template, simply by changing the concentration of hydrochloric acid used in the synthesis. The production of these materials provides, to the best of our knowledge, the first identified example of a ledgewise growth mechanism in a rutile TiO2 structure. Also for the first time, a Dye-sensitized Solar Cell (DSC) combining a HRT is reported in conjunction with a high-extinction-coefficient metal-free organic sensitizer (D149), achieving a conversion efficiency of 5.5%, which is superior to ones employing P25 (4.5%), comparable to state-of-the-art commercial transparent titania anatase paste (5.8%). Further to this, an overall conversion efficiency 8.6% was achieved when HRT was used as the light scattering layer, a considerable improvement over the commercial transparent/reflector titania anatase paste (7.6%), a significantly smaller gap in performance than has been seen previously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conjugated polymers are promising materials for electrochromic device technology. Aqueous dispersions of poly(3,4-ethylenedioxythiophene)-(PEDOT) were spin coated onto transparent conducting oxide (TCO) coated glass substrates. A seven-layer electrochromic device was fabricated with the following configuration: glass/transparent conducting oxide (TCO)/PEDOT (main electrochromic layer)/gel electrolyte/prussian blue (counter electrode)/TCO/glass. The device fabricated with counter electrode (Prussian blue) showed a contrast of 18% and without counter electrode showed visible contrast of 5% at 632 nm at a voltage of 1.9 V. The comparison of the device is done in terms of the colouration efficiency of the devices with and without counter electrode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transparent glasses of BaNaB9O15 (BNBO) were fabricated via the conventional melt-quenching technique. The amorphous and the glassy nature of the as-quenched samples were, respectively, confirmed by x-ray powder diffraction and differential scanning calorimetry (DSC). The glass transition and crystallization parameters were evaluated under non-isothermal conditions using DSC. The correlation between the heating rate dependent glass transition and the crystallization temperatures was studied and the Kauzmann temperature was deduced for BNBO glass plates and powdered samples. The values of the Kauzmann temperature for the plates and powdered samples were 776 K and 768 K, respectively. An approximation- free method was used to evaluate the crystallization kinetic parameters for the BNBO glass samples. The effect of the sample thickness on the crystallization kinetics of BNBO glasses was also investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the world today there are many ways in which we measure, count and determine whether something is worth the effort or not. In Australia and many other countries, new government legislation is requiring government-funded entities to become more transparent in their practice and to develop a more cohesive narrative about the worth, or impact, for the betterment of society. This places the executives of such entities in a position of needing evaluative thinking and practice to guide how they may build the narrative that documents and demonstrates this type of impact. In thinking about where to start, executives, project and program managers may consider this workshop as a professional development opportunity to explore both the intended and unintended consequences of performance models as tools of evaluation. This workshop will offer participants an opportunity to unpack the place of performance models as an evaluative tool through the following: · What shape does an ethical, sound and valid performance measure for an organization or personnel take? · What role does cultural specificity play in the design and development of a performance model for an organization or for personnel? · How are stakeholders able to identify risk during the design and development of such models? · When and where will dissemination strategies be required? · And so what? How can you determine that your performance model implementation has made a difference now or in the future?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the first half of the twentieth century the dematerializing of boundaries between enclosure and exposure problematized traditional expectations of the domestic environment. At the same time, as a space of escalating technological control, the modern domestic interior also offered new potential to redefine the meaning and means of habitation. The inherent tension between these opposing forces is particularly evident in the introduction of new electric lighting technology and applications into the modern domestic interior in the mid-twentieth century. Addressing this nexus of technology and domestic psychology, this article examines the critical role of electric lighting in regulating and framing both the public and private occupation of Philip Johnson's New Canaan estate. Exploring the dialectically paired transparent Glass House and opaque Guest House, this study illustrates how Johnson employed electric light to negotiate the visual environment of the estate as well as to help sustain a highly aestheticized domestic lifestyle. Contextualized within the existing literature, this analysis provides a more nuanced understanding of the New Canaan estate as an expression of Johnson's interests as a designer as well as a subversion of traditional suburban conventions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Delay Tolerant Network (DTN) is a dynamic, fragmented, and ephemeral network formed by a large number of highly mobile nodes. DTNs are ephemeral networks with highly mobile autonomous nodes. This requires distributed and self-organised approaches to trust management. Revocation and replacement of security credentials under adversarial influence by preserving the trust on the entity is still an open problem. Existing methods are mostly limited to detection and removal of malicious nodes. This paper makes use of the mobility property to provide a distributed, self-organising, and scalable revocation and replacement scheme. The proposed scheme effectively utilises the Leverage of Common Friends (LCF) trust system concepts to revoke compromised security credentials, replace them with new ones, whilst preserving the trust on them. The level of achieved entity confidence is thereby preserved. Security and performance of the proposed scheme is evaluated using an experimental data set in comparison with other schemes based around the LCF concept. Our extensive experimental results show that the proposed scheme distributes replacement credentials up to 35% faster and spreads spoofed credentials of strong collaborating adversaries up to 50% slower without causing any significant increase on the communication and storage overheads, when compared to other LCF based schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the first half of the twentieth century the dematerializing of boundaries between enclosure and exposure problematized traditional acts of “occupation” and understandings of the domestic environment. As a space of escalating technological control, the modern domestic interior offered new potential to re-define the meaning and means of habitation. This shift is clearly expressed in the transformation of electric lighting technology and applications for the modern interior in the mid-twentieth century. Addressing these issues, this paper examines the critical role of electric lighting in regulating and framing both the public and private occupation of Philip Johnson’s New Canaan estate. Exploring the dialectically paired transparent Glass House and opaque Guest House (both 1949), this study illustrates how Johnson employed artificial light to control the visual environment of the estate as well as to aestheticize the performance of domestic space. Looking closely at the use of artificial light to create emotive effects as well as to intensify the experience of occupation, this revisiting of the iconic Glass House and lesser-known Guest House provides a more complex understanding of Johnson’s work and the means with which he inhabited his own architecture. Calling attention to the importance of Johnson serving as both architect and client, and his particular interest in exploring the new potential of architectural lighting in this period, this paper investigates Johnson’s use of electric light to support architectural narratives, maintain visual order and control, and to suit the nuanced desires of domestic occupation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interstellar clouds are not featureless, but show quite complex internal structures of filaments and clumps when observed with high enough resolution. These structures have been generated by 1) turbulent motions driven mainly by supernovae, 2) magnetic fields working on the ions and, through neutral-ion collisions, on neutral gas as well, and 3) self-gravity pulling a dense clump together to form a new star. The study of the cloud structure gives us information on the relative importance of each of these mechanisms, and helps us to gain a better understanding of the details of the star formation process. Interstellar dust is often used as a tracer for the interstellar gas which forms the bulk of the interstellar matter. Some of the methods that are used to derive the column density are summarized in this thesis. A new method, which uses the scattered light to map the column density in large fields with high spatial resolution, is introduced. This thesis also takes a look at the grain alignment with respect to the magnetic fields. The aligned grains give rise to the polarization of starlight and dust emission, thus revealing the magnetic field. The alignment mechanisms have been debated for the last half century. The strongest candidate at present is the radiative torques mechanism. In the first four papers included in this thesis, the scattered light method of column density estimation is formulated, tested in simulations, and finally used to obtain a column density map from observations. They demonstrate that the scattered light method is a very useful and reliable tool in column density estimation, and is able to provide higher resolution than the near-infrared color excess method. These two methods are complementary. The derived column density maps are also used to gain information on the dust emissivity within the observed cloud. The two final papers present simulations of polarized thermal dust emission assuming that the alignment happens by the radiative torques mechanism. We show that the radiative torques can explain the observed decline of the polarization degree towards dense cores. Furthermore, the results indicate that the dense cores themselves might not contribute significantly to the polarized signal, and hence one needs to be careful when interpreting the observations and deriving the magnetic field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architecture focuses on designing built environments in response to society’s needs, reflecting culture through materials and forms. The physical boundaries of the city have become blurred through the integration of digital media, connecting the physical environment with the digital. In the recent past the future was imagined as highly technological; 1982 Ridley Scott’s Blade Runner is set in 2019 and introduces a world where supersized screens inject advertisements in the cluttered urban space. Now, in 2015 screens are central to everyday life, but in a completely different way in respect to what had been imagined. Through ubiquitous computing and social media, information is abundant. Digital technologies have changed the way people relate to cities supporting discussion on multiple levels, allowing citizens to be more vocal than ever before. We question how architects can use the affordances of urban informatics to obtain and navigate useful social information to inform design. This chapter investigates different approaches to engage communities in the debate on cities, in particular it aims to capture citizens’ opinions on the use and design of public places. Physical and digital discussions have been initiated to capture citizens’ opinions on the use and design of public places. In addition to traditional consultation methods, Web 2.0 platforms, urban screens, and mobile apps are used in the context of Brisbane, Australia to explore contemporary strategies of engagement (Gray 2014).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of reconstruction of a refractive-index distribution (RID) in optical refraction tomography (ORT) with optical path-length difference (OPD) data is solved using two adaptive-estimation-based extended-Kalman-filter (EKF) approaches. First, a basic single-resolution EKF (SR-EKF) is applied to a state variable model describing the tomographic process, to estimate the RID of an optically transparent refracting object from noisy OPD data. The initialization of the biases and covariances corresponding to the state and measurement noise is discussed. The state and measurement noise biases and covariances are adaptively estimated. An EKF is then applied to the wavelet-transformed state variable model to yield a wavelet-based multiresolution EKF (MR-EKF) solution approach. To numerically validate the adaptive EKF approaches, we evaluate them with benchmark studies of standard stationary cases, where comparative results with commonly used efficient deterministic approaches can be obtained. Detailed reconstruction studies for the SR-EKF and two versions of the MR-EKF (with Haar and Daubechies-4 wavelets) compare well with those obtained from a typically used variant of the (deterministic) algebraic reconstruction technique, the average correction per projection method, thus establishing the capability of the EKF for ORT. To the best of our knowledge, the present work contains unique reconstruction studies encompassing the use of EKF for ORT in single-resolution and multiresolution formulations, and also in the use of adaptive estimation of the EKF's noise covariances. (C) 2010 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Earth's ecosystems are protected from the dangerous part of the solar ultraviolet (UV) radiation by stratospheric ozone, which absorbs most of the harmful UV wavelengths. Severe depletion of stratospheric ozone has been observed in the Antarctic region, and to a lesser extent in the Arctic and midlatitudes. Concern about the effects of increasing UV radiation on human beings and the natural environment has led to ground based monitoring of UV radiation. In order to achieve high-quality UV time series for scientific analyses, proper quality control (QC) and quality assurance (QA) procedures have to be followed. In this work, practices of QC and QA are developed for Brewer spectroradiometers and NILU-UV multifilter radiometers, which measure in the Arctic and Antarctic regions, respectively. These practices are applicable to other UV instruments as well. The spectral features and the effect of different factors affecting UV radiation were studied for the spectral UV time series at Sodankylä. The QA of the Finnish Meteorological Institute's (FMI) two Brewer spectroradiometers included daily maintenance, laboratory characterizations, the calculation of long-term spectral responsivity, data processing and quality assessment. New methods for the cosine correction, the temperature correction and the calculation of long-term changes of spectral responsivity were developed. Reconstructed UV irradiances were used as a QA tool for spectroradiometer data. The actual cosine correction factor was found to vary between 1.08-1.12 and 1.08-1.13. The temperature characterization showed a linear temperature dependence between the instrument's internal temperature and the photon counts per cycle. Both Brewers have participated in international spectroradiometer comparisons and have shown good stability. The differences between the Brewers and the portable reference spectroradiometer QASUME have been within 5% during 2002-2010. The features of the spectral UV radiation time series at Sodankylä were analysed for the time period 1990-2001. No statistically significant long-term changes in UV irradiances were found, and the results were strongly dependent on the time period studied. Ozone was the dominant factor affecting UV radiation during the springtime, whereas clouds played a more important role during the summertime. During this work, the Antarctic NILU-UV multifilter radiometer network was established by the Instituto Nacional de Meteorogía (INM) as a joint Spanish-Argentinian-Finnish cooperation project. As part of this work, the QC/QA practices of the network were developed. They included training of the operators, daily maintenance, regular lamp tests and solar comparisons with the travelling reference instrument. Drifts of up to 35% in the sensitivity of the channels of the NILU-UV multifilter radiometers were found during the first four years of operation. This work emphasized the importance of proper QC/QA, including regular lamp tests, for the multifilter radiometers also. The effect of the drifts were corrected by a method scaling the site NILU-UV channels to those of the travelling reference NILU-UV. After correction, the mean ratios of erythemally-weighted UV dose rates measured during solar comparisons between the reference NILU-UV and the site NILU-UVs were 1.007±0.011 and 1.012±0.012 for Ushuaia and Marambio, respectively, when the solar zenith angle varied up to 80°. Solar comparisons between the NILU-UVs and spectroradiometers showed a ±5% difference near local noon time, which can be seen as proof of successful QC/QA procedures and transfer of irradiance scales. This work also showed that UV measurements made in the Arctic and Antarctic can be comparable with each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis consists of four research papers and an introduction providing some background. The structure in the universe is generally considered to originate from quantum fluctuations in the very early universe. The standard lore of cosmology states that the primordial perturbations are almost scale-invariant, adiabatic, and Gaussian. A snapshot of the structure from the time when the universe became transparent can be seen in the cosmic microwave background (CMB). For a long time mainly the power spectrum of the CMB temperature fluctuations has been used to obtain observational constraints, especially on deviations from scale-invariance and pure adiabacity. Non-Gaussian perturbations provide a novel and very promising way to test theoretical predictions. They probe beyond the power spectrum, or two point correlator, since non-Gaussianity involves higher order statistics. The thesis concentrates on the non-Gaussian perturbations arising in several situations involving two scalar fields, namely, hybrid inflation and various forms of preheating. First we go through some basic concepts -- such as the cosmological inflation, reheating and preheating, and the role of scalar fields during inflation -- which are necessary for the understanding of the research papers. We also review the standard linear cosmological perturbation theory. The second order perturbation theory formalism for two scalar fields is developed. We explain what is meant by non-Gaussian perturbations, and discuss some difficulties in parametrisation and observation. In particular, we concentrate on the nonlinearity parameter. The prospects of observing non-Gaussianity are briefly discussed. We apply the formalism and calculate the evolution of the second order curvature perturbation during hybrid inflation. We estimate the amount of non-Gaussianity in the model and find that there is a possibility for an observational effect. The non-Gaussianity arising in preheating is also studied. We find that the level produced by the simplest model of instant preheating is insignificant, whereas standard preheating with parametric resonance as well as tachyonic preheating are prone to easily saturate and even exceed the observational limits. We also mention other approaches to the study of primordial non-Gaussianities, which differ from the perturbation theory method chosen in the thesis work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New stars form in dense interstellar clouds of gas and dust called molecular clouds. The actual sites where the process of star formation takes place are the dense clumps and cores deeply embedded in molecular clouds. The details of the star formation process are complex and not completely understood. Thus, determining the physical and chemical properties of molecular cloud cores is necessary for a better understanding of how stars are formed. Some of the main features of the origin of low-mass stars, like the Sun, are already relatively well-known, though many details of the process are still under debate. The mechanism through which high-mass stars form, on the other hand, is poorly understood. Although it is likely that the formation of high-mass stars shares many properties similar to those of low-mass stars, the very first steps of the evolutionary sequence are unclear. Observational studies of star formation are carried out particularly at infrared, submillimetre, millimetre, and radio wavelengths. Much of our knowledge about the early stages of star formation in our Milky Way galaxy is obtained through molecular spectral line and dust continuum observations. The continuum emission of cold dust is one of the best tracers of the column density of molecular hydrogen, the main constituent of molecular clouds. Consequently, dust continuum observations provide a powerful tool to map large portions across molecular clouds, and to identify the dense star-forming sites within them. Molecular line observations, on the other hand, provide information on the gas kinematics and temperature. Together, these two observational tools provide an efficient way to study the dense interstellar gas and the associated dust that form new stars. The properties of highly obscured young stars can be further examined through radio continuum observations at centimetre wavelengths. For example, radio continuum emission carries useful information on conditions in the protostar+disk interaction region where protostellar jets are launched. In this PhD thesis, we study the physical and chemical properties of dense clumps and cores in both low- and high-mass star-forming regions. The sources are mainly studied in a statistical sense, but also in more detail. In this way, we are able to examine the general characteristics of the early stages of star formation, cloud properties on large scales (such as fragmentation), and some of the initial conditions of the collapse process that leads to the formation of a star. The studies presented in this thesis are mainly based on molecular line and dust continuum observations. These are combined with archival observations at infrared wavelengths in order to study the protostellar content of the cloud cores. In addition, centimetre radio continuum emission from young stellar objects (YSOs; i.e., protostars and pre-main sequence stars) is studied in this thesis to determine their evolutionary stages. The main results of this thesis are as follows: i) filamentary and sheet-like molecular cloud structures, such as infrared dark clouds (IRDCs), are likely to be caused by supersonic turbulence but their fragmentation at the scale of cores could be due to gravo-thermal instability; ii) the core evolution in the Orion B9 star-forming region appears to be dynamic and the role played by slow ambipolar diffusion in the formation and collapse of the cores may not be significant; iii) the study of the R CrA star-forming region suggests that the centimetre radio emission properties of a YSO are likely to change with its evolutionary stage; iv) the IRDC G304.74+01.32 contains candidate high-mass starless cores which may represent the very first steps of high-mass star and star cluster formation; v) SiO outflow signatures are seen in several high-mass star-forming regions which suggest that high-mass stars form in a similar way as their low-mass counterparts, i.e., via disk accretion. The results presented in this thesis provide constraints on the initial conditions and early stages of both low- and high-mass star formation. In particular, this thesis presents several observational results on the early stages of clustered star formation, which is the dominant mode of star formation in our Galaxy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transport plays an important role in the distribution of long-lived gases such as ozone and water vapour in the atmosphere. Understanding of observed variability in these gases as well as prediction of the future changes depends therefore on our knowledge of the relevant atmospheric dynamics. This dissertation studies certain dynamical processes in the stratosphere and upper troposphere which influence the distribution of ozone and water vapour in the atmosphere. The planetary waves that originate in the troposphere drive the stratospheric circulation. They influence both the meridional transport of substances as well as parameters of the polar vortices. In turn, temperatures inside the polar vortices influence abundance of the Polar Stratospheric Clouds (PSC) and therefore the chemical ozone destruction. Wave forcing of the stratospheric circulation is not uniform during winter. The November-December averaged stratospheric eddy heat flux shows a significant anticorrelation with the January-February averaged eddy heat flux in the midlatitude stratosphere and troposphere. These intraseasonal variations are attributable to the internal stratospheric vacillations. In the period 1979-2002, the wave forcing exhibited a negative trend which was confined to the second half of winter only. In the period 1958-2002, area, strength and longevity of the Arctic polar vortices do not exhibit significant long-term changes while the area with temperatures lower than the threshold temperature for PSC formation shows statistically significant increase. However, the Arctic vortex parameters show significant decadal changes which are mirrored in the ozone variability. Monthly ozone tendencies in the Northern Hemisphere show significant correlations (|r|=0.7) with proxies of the stratospheric circulation. In the Antarctic, the springtime vortex in the lower stratosphere shows statistically significant trends in temperature, longevity and strength (but not in area) in the period 1979-2001. Analysis of the ozone and water vapour vertical distributions in the Arctic UTLS shows that layering below and above the tropopause is often associated with poleward Rossby wave-breaking. These observations together with calculations of cross-tropopause fluxes emphasize the importance of poleward Rossby wave breaking for the stratosphere-troposphere exchange in the Arctic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Body Area Network (BAN) is an emerging technology that focuses on monitoring physiological data in, on and around the human body. BAN technology permits wearable and implanted sensors to collect vital data about the human body and transmit it to other nodes via low-energy communication. In this paper, we investigate interactions in terms of data flows between parties involved in BANs under four different scenarios targeting outdoor and indoor medical environments: hospital, home, emergency and open areas. Based on these scenarios, we identify data flow requirements between BAN elements such as sensors and control units (CUs) and parties involved in BANs such as the patient, doctors, nurses and relatives. Identified requirements are used to generate BAN data flow models. Petri Nets (PNs) are used as the formal modelling language. We check the validity of the models and compare them with the existing related work. Finally, using the models, we identify communication and security requirements based on the most common active and passive attack scenarios.