984 resultados para potential fields
Resumo:
[Book] The potential of electric light as a new building “material” was recognized in the 1920s and became a useful design tool by the mid-century. Skillful lighting allowed for theatricality, narrative, and a new emphasis on structure and space. The Structure of Light tells the story of the career of Richard Kelly, the field’s most influential figure. Six historians, architects, and practitioners explore Kelly’s unparalleled influence on modern architecture and his lighting designs for some of the 20th century’s most iconic buildings: Philip Johnson’s Glass House; Louis Kahn’s Kimbell Art Museum; Eero Saarinen’s GM Technical Center; and Mies van der Rohe’s Seagram Building, among many others. This beautifully illustrated history demonstrates the range of applications, building types, and artistic solutions he employed to achieve a “nocturnal modernity” that would render buildings evocatively different at night. The survival of Kelly’s rich correspondence and extensive diaries allows an in-depth look at the triumphs and uncertainties of a young profession in the making. The first book to focus on the contributions of a master in the field of architectural lighting, this fascinating volume celebrates the practice’s significance in modern design.
Resumo:
This study investigates the implications of the introduction of electric lighting systems, building technologies, and theories of worker efficiency on the deep spatial and environmental transformations that occurred within the corporate workplace during the twentieth century. Examining the shift from daylighting strategies to largely artificially lit workplace environments, this paper argues that electric lighting significantly contributed to the architectural rationalization of both office work and the modern office environment. Contesting the historical and critical marginalization of lighting within the discourse of the modern built environment, this study calls for a reassessment of the role of artificial lighting in the development of the modern corporate workplace. Keywords: daylighting, fluorescent lighting, rationalization, workplace design
Resumo:
Context In-training assessment (ITA) has established its place alongside formative and summative assessment at both the undergraduate and postgraduate level. In this paper the authors aimed to identify those characteristics of ITA that could enhance clinical teaching. Methods A literature review and discussions by an expert working group at the Ninth Cambridge Conference identified the aspects of ITA that could enhance clinical teaching. Results The features of ITA identified included defining the specific benefits to the learner, teacher and institution, and highlighting the patient as the context for ITA and clinical teaching. The ‘mapping’ of a learner’s progress towards the clinical teaching objectives by using multiple assessments over time, by multiple observers in both a systematic and opportunistic way correlates with the incremental nature of reaching clinical competence. Conclusions The importance of ITA based on both direct and indirect evidence of what the learner actually does in the real clinical setting is emphasized. Particular attention is given to addressing concerns in the more controversial areas of assessor training, ratings and documentation for ITA. Areas for future research are also identified.
Resumo:
Derailments due to lateral collisions between heavy road vehicles and passenger trains at level crossings (LCs) are serious safety issues. A variety of countermeasures in terms of traffic laws, communication technology and warning devices are used for minimising LC accidents; however, innovative civil infrastructure solution is rare. This paper presents a study of the efficacy of guard rail system (GRS) to minimise the derailment potential of trains laterally collided by heavy road vehicles at LCs. For this purpose, a three-dimensional dynamic model of a passenger train running on a ballasted track fitted with guard rail subject to lateral impact caused by a road truck is formulated. This model is capable of predicting the lateral collision-induced derailments with and without GRS. Based on dynamic simulations, derailment prevention mechanism of the GRS is illustrated. Sensitivities of key parameters of the GRS, such as the flange way width, the installation height and contact friction, to the efficacy of GRS are reported. It is shown that guard rails can enhance derailment safety against lateral impacts at LCs.
Resumo:
A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.
Resumo:
Einstein's general relativity is a classical theory of gravitation: it is a postulate on the coupling between the four-dimensional, continuos spacetime and the matter fields in the universe, and it yields their dynamical evolution. It is believed that general relativity must be replaced by a quantum theory of gravity at least at extremely high energies of the early universe and at regions of strong curvature of spacetime, cf. black holes. Various attempts to quantize gravity, including conceptually new models such as string theory, have suggested that modification to general relativity might show up even at lower energy scales. On the other hand, also the late time acceleration of the expansion of the universe, known as the dark energy problem, might originate from new gravitational physics. Thus, although there has been no direct experimental evidence contradicting general relativity so far - on the contrary, it has passed a variety of observational tests - it is a question worth asking, why should the effective theory of gravity be of the exact form of general relativity? If general relativity is modified, how do the predictions of the theory change? Furthermore, how far can we go with the changes before we are face with contradictions with the experiments? Along with the changes, could there be new phenomena, which we could measure to find hints of the form of the quantum theory of gravity? This thesis is on a class of modified gravity theories called f(R) models, and in particular on the effects of changing the theory of gravity on stellar solutions. It is discussed how experimental constraints from the measurements in the Solar System restrict the form of f(R) theories. Moreover, it is shown that models, which do not differ from general relativity at the weak field scale of the Solar System, can produce very different predictions for dense stars like neutron stars. Due to the nature of f(R) models, the role of independent connection of the spacetime is emphasized throughout the thesis.
Resumo:
Volatile organic compounds (VOCs) in the headspace of bubble chambers containing branches of live coral in filtered reef seawater were analysed using gas chromatography with mass spectrometry (GC-MS). When the coral released mucus it was a source of dimethyl sulfide (DMS) and isoprene; however, these VOCs were not emitted to the chamber headspace from mucus-free coral. This finding, which suggests that coral is an intermittent source of DMS and isoprene, was supported by the observation of occasional large pulses of atmospheric DMS (DMSa) over Heron Island reef on the southern Great Barrier Reef (GBR), Australia, in the austral winter. The highest DMSa pulse (320 ppt) was three orders of magnitude less than the DMS mixing ratio (460 ppb) measured in the headspace of a dynamically purged bubble chamber containing a mucus-coated branch of Acropora aspera indicating that coral reefs can be strong point sources of DMSa. Static headspace GC-MS analysis of coral fragments identified mainly DMS and seven other minor reduced sulfur compounds including dimethyl disulfide, methyl mercaptan, and carbon disulfide, while coral reef seawater was an indicated source of methylene chloride, acetone, and methyl ethyl ketone. The VOCs emitted by coral and reef seawater are capable of producing new atmospheric particles < 15 nm diameter as observed at Heron Island reef. DMS and isoprene are known to play a role in low-level cloud formation, so aerosol precursors such as these could influence regional climate through a sea surface temperature regulation mechanism hypothesized to operate over the GBR.
Resumo:
In the wake of an almost decade long economic downturn and increasing competition from developing economies, a new agenda in the Australian Government for science, technology, engineering, and mathematics (STEM) education and research has emerged as a national priority. However, to art and design educators, the pervasiveness and apparent exclusivity of STEM can be viewed as another instance of art and design education being relegated to the margins of curriculum (Greene, 1995). In the spirit of interdisciplinarity, there have been some recent calls to expand STEM education to include the arts and design, transforming STEM into STEAM in education (Maeda, 2013). As with STEM, STEAM education emphasises the connections between previously disparate disciplines, meaning that education has been conceptualised in different ways, such as focusing on the creative design thinking process that is fundamental to engineering and art (Bequette & Bequette, 2012). In this article, we discuss divergent creative design thinking process and metacognitive skills, how, and why they may enhance learning in STEM and STEAM.
Resumo:
The Maitra group has explored a variety of chemistry with bile acids during the past 15 years and these experiments have covered a wide variety of chemistry - asymmetric synthesis, molecular recognition, ion receptors/sensors, dendrimers, low molecular mass organo and hydrogelators, gel-nanoparticle composites, etc. Some of what excites us in this field is highlighted in this perspective article.
Resumo:
Malignant pleural mesothelioma (MPM) is a rare aggressive cancer of the pleura. Asbestos exposure (through inhalation) is the most well established risk factor for mesothelioma. The current standard of care for patients suffering from MPM is a combination of cisplatin and pemetrexed (or alternatively cisplatin and raltitrexed). Most patients, however, die within 24 months of diagnosis. New therapies are therefore urgently required for this disease. Lysine acetyltransferases (KATs) including KAT5 have been linked with the development of cisplatin resistance. This gene may therefore be altered in MPM and could represent a novel candidate target for intervention. Using RT-PCR screening the expression of all known KAT5 variants was found to be markedly increased in malignant tumors compared to benign pleura. When separated according to histological subtype, KAT5 was significantly overexpressed in both the sarcomatoid and biphasic subgroups for all transcript variants. A panel of MPM cell lines including the normal pleural cells LP9 and Met5A was screened for expression of KAT5 variants. Treatment of cells with a small molecule inhibitor of KAT5 (MG-149) caused significant inhibition of cellular proliferation (p<0.0001), induction of apoptosis and was accompanied by significant induction of pro-inflammatory cytokines/chemokines.
Resumo:
Cosmological inflation is the dominant paradigm in explaining the origin of structure in the universe. According to the inflationary scenario, there has been a period of nearly exponential expansion in the very early universe, long before the nucleosynthesis. Inflation is commonly considered as a consequence of some scalar field or fields whose energy density starts to dominate the universe. The inflationary expansion converts the quantum fluctuations of the fields into classical perturbations on superhorizon scales and these primordial perturbations are the seeds of the structure in the universe. Moreover, inflation also naturally explains the high degree of homogeneity and spatial flatness of the early universe. The real challenge of the inflationary cosmology lies in trying to establish a connection between the fields driving inflation and theories of particle physics. In this thesis we concentrate on inflationary models at scales well below the Planck scale. The low scale allows us to seek for candidates for the inflationary matter within extensions of the Standard Model but typically also implies fine-tuning problems. We discuss a low scale model where inflation is driven by a flat direction of the Minimally Supersymmetric Standard Model. The relation between the potential along the flat direction and the underlying supergravity model is studied. The low inflationary scale requires an extremely flat potential but we find that in this particular model the associated fine-tuning problems can be solved in a rather natural fashion in a class of supergravity models. For this class of models, the flatness is a consequence of the structure of the supergravity model and is insensitive to the vacuum expectation values of the fields that break supersymmetry. Another low scale model considered in the thesis is the curvaton scenario where the primordial perturbations originate from quantum fluctuations of a curvaton field, which is different from the fields driving inflation. The curvaton gives a negligible contribution to the total energy density during inflation but its perturbations become significant in the post-inflationary epoch. The separation between the fields driving inflation and the fields giving rise to primordial perturbations opens up new possibilities to lower the inflationary scale without introducing fine-tuning problems. The curvaton model typically gives rise to relatively large level of non-gaussian features in the statistics of primordial perturbations. We find that the level of non-gaussian effects is heavily dependent on the form of the curvaton potential. Future observations that provide more accurate information of the non-gaussian statistics can therefore place constraining bounds on the curvaton interactions.
Resumo:
Deep convolutional neural networks (DCNNs) have been employed in many computer vision tasks with great success due to their robustness in feature learning. One of the advantages of DCNNs is their representation robustness to object locations, which is useful for object recognition tasks. However, this also discards spatial information, which is useful when dealing with topological information of the image (e.g. scene labeling, face recognition). In this paper, we propose a deeper and wider network architecture to tackle the scene labeling task. The depth is achieved by incorporating predictions from multiple early layers of the DCNN. The width is achieved by combining multiple outputs of the network. We then further refine the parsing task by adopting graphical models (GMs) as a post-processing step to incorporate spatial and contextual information into the network. The new strategy for a deeper, wider convolutional network coupled with graphical models has shown promising results on the PASCAL-Context dataset.