71 resultados para Observational techniques and algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bertolt Brecht's dramaturgy was as influential upon the development of British drama on television between the 1950s and the 1970s as it was in the theatre. His influence was made manifest through the work of writers, directors and producers such as Tony Garnett, Ken Loach, John McGrath and Dennis Potter, whose attempts to create original Brechtian forms of television drama were reflected in the frequent reference to Brecht in contemporary debate concerning the political and aesthetic direction and value of television drama. While this discussion has been framed thus far around how Brechtian techniques and theory were applied to the newer media of television, this article examines these arguments from another perspective. Through detailed analysis of a 1964 BBC production of The Life of Galileo, I assess how the primary, canonical sources of Brecht's stage plays were realised on television during this period, locating Brecht's drama in the wider context of British television drama in general during the 1960s and 1970s. I pay particular attention to the use of the television studio as a site that could replicate or reinvent the theatrical space of the stage, and the responsiveness of the television audience towards Brechtian dramaturgy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects and influence of the Building Research Establishment’s Environmental Assessment Methods (BREEAM) on construction professionals are examined. Most discussions of building assessment methods focus on either the formal tool or the finished product. In contrast, BREEAM is analysed here as a social technology using Michel Foucault’s theory of governmentality. Interview data are used to explore the effect of BREEAM on visibilities, knowledge, techniques and professional identities. The analysis highlights a number of features of the BREEAM assessment process which generally go unremarked: professional and public understandings of the method, the deployment of different types of knowledge and their implication for the authority and legitimacy of the tool, and the effect of BREEAM on standard practice. The analysis finds that BREEAM’s primary effect is through its impact on standard practices. Other effects include the use of assessment methods to defend design decisions, its role in both operationalizing and obscuring the concept of green buildings, and the effect of tensions between project and method requirements for the authority of the tool. A reflection on assessment methods as neo-liberal tools and their adequacy for the promotion of sustainable construction suggests several limitations of lock-in that hinder variation and wider systemic change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents two schemes of measuring the linear and angular kinematics of a rigid body using a kinematically redundant array of triple-axis accelerometers with potential applications in biomechanics. A novel angular velocity estimation algorithm is proposed and evaluated that can compensate for angular velocity errors using measurements of the direction of gravity. Analysis and discussion of optimal sensor array characteristics are provided. A damped 2 axis pendulum was used to excite all 6 DoF of the a suspended accelerometer array through determined complex motion and is the basis of both simulation and experimental studies. The relationship between accuracy and sensor redundancy is investigated for arrays of up to 100 triple axis (300 accelerometer axes) accelerometers in simulation and 10 equivalent sensors (30 accelerometer axes) in the laboratory test rig. The paper also reports on the sensor calibration techniques and hardware implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rising sea level is perhaps the most severe consequence of climate warming, as much of the world’s population and infrastructure is located near current sea level (Lemke et al. 2007). A major rise of a metre or more would cause serious problems. Such possibilities have been suggested by Hansen and Sato (2011) who pointed out that sea level was several metres higher than now during the Holsteinian and Eemian interglacials (about 250,000 and 120,000 years ago, respectively), even though the global temperature was then only slightly higher than it is nowadays. It is consequently of the utmost importance to determine whether such a sea level rise could occur and, if so, how fast it might happen. Sea level undergoes considerable changes due to natural processes such as the wind, ocean currents and tidal motions. On longer time scales, the sea level is influenced by steric effects (sea water expansion caused by temperature and salinity changes of the ocean) and by eustatic effects caused by changes in ocean mass. Changes in the Earth’s cryosphere, such as the retreat or expansion of glaciers and land ice areas, have been the dominant cause of sea level change during the Earth’s recent history. During the glacial cycles of the last million years, the sea level varied by a large amount, of the order of 100 m. If the Earth’s cryosphere were to disappear completely, the sea level would rise by some 65 m. The scientific papers in the present volume address the different aspects of the Earth’s cryosphere and how the different changes in the cryosphere affect sea level change. It represents the outcome of the first workshop held within the new ISSI Earth Science Programme. The workshop took place from 22 to 26 March, 2010, in Bern, Switzerland, with the objective of providing an in-depth insight into the future of mountain glaciers and the large land ice areas of Antarctica and Greenland, which are exposed to natural and anthropogenic climate influences, and their effects on sea level change. The participants of the workshop are experts in different fields including meteorology, climatology, oceanography, glaciology and geodesy; they use advanced space-based observational studies and state-of-the-art numerical modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research examines the influence of environmental institutional distance between home and host countries on the standardization of environmental performance among multinational enterprises using ordinary least-squares (OLS) regression techniques and a sample of 128 multinationals from high-polluting industries. The paper examines the environmental institutional distance of countries using the concepts of formal and informal institutional distances. The results show that whereas a high formal environmental distance between home and host countries leads multinational enterprises to achieve a different level of environmental performance according to each country's legal requirements, a high informal environmental distance encourages these firms to unify their environmental performance independently of the countries in which their units are based. The study also discusses the implications for academia, managers, and policy makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The chemical specificity of terahertz spectroscopy, when combined with techniques for sub-wavelength sensing, is giving new understanding of processes occurring at the nanometre scale in biological systems and offers the potential for single molecule detection of chemical and biological agents and explosives. In addition, terahertz techniques are enabling the exploration of the fundamental behaviour of light when it interacts with nanoscale optical structures, and are being used to measure ultrafast carrier dynamics, transport and localisation in nanostructures. This chapter will explain how terahertz scale modelling can be used to explore the fundamental physics of nano-optics, it will discuss the terahertz spectroscopy of nanomaterials, terahertz near-field microscopy and other sub-wavelength techniques, and summarise recent developments in the terahertz spectroscopy and imaging of biological systems at the nanoscale. The potential of using these techniques for security applications will be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensemble learning can be used to increase the overall classification accuracy of a classifier by generating multiple base classifiers and combining their classification results. A frequently used family of base classifiers for ensemble learning are decision trees. However, alternative approaches can potentially be used, such as the Prism family of algorithms that also induces classification rules. Compared with decision trees, Prism algorithms generate modular classification rules that cannot necessarily be represented in the form of a decision tree. Prism algorithms produce a similar classification accuracy compared with decision trees. However, in some cases, for example, if there is noise in the training and test data, Prism algorithms can outperform decision trees by achieving a higher classification accuracy. However, Prism still tends to overfit on noisy data; hence, ensemble learners have been adopted in this work to reduce the overfitting. This paper describes the development of an ensemble learner using a member of the Prism family as the base classifier to reduce the overfitting of Prism algorithms on noisy datasets. The developed ensemble classifier is compared with a stand-alone Prism classifier in terms of classification accuracy and resistance to noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Parental overprotection has commonly been implicated in the development and maintenance of childhood anxiety disorders. Overprotection has been assessed using questionnaire and observational methods interchangeably; however, the extent to which these methods access the same construct has received little attention. Edwards, 2008 and Edwards et al., 2010 developed a promising parent-report measure of overprotection (OP) and reported that, with parents of pre-school children, the measure correlated with observational assessments and predicted changes in child anxiety symptoms. We aimed to validate the use of the OP measure with mothers of children in middle childhood, and examine its association with child and parental anxiety. Methods: Mothers of 90 children (60 clinically anxious, 30 non-anxious) aged 7–12 years completed the measure and engaged in a series of mildly stressful tasks with their child. Results: The internal reliability of the measure was good and scores correlated significantly with observations of maternal overprotection in a challenging puzzle task. Contrary to expectations, OP was not significantly associated with child anxiety status or symptoms, but was significantly associated with maternal anxiety symptoms. Limitations: Participants were predominantly from affluent social groups and of non-minority status. Overprotection is a broad construct, the use of specific sub-dimensions of behavioural constructs may be preferable. Conclusions: The findings support the use of the OP measure to assess parental overprotection among 7–12 year-old children; however, they suggest that parental responses may be more closely related to the degree of parental rather than child anxiety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosols affect the Earth's energy budget directly by scattering and absorbing radiation and indirectly by acting as cloud condensation nuclei and, thereby, affecting cloud properties. However, large uncertainties exist in current estimates of aerosol forcing because of incomplete knowledge concerning the distribution and the physical and chemical properties of aerosols as well as aerosol-cloud interactions. In recent years, a great deal of effort has gone into improving measurements and datasets. It is thus feasible to shift the estimates of aerosol forcing from largely model-based to increasingly measurement-based. Our goal is to assess current observational capabilities and identify uncertainties in the aerosol direct forcing through comparisons of different methods with independent sources of uncertainties. Here we assess the aerosol optical depth (τ), direct radiative effect (DRE) by natural and anthropogenic aerosols, and direct climate forcing (DCF) by anthropogenic aerosols, focusing on satellite and ground-based measurements supplemented by global chemical transport model (CTM) simulations. The multi-spectral MODIS measures global distributions of aerosol optical depth (τ) on a daily scale, with a high accuracy of ±0.03±0.05τ over ocean. The annual average τ is about 0.14 over global ocean, of which about 21%±7% is contributed by human activities, as estimated by MODIS fine-mode fraction. The multi-angle MISR derives an annual average AOD of 0.23 over global land with an uncertainty of ~20% or ±0.05. These high-accuracy aerosol products and broadband flux measurements from CERES make it feasible to obtain observational constraints for the aerosol direct effect, especially over global the ocean. A number of measurement-based approaches estimate the clear-sky DRE (on solar radiation) at the top-of-atmosphere (TOA) to be about -5.5±0.2 Wm-2 (median ± standard error from various methods) over the global ocean. Accounting for thin cirrus contamination of the satellite derived aerosol field will reduce the TOA DRE to -5.0 Wm-2. Because of a lack of measurements of aerosol absorption and difficulty in characterizing land surface reflection, estimates of DRE over land and at the ocean surface are currently realized through a combination of satellite retrievals, surface measurements, and model simulations, and are less constrained. Over the oceans the surface DRE is estimated to be -8.8±0.7 Wm-2. Over land, an integration of satellite retrievals and model simulations derives a DRE of -4.9±0.7 Wm-2 and -11.8±1.9 Wm-2 at the TOA and surface, respectively. CTM simulations derive a wide range of DRE estimates that on average are smaller than the measurement-based DRE by about 30-40%, even after accounting for thin cirrus and cloud contamination. A number of issues remain. Current estimates of the aerosol direct effect over land are poorly constrained. Uncertainties of DRE estimates are also larger on regional scales than on a global scale and large discrepancies exist between different approaches. The characterization of aerosol absorption and vertical distribution remains challenging. The aerosol direct effect in the thermal infrared range and in cloudy conditions remains relatively unexplored and quite uncertain, because of a lack of global systematic aerosol vertical profile measurements. A coordinated research strategy needs to be developed for integration and assimilation of satellite measurements into models to constrain model simulations. Enhanced measurement capabilities in the next few years and high-level scientific cooperation will further advance our knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimisation of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimisation of throughput, based on pa- rameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are a range of studies based in the low carbon arena which use various ‘futures’- based techniques as ways of exploring uncertainties. These techniques range from ‘scenarios’ and ‘roadmaps’ through to ‘transitions’ and ‘pathways’ as well as ‘vision’-based techniques. The overall aim of the paper is therefore to compare and contrast these techniques to develop a simple working typology with the further objective of identifying the implications of this analysis for RETROFIT 2050. Using recent examples of city-based and energy-based studies throughout, the paper compares and contrasts these techniques and finds that the distinctions between them have often been blurred in the field of low carbon. Visions, for example, have been used in both transition theory and futures/Foresight methods, and scenarios have also been used in transition-based studies as well as futures/Foresight studies. Moreover, Foresight techniques which capture expert knowledge and map existing knowledge to develop a set of scenarios and roadmaps which can inform the development of transitions and pathways can not only help potentially overcome any ‘disconnections’ that may exist between the social and the technical lenses in which such future trajectories are mapped, but also promote a strong ‘co-evolutionary’ content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a part of the Atmospheric Model Intercomparison Project (AMIP), the behaviour of 15 general circulation models has been analysed in order to diagnose and compare the ability of the different models in simulating Northern Hemisphere midlatitude atmospheric blocking. In accordance with the established AMIP procedure, the 10-year model integrations were performed using prescribed, time-evolving monthly mean observed SSTs spanning the period January 1979–December 1988. Atmospheric observational data (ECMWF analyses) over the same period have been also used to verify the models results. The models involved in this comparison represent a wide spectrum of model complexity, with different horizontal and vertical resolution, numerical techniques and physical parametrizations, and exhibit large differences in blocking behaviour. Nevertheless, a few common features can be found, such as the general tendency to underestimate both blocking frequency and the average duration of blocks. The problem of the possible relationship between model blocking and model systematic errors has also been assessed, although without resorting to ad-hoc numerical experimentation it is impossible to relate with certainty particular model deficiencies in representing blocking to precise parts of the model formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a part of the Atmospheric Model Intercomparison Project (AMIP), the behaviour of 15 general circulation models has been analysed in order to diagnose and compare the ability of the different models in simulating Northern Hemisphere midlatitude atmospheric blocking. In accordance with the established AMIP procedure, the 10-year model integrations were performed using prescribed, time-evolving monthly mean observed SSTs spanning the period January 1979–December 1988. Atmospheric observational data (ECMWF analyses) over the same period have been also used to verify the models results. The models involved in this comparison represent a wide spectrum of model complexity, with different horizontal and vertical resolution, numerical techniques and physical parametrizations, and exhibit large differences in blocking behaviour. Nevertheless, a few common features can be found, such as the general tendency to underestimate both blocking frequency and the average duration of blocks. The problem of the possible relationship between model blocking and model systematic errors has also been assessed, although without resorting to ad-hoc numerical experimentation it is impossible to relate with certainty particular model deficiencies in representing blocking to precise parts of the model formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

So-called ‘radical’ and ‘critical’ pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional ‘banking’ method of pedagogical praxis. Yet, how do we challenge this ‘banking’ model of knowledge transmission in both a large-class setting and around the topic of commodity geographies where the banking model of information transfer still holds sway? This paper presents a theoretically and pedagogically driven argument, as well as a series of practical teaching ‘techniquesand tools—mind-mapping and group work—designed to promote ‘deep learning’ and a progressive political potential in a first-year large-scale geography course centred around lectures on the Geographies of Consumption and Material Culture. Here students are not only asked to place themselves within and without the academic materials and other media but are urged to make intimate connections between themselves and their own consumptive acts and the commodity networks in which they are enmeshed. Thus, perhaps pedagogy needs to be emplaced firmly within the realms of research practice rather than as simply the transference of research findings.