100 resultados para Q-Methodology
em CentAUR: Central Archive University of Reading - UK
Resumo:
In the UK, architectural design is regulated through a system of design control for the public interest, which aims to secure and promote ‘quality’ in the built environment. Design control is primarily implemented by locally employed planning professionals with political oversight, and independent design review panels, staffed predominantly by design professionals. Design control has a lengthy and complex history, with the concept of ‘design’ offering a range of challenges for a regulatory system of governance. A simultaneously creative and emotive discipline, architectural design is a difficult issue to regulate objectively or consistently, often leading to policy that is regarded highly discretionary and flexible. This makes regulatory outcomes difficult to predict, as approaches undertaken by the ‘agents of control’ can vary according to the individual. The role of the design controller is therefore central, tasked with the responsibility of interpreting design policy and guidance, appraising design quality and passing professional judgment. However, little is really known about what influences the way design controllers approach their task, providing a ‘veil’ over design control, shrouding the basis of their decisions. This research engaged directly with the attitudes and perceptions of design controllers in the UK, lifting this ‘veil’. Using in-depth interviews and Q-Methodology, the thesis explores this hidden element of control, revealing a number of key differences in how controllers approach and implement policy and guidance, conceptualise design quality, and rationalise their evaluations and judgments. The research develops a conceptual framework for agency in design control – this consists of six variables (Regulation; Discretion; Skills; Design Quality; Aesthetics; and Evaluation) and it is suggested that this could act as a ‘heuristic’ instrument for UK controllers, prompting more reflexivity in relation to evaluating their own position, approaches, and attitudes, leading to better practice and increased transparency of control decisions.
Resumo:
The sustainable intelligent building is a building that has the best combination of environmental, social, economic and technical values. And its sustainability assessment is related with system engineering methods and multi-criteria decision-making. Therefore firstly, the wireless monitoring system of sustainable parameters for intelligent buildings is achieved; secondly, the indicators and key issues based on the “whole life circle” for sustainability of intelligent buildings are researched; thirdly, the sustainable assessment model identified on the structure entropy and fuzzy analytic hierarchy process is proposed.
Resumo:
Convectively coupled equatorial waves are fundamental components of the interaction between the physics and dynamics of the tropical atmosphere. A new methodology, which isolates individual equatorial wave modes, has been developed and applied to observational data. The methodology assumes that the horizontal structures given by equatorial wave theory can be used to project upper- and lower-tropospheric data onto equatorial wave modes. The dynamical fields are first separated into eastward- and westward-moving components with a specified domain of frequency–zonal wavenumber. Each of the components for each field is then projected onto the different equatorial modes using the y structures of these modes given by the theory. The latitudinal scale yo of the modes is predetermined by data to fit the equatorial trapping in a suitable latitude belt y = ±Y. The extent to which the different dynamical fields are consistent with one another in their depiction of each equatorial wave structure determines the confidence in the reality of that structure. Comparison of the analyzed modes with the eastward- and westward-moving components in the convection field enables the identification of the dynamical structure and nature of convectively coupled equatorial waves. In a case study, the methodology is applied to two independent data sources, ECMWF Reanalysis and satellite-observed window brightness temperature (Tb) data for the summer of 1992. Various convectively coupled equatorial Kelvin, mixed Rossby–gravity, and Rossby waves have been detected. The results indicate a robust consistency between the two independent data sources. Different vertical structures for different wave modes and a significant Doppler shifting effect of the background zonal winds on wave structures are found and discussed. It is found that in addition to low-level convergence, anomalous fluxes induced by strong equatorial zonal winds associated with equatorial waves are important for inducing equatorial convection. There is evidence that equatorial convection associated with Rossby waves leads to a change in structure involving a horizontal structure similar to that of a Kelvin wave moving westward with it. The vertical structure may also be radically changed. The analysis method should make a very powerful diagnostic tool for investigating convectively coupled equatorial waves and the interaction of equatorial dynamics and physics in the real atmosphere. The results from application of the analysis method for a reanalysis dataset should provide a benchmark against which model studies can be compared.
Resumo:
We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about ±5–10 W m−2 in clear-sky outgoing longwave radiation (OLR) and ±0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.
Resumo:
The uptake of metals by earthworms occurs predominantly via the soil pore water, or via an uptake route which is related to the soil pore water metal concentration. However, it has been suggested that the speciation of the metal is also important. A novel technique is described which exposes Eisenia andrei Bouche to contaminant bearing solutions in which the chemical factors affecting its speciation may be individually and systematically manipulated. In a preliminary experiment, the LC50 for copper nitrate was 0.046 mg l(-1) (95 % confidence intervals: 0.03 and 0.07 mg l(-1)). There was a significant positive correlation between earthworm mortality and bulk copper concentration in solution (R-2 = 0.88, P less than or equal to 0.001), and a significant positive increase in earthworm tissue copper concentration with increasing copper concentration in solution (R-2 = 0.97, P less than or equal to 0.001). It is anticipated that quantifying the effect of soil solution chemical speciation on copper bioavailability will provide an excellent aid to understanding the importance of chemical composition and the speciation of metals, in the calculation of toxicological parameters.
Resumo:
A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.
Resumo:
Models developed to identify the rates and origins of nutrient export from land to stream require an accurate assessment of the nutrient load present in the water body in order to calibrate model parameters and structure. These data are rarely available at a representative scale and in an appropriate chemical form except in research catchments. Observational errors associated with nutrient load estimates based on these data lead to a high degree of uncertainty in modelling and nutrient budgeting studies. Here, daily paired instantaneous P and flow data for 17 UK research catchments covering a total of 39 water years (WY) have been used to explore the nature and extent of the observational error associated with nutrient flux estimates based on partial fractions and infrequent sampling. The daily records were artificially decimated to create 7 stratified sampling records, 7 weekly records, and 30 monthly records from each WY and catchment. These were used to evaluate the impact of sampling frequency on load estimate uncertainty. The analysis underlines the high uncertainty of load estimates based on monthly data and individual P fractions rather than total P. Catchments with a high baseflow index and/or low population density were found to return a lower RMSE on load estimates when sampled infrequently than those with a tow baseflow index and high population density. Catchment size was not shown to be important, though a limitation of this study is that daily records may fail to capture the full range of P export behaviour in smaller catchments with flashy hydrographs, leading to an underestimate of uncertainty in Load estimates for such catchments. Further analysis of sub-daily records is needed to investigate this fully. Here, recommendations are given on load estimation methodologies for different catchment types sampled at different frequencies, and the ways in which this analysis can be used to identify observational error and uncertainty for model calibration and nutrient budgeting studies. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Laboratory measurements of the attenuation and velocity dispersion of compressional and shear waves at appropriate frequencies, pressures, and temperatures can aid interpretation of seismic and well-log surveys as well as indicate absorption mechanisms in rocks. Construction and calibration of resonant-bar equipment was used to measure velocities and attenuations of standing shear and extensional waves in copper-jacketed right cylinders of rocks (30 cm in length, 2.54 cm in diameter) in the sonic frequency range and at differential pressures up to 65 MPa. We also measured ultrasonic velocities and attenuations of compressional and shear waves in 50-mm-diameter samples of the rocks at identical pressures. Extensional-mode velocities determined from the resonant bar are systematically too low, yielding unreliable Poisson's ratios. Poisson's ratios determined from the ultrasonic data are frequency corrected and used to calculate the sonic-frequency compressional-wave velocities and attenuations from the shear- and extensional-mode data. We calculate the bulk-modulus loss. The accuracies of attenuation data (expressed as 1000/Q, where Q is the quality factor) are +/- 1 for compressional and shear waves at ultrasonic frequency, +/- 1 for shear waves, and +/- 3 for compressional waves at sonic frequency. Example sonic-frequency data show that the energy absorption in a limestone is small (Q(P) greater than 200 and stress independent) and is primarily due to poroelasticity, whereas that in the two sandstones is variable in magnitude (Q(P) ranges from less than 50 to greater than 300, at reservoir pressures) and arises from a combination of poroelasticity and viscoelasticity. A graph of compressional-wave attenuation versus compressional-wave velocity at reservoir pressures differentiates high-permeability (> 100 mD, 9.87 X 10(-14) m(2)) brine-saturated sandstones from low-permeability (< 100 mD, 9.87 X 10 (14) m(2)) sandstones and shales.
Resumo:
Crop irrigation has long been recognized as having been important for the evolution of social complexity in several parts of the world. Structural evidence for water management, as in the form of wells, ditches and dams, is often difficult to interpret and may be a poor indicator of past irrigation that may have had no need for such constructions. It would be of considerable value, therefore, to be able to infer past irrigation directly from archaeo-botanical remains, and especially the type of archaeo-botanical remains that are relatively abundant in the archaeological record, such as phytoliths. Building on the pioneering work of Rosen and Wiener (1994), this paper describes a crop-growing experiment designed to explore the impact of irrigation on the formation of phytoliths within cereals. If it can be shown that a systemic and consistent relationship exists between phytolith size, structure and the intensity of irrigation, and if various taphonomic and palaeoenvironmental processes can be controlled for, then the presence of past irrigation can feasibly be inferred from the phytoliths recovered from the archaeological record.
Resumo:
The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
From April 2010, the General Pharmaceutical Council (GPhC) will be responsible for the statutory regulation of pharmacists and pharmacy technicians in Great Britain (GB).[1] All statutorily regulated health professionals will need to periodically demonstrate their fitness-to-practise through a process of revalidation.[2] One option being considered in GB is that continuing professional development (CPD) records will form a part of the evidence submitted for revalidation, similar to the system in New Zealand.[3] At present, pharmacy professionals must make a minimum of nine CPD entries per annum from 1 March 2009 using the Royal Pharmaceutical Society of Great Britain (RPSGB) CPD framework. Our aim was to explore the applicability of new revalidation standards within the current CPD framework. We also wanted to review the content of CPD portfolios to assess strengths and qualities and identify any information gaps for the purpose of revalidation.
Resumo:
We advocate the use of systolic design techniques to create custom hardware for Custom Computing Machines. We have developed a hardware genetic algorithm based on systolic arrays to illustrate the feasibility of the approach. The architecture is independent of the lengths of chromosomes used and can be scaled in size to accommodate different population sizes. An FPGA prototype design can process 16 million genes per second.