983 resultados para Structure Project
Resumo:
This article results from three conferences organized by the research project titled “Architectural research framework” developed by the research center Architectural Lab – LabART – of the Lusófona University, and also by my personal experiences and dialogs with other members of the EAAE research committee. Architectural research always existed, but only recently some major questions have emerged, by the time that Europe started the last universitary reform on the 80’s. Two aspects are crucial in understanding the problematic that we are referring to. On the one hand we verify that the architectural teaching should maintain the articulation and close relationship between the theoretical and practical aspects. On the other hand, there is a need to confer academic degrees, as the MsC and PhD’s in the Faculties of Architecture. Inevitably, discussions began about the scientificity of architecture (its grounding), the types of research, methodological models, as well as on the evaluation criteria and the quality of research, or the relevance of the results. We will try to approach some of these discussions, and by the end, establish a basic structure that allows us to obtain an open model for research in architecture.
Resumo:
The atmospheric circulation changes predicted by climate models are often described using sea level pressure, which generally shows a strengthening of the mid-latitude westerlies. Recent observed variability is dominated by the Northern Annular Mode (NAM) which is equivalent barotropic, so that wind variations of the same sign are seen at all levels. However, in model predictions of the response to anthropogenic forcing, there is a well-known enhanced warming at low levels over the northern polar cap in winter. This means that there is a strong baroclinic component to the response. The projection of the response onto a NAM-like zonal index varies with height. While at the surface most models project positively onto the zonal index, throughout most of the depth of the troposphere many of the models give negative projections. The response to anthropogenic forcing therefore has a distinctive baroclinic signature which is very different to the NAM
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an extension to existing buildings to provide a warehouse, services block and packing line. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of building provision can be viewed as systems and subsystems which are differentiated from each other at decision points. Further to this, the subsystems can be viewed as the interaction of managing system and operating system. Using Walker's model, a systematic analysis of the relationships between the contributors gives a quantitative assessment of the efficacy of the organizational structure used. The causes of the client's dissatisfaction with the outcome of the project were lack of integration and complexity of the managing system. However, there was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's propositions.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private development consisting of an extension to an existing building to provide a wholesale butchery facility. The project used a conventionally organized management process. The organization structure adopted on the project is analysed using concepts from the systems theory, which are included in Walkers theoretical model of the structure of building project organizations. This model proposes that the process of building provision can be viewed as systems and sub-systems that are differentiated from each other at decision points. Further to this, the sub-systems can be viewed as the interaction of managing system and operating system. Using Walkers model, a systematic analysis of the relationships between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. The project's organization structure diverged from the models propositions resulting in delay to the project's completion and cost overrun but the client was satisfied with the project functionally.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an engineering factory and offices. A multi-disciplinary professional practice was used to manage and design the project. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of buildings provision can be viewed as systems and sub-systems which are differentiated form each other at decision points. Further to this, the sub-systematic analysis of the relationship between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. There was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's proposition. However, the project was subject to string environmental forces which the project organization was not capable of entirely overcoming.
Resumo:
The management of a public sector project is analysed using a model developed from systems theory. Linear responsibility analysis is used to identify the primary and key decision structure of the project and to generate quantitative data regarding differentiation and integration of the operating system, the managing system and the client/project team. The environmental context of the project is identified. Conclusions are drawn regarding the project organization structure's ability to cope with the prevailing environmental conditions. It is found that the complexity of the managing system imposed on the project was unable to achieve this and created serious deficiencies in the outcome of the project.
Resumo:
Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).
Resumo:
The Solar TErrestrial RElations Observatory (STEREO) provides high cadence and high resolution images of the structure and morphology of coronal mass ejections (CMEs) in the inner heliosphere. CME directions and propagation speeds have often been estimated through the use of time-elongation maps obtained from the STEREO Heliospheric Imager (HI) data. Many of these CMEs have been identified by citizen scientists working within the SolarStormWatch project ( www.solarstormwatch.com ) as they work towards providing robust real-time identification of Earth-directed CMEs. The wide field of view of HI allows scientists to directly observe the two-dimensional (2D) structures, while the relative simplicity of time-elongation analysis means that it can be easily applied to many such events, thereby enabling a much deeper understanding of how CMEs evolve between the Sun and the Earth. For events with certain orientations, both the rear and front edges of the CME can be monitored at varying heliocentric distances (R) between the Sun and 1 AU. Here we take four example events with measurable position angle widths and identified by the citizen scientists. These events were chosen for the clarity of their structure within the HI cameras and their long track lengths in the time-elongation maps. We show a linear dependency with R for the growth of the radial width (W) and the 2D aspect ratio (χ) of these CMEs, which are measured out to ≈ 0.7 AU. We estimated the radial width from a linear best fit for the average of the four CMEs. We obtained the relationships W=0.14R+0.04 for the width and χ=2.5R+0.86 for the aspect ratio (W and R in units of AU).
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
Sea surface temperature (SST) measurements are required by operational ocean and atmospheric forecasting systems to constrain modeled upper ocean circulation and thermal structure. The Global Ocean Data Assimilation Experiment (GODAE) High Resolution SST Pilot Project (GHRSST-PP) was initiated to address these needs by coordinating the provision of accurate, high-resolution, SST products for the global domain. The pilot project is now complete, but activities continue within the Group for High Resolution SST (GHRSST). The pilot project focused on harmonizing diverse satellite and in situ data streams that were indexed, processed, quality controlled, analyzed, and documented within a Regional/Global Task Sharing (R/GTS) framework implemented in an internationally distributed manner. Data with meaningful error estimates developed within GHRSST are provided by services within R/GTS. Currently, several terabytes of data are processed at international centers daily, creating more than 25 gigabytes of product. Ensemble SST analyses together with anomaly SST outputs are generated each day, providing confidence in SST analyses via diagnostic outputs. Diagnostic data sets are generated and Web interfaces are provided to monitor the quality of observation and analysis products. GHRSST research and development projects continue to tackle problems of instrument calibration, algorithm development, diurnal variability, skin temperature deviation, and validation/verification of GHRSST products. GHRSST also works closely with applications and users, providing a forum for discussion and feedback between SST users and producers on a regular basis. All data within the GHRSST R/GTS framework are freely available. This paper reviews the progress of GHRSST-PP, highlighting achievements that have been fundamental to the success of the pilot project.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
The ClearfLo project provides integrated measurements of the meteorology, composition and particulate loading of London's urban atmosphere to improve predictive capability for air quality. Air quality and heat are strong health drivers and their accurate assessment and forecast are important in densely populated urban areas. However, the sources and processes leading to high concentrations of main pollutants such as ozone, nitrogen dioxide, and fine and coarse particulate matter in complex urban areas are not fully understood, limiting our ability to forecast air quality accurately. This paper introduces the ClearfLo project's interdisciplinary approach to investigate the processes leading to poor air quality and elevated temperatures. Within ClearfLo (www.clearflo.ac.uk), a large multi-institutional project funded by the UK Natural Environment Research Council (NERC), integrated measurements of meteorology, gaseous and particulate composition/loading within London's atmosphere were undertaken to understand the processes underlying poor air quality. Long-term measurement infrastructure installed at multiple levels (street and elevated), and at urban background, kerbside and rural locations were complemented with high-resolution numerical atmospheric simulations . Combining these (measurement/modeling) enhances understanding of seasonal variations in meteorology and composition together with the controlling processes. Two intensive observation periods (winter 2012 and summer Olympics 2012) focus upon the vertical structure and evolution of the urban boundary layer, chemical controls on nitrogen dioxide and ozone production, in particular the role of volatile organic compounds, and processes controlling the evolution, size, distribution and composition of particulate matter. The paper shows that mixing heights are deeper over London than in the rural surroundings and the seasonality of the urban boundary layer evolution controls when concentrations peak. The composition also reflects the seasonality of sources such as domestic burning and biogenic emissions.
Resumo:
Many theories for the Madden-Julian oscillation (MJO) focus on diabatic processes, particularly the evolution of vertical heating and moistening. Poor MJO performance in weather and climate models is often blamed on biases in these processes and their interactions with the large-scale circulation. We introduce one of three components of a model-evaluation project, which aims to connect MJO fidelity in models to their representations of several physical processes, focusing on diabatic heating and moistening. This component consists of 20-day hindcasts, initialised daily during two MJO events in winter 2009-10. The 13 models exhibit a range of skill: several have accurate forecasts to 20 days' lead, while others perform similarly to statistical models (8-11 days). Models that maintain the observed MJO amplitude accurately predict propagation, but not vice versa. We find no link between hindcast fidelity and the precipitation-moisture relationship, in contrast to other recent studies. There is also no relationship between models' performance and the evolution of their diabatic-heating profiles with rain rate. A more robust association emerges between models' fidelity and net moistening: the highest-skill models show a clear transition from low-level moistening for light rainfall to mid-level moistening at moderate rainfall and upper-level moistening for heavy rainfall. The mid-level moistening, arising from both dynamics and physics, may be most important. Accurately representing many processes may be necessary, but not sufficient for capturing the MJO, which suggests that models fail to predict the MJO for a broad range of reasons and limits the possibility of finding a panacea.
Resumo:
The "Vertical structure and physical processes of the Madden-Julian oscillation (MJO)" project comprises three experiments, designed to evaluate comprehensively the heating, moistening and momentum associated with tropical convection in general circulation models (GCMs). We consider here only those GCMs that performed all experiments. Some models display relatively higher or lower MJO fidelity in both initialized hindcasts and climate simulations, while others show considerable variations in fidelity between experiments. Fidelity in hindcasts and climate simulations are not meaningfully correlated. The analysis of each experiment led to the development of process-oriented diagnostics, some of which distinguished between GCMs with higher or lower fidelity in that experiment. We select the most discriminating diagnostics and apply them to data from all experiments, where possible, to determine if correlations with MJO fidelity hold across scales and GCM states. While normalized gross moist stability had a small but statistically significant correlation with MJO fidelity in climate simulations, we find no link with fidelity in medium-range hindcasts. Similarly, there is no association between timestep-to-timestep rainfall variability, identified from short hindcasts, and fidelity in medium-range hindcasts or climate simulations. Two metrics that relate precipitation to free-tropospheric moisture--the relative humidity for extreme daily precipitation, and variations in the height and amplitude of moistening with rain rate--successfully distinguish between higher- and lower-fidelity GCMs in hindcasts and climate simulations. To improve the MJO, developers should focus on relationships between convection and both total moisture and its rate of change. We conclude by offering recommendations for further experiments.
Resumo:
Aimed at reducing deficiencies in representing the Madden-Julian oscillation (MJO) in general circulation models (GCMs), a global model evaluation project on vertical structure and physical processes of the MJO was coordinated. In this paper, results from the climate simulation component of this project are reported. It is shown that the MJO remains a great challenge in these latest generation GCMs. The systematic eastward propagation of the MJO is only well simulated in about one-fourth of the total participating models. The observed vertical westward tilt with altitude of the MJO is well simulated in good MJO models, but not in the poor ones. Damped Kelvin wave responses to the east of convection in the lower troposphere could be responsible for the missing MJO preconditioning process in these poor MJO models. Several process-oriented diagnostics were conducted to discriminate key processes for realistic MJO simulations. While large-scale rainfall partition and low-level mean zonal winds over the Indo-Pacific in a model are not found to be closely associated with its MJO skill, two metrics, including the low-level relative humidity difference between high and low rain events and seasonal mean gross moist stability, exhibit statistically significant correlations with the MJO performance. It is further indicated that increased cloud-radiative feedback tends to be associated with reduced amplitude of intraseasonal variability, which is incompatible with the radiative instability theory previously proposed for the MJO. Results in this study confirm that inclusion of air-sea interaction can lead to significant improvement in simulating the MJO.