59 resultados para modeling and visualization
Resumo:
As digital technologies become widely used in designing buildings and infrastructure, questions arise about their impacts on construction safety. This review explores relationships between construction safety and digital design practices with the aim of fostering and directing further research. It surveys state-of-the-art research on databases, virtual reality, geographic information systems, 4D CAD, building information modeling and sensing technologies, finding various digital tools for addressing safety issues in the construction phase, but few tools to support design for construction safety. It also considers a literature on safety critical, digital and design practices that raises a general concern about ‘mindlessness’ in the use of technologies, and has implications for the emerging research agenda around construction safety and digital design. Bringing these strands of literature together suggests new kinds of interventions, such as the development of tools and processes for using digital models to promote mindfulness through multi-party collaboration on safety
Resumo:
Time-resolved studies of chlorosilylene, ClSiH, generated by the 193 nm laser flash photolysis of 1-chloro-1- silacyclopent-3-ene, have been carried out to obtain rate constants for its bimolecular reaction with trimethylsilane-1-d, Me3SiD, in the gas phase. The reaction was studied at total pressures up to 100 Torr (with and without added SF6) over the temperature range of 295−407 K. The rate constants were found to be pressure independent and gave the following Arrhenius equation: log[(k/(cm3 molecule−1 s−1)] = (−13.22 ± 0.15) + [(13.20 ± 1.00) kJ mol−1]/(RT ln 10). When compared with previously published kinetic data for the reaction of ClSiH with Me3SiH, kinetic isotope effects, kD/kH, in the range from 7.4 (297 K) to 6.4 (407 K) were obtained. These far exceed values of 0.4−0.5 estimated for a single-step insertion process. Quantum chemical calculations (G3MP2B3 level) confirm not only the involvement of an intermediate complex, but also the existence of a low-energy internal isomerization pathway which can scramble the D and H atom labels. By means of Rice−Ramsperger−Kassel−Marcus modeling and a necessary (but small) refinement of the energy surface, we have shown that this mechanism can reproduce closely the experimental isotope effects. These findings provide the first experimental evidence for the isomerization pathway and thereby offer the most concrete evidence to date for the existence of intermediate complexes in the insertion reactions of silylenes.
Resumo:
The temporal relationship between changes in cerebral blood flow (CBF) and cerebral blood volume (CBV) is important in the biophysical modeling and interpretation of the hemodynamic response to activation, particularly in the context of magnetic resonance imaging and the blood oxygen level-dependent signal. Grubb et al. (1974) measured the steady state relationship between changes in CBV and CBF after hypercapnic challenge. The relationship CBV proportional to CBFPhi has been used extensively in the literature. Two similar models, the Balloon (Buxton et al., 1998) and the Windkessel (Mandeville et al., 1999), have been proposed to describe the temporal dynamics of changes in CBV with respect to changes in CBF. In this study, a dynamic model extending the Windkessel model by incorporating delayed compliance is presented. The extended model is better able to capture the dynamics of CBV changes after changes in CBF, particularly in the return-to-baseline stages of the response.
Observations of the eruption of the Sarychev volcano and simulations using the HadGEM2 climate model
Resumo:
In June 2009 the Sarychev volcano located in the Kuril Islands to the northeast of Japan erupted explosively, injecting ash and an estimated 1.2 ± 0.2 Tg of sulfur dioxide into the upper troposphere and lower stratosphere, making it arguably one of the 10 largest stratospheric injections in the last 50 years. During the period immediately after the eruption, we show that the sulfur dioxide (SO2) cloud was clearly detected by retrievals developed for the Infrared Atmospheric Sounding Interferometer (IASI) satellite instrument and that the resultant stratospheric sulfate aerosol was detected by the Optical Spectrograph and Infrared Imaging System (OSIRIS) limb sounder and CALIPSO lidar. Additional surface‐based instrumentation allows assessment of the impact of the eruption on the stratospheric aerosol optical depth. We use a nudged version of the HadGEM2 climate model to investigate how well this state‐of‐the‐science climate model can replicate the distributions of SO2 and sulfate aerosol. The model simulations and OSIRIS measurements suggest that in the Northern Hemisphere the stratospheric aerosol optical depth was enhanced by around a factor of 3 (0.01 at 550 nm), with resultant impacts upon the radiation budget. The simulations indicate that, in the Northern Hemisphere for July 2009, the magnitude of the mean radiative impact from the volcanic aerosols is more than 60% of the direct radiative forcing of all anthropogenic aerosols put together. While the cooling induced by the eruption will likely not be detectable in the observational record, the combination of modeling and measurements would provide an ideal framework for simulating future larger volcanic eruptions.
Resumo:
The DIAMET (DIAbatic influences on Mesoscale structures in ExTratropical storms) project aims to improve forecasts of high-impact weather in extratropical cyclones through field measurements, high-resolution numerical modeling, and improved design of ensemble forecasting and data assimilation systems. This article introduces DIAMET and presents some of the first results. Four field campaigns were conducted by the project, one of which, in late 2011, coincided with an exceptionally stormy period marked by an unusually strong, zonal North Atlantic jet stream and a succession of severe windstorms in northwest Europe. As a result, December 2011 had the highest monthly North Atlantic Oscillation index (2.52) of any December in the last 60 years. Detailed observations of several of these storms were gathered using the UK’s BAe146 research aircraft and extensive ground-based measurements. As an example of the results obtained during the campaign, observations are presented of cyclone Friedhelm on 8 December 2011, when surface winds with gusts exceeding 30 m s-1 crossed central Scotland, leading to widespread disruption to transportation and electricity supply. Friedhelm deepened 44 hPa in 24 hours and developed a pronounced bent-back front wrapping around the storm center. The strongest winds at 850 hPa and the surface occurred in the southern quadrant of the storm, and detailed measurements showed these to be most intense in clear air between bands of showers. High-resolution ensemble forecasts from the Met Office showed similar features, with the strongest winds aligned in linear swaths between the bands, suggesting that there is potential for improved skill in forecasts of damaging winds.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The idea of buildings in harmony with nature can be traced back to ancient times. The increasing concerns on sustainability oriented buildings have added new challenges in building architectural design and called for new design responses. Sustainable design integrates and balances the human geometries and the natural ones. As the language of nature, it is, therefore, natural to assume that fractal geometry could play a role in developing new forms of aesthetics and sustainable architectural design. This paper gives a brief description of fractal geometry theory and presents its current status and recent developments through illustrative review of some fractal case studies in architecture design, which provides a bridge between fractal geometry and architecture design.
Resumo:
Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling
Resumo:
Key Performance Indicators (KPIs) are the main instruments of Business Performance Management. KPIs are the measures that are translated to both the strategy and the business process. These measures are often designed for an industry sector with the assumptions about business processes in organizations. However, the assumptions can be too incomplete to guarantee the required properties of KPIs. This raises the need to validate the properties of KPIs prior to their application to performance measurement. This paper applies the method called EXecutable Requirements Engineering Management and Evolution (EXTREME) for validation of the KPI definitions. EXTREME semantically relates the goal modeling, conceptual modeling and protocol modeling techniques into one methodology. The synchronous composition built into protocol modeling enables raceability of goals in protocol models and constructive definitions of a KPI. The application of the method clarifies the meaning of KPI properties and procedures of their assessment and validation.
Resumo:
Horticultural science linked with basic studies in biology, chemistry, physics and engineering has laid the foundation for advances in applied knowledge which are at the heart of commercial, environmental and social horticulture. In few disciplines is science more rapidly translated into applicable technologies than in the huge range of man’s activities embraced within horticulture which are discussed in this Trilogy. This chapter surveys the origins of horticultural science developing as an integral part of the 16th century “Scientific Revolution”. It identifies early discoveries during the latter part of the 19th and early 20th centuries which rationalized the control of plant growth, flowering and fruiting and the media in which crops could be cultivated. The products of these discoveries formed the basis on which huge current industries of worldwide significance are founded in fruit, vegetable and ornamental production. More recent examples of the application of horticultural science are used in an explanation of how the integration of plant breeding, crop selection and astute marketing highlighted by the New Zealand industry have retained and expanded the viability of production which supplies huge volumes of fruit into the world’s markets. This is followed by an examination of science applied to tissue and cell culture as an example of technologies which have already produced massive industrial applications but hold the prospect for generating even greater advances in the future. Finally, examples are given of nascent scientific discoveries which hold the prospect for generating horticultural industries with considerable future impact. These include systems modeling and biology, nanotechnology, robotics, automation and electronics, genetics and plant breeding, and more efficient and effective use of resources and the employment of benign microbes. In conclusion there is an estimation of the value of horticultural science to society.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
Results are presented from a new web application called OceanDIVA - Ocean Data Intercomparison and Visualization Application. This tool reads hydrographic profiles and ocean model output and presents the data on either depth levels or isotherms for viewing in Google Earth, or as probability density functions (PDFs) of regional model-data misfits. As part of the CLIVAR Global Synthesis and Observations Panel, an intercomparison of water mass properties of various ocean syntheses has been undertaken using OceanDIVA. Analysis of model-data misfits reveals significant differences between the water mass properties of the syntheses, such as the ability to capture mode water properties.