125 resultados para Isotropic and Anisotropic models
Resumo:
We analyse by simulation the impact of model-selection strategies (sometimes called pre-testing) on forecast performance in both constant-and non-constant-parameter processes. Restricted, unrestricted and selected models are compared when either of the first two might generate the data. We find little evidence that strategies such as general-to-specific induce significant over-fitting, or thereby cause forecast-failure rejection rates to greatly exceed nominal sizes. Parameter non-constancies put a premium on correct specification, but in general, model-selection effects appear to be relatively small, and progressive research is able to detect the mis-specifications.
Resumo:
We develop the essential ingredients of a new, continuum and anisotropic model of sea-ice dynamics designed for eventual use in climate simulation. These ingredients are a constitutive law for sea-ice stress, relating stress to the material properties of sea ice and to internal variables describing the sea-ice state, and equations describing the evolution of these variables. The sea-ice cover is treated as a densely flawed two-dimensional continuum consisting of a uniform field of thick ice that is uniformly permeated with narrow linear regions of thinner ice called leads. Lead orientation, thickness and width distributions are described by second-rank tensor internal variables: the structure, thickness and width tensors, whose dynamics are governed by corresponding evolution equations accounting for processes such as new lead generation and rotation as the ice cover deforms. These evolution equations contain contractions of higher-order tensor expressions that require closures. We develop a sea-ice stress constitutive law that relates sea-ice stress to the structure tensor, thickness tensor and strain rate. For the special case of empty leads (containing no ice), linear closures are adopted and we present calculations for simple shear, convergence and divergence.
Resumo:
Seventeen simulations of the Last Glacial Maximum (LGM) climate have been performed using atmospheric general circulation models (AGCM) in the framework of the Paleoclimate Modeling Intercomparison Project (PMIP). These simulations use the boundary conditions for CO2, insolation and ice-sheets; surface temperatures (SSTs) are either (a) prescribed using CLIMAP data set (eight models) or (b) computed by coupling the AGCM with a slab ocean (nine models). The present-day (PD) tropical climate is correctly depicted by all the models, except the coarser resolution models, and the simulated geographical distribution of annual mean temperature is in good agreement with climatology. Tropical cooling at the LGM is less than at middle and high latitudes, but greatly exceeds the PD temperature variability. The LGM simulations with prescribed SSTs underestimate the observed temperature changes except over equatorial Africa where the models produce a temperature decrease consistent with the data. Our results confirm previous analyses showing that CLIMAP (1981) SSTs only produce a weak terrestrial cooling. When SSTs are computed, the models depict a cooling over the Pacific and Indian oceans in contrast with CLIMAP and most models produce cooler temperatures over land. Moreover four of the nine simulations, produce a cooling in good agreement with terrestrial data. Two of these model results over ocean are consistent with new SST reconstructions whereas two models simulate a homogeneous cooling. Finally, the LGM aridity inferred for most of the tropics from the data, is globally reproduced by the models with a strong underestimation for models using computed SSTs.
Resumo:
As weather and climate models move toward higher resolution, there is growing excitement about potential future improvements in the understanding and prediction of atmospheric convection and its interaction with larger-scale phenomena. A meeting in January 2013 in Dartington, Devon was convened to address the best way to maximise these improvements, specifically in a UK context but with international relevance. Specific recommendations included increased convective-scale observations, high-resolution virtual laboratories, and a system of parameterization test beds with a range of complexities. The main recommendation was to facilitate the development of physically based convective parameterizations that are scale-aware, non-local, non-equilibrium, and stochastic.
Resumo:
Purpose – Progress in retrofitting the UK's commercial properties continues to be slow and fragmented. New research from the UK and USA suggests that radical changes are needed to drive large-scale retrofitting, and that new and innovative models of financing can create new opportunities. The purpose of this paper is to offer insights into the terminology of retrofit and the changes in UK policy and practice that are needed to scale up activity in the sector. Design/methodology/approach – The paper reviews and synthesises key published research into commercial property retrofitting in the UK and USA and also draws on policy and practice from the EU and Australia. Findings – The paper provides a definition of “retrofit”, and compares and contrasts this with “refurbishment” and “renovation” in an international context. The paper summarises key findings from recent research and suggests that there are a number of policy and practice measures which need to be implemented in the UK for commercial retrofitting to succeed at scale. These include improved funding vehicles for retrofit; better transparency in actual energy performance; and consistency in measurement, verification and assessment standards. Practical implications – Policy and practice in the UK needs to change if large-scale commercial property retrofit is to be rolled out successfully. This requires mandatory legislation underpinned by incentives and penalties for non-compliance. Originality/value – This paper synthesises recent research to provide a set of policy and practice recommendations which draw on international experience, and can assist on implementation in the UK.
Resumo:
A series of inquiries and reports suggest considerable failings in the care provided to some patients in the NHS. Although the Bristol Inquiry report of 2001 led to the creation of many new regulatory bodies to supervise the NHS, they have never enjoyed consistent support from government and the Mid Staffordshire Inquiry in 2013 suggests they made little difference. Why do some parts of the NHS disregard patients’ interests and how we should we respond to the challenge? The following discusses the evolution of approaches to NHS governance through the Hippocratic, Managerial and Commercial models, and assesses their risks and benefits. Apart from the ethical imperative, the need for effective governance is driven both by the growth in information available to the public and the resources wasted by ineffective systems of care. Appropriate solutions depend on an understanding of the perverse incentives inherent in each model and the need for greater sensitivity to the voices of patients and the public.
Resumo:
Observations of atmospheric conditions and processes in citiesare fundamental to understanding the interactions between the urban surface and weather/climate, improving the performance of urban weather, air quality and climate models, and providing key information for city end-users (e.g. decision-makers, stakeholders, public). In this paper, Shanghai's urban integrated meteorological observation network (SUIMON) and some examples of intended applications are introduced. Its characteristics include being: multi- purpose (e.g. forecast, research, service), multi-function (high impact weather, city climate, special end-users), multi-scale (e.g. macro/meso-, urban-, neighborhood, street canyon), multi-variable (e.g. thermal, dynamic, chemical, bio-meteorological, ecological), and multi- platform (e.g. radar, wind profiler, ground-based, satellite based, in-situ observation/ sampling). Underlying SUIMON is a data management system to facilitate exchange of data and information. The overall aim of the network is to improve coordination strategies and instruments; to identify data gaps based on science and user driven requirements; and to intelligently combine observations from a variety of platforms by using a data assimilation system that is tuned to produce the best estimate of the current state of the urban atmosphere.
Resumo:
Anthropogenic pressure influences the two-way interactions between shallow aquifers and coastal lagoons. Aquifer overexploitation may lead to seawater intrusion, and aquifer recharge from rainfall plus irrigation may, in turn, increase the groundwater discharge into the lagoon. We analyse the evolution, since the 1950s up to the present, of the interactions between the Campo de Cartagena Quaternary aquifer and the Mar Menor coastal lagoon (SE Spain). This is a very heterogeneous and anisotropic detrital aquifer, where aquifer–lagoon interface has a very irregular geometry. Using electrical resistivity tomography, we clearly identified the freshwater–saltwater transition zone and detected areas affected by seawater intrusion. Severity of the intrusion was spatially variable and significantly related to the density of irrigation wells in 1950s–1960s, suggesting the role of groundwater overexploitation. We distinguish two different mechanisms by which water from the sea invades the land: (a) horizontal advance of the interface due to a wide exploitation area and (b) vertical rise (upconing) caused by local intensive pumping. In general, shallow parts of the geophysical profiles show higher electrical resistivity associated with freshwater mainly coming from irrigation return flows, with water resources mostly from deep confined aquifers and imported from Tagus river, 400 km north. This indicates a likely reversal of the former seawater intrusion process.
Resumo:
New in-situ aircraft measurements of Saharan dust originating from Mali, Mauritania and Algeria taken during the Fennec 2011 aircraft campaign over a remote part of the Sahara Desert are presented. Size distributions extending to 300 μm are shown, representing measurements extending further into the coarse mode than previously published for airborne Saharan dust. A significant coarse mode was present in the size distribution measurements with effective diameter (deff) from 2.3 to 19.4 μm and coarse mode volume median diameter (dvc) from 5.8 to 45.3 μm. The mean size distribution had a larger relative proportion of coarse mode particles than previous aircraft measurements. The largest particles (with deff >12 μm, or dvc >25 μm) were only encountered within 1 km of the ground. Number concentration, mass loading and extinction coefficient showed inverse relationships to dust age since uplift. Dust particle size showed a weak exponential relationship to dust age. Two cases of freshly uplifted dust showed quite different characteristics of size distribution and number concentration. Single Scattering Albed (SSA) values at 550 nm calculated from the measured size distributions revealed high absorption ranging from 0.70 to 0.97 depending on the refractive index. SSA was found to be strongly related to deff. New instrumentation revealed that direct measurements, behind Rosemount inlets, overestimate SSA by up to 0.11 when deff is greater than 2 μm. This is caused by aircraft inlet inefficiencies and sampling losses. Previous measurements of SSA from aircraft measurements may also have been overestimates for this reason. Radiative transfer calculations indicate that the range of SSAs during Fennec 2011 can lead to underestimates in shortwave atmospheric heating rates by 2.0 to 3.0 times if the coarse mode is neglected. This will have an impact on Saharan atmospheric dynamics and circulation,which should be taken into account by numerical weather prediction and climate models.
Resumo:
There is a growing consensus that the eleven year modulation of galactic cosmic rays (GCRs) resulting from solar activity is related to interplanetary propagating diffusive barriers (PDBs). The source of these PDBs is not well understood and numerical models describing GCR modulation simulate their effect by scaling the diffusion tensor to the interplanetary magnetic field strength (IMF). The implications of a century-scale change in solar wind speed and open solar flux, for numerical modelling of GCR modulation and the reconstruction of GCR variations over the last hundred years are discussed. The dominant role of the solar non-axisymmetric magnetic field in both forcing longitudinal solar wind speed fluctuations at solar maximum and in increasing the IMF is discussed in the context of a long-term rise in the open solar magnetic flux.
Resumo:
This study evaluates model-simulated dust aerosols over North Africa and the North Atlantic from five global models that participated in the Aerosol Comparison between Observations and Models phase II model experiments. The model results are compared with satellite aerosol optical depth (AOD) data from Moderate Resolution Imaging Spectroradiometer (MODIS), Multiangle Imaging Spectroradiometer (MISR), and Sea-viewing Wide Field-of-view Sensor, dust optical depth (DOD) derived from MODIS and MISR, AOD and coarse-mode AOD (as a proxy of DOD) from ground-based Aerosol Robotic Network Sun photometer measurements, and dust vertical distributions/centroid height from Cloud Aerosol Lidar with Orthogonal Polarization and Atmospheric Infrared Sounder satellite AOD retrievals. We examine the following quantities of AOD and DOD: (1) the magnitudes over land and over ocean in our study domain, (2) the longitudinal gradient from the dust source region over North Africa to the western North Atlantic, (3) seasonal variations at different locations, and (4) the dust vertical profile shape and the AOD centroid height (altitude above or below which half of the AOD is located). The different satellite data show consistent features in most of these aspects; however, the models display large diversity in all of them, with significant differences among the models and between models and observations. By examining dust emission, removal, and mass extinction efficiency in the five models, we also find remarkable differences among the models that all contribute to the discrepancies of model-simulated dust amount and distribution. This study highlights the challenges in simulating the dust physical and optical processes, even in the best known dust environment, and stresses the need for observable quantities to constrain the model processes.
Resumo:
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Resumo:
Income growth in highly industrialised countries has resulted in consumer choice of foodstuffs no longer being primarily influenced by basic factors such as price and organoleptic features. From this perspective, the present study sets out to evaluate how and to what extent consumer choice is influenced by the possible negative effects on health and environment caused by the consumption of fruit containing deposits of pesticides and chemical products. The study describes the results of a survey which explores and estimates consumer willingness to pay in two forms: a yearly contribution for the abolition of the use of pesticides on fruit, and a premium price for organically grown apples guaranteed by a certified label. The same questionnaire was administered to two samples. The first was a conventional face-to-face survey of customers of large retail outlets located around Bologna (Italy); the second was an Internet sample. The discrete choice data were analysed by means of probit and tobit models to estimate the utility consumers attribute to organically grown fruit and to a pesticide ban. The research also addresses questions of validity and representativeness as a fundamental problem in web-based surveys.
Resumo:
Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.
Resumo:
Many theories for the Madden-Julian oscillation (MJO) focus on diabatic processes, particularly the evolution of vertical heating and moistening. Poor MJO performance in weather and climate models is often blamed on biases in these processes and their interactions with the large-scale circulation. We introduce one of three components of a model-evaluation project, which aims to connect MJO fidelity in models to their representations of several physical processes, focusing on diabatic heating and moistening. This component consists of 20-day hindcasts, initialised daily during two MJO events in winter 2009-10. The 13 models exhibit a range of skill: several have accurate forecasts to 20 days' lead, while others perform similarly to statistical models (8-11 days). Models that maintain the observed MJO amplitude accurately predict propagation, but not vice versa. We find no link between hindcast fidelity and the precipitation-moisture relationship, in contrast to other recent studies. There is also no relationship between models' performance and the evolution of their diabatic-heating profiles with rain rate. A more robust association emerges between models' fidelity and net moistening: the highest-skill models show a clear transition from low-level moistening for light rainfall to mid-level moistening at moderate rainfall and upper-level moistening for heavy rainfall. The mid-level moistening, arising from both dynamics and physics, may be most important. Accurately representing many processes may be necessary, but not sufficient for capturing the MJO, which suggests that models fail to predict the MJO for a broad range of reasons and limits the possibility of finding a panacea.