52 resultados para space-time block codes
The Asian summer monsoon: an intercomparison of CMIP5 vs. CMIP3 simulations of the late 20th century
Resumo:
The boreal summer Asian monsoon has been evaluated in 25 Coupled Model Intercomparison Project-5 (CMIP5) and 22 CMIP3 GCM simulations of the late 20th Century. Diagnostics and skill metrics have been calculated to assess the time-mean, climatological annual cycle, interannual variability, and intraseasonal variability. Progress has been made in modeling these aspects of the monsoon, though there is no single model that best represents all of these aspects of the monsoon. The CMIP5 multi-model mean (MMM) is more skillful than the CMIP3 MMM for all diagnostics in terms of the skill of simulating pattern correlations with respect to observations. Additionally, for rainfall/convection the MMM outperforms the individual models for the time mean, the interannual variability of the East Asian monsoon, and intraseasonal variability. The pattern correlation of the time (pentad) of monsoon peak and withdrawal is better simulated than that of monsoon onset. The onset of the monsoon over India is typically too late in the models. The extension of the monsoon over eastern China, Korea, and Japan is underestimated, while it is overestimated over the subtropical western/central Pacific Ocean. The anti-correlation between anomalies of all-India rainfall and Niño-3.4 sea surface temperature is overly strong in CMIP3 and typically too weak in CMIP5. For both the ENSO-monsoon teleconnection and the East Asian zonal wind-rainfall teleconnection, the MMM interannual rainfall anomalies are weak compared to observations. Though simulation of intraseasonal variability remains problematic, several models show improved skill at representing the northward propagation of convection and the development of the tilted band of convection that extends from India to the equatorial west Pacific. The MMM also well represents the space-time evolution of intraseasonal outgoing longwave radiation anomalies. Caution is necessary when using GPCP and CMAP rainfall to validate (1) the time-mean rainfall, as there are systematic differences over ocean and land between these two data sets, and (2) the timing of monsoon withdrawal over India, where the smooth southward progression seen in India Meteorological Department data is better realized in CMAP data compared to GPCP data.
Resumo:
Drought characterisation is an intrinsically spatio-temporal problem. A limitation of previous approaches to characterisation is that they discard much of the spatio-temporal information by reducing events to a lower-order subspace. To address this, an explicit 3-dimensional (longitude, latitude, time) structure-based method is described in which drought events are defined by a spatially and temporarily coherent set of points displaying standardised precipitation below a given threshold. Geometric methods can then be used to measure similarity between individual drought structures. Groupings of these similarities provide an alternative to traditional methods for extracting recurrent space-time signals from geophysical data. The explicit consideration of structure encourages the construction of summary statistics which relate to the event geometry. Example measures considered are the event volume, centroid, and aspect ratio. The utility of a 3-dimensional approach is demonstrated by application to the analysis of European droughts (15 °W to 35°E, and 35 °N to 70°N) for the period 1901–2006. Large-scale structure is found to be abundant with 75 events identified lasting for more than 3 months and spanning at least 0.5 × 106 km2. Near-complete dissimilarity is seen between the individual drought structures, and little or no regularity is found in the time evolution of even the most spatially similar drought events. The spatial distribution of the event centroids and the time evolution of the geographic cross-sectional areas strongly suggest that large area, sustained droughts result from the combination of multiple small area (∼106 km2) short duration (∼3 months) events. The small events are not found to occur independently in space. This leads to the hypothesis that local water feedbacks play an important role in the aggregation process.
Resumo:
A fingerprint method for detecting anthropogenic climate change is applied to new simulations with a coupled ocean-atmosphere general circulation model (CGCM) forced by increasing concentrations of greenhouse gases and aerosols covering the years 1880 to 2050. In addition to the anthropogenic climate change signal, the space-time structure of the natural climate variability for near-surface temperatures is estimated from instrumental data over the last 134 years and two 1000 year simulations with CGCMs. The estimates are compared with paleoclimate data over 570 years. The space-time information on both the signal and the noise is used to maximize the signal-to-noise ratio of a detection variable obtained by applying an optimal filter (fingerprint) to the observed data. The inclusion of aerosols slows the predicted future warming. The probability that the observed increase in near-surface temperatures in recent decades is of natural origin is estimated to be less than 5%. However, this number is dependent on the estimated natural variability level, which is still subject to some uncertainty.
Resumo:
In the 1960s and early 1970s sea surface temperatures in the North Atlantic Ocean cooled rapidly. There is still considerable uncertainty about the causes of this event, although various mechanisms have been proposed. In this observational study it is demonstrated that the cooling proceeded in several distinct stages. Cool anomalies initially appeared in the mid-1960s in the Nordic Seas and Gulf Stream Extension, before spreading to cover most of the Subpolar Gyre. Subsequently, cool anomalies spread into the tropical North Atlantic before retreating, in the late 1970s, back to the Subpolar Gyre. There is strong evidence that changes in atmospheric circulation, linked to a southward shift of the Atlantic ITCZ, played an important role in the event, particularly in the period 1972-76. Theories for the cooling event must account for its distinctive space-time evolution. Our analysis suggests that the most likely drivers were: 1) The “Great Salinity Anomaly” of the late 1960s; 2) An earlier warming of the subpolar North Atlantic, which may have led to a slow-down in the Atlantic Meridional Overturning Circulation; 3) An increase in anthropogenic sulphur dioxide emissions. Determining the relative importance of these factors is a key area for future work.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
Research on invention has focused on business invention and little work has been conducted on the process and capability required for the individual inventor or the capabilities required for an advice to be considered an invention. This paper synthesises the results of an empirical survey of ten inventor case studies with current research on invention and recent capability affordance research to develop an integrated capability process model of human capabilities for invention and specific capabilities of an invented device. We identify eight necessary human effectivities required for individual invention capability and six functional key activities using these effectivities, to deliver the functional capability of invention. We also identified key differences between invention and general problem solving processes. Results suggest that inventive step capability relies on a unique application of principles that relate to a new combination of affordance chain with a new mechanism and or space time (affordance) path representing the novel way the device works, in conjunction with defined critical affordance operating factors that are the subject of the patent claims.
Resumo:
In numerical weather prediction, parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known; and the parameterisations themselves are also approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation (DA), such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential DA methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to pre-determined functional forms of missing physics or parameterisations that are based upon prior information. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. Furthermore, it is shown how the method depends on the quality of the DA results. The results indicate that this new method is a powerful tool in systematic model improvement.