17 resultados para Distributed Network Protocol version 3 (DNP3)

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies have identified a distributed network of brain regions thought to support cognitive reappraisal processes underlying emotion regulation in response to affective images, including parieto-temporal regions and lateral/medial regions of prefrontal cortex (PFC). A number of these commonly activated regions are also known to underlie visuospatial attention and oculomotor control, which raises the possibility that people use attentional redeployment rather than, or in addition to, reappraisal as a strategy to regulate emotion. We predicted that a significant portion of the observed variance in brain activation during emotion regulation tasks would be associated with differences in how participants visually scan the images while regulating their emotions. We recorded brain activation using fMRI and quantified patterns of gaze fixation while participants increased or decreased their affective response to a set of affective images. fMRI results replicated previous findings on emotion regulation with regulation differences reflected in regions of PFC and the amygdala. In addition, our gaze fixation data revealed that when regulating, individuals changed their gaze patterns relative to a control condition. Furthermore, this variation in gaze fixation accounted for substantial amounts of variance in brain activation. These data point to the importance of controlling for gaze fixation in studies of emotion regulation that use visual stimuli.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decoding emotional prosody is crucial for successful social interactions, and continuous monitoring of emotional intent via prosody requires working memory. It has been proposed by Ross and others that emotional prosody cognitions in the right hemisphere are organized in an analogous fashion to propositional language functions in the left hemisphere. This study aimed to test the applicability of this model in the context of prefrontal cortex working memory functions. BOLD response data were therefore collected during performance of two emotional working memory tasks by participants undergoing fMRI. In the prosody task, participants identified the emotion conveyed in pre-recorded sentences, and working memory load was manipulated in the style of an N-back task. In the matched lexico-semantic task, participants identified the emotion conveyed by sentence content. Block-design neuroimaging data were analyzed parametrically with SPM5. At first, working memory for emotional prosody appeared to be right-lateralized in the PFC, however, further analyses revealed that it shared much bilateral prefrontal functional neuroanatomy with working memory for lexico-semantic emotion. Supplementary separate analyses of males and females suggested that these language functions were less bilateral in females, but their inclusion did not alter the direction of laterality. It is concluded that Ross et al.'s model is not applicable to prefrontal cortex working memory functions, that evidence that working memory cannot be subdivided in prefrontal cortex according to material type is increased, and that incidental working memory demands may explain the frontal lobe involvement in emotional prosody comprehension as revealed by neuroimaging studies. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a distributed computing framework for problems characterized by a highly irregular search tree, whereby no reliable workload prediction is available. The framework is based on a peer-to-peer computing environment and dynamic load balancing. The system allows for dynamic resource aggregation, does not depend on any specific meta-computing middleware and is suitable for large-scale, multi-domain, heterogeneous environments, such as computational Grids. Dynamic load balancing policies based on global statistics are known to provide optimal load balancing performance, while randomized techniques provide high scalability. The proposed method combines both advantages and adopts distributed job-pools and a randomized polling technique. The framework has been successfully adopted in a parallel search algorithm for subgraph mining and evaluated on a molecular compounds dataset. The parallel application has shown good calability and close-to linear speedup in a distributed network of workstations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The commonly held view of the conditions in the North Atlantic at the last glacial maximum, based on the interpretation of proxy records, is of large-scale cooling compared to today, limited deep convection, and extensive sea ice, all associated with a southward displaced and weakened overturning thermohaline circulation (THC) in the North Atlantic. Not all studies support that view; in particular, the "strength of the overturning circulation" is contentious and is a quantity that is difficult to determine even for the present day. Quasi-equilibrium simulations with coupled climate models forced by glacial boundary conditions have produced differing results, as have inferences made from proxy records. Most studies suggest the weaker circulation, some suggest little or no change, and a few suggest a stronger circulation. Here results are presented from a three-dimensional climate model, the Hadley Centre Coupled Model version 3 (HadCM3), of the coupled atmosphere - ocean - sea ice system suggesting, in a qualitative sense, that these diverging views could all have occurred at different times during the last glacial period, with different modes existing at different times. One mode might have been characterized by an active THC associated with moderate temperatures in the North Atlantic and a modest expanse of sea ice. The other mode, perhaps forced by large inputs of meltwater from the continental ice sheets into the northern North Atlantic, might have been characterized by a sluggish THC associated with very cold conditions around the North Atlantic and a large areal cover of sea ice. The authors' model simulation of such a mode, forced by a large input of freshwater, bears several of the characteristics of the Climate: Long-range Investigation, Mapping, and Prediction (CLIMAP) Project's reconstruction of glacial sea surface temperature and sea ice extent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Observations show the oceans have warmed over the past 40 yr. with appreciable regional variation and more warming at the surface than at depth. Comparing the observations with results from two coupled ocean-atmosphere climate models [the Parallel Climate Model version 1 (PCM) and the Hadley Centre Coupled Climate Model version 3 (HadCM3)] that include anthropogenic forcing shows remarkable agreement between the observed and model-estimated warming. In this comparison the models were sampled at the same locations as gridded yearly observed data. In the top 100 m of the water column the warming is well separated from natural variability, including both variability arising from internal instabilities of the coupled ocean-atmosphere climate system and that arising from volcanism and solar fluctuations. Between 125 and 200 m the agreement is not significant, but then increases again below this level, and remains significant down to 600 m. Analysis of PCM's heat budget indicates that the warming is driven by an increase in net surface heat flux that reaches 0.7 W m(-2) by the 1990s; the downward longwave flux increases bv 3.7 W m(-2). which is not fully compensated by an increase in the upward longwave flux of 2.2 W m(-2). Latent and net solar heat fluxes each decrease by about 0.6 W m(-2). The changes in the individual longwave components are distinguishable from the preindustrial mean by the 1920s, but due to cancellation of components. changes in the net surface heat flux do not become well separated from zero until the 1960s. Changes in advection can also play an important role in local ocean warming due to anthropogenic forcing, depending, on the location. The observed sampling of ocean temperature is highly variable in space and time. but sufficient to detect the anthropogenic warming signal in all basins, at least in the surface layers, bv the 1980s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is much evidence that El Niño and La Niña lead to significant atmospheric seasonal predictability across much of the globe. However, despite successful predictions of tropical Pacific SSTs, atmospheric seasonal forecasts have had limited success. This study investigates model errors in the Hadley Centre Atmospheric Model version 3 (HadAM3) by analyzing composites of similar El Niño and La Niña events at their peak in December–January–February (DJF) and through their decay in March–April–May (MAM). The large-scale, tropical ENSO teleconnections are modeled accurately by HadAM3 during DJF but the strongest extratropical teleconnection, that in the North Pacific during winter, is modeled inaccurately. The Aleutian low is frequently observed to shift eastward during El Niño but the modeled response always consists of a deepening of the low without a shift. This is traced to small errors in the sensitivity of precipitation to SST in the tropical Pacific, which does not display enough variability so that the precipitation is always too high over the warmest SSTs. This error is reduced when vertical resolution is increased from 19 to 30 levels but enhanced horizontal resolution does not improve the simulation further. In MAM, following the peak of an El Niño or La Niña, atmospheric anomalies are observed to decay rapidly. The modeled ENSO response in DJF persists into MAM, making the extratropical anomalies in MAM too strong. This inaccuracy is again likely to be due to the high modeled sensitivity of tropical Pacific precipitation to SST, which is not significantly improved with enhanced vertical or horizontal resolution in MAM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formation and composition of secondary organic aerosol (SOA) from the photooxidation of benzene, p-xylene, and 1,3,5-trimethylbenzene has been simulated using the Master Chemical Mechanism version 3.1 (MCM v3.1) coupled to a representation of the transfer of organic material from the gas to particle phase. The combined mechanism was tested against data obtained from a series of experiments conducted at the European Photoreactor (EUPHORE) outdoor smog chamber in Valencia, Spain. Simulated aerosol mass concentrations compared reasonably well with the measured SOA data only after absorptive partitioning coefficients were increased by a factor of between 5 and 30. The requirement of such scaling was interpreted in terms of the occurrence of unaccounted-for association reactions in the condensed organic phase leading to the production of relatively more nonvolatile species. Comparisons were made between the relative aerosol forming efficiencies of benzene, toluene, p-xylene, and 1,3,5-trimethylbenzene, and differences in the OH-initiated degradation mechanisms of these aromatic hydrocarbons. A strong, nonlinear relationship was observed between measured (reference) yields of SOA and (proportional) yields of unsaturated dicarbonyl aldehyde species resulting from ring-fragmenting pathways. This observation, and the results of the simulations, is strongly suggestive of the involvement of reactive aldehyde species in association reactions occurring in the aerosol phase, thus promoting SOA formation and growth. The effect of NO, concentrations on SOA formation efficiencies (and formation mechanisms) is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following on from the companion study (Johnson et al., 2006), a photochemical trajectory model (PTM) has been used to simulate the chemical composition of organic aerosol for selected events during the 2003 TORCH (Tropospheric Organic Chemistry Experiment) field campaign. The PTM incorporates the speciated emissions of 124 nonmethane anthropogenic volatile organic compounds (VOC) and three representative biogenic VOC, a highly-detailed representation of the atmospheric degradation of these VOC, the emission of primary organic aerosol (POA) material and the formation of secondary organic aerosol (SOA) material. SOA formation was represented by the transfer of semi and non-volatile oxidation products from the gas-phase to a condensed organic aerosol-phase, according to estimated thermodynamic equilibrium phase-partitioning characteristics for around 2000 reaction products. After significantly scaling all phase-partitioning coefficients, and assuming a persistent background organic aerosol (both required in order to match the observed organic aerosol loadings), the detailed chemical composition of the simulated SOA has been investigated in terms of intermediate oxygenated species in the Master Chemical Mechanism, version 3.1 ( MCM v3.1). For the various case studies considered, 90% of the simulated SOA mass comprises between ca. 70 and 100 multifunctional oxygenated species derived, in varying amounts, from the photooxidation of VOC of anthropogenic and biogenic origin. The anthropogenic contribution is dominated by aromatic hydrocarbons and the biogenic contribution by alpha-and beta-pinene (which also constitute surrogates for other emitted monoterpene species). Sensitivity in the simulated mass of SOA to changes in the emission rates of anthropogenic and biogenic VOC has also been investigated for 11 case study events, and the results have been compared to the detailed chemical composition data. The role of accretion chemistry in SOA formation, and its implications for the results of the present investigation, is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Along-Track Scanning Radiometers (ATSRs) provide a long time-series of measurements suitable for the retrieval of cloud properties. This work evaluates the freely-available Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE) dataset (version 3) created from the ATSR-2 (1995�2003) and Advanced ATSR (AATSR; 2002 onwards) records. Users are recommended to consider only retrievals flagged as high-quality, where there is a good consistency between the measurements and the retrieved state (corresponding to about 60% of converged retrievals over sea, and more than 80% over land). Cloud properties are found to be generally free of any significant spurious trends relating to satellite zenith angle. Estimates of the random error on retrieved cloud properties are suggested to be generally appropriate for optically-thick clouds, and up to a factor of two too small for optically-thin cases. The correspondence between ATSR-2 and AATSR cloud properties is high, but a relative calibration difference between the sensors of order 5�10% at 660 nm and 870 nm limits the potential of the current version of the dataset for trend analysis. As ATSR-2 is thought to have the better absolute calibration, the discussion focusses on this portion of the record. Cloud-top heights from GRAPE compare well to ground-based data at four sites, particularly for shallow clouds. Clouds forming in boundary-layer inversions are typically around 1 km too high in GRAPE due to poorly-resolved inversions in the modelled temperature profiles used. Global cloud fields are compared to satellite products derived from the Moderate Resolution Imaging Spectroradiometer (MODIS), Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) measurements, and a climatology of liquid water content derived from satellite microwave radiometers. In all cases the main reasons for differences are linked to differing sensitivity to, and treatment of, multi-layer cloud systems. The correlation coefficient between GRAPE and the two MODIS products considered is generally high (greater than 0.7 for most cloud properties), except for liquid and ice cloud effective radius, which also show biases between the datasets. For liquid clouds, part of the difference is linked to choice of wavelengths used in the retrieval. Total cloud cover is slightly lower in GRAPE (0.64) than the CALIOP dataset (0.66). GRAPE underestimates liquid cloud water path relative to microwave radiometers by up to 100 g m�2 near the Equator and overestimates by around 50 g m�2 in the storm tracks. Finally, potential future improvements to the algorithm are outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model differences in projections of extratropical regional climate change due to increasing greenhouse gases are investigated using two atmospheric general circulation models (AGCMs): ECHAM4 (Max Planck Institute, version 4) and CCM3 (National Center for Atmospheric Research Community Climate Model version 3). Sea-surface temperature (SST) fields calculated from observations and coupled versions of the two models are used to force each AGCM in experiments based on time-slice methodology. Results from the forced AGCMs are then compared to coupled model results from the Coupled Model Intercomparison Project 2 (CMIP2) database. The time-slice methodology is verified by showing that the response of each model to doubled CO2 and SST forcing from the CMIP2 experiments is consistent with the results of the coupled GCMs. The differences in the responses of the models are attributed to (1) the different tropical SST warmings in the coupled simulations and (2) the different atmospheric model responses to the same tropical SST warmings. Both are found to have important contributions to differences in implied Northern Hemisphere (NH) winter extratropical regional 500 mb height and tropical precipitation climate changes. Forced teleconnection patterns from tropical SST differences are primarily responsible for sensitivity differences in the extratropical North Pacific, but have relatively little impact on the North Atlantic. There are also significant differences in the extratropical response of the models to the same tropical SST anomalies due to differences in numerical and physical parameterizations. Differences due to parameterizations dominate in the North Atlantic. Differences in the control climates of the two coupled models from the current climate, in particular for the coupled model containing CCM3, are also demonstrated to be important in leading to differences in extratropical regional sensitivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new record of sea surface temperature (SST) for climate applications is described. This record provides independent corroboration of global variations estimated from SST measurements made in situ. Infrared imagery from Along-Track Scanning Radiometers (ATSRs) is used to create a 20 year time series of SST at 0.1° latitude-longitude resolution, in the ATSR Reprocessing for Climate (ARC) project. A very high degree of independence of in situ measurements is achieved via physics-based techniques. Skin SST and SST estimated for 20 cm depth are provided, with grid cell uncertainty estimates. Comparison with in situ data sets establishes that ARC SSTs generally have bias of order 0.1 K or smaller. The precision of the ARC SSTs is 0.14 K during 2003 to 2009, from three-way error analysis. Over the period 1994 to 2010, ARC SSTs are stable, with better than 95% confidence, to within 0.005 K yr−1(demonstrated for tropical regions). The data set appears useful for cleanly quantifying interannual variability in SST and major SST anomalies. The ARC SST global anomaly time series is compared to the in situ-based Hadley Centre SST data set version 3 (HadSST3). Within known uncertainties in bias adjustments applied to in situ measurements, the independent ARC record and HadSST3 present the same variations in global marine temperature since 1996. Since the in situ observing system evolved significantly in its mix of measurement platforms and techniques over this period, ARC SSTs provide an important corroboration that HadSST3 accurately represents recent variability and change in this essential climate variable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much research has been undertaken into the value of mentoring for beginning teachers. Less research has been done into the mentoring of Overseas Trained Teachers (OTTs). The studies that have been done suggest that mentors’ lack of cultural knowledge and of the pedagogical challenges faced by OTT-mentees may inhibit the integration of such teachers into school life. It may be that effective tailor-made training cannot be provided for OTTs by mentors whose skills or knowledge are insufficient. Lack of understanding of the cultural diversity of mentees may result, as studies show, in mentors failing to address relevant issues during the mentoring process. This study investigates the experiences of OTTs of mentorship in England, and suggests the importance of mentors having understanding of their culturally diverse OTT mentees. The implications of these findings in the context of recent teacher-training policy developments in England are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Morocco constitutes an important centre of plant diversity and speciation in the Mediterranean Basin. However, numerous species are threatened by issues ranging from human activities to global climatic change. In this study, we present the conservation assessments and Red Listing of the endemic Moroccan monocotyledons according to International Union for Conservation of Nature (IUCN) criteria and categories. For each species, we include basic taxonomic information, local names and synonyms, uses, a distribution map, extent of occurrence, area of occupancy, population size and trend, a description of habitats and ecological requirements, and a discussion of the threats affecting the species and habitats. We assessed the threatened status of the endemic Moroccan monocotyledons at the species level (59 species) using the IUCN Red List criteria and categories (Version 3.1). This study shows the high extinction risk to the Moroccan monocotyledon flora, with 95% of threatened species (20% Critically Endangered, 50% Endangered, 25% Vulnerable) and only 5% not threatened (2% Near Threatened and 3% Least Concern). The flora is thus of conservation concern, which is poorly recognized, both nationally and internationally. The study presents the first part and so far the only national IUCN Red Data List for a large group of Moroccan plants, and thus provides an overview of the threatened Moroccan flora. This IUCN Red List is an important first step towards the recognition of the danger to Moroccan biodiversity hotspots, conservation of threatened species and the raising of public awareness at national and international levels.