867 resultados para Model of the semantic fields
Resumo:
Vector space models (VSMs) represent word meanings as points in a high dimensional space. VSMs are typically created using a large text corpora, and so represent word semantics as observed in text. We present a new algorithm (JNNSE) that can incorporate a measure of semantics not previously used to create VSMs: brain activation data recorded while people read words. The resulting model takes advantage of the complementary strengths and weaknesses of corpus and brain activation data to give a more complete representation of semantics. Evaluations show that the model 1) matches a behavioral measure of semantics more closely, 2) can be used to predict corpus data for unseen words and 3) has predictive power that generalizes across brain imaging technologies and across subjects. We believe that the model is thus a more faithful representation of mental vocabularies.
Resumo:
The advent of high-power laser facilities has, in the past two decades, opened a new field of research where astrophysical environments can be scaled down to laboratory dimensions, while preserving the essential physics. This is due to the invariance of the equations of magneto-hydrodynamics to a class of similarity transformations. Here we review the relevant scaling relations and their application in laboratory astrophysics experiments with a focus on the generation and amplification of magnetic fields in cosmic environment. The standard model for the origin of magnetic fields is a multi stage process whereby a vanishing magnetic seed is first generated by a rotational electric field and is then amplified by turbulent dynamo action to the characteristic values observed in astronomical bodies. We thus discuss the relevant seed generation mechanisms in cosmic environment including resistive mechanism, collision-less and fluid instabilities, as well as novel laboratory experiments using high power laser systems aimed at investigating the amplification of magnetic energy by magneto-hydrodynamic (MHD) turbulence. Future directions, including efforts to model in the laboratory the process of diffusive shock acceleration are also discussed, with an emphasis on the potential of laboratory experiments to further our understanding of plasma physics on cosmic scales.
Resumo:
The representation of the diurnal cycle in the Hadley Centre climate model is evaluated using simulations of the infrared radiances observed by Meteosat 7. In both the window and water vapour channels, the standard version of the model with 19 levels produces a good simulation of the geographical distributions of the mean radiances and of the amplitude of the diurnal cycle. Increasing the vertical resolution to 30 levels leads to further improvements in the mean fields. The timing of the maximum and minimum radiances reveals significant model errors, however, which are sensitive to the frequency with which the radiation scheme is called. In most regions, these errors are consistent with well documented errors in the timing of convective precipitation, which peaks before noon in the model, in contrast to the observed peak in the late afternoon or evening. When the radiation scheme is called every model time step (half an hour), as opposed to every three hours in the standard version, the timing of the minimum radiance is improved for convective regions over central Africa, due to the creation of upper-level layer-cloud by detrainment from the convection scheme, which persists well after the convection itself has dissipated. However, this produces a decoupling between the timing of the diurnal cycles of precipitation and window channel radiance. The possibility is raised that a similar decoupling may occur in reality and the implications of this for the retrieval of the diurnal cycle of precipitation from infrared radiances are discussed.
Resumo:
Currently many ontologies are available for addressing different domains. However, it is not always possible to deploy such ontologies to support collaborative working, so that their full potential can be exploited to implement intelligent cooperative applications capable of reasoning over a network of context-specific ontologies. The main problem arises from the fact that presently ontologies are created in an isolated way to address specific needs. However we foresee the need for a network of ontologies which will support the next generation of intelligent applications/devices, and, the vision of Ambient Intelligence. The main objective of this paper is to motivate the design of a networked ontology (Meta) model which formalises ways of connecting available ontologies so that they are easy to search, to characterise and to maintain. The aim is to make explicit the virtual and implicit network of ontologies serving the Semantic Web.
Resumo:
We introduce a technique for assessing the diurnal development of convective storm systems based on outgoing longwave radiation fields. Using the size distribution of the storms measured from a series of images, we generate an array in the lengthscale-time domain based on the standard score statistic. It demonstrates succinctly the size evolution of storms as well as the dissipation kinematics. It also provides evidence related to the temperature evolution of the cloud tops. We apply this approach to a test case comparing observations made by the Geostationary Earth Radiation Budget instrument to output from the Met Office Unified Model run at two resolutions. The 12km resolution model produces peak convective activity on all lengthscales significantly earlier in the day than shown by the observations and no evidence for storms growing in size. The 4km resolution model shows realistic timing and growth evolution although the dissipation mechanism still differs from the observed data.
Resumo:
Simulations of the stratosphere from thirteen coupled chemistry-climate models (CCMs) are evaluated to provide guidance for the interpretation of ozone predictions made by the same CCMs. The focus of the evaluation is on how well the fields and processes that are important for determining the ozone distribution are represented in the simulations of the recent past. The core period of the evaluation is from 1980 to 1999 but long-term trends are compared for an extended period (1960–2004). Comparisons of polar high-latitude temperatures show that most CCMs have only small biases in the Northern Hemisphere in winter and spring, but still have cold biases in the Southern Hemisphere spring below 10 hPa. Most CCMs display the correct stratospheric response of polar temperatures to wave forcing in the Northern, but not in the Southern Hemisphere. Global long-term stratospheric temperature trends are in reasonable agreement with satellite and radiosonde observations. Comparisons of simulations of methane, mean age of air, and propagation of the annual cycle in water vapor show a wide spread in the results, indicating differences in transport. However, for around half the models there is reasonable agreement with observations. In these models the mean age of air and the water vapor tape recorder signal are generally better than reported in previous model intercomparisons. Comparisons of the water vapor and inorganic chlorine (Cly) fields also show a large intermodel spread. Differences in tropical water vapor mixing ratios in the lower stratosphere are primarily related to biases in the simulated tropical tropopause temperatures and not transport. The spread in Cly, which is largest in the polar lower stratosphere, appears to be primarily related to transport differences. In general the amplitude and phase of the annual cycle in total ozone is well simulated apart from the southern high latitudes. Most CCMs show reasonable agreement with observed total ozone trends and variability on a global scale, but a greater spread in the ozone trends in polar regions in spring, especially in the Arctic. In conclusion, despite the wide range of skills in representing different processes assessed here, there is sufficient agreement between the majority of the CCMs and the observations that some confidence can be placed in their predictions.
Resumo:
The dependence of the annual mean tropical precipitation on horizontal resolution is investigated in the atmospheric version of the Hadley Centre General Environment Model (HadGEM1). Reducing the grid spacing from about 350 km to 110 km improves the precipitation distribution in most of the tropics. In particular, characteristic dry biases over South and Southeast Asia including the Maritime Continent as well as wet biases over the western tropical oceans are reduced. The annual-mean precipitation bias is reduced by about one third over the Maritime Continent and the neighbouring ocean basins associated with it via the Walker circulation. Sensitivity experiments show that much of the improvement with resolution in the Maritime Continent region is due to the specification of better resolved surface boundary conditions (land fraction, soil and vegetation parameters) at the higher resolution. It is shown that in particular the formulation of the coastal tiling scheme may cause resolution sensitivity of the mean simulated climate. The improvement in the tropical mean precipitation in this region is not primarily associated with the better representation of orography at the higher resolution, nor with changes in the eddy transport of moisture. Sizeable sensitivity to changes in the surface fields may be one of the reasons for the large variation of the mean tropical precipitation distribution seen across climate models.
Resumo:
The Plaut, McClelland, Seidenberg and Patterson (1996) connectionist model of reading was evaluated at two points early in its training against reading data collected from British children on two occasions during their first year of literacy instruction. First, the network’s non-word reading was poor relative to word reading when compared with the children. Second, the network made more non-lexical than lexical errors, the opposite pattern to the children. Three adaptations were made to the training of the network to bring it closer to the learning environment of a child: an incremental training regime was adopted; the network was trained on grapheme– phoneme correspondences; and a training corpus based on words found in children’s early reading materials was used. The modifications caused a sharp improvement in non-word reading, relative to word reading, resulting in a near perfect match to the children’s data on this measure. The modified network, however, continued to make predominantly non-lexical errors, although evidence from a small-scale implementation of the full triangle framework suggests that this limitation stems from the lack of a semantic pathway. Taken together, these results suggest that, when properly trained, connectionist models of word reading can offer insights into key aspects of reading development in children.
Resumo:
The potential risk of agricultural pesticides to mammals typically depends on internal concentrations within individuals, and these are determined by the amount ingested and by absorption, distribution, metabolism, and excretion (ADME). Pesticide residues ingested depend, amongst other things, on individual spatial choices which determine how much and when feeding sites and areas of pesticide application overlap, and can be calculated using individual-based models (IBMs). Internal concentrations can be calculated using toxicokinetic (TK) models, which are quantitative representations of ADME processes. Here we provide a population model for the wood mouse (Apodemus sylvaticus) in which TK submodels were incorporated into an IBM representation of individuals making choices about where to feed. This allows us to estimate the contribution of individual spatial choice and TK processes to risk. We compared the risk predicted by four IBMs: (i) “AllExposed-NonTK”: assuming no spatial choice so all mice have 100% exposure, no TK, (ii) “AllExposed-TK”: identical to (i) except that the TK processes are included where individuals vary because they have different temporal patterns of ingestion in the IBM, (iii) “Spatial-NonTK”: individual spatial choice, no TK, and (iv) “Spatial-TK”: individual spatial choice and with TK. The TK parameters for hypothetical pesticides used in this study were selected such that a conventional risk assessment would fail. Exposures were standardised using risk quotients (RQ; exposure divided by LD50 or LC50). We found that for the exposed sub-population including either spatial choice or TK reduced the RQ by 37–85%, and for the total population the reduction was 37–94%. However spatial choice and TK together had little further effect in reducing RQ. The reasons for this are that when the proportion of time spent in treated crop (PT) approaches 1, TK processes dominate and spatial choice has very little effect, and conversely if PT is small spatial choice dominates and TK makes little contribution to exposure reduction. The latter situation means that a short time spent in the pesticide-treated field mimics exposure from a small gavage dose, but TK only makes a substantial difference when the dose was consumed over a longer period. We concluded that a combined TK-IBM is most likely to bring added value to the risk assessment process when the temporal pattern of feeding, time spent in exposed area and TK parameters are at an intermediate level; for instance wood mice in foliar spray scenarios spending more time in crop fields because of better plant cover.
Resumo:
Recent studies of the variation of geomagnetic activity over the past 140 years have quantified the "coronal source" magnetic flux F-s that leaves the solar atmosphere and enters the heliosphere and have shown that it has risen, on average, by an estimated 34% since 1963 and by 140% since 1900. This variation of open solar flux has been reproduced by Solanki et al. [2000] using a model which demonstrates how the open flux accumulates and decays, depending on the rate of flux emergence in active regions and on the length of the solar cycle. We here use a new technique to evaluate solar cycle length and find that it does vary in association with the rate of change of F-s in the way predicted. The long-term variation of the rate of flux emergence is found to be very similar in form to that in F-s, which may offer a potential explanation of why F-s appears to be a useful proxy for extrapolating solar total irradiance back in time. We also find that most of the variation of cosmic ray fluxes incident on Earth is explained by the strength of the heliospheric field (quantified by F-s) and use observations of the abundance of the isotope Be-10 (produced by cosmic rays and deposited in ice sheets) to study the decrease in F-s during the Maunder minimum. The interior motions at the base of the convection zone, where the solar dynamo is probably located, have recently been revealed using the helioseismology technique and found to exhibit a 1.3-year oscillation. This periodicity is here reported in observations of the interplanetary magnetic field and geomagnetic activity but is only present after 1940, When present, it shows a strong 22-year variation, peaking near the maximum of even-numbered sunspot cycles and showing minima at the peaks of odd-numbered cycles. We discuss the implications of these long-term solar and heliospheric variations for Earth's environment.
Resumo:
Weather and climate model simulations of the West African Monsoon (WAM) have generally poor representation of the rainfall distribution and monsoon circulation because key processes, such as clouds and convection, are poorly characterized. The vertical distribution of cloud and precipitation during the WAM are evaluated in Met Office Unified Model simulations against CloudSat observations. Simulations were run at 40-km and 12-km horizontal grid length using a convection parameterization scheme and at 12-km, 4-km, and 1.5-km grid length with the convection scheme effectively switched off, to study the impact of model resolution and convection parameterization scheme on the organisation of tropical convection. Radar reflectivity is forward-modelled from the model cloud fields using the CloudSat simulator to present a like-with-like comparison with the CloudSat radar observations. The representation of cloud and precipitation at 12-km horizontal grid length improves dramatically when the convection parameterization is switched off, primarily because of a reduction in daytime (moist) convection. Further improvement is obtained when reducing model grid length to 4 km or 1.5 km, especially in the representation of thin anvil and mid-level cloud, but three issues remain in all model configurations. Firstly, all simulations underestimate the fraction of anvils with cloud top height above 12 km, which can be attributed to too low ice water contents in the model compared to satellite retrievals. Secondly, the model consistently detrains mid-level cloud too close to the freezing level, compared to higher altitudes in CloudSat observations. Finally, there is too much low-level cloud cover in all simulations and this bias was not improved when adjusting the rainfall parameters in the microphysics scheme. To improve model simulations of the WAM, more detailed and in-situ observations of the dynamics and microphysics targeting these non-precipitating cloud types are required.
Resumo:
This study explores the decadal potential predictability of the Atlantic Meridional Overturning Circulation (AMOC) as represented in the IPSL-CM5A-LR model, along with the predictability of associated oceanic and atmospheric fields. Using a 1000-year control run, we analyze the prognostic potential predictability (PPP) of the AMOC through ensembles of simulations with perturbed initial conditions. Based on a measure of the ensemble spread, the modelled AMOC has an average predictive skill of 8 years, with some degree of dependence on the AMOC initial state. Diagnostic potential predictability of surface temperature and precipitation is also identified in the control run and compared to the PPP. Both approaches clearly bring out the same regions exhibiting the highest predictive skill. Generally, surface temperature has the highest skill up to 2 decades in the far North Atlantic ocean. There are also weak signals over a few oceanic areas in the tropics and subtropics. Predictability over land is restricted to the coastal areas bordering oceanic predictable regions. Potential predictability at interannual and longer timescales is largely absent for precipitation in spite of weak signals identified mainly in the Nordic Seas. Regions of weak signals show some dependence on AMOC initial state. All the identified regions are closely linked to decadal AMOC fluctuations suggesting that the potential predictability of climate arises from the mechanisms controlling these fluctuations. Evidence for dependence on AMOC initial state also suggests that studying skills from case studies may prove more useful to understand predictability mechanisms than computing average skill from numerous start dates.
Resumo:
Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.
Resumo:
In this paper we present a finite difference method for solving two-dimensional viscoelastic unsteady free surface flows governed by the single equation version of the eXtended Pom-Pom (XPP) model. The momentum equations are solved by a projection method which uncouples the velocity and pressure fields. We are interested in low Reynolds number flows and, to enhance the stability of the numerical method, an implicit technique for computing the pressure condition on the free surface is employed. This strategy is invoked to solve the governing equations within a Marker-and-Cell type approach while simultaneously calculating the correct normal stress condition on the free surface. The numerical code is validated by performing mesh refinement on a two-dimensional channel flow. Numerical results include an investigation of the influence of the parameters of the XPP equation on the extrudate swelling ratio and the simulation of the Barus effect for XPP fluids. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We consider the three-particle scattering S-matrix for the Landau-Lifshitz model by directly computing the set of the Feynman diagrams up to the second order. We show, following the analogous computations for the non-linear Schrdinger model [1, 2], that the three-particle S-matrix is factorizable in the first non-trivial order.