963 resultados para tense and aspect


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Differential geometry is used to investigate the structure of neural-network-based control systems. The key aspect is relative order—an invariant property of dynamic systems. Finite relative order allows the specification of a minimal architecture for a recurrent network. Any system with finite relative order has a left inverse. It is shown that a recurrent network with finite relative order has a local inverse that is also a recurrent network with the same weights. The results have implications for the use of recurrent networks in the inverse-model-based control of nonlinear systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mean state, variability and extreme variability of the stratospheric polar vortices, with an emphasis on the Northern Hemisphere vortex, are examined using 2-dimensional moment analysis and Extreme Value Theory (EVT). The use of moments as an analysis to ol gives rise to information about the vortex area, centroid latitude, aspect ratio and kurtosis. The application of EVT to these moment derived quantaties allows the extreme variability of the vortex to be assessed. The data used for this study is ECMWF ERA-40 potential vorticity fields on interpolated isentropic surfaces that range from 450K-1450K. Analyses show that the most extreme vortex variability occurs most commonly in late January and early February, consistent with when most planetary wave driving from the troposphere is observed. Composites around sudden stratospheric warming (SSW) events reveal that the moment diagnostics evolve in statistically different ways between vortex splitting events and vortex displacement events, in contrast to the traditional diagnostics. Histograms of the vortex diagnostics on the 850K (∼10hPa) surface over the 1958-2001 period are fitted with parametric distributions, and show that SSW events comprise the majority of data in the tails of the distributions. The distribution of each diagnostic is computed on various surfaces throughout the depth of the stratosphere, and shows that in general the vortex becomes more circular with higher filamentation at the upper levels. The Northern Hemisphere (NH) and Southern Hemisphere (SH) vortices are also compared through the analysis of their respective vortex diagnostics, and confirm that the SH vortex is less variable and lacks extreme events compared to the NH vortex. Finally extreme value theory is used to statistically mo del the vortex diagnostics and make inferences about the underlying dynamics of the polar vortices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The present study was carried out to determine effects of test meals of different fatty acid compositions on postprandial lipoprotein and apolipoprotein metabolism. DESIGN: The study was a randomized, single blind design. SETTING: The study was carried out in the Clinical Investigation Unit of the Royal Surrey County Hospital. SUBJECTS: Twelve male normal subjects with an average age of 22.4 +/- 1.4 years (mean +/- SD) were selected from the student population of the University of Surrey; one subject dropped out of the study because he found the test meal unpalatable. INTERVENTIONS: The subjects were given three evening test meals on three separate occasions, in which the oils used were either a mixed oil (rich in saturated fatty acids and approximated the fatty acid intake of the current UK diet), corn oil (rich in n-6 fatty acids), or fish oil (rich in n-3 fatty acids) 40 g of the oil under investigation were incorporated into a rice-based test meal. Triacylglycerol-rich lipoproteins-triacylglycerol (TRL-TAG), TRL-cholesterol (TRL-cholesterol), plasma-TAG, plasma cholesterol (T-C), and serum apolipoprotein A-I and B (apo A-I and B) responses were measured. Postprandial responses were followed for 11 h. RESULTS: Postprandial plasma-TAG responses, calculated as incremental areas under the response curves (IAUC) were significantly reduced following the fish oil meal [365.5 +/- 145.4 mmol/l x min (mean +/- SD)[ compared with the mixed oil meal (552.0 +/- 141.7 mmol/l x min) (P < 0.05) and there was a strong trend towards the same direction in the TRL-TAG responses. In all instances, plasma-and TRL-TAG showed a biphasic response with increased concentrations occurring at 1h and between 3 and 7h postprandially. TRL-cholesterol, T-C, and serum apo A-I and B responses to the three meals were similar. CONCLUSIONS: The findings support the view that fish oils decrease postprandial lipaemia and this may be an important aspect of their beneficial effects in reducing risk of coronary heart disease (CHD). Further work is required to determine the mechanisms responsible for this effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information modelling is a topic that has been researched a great deal, but still many questions around it have not been solved. An information model is essential in the design of a database which is the core of an information system. Currently most of databases only deal with information that represents facts, or asserted information. The ability of capturing semantic aspect has to be improved, and yet other types, such as temporal and intentional information, should be considered. Semantic Analysis, a method of information modelling, has offered a way to handle various aspects of information. It employs the domain knowledge and communication acts as sources of information modelling. It lends itself to a uniform structure whereby semantic, temporal and intentional information can be captured, which builds a sound foundation for building a semantic temporal database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-level rise is an important aspect of climate change because of its impact on society and ecosystems. Here we present an intercomparison of results from ten coupled atmosphere-ocean general circulation models (AOGCMs) for sea-level changes simulated for the twentieth century and projected to occur during the twenty first century in experiments following scenario IS92a for greenhouse gases and sulphate aerosols. The model results suggest that the rate of sea-level rise due to thermal expansion of sea water has increased during the twentieth century, but the small set of tide gauges with long records might not be adequate to detect this acceleration. The rate of sea-level rise due to thermal expansion continues to increase throughout the twenty first century, and the projected total is consequently larger than in the twentieth century; for 1990-2090 it amounts to 0.20-0.37 in. This wide range results from systematic uncertainty in modelling of climate change and of heat uptake by the ocean. The AOGCMs agree that sea-level rise is expected to be geographically non-uniform, with some regions experiencing as much as twice the global average, and others practically zero, but they do not agree about the geographical pattern. The lack of agreement indicates that we cannot currently have confidence in projections of local sea- level changes, and reveals a need for detailed analysis and intercomparison in order to understand and reduce the disagreements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The benefits and applications of virtual reality (VR) in the construction industry have been investigated for almost a decade. However, the practical implementation of VR in the construction industry has yet to reach maturity owing to technical constraints. The need for effective information management presents challenges: both transfer of building data to, and organisation of building information within, the virtual environment require consideration. This paper reviews the applications and benefits of VR in the built environment field and reports on a collaboration between Loughborough University and South Bank University to overcome constraints on the use of the overall VR model for whole lifecycle visualisation. The work at each research centre is concerned with an aspect of information management within VR applications for the built environment, and both data transfer and internal data organisation have been investigated. In this paper, similarities and differences between computer-aided design (CAD) and VR packages are first discussed. Three different approaches to the creation of VR models during the design stage are identified and described, with a view to providing sharing understanding across the interdiscipliary groups involved. The suitable organisation of building information within the virtual environment is then further investigated. This work focused on the visualisation of the degradation of a building, through its lifespan, with the view to provide a visual aid for developing an effective and economic project maintenance programme. Finally consideration is given to the potential of emerging standards to facilitate an integrated use of VR. The convergence towards similar data structures in VR and other construction packages may enable visualisation to be better utilised in the overall lifecycle model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A stylised fact in the real estate portfolio diversification literature is that sector (property-type) effects are relatively more important than regional (geographical) factors in determining property returns. Thus, for those portfolio managers who follow a top-down approach to portfolio management, they should first choose in which sectors to invest and then select the best properties in each market. However, the question arises as to whether the dominance of the sector effects relative to regional effects is constant. If not property fund managers will need to take account of regional effects in developing their portfolio strategy. Using monthly data over the period 1987:1 to 2002:12 for a sample of over 1000 properties the results show that the sector-specific factors dominate the regional-specific factors for the vast majority of the time. Nonetheless, there are periods when the regional factors are of equal or greater importance than the sector effects. In particular, the sector effects tend to dominate during volatile periods of the real estate cycle; however, during calmer periods the sector and regional effects are of equal importance. These findings suggest that the sector effects are still the most important aspect in the development of an active portfolio strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interchange reconnection at the Sun, that is, reconnection between a doubly-connected field loop and singly-connected or open field line that extends to infinity, has important implications for the heliospheric magnetic flux budget. Recent work on the topic is reviewed, with emphasis on two aspects. The first is a possible heliospheric signature of interchange reconnection at the coronal hole boundary, where open fields meet closed loops. The second aspect concerns the means by which the heliospheric magnetic field strength reached record-lows during the recent solar minimum period. A new implication of this work is that interchange reconnection may be responsible for the puzzling, occasional coincidence of the heliospheric current sheet and the interface between fast and slow flow in the solar wind.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter considers the Multiband Orthogonal Frequency Division Multiplexing (MB- OFDM) modulation and demodulation with the intention to optimize the Ultra-Wideband (UWB) system performance. OFDM is a type of multicarrier modulation and becomes the most important aspect for the MB-OFDM system performance. It is also a low cost digital signal component efficiently using Fast Fourier Transform (FFT) algorithm to implement the multicarrier orthogonality. Within the MB-OFDM approach, the OFDM modulation is employed in each 528 MHz wide band to transmit the data across the different bands while also using the frequency hopping technique across different bands. Each parallel bit stream can be mapped onto one of the OFDM subcarriers. Quadrature Phase Shift Keying (QPSK) and Dual Carrier Modulation (DCM) are currently used as the modulation schemes for MB-OFDM in the ECMA-368 defined UWB radio platform. A dual QPSK soft-demapper is suitable for ECMA-368 that exploits the inherent Time-Domain Spreading (TDS) and guard symbol subcarrier diversity to improve the receiver performance, yet merges decoding operations together to minimize hardware and power requirements. There are several methods to demap the DCM, which are soft bit demapping, Maximum Likelihood (ML) soft bit demapping, and Log Likelihood Ratio (LLR) demapping. The Channel State Information (CSI) aided scheme coupled with the band hopping information is used as a further technique to improve the DCM demapping performance. ECMA-368 offers up to 480 Mb/s instantaneous bit rate to the Medium Access Control (MAC) layer, but depending on radio channel conditions dropped packets unfortunately result in a lower throughput. An alternative high data rate modulation scheme termed Dual Circular 32-QAM that fits within the configuration of the current standard increasing system throughput thus maintaining the high rate throughput even with a moderate level of dropped packets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magmas in volcanic conduits commonly contain microlites in association with preexisting phenocrysts, as often indicated by volcanic rock textures. In this study, we present two different experiments that inves- tigate the flow behavior of these bidisperse systems. In the first experiments, rotational rheometric methods are used to determine the rheology of monodisperse and polydisperse suspensions consisting of smaller, prolate particles (microlites) and larger, equant particles (phenocrysts) in a bubble‐free Newtonian liquid (silicate melt). Our data show that increasing the relative proportion of prolate microlites to equant pheno- crysts in a magma at constant total particle content can increase the relative viscosity by up to three orders of magnitude. Consequently, the rheological effect of particles in magmas cannot be modeled by assuming a monodisperse population of particles. We propose a new model that uses interpolated parameters based on the relative proportions of small and large particles and produces a considerably improved fit to the data than earlier models. In a second series of experiments we investigate the textures produced by shearing bimodal suspensions in gradually solidifying epoxy resin in a concentric cylinder setup. The resulting textures show the prolate particles are aligned with the flow lines and spherical particles are found in well‐organized strings, with sphere‐depleted shear bands in high‐shear regions. These observations may explain the measured variation in the shear thinning and yield stress behavior with increasing solid fraction and particle aspect ratio. The implications for magma flow are discussed, and rheological results and tex- tural observations are compared with observations on natural samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Crusades in the Near East, eastern Baltic and Iberian Peninsula (in the context of the Reconquest/reconquista) were accompanied by processes of colonisation, characterising the expansion of medieval Europe and resulting in the creation of frontier societies at the fringes of Christendom. Colonisation was closely associated with — indeed, depended on — the exploitation of local environments, but this dimension is largely missing from studies of the crusading frontiers. This paper, the product of a European Science Foundation Exploratory Workshop on 'The Ecology of Crusading' in 2009, surveys the potential for investigating the environmental impact of the crusading movement in all three frontier regions. It considers a diverse range of archaeological, palaeoenvironmental and written sources, with the aim of situating the societies created by the Crusades within the context of medieval colonisation and human ecological niche construction. It demonstrates that an abundant range of data exists for developing this largely neglected and disparately studied aspect of medieval frontier societies into a significant research programme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a literature review, we argue that new models of peatland development are needed. Many existing models do not account for potentially important ecohydrological feedbacks, and/or ignore spatial structure and heterogeneity. Existing models, including those that simulate a near total loss of the northern peatland carbon store under a warming climate, may produce misleading results because they rely upon oversimplified representations of ecological and hydrological processes. In this, the first of a pair of papers, we present the conceptual framework for a model of peatland development, DigiBog, which considers peatlands as complex adaptive systems. DigiBog accounts for the interactions between the processes which govern litter production and peat decay, peat soil hydraulic properties, and peatland water-table behaviour, in a novel and genuinely ecohydrological manner. DigiBog consists of a number of interacting submodels, each representing a different aspect of peatland ecohydrology. Here we present in detail the mathematical and computational basis, as well as the implementation and testing, of the hydrological submodel. Remaining submodels are described and analysed in the accompanying paper. Tests of the hydrological submodel against analytical solutions for simple aquifers were highly successful: the greatest deviation between DigiBog and the analytical solutions was 2·83%. We also applied the hydrological submodel to irregularly shaped aquifers with heterogeneous hydraulic properties—situations for which no analytical solutions exist—and found the model's outputs to be plausible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analysis of how illustrations functioned as a distinctive and important aspect of the translation of Latin versions of the story of the rape and suicide of Lucretia into Middle French texts, especially the 'Faits et dits memorables' (a translation-adaptation of Valerius Maximus's 'Facta et dicta memorabilia'). The study focuses on a selection of 14th- and 15th- century illuminations, and proposes also that the early modern 'Lucretia' portrait tradition should be viewed in the context of these images.