960 resultados para Space-time block code
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
This paper describes the basic tools for a real-time decision support system of a semiotic type on the example of the prototype for management and monitoring of a nuclear power block implemented on the basis of the tool complex G2+GDA using cognitive graphics and parallel processing. This work was supported by RFBR (project 02-07-90042).
Resumo:
A novel framework for modelling biomolecular systems at multiple scales in space and time simultaneously is described. The atomistic molecular dynamics representation is smoothly connected with a statistical continuum hydrodynamics description. The system behaves correctly at the limits of pure molecular dynamics (hydrodynamics) and at the intermediate regimes when the atoms move partly as atomistic particles, and at the same time follow the hydrodynamic flows. The corresponding contributions are controlled by a parameter, which is defined as an arbitrary function of space and time, thus, allowing an effective separation of the atomistic 'core' and continuum 'environment'. To fill the scale gap between the atomistic and the continuum representations our special purpose computer for molecular dynamics, MDGRAPE-4, as well as GPU-based computing were used for developing the framework. These hardware developments also include interactive molecular dynamics simulations that allow intervention of the modelling through force-feedback devices.
Resumo:
Mathematical Subject Classification 2010: 35R11, 42A38, 26A33, 33E12.
Resumo:
A new 3D implementation of a hybrid model based on the analogy with two-phase hydrodynamics has been developed for the simulation of liquids at microscale. The idea of the method is to smoothly combine the atomistic description in the molecular dynamics zone with the Landau-Lifshitz fluctuating hydrodynamics representation in the rest of the system in the framework of macroscopic conservation laws through the use of a single "zoom-in" user-defined function s that has the meaning of a partial concentration in the two-phase analogy model. In comparison with our previous works, the implementation has been extended to full 3D simulations for a range of atomistic models in GROMACS from argon to water in equilibrium conditions with a constant or a spatially variable function s. Preliminary results of simulating the diffusion of a small peptide in water are also reported.
Resumo:
The development of 3G (the 3rd generation telecommunication) value-added services brings higher requirements of Quality of Service (QoS). Wideband Code Division Multiple Access (WCDMA) is one of three 3G standards, and enhancement of QoS for WCDMA Core Network (CN) becomes more and more important for users and carriers. The dissertation focuses on enhancement of QoS for WCDMA CN. The purpose is to realize the DiffServ (Differentiated Services) model of QoS for WCDMA CN. Based on the parallelism characteristic of Network Processors (NPs), the NP programming model is classified as Pool of Threads (POTs) and Hyper Task Chaining (HTC). In this study, an integrated programming model that combines both of the two models was designed. This model has highly efficient and flexible features, and also solves the problems of sharing conflicts and packet ordering. We used this model as the programming model to realize DiffServ QoS for WCDMA CN. ^ The realization mechanism of the DiffServ model mainly consists of buffer management, packet scheduling and packet classification algorithms based on NPs. First, we proposed an adaptive buffer management algorithm called Packet Adaptive Fair Dropping (PAFD), which takes into consideration of both fairness and throughput, and has smooth service curves. Then, an improved packet scheduling algorithm called Priority-based Weighted Fair Queuing (PWFQ) was introduced to ensure the fairness of packet scheduling and reduce queue time of data packets. At the same time, the delay and jitter are also maintained in a small range. Thirdly, a multi-dimensional packet classification algorithm called Classification Based on Network Processors (CBNPs) was designed. It effectively reduces the memory access and storage space, and provides less time and space complexity. ^ Lastly, an integrated hardware and software system of the DiffServ model of QoS for WCDMA CN was proposed. It was implemented on the NP IXP2400. According to the corresponding experiment results, the proposed system significantly enhanced QoS for WCDMA CN. It extensively improves consistent response time, display distortion and sound image synchronization, and thus increases network efficiency and saves network resource.^
Resumo:
Swamp-breeding treefrogs form conspicuous components of many tropical forest sites, yet remain largely understudied. The La Selva Biological Station, a rainforest reserve in Costa Rica, harbors a rich swamp-breeding treefrog fauna that has been studied in only one of the many swamps found at the site. To understand if the species composition of treefrogs at La Selva varies over space or time, frogs were censused in 1982-83, 1994-95, 2005 and 2011 at two ponds located in the reserve. Data on treefrog habitat utilization were also collected. Species composition varied spatially only in 2011. Temporal variation was observed at both ponds for all groups tested. Habitat use varied among species and between swamps. The pattern of variation suggests that temporally dynamic systems such as temporary Neotropical forest swamps will converge and diverge in species composition over time.
Resumo:
The study examines the thought of Yanagita Kunio (1875–1962), an influential Japanese nationalist thinker and a founder of an academic discipline named minzokugaku. The purpose of the study is to bring into light an unredeemed potential of his intellectual and political project as a critique of the way in which modern politics and knowledge systematically suppresses global diversity. The study reads his texts against the backdrop of the modern understanding of space and time and its political and moral implications and traces the historical evolution of his thought that culminates in the establishment of minzokugaku. My reading of Yanagita’s texts draws on three interpretive hypotheses. First, his thought can be interpreted as a critical engagement with John Stuart Mill’s philosophy of history, as he turns Mill’s defense of diversity against Mill’s justification of enlightened despotism in non-Western societies. Second, to counter Mill’s individualistic notion of progressive agency, he turns to a Marxian notion of anthropological space, in which a laboring class makes history by continuously transforming nature, and rehabilitates the common people (jomin) as progressive agents. Third, in addition to the common people, Yanagita integrates wandering people as a countervailing force to the innate parochialism and conservatism of agrarian civilization. To excavate the unrecorded history of ordinary farmers and wandering people and promote the formation of national consciousness, his minzokugaku adopts travel as an alternative method for knowledge production and political education. In light of this interpretation, the aim of Yanagita’s intellectual and political project can be understood as defense and critique of the Enlightenment tradition. Intellectually, he attempts to navigate between spurious universalism and reactionary particularism by revaluing diversity as a necessary condition for universal knowledge and human progress. Politically, his minzokugaku aims at nation-building/globalization from below by tracing back the history of a migratory process cutting across the existing boundaries. His project is opposed to nation-building from above that aims to integrate the world population into international society at the expense of global diversity.
Resumo:
Space-for-time substitution is often used in predictive models because long-term time-series data are not available. Critics of this method suggest factors other than the target driver may affect ecosystem response and could vary spatially, producing misleading results. Monitoring data from the Florida Everglades were used to test whether spatial data can be substituted for temporal data in forecasting models. Spatial models that predicted bluefin killifish (Lucania goodei) population response to a drying event performed comparably and sometimes better than temporal models. Models worked best when results were not extrapolated beyond the range of variation encompassed by the original dataset. These results were compared to other studies to determine whether ecosystem features influence whether space-for-time substitution is feasible. Taken in the context of other studies, these results suggest space-for-time substitution may work best in ecosystems with low beta-diversity, high connectivity between sites, and small lag in organismal response to the driver variable.
Resumo:
Space-for-time substitution is often used in predictive models because long-term time-series data are not available. Critics of this method suggest factors other than the target driver may affect ecosystem response and could vary spatially, producing misleading results. Monitoring data from the Florida Everglades were used to test whether spatial data can be substituted for temporal data in forecasting models. Spatial models that predicted bluefin killifish (Lucania goodei) population response to a drying event performed comparably and sometimes better than temporal models. Models worked best when results were not extrapolated beyond the range of variation encompassed by the original dataset. These results were compared to other studies to determine whether ecosystem features influence whether space-for-time substitution is feasible. Taken in the context of other studies, these results suggest space-fortime substitution may work best in ecosystems with low beta-diversity, high connectivity between sites, and small lag in organismal response to the driver variable.
Resumo:
The Hf isotope composition of seawater does not match that expected from dissolution of bulk continental crust. This mismatch is generally considered to be due to retention of unradiogenic Hf in resistant zircons during incomplete weathering of continental crust. During periods of intense glacial weathering, zircons should break down more efficiently, resulting in the release of highly unradiogenic Hf to the oceans. We test this hypothesis by comparing Nd and Hf isotope time series obtained from NW Atlantic ferromanganese crusts. Both isotope systems show a decrease associated with the onset of northern hemisphere glaciation. The observed changes display distinct trajectories in epsilon Nd- epsilon Hf space, which differ from previously reported arrays of bulk terrestrial material and seawater. Such patterns are consistent with the release of highly unradiogenic Hf from very old zircons, facilitated by enhanced mechanical weathering.
Resumo:
This study examines the performance of series of two geomagnetic indices and series synthesized from a semi-empirical model of magnetospheric currents, in explaining the geomagnetic activity observed at Northern Hemipshere's mid-latitude ground-based stations. We analyse data, for the 2007 to 2014 period, from four magnetic observatories (Coimbra, Portugal; Panagyurishte, Bulgary; Novosibirsk, Russia and Boulder, USA), at geomagnetic latitudes between 40° and 50° N. The quiet daily (QD) variation is firstly removed from the time series of the geomagnetic horizontal component (H) using natural orthogonal components (NOC) tools. We compare the resulting series with series of storm-time disturbance (Dst) and ring current (RC) indices and with H series synthesized from the Tsyganenko and Sitnov (2005, doi:10.1029/2004JA010798) (TS05) semi-empirical model of storm-time geomagnetic field. In the analysis, we separate days with low and high local K-index values. Our results show that NOC models are as efficient as standard models of QD variation in preparing raw data to be compared with proxies, but with much less complexity. For the two stations in Europe, we obtain indication that NOC models could be able to separate ionospheric and magnetospheric contributions. Dst and RC series explain the four observatory H-series successfully, with values for the mean of significant correlation coefficients, from 0.5 to 0.6 during low geomagnetic activity (K less than 4) and from 0.6 to 0.7 for geomagnetic active days (K greater than or equal to 4). With regard to the performance of TS05, our results show that the four observatories separate into two groups: Coimbra and Panagyurishte, in one group, for which the magnetospheric/ionospheric ratio in QD variation is smaller, a dominantly QD ionospheric contribution can be removed and TS05 simulations are the best proxy; Boulder and Novosibirsk,in the other group, for which the ionospheric and magnetospheric contributions in QD variation can not be differentiated and correlations with TS05 series can not be made to improve. The main contributor to magnetospheric QD signal are Birkeland currents. The relatively good success of TS05 model in explaining ground-based irregular geomagnetic activity at mid-latitudes makes it an effective tool to classify storms according to their main sources. For Coimbra and Panagyurishte in particular, where ionospheric and magnetospheric daily contributions seem easier to separate, we can aspire to use the TS05 model for ensemble generation in space weather (SW) forecasting and interpretation of past SW events.
Resumo:
Peer reviewed