956 resultados para embedded, system, entropy, pool, TRNG, random, ADC
Resumo:
A recent area for investigation into the development of adaptable robot control is the use of living neuronal networks to control a mobile robot. The so-called Animat paradigm comprises a neuronal network (the ‘brain’) connected to an external embodiment (in this case a mobile robot), facilitating potentially robust, adaptable robot control and increased understanding of neural processes. Sensory input from the robot is provided to the neuronal network via stimulation on a number of electrodes embedded in a specialist Petri dish (Multi Electrode Array (MEA)); accurate control of this stimulation is vital. We present software tools allowing precise, near real-time control of electrical stimulation on MEAs, with fast switching between electrodes and the application of custom stimulus waveforms. These Linux-based tools are compatible with the widely used MEABench data acquisition system. Benefits include rapid stimulus modulation in response to neuronal activity (closed loop) and batch processing of stimulation protocols.
Resumo:
The entropy budget is calculated of the coupled atmosphere–ocean general circulation model HadCM3. Estimates of the different entropy sources and sinks of the climate system are obtained directly from the diabatic heating terms, and an approximate estimate of the planetary entropy production is also provided. The rate of material entropy production of the climate system is found to be ∼50 mW m−2 K−1, a value intermediate in the range 30–70 mW m−2 K−1 previously reported from different models. The largest part of this is due to sensible and latent heat transport (∼38 mW m−2 K−1). Another 13 mW m−2 K−1 is due to dissipation of kinetic energy in the atmosphere by friction and Reynolds stresses. Numerical entropy production in the atmosphere dynamical core is found to be about 0.7 mW m−2 K−1. The material entropy production within the ocean due to turbulent mixing is ∼1 mW m−2 K−1, a very small contribution to the material entropy production of the climate system. The rate of change of entropy of the model climate system is about 1 mW m−2 K−1 or less, which is comparable with the typical size of the fluctuations of the entropy sources due to interannual variability, and a more accurate closure of the budget than achieved by previous analyses. Results are similar for FAMOUS, which has a lower spatial resolution but similar formulation to HadCM3, while more substantial differences are found with respect to other models, suggesting that the formulation of the model has an important influence on the climate entropy budget. Since this is the first diagnosis of the entropy budget in a climate model of the type and complexity used for projection of twenty-first century climate change, it would be valuable if similar analyses were carried out for other such models.
Resumo:
This paper describes the development and first results of the “Community Integrated Assessment System” (CIAS), a unique multi-institutional modular and flexible integrated assessment system for modelling climate change. Key to this development is the supporting software infrastructure, SoftIAM. Through it, CIAS is distributed between the community of institutions which has each contributed modules to the CIAS system. At the heart of SoftIAM is the Bespoke Framework Generator (BFG) which enables flexibility in the assembly and composition of individual modules from a pool to form coupled models within CIAS, and flexibility in their deployment onto the available software and hardware resources. Such flexibility greatly enhances modellers’ ability to re-configure the CIAS coupled models to answer different questions, thus tracking evolving policy needs. It also allows rigorous testing of the robustness of IA modelling results to the use of different component modules representing the same processes (for example, the economy). Such processes are often modelled in very different ways, using different paradigms, at the participating institutions. An illustrative application to the study of the relationship between the economy and the earth’s climate system is provided.
Resumo:
A whole life-cycle information management vision is proposed, the organizational requirements for the realization of the scenario is investigated. Preliminary interviews with construction professionals are reported. Discontinuities at information transfer throughout life-cycle of built environments are resulting from lack of coordination and multiple data collection/storage practices. A more coherent history of these activities can improve the work practices of various teams by augmenting decision making processes and creating organizational learning opportunities. Therefore, there is a need for unifying these fragmented bits of data to create a meaningful, semantically rich and standardized information repository for built environment. The proposed vision utilizes embedded technologies and distributed building information models. Two diverse construction project types (large one-off design, small repetitive design) are investigated for the applicability of the vision. A functional prototype software/hardware system for demonstrating the practical use of this vision is developed and discussed. Plans for case-studies for validating the proposed model at a large PFI hospital and housing association projects are discussed.
Resumo:
In real-world environments it is usually difficult to specify the quality of a preventive maintenance (PM) action precisely. This uncertainty makes it problematic to optimise maintenance policy.-This problem is tackled in this paper by assuming that the-quality of a PM action is a random variable following a probability distribution. Two frequently studied PM models, a failure rate PM model and an age reduction PM model, are investigated. The optimal PM policies are presented and optimised. Numerical examples are also given.
Resumo:
Epidemiological studies have shown that ingestion of isoflavone-rich soy products is associated with a reduced risk for the development of breast cancer. In the present study, we investigated the hypothesis that genistein modulates the expression of glutathione S-transferases (GSTs) in human breast cells, thus conferring protection towards genotoxic carcinogens which are GST substrates. Our approach was to use human mammary cell lines MCF-10A and MCF-7 as models for non-neoplastic and neoplastic epithelial breast cells, respectively. MCF-10A cells expressed hGSTA1/2, hGSTA4-4, hGSTM1-1 and hGSTP1-1 proteins, but not hGSTM2-2. In contrast, MCF-7 cells only marginally expressed hGSTA1/2, hGSTA4-4 and hGSTM1-1. Concordant to the protein expression, the hGSTA4 and hGSTP1 mRNA expression was higher in the non-neoplastic cell line. Exposure to genistein significantly increased hGSTP1 mRNA (2.3-fold), hGSTP1-1 protein levels (3.1-fold), GST catalytic activity (4.7-fold) and intracellular glutathione concentrations (1.4-fold) in MCF-10A cells, whereas no effects were observed on GST expression or glutathione concentrations in MCF-7 cells. Preincubation of MCF-10A cells with genistein decreased the extent of DNA damage by 4-hydroxy-2-nonenal (150 mu M) and benzo(a)pyrene-7,8-dihydrodiol-9,10-epoxide (50 mu M), compounds readily detoxified by hGSTA4-4 and hGSTP1-1. In conclusion, genistein pretreatment protects non-neoplastic mammary cells from certain carcinogens that are detoxified by GSTs, suggesting that dietary-mediated induction of GSTs may be a mechanism contributing to prevention against genotoxic injury in the aetiology of breast cancer.
Resumo:
The major technical objectives of the RC-NSPES are to provide a framework for the concurrent operation of reactive and pro-active security functions to deliver efficient and optimised intrusion detection schemes as well as enhanced and highly correlated rule sets for more effective alerts management and root-cause analysis. The design and implementation of the RC-NSPES solution includes a number of innovative features in terms of real-time programmable embedded hardware (FPGA) deployment as well as in the integrated management station. These have been devised so as to deliver enhanced detection of attacks and contextualised alerts against threats that can arise from both the network layer and the application layer protocols. The resulting architecture represents an efficient and effective framework for the future deployment of network security systems.
Resumo:
As consumers demand more functionality) from their electronic devices and manufacturers supply the demand then electrical power and clock requirements tend to increase, however reassessing system architecture can fortunately lead to suitable counter reductions. To maintain low clock rates and therefore reduce electrical power, this paper presents a parallel convolutional coder for the transmit side in many wireless consumer devices. The coder accepts a parallel data input and directly computes punctured convolutional codes without the need for a separate puncturing operation while the coded bits are available at the output of the coder in a parallel fashion. Also as the computation is in parallel then the coder can be clocked at 7 times slower than the conventional shift-register based convolutional coder (using DVB 7/8 rate). The presented coder is directly relevant to the design of modern low-power consumer devices
Resumo:
We present an extensive thermodynamic analysis of a hysteresis experiment performed on a simplified yet Earth-like climate model. We slowly vary the solar constant by 20% around the present value and detect that for a large range of values of the solar constant the realization of snowball or of regular climate conditions depends on the history of the system. Using recent results on the global climate thermodynamics, we show that the two regimes feature radically different properties. The efficiency of the climate machine monotonically increases with decreasing solar constant in present climate conditions, whereas the opposite takes place in snowball conditions. Instead, entropy production is monotonically increasing with the solar constant in both branches of climate conditions, and its value is about four times larger in the warm branch than in the corresponding cold state. Finally, the degree of irreversibility of the system, measured as the fraction of excess entropy production due to irreversible heat transport processes, is much higher in the warm climate conditions, with an explosive growth in the upper range of the considered values of solar constants. Whereas in the cold climate regime a dominating role is played by changes in the meridional albedo contrast, in the warm climate regime changes in the intensity of latent heat fluxes are crucial for determining the observed properties. This substantiates the importance of addressing correctly the variations of the hydrological cycle in a changing climate. An interpretation of the climate transitions at the tipping points based upon macro-scale thermodynamic properties is also proposed. Our results support the adoption of a new generation of diagnostic tools based on the second law of thermodynamics for auditing climate models and outline a set of parametrizations to be used in conceptual and intermediate-complexity models or for the reconstruction of the past climate conditions. Copyright © 2010 Royal Meteorological Society
Resumo:
The problem of identification of a nonlinear dynamic system is considered. A two-layer neural network is used for the solution of the problem. Systems disturbed with unmeasurable noise are considered, although it is known that the disturbance is a random piecewise polynomial process. Absorption polynomials and nonquadratic loss functions are used to reduce the effect of this disturbance on the estimates of the optimal memory of the neural-network model.
Resumo:
In financial decision-making processes, the adopted weights of the objective functions have significant impacts on the final decision outcome. However, conventional rating and weighting methods exhibit difficulty in deriving appropriate weights for complex decision-making problems with imprecise information. Entropy is a quantitative measure of uncertainty and has been useful in exploring weights of attributes in decision making. A fuzzy and entropy-based mathematical approach is employed to solve the weighting problem of the objective functions in an overall cash-flow model. The multiproject being undertaken by a medium-size construction firm in Hong Kong was used as a real case study to demonstrate the application of entropy. Its application in multiproject cash flow situations is demonstrated. The results indicate that the overall before-tax profit was HK$ 0.11 millions lower after the introduction of appropriate weights. In addition, the best time to invest in new projects arising from positive cash flow was identified to be two working months earlier than the nonweight system.
Resumo:
Emergency vehicles use high-amplitude sirens to warn pedestrians and other road users of their presence. Unfortunately, the siren noise enters the vehicle and corrupts the intelligibility of two-way radio voice com-munications from the emergency vehicle to a control room. Often the siren has to be turned off to enable the control room to hear what is being said which subsequently endangers people's lives. A digital signal processing (DSP) based system for the cancellation of siren noise embedded within speech is presented. The system has been tested with the least mean square (LMS), normalised least mean square (NLMS) and affine projection algorithm (APA) using recordings from three common types of sirens (two-tone, wail and yelp) from actual test vehicles. It was found that the APA with a projection order of 2 gives comparably improved cancellation over the LMS and NLMS with only a moderate increase in algorithm complexity and code size. Therefore, this siren noise cancellation system using the APA offers an improvement in cancellation achieved by previous systems. The removal of the siren noise improves the response time for the emergency vehicle and thus the system can contribute to saving lives. The system also allows voice communication to take place even when the siren is on and as such the vehicle offers less risk of danger when moving at high speeds in heavy traffic.
Resumo:
The LiHoxY1-xF4 magnetic material in a transverse magnetic field Bxx̂ perpendicular to the Ising spin direction has long been used to study tunable quantum phase transitions in a random disordered system. We show that the Bx-induced magnetization along the x̂ direction, combined with the local random dilution-induced destruction of crystalline symmetries, generates, via the predominant dipolar interactions between Ho3+ ions, random fields along the Ising ẑ direction. This identifies LiHoxY1-xF4 in Bx as a new random field Ising system. The random fields explain the rapid decrease of the critical temperature in the diluted ferromagnetic regime and the smearing of the nonlinear susceptibility at the spin-glass transition with increasing Bx and render the Bx-induced quantum criticality in LiHoxY1-xF4 likely inaccessible.
Resumo:
The objective of this paper is to reconsider the Maximum Entropy Production conjecture (MEP) in the context of a very simple two-dimensional zonal-vertical climate model able to represent the total material entropy production due at the same time to both horizontal and vertical heat fluxes. MEP is applied first to a simple four-box model of climate which accounts for both horizontal and vertical material heat fluxes. It is shown that, under condition of fixed insolation, a MEP solution is found with reasonably realistic temperature and heat fluxes, thus generalising results from independent two-box horizontal or vertical models. It is also shown that the meridional and the vertical entropy production terms are independently involved in the maximisation and thus MEP can be applied to each subsystem with fixed boundary conditions. We then extend the four-box model by increasing its resolution, and compare it with GCM output. A MEP solution is found which is fairly realistic as far as the horizontal large scale organisation of the climate is concerned whereas the vertical structure looks to be unrealistic and presents seriously unstable features. This study suggest that the thermal meridional structure of the atmosphere is predicted fairly well by MEP once the insolation is given but the vertical structure of the atmosphere cannot be predicted satisfactorily by MEP unless constraints are imposed to represent the determination of longwave absorption by water vapour and clouds as a function of the state of the climate. Furthermore an order-of-magnitude estimate of contributions to the material entropy production due to horizontal and vertical processes within the climate system is provided by using two different methods. In both cases we found that approximately 40 mW m−2 K−1 of material entropy production is due to vertical heat transport and 5–7 mW m−2 K−1 to horizontal heat transport
Resumo:
Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.