931 resultados para Computation by Abstract Devices
Resumo:
Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.
Resumo:
In this paper we propose and analyze a hybrid $hp$ boundary element method for the solution of problems of high frequency acoustic scattering by sound-soft convex polygons, in which the approximation space is enriched with oscillatory basis functions which efficiently capture the high frequency asymptotics of the solution. We demonstrate, both theoretically and via numerical examples, exponential convergence with respect to the order of the polynomials, moreover providing rigorous error estimates for our approximations to the solution and to the far field pattern, in which the dependence on the frequency of all constants is explicit. Importantly, these estimates prove that, to achieve any desired accuracy in the computation of these quantities, it is sufficient to increase the number of degrees of freedom in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods.
Resumo:
A parallel pipelined array of cells suitable for real-time computation of histograms is proposed. The cell architecture builds on previous work obtained via C-slow retiming techniques and can be clocked at 65 percent faster frequency than previous arrays. The new arrays can be exploited for higher throughput particularly when dual data rate sampling techniques are used to operate on single streams of data from image sensors. In this way, the new cell operates on a p-bit data bus which is more convenient for interfacing to camera sensors or to microprocessors in consumer digital cameras.
Resumo:
Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick
Resumo:
Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the deployment of PDM in the mobile environment. An important extension to the basic implementation of PDM dealing with concept drift is also reported. In the era of Big Data, potential applications of paramount importance offered by PDM in a variety of domains including security, business and telemedicine are discussed.
Resumo:
Energy storage is a potential alternative to conventional network reinforcementof the low voltage (LV) distribution network to ensure the grid’s infrastructure remainswithin its operating constraints. This paper presents a study on the control of such storagedevices, owned by distribution network operators. A deterministic model predictive control (MPC) controller and a stochastic receding horizon controller (SRHC) are presented, wherethe objective is to achieve the greatest peak reduction in demand, for a given storagedevice specification, taking into account the high level of uncertainty in the prediction of LV demand. The algorithms presented in this paper are compared to a standard set-pointcontroller and bench marked against a control algorithm with a perfect forecast. A specificcase study, using storage on the LV network, is presented, and the results of each algorithmare compared. A comprehensive analysis is then carried out simulating a large number of LV networks of varying numbers of households. The results show that the performance of each algorithm is dependent on the number of aggregated households. However, on a typical aggregation, the novel SRHC algorithm presented in this paper is shown to outperform each of the comparable storage control techniques.
Resumo:
There has been a recent surge in the use of silver as an antimicrobial agent in a wide range of domestic and clinical products, intended to prevent or treat bacterial infections and reduce bacterial colonization of surfaces. It has been reported that the antibacterial and cytotoxic properties of silver are affected by the assay conditions, particularly the type of growth media used in vitro. The toxicity of Ag+ to bacterial cells is comparable to that of human cells. We demonstrate that biologically relevant compounds such as glutathione, cysteine and human blood components significantly reduce the toxicity of silver ions to clinically relevant pathogenic bacteria and primary human dermal fibroblasts (skin cells). Bacteria are able to grow normally in the presence of silver nitrate at >20-fold the minimum inhibitory concentration (MIC) if Ag+ and thiols are added in a 1:1 ratio because the reaction of Ag+ with extracellular thiols prevents silver ions from interacting with cells. Extracellular thiols and human serum also significantly reduce the antimicrobial activity of silver wound dressings Aquacel-Ag (Convatec) and Acticoat (Smith & Nephew) to Staphylococcus aureus, Pseudomonas aeruginosa and Escherichia coli in vitro. These results have important implications for the deployment of silver as an antimicrobial agent in environments exposed to biological tissue or secretions. Significant amounts of money and effort have been directed at the development of silver-coated medical devices (e.g. dressings, catheters, implants). We believe our findings are essential for the effective design and testing of antimicrobial silver coatings.
Resumo:
Organo-copper(I) halide complexes with a Cu4I4 cubane core and cyclic amines as ligands have been synthesized and their crystal structures have been defined. Their solid state photophysical properties have been measured and correlated with the crystal structure and packing. A unique and remarkably high luminescence quantum yield (76%) has been measured for one of the complexes having the cubane clusters arranged in a columnar structure and held together by N–HI hydrogen bonds. This high luminescence quantum yield is correlated with a slow radiationless deactivation rate of the excited state and suggests a rather strong enhancement of the cubane core rigidity bestowed by the hydrogen bond pattern. Some preliminary thin film deposition experiments show that these compounds could be considered to be good candidates for applications in electroluminescent devices because of their bright luminescence, low cost and relatively easy synthesis processes
Resumo:
In recent years, ZigBee has been proven to be an excellent solution to create scalable and flexible home automation networks. In a home automation network, consumer devices typically collect data from a home monitoring environment and then transmit the data to an end user through multi-hop communication without the need for any human intervention. However, due to the presence of typical obstacles in a home environment, error-free reception may not be possible, particularly for power constrained devices. A mobile sink based data transmission scheme can be one solution but obstacles create significant complexities for the sink movement path determination process. Therefore, an obstacle avoidance data routing scheme is of vital importance to the design of an efficient home automation system. This paper presents a mobile sink based obstacle avoidance routing scheme for a home monitoring system. The mobile sink collects data by traversing through the obstacle avoidance path. Through ZigBee based hardware implementation and verification, the proposed scheme successfully transmits data through the obstacle avoidance path to improve network performance in terms of life span, energy consumption and reliability. The application of this work can be applied to a wide range of intelligent pervasive consumer products and services including robotic vacuum cleaners and personal security robots1.
Resumo:
Most current state-of-the-art haptic devices render only a single force, however almost all human grasps are characterised by multiple forces and torques applied by the fingers and palms of the hand to the object. In this chapter we will begin by considering the different types of grasp and then consider the physics of rigid objects that will be needed for correct haptic rendering. We then describe an algorithm to represent the forces associated with grasp in a natural manner. The power of the algorithm is that it considers only the capabilities of the haptic device and requires no model of the hand, thus applies to most practical grasp types. The technique is sufficiently general that it would also apply to multi-hand interactions, and hence to collaborative interactions where several people interact with the same rigid object. Key concepts in friction and rigid body dynamics are discussed and applied to the problem of rendering multiple forces to allow the person to choose their grasp on a virtual object and perceive the resulting movement via the forces in a natural way. The algorithm also generalises well to support computation of multi-body physics
Resumo:
In this paper the authors consider natural, feigned or absence of emotions in text-based dialogues. The dialogues occurred during interactions between human Judges/Interrogators and hidden entities in practical Turing tests implemented at Bletchley Park in June 2012. The authors focus on the interactions that left the Interrogator unable to say whether they were talking to a human or a machine after five minutes of questioning; the hidden interlocutor received an ‘unsure’ classification. In cases where the Judge has provided post-event feedback the authors present their rationale from three viva voce one-to-one Turing tests. The authors find that emoticons and other visual devices used to express feelings in text-based interaction were missing in the conversations between the Interrogators and hidden interlocutors.
Resumo:
The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.
Resumo:
Dual-polarisation radar measurements provide valuable information about the shapes and orientations of atmospheric ice particles. For quantitative interpretation of these data in the Rayleigh regime, common practice is to approximate the true ice crystal shape with that of a spheroid. Calculations using the discrete dipole approximation for a wide range of crystal aspect ratios demonstrate that approximating hexagonal plates as spheroids leads to significant errors in the predicted differential reflectivity, by as much as 1.5 dB. An empirical modification of the shape factors in Gans's spheroid theory was made using the numerical data. The resulting simple expressions, like Gans's theory, can be applied to crystals in any desired orientation, illuminated by an arbitrarily polarised wave, but are much more accurate for hexagonal particles. Calculations of the scattering from more complex branched and dendritic crystals indicate that these may be accurately modelled using the new expression, but with a reduced permittivity dependent on the volume of ice relative to an enclosing hexagonal prism.
Resumo:
The present article examines production and on-line processing of definite articles in Turkish-speaking sequential bilingual children acquiring English and Dutch as second languages (L2) in the UK and in the Netherlands, respectively. Thirty-nine 6–8-year-old L2 children and 48 monolingual (L1) age-matched children participated in two separate studies examining the production of definite articles in English and Dutch in conditions manipulating semantic context, that is, the anaphoric and the bridging contexts. Sensitivity to article omission was examined in the same groups of children using an on-line processing task involving article use in the same semantic contexts as in the production task. The results indicate that both L2 children and L1 controls are less accurate when definiteness is established by keeping track of the discourse referents (anaphoric) than when it is established via world knowledge (bridging). Moreover, despite variable production, all groups of children were sensitive to the omission of definite articles in the on-line comprehension task. This suggests that the errors of omission are not due to the lack of abstract syntactic representations, but could result from processes implicated in the spell-out of definite articles. The findings are in line with the idea that variable production in child L2 learners does not necessarily indicate lack of abstract representations (Haznedar and Schwartz, 1997).
Resumo:
Lake surface water temperatures (LSWTs) of 246 globally distributed large lakes were derived from Along-Track Scanning Radiometers (ATSR) for the period 1991–2011. The climatological cycles of mean LSWT derived from these data quantify on a global scale the responses of large lakes' surface temperatures to the annual cycle of forcing by solar radiation and the ambient meteorological conditions. LSWT cycles reflect the twice annual peak in net solar radiation for lakes between 1°S to 12°N. For lakes without a lake-mean seasonal ice cover, LSWT extremes exceed air temperatures by 0.5–1.7 °C for maximum and 0.7–1.9 °C for minimum temperature. The summer maximum LSWTs of lakes from 25°S to 35°N show a linear decrease with increasing altitude; −3.76 ± 0.17 °C km−1 (inline image = 0.95), marginally lower than the corresponding air temperature decrease with altitude −4.15 ± 0.24 °C km−1 (inline image = 0.95). Lake altitude of tropical lakes account for 0.78–0.83 (inline image) of the variation in the March to June LSWT–air temperature differences, with differences decreasing by 1.9 °C as the altitude increases from 500 to 1800 m above sea level (a.s.l.) We define an ‘open water phase’ as the length of time the lake-mean LSWT remains above 4 °C. There is a strong global correlation between the start and end of the lake-mean open water phase and the spring and fall 0 °C air temperature transition days, (inline image = 0.74 and 0.80, respectively), allowing for a good estimation of timing and length of the open water phase of lakes without LSWT observations. Lake depth, lake altitude and distance from coast further explain some of the inter-lake variation in the start and end of the open water phase.