186 resultados para Pseudo-Dionysius, the Areopagite
Resumo:
The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.
Resumo:
Background Aphasia is an acquired language disorder that can present a significant barrier to patient involvement in healthcare decisions. Speech-language pathologists (SLPs) are viewed as experts in the field of communication. However, many SLP students do not receive practical training in techniques to communicate with people with aphasia (PWA) until they encounter PWA during clinical education placements. Methods This study investigated the confidence and knowledge of SLP students in communicating with PWA prior to clinical placements using a customised questionnaire. Confidence in communicating with people with aphasia was assessed using a 100-point visual analogue scale. Linear, and logistic, regressions were used to examine the association between confidence and age, as well as confidence and course type (graduate-entry masters or undergraduate), respectively. Knowledge of strategies to assist communication with PWA was examined by asking respondents to list specific strategies that could assist communication with PWA. Results SLP students were not confident with the prospect of communicating with PWA; reporting a median 29-points (inter-quartile range 17–47) on the visual analogue confidence scale. Only, four (8.2%) of respondents rated their confidence greater than 55 (out of 100). Regression analyses indicated no relationship existed between confidence and students‘ age (p = 0.31, r-squared = 0.02), or confidence and course type (p = 0.22, pseudo r-squared = 0.03). Students displayed limited knowledge about communication strategies. Thematic analysis of strategies revealed four overarching themes; Physical, Verbal Communication, Visual Information and Environmental Changes. While most students identified potential use of resources (such as images and written information), fewer students identified strategies to alter their verbal communication (such as reduced speech rate). Conclusions SLP students who had received aphasia related theoretical coursework, but not commenced clinical placements with PWA, were not confident in their ability to communicate with PWA. Students may benefit from an educational intervention or curriculum modification to incorporate practical training in effective strategies to communicate with PWA, before they encounter PWA in clinical settings. Ensuring students have confidence and knowledge of potential communication strategies to assist communication with PWA may allow them to focus their learning experiences in more specific clinical domains, such as clinical reasoning, rather than building foundation interpersonal communication skills.
Resumo:
Application of `advanced analysis' methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A concentrated plasticity method suitable for practical advanced analysis of steel frame structures comprising non-compact sections is presented in this paper. The pseudo plastic zone method implicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the method for the analysis of steel frames comprising non-compact sections is established by comparison with a comprehensive range of analytical benchmark frame solutions. The pseudo plastic zone method is shown to be more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations.
Resumo:
Like many cautionary tales, The Hunger Games takes as its major premise an observation about contemporary society, measuring its ballistic arc in order to present graphically its logical conclusions. The Hunger Games gazes back to the panem et circenses of Ancient Rome, staring equally cynically forward, following the trajectory of reality television to its unbearably barbaric end point – a sadistic voyeurism for an effete elite of consumers. At each end of the historical spectrum (and in the present), the prevailing social form is Arendt’s animal laborans. Consumer or consumed, Panem’s population is (with the exception of the inner circle) either deprived of the possibility of, or distracted from, political action. Within the confines of the Games themselves, Law is abandoned or de‐realised: Law – an elided Other in the pseudo‐Hobbesian nightmare that is the Arena. The Games are played out, as were gladiatorial combats and other diversions of the Roman Empire, against a background resonant of Juvenal’s concern for his contemporaries’ attachment to short term gratification at the expense the civic virtues of justice and caring which are (or would be) constitutive of a contemporary form of Arendt’s homo politicus. While the Games are, on their face, ‘reality’ they are (like the realities presented in contemporary reality television) a simulated reality, de‐realised in a Foucauldian set design constructed as a distraction for Capitol, and for the residents of the Districts, a constant reminder of their subservience to Capitol. Yet contemporary Western culture, for which manipulative reality TV is but a symptom of an underlying malaise, is inscribed at least as an incipient Panem, Its public/political space is diminished by the effective slavery of the poor, the pre‐occupation with and distractions of materiality and modern media, and the increasing concentration of power/wealth into a smaller proportion of the population.
Resumo:
A microgrid contains both distributed generators (DGs) and loads and can be viewed by a controllable load by utilities. The DGs can be either inertial synchronous generators or non-inertial converter interfaced. Moreover, some of them can come online or go offline in plug and play fashion. The combination of these various types of operation makes the microgrid control a challenging task, especially when the microgrid operates in an autonomous mode. In this paper, a new phase locked loop (PLL) algorithm is proposed for smooth synchronization of plug and play DGs. A frequency droop for power sharing is used and a pseudo inertia has been introduced to non-inertial DGs in order to match their response with inertial DGs. The proposed strategy is validated through PSCAD simulation studies.
Resumo:
In this article, we study the security of the IDEA block cipher when it is used in various simple-length or double-length hashing modes. Even though this cipher is still considered as secure, we show that one should avoid its use as internal primitive for block cipher based hashing. In particular, we are able to generate instantaneously free-start collisions for most modes, and even semi-free-start collisions, pseudo-preimages or hash collisions in practical complexity. This work shows a practical example of the gap that exists between secret-key and known or chosen-key security for block ciphers. Moreover, we also settle the 20-year-old standing open question concerning the security of the Abreast-DM and Tandem-DM double-length compression functions, originally invented to be instantiated with IDEA. Our attacks have been verified experimentally and work even for strengthened versions of IDEA with any number of rounds.
Resumo:
In Crypto’95, Micali and Sidney proposed a method for shared generation of a pseudo-random function f(·) among n players in such a way that for all the inputs x, any u players can compute f(x) while t or fewer players fail to do so, where 0⩽tThe idea behind the Micali–Sidney scheme is to generate and distribute secret seeds S={s1,…,sd} of a poly-random collection of functions, among the n players, each player gets a subset of S, in such a way that any u players together hold all the secret seeds in S while any t or fewer players will lack at least one element from S. The pseudo-random function is then computed as where fsi(·)'s are poly-random functions. One question raised by Micali and Sidney is how to distribute the secret seeds satisfying the above condition such that the number of seeds, d, is as small as possible. In this paper, we continue the work of Micali and Sidney. We first provide a general framework for shared generation of pseudo-random function using cumulative maps. We demonstrate that the Micali–Sidney scheme is a special case of this general construction. We then derive an upper and a lower bound for d. Finally we give a simple, yet efficient, approximation greedy algorithm for generating the secret seeds S in which d is close to the optimum by a factor of at most u ln 2.
Resumo:
In Crypto’95, Micali and Sidney proposed a method for shared generation of a pseudo-random function f(·) among n players in such a way that for all the inputs x, any u players can compute f(x) while t or fewer players fail to do so, where 0 ≤ t < u ≤ n. The idea behind the Micali-Sidney scheme is to generate and distribute secret seeds S = s1, . . . , sd of a poly-random collection of functions, among the n players, each player gets a subset of S, in such a way that any u players together hold all the secret seeds in S while any t or fewer players will lack at least one element from S. The pseudo-random function is then computed as where f s i (·)’s are poly-random functions. One question raised by Micali and Sidney is how to distribute the secret seeds satisfying the above condition such that the number of seeds, d, is as small as possible. In this paper, we continue the work of Micali and Sidney. We first provide a general framework for shared generation of pseudo-random function using cumulative maps. We demonstrate that the Micali-Sidney scheme is a special case of this general construction.We then derive an upper and a lower bound for d. Finally we give a simple, yet efficient, approximation greedy algorithm for generating the secret seeds S in which d is close to the optimum by a factor of at most u ln 2.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
1,4-Diazabicyclo[2.2.2]octane (DABCO) forms well-defined co-crystals with 1,2-diiodotetrafluorobenzene (1,2-DITFB), [(1,2-DITFB)2DABCO], and 1,3,5-triiodotrifluorobenzene, [(1,3,5-TITFB)2DABCO]. Both systems exhibited lower-than-expected supramolecular connectivity, which inspired a search for polymorphs in alternative crystallization solvents. In dichloromethane solution, the Menshutkin reaction was found to occur, generating chloride anions and quaternary ammonium cations through the reaction between the solvent and DABCO. The controlled in situ production of chloride ions facilitated the crystallization of new halogen bonded networks, DABCO–CH2Cl[(1,2-DITFB)Cl] (zigzag X-bonded chains) and (DABCO–CH2Cl)3[(1,3,5-TITFB)2Cl3]·CHCl3 (2D pseudo-trigonal X-bonded nets displaying Borremean entanglement), propagating with charge-assisted C–I···Cl– halogen bonds. The method was found to be versatile, and substitution of DABCO with triethylamine (TEA) gave (TEA-CH2Cl)3[(1,2-DITFB)Cl3]·4(H2O) (mixed halogen bond hydrogen bond network with 2D supramolecular connectivity) and TEA-CH2Cl[(1,3,5-TITFB)Cl] (tightly packed planar trigonal nets). The co-crystals were typically produced in high yield and purity with relatively predictable supramolecular topology, particularly with respect to the connectivity of the iodobenzene molecules. The potential to use this synthetic methodology for crystal engineering of halogen bonded architectures is demonstrated and discussed.
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.
Resumo:
This paper offers an uncertainty quantification (UQ) study applied to the performance analysis of the ERCOFTAC conical diffuser. A deterministic CFD solver is coupled with a non-statistical generalised Polynomial Chaos(gPC)representation based on a pseudo-spectral projection method. Such approach has the advantage to not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic field. The stochactic results highlihgt the importance of the inlet velocity uncertainties on the pressure recovery both alone and when coupled with a second uncertain variable. From a theoretical point of view, we investigate the possibility to build our gPC representation on arbitray grid, thus increasing the flexibility of the stochastic framework.
Resumo:
Interactions between the anti-carcinogens, bendamustine (BDM) and dexamethasone (DXM), with bovine serum albumin (BSA) were investigated with the use of fluorescence and UV–vis spectroscopies under pseudo-physiological conditions (Tris–HCl buffer, pH 7.4). The static mechanism was responsible for the fluorescence quenching during the interactions; the binding formation constant of the BSA–BDM complex and the binding number were 5.14 × 105 L mol−1 and 1.0, respectively. Spectroscopic studies for the formation of BDM–BSA complex were interpreted with the use of multivariate curve resolution – alternating least squares (MCR–ALS), which supported the complex formation. The BSA samples treated with site markers (warfarin – site I and ibuprofen – site II) were reacted separately with BDM and DXM; while both anti-carcinogens bound to site I, the binding constants suggested that DXM formed a more stable complex. Relative concentration profiles and the fluorescence spectra associated with BDM, DXM and BSA, were recovered simultaneously from the full fluorescence excitation–emission data with the use of the parallel factor analysis (PARAFAC) method. The results confirmed that on addition of DXM to the BDM–BSA complex, the BDM was replaced and the DXM–BSA complex formed; free BDM was released. This finding may have consequences for the transport of these drugs during any anti-cancer treatment.
Resumo:
The surfaces of natural beidellite were modified with cationic surfactant octadecyl trimethylammonium bromide at different concentrations. The organo-beidellite adsorbent materials were then used for the removal of atrazine with the goal of investigating the mechanism for the adsorption of organic triazine herbicide from contaminated water. Changes on the surfaces and structure of beidellite were characterised by X-ray diffraction (XRD), thermogravimetric analysis (TGA), Fourier transform infrared (FTIR) spectroscopy, scanning electron microscopy (SEM) and BET surface analysis. Kinetics of the adsorption studies were also carried out which show that the adsorption capacity of the organoclays increases with increasing surfactant concentration up until 1.0 CEC surfactant loading, after which the adsorption capacity greatly decreases. TG analysis reveals that although the 2.0 CEC sample has the greatest percentage of surfactant by mass, most of it is present on external sites. The 0.5 CEC sample has the highest proportion of surfactant exchanged into the internal active sites and the 1.0 CEC sample accounts for the highest adsorption capacity. The goodness of fit of the pseudo-second order kinetic confirms that chemical adsorption, rather than physical adsorption, controls the adsorption rate of atrazine.
Resumo:
The development of microfinance in Vietnam since 1990s has coincided with a remarkable progress in poverty reduction. Numerous descriptive studies have illustrated that microfinance is an effective tool to eradicate poverty in Vietnam but evidence from quantitative studies is mixed. This study contributes to the literature by providing new evidence on the impact of microfinance to poverty reduction in Vietnam using the repeated cross - sectional data from the Vietnam Living Standard s Survey (VLSS) during period 1992 - 2010. Our results show that micro - loans contribute significantly to household consumption.