524 resultados para Stochastic Context-Free Grammars
Resumo:
An analytic closed form for the second- order or fourth- order Markovian stochastic correlation of attosecond sum- frequency polarization beat ( ASPB) can be obtained in the extremely Doppler- broadened limit. The homodyne detected ASPB signal is shown to be particularly sensitive to the statistical properties of the Markovian stochastic light. fields with arbitrary bandwidth. The physical explanation for this is that the Gaussian- amplitude. field undergoes stronger intensity. fluctuations than a chaotic. field. On the other hand, the intensity ( amplitude). fluctuations of the Gaussian- amplitude. field or the chaotic. field are always much larger than the pure phase. fluctuations of the phase-diffusion field. The field correlation has weakly influence on the ASPB signal when the laser has narrow bandwidth. In contrast, when the laser has broadband linewidth, the ASPB signal shows resonant- nonresonant cross correlation, and the sensitivities of ASPB signal to three Markovian stochastic models increase as time delay is increased. A Doppler- free precision in the measurement of the energy- level sum can be achieved with an arbitrary bandwidth. The advantage of ASPB is that the ultrafast modulation period 900as can still be improved, because the energy- level interval between ground state and excited state can be widely separated.
Resumo:
A new approach is proposed to simulate splash erosion on local soil surfaces. Without the effect of wind and other raindrops, the impact of free-falling raindrops was considered as an independent event from the stochastic viewpoint. The erosivity of a single raindrop depending on its kinetic energy was computed by an empirical relationship in which the kinetic energy was expressed as a power function of the equivalent diameter of the raindrop. An empirical linear function combining the kinetic energy and soil shear strength was used to estimate the impacted amount of soil particles by a single raindrop. Considering an ideal local soil surface with size of I m x I m, the expected number of received free-failing raindrops with different diameters per unit time was described by the combination of the raindrop size distribution function and the terminal velocity of raindrops. The total splash amount was seen as the sum of the impact amount by all raindrops in the rainfall event. The total splash amount per unit time was subdivided into three different components, including net splash amount, single impact amount and re-detachment amount. The re-detachment amount was obtained by a spatial geometric probability derived using the Poisson function in which overlapped impacted areas were considered. The net splash amount was defined as the mass of soil particles collected outside the splash dish. It was estimated by another spatial geometric probability in which the average splashed distance related to the median grain size of soil and effects of other impacted soil particles and other free-falling raindrops were considered. Splash experiments in artificial rainfall were carried out to validate the availability and accuracy of the model. Our simulated results suggested that the net splash amount and re-detachment amount were small parts of the total splash amount. Their proportions were 0.15% and 2.6%, respectively. The comparison of simulated data with measured data showed that this model could be applied to simulate the soil-splash process successfully and needed information of the rainfall intensity and original soil properties including initial bulk intensity, water content, median grain size and some empirical constants related to the soil surface shear strength, the raindrop size distribution function and the average splashed distance. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
The vehicle navigation problem studied in Bell (2009) is revisited and a time-dependent reverse Hyperstar algorithm is presented. This minimises the expected time of arrival at the destination, and all intermediate nodes, where expectation is based on a pessimistic (or risk-averse) view of unknown link delays. This may also be regarded as a hyperpath version of the Chabini and Lan (2002) algorithm, which itself is a time-dependent A* algorithm. Links are assigned undelayed travel times and maximum delays, both of which are potentially functions of the time of arrival at the respective link. The driver seeks probabilities for link use that minimise his/her maximum exposure to delay on the approach to each node, leading to the determination of the pessimistic expected time of arrival. Since the context considered is vehicle navigation where the driver is not making repeated trips, the probability of link use may be interpreted as a measure of link attractiveness, so a link with a zero probability of use is unattractive while a link with a probability of use equal to one will have no attractive alternatives. A solution algorithm is presented and proven to solve the problem provided the node potentials are feasible and a FIFO condition applies for undelayed link travel times. The paper concludes with a numerical example.
Resumo:
Similarly to protein folding, the association of two proteins is driven by a free energy funnel, determined by favorable interactions in some neighborhood of the native state. We describe a docking method based on stochastic global minimization of funnel-shaped energy functions in the space of rigid body motions (SE(3)) while accounting for flexibility of the interface side chains. The method, called semi-definite programming-based underestimation (SDU), employs a general quadratic function to underestimate a set of local energy minima and uses the resulting underestimator to bias further sampling. While SDU effectively minimizes functions with funnel-shaped basins, its application to docking in the rotational and translational space SE(3) is not straightforward due to the geometry of that space. We introduce a strategy that uses separate independent variables for side-chain optimization, center-to-center distance of the two proteins, and five angular descriptors of the relative orientations of the molecules. The removal of the center-to-center distance turns out to vastly improve the efficiency of the search, because the five-dimensional space now exhibits a well-behaved energy surface suitable for underestimation. This algorithm explores the free energy surface spanned by encounter complexes that correspond to local free energy minima and shows similarity to the model of macromolecular association that proceeds through a series of collisions. Results for standard protein docking benchmarks establish that in this space the free energy landscape is a funnel in a reasonably broad neighborhood of the native state and that the SDU strategy can generate docking predictions with less than 5 � ligand interface Ca root-mean-square deviation while achieving an approximately 20-fold efficiency gain compared to Monte Carlo methods.
Resumo:
A method for reconstruction of 3D polygonal models from multiple views is presented. The method uses sampling techniques to construct a texture-mapped semi-regular polygonal mesh of the object in question. Given a set of views and segmentation of the object in each view, constructive solid geometry is used to build a visual hull from silhouette prisms. The resulting polygonal mesh is simplified and subdivided to produce a semi-regular mesh. Regions of model fit inaccuracy are found by projecting the reference images onto the mesh from different views. The resulting error images for each view are used to compute a probability density function, and several points are sampled from it. Along the epipolar lines corresponding to these sampled points, photometric consistency is evaluated. The mesh surface is then pulled towards the regions of higher photometric consistency using free-form deformations. This sampling-based approach produces a photometrically consistent solution in much less time than possible with previous multi-view algorithms given arbitrary camera placement.
Resumo:
Organizations that leverage lessons learned from their experience in the practice of complex real-world activities are faced with five difficult problems. First, how to represent the learning situation in a recognizable way. Second, how to represent what was actually done in terms of repeatable actions. Third, how to assess performance taking account of the particular circumstances. Fourth, how to abstract lessons learned that are re-usable on future occasions. Fifth, how to determine whether to pursue practice maturity or strategic relevance of activities. Here, organizational learning and performance improvement are investigated in a field study using the Context-based Intelligent Assistant Support (CIAS) approach. A new conceptual framework for practice-based organizational learning and performance improvement is presented that supports researchers and practitioners address the problems evoked and contributes to a practice-based approach to activity management. The novelty of the research lies in the simultaneous study of the different levels involved in the activity. Route selection in light rail infrastructure projects involves practices at both the strategic and operational levels; it is part managerial/political and part engineering. Aspectual comparison of practices represented in Contextual Graphs constitutes a new approach to the selection of Key Performance Indicators (KPIs). This approach is free from causality assumptions and forms the basis of a new approach to practice-based organizational learning and performance improvement. The evolution of practices in contextual graphs is shown to be an objective and measurable expression of organizational learning. This diachronic representation is interpreted using a practice-based organizational learning novelty typology. This dissertation shows how lessons learned when effectively leveraged by an organization lead to practice maturity. The practice maturity level of an activity in combination with an assessment of an activity’s strategic relevance can be used by management to prioritize improvement effort.
Resumo:
We present a theory of hypoellipticity and unique ergodicity for semilinear parabolic stochastic PDEs with "polynomial" nonlinearities and additive noise, considered as abstract evolution equations in some Hilbert space. It is shown that if Hörmander's bracket condition holds at every point of this Hilbert space, then a lower bound on the Malliavin covariance operatorμt can be obtained. Informally, this bound can be read as "Fix any finite-dimensional projection on a subspace of sufficiently regular functions. Then the eigenfunctions of μt with small eigenvalues have only a very small component in the image of Π." We also show how to use a priori bounds on the solutions to the equation to obtain good control on the dependency of the bounds on the Malliavin matrix on the initial condition. These bounds are sufficient in many cases to obtain the asymptotic strong Feller property introduced in [HM06]. One of the main novel technical tools is an almost sure bound from below on the size of "Wiener polynomials," where the coefficients are possibly non-adapted stochastic processes satisfying a Lips chitz condition. By exploiting the polynomial structure of the equations, this result can be used to replace Norris' lemma, which is unavailable in the present context. We conclude by showing that the two-dimensional stochastic Navier-Stokes equations and a large class of reaction-diffusion equations fit the framework of our theory.
Resumo:
The space–time dynamics of rigid inhomogeneities (inclusions) free to move in a randomly fluctuating fluid bio-membrane is derived and numerically simulated as a function of the membrane shape changes. Both vertically placed (embedded) inclusions and horizontally placed (surface) inclusions are considered. The energetics of the membrane, as a two-dimensional (2D) meso-scale continuum sheet, is described by the Canham–Helfrich Hamiltonian, with the membrane height function treated as a stochastic process. The diffusion parameter of this process acts as the link coupling the membrane shape fluctuations to the kinematics of the inclusions. The latter is described via Ito stochastic differential equation. In addition to stochastic forces, the inclusions also experience membrane-induced deterministic forces. Our aim is to simulate the diffusion-driven aggregation of inclusions and show how the external inclusions arrive at the sites of the embedded inclusions. The model has potential use in such emerging fields as designing a targeted drug delivery system.
Resumo:
We examined whether individual differences in shyness and context influenced the amount of computer-mediated self-disclosure and use of affective language during an unfamiliar dyadic social interaction. Unfamiliar young adults were selected for high and low self-reported shyness and paired in mixed dyads (one shy and one nonshy). Each dyad was randomly assigned to either a live webcam or no webcam condition. Participants then engaged in a 20-minute online free chat over the Internet in the laboratory. Free chat conversations were archived, and the transcripts were objectively coded for traditional communication variables, conversational style, and the use of affective language. As predicted, shy adults engaged in significantly fewer spontaneous self-disclosures than did their nonshy counterparts only in the webcam condition. Shy versus nonshy adults did not differ on spontaneous self-disclosures in the no webcam condition. However, context did not influence the use of computer-mediated affective language. Although shy adults used significantly less active and pleasant words than their nonshy counterparts, these differences were not related to webcam condition. The present findings replicate and extend earlier work on shyness, context, and computer-mediated communication to a selected sample of shy adults. Findings suggest that context may influence some, but not all, aspects of social communication in shy adults.
Resumo:
The Internet provides a new tool to investigate old questions in experimental social psychology regarding Person x Context interaction. We examined the interaction of self-reported shyness and context on computer-mediated communication measures. Sixty female undergraduates unfamiliar were paired in dyads and engaged in a 10 min free chat conversation on the Internet with and without a live webcam. Free chat conversations were archived, transcripts were objectively coded for communication variables, and a linear mixed model used for data analysis of dyadic interaction was performed on each communication measure. As predicted, increases in self-reported shyness were significantly related to decreases in the number of prompted self-disclosures (after controlling for the number of opportunities to self-disclose) only in the webcam condition. Self-reported shyness was not related to the number of prompted self-disclosures in the no webcam condition, suggesting that shyness was context dependent. The present study appears to be the first to objectively code measures of Internet behaviour in relation to the study of personality in general and shyness in particular. Theoretical and clinical implications for understanding the contextual nature of shyness are discussed. (C) 2006 Elsevier Inc. All rights reserved.
Resumo:
Ethel Smyth’s opera, Der Wald, met with mixed reactions at its premiere in Berlin in 1902. Many factors contributed to this, not least, as Smyth herself observed, anti-British sentiment in Germany following the second Boer War. One might have expected that the reception of the opera at its British premiere on 18 July at Covent Garden might have been more positive, but even here critical opinion was divided. Even positive reviews were not free from gender discrimination, and other reviews condemned the opera for being too German or Wagnerian. What was meant by ‘Wagnerian’? This article answers the question in three ways. Firstly, I argue that ‘Wagnerian’ meant not a leitmotif-filled, through-composed work (as distinct from a number opera), but simply a lyrical drama; for British audiences the model for this was Tannhäuser or Lohengrin, not the Ring or Tristan. Secondly, taking this definition on board, I analyse the musical language of the opera, in particular the key structure. The central duet sung by the doomed lovers, Heinrich and Röschen, is in F major, almost the furthest possible distance from the home key of the opera (E major), which characterizes the forest and ‘nature’ in general; by contrast, the next scene, where the Kundry-like Iolanthe attempts to seduce Heinrich (a crucial reversal of the more conventional power relations of the love duet), sees a return to the home key. Thirdly, I set the hermeneutical implications of this reversal in the context of the decadent movement, with which late nineteenth-century Wagnerism was associated, and which, following the conviction of Oscar Wilde in 1895, was discredited. Der Wald thus failed because of its ‘guilt by association’ with an aesthetic that had fallen into disrepute.
Resumo:
Throughout design development of satellite structure, stress engineer is usually challenged with randomness in applied loads and material properties. To overcome such problem, a risk-based design is applied which estimates satellite structure probability of failure under static and thermal loads. Determining probability of failure can help to update initially applied factors of safety that were used during structure preliminary design phase. These factors of safety are related to the satellite mission objective. Sensitivity-based analysis is to be implemented in the context of finite element analysis (probabilistic finite element method or stochastic finite element method (SFEM)) to determine the probability of failure for satellite structure or one of its components.
Resumo:
In this article I investigate the practice of free music improvisation in Brazil. The reflections and findings presented here are derived from research conducted as part of a four months Higher Education Academy (HEA, UK) Fellowship, carried out between February and June 2014. The aim was to enquire whether or how the practice of free improvisation is taught in the Brazilian higher education system.
As part of this ethnographic study visits to the following universities were scheduled:
The Federal University of Rio de Janeiro - UFRJ
The Universidade Federal do Estado do Rio de Janeiro (UNIRIO)
The University of São Paulo - USP
The Federal University of Minas Gerais – UFMG
The Federal University of Bahia – UFBA.
The Federal University of Rio Grande do Norte in Natal (UFRN) and
The ELM, the Escola Livre de Música in Unicamp.
I discuss here some general background thinking to the research process, specifically recalling the work of French composer and educator Alain Savouret. I proceed to examine the improvisational spirit, the improvisatory worldmaking approach (the ‘jeitinho brasileiro’) that is often considered to be integral to the Brazilian way of life. In the final part of the article I discuss applied ethnographic methodologies, including the design of questions that were used for over 50 video interviews with Brazilian musicians during the research. I conclude with a final reflection on the video interviews with a specific focus on whether free improvisation can be taught, and the importance of listening in the context of free improvisation practices.
Resumo:
It is often assumed that in order to avoid the most severe consequences of global anthropogenic climate change we have to preserve our existing carbon sinks, such as for instance tropical forests. Global carbon sink conservation raises a host of normative issues, though, since it is debatable who should pay the costs of carbon sink conservation, who has the duty to protect which sinks, and how far the duty to conserve one’s carbon sinks actually extends, especially if it conflicts with other duties one might have. According to some, forested states like Ecuador have a duty to preserve their tropical forests while the rich states of the global North have a duty of fairness to compensate states like Ecuador for the costs they incur. My aim in this paper is to critically analyse this standard line of argument and to criticise its validity both internally (i.e. with regard to its normative conclusion based on its premises) and externally (i.e. with regard to the argument’s underlying assumptions and its lack of contextualisation). As I will argue, the duty to conserve one’s forests is only a particular instantiation of a wider, more general duty to contribute towards global climate justice for which the context in which one operates (e.g. whether other agents are complying with their duties of global climate justice or not) matters significantly.
Resumo:
A core activity in information systems development involves building a conceptual model of the domain that an information system is intended to support. Such models are created using a conceptual-modeling (CM) grammar. Just as high-quality conceptual models facilitate high-quality systems development, high-quality CM grammars facilitate high-quality conceptual modeling. This paper provides a new perspective on ways to improve the quality of the semantics of CM grammars. For many years, the leading approach to this topic has relied on ontological theory. We show, however, that the ontological approach captures only half the story. It needs to be coupled with a logical approach. We explain how the ontological quality and logical quality of CM grammars interrelate. Furthermore, we outline three contributions that a logical approach can make to evaluating the quality of CM grammars: a means of seeing some familiar conceptual-modeling problems in simpler ways; the illumination of new problems; and the ability to prove the benefit of modifying existing CM grammars in particular ways. We demonstrate these benefits in the context of the Entity-Relationship grammar. More generally, our paper opens up a new area of research with many opportunities for future research and practice.