290 resultados para Classical Peronism


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report documents the key findings of a year-long collaborative research project focusing on the London Symphony Orchestra’s (LSO) development, implementation and testing of a mobile ticketing and information system. This ticketing system was developed in association with the LSO’s technical partners, Kodime Limited and in collaboration with the Aurora Orchestra.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), we show to be a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Every year a number of pedestrians are struck by trains resulting in death and serious injury. While much research has been conducted on train-vehicle collisions, very little is currently known about the aetiology of train-pedestrian collisions. To date, scant research has been undertaken to investigate the demographics of rule breakers, the frequency of deliberate violation versus error making and the influence of the classic deterrence approach on subsequent behaviours. Aim This study aimed to to identify pedestrians’ self-reported reasons for engaging in violations at crossing, the frequency and nature of rule breaking and whether the threat of sanctions influence such events. Method A questionnaire was administered to 511 participants of all ages. Results Analysis revealed that pedestrians (particularly younger groups) were more likely to commit deliberate violations rather than make crossing errors e.g., mistakes. The most frequent reasons given for deliberate violations were participants were running late and did not want to miss their train or participants believed that the gate was taking too long to open so may be malfunctioning. In regards to classical deterrence, an examination of the perceived threat of being apprehended and fined for a crossing violation revealed participants reported the highest mean scores for swiftness of punishment, which suggests they were generally aware that they would receive an “on the spot” fine. However, the overall mean scores for certainty and severity of sanctions (for violating the rules) indicate that the participants did not perceive the certainty and severity of sanctions as very high. This paper will further discuss the research findings in regards to the development of interventions designed to improve pedestrian crossing safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines manual textual categorisation by human coders with the hypothesis that the law of total probability may be violated for difficult categories. An empirical evaluation was conducted to compare a one step categorisation task with a two step categorisation task using crowdsourcing. It was found that the law of total probability was violated. Both a quantum and classical probabilistic interpretations for this violation are presented. Further studies are required to resolve whether quantum models are more appropriate for this task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although the endocannabinoid system (ECS) has been implicated in brain development and various psychiatric disorders, precise mechanisms of the ECS on mood and anxiety disorders remain unclear. Here, we have investigated developmental and disease-related expression pattern of the cannabinoid receptor 1 (CB1) and the cannabinoid receptor 2 (CB2) genes in the dorsolateral prefrontal cortex (PFC) of humans. Using mice selectively bred for high and low fear, we further investigated potential association between fear memory and the cannabinoid receptor expression in the brain. The CB1, not the CB2, mRNA levels in the PFC gradually decrease during postnatal development ranging in age from birth to 50 years (r 2 > 0.6 & adj. p < 0.05). The CB1 levels in the PFC of major depression patients were higher when compared to the age-matched controls (adj. p < 0.05). In mice, the CB1, not the CB2, levels in the PFC were positively correlated with freezing behavior in classical fear conditioning (p < 0.05). These results suggest that the CB1 in the PFC may play a significant role in regulating mood and anxiety symptoms. Our study demonstrates the advantage of utilizing data from postmortem brain tissue and a mouse model of fear to enhance our understanding of the role of the cannabinoid receptors in mood and anxiety disorders

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pavlovian fear conditioning, also known as classical fear conditioning is an important model in the study of the neurobiology of normal and pathological fear. Progress in the neurobiology of Pavlovian fear also enhances our understanding of disorders such as posttraumatic stress disorder (PTSD) and with developing effective treatment strategies. Here we describe how Pavlovian fear conditioning is a key tool for understanding both the neurobiology of fear and the mechanisms underlying variations in fear memory strength observed across different phenotypes. First we discuss how Pavlovian fear models aspects of PTSD. Second, we describe the neural circuits of Pavlovian fear and the molecular mechanisms within these circuits that regulate fear memory. Finally, we show how fear memory strength is heritable; and describe genes which are specifically linked to both changes in Pavlovian fear behavior and to its underlying neural circuitry. These emerging data begin to define the essential genes, cells and circuits that contribute to normal and pathological fear.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By presenting an overview of institutional theory, specifically the concepts of organizational fields, institutional pressures, and legitimacy, in addition to classical rhetoric, we have sought to highlight that there are links within the literature between the concepts of institutional theory and legitimacy, and also legitimacy and classical rhetoric. To date however, the three concepts – institutional pressures, legitimacy, and rhetoric – have not been explicitly linked. Through building on the current literature, and using the notion of legitimacy as the axis to connect institutional pressures with rhetoric, we argue that certain rhetorical devices may in fact be used to build and construct legitimacy in relation to the different institutional pressures an organization may face within a field. We believe that this preliminary framework may be useful to the field of CSR communication, whereby it may assist in constructing legitimate CSR communication in response to the various pressures an organization may face in relation to CSR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, there has been a significant increase in the popularity of ontological analysis of conceptual modelling techniques. To date, related research explores the ontological deficiencies of classical techniques such as ER or UML modelling, as well as business process modelling techniques such as ARIS or even Web Services standards such as BPEL4WS, BPML, ebXML, BPSS and WSCI. While the ontologies that form the basis of these analyses are reasonably mature, it is the actual process of an ontological analysis that still lacks rigour. The current procedure is prone to individual interpretations and is one reason for criticism of the entire ontological analysis. This paper presents a procedural model for ontological analysis based on the use of meta models, multiple coders and metrics. The model is supported by examples from various ontological analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining what consequences are likely to serve as effective punishment for any given behaviour is a complex task. This chapter focuses specifically on illegal road user behaviours and the mechanisms used to punish and deter them. Traffic law enforcement has traditionally used the threat and/or receipt of legal sanctions and penalties to deter illegal and risky behaviours. This process represents the use of positive punishment, one of the key behaviour modification mechanisms. Behaviour modification principles describe four types of reinforcers: positive and negative punishments and positive and negative reinforcements. The terms ‘positive’ and ‘negative’ are not used in an evaluative sense here. Rather, they represent the presence (positive) or absence (negative) of stimuli to promote behaviour change. Punishments aim to inhibit behaviour and reinforcements aim to encourage it. This chapter describes a variety of punishments and reinforcements that have been and could be used to modify illegal road user behaviours. In doing so, it draws on several theoretical perspectives that have defined behavioural reinforcement and punishment in different ways. Historically, the main theoretical approach used to deter risky road use has been classical deterrence theory which has focussed on the perceived certainty, severity and swiftness of penalties. Stafford and Warr (1993) extended the traditional deterrence principles to include the positive reinforcement concept of punishment avoidance. Evidence of the association between punishment avoidance experiences and behaviour has been established for a number of risky road user behaviours including drink driving, unlicensed driving, and speeding. We chose a novel way of assessing punishment avoidance by specifying two sub-constructs (detection evasion and punishment evasion). Another theorist, Akers, described the idea of competing reinforcers, termed differential reinforcement, within social learning theory (1977). Differential reinforcement describes a balance of reinforcements and punishments as influential on behaviour. This chapter describes comprehensive way of conceptualising a broad range of reinforcement and punishment concepts, consistent with Akers’ differential reinforcement concept, within a behaviour modification framework that incorporates deterrence principles. The efficacy of three theoretical perspectives to explain self-reported speeding among a sample of 833 Australian car drivers was examined. Results demonstrated that a broad range of variables predicted speeding including personal experiences of evading detection and punishment for speeding, intrinsic sensations, practical benefits expected from speeding, and an absence of punishing effects from being caught. Not surprisingly, being younger was also significantly related to more frequent speeding, although in a regression analysis, gender did not retain a significant influence once all punishment and reinforcement variables were entered. The implications for speed management, as well as road user behaviour modification more generally, are discussed in light of these findings. Overall, the findings reported in this chapter suggest that a more comprehensive approach is required to manage the behaviour of road users which does not rely solely on traditional legal penalties and sanctions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transport through crowded environments is often classified as anomalous, rather than classical, Fickian diffusion. Several studies have sought to describe such transport processes using either a continuous time random walk or fractional order differential equation. For both these models the transport is characterized by a parameter α, where α = 1 is associated with Fickian diffusion and α < 1 is associated with anomalous subdiffusion. Here, we simulate a single agent migrating through a crowded environment populated by impenetrable, immobile obstacles and estimate α from mean squared displacement data. We also simulate the transport of a population of such agents through a similar crowded environment and match averaged agent density profiles to the solution of a related fractional order differential equation to obtain an alternative estimate of α. We examine the relationship between our estimate of α and the properties of the obstacle field for both a single agent and a population of agents; we show that in both cases, α decreases as the obstacle density increases, and that the rate of decrease is greater for smaller obstacles. Our work suggests that it may be inappropriate to model transport through a crowded environment using widely reported approaches including power laws to describe the mean squared displacement and fractional order differential equations to represent the averaged agent density profiles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The representation of vampires in horror movies and television programmes has changed considerably over the last two decades. No longer is the vampire portrayed simply as a monster or representation of death. Now, the vampire on our screen such as True Blood’s Bill Compton, or Twilight’s Edward Cullen, passes as human, chooses to make morally sound decisions, becomes an upstanding assimilated citizen, works in the community, and aspires to be a husband to mortal women. The success of recent series such as The Twilight Saga (2009, 2010, 2011, 2012), The Vampire Diaries (2009 - ) and True Blood (2008 - ) has popularised the idea of vampires who cling to remnants of their humanity (or memories of what it means to be human) and attempt to live as human, which builds upon similar – albeit embryonic – themes which emerged from the vampire sub-genre in the 1990s. Within these narratives, representations of the other have shifted from the traditional idea of the monster, to alternative and surprising loci. As this chapter argues, humans themselves, and the concept of the human body, now represent, in many instances, both abject and other. The chapter begins by considering the nature of the abject and otherness in relation to representations of classical vampires and how they have traditionally embodied the other. This provides a backdrop against which to examine the characteristics of the contemporary mainstreaming vampire ‘monster’. An examination of the broad thematic and representational shifts from other to mainstream vampire demonstrates how mainstream monsters are increasingly assimilating into mortal lifestyles with trappings that many viewers may find appealing. The same shifts in theme and representation also reveal that humans are frequently cast as mundane and unappealing in contemporary vampire narratives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As all-atom molecular dynamics method is limited by its enormous computational cost, various coarse-grained strategies have been developed to extend the length scale of soft matters in the modeling of mechanical behaviors. However, the classical thermostat algorithm in highly coarse-grained molecular dynamics method would underestimate the thermodynamic behaviors of soft matters (e.g. microfilaments in cells), which can weaken the ability of materials to overcome local energy traps in granular modeling. Based on all-atom molecular dynamics modeling of microfilament fragments (G-actin clusters), a new stochastic thermostat algorithm is developed to retain the representation of thermodynamic properties of microfilaments at extra coarse-grained level. The accuracy of this stochastic thermostat algorithm is validated by all-atom MD simulation. This new stochastic thermostat algorithm provides an efficient way to investigate the thermomechanical properties of large-scale soft matters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commercially available generic Superglue (cyanoacrylate glue) can be used as an alternative mounting medium for stained resin-embedded semithin sections. It is colourless and contains a volatile, quick-setting solvent that produces permanent mounts of semithin sections for immediate inspection under the light microscope. Here, we compare the use of cyanoacrylate glue for mounting semithin sections with classical dibutyl phthalate xylene (DPX) in terms of practical usefulness, effectiveness and the quality of the final microscopic image.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.