936 resultados para Best evidence rule


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rule 478 of the Uniform Civil Procedure Rules 1999 (Qld)(view by court) is silent as to the manner in which a court might be expected to exercise the discretion to order an inspection or demonstration under the rule and also as to the use which may be made of any inspection or demonstration ordered. The decision in Matton Developments Pty Ltd v CGU Insurance Limited [2014] QSC 256 provides guidance on both matters. This case provides some guidance on the circumstances in which a court may exercise its discretion to order a view or demonstration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This cross-sectional study analyzed psychological well-being at school using the Self-Determination theory as a theoretical frame-work. The study explored basic psychological needs fulfillment (BPNS), academic (SRQ-A), prosocial self-regulation (SRQ-P) and motivation, and their relationship with achievement in general, special and selective education (N=786, 444 boys, 345 girls, mean age 12 yrs 8 mths). Motivation starts behavior which becomes guided by self-regulation. The perceived locus of control (PLOC) affects how self-determined this behavior will be; in other words, to what extent it is autonomously regulated. In order learn and thus to be able to accept external goals, a student has to feel emotionally safe and have sufficient ego-flexibility—all of which builds on satisfied psychological needs. In this study those conditions were explored. In addition to traditional methods Self-organizing maps (SOM), was used in order to cluster the students according to their well-being, self-regulation, motivation and achievement scores. The main impacts of this research were: a presentation of the theory based alternative of studying psychological well-being at school and usage of both the variable and person-oriented approach. In this Finnish sample the results showed that the majority of students felt well, but the well-being varied by group. Overall about for 11–15% the basic needs were deprived depending on the educational group. Age and educational group were the most effective factors; gender was important in relation to prosocial identified behavior. Although the person-oriented SOM-approach, was in a large extent confirming what was no-ticed by using comparison of the variables: the SEN groups had lower levels of basic needs fulfillment and less autonomous self-regulation, interesting deviations of that rule appeared. Some of the SEL- and GEN-group members ended up in the more unfavorable SOM-clusters, and not all SEN-group members belonged to the poorest clusters (although not to the best either). This evidence refines the well-being and self-regulation picture, and may re-direct intervention plans, and turn our focus also on students who might otherwise remain unnoticed. On the other hand, these results imply simultaneously that in special education groups the average is not the whole truth. On the basis of theoretical and empirical considerations an intervention model was sug-gested. The aim of the model was to shift amotivation or external motivation in a more intrinsic direction. According to the theoretical and empirical evidence this can be achieved first by studying the self-concept a student has, and then trying to affect both inner and environmental factors—including a consideration of the basic psychological needs. Keywords: academic self-regulation, prosocial self-regulation, basic psychological needs, moti-vation, achievement

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional and remote Indigenous students are underrepresented in both higher education and vocational education and training. Enabling education courses are important in lifting participation rates and potentially in encouraging mobility between the sectors, yet there is a clear lack of evidence underpinning their development. This report provides an overview of the data collection and analysis activities undertaken via a research project funded by the National Centre for Student Equity in Higher Education. The project purpose was to explore current practices dealing with Indigenous enabling courses, particularly in the context of regional, dual-sector universities. In particular, the project examined how these programs vary by institution (and region) in terms of structure, mode and ethos of offering; and direct and indirect impacts of these initiatives on Indigenous student participation and attainment; with a view to designing a best practice framework and implementation statement. Through its focus on students accessing Indigenous and mainstream enabling education, the project focussed on range of equity groups including those of low socio-economic status (both school leaver and mature-age categories), regional and/or remote students, Indigenous students and students with disability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines whether the rules for of evidence, which were developed around paper over centuries, are adequate for the authentication of electronic evidence. The history of documentary evidence is examined, and the nature of electronic evidence is explored, particularly recent types of electronic evidence such as social media and 'the Cloud'. The old rules are then critically applied to the varied types of electronic evidence to determine whether or not these old rules are indeed adequate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanoindentation technique was employed to measure the changes in mechanical properties of a glass preform subjected to different levels of UV exposure. The results reveal that short-term exposure leads to an appreciable increase in the Young's modulus (E), suggesting the densification of the glass, confirming the compaction-densification model. However, on prolonged exposure, E decreases, which provides what we believe to be the first direct evidence of dilation in the glass leading into the Type IIA regime. The present results rule out the hypothesis that continued exposure leads to an irreversible compaction and prove that index modulation regimes are intrinsic to the glass matrix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivated by a problem from fluid mechanics, we consider a generalization of the standard curve shortening flow problem for a closed embedded plane curve such that the area enclosed by the curve is forced to decrease at a prescribed rate. Using formal asymptotic and numerical techniques, we derive possible extinction shapes as the curve contracts to a point, dependent on the rate of decreasing area; we find there is a wider class of extinction shapes than for standard curve shortening, for which initially simple closed curves are always asymptotically circular. We also provide numerical evidence that self-intersection is possible for non-convex initial conditions, distinguishing between pinch-off and coalescence of the curve interior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every year, approximately 62 000 people with stroke and transient ischemic attack are treated in Canadian hospitals, and the evidence suggests one-third or more will experience vascular-cognitive impairment, and/or intractable fatigue, either alone or in combination. The 2015 update of the Canadian Stroke Best Practice Recommendations: Mood, Cognition and Fatigue Module guideline is a comprehensive summary of current evidence-based recommendations for clinicians in a range of settings, who provide care to patients following stroke. The three consequences of stroke that are the focus of the this guideline (poststroke depression, vascular cognitive impairment, and fatigue) have high incidence rates and significant impact on the lives of people who have had a stroke, impede recovery, and result in worse long-term outcomes. Significant practice variations and gaps in the research evidence have been reported for initial screening and in-depth assessment of stroke patients for these conditions. Also of concern, an increased number of family members and informal caregivers may also experience depressive symptoms in the poststroke recovery phase which further impact patient recovery. These factors emphasize the need for a system of care that ensures screening occurs as a standard and consistent component of clinical practice across settings as stroke patients transition from acute care to active rehabilitation and reintegration into their community. Additionally, building system capacity to ensure access to appropriate specialists for treatment and ongoing management of stroke survivors with these conditions is another great challenge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter traces the history of evidence-based practice (EBP) from its roots in evidence-based medicine to contemporary thinking about its usefulness to public health practice. It defines EBP and differentiates it from ‘evidence-based medicine’, ‘evidence-based policy’ and ‘evidence-based healthcare’. As it is important to understand the subjective nature of knowledge and the research process, this chapter describes the nature and production of knowledge. This chapter considers the necessary skills for EBP, and the processes of attaining the necessary evidence. We examine the barriers and facilitators to identifying and implementing ‘best practice’, and when EBP is appropriate to use. There is a discussion about the limitations of EBP and the use of other information sources to guide practice, and concluding information about the application of evidence to guide policy and practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ever since its initial introduction some fifty years ago, the rational expectations paradigm has dominated the way economic theory handles uncertainty. The main assertion made by John F. Muth (1961), seen by many as the father of the paradigm, is that expectations of rational economic agents should essentially be equal to the predictions of relevant economic theory, since rational agents should use information available to them in an optimal way. This assumption often has important consequences on the results and interpretations of the models where it is applied. Although the rational expectations assumption can be applied to virtually any economic theory, the focus in this thesis is on macroeconomic theories of consumption, especially the Rational Expectations–Permanent Income Hypothesis proposed by Robert E. Hall in 1978. The much-debated theory suggests that, assuming that agents have rational expectations on their future income, consumption decisions should follow a random walk, and the best forecast of future consumption level is the current consumption level. Then, changes in consumption are unforecastable. This thesis constructs an empirical test for the Rational Expectations–Permanent Income Hypothesis using Finnish Consumer Survey data as well as various Finnish macroeconomic data. The data sample covers the years 1995–2010. Consumer survey data may be interpreted to directly represent household expectations, which makes it an interesting tool for this particular test. The variable to be predicted is the growth of total household consumption expenditure. The main empirical result is that the Consumer Confidence Index (CCI), a balance figure computed from the most important consumer survey responses, does have statistically significant predictive power over the change in total consumption expenditure. The history of consumption expenditure growth itself, however, fails to predict its own future values. This indicates that the CCI contains some information that the history of consumption decisions does not, and that the consumption decisions are not optimal in the theoretical context. However, when conditioned on various macroeconomic variables, the CCI loses its predictive ability. This finding suggests that the index is merely a (partial) summary of macroeconomic information, and does not contain any significant private information on consumption intentions of households not directly deductible from the objective economic variables. In conclusion, the Rational Expectations–Permanent Income Hypothesis is strongly rejected by the empirical results in this thesis. This result is in accordance with most earlier studies conducted on the topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variable-temperature X-ray diffraction studies of C70 suggest the occurrence of two phase transitions around 350 and 280 K where the high-temperature phase is fcc and the low-temperature phase is monoclinic, best described as a distorted hcp structure with a doubled unit cell; two like-phases (possibly hcp) seem to coexist in the 280-350 K range. Application of pressure gives rise to three distinct transitions associated with characteristic pressure coefficients, the extrapolated values of the transition temperatures at ambient pressure being around 340, 325 and 270 K. Pressure delineates closely related phases Of C70 just as in the case Of C60 which exhibits two orientational phase transitions at high pressures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures. We used a drift-diffusion framework in the context of a race model, which has been previously used to explain redirect behavior for eye and hand movements separately, to predict the pattern of eye-hand decoupling. We found that the common command architecture could best explain the observed frequency of different eye and hand response patterns to the target step. A common stochastic accumulator for eye-hand coordination also predicts comparable variances, despite significant difference in the means of the eye and hand reaction time (RT) distributions, which we tested. Consistent with this prediction, we observed that the variances of the eye and hand RTs were similar, despite much larger hand RTs (similar to 90 ms). Moreover, changes in mean eye RTs, which also increased eye RT variance, produced a similar increase in mean and variance of the associated hand RT. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organised multilayers were formed from the controlled self-assembly of ferrocene alkyl thiols on Au(111) surfaces. The control was accomplished by increasing the concentration of the thiol solutions used for the assembly. Cyclic voltammetry, ellipsometry, scanning probe microscopy (STM and AFM) and in situ FTIR spectroscopy were used to probe the differences between mono- and multilayers of the same compounds. Electrochemical desorption studies confirmed that the multilayer structure is attached to the surface via one monolayer. The electrochemical behaviour of the multilayers indicated the presence of more than one controlling factor during the oxidation step, whereas the reduction was kinetically controlled which contrasts with the behaviour of monolayers, which exhibit kinetic control for the oxidation and reduction steps. Conventional and imaging ellipsometry confirmed that multilayers with well-defined increments in thickness could be produced. However, STM indicated that at the monolayer stage, the thiols used promote the mobility of Au atoms on the surface. It is very likely that the multilayer structure is held together through hydrogen bonding. To the best of out knowledge, this is the first example of a controlled one-step growth of multilayers of ferrocenyl alkyl thiols using self-assembly techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using US data for the period 1967:5-2002:4, this paper empirically investigates the performance of an augmented version of the Taylor rule (ATR) that (i) allows for the presence of switching regimes, (ii) considers the long-short term spread in addition to the typical variables, (iii) uses an alternative monthly indicator of general economic activity suggested by Stock and Watson (1999), and (iv) considers interest rate smoothing. The estimation results show the existence of switching regimes, one characterized by low volatility and the other by high volatility. Moreover, the scale of the responses of the Federal funds rate to movements in the term spread, inflation and the economic activity index depend on the regime. The estimation results also show robust empirical evidence that the ATR has been more stable during the term of office of Chairman Greenspan than in the pre-Greenspan period. However, a closer look at the Greenspan period shows the existence of two alternative regimes and that the response of the Fed funds rate to inflation has not been significant during this period once the term spread is considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing amount of experimental evidence that suggests people often deviate from the predictions of game theory. Some scholars attempt to explain the observations by introducing errors into behavioral models. However, most of these modifications are situation dependent and do not generalize. A new theory, called the rational novice model, is introduced as an attempt to provide a general theory that takes account of erroneous behavior. The rational novice model is based on two central principals. The first is that people systematically make inaccurate guesses when they are evaluating their options in a game-like situation. The second is that people treat their decisions similar to a portfolio problem. As a result, non optimal actions in a game theoretic sense may be included in the rational novice strategy profile with positive weights.

The rational novice model can be divided into two parts: the behavioral model and the equilibrium concept. In a theoretical chapter, the mathematics of the behavioral model and the equilibrium concept are introduced. The existence of the equilibrium is established. In addition, the Nash equilibrium is shown to be a special case of the rational novice equilibrium. In another chapter, the rational novice model is applied to a voluntary contribution game. Numerical methods were used to obtain the solution. The model is estimated with data obtained from the Palfrey and Prisbrey experimental study of the voluntary contribution game. It is found that the rational novice model explains the data better than the Nash model. Although a formal statistical test was not used, pseudo R^2 analysis indicates that the rational novice model is better than a Probit model similar to the one used in the Palfrey and Prisbrey study.

The rational novice model is also applied to a first price sealed bid auction. Again, computing techniques were used to obtain a numerical solution. The data obtained from the Chen and Plott study were used to estimate the model. The rational novice model outperforms the CRRAM, the primary Nash model studied in the Chen and Plott study. However, the rational novice model is not the best amongst all models. A sophisticated rule-of-thumb, called the SOPAM, offers the best explanation of the data.