176 resultados para Ordinary psychoses
Resumo:
Mathematics has been perceived as the core area of learning in most educational systems around the world including Sri Lanka. Unfortunately, it is clearly visible that a majority of Sri Lankan students are failing in their basic mathematics when the recent grade five scholarship examination and ordinary level exam marks are analysed. According to Department of Examinations Sri Lanka , on average, over 88 percent of the students are failing in the grade 5 scholarship examinations where mathematics plays a huge role while about 50 percent of the students fail in there ordinary level mathematics examination. Poor or lack of basic mathematics skills has been identified as the root cause.
Resumo:
Background Biochemical systems with relatively low numbers of components must be simulated stochastically in order to capture their inherent noise. Although there has recently been considerable work on discrete stochastic solvers, there is still a need for numerical methods that are both fast and accurate. The Bulirsch-Stoer method is an established method for solving ordinary differential equations that possesses both of these qualities. Results In this paper, we present the Stochastic Bulirsch-Stoer method, a new numerical method for simulating discrete chemical reaction systems, inspired by its deterministic counterpart. It is able to achieve an excellent efficiency due to the fact that it is based on an approach with high deterministic order, allowing for larger stepsizes and leading to fast simulations. We compare it to the Euler τ-leap, as well as two more recent τ-leap methods, on a number of example problems, and find that as well as being very accurate, our method is the most robust, in terms of efficiency, of all the methods considered in this paper. The problems it is most suited for are those with increased populations that would be too slow to simulate using Gillespie’s stochastic simulation algorithm. For such problems, it is likely to achieve higher weak order in the moments. Conclusions The Stochastic Bulirsch-Stoer method is a novel stochastic solver that can be used for fast and accurate simulations. Crucially, compared to other similar methods, it better retains its high accuracy when the timesteps are increased. Thus the Stochastic Bulirsch-Stoer method is both computationally efficient and robust. These are key properties for any stochastic numerical method, as they must typically run many thousands of simulations.
Resumo:
Ordinal qualitative data are often collected for phenotypical measurements in plant pathology and other biological sciences. Statistical methods, such as t tests or analysis of variance, are usually used to analyze ordinal data when comparing two groups or multiple groups. However, the underlying assumptions such as normality and homogeneous variances are often violated for qualitative data. To this end, we investigated an alternative methodology, rank regression, for analyzing the ordinal data. The rank-based methods are essentially based on pairwise comparisons and, therefore, can deal with qualitative data naturally. They require neither normality assumption nor data transformation. Apart from robustness against outliers and high efficiency, the rank regression can also incorporate covariate effects in the same way as the ordinary regression. By reanalyzing a data set from a wheat Fusarium crown rot study, we illustrated the use of the rank regression methodology and demonstrated that the rank regression models appear to be more appropriate and sensible for analyzing nonnormal data and data with outliers.
Resumo:
The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.
Resumo:
Quasi-likelihood (QL) methods are often used to account for overdispersion in categorical data. This paper proposes a new way of constructing a QL function that stems from the conditional mean-variance relationship. Unlike traditional QL approaches to categorical data, this QL function is, in general, not a scaled version of the ordinary log-likelihood function. A simulation study is carried out to examine the performance of the proposed QL method. Fish mortality data from quantal response experiments are used for illustration.
Resumo:
Over the last two decades, there has been an increasing awareness of, and interest in, the use of spatial moment techniques to provide insight into a range of biological and ecological processes. Models that incorporate spatial moments can be viewed as extensions of mean-field models. These mean-field models often consist of systems of classical ordinary differential equations and partial differential equations, whose derivation, at some point, hinges on the simplifying assumption that individuals in the underlying stochastic process encounter each other at a rate that is proportional to the average abundance of individuals. This assumption has several implications, the most striking of which is that mean-field models essentially neglect any impact of the spatial structure of individuals in the system. Moment dynamics models extend traditional mean-field descriptions by accounting for the dynamics of pairs, triples and higher n-tuples of individuals. This means that moment dynamics models can, to some extent, account for how the spatial structure affects the dynamics of the system in question.
Resumo:
Previous research identifies various reasons companies invest in information technology (IT), often as a means to generate value. To add to the discussion of IT value generation, this study investigates investments in enterprise software systems that support business processes. Managers of more than 500 Swiss small and medium-sized enterprises (SMEs) responded to a survey regarding the levels of their IT investment in enterprise software systems and the perceived utility of those investments. The authors use logistic and ordinary least squares regression to examine whether IT investments in two business processes affect SMEs' performance and competitive advantage. Using cluster analysis, they also develop a firm typology with four distinct groups that differ in their investments in enterprise software systems. These findings offer key implications for both research and managerial practice.
Resumo:
This paper argues that the Panopticon is an accurate model for and illustration of policing and security methods in the modern society. Initially, I overview the theoretical concept of the Panopticon as a structure of perceived universal surveillance which facilitates automatic obedience in its subjects as identified by the theorists Jeremy Bentham and Michel Foucault. The paper subsequently moves to identify how the Panopticon, despite being a theoretical construct, is nevertheless instantiated to an extent through the prevalence of security cameras as a means of sovereignly regulating human conduct; speeding is an ordinary example. It could even be contended that increasing surveillance according to the model of the Panopticon would reduce the frequency of offences. However, in the final analysis the paper considers that even if adopting an approach based on the Panopticon is a more effective method of policing, it is not necessarily a more desirable one.
Scopophobia/Scopophilia: electric light and the anxiety of the gaze in postwar American architecture
Resumo:
In the years of reconstruction and economic boom that followed the Second World War, the domestic sphere encountered new expectations regarding social behaviour, modes of living, and forms of dwelling. This book brings together an international group of scholars from architecture, design, urban planning, and interior design to reappraise mid-twentieth century modern life, offering a timely reassessment of culture and the economic and political effects on civilian life. This collection contains essays that examine the material of art, objects, and spaces in the context of practices of dwelling over the long span of the postwar period. It asks what role material objects, interior spaces, and architecture played in quelling or fanning the anxieties of modernism’s ordinary denizens, and how this role informs their legacy today. Table of Contents [Book] Introduction Robin Schuldenfrei Part 1: Psychological Constructions: Anxiety of Isolation and Exposure 1. Taking Comfort in the Age of Anxiety: Eero Saarinen’s Womb Chair Cammie McAtee 2. The Future is Possibly Past: The Anxious Spaces of Gaetano Pesce Jane Pavitt 3. Scopophobia/Scopophilia: Electric Light and the Anxiety of the Gaze in American Postwar Domestic Architecture Margaret Petty Part 2: Ideological Objects: Design and Representation 4. The Allegory of the Socialist Lifestyle: The Czechoslovak Pavilion at the Brussels Expo, its Gold Medal and the Politburo Ana Miljacki 5. Assimilating Unease: Moholy-Nagy and the Wartime-Postwar Bauhaus in Chicago Robin Schuldenfrei 6. The Anxieties of Autonomy: Peter Eisenman from Cambridge to House VI Sean Keller Part 3: Societies of Consumers: Materialist Ideologies and Postwar Goods 7. "But a home is not a laboratory": The Anxieties of Designing for the Socialist Home in the German Democratic Republic 1950—1965 Katharina Pfützner 8. Architect-designed Interiors for a Culturally Progressive Upper-Middle Class: The Implicit Political Presence of Knoll International in Belgium Fredie Floré 9. Domestic Environment: Italian Neo-Avant-Garde Design and the Politics of Post-Materialism Mary Louise Lobsinger Part 4: Class Concerns and Conflict: Dwelling and Politics 10. Dirt and Disorder: Taste and Anxiety in the Working Class Home Christine Atha 11. Upper West Side Stories: Race, Liberalism, and Narratives of Urban Renewal in Postwar New York Jennifer Hock 12. Pawns or Prophets? Postwar Architects and Utopian Designs for Southern Italy Anne Parmly Toxey. Coda: From Homelessness to Homelessness David Crowley
Resumo:
ABORIGINAL people in urbanised Australia experience violence on a daily basis. This violence ranges from the psychological (the covert hostility of the corner shop, the denial of the Aboriginality of fair-skinned or urban Blacks) through to the physical brutality of the criminal justice system. For Aboriginal women and children this daily violence is not only public but also has a private, B1ack-on-B1ack dimension. The Aboriginal home may be some refuge from the slights of white Australia but this is cold comfort to women for whom being 'flogged up' by their partners is so ordinary an event as to be unremarkable.
Resumo:
The quality of short-term electricity load forecasting is crucial to the operation and trading activities of market participants in an electricity market. In this paper, it is shown that a multiple equation time-series model, which is estimated by repeated application of ordinary least squares, has the potential to match or even outperform more complex nonlinear and nonparametric forecasting models. The key ingredient of the success of this simple model is the effective use of lagged information by allowing for interaction between seasonal patterns and intra-day dependencies. Although the model is built using data for the Queensland region of Australia, the method is completely generic and applicable to any load forecasting problem. The model’s forecasting ability is assessed by means of the mean absolute percentage error (MAPE). For day-ahead forecast, the MAPE returned by the model over a period of 11 years is an impressive 1.36%. The forecast accuracy of the model is compared with a number of benchmarks including three popular alternatives and one industrial standard reported by the Australia Energy Market Operator (AEMO). The performance of the model developed in this paper is superior to all benchmarks and outperforms the AEMO forecasts by about a third in terms of the MAPE criterion.