847 resultados para EPISODES
Resumo:
An episodic recreation of Hibberd's Stretch of the Imagination presented as a performance extract as part of the Enter the New Wave Symposium at Melbourne University September 2007.
Resumo:
Statement: Jams, Jelly Beans and the Fruits of Passion Let us search, instead, for an epistemology of practice implicit in the artistic, intuitive processes which some practitioners do bring to situations of uncertainty, instability, uniqueness, and value conflict. (Schön 1983, p40) Game On was born out of the idea of creative community; finding, networking, supporting and inspiring the people behind the face of an industry, those in the mist of the machine and those intending to join. We understood this moment to be a pivotal opportunity to nurture a new emerging form of game making, in an era of change, where the old industry models were proving to be unsustainable. As soon as we started putting people into a room under pressure, to make something in 48hrs, a whole pile of evolutionary creative responses emerged. People refashioned their craft in a moment of intense creativity that demanded different ways of working, an adaptive approach to the craft of making games – small – fast – indie. An event like the 48hrs forces participants’ attention on the process as much as the outcome. As one game industry professional taking part in a challenge for the first time observed: there are three paths in the genesis from idea to finished work: the path that focuses on mechanics; the path that focuses on team structure and roles and the path that focuses on the idea, the spirit – and the more successful teams need to put the spirit of the work first and foremost. The spirit drives the adaptation, it becomes improvisation. As Schön says: “Improvisation consists on varying, combining and recombining a set of figures within the schema which bounds and gives coherence to the performance.” (1983, p55). This improvisational approach is all about those making the games: the people and the principles of their creative process. This documentation evidences the intensity of their passion, determination and the shit that they are prepared to put themselves through to achieve their goal – to win a cup full of jellybeans and make a working game in 48hrs. 48hr is a project where, on all levels, analogue meets digital. This concept was further explored through the documentation process. This set of four videos were created by Cameron Owen on the fly during the challenge using both the iphone video camera and editing software in order to be available with immediacy and allow the event audience to share the experience - and perhaps to give some insights into the creative process exposed by the 48 hour challenge. ____________________________ Schön, D. A. 1983, The Reflective Practitioner: How Professionals Think in Action, Basic Books, New York
Resumo:
Land-change science emphasizes the intimate linkages between the human and environmental components of land management systems. Recent theoretical developments in drylands identify a small set of key principles that can guide the understanding of these linkages. Using these principles, a detailed study of seven major degradation episodes over the past century in Australian grazed rangelands was reanalyzed to show a common set of events: (i) good climatic and economic conditions for a period, leading to local and regional social responses of increasing stocking rates, setting the preconditions for rapid environmental collapse, followed by (ii) a major drought coupled with a fall in the market making destocking financially unattractive, further exacerbating the pressure on the environment; then (iii) permanent or temporary declines in grazing productivity, depending on follow-up seasons coupled again with market and social conditions. The analysis supports recent theoretical developments but shows that the establishment of environmental knowledge that is strictly local may be insufficient on its own for sustainable management. Learning systems based in a wider community are needed that combine local knowledge, formal research, and institutional support. It also illustrates how natural variability in the state of both ecological and social systems can interact to precipitate nonequilibrial change in each other, so that planning cannot be based only on average conditions. Indeed, it is this variability in both environment and social subsystems that hinders the local learning required to prevent collapse.
Resumo:
Reading pedagogy is constantly an object of discussion and debate in contemporary policy and practice but is rarely a matter for historical inquiry. This paper reports from a recent study of the history of reading pedagogy in Australia and beyond. It focuses on a recurring figure in the historical record—the ‘reading lesson’. Presented as a distinctive trope, the reading lesson is traced in its regularity in and through the discourse of reading pedagogy, starting in 1930s Australia and moving back into 19th-century Europe, and with specific reference to the UK and the USA. Teaching reading is expressly identified as a moral project—something that, it can be argued, clearly continues into the present.
Resumo:
This paper reviews a wide range of literature on environmental management in the field in Queensland, and analyzes this by period and by author. An episodic pattern of activities since European settlement is evident. Periods of exploration (pre-1950) and inventory- compilation (ca. 1950-1970) were followed by two decades of media and non-government organization campaigning (ca. 1970-1990), then an era dominated by government regulatory action (ca. 1990-2010). These eras dominated public perception of what was happening in environmental practice. They were delineated by historic ‘interventions’ (summarily, the end of World War II, the 1971 inflationary crisis, and computerization respectively).
Resumo:
Aerosol particles can cause detrimental environmental and health effects. The particles and their precursor gases are emitted from various anthropogenic and natural sources. It is important to know the origin and properties of aerosols to efficiently reduce their harmful effects. The diameter of aerosol particles (Dp) varies between ~0.001 and ~100 μm. Fine particles (PM2.5: Dp < 2.5 μm) are especially interesting because they are the most harmful and can be transported over long distances. The aim of this thesis is to study the impact on air quality by pollution episodes of long-range transported aerosols affecting the composition of the boundary-layer atmosphere in remote and relatively unpolluted regions of the world. The sources and physicochemical properties of aerosols were investigated in detail, based on various measurements (1) in southern Finland during selected long-range transport (LRT) pollution episodes and unpolluted periods and (2) over the Atlantic Ocean between Europe and Antarctica during a voyage. Furthermore, the frequency of LRT pollution episodes of fine particles in southern Finland was investigated over a period of 8 years, using long-term air quality monitoring data. In southern Finland, the annual mean PM2.5 mass concentrations were low but LRT caused high peaks of daily mean concentrations every year. At an urban background site in Helsinki, the updated WHO guideline value (24-h PM2.5 mean 25 μg/m3) was exceeded during 1-7 LRT episodes each year during 1999-2006. The daily mean concentrations varied between 25 and 49 μg/m3 during the episodes, which was 3-6 times higher than the mean concentration in the long term. The in-depth studies of selected LRT episodes in southern Finland revealed that biomass burning in agricultural fields and wildfires, occurring mainly in Eastern Europe, deteriorated air quality on a continental scale. The strongest LRT episodes of fine particles resulted from open biomass-burning fires but the emissions from other anthropogenic sources in Eastern Europe also caused significant LRT episodes. Particle mass and number concentrations increased strongly in the accumulation mode (Dp ~ 0.09-1 μm) during the LRT episodes. However, the concentrations of smaller particles (Dp < 0.09 μm) remained low or even decreased due to the uptake of vapours and molecular clusters by LRT particles. The chemical analysis of individual particles showed that the proportions of several anthropogenic particle types increased (e.g. tar balls, metal oxides/hydroxides, spherical silicate fly ash particles and various calcium-rich particles) in southern Finland during an LRT episode, when aerosols originated from the polluted regions of Eastern Europe and some open biomass-burning smoke was also brought in by LRT. During unpolluted periods when air masses arrived from the north, the proportions of marine aerosols increased. In unpolluted rural regions of southern Finland, both accumulation mode particles and small-sized (Dp ~ 1-3 μm) coarse mode particles originated mostly from LRT. However, the composition of particles was totally different in these size fractions. In both size fractions, strong internal mixing of chemical components was typical for LRT particles. Thus, the aging of particles has significant impacts on their chemical, hygroscopic and optical properties, which can largely alter the environmental and health effects of LRT aerosols. Over the Atlantic Ocean, the individual particle composition of small-sized (Dp ~ 1-3 μm) coarse mode particles was affected by continental aerosol plumes to distances of at least 100-1000 km from the coast (e.g. pollutants from industrialized Europe, desert dust from the Sahara and biomass-burning aerosols near the Gulf of Guinea). The rate of chloride depletion from sea-salt particles was high near the coasts of Europe and Africa when air masses arrived from polluted continental regions. Thus, the LRT of continental aerosols had significant impacts on the composition of the marine boundary-layer atmosphere and seawater. In conclusion, integration of the results obtained using different measurement techniques captured the large spatial and temporal variability of aerosols as observed at terrestrial and marine sites, and assisted in establishing the causal link between land-bound emissions, LRT and air quality.
Resumo:
Stroke is a major cause of death and disability, incurs significant costs to healthcare systems, and inflicts severe burden to the whole society. Stroke care in Finland has been described in several population-based studies between 1967 and 1998, but not since. In the PERFECT Stroke study presented here, a system for monitoring the Performance, Effectiveness, and Costs of Treatment episodes in Stroke was developed in Finland. Existing nationwide administrative registries were linked at individual patient level with personal identification numbers to depict whole episodes of care, from acute stroke, through rehabilitation, until the patients went home, were admitted to permanent institutional care, or died. For comparisons in time and between providers, patient case-mix was adjusted for. The PERFECT Stroke database includes 104 899 first-ever stroke patients over the years 1999 to 2008, of whom 79% had ischemic stroke (IS), 14% intracerebral hemorrhage (ICH), and 7% subarachnoid hemorrhage (SAH). A 18% decrease in the age and sex adjusted incidence of stroke was observed over the study period, 1.8% improvement annually. All-cause 1-year case-fatality rate improved from 28.6% to 24.6%, or 0.5% annually. The expected median lifetime after stroke increased by 2 years for IS patients, to 7 years and 7 months, and by 1 year for ICH patients, to 4 years 5 months. No change could be seen in median SAH patient survival, >10 years. Stroke prevalence was 82 000, 1.5% of total population of Finland, in 2008. Modern stroke center care was shown to be associated with a decrease in both death and risk of institutional care of stroke patients. Number needed to treat to prevent these poor outcomes at one year from stroke was 32 (95% confidence intervals 26 to 42). Despite improvements over the study period, more than a third of Finnish stroke patients did not have access to stroke center care. The mean first-year healthcare cost of a stroke patient was ~20 000 , and among survivors ~10 000 annually thereafter. Only part of this cost was incurred by stroke, as the same patients cost ~5000 over the year prior to stroke. Total lifetime costs after first-ever stroke were ~85 000 . A total of 1.1 Billion , 7% of all healthcare expenditure, is used in the treatment of stroke patients annually. Despite a rapidly aging population, the number of new stroke patients is decreasing, and the patients are more likely to survive. This is explained in part by stroke center care, which is effective, and should be made available for all stroke patients. It is possible, in a suitable setting with high-quality administrative registries and a common identifier, to avoid the huge workload and associated costs of setting up a conventional stroke registry, and still acquire a fairly comprehensive dataset on stroke care and outcome.
Resumo:
Frequent episode discovery is a popular framework for mining data available as a long sequence of events. An episode is essentially a short ordered sequence of event types and the frequency of an episode is some suitable measure of how often the episode occurs in the data sequence. Recently,we proposed a new frequency measure for episodes based on the notion of non-overlapped occurrences of episodes in the event sequence, and showed that, such a definition, in addition to yielding computationally efficient algorithms, has some important theoretical properties in connecting frequent episode discovery with HMM learning. This paper presents some new algorithms for frequent episode discovery under this non-overlapped occurrences-based frequency definition. The algorithms presented here are better (by a factor of N, where N denotes the size of episodes being discovered) in terms of both time and space complexities when compared to existing methods for frequent episode discovery. We show through some simulation experiments, that our algorithms are very efficient. The new algorithms presented here have arguably the least possible orders of spaceand time complexities for the task of frequent episode discovery.
Resumo:
Frequent episode discovery is a popular framework for temporal pattern discovery in event streams. An episode is a partially ordered set of nodes with each node associated with an event type. Currently algorithms exist for episode discovery only when the associated partial order is total order (serial episode) or trivial (parallel episode). In this paper, we propose efficient algorithms for discovering frequent episodes with unrestricted partial orders when the associated event-types are unique. These algorithms can be easily specialized to discover only serial or parallel episodes. Also, the algorithms are flexible enough to be specialized for mining in the space of certain interesting subclasses of partial orders. We point out that frequency alone is not a sufficient measure of interestingness in the context of partial order mining. We propose a new interestingness measure for episodes with unrestricted partial orders which, when used along with frequency, results in an efficient scheme of data mining. Simulations are presented to demonstrate the effectiveness of our algorithms.
Resumo:
Discovering patterns in temporal data is an important task in Data Mining. A successful method for this was proposed by Mannila et al. [1] in 1997. In their framework, mining for temporal patterns in a database of sequences of events is done by discovering the so called frequent episodes. These episodes characterize interesting collections of events occurring relatively close to each other in some partial order. However, in this framework(and in many others for finding patterns in event sequences), the ordering of events in an event sequence is the only allowed temporal information. But there are many applications where the events are not instantaneous; they have time durations. Interesting episodesthat we want to discover may need to contain information regarding event durations etc. In this paper we extend Mannila et al.’s framework to tackle such issues. In our generalized formulation, episodes are defined so that much more temporal information about events can be incorporated into the structure of an episode. This significantly enhances the expressive capability of the rules that can be discovered in the frequent episode framework. We also present algorithms for discovering such generalized frequent episodes.
Resumo:
Frequent episode discovery is one of the methods used for temporal pattern discovery in sequential data. An episode is a partially ordered set of nodes with each node associated with an event type. For more than a decade, algorithms existed for episode discovery only when the associated partial order is total (serial episode) or trivial (parallel episode). Recently, the literature has seen algorithms for discovering episodes with general partial orders. In frequent pattern mining, the threshold beyond which a pattern is inferred to be interesting is typically user-defined and arbitrary. One way of addressing this issue in the pattern mining literature has been based on the framework of statistical hypothesis testing. This paper presents a method of assessing statistical significance of episode patterns with general partial orders. A method is proposed to calculate thresholds, on the non-overlapped frequency, beyond which an episode pattern would be inferred to be statistically significant. The method is first explained for the case of injective episodes with general partial orders. An injective episode is one where event-types are not allowed to repeat. Later it is pointed out how the method can be extended to the class of all episodes. The significance threshold calculations for general partial order episodes proposed here also generalize the existing significance results for serial episodes. Through simulations studies, the usefulness of these statistical thresholds in pruning uninteresting patterns is illustrated. (C) 2014 Elsevier Inc. All rights reserved.
Resumo:
Most pattern mining methods yield a large number of frequent patterns, and isolating a small relevant subset of patterns is a challenging problem of current interest. In this paper, we address this problem in the context of discovering frequent episodes from symbolic time-series data. Motivated by the Minimum Description Length principle, we formulate the problem of selecting relevant subset of patterns as one of searching for a subset of patterns that achieves best data compression. We present algorithms for discovering small sets of relevant non-redundant episodes that achieve good data compression. The algorithms employ a novel encoding scheme and use serial episodes with inter-event constraints as the patterns. We present extensive simulation studies with both synthetic and real data, comparing our method with the existing schemes such as GoKrimp and SQS. We also demonstrate the effectiveness of these algorithms on event sequences from a composable conveyor system; this system represents a new application area where use of frequent patterns for compressing the event sequence is likely to be important for decision support and control.
Resumo:
Background: Few studies have analyzed predictors of length of stay (LOS) in patients admitted due to acute bipolar manic episodes. The purpose of the present study was to estimate LOS and to determine the potential sociodemographic and clinical risk factors associated with a longer hospitalization. Such information could be useful to identify those patients at high risk for long LOS and to allocate them to special treatments, with the aim of optimizing their hospital management. Methods: This was a cross-sectional study recruiting adult patients with a diagnosis of bipolar disorder (Diagnostic and Statistical Manual of Mental Disorders, 4th edition, text revision (DSM-IV-TR) criteria) who had been hospitalized due to an acute manic episode with a Young Mania Rating Scale total score greater than 20. Bivariate correlational and multiple linear regression analyses were performed to identify independent predictors of LOS. Results: A total of 235 patients from 44 centers were included in the study. The only factors that were significantly associated to LOS in the regression model were the number of previous episodes and the Montgomery-Åsberg Depression Rating Scale (MADRS) total score at admission (P < 0.05). Conclusions: Patients with a high number of previous episodes and those with depressive symptoms during mania are more likely to stay longer in hospital. Patients with severe depressive symptoms may have a more severe or treatment-resistant course of the acute bipolar manic episode.
Resumo:
This case study, utilizing surface and upper-air data, has attempted to shed light on the mechanisms that exerted control on two contrasting rainfall episodes in Hawaii [in the dry winter of 1981 and wet winter of 1982].
Resumo:
Wintertime precipitation in the mountains of the western United States during a warm or cool period has a pronounced influence on streamflow. During a warm year, streamflow at intermediate elevations responds more immediately to precipitation events; during a cold year, much of the discharge is delayed until the snow melts in spring and summer. Previous efforts at studying these extremes have been hampered by a limited number and length of observational analyses. In this study, we augment this limited observational record by analyzing a simplified general circulation model.