292 resultados para becoming (Deleuze and Guattari)
Resumo:
As most of people know that all of mass media are state-owned in China, television stations are not exceptional to belong to the enormous state-owned system. But to date, with the economic reform in the broadcasting system and China entering into WTO, the television industry has increased greatly and the television market has matured with more and more competition. The players in China’s television industry have changed from the monologue of TV stations to multi roles of TV stations, production companies and overseas television companies, although TV stations are still the majority of China’s TV market. Especially, private television production companies are becoming more and more active in this market. In this paper, I will describe the development process and challenges of this group in China and ask whether the emergence of this group means for the whole China’s TV industry?
Resumo:
There are at least four key challenges in the online news environment that computational journalism may address. Firstly, news providers operate in a rapidly evolving environment and larger businesses are typically slower to adapt to market innovations. News consumption patterns have changed and news providers need to find new ways to capture and retain digital users. Meanwhile, declining financial performance has led to cost cuts in mass market newspapers. Finally investigative reporting is typically slow, high cost and may be tedious, and yet is valuable to the reputation of a news provider. Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, social science and communications. New technologies may enhance the traditional aims of journalism, or may require “a new breed of people who are midway between technologists and journalists” (Irfan Essa in Mecklin 2009: 3). Historically referred to as ‘computer assisted reporting’, the use of software in online reportage is increasingly valuable due to three factors: larger datasets are becoming publicly available; software is becoming sophisticated and ubiquitous; and the developing Australian digital economy. This paper introduces key elements of computational journalism – it describes why it is needed; what it involves; benefits and challenges; and provides a case study and examples. Computational techniques can quickly provide a solid factual basis for original investigative journalism and may increase interaction with readers, when correctly used. It is a major opportunity to enhance the delivery of original investigative journalism, which ultimately may attract and retain readers online.
Resumo:
Almost every nation on the planet is experiencing increases in both the number and proportion of older adults. Research has shown that older adults use technology less intuitively than younger adults, and have more difficulty with using products effectively. With an ever-increasing population of older adults, it is necessary to understand why they often struggle to use technology, which is becoming more and more important in day to day living. Intuitive use of products is grounded in familiarity and prior experience. The aims of this research were twofold: (i) to examine the differences in familiarity between younger and older adults, to see if this could explain the difficulties faced by some older adults; (ii) to develop investigational methods to assist designers in identifying familiarity in prospective users. Two empirical studies were conducted. The first experiment was conducted in the field with 32 participants, divided across four age groups (18 – 44, 45 – 59, 60 – 74, and 75+). This experiment was conducted in the participants’ homes, with a product they were familiar with. Familiarity was measured through the analysis of data collected through interviews, observation and retrospective protocol. The results of this study show that the youngest group demonstrated significantly higher levels of familiarity with products they own than the 60 – 74 and the 75+ age groups. There were no significant differences between the 18 – 44 age group and the 45 – 59 age group and there were also no significant differences between the three oldest age groups. The second experiment was conducted with 32 participants, across the same four age groups. Four everyday products were used in this experiment. The results of Experiment 2 show that, with previously unused products, younger adults demonstrate significantly higher levels of familiarity than the three older age groups. The three oldest age groups had no significant differences between them. The results of these two studies show that younger adults are more familiar with contemporary products than older adults. They also demonstrate that in terms of familiarity, older adults do not differ significantly as they get older. The results also show that the 45 – 59 age group demonstrate higher levels of familiarity with products they have owned, in comparison with those they have not. The two older age groups did not demonstrate such differences. This suggests that interacting with products over time increases familiarity more for middle-aged adults than for older adults. As a result of this research, a method that can be used by designers to identify potential users’ product familiarity has been identified. This method is easy to use, quick, low cost, highly mobile, flexible, and allows for easy data collection and analysis. A tool has been designed that assists designers and researchers to use the method. Designers can use the knowledge gained from this tool, and integrate it into the design process, resulting in more intuitive products. Such products may lead to improvements in the quality of life of older adults, as a result of improved societal integration, better health management, and more widespread use of communications technology.
Resumo:
Deploying networked control systems (NCSs) over wireless networks is becoming more and more popular. However, the widely-used transport layer protocols, Transmission Control Protocol (TCP) and User Datagram Protocol (UDP), are not designed for real-time applications. Therefore, they may not be suitable for many NCS application scenarios because of their limitations on reliability and/or delay performance, which real-control systems concern. Considering a typical type of NCSs with periodic and sporadic real-time traffic, this paper proposes a highly reliable transport layer protocol featuring a packet loss-sensitive retransmission mechanism and a prioritized transmission mechanism. The packet loss-sensitive retransmission mechanism is designed to improve the reliability of all traffic flows. And the prioritized transmission mechanism offers differentiated services for periodic and sporadic flows. Simulation results show that the proposed protocol has better reliability than UDP and improved delay performance than TCP over wireless networks, particularly when channel errors and congestions occur.
Resumo:
As e-commerce is becoming more and more popular, the number of customer reviews that a product receives grows rapidly. In order to enhance customer satisfaction and their shopping experiences, it has become important to analysis customers reviews to extract opinions on the products that they buy. Thus, Opinion Mining is getting more important than before especially in doing analysis and forecasting about customers’ behavior for businesses purpose. The right decision in producing new products or services based on data about customers’ characteristics means profit for organization/company. This paper proposes a new architecture for Opinion Mining, which uses a multidimensional model to integrate customers’ characteristics and their comments about products (or services). The key step to achieve this objective is to transfer comments (opinions) to a fact table that includes several dimensions, such as, customers, products, time and locations. This research presents a comprehensive way to calculate customers’ orientation for all possible products’ attributes.
Resumo:
Diesel particulate matter (DPM), in particular, has been likened in a somewhat inflammatory manner to be the ‘next asbestos’. From the business change perspective, there are three areas holding the industry back from fully engaging with the issue: 1. There is no real feedback loop in any operational sense to assess the impact of investment or application of controls to manage diesel emissions. 2. DPM are getting ever smaller and more numerous, but there is no practical way of measuring them to regulate them in the field. Mass, the current basis of regulation, is becoming less and less relevant. 3. Diesel emissions management is generally wholly viewed as a cost, yet there are significant areas of benefit available from good management. This paper discusses a feedback approach to address these three areas to move the industry forward. The six main areas of benefit from providing a feedback loop by continuously monitoring diesel emissions have been identified: 1. Condition-based maintenance. Emissions change instantaneously if engine condition changes. 2. Operator performance. An operator can use a lot more fuel for little incremental work output through poor technique or discipline. 3. Vehicle utilisation. Operating hours achieved and ratios of idling to under power affect the proportion of emissions produced with no economic value. 4. Fuel efficiency. This allows visibility into other contributing configuration and environmental factors for the vehicle. 5. Emission rates. This allows scope to directly address the required ratio of ventilation to diesel emissions. 6. Total carbon emissions - for NGER-type reporting requirements, calculating the emissions individually from each vehicle rather than just reporting on fuel delivered to a site.
Resumo:
Stigmergy is a biological term used when discussing a sub-set of insect swarm-behaviour describing the apparent organisation seen during their activities. Stigmergy describes a communication mechanism based on environment-mediated signals which trigger responses among the insects. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, where the pheromones are a form of environment-mediated communication. What is interesting with this phenomenon is that highly organized societies are achieved without an apparent management structure. Stigmergy is also observed in human environments, both natural and engineered. It is implicit in the Web where sites provide a virtual environment supporting coordinative contributions. Researchers in varying disciplines appreciate the power of this phenomenon and have studied how to exploit it. As stigmergy becomes more widely researched we see its definition mutate as papers citing original work become referenced themselves. Each paper interprets these works in ways very specific to the research being conducted. Our own research aims to better understand what improves the collaborative function of a Web site when exploiting the phenomenon. However when researching stigmergy to develop our understanding we discover a lack of a standardized and abstract model for the phenomenon. Papers frequently cited the same generic descriptions before becoming intimately focused on formal specifications of an algorithm, or esoteric discussions regarding sub-facets of the topic. None provide a holistic and macro-level view to model and standardize the nomenclature. This paper provides a content analysis of influential literature documenting the numerous theoretical and experimental papers that have focused on stigmergy. We establish that stigmergy is a phenomenon that transcends the insect world and is more than just a metaphor when applied to the human world. We present from our own research our general theory and abstract model of semantics of stigma in stigmergy. We hope our model will clarify the nuances of the phenomenon into a useful road-map, and standardise vocabulary that we witness becoming confused and divergent. Furthermore, this paper documents the analysis on which we base our next paper: Special Theory of Stigmergy: A Design Pattern for Web 2.0 Collaboration.
Resumo:
In October 2012, Simone presented her book Architecture for a Free Subjectivity to the University of Michigan, Taubman College of Architecture and Urban Planning. This book explores the architectural significance of Deleuze’s philosophy of subjectivization, and Guattari’s overlooked dialogue on architecture and subjectivity. In doing so, it proposes that subjectivity is no longer the exclusive provenance of human beings, but extends to the architectural, the cinematic, the erotic, and the political. It defines a new position within the literature on Deleuze and architecture, while highlighting the neglected issue of subjectivity in contemporary discussion.
Resumo:
Executive Summary Emergency Departments (EDs) locally, nationally and internationally are becoming increasingly busy. Within this context, it can be challenging to deliver a health service that is safe, of high quality and cost-effective. Whilst various models are described within the literature that aim to measure ED ‘work’ or ‘activity’, they are often not linked to a measure of costs to provide such activity. It is important for hospital and ED managers to understand and apply this link so that optimal staffing and financial resourcing can be justifiably sought. This research is timely given that Australia has moved towards a national Activity Based Funding (ABF) model for ED activity. ABF is believed to increase transparency of care and fairness (i.e. equal work receives equal pay). ABF involves a person-, performance- or activity-based payment system, and thus a move away from historical “block payment” models that do not incentivise efficiency and quality. The aim of the Statewide Workforce and Activity-Based Funding Modelling Project in Queensland Emergency Departments (SWAMPED) is to identify and describe best practice Emergency Department (ED) workforce models within the current context of ED funding that operates under an ABF model. The study is comprised of five distinct phases. This monograph (Phase 1) comprises a systematic review of the literature that was completed in June 2013. The remaining phases include a detailed survey of Queensland hospital EDs’ resource levels, activity and operational models of care, development of new resource models, development of a user-friendly modelling interface for ED mangers, and production of a final report that identifies policy implications. The anticipated deliverable outcome of this research is the development of an ABF based Emergency Workforce Modelling Tool that will enable ED managers to profile both their workforce and operational models of care. Additionally, the tool will assist with the ability to more accurately inform adequate staffing numbers required in the future, inform planning of expected expenditures and be used for standardisation and benchmarking across similar EDs. Summary of the Findings Within the remit of this review of the literature, the main findings include: 1. EDs are becoming busier and more congested Rising demand, barriers to ED throughput and transitions of care all contribute to ED congestion. In addition requests by organisational managers and the community require continued broadening of the scope of services required of the ED and further increases in demand. As the population live longer with more lifestyle diseases their propensity to require ED care continues to grow. 2. Various models of care within EDs exist Models often vary to account for site specific characteritics to suit staffing profile, ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Existing and new models implemented within EDs often depend on the target outcome requiring change. Generally this is focussed on addressing issues at the input, throughput or output areas of the ED. Even with models targeting similar demographic or illness, the structure and process elements underpinning the model can vary, which can impact on outcomes and variance to the patient and carer experience between and within EDs. Major models of care to manage throughput inefficiencies include: A. Workforce Models of Care focus on the appropriate level of staffing for a given workload to provide prompt, timely and clinically effective patient care within an emergency care setting. The studies reviewed suggest that the early involvement of senior medical decision maker and/or specialised nursing roles such as Emergency Nurse Practitioners and Clinical Initiatives Nurse, primary contact or extended scope Allied Health Practitioners can facilitate patient flow and improve key indicators such as length of stay and reducing the number of those who did not wait to be seen amongst others. B. Operational Models of Care within EDs focus on mechanisms for streaming (e.g. fast-tracking) or otherwise grouping patient care based on acuity and complexity to assist with minimising any throughput inefficiencies. While studies support the positive impact of these models in general, it appears that they are most effective when they are adequately resourced. 3. Various methods of measuring ED activity exist Measuring ED activity requires careful consideration of models of care and staffing profile. Measuring activity requires the ability to account for factors including: patient census, acuity, LOS, intensity of intervention, department skill-mix plus an adjustment for non-patient care time. 4. Gaps in the literature Continued ED growth calls for new and innovative care delivery models that are safe, clinically effective and cost effective. New roles and stand-alone service delivery models are often evaluated in isolation without considering the global and economic impact on staffing profiles. Whilst various models of accounting for and measuring health care activity exist, costing studies and cost effectiveness studies are lacking for EDs making accurate and reliable assessments of care models difficult. There is a necessity to further understand, refine and account for measures of ED complexity that define a workload upon which resources and appropriate staffing determinations can be made into the future. There is also a need for continued monitoring and comprehensive evaluation of newly implemented workforce modelling tools. This research acknowledges those gaps and aims to: • Undertake a comprehensive and integrated whole of department workforce profiling exercise relative to resources in the context of ABF. • Inform workforce requirements based on traditional quantitative markers (e.g. volume and acuity) combined with qualitative elements of ED models of care; • Develop a comprehensive and validated workforce calculation tool that can be used to better inform or at least guide workforce requirements in a more transparent manner.
Resumo:
The palette of fluorescent proteins (FPs) has grown exponentially over the past decade, and as a result, live imaging of cells expressing fluorescently tagged proteins is becoming more and more mainstream. Spinning disk confocal (SDC) microscopy is a high-speed optical sectioning technique and a method of choice to observe and analyze intracellular FP dynamics at high spatial and temporal resolution. In an SDC system, a rapidly rotating pinhole disk generates thousands of points of light that scan the specimen simultaneously, which allows direct capture of the confocal image with low-noise scientific grade-cooled charge-coupled device cameras, and can achieve frame rates of up to 1000 frames per second. In this chapter, we describe important components of a state-of-the-art spinning disk system optimized for live cell microscopy and provide a rationale for specific design choices. We also give guidelines of how other imaging techniques such as total internal reflection microscopy or spatially controlled photoactivation can be coupled with SDC imaging and provide a short protocol on how to generate cell lines stably expressing fluorescently tagged proteins by lentivirus-mediated transduction.
Resumo:
Schooling is one of the core experiences of most young people in the Western world. This study examines the ways that students inhabit subjectivities defined in their relationship to some normalised good student. The idea that schools exist to produce students who become good citizens is one of the basic tenets of modernist educational philosophies that dominate the contemporary education world. The school has become a political site where policy, curriculum orientations, expectations and philosophies of education contest for the ‘right’ way to school and be schooled. For many people, schools and schooling only make sense if they resonate with past experiences. The good student is framed within these aspects of cultural understanding. However, this commonsense attitude is based on a hegemonic understanding of the good, rather than the good student as a contingent multiplicity that is produced by an infinite set of discourses and experiences. In this book, author Greg Thompson argues that this understanding of subjectivities and power is crucial if schools are to meet the needs of a rapidly changing and challenging world. As a high school teacher for many years, Thompson often wondered how students responded to complex articulations on how to be a good student. How a student can be considered good is itself an articulation of powerful discourses that compete within the school. Rather than assuming a moral or ethical citizen, this study turns that logic on it on its head to ask students in what ways they can be good within the school. Visions of the good student deployed in various ways in schools act to produce various ways of knowing the self as certain types of subjects. Developing the postmodern theories of Foucault and Deleuze, this study argues that schools act to teach students to know themselves in certain idealised ways through which they are located, and locate themselves, in hierarchical rationales of the good student. Problematising the good student in high schools engages those institutional discourses with the philosophy, history and sociology of education. Asking students how they negotiate or perform their selves within schools challenges the narrow and limiting ways that the good is often understood. By pushing the ontological understandings of the self beyond the modernist philosophies that currently dominate schools and schooling, this study problematises the tendency to see students as fixed, measurable identities (beings) rather than dynamic, evolving performances (becomings). Who is the Good High School Student? is an important book for scholars conducting research on high school education, as well as student-teachers, teacher educators and practicing teachers alike.
Resumo:
ARTIST STATEMENT VIBRANTe 2.0 was inspired by a research project for Parkinson’s disease patients aimed at developing a wearable device to collect relevant data for patients and medical health professionals. Vibrante is a Spanish word that translates to vibrant; literally meaning shaking or vibrations. Vibrante also has a dual meaning including vibrancy, energy, activity, and liveliness. Parkinson’s can be a debilitating disease, but it does not mean the person has to lose energy, activeness or vibrancy. As technology moves from being worn to becoming implantable and completely hidden within the body, the very notion of its physicality becomes difficult to grasp. While the human body hides implantable technology, VIBRANTe 2.0 intentionally hides the human body by making it invisible to reveal the technology stitched within. Wires become veins, delivering lifeblood to the technology inside, allowing it to pulsate and exist, while motherboards become networked hubs by which information is transferred through and within the body, performing functions that mirror and often surpass human performance capabilities. Ultimately, VIBRANTe 2.0 seeks to prompt the viewer to reflect on the potential ramifications of the complete immersion of technology into the human body. CONTEXT Technology is increasingly penetrating all aspects of our environment, and the rapid uptake of devices that live near, on or in our bodies is facilitating radical new ways of working, relating and socialising. Such technology, with its capacity to generate previously unimaginable levels of data, offers the potential to provide life-augmenting levels of interactivity. However, the absorption of technology into the very fabric of clothes, accessories and even bodies begins to dilute boundaries between physical, technological and social spheres, generating genuine ethical and privacy concerns and potentially having implications for human evolution. Embedding technology into the fabric of our clothes, accessories, and even the body enable the acquisition of and the connection to vast amounts of data about people and environments in order to provide life-augmenting levels of interactivity. Wearable sensors for example, offer the potential for significant benefits in the future management of our wellbeing. Fitness trackers such as ‘Fitbit’ and ‘Garmen’ provide wearers with the ability to monitor their personal fitness indicators while other wearables provide healthcare professionals with information that improves diagnosis and observation of medical conditions. This exhibition aimed to illustrate this shifting landscape through a selection of experimental wearable and interactive works by local, national and international artists and designers. The exhibition will also provide a platform for broader debate around wearable technology, our mediated future-selves and human interactions in this future landscape. EXHIBITION As part of Artisan’s Wearnext exhibition, the work was on public display from 25 July to 7 November 2015 and received the following media coverage: [Please refer to Additional URLs]
Resumo:
Becoming a Teacher is structured in five very readable sections. The introductory section addresses the nature of teaching and the importance of developing a sense of purpose for teaching in a 21st century classroom. It also introduces some key concepts that are explored throughout the volume according to the particular chapter focus of each part. For example, the chapters in Part 2 explore aspects of student learning and the learning environment and focus on how students develop and learn, learner motivation, developing self esteem and learning environments. The concepts developed in this section, such as human development, stages of learning, motivation, and self-concept are contextualised in terms of theories of cognitive development and theories of social, emotional and moral development. The author, Colin Marsh, draws on his extensive experience as an educator to structure the narrative of chapters in this part via checklists for observation, summary tables, sample strategies for teaching at specific stages of student development, and questions under the heading ‘your turn’. Case studies such as ‘How I use Piaget in my teaching’ make that essential link between theory and practice, something which pre-service teachers struggle with in the early phases of their university course. I was pleased to see that Marsh also explores the contentious and debated aspects of these theoretical frameworks to demonstrate that pre-service teachers must engage with and critique the ways in which theories about teaching and learning are applied. Marsh weaves in key quotations and important references into each chapter’s narrative and concludes every chapter with summary comments, reflection activities, lists of important references and useful web sources. As one would expect of a book published in 2008, Becoming a Teacher is informed by the most recent reports of classroom practice, current policy initiatives and research.