833 resultados para Humanitariansim and complex emergencies
Resumo:
A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.
Resumo:
n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.
Resumo:
Cell patterning commonly employs photolithographic methods for the micro fabrication of structures on silicon chips. These require expensive photo-mask development and complex photolithographic processing. Laser based patterning of cells has been studied in vitro and laser ablation of polymers is an active area of research promising high aspect ratios. This paper disseminates how 800 nm femtosecond infrared (IR) laser radiation can be successfully used to perform laser ablative micromachining of parylene-C on SiO2 substrates for the patterning of human hNT astrocytes (derived from the human teratocarcinoma cell line (hNT)) whilst 248 nm nanosecond ultra-violet laser radiation produces photo-oxidization of the parylene-C and destroys cell patterning. In this work, we report the laser ablation methods used and the ablation characteristics of parylene-C for IR pulse fluences. Results follow that support the validity of using IR laser ablative micromachining for patterning human hNT astrocytes cells. We disseminate the variation in yield of patterned hNT astrocytes on parylene-C with laser pulse spacing, pulse number, pulse fluence and parylene-C strip width. The findings demonstrate how laser ablative micromachining of parylene-C on SiO2 substrates can offer an accessible alternative for rapid prototyping, high yield cell patterning with broad application to multi-electrode arrays, cellular micro-arrays and microfluidics.
Resumo:
In traditional and geophysical fluid dynamics, it is common to describe stratified turbulent fluid flows with low Mach number and small relative density variations by means of the incompressible Boussinesq approximation. Although such an approximation is often interpreted as decoupling the thermodynamics from the dynamics, this paper reviews recent results and derive new ones that show that the reality is actually more subtle and complex when diabatic effects and a nonlinear equation of state are retained. Such an analysis reveals indeed: (1) that the compressible work of expansion/contraction remains of comparable importance as the mechanical energy conversions in contrast to what is usually assumed; (2) in a Boussinesq fluid, compressible effects occur in the guise of changes in gravitational potential energy due to density changes. This makes it possible to construct a fully consistent description of the thermodynamics of incompressible fluids for an arbitrary nonlinear equation of state; (3) rigorous methods based on using the available potential energy and potential enthalpy budgets can be used to quantify the work of expansion/contraction B in steady and transient flows, which reveals that B is predominantly controlled by molecular diffusive effects, and act as a significant sink of kinetic energy.
Resumo:
What is it that gives celebrities the voice and authority to do and say the things they do in the realm of development politics? Asked another way, how is celebrity practised and, simultaneously, how does this praxis make celebrity, personas, politics and, indeed, celebrities themselves? In this article, we explore this ‘celebrity praxis’ through the lens of the creation of the contemporary ‘development celebrity’ in those stars working for development writ large in the so-called Third World. Drawing on work in science studies, material cultures and the growing geo-socio-anthropologies of things, the key to understanding the material practices embedded in and creating development celebrity networks is the multiple and complex circulations of the everyday and bespectacled artefacts of celebrity. Conceptualised as the ‘celebrity–consumption–compassion complex’, the performances of development celebrities are as much about everyday events, materials, technologies, emotions and consumer acts as they are about the mediated and liquidised constructions of the stars who now ‘market’ development.Moreover, this complex is constructed by and constructs what we are calling ‘star/poverty space’ that works to facilitate the ‘expertise’ and ‘authenticity’ and, thus, elevated voice and authority, of development celebrities through poverty tours, photoshoots, textual and visual diaries, websites and tweets. In short, the creation of star/poverty space is performed through a kind of ‘materiality of authenticity’ that is at the centre of the networks of development celebrity. The article concludes with several brief observations about the politics, possibilities and problematics of development celebrities and the star/poverty spaces that they create.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.
Resumo:
This editorial article introduces this special issue of the International Journal of Human Resource Management devoted to the outcomes of expatriate assignments. We set the topic in context. We start by summarizing the traditional view on expatriate outcomes. We then argue that recent developments in the field suggest the need to build a more sophisticated and complex analysis on the topic that incorporates different perspectives (e.g. the organization, the expatriate, their co-workers and their families) and additional types of international experiences and organizations. We then present some difficulties in developing such an analysis. Specifically, using a new typology of complementary relationships among outcomes (i.e. temporal, among-group and among-outcome consistencies), we point out some complications to achieve those relationships. We conclude by introducing the papers in the special edition that all in some way aim to contribute to our understanding of expatriate outcomes.
Resumo:
Burst suppression in the electroencephalogram (EEG) is a well-described phenomenon that occurs during deep anesthesia, as well as in a variety of congenital and acquired brain insults. Classically it is thought of as spatially synchronous, quasi-periodic bursts of high amplitude EEG separated by low amplitude activity. However, its characterization as a “global brain state” has been challenged by recent results obtained with intracranial electrocortigraphy. Not only does it appear that burst suppression activity is highly asynchronous across cortex, but also that it may occur in isolated regions of circumscribed spatial extent. Here we outline a realistic neural field model for burst suppression by adding a slow process of synaptic resource depletion and recovery, which is able to reproduce qualitatively the empirically observed features during general anesthesia at the whole cortex level. Simulations reveal heterogeneous bursting over the model cortex and complex spatiotemporal dynamics during simulated anesthetic action, and provide forward predictions of neuroimaging signals for subsequent empirical comparisons and more detailed characterization. Because burst suppression corresponds to a dynamical end-point of brain activity, theoretically accounting for its spatiotemporal emergence will vitally contribute to efforts aimed at clarifying whether a common physiological trajectory is induced by the actions of general anesthetic agents. We have taken a first step in this direction by showing that a neural field model can qualitatively match recent experimental data that indicate spatial differentiation of burst suppression activity across cortex.
Resumo:
The nature and extent of pre-Columbian (pre-1492 AD) human impact in Amazonia is a contentious issue. The Bolivian Amazon has yielded some of the most impressive evidence for large and complex pre-Columbian societies in the Amazon basin, yet there remains relatively little data concerning the land use of these societies over time. Palaeoecology, when integrated with archaeological data, has the potential to fill these gaps in our knowledge. We present a 6,000-year record of anthropogenic burning, agriculture and vegetation change, from an oxbow lake located adjacent to a pre-Columbian ring-ditch in north-east Bolivia (13°15’44” S, 63°42’37” W). Human occupation around the lake site is inferred from pollen and phytoliths of maize (Zea mays L.) and macroscopic charcoal evidence of anthropogenic burning. First occupation around the lake was radiocarbon dated to ~2500 years BP. The persistence of maize in the record from ~1850 BP suggests that it was an important crop grown in the ringditch region in pre-Columbian times, and abundant macroscopic charcoal suggests that pre-Columbian land management entailed more extensive burning of the landscape than the slash-and-burn agriculture practised around the site today. The site was occupied continuously until near-modern times, although there is evidence for a decline in agricultural intensity or change in land use strategy, and possible population decline, from ~600-500 BP. The long and continuous occupation, which predates the establishment of rainforest in the region, suggests that pre-Columbian land use may have had a significant influence on ecosystem development at this site over the last ~2000 years.
Resumo:
The reputation of The Phantom Carriage (Körkarlen) as one of the major films of Swedish silent cinema is in some respects securely established. Yet the film has attracted surprisingly little detailed discussion. It may be that its most striking stylistic features have deflected or discouraged closer scrutiny. Tom Gunning, for instance, in making the case for Sjöström’s Masterman, argues that ‘Körkarlen wears its technique on its sleeve, overtly displays its unquestionable mastery of superimposition and complex narrative structure. Mästerman tucks its mastery of editing and composition up its sleeve, so to speak’. This article makes an argument for a different evaluation of The Phantom Carriage, bringing a critical and interpretative understanding of the film’s style into conversation with the historical accounts of film form which predominate in the scholarship around silent cinema. It suggests that the film achieves ‘mastery of editing and composition’ with a flexibility and fluidity in the construction of dramatic space that is in itself remarkable for its period, but that Sjöström’s achievements extend well beyond his handling of film space. Specifically, it discusses a segment which is in several respects at the heart of the film: the first meeting between the two central characters, David Holm (Victor Sjöström) and Sister Edit (Astrid Holm); it spans the film’s exact mid-point; and at almost twelve and a half minutes it is the longest uninterrupted passage to take place in a single setting. The chapter argues that the dramatic and structural centrality of the hostel segment is paralleled by its remarkably rich articulation of the relationships between action, character and space. We show how Sjöström’s creation of a three-dimensional filmic space - with no hint of frontality - becomes the basis for a reciprocal relationship between spatial naturalism and performance style, and for a mise-en-scene that can take on discrete interpretive force. The argument also places the hostel sequences within the film as a whole in order to show how relationships articulated through the detailed decisions in this section take on their full resonance within patterns and motifs that develop across the film.
Resumo:
Texture is an important visual attribute used to describe the pixel organization in an image. As well as it being easily identified by humans, its analysis process demands a high level of sophistication and computer complexity. This paper presents a novel approach for texture analysis, based on analyzing the complexity of the surface generated from a texture, in order to describe and characterize it. The proposed method produces a texture signature which is able to efficiently characterize different texture classes. The paper also illustrates a novel method performance on an experiment using texture images of leaves. Leaf identification is a difficult and complex task due to the nature of plants, which presents a huge pattern variation. The high classification rate yielded shows the potential of the method, improving on traditional texture techniques, such as Gabor filters and Fourier analysis.
Resumo:
Harold Pinter’s A Night Out is a significant but rarely produced piece of drama. Therefore, there is very little criticism to support or contradict my argument. The reason why I chose to do my essay on this particular play is to open doors for academic research and to try and make it an equal to its sister plays. I will raise questions and topics to prove the play is worth the readers’ time and effort and that A Night Out is a sharp piece of political theatre. Although at first glance it is a simple enough story, a straightforward tale of the nasty consequences of motherly love when it is pushed to the limit, on deeper inspection, a more far reaching and complex analysis of the abuse of power can be observed. The play offers a variety of themes, including: interpersonal power struggles, failed attempts at communication, antagonistic relationships, the threat of impending or past violence, the struggle for survival or identity, domination and submission, politics, lies and verbal, physical, psychological and sexual abuse. The prevailing theme in the play is the abuse of power: powerful parties oppressing weaker ones, and the results of the oppressed party looking for a vent in someone even weaker than themselves.
Clean Code vs Dirty Code : Ett fältexperiment för att förklara hur Clean Code påverkar kodförståelse
Resumo:
Stora och komplexa kodbaser med bristfällig kodförståelse är ett problem som blir allt vanligare bland företag idag. Bristfällig kodförståelse resulterar i längre tidsåtgång vid underhåll och modifiering av koden, vilket för ett företag leder till ökade kostnader. Clean Code anses enligt somliga vara lösningen på detta problem. Clean Code är en samling riktlinjer och principer för hur man skriver kod som är enkel att förstå och underhålla. Ett kunskapsglapp identifierades vad gäller empirisk data som undersöker Clean Codes påverkan på kodförståelse. Studiens frågeställning var: Hur påverkas förståelsen vid modifiering av kod som är refaktoriserad enligt Clean Code principerna för namngivning och att skriva funktioner? För att undersöka hur Clean Code påverkar kodförståelsen utfördes ett fältexperiment tillsammans med företaget CGM Lab Scandinavia i Borlänge, där data om tidsåtgång och upplevd förståelse hos testdeltagare samlades in och analyserades. Studiens resultat visar ingen tydlig förbättring eller försämring av kodförståelsen då endast den upplevda kodförståelsen verkar påverkas. Alla testdeltagare föredrar Clean Code framför Dirty Code även om tidsåtgången inte påverkas. Detta leder fram till slutsatsen att Clean Codes effekter kanske inte är omedelbara då utvecklare inte hunnit anpassa sig till Clean Code, och därför inte kan utnyttja det till fullo. Studien ger en fingervisning om Clean Codes potential att förbättra kodförståelsen.
Resumo:
The outcome of an audience study supports theories stating that stories are a primary means by which we make sense of our experiences over time. Empirical examples of narrative impact are presented in which specific fiction film scenes condense spectators' lives, identities and beliefs. One conclusion is that spectators test the emotional realism of the narative for greater significance, connecting diegetic fiction experiences with their extra-diegetic world in their quest for meaning, self and identity. The 'banal' notion of the mediatization of religion theory is questioned as unsatisfactory in the theoretical context of individualized meaning-making processes. As a semantically negatively charged concept, it is problematic when analyzing empirical examples of spectators' use of fictional narratives, especially when trying to characterize the idiosyncratic and complex interplay between spectators' fiction emotions and their testing of mediated narratives in an exercise to find moral significance in extra-filmic life. Instead vernacular meaning-making is proposed.
Resumo:
We propose a preliminary methodology for agent-oriented software engineering based on the idea of agent interaction analysis. This approach uses interactions between undetermined agents as the primary component of analysis and design. Agents as a basis for software engineering are useful because they provide a powerful and intuitive abstraction which can increase the comprehensiblity of a complex design. The paper describes a process by which the designer can derive the interactions that can occur in a system satisfying the given requirements and use them to design the structure of an agent-based system, including the identification of the agents themselves. We suggest that this approach has the flexibility necessary to provide agent-oriented designs for open and complex applications, and has value for future maintenance and extension of these systems.