89 resultados para Reflective abstraction
Resumo:
This article applies FIMIX-PLS segmentation methodology to detect and explore unanticipated reactions to organisational strategy among stakeholder segments. For many large organisations today, the tendency to apply a “one-size-fits-all” strategy to members of a stakeholder population, commonly driven by a desire for simplicity, efficiency and fairness, may actually result in unanticipated consequences amongst specific subgroups within the target population. This study argues that it is critical for organisations to understand the varying and potentially harmful effects of strategic actions across differing, and previously unidentified, segments within a stakeholder population. The case of a European revenue service that currently focuses its strategic actions on building trust and compliant behaviour amongst taxpayers is used as the context for this study. FIMIX-PLS analysis is applied to a sample of 501 individual taxpayers, while a novel PLS-based approach for assessing measurement model invariance that can be applied to both reflective and formative measures is also introduced for the purpose of multi-group comparisons. The findings suggest that individual taxpayers can be split into two equal-sized segments with highly differentiated characteristics and reactions to organisational strategy and communications. Compliant behaviour in the first segment (n = 223), labelled “relationships centred on trust,” is mainly driven through positive service experiences and judgements of competence, while judgements of benevolence lead to the unanticipated reaction of increasing distrust among this group. Conversely, compliant behaviour in the second segment (n = 278), labelled “relationships centred on distrust,” is driven by the reduction of fear and scepticism towards the revenue service, which is achieved through signalling benevolence, reduced enforcement and the lower incidence of negative stories. In this segment, the use of enforcement has the unanticipated and counterproductive effect of ultimately reducing compliant behaviour.
Resumo:
Until recently, pollution control in rural drainage basins of the UK consisted solely of water treatment at the point of abstraction. However, prevention of agricultural pollution at source is now a realistic option given the possibility of financing the necessary changes in land use through modification of the Common Agricultural Policy. This paper uses a nutrient export coefficient model to examine the cost of land-use change in relation to improvement of water quality. Catchment-wide schemes and local protection measures are considered. Modelling results underline the need for integrated management of entire drainage basins. A wide range of benefits may accrue from land-use change, including enhanced habitats for wildlife as well as better drinking water.
Resumo:
Objective: This work investigates the nature of the comprehension impairment in Wernicke’s aphasia, by examining the relationship between deficits in auditory processing of fundamental, non-verbal acoustic stimuli and auditory comprehension. Wernicke’s aphasia, a condition resulting in severely disrupted auditory comprehension, primarily occurs following a cerebrovascular accident (CVA) to the left temporo-parietal cortex. Whilst damage to posterior superior temporal areas is associated with auditory linguistic comprehension impairments, functional imaging indicates that these areas may not be specific to speech processing but part of a network for generic auditory analysis. Methods: We examined analysis of basic acoustic stimuli in Wernicke’s aphasia participants (n = 10) using auditory stimuli reflective of theories of cortical auditory processing and of speech cues. Auditory spectral, temporal and spectro-temporal analysis was assessed using pure tone frequency discrimination, frequency modulation (FM) detection and the detection of dynamic modulation (DM) in “moving ripple” stimuli. All tasks used criterion-free, adaptive measures of threshold to ensure reliable results at the individual level. Results: Participants with Wernicke’s aphasia showed normal frequency discrimination but significant impairments in FM and DM detection, relative to age- and hearing-matched controls at the group level (n = 10). At the individual level, there was considerable variation in performance, and thresholds for both frequency and dynamic modulation detection correlated significantly with auditory comprehension abilities in the Wernicke’s aphasia participants. Conclusion: These results demonstrate the co-occurrence of a deficit in fundamental auditory processing of temporal and spectrotemporal nonverbal stimuli in Wernicke’s aphasia, which may have a causal contribution to the auditory language comprehension impairment Results are discussed in the context of traditional neuropsychology and current models of cortical auditory processing.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk
Resumo:
The fungal family Clavicipitaceae includes plant symbionts and parasites that produce several psychoactive and bioprotective alkaloids. The family includes grass symbionts in the epichloae clade (Epichloë and Neotyphodium species), which are extraordinarily diverse both in their host interactions and in their alkaloid profiles. Epichloae produce alkaloids of four distinct classes, all of which deter insects, and some—including the infamous ergot alkaloids—have potent effects on mammals. The exceptional chemotypic diversity of the epichloae may relate to their broad range of host interactions, whereby some are pathogenic and contagious, others are mutualistic and vertically transmitted (seed-borne), and still others vary in pathogenic or mutualistic behavior. We profiled the alkaloids and sequenced the genomes of 10 epichloae, three ergot fungi (Claviceps species), a morning-glory symbiont (Periglandula ipomoeae), and a bamboo pathogen (Aciculosporium take), and compared the gene clusters for four classes of alkaloids. Results indicated a strong tendency for alkaloid loci to have conserved cores that specify the skeleton structures and peripheral genes that determine chemical variations that are known to affect their pharmacological specificities. Generally, gene locations in cluster peripheries positioned them near to transposon-derived, AT-rich repeat blocks, which were probably involved in gene losses, duplications, and neofunctionalizations. The alkaloid loci in the epichloae had unusual structures riddled with large, complex, and dynamic repeat blocks. This feature was not reflective of overall differences in repeat contents in the genomes, nor was it characteristic of most other specialized metabolism loci. The organization and dynamics of alkaloid loci and abundant repeat blocks in the epichloae suggested that these fungi are under selection for alkaloid diversification. We suggest that such selection is related to the variable life histories of the epichloae, their protective roles as symbionts, and their associations with the highly speciose and ecologically diverse cool-season grasses.
Resumo:
Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient c was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and c. For single-peak waveforms the scatterplot of c versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return c values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the c versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient c of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the wavelength of the Riegl scanner (1550 nm). The grass class reflectance (0.46) falls in between the other two classes as might be expected, as this class has a mixture of the contributions of both vegetation and ground reflectance properties.
Resumo:
We present a well-dated, high-resolution, ~ 45 kyr lake sediment record reflecting regional temperature and precipitation change in the continental interior of the Southern Hemisphere (SH) tropics of South America. The study site is Laguna La Gaiba (LLG), a large lake (95 km2) hydrologically-linked to the Pantanal, an immense, seasonally-flooded basin and the world's largest tropical wetland (135,000 km2). Lake-level changes at LLG are therefore reflective of regional precipitation. We infer past fluctuations in precipitation at this site through changes in: i) pollen-inferred extent of flood-tolerant forest; ii) relative abundance of terra firme humid tropical forest versus seasonally-dry tropical forest pollen types; and iii) proportions of deep- versus shallow-water diatoms. A probabilistic model, based on plant family and genus climatic optima, was used to generate quantitative estimates of past temperature from the fossil pollen data. Our temperature reconstruction demonstrates rising temperature (by 4 °C) at 19.5 kyr BP, synchronous with the onset of deglacial warming in the central Andes, strengthening the evidence that climatic warming in the SH tropics preceded deglacial warming in the Northern Hemisphere (NH) by at least 5 kyr. We provide unequivocal evidence that the climate at LLG was markedly drier during the last glacial period (45.0–12.2 kyr BP) than during the Holocene, contrasting with SH tropical Andean and Atlantic records that demonstrate a strengthening of the South American summer monsoon during the global Last Glacial Maximum (~ 21 kyr BP), in tune with the ~ 20 kyr precession orbital cycle. Holocene climate conditions occurred as early as 12.8–12.2 kyr BP, when increased precipitation in the Pantanal catchment caused heightened flooding and rising lake levels in LLG. In contrast to this strong geographic variation in LGM precipitation across the continent, expansion of tropical dry forest between 10 and 3 kyr BP at LLG strengthens the body of evidence for widespread early–mid Holocene drought across tropical South America.
Resumo:
The study of intuition is an emerging area of research in psychology, social sciences, and business studies. It is increasingly of interest to the study of management, for example in decision-making as a counterpoint to structured approaches. Recently work has been undertaken to conceptualize a construct for the intuitive nature of technology. However to-date there is no common under-standing of the term intuition in information systems (IS) research. This paper extends the study of intuition in IS research by using exploratory research to cate-gorize the use of the word “intuition” and related terms in papers published in two prominent IS journals over a ten year period. The entire text of MIS Quarterly and Information Systems Research was reviewed for the years 1999 through 2008 using searchable PDF versions of these publications. As far as could be deter-mined, this is the first application of this approach in the analysis of the text of IS academic journals. The use of the word “intuition” and related terms was catego-rized using coding consistent with Grounded Theory. The focus of this research was on the first two stages of Grounded Theory analysis - the development of codes and constructs. Saturation of coding was not reached: an extended review of these publications would be required to enable theory development. Over 400 incidents of the use of “intuition”, and related terms were found in the articles reviewed. The most prominent use of the term of “intuition” was coded as “Intui-tion as Authority” in which intuition was used to validate a research objective or finding; representing approximately 37 per cent of codes assigned. The second most common coding occurred in research articles with mathematical analysis, representing about 19 per cent of the codes assigned, for example where a ma-thematical formulation or result was “intuitive”. The possibly most impactful use of the term “intuition” was “Intuition as Outcome”, representing approximately 7 per cent of all coding, which characterized research results as adding to the intui-tive understanding of a research topic or phenomena. This research contributes to a greater theoretical understanding of intuition enabling insight into the use of intuition, and the eventual development of a theory on the use of intuition in academic IS research publications. It also provides potential benefits to practi-tioners by providing insight into and validation of the use of intuition in IS man-agement. Research directions include the creation of reflective and/or formative constructs for intuition in information systems research.
Resumo:
Problem-Based Learning, despite recent controversies about its effectiveness, is used extensively as a teaching method throughout higher education. In meteorology, there has been little attempt to incorporate Problem-Based Learning techniques into the curriculum. Motivated by a desire to enhance the reflective engagement of students within a current field course module, this project describes the implementation of two test Problem-Based Learning activities and testing and improvement using several different and complementary means of evaluation. By the end of a 2-year program of design, implementation, testing, and reflection and re-evaluation, two robust, engaging activities have been developed that provide an enhanced and diverse learning environment in the field course. The results suggest that Problem-Based Learning techniques would be a useful addition to the meteorology curriculum and suggestions for courses and activities that may benefit from this approach are included in the conclusions.
Resumo:
Geoengineering by injection of reflective aerosols into the stratosphere has been proposed as a way to counteract the warming effect of greenhouse gases by reducing the intensity of solar radiation reaching the surface. Here, climate model simulations are used to examine the effect of geoengineering on the tropical overturning circulation. The strength of the circulation is related to the atmospheric static stability and has implications for tropical rainfall. The tropical circulation is projected to weaken under anthropogenic global warming. Geoengineering with stratospheric sulfate aerosol does not mitigate this weakening of the circulation. This response is due to a fast adjustment of the troposphere to radiative heating from the aerosol layer. This effect is not captured when geoengineering is modelled as a reduction in total solar irradiance, suggesting caution is required when interpreting model results from solar dimming experiments as analogues for stratospheric aerosol geoengineering.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.
Resumo:
This paper draws upon fieldwork undertaken across Kenya, Zambia, Mozambique and South Africa to present a reflective overview of the use of financial services amongst the poorest members of society. It considers the role that access to a portfolio of financial products and services may have as a contributory factor in poverty alleviation, but also how inappropriate use of these mechanisms may exacerbate a descent into poverty. This work draws upon the notions of poverty pools and the rise of fall of low income households in and out of poverty, alongside the contributory nature of vicious cycles of economic and political poverty. Drawing on fieldwork experiences it presents a synopsis of the types of financial mechanisms commonly in use on the African continent, as well as examples of public, private and civil society partnerships that are producing services specifically tailored for those in extreme and absolute poverty.
Resumo:
Samples containing red pigment have been collected from two different archaeological sites dating to the Neolithic (Çatalhöyük in Turkey and Sheikh-e Abad in Iran) and have been analysed by a range of techniques. Sub-samples were examined by IR spectroscopy and X-ray diffraction, whilst thin sections were studied using optical polarising microscopy, synchrotron based IR microscopy and environmental scanning electron microscopy with energy dispersive X-ray analysis. Thin layers of red paint in a wall painting from Çatalhöyük were found to contain ochre (hematite and clay) as well as an unexpected component, grains of red and colourless obsidian, which have not been identified in any previous studies of the wall paintings at Çatalhöyük. These small grains of obsidian may have improved the reflective properties of the paint and made the artwork more vivid in the darkness of the buildings. Analysis of a roughly shaped ball of red sediment found on a possible working surface at Sheikh-e Abad revealed that the cause of the red colouring was the mineral hematite, which was probably from a source of terra rossa sediment in the local area. The results of this work suggest it is unlikely that this had been altered by the Neolithic people through mixing with other minerals.
Resumo:
This paper concerns the philosophical significance of a choice about how to design the context-shifting experiments used by contextualists and anti-intellectualists: Should contexts be judged jointly, with contrast, or separately, without contrast? Findings in experimental psychology suggest (1) that certain contextual features are difficult to evaluate when considered separately, and there are reasons to think that one feature that interests contextualists and anti- intellectualists—stakes or importance—is such a difficult to evaluate attribute, and (2) that joint evaluation of contexts can yield judgments that are more reflective and rational in certain respects. With those two points in mind, a question is raised about what source of evidence provides better support for philosophical theories of how contextual features affect knowledge ascriptions and evidence: Should we prefer evidence consisting of "ordinary" judgments, or more reflective, perhaps more rational judgments? That question is answered in relation to different accounts of what such theories aim to explain, and it is concluded that evidence from contexts evaluated jointly should be an important source of evidence for contextualist and anti-intellectualist theories, a conclusion that is at odds with the methodology of some recent studies in experimental epistemology.
Resumo:
Intuition is an important and under-researched concept in information systems. Prior exploratory research has shown that that there is potential to characterize the use of intuition in academic information systems research. This paper extends this research to all of the available issues of two leading IS journals with the aim of reaching an approximation of theoretical saturation. Specifically, the entire text of MISQ and ISR was reviewed for the years 1990 through 2009 using searchable PDF versions of these publications. All references to intuition were coded on a basis consistent with Grounded Theory, interpreted as a gestalt and represented as a mind-map. In the period 1990-2009, 681 incidents of the use of "intuition", and related terms were found in the articles reviewed, representing a greater range of codes than prior research. In addition, codes were assigned to all issues of MIS Quarterly from commencement of publication to the end of the 2012 publication year to support the conjecture that coding saturation has been approximated. The most prominent use of the term of "intuition" was coded as "Intuition as Authority" in which intuition was used to validate a statement, research objective or a finding; representing approximately 34 per cent of codes assigned. In research articles where mathematical analysis was presented, researchers not infrequently commented on the degree to which a mathematical formulation was "intuitive"; this was the second most common coding representing approximately 16 per cent of the codes. The possibly most impactful use of the term "intuition" was "Intuition as Outcome", representing approximately 7 per cent of all coding, which characterized research results as adding to the intuitive understanding of a research topic or phenomena.This research aims to contribute to a greater theoretical understanding of the use of intuition in academic IS research publications. It provides potential benefits to practitioners by providing insight into the use of intuition in IS management, for example, emphasizing the emerging importance of "intuitive technology". Research directions include the creation of reflective and/or formative constructs for intuition in information systems research and the expansion of this novel research method to additional IS academic publications and topics.