829 resultados para Frames and Locales
Resumo:
One can do research in pointfree topology in two ways. The rst is the contravariant way where research is done in the category Frm but the ultimate objective is to obtain results in Loc. The other way is the covariant way to carry out research in the category Loc itself directly. According to Johnstone [23], \frame theory is lattice theory applied to topology whereas locale theory is topology itself". The most part of this thesis is written according to the rst view. In this thesis, we make an attempt to study about 1. the frame counterparts of maximal compactness, minimal Hausdor - ness and reversibility, 2. the automorphism groups of a nite frame and its relation with the subgroups of the permutation group on the generator set of the frame
Resumo:
How do humans respond to their social context? This question is becoming increasingly urgent in a society where democracy requires that the citizens of a country help to decide upon its policy directions, and yet those citizens frequently have very little knowledge of the complex issues that these policies seek to address. Frequently, we find that humans make their decisions more with reference to their social setting, than to the arguments of scientists, academics, and policy makers. It is broadly anticipated that the agent based modelling (ABM) of human behaviour will make it possible to treat such social effects, but we take the position here that a more sophisticated treatment of context will be required in many such models. While notions such as historical context (where the past history of an agent might affect its later actions) and situational context (where the agent will choose a different action in a different situation) abound in ABM scenarios, we will discuss a case of a potentially changing context, where social effects can have a strong influence upon the perceptions of a group of subjects. In particular, we shall discuss a recently reported case where a biased worm in an election debate led to significant distortions in the reports given by participants as to who won the debate (Davis et al 2011). Thus, participants in a different social context drew different conclusions about the perceived winner of the same debate, with associated significant differences among the two groups as to who they would vote for in the coming election. We extend this example to the problem of modelling the likely electoral responses of agents in the context of the climate change debate, and discuss the notion of interference between related questions that might be asked of an agent in a social simulation that was intended to simulate their likely responses. A modelling technology which could account for such strong social contextual effects would benefit regulatory bodies which need to navigate between multiple interests and concerns, and we shall present one viable avenue for constructing such a technology. A geometric approach will be presented, where the internal state of an agent is represented in a vector space, and their social context is naturally modelled as a set of basis states that are chosen with reference to the problem space.
Resumo:
Introduction- This study investigates the prevailing status of Nepalese media portrayal of natural disasters. It is contributing to the development of a disaster management model to improve the effectiveness and efficiency of news production throughout the continuum of prevention, preparedness, response and recovery (PPRR) phases of disaster management. Theoretical framework- Studies of media content often rely on framing as the theoretical underpinning of the study, as it describes how the press crafts the message. However there are additional theoretical perspectives that underline an understanding of the role of the media. This article outlines a conceptual understanding of the role of the media in modern society, the way that this conceptual understanding is used in the crafting of media messages and how those theoretical considerations are applied to the concepts that underpin effective disaster management. (R.M. Entman, 2003; Liu, 2007; Meng & Berger, 2008). Methodology- A qualitative descriptive design is used to analyse the disaster news of Nepal Television (NTV). However, this paper presents the preliminary findings of Nepal Television (a government owned Television station) using qualitative content analysis of 105 natural disaster related news scripts (June 2012-March 2013) based on the framing theory and PPRR cycle. Results- The preliminary results indicate that the media focus while framing natural disasters is dominated by human interest frame followed by responsibility frame. News about response phase was found to be most prominent in terms of PPRR cycle. Limited disaster reporting by NTV has impacted the national disaster management programs and strategies. The findings describe natural disasters are being reported within the limited understanding of the important principles of disaster management and PPRR cycle. Conclusion- This paper describes the current status of the coverage of natural disasters by Nepal Television to identify the frames used in the news content. It contributes to determining the characteristics of effective media reporting of natural disasters in the government owned media outlets, and also leads to including communities and agencies involved in disasters. It suggests the frames which are best suited for news making and how media responds to the different phases of the disaster cycle.
Resumo:
This study seeks to understand the prevailing status of Nepalese media portrayal of natural disasters and develop a disaster management framework to improve the effectiveness and efficiency of news production through the continuum of prevention, preparedness, response and recovery (PPRR) phases of disaster management. The study is currently under progress. It is being undertaken in three phases. In phase-1, a qualitative content analysis is conducted. The news contents are categorized in frames as proposed in the 'Framing theory' and pre-defined frames. However, researcher has looked at the theories of the Press, linking to social responsibility theory as it is regarded as the major obligation of the media towards the society. Thereafter, the contents are categorized as per PPRR cycle. In Phase-2, based on the findings of content analysis, 12 in-depth interviews with journalists, disaster managers and community leaders are conducted. In phase-3, based on the findings of content analysis and in-depth interviews, a framework for effective media management of disaster are developed using thematic analysis. As the study is currently under progress hence, findings from the pilot study are elucidated. The response phase of disasters is most commonly reported in Nepal. There is relatively low coverage of preparedness and prevention. Furthermore, the responsibility frame in the news is most prevalent following human interest. Economic consequences and conflict frames are also used while reporting and vulnerability assessment has been used as an additional frame. The outcomes of this study are multifaceted: At the micro-level people will be benefited as it will enable a reduction in the loss of human lives and property through effective dissemination of information in news and other mode of media. They will be ‘well prepared for', 'able to prevent', 'respond to' and 'recover from' any natural disasters. At the meso level the media industry will be benefited and have their own 'disaster management model of news production' as an effective disaster reporting tool which will improve in media's editorial judgment and priority. At the macro-level it will assist government and other agencies to develop appropriate policies and strategies for better management of natural disasters.
Resumo:
Tight fusion frames which form optimal packings in Grassmannian manifolds are of interest in signal processing and communication applications. In this paper, we study optimal packings and fusion frames having a specific structure for use in block sparse recovery problems. The paper starts with a sufficient condition for a set of subspaces to be an optimal packing. Further, a method of using optimal Grassmannian frames to construct tight fusion frames which form optimal packings is given. Then, we derive a lower bound on the block coherence of dictionaries used in block sparse recovery. From this result, we conclude that the Grassmannian fusion frames considered in this paper are optimal from the block coherence point of view. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Introduction: The National Oceanic and Atmospheric Administration’s Biogeography Branch has conducted surveys of reef fish in the Caribbean since 1999. Surveys were initially undertaken to identify essential fish habitat, but later were used to characterize and monitor reef fish populations and benthic communities over time. The Branch’s goals are to develop knowledge and products on the distribution and ecology of living marine resources and provide resource managers, scientists and the public with an improved ecosystem basis for making decisions. The Biogeography Branch monitors reef fishes and benthic communities in three study areas: (1) St. John, USVI, (2) Buck Island, St. Croix, USVI, and (3) La Parguera, Puerto Rico. In addition, the Branch has characterized the reef fish and benthic communities in the Flower Garden Banks National Marine Sanctuary, Gray’s Reef National Marine Sanctuary and around the island of Vieques, Puerto Rico. Reef fish data are collected using a stratified random sampling design and stringent measurement protocols. Over time, the sampling design has changed in order to meet different management objectives (i.e. identification of essential fish habitat vs. monitoring), but the designs have always remained: • Probabilistic – to allow inferences to a larger targeted population, • Objective – to satisfy management objectives, and • Stratified – to reduce sampling costs and obtain population estimates for strata. There are two aspects of the sampling design which are now under consideration and are the focus of this report: first, the application of a sample frame, identified as a set of points or grid elements from which a sample is selected; and second, the application of subsampling in a two-stage sampling design. To evaluate these considerations, the pros and cons of implementing a sampling frame and subsampling are discussed. Particular attention is paid to the impacts of each design on accuracy (bias), feasibility and sampling cost (precision). Further, this report presents an analysis of data to determine the optimal number of subsamples to collect if subsampling were used. (PDF contains 19 pages)
Resumo:
The study of codes, classically motivated by the need to communicate information reliably in the presence of error, has found new life in fields as diverse as network communication, distributed storage of data, and even has connections to the design of linear measurements used in compressive sensing. But in all contexts, a code typically involves exploiting the algebraic or geometric structure underlying an application. In this thesis, we examine several problems in coding theory, and try to gain some insight into the algebraic structure behind them.
The first is the study of the entropy region - the space of all possible vectors of joint entropies which can arise from a set of discrete random variables. Understanding this region is essentially the key to optimizing network codes for a given network. To this end, we employ a group-theoretic method of constructing random variables producing so-called "group-characterizable" entropy vectors, which are capable of approximating any point in the entropy region. We show how small groups can be used to produce entropy vectors which violate the Ingleton inequality, a fundamental bound on entropy vectors arising from the random variables involved in linear network codes. We discuss the suitability of these groups to design codes for networks which could potentially outperform linear coding.
The second topic we discuss is the design of frames with low coherence, closely related to finding spherical codes in which the codewords are unit vectors spaced out around the unit sphere so as to minimize the magnitudes of their mutual inner products. We show how to build frames by selecting a cleverly chosen set of representations of a finite group to produce a "group code" as described by Slepian decades ago. We go on to reinterpret our method as selecting a subset of rows of a group Fourier matrix, allowing us to study and bound our frames' coherences using character theory. We discuss the usefulness of our frames in sparse signal recovery using linear measurements.
The final problem we investigate is that of coding with constraints, most recently motivated by the demand for ways to encode large amounts of data using error-correcting codes so that any small loss can be recovered from a small set of surviving data. Most often, this involves using a systematic linear error-correcting code in which each parity symbol is constrained to be a function of some subset of the message symbols. We derive bounds on the minimum distance of such a code based on its constraints, and characterize when these bounds can be achieved using subcodes of Reed-Solomon codes.
Resumo:
Using Azoulay's frame of the civil gaze, this chapter examines selected Second World War images, catalogued under 'interpreter' in the IWM's photographic archive, looking at representative situations in which interpreters typically operate in wartime - communicating with clandestine forces, liaising between the army and civilians, and dealing with the aftermath of war.
Resumo:
Includes bibliography
Resumo:
In late 1994 and early 1995, Ebola (EBO) virus dramatically reemerged in Africa, causing human disease in the Ivory Coast and Zaire. Analysis of the entire glycoprotein genes of these viruses and those of other EBO virus subtypes has shown that the virion glycoprotein (130 kDa) is encoded in two reading frames, which are linked by transcriptional editing. This editing results in the addition of an extra nontemplated adenosine within a run of seven adenosines near the middle of the coding region. The primary gene product is a smaller (50-70 kDa), nonstructural, secreted glycoprotein, which is produced in large amounts and has an unknown function. Phylogenetic analysis indicates that EBO virus subtypes are genetically diverse and that the recent Ivory Coast isolate represents a new (fourth) subtype of EBO virus. In contrast, the EBO virus isolate from the 1995 outbreak in Kikwit, Zaire, is virtually identical to the virus that caused a similar epidemic in Yambuku, Zaire, almost 20 years earlier. This genetic stability may indicate that EBO viruses have coevolved with their natural reservoirs and do not change appreciably in the wild.
Resumo:
In the context of Aboriginal-Anglo Australian relations, we tested the effect of framing (multiculturalism versus separatism) and majority group members' social values (universalism) on the persuasiveness of Aboriginal group rhetoric, majority collective guilt, attitudes toward compensation, and reparations for Aboriginals. As predicted, Anglo Australians who are low on universalism report more collective guilt when presented with a multiculturalist than a separatist Aboriginal frame, whereas those high on universalism report high levels of guilt independent of frame. The same pattern was predicted and found for the persuasiveness of the rhetoric and attitudes toward compensation. Our data suggest that (a) for individuals low in universalism, framing produces attitudes consonant with compensation because it produces collective guilt and (b) the reason that universalists are more in favor of compensation and reparation is because of high collective guilt. We discuss the strategic use of language to create power through the manipulation of collective guilt in political contexts.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
A finite-strain solid–shell element is proposed. It is based on least-squares in-plane assumed strains, assumed natural transverse shear and normal strains. The singular value decomposition (SVD) is used to define local (integration-point) orthogonal frames-of-reference solely from the Jacobian matrix. The complete finite-strain formulation is derived and tested. Assumed strains obtained from least-squares fitting are an alternative to the enhanced-assumed-strain (EAS) formulations and, in contrast with these, the result is an element satisfying the Patch test. There are no additional degrees-of-freedom, as it is the case with the enhanced-assumed-strain case, even by means of static condensation. Least-squares fitting produces invariant finite strain elements which are shear-locking free and amenable to be incorporated in large-scale codes. With that goal, we use automatically generated code produced by AceGen and Mathematica. All benchmarks show excellent results, similar to the best available shell and hybrid solid elements with significantly lower computational cost.