917 resultados para Digital information environment
Resumo:
Este estudo buscou investigar duas relações de interesse: a relação entre poder e cobertura de analistas financeiros no mercado acionário brasileiro, e a relação entre poder e assimetria informacional neste mercado, nos períodos de 2000 a 2010. O objetivo desta pesquisa envolveu verificar se o poder empresarial aumenta a assimetria informacional decorrentes dos custos de agência envolvidos e possibilidade de expropriação de valor (Jensen & Meckling, 1976), ou diminui a assimetria, uma vez que administração da empresa não se sente vulnerável a demissões ou possíveis embaraços a sua atuação, e opta por não omitir informações aos stakeholders (Bertrand & Mullainathan, 2003). Ainda relacionado ao ambiente informacional impactado pelo poder empresarial, buscou-se verificar se os analistas financeiros acompanham empresas que apresentam uma maior assimetria informacional, e assim cumprindo sua função de monitoramento da gestão empresarial (Healy & Palepu, 2001), ou menor assimetria, em decorrência dos custos envolvidos em se obter informações privadas (Frankel, Kothari & Weber, 2006). Com o uso de proxies criadas pela análise fatorial para capturar as especificidades relacionadas a poder empresarial e assimetria informacional no ambiente empresarial brasileiro, foram observadas uma relação negativa entre cobertura de analistas financeiros e poder empresarial e uma relação positiva entre assimetria e poder empresarial. Pelas hipóteses esquematizadas por Jiraporn, Liu e Kim (2012), que abarcam todas as relações possíveis entre assimetria, poder empresarial e cobertura de analistas financeiros, os resultados se enquadram na Hipótese da Opacidade.
Resumo:
El ecosistema mediático español se está transformando a ritmo vertiginoso, de manera similar a la política nacional y con muchas claves coincidentes como la participación ciudadana. Tras unos años de experimentación en torno a contenidos, organización y sobre todo modelos de negocio de los nuevos medios, en España se están consolidando proyectos de sumo interés. Uno de ellos es eldiario.es, medio nativo digital encabezado por el periodista Ignacio Escolar que se ha colocado entre los medios más influyentes del país, entre los más visitados en Internet, y ello gracias a un periodismo de calidad especializado en contenidos de política nacional, aderezados con un toque social. En el presente artículo se analiza la evolución de los medios nativos digitales en España durante el último lustro, poniendo el foco en el caso aludido por la originalidad de su modelo de negocio y por el impacto de sus contenidos de actualidad política.
Resumo:
Background. Websites have the potential to deliver enhanced versions of targeted and tailored physical activity programs to large numbers of participants. We describe participant engagement and retention with a stage-based physical activity website in a workplace setting. Methods. We analyzed data from participants in the website condition of a randomized trial designed to test the efficacy of a print- vs. website-delivered intervention. They received four stage-targeted e-mails over 8 weeks, with hyperlinks to the website. Both objective and self-reported website use data were collected and analyzed. Results. Overall, 327 were randomized to the website condition and 250 (76%) completed the follow-up survey. Forty-six percent (n = 152) visited the website over the trial period. A total of 4,114 hits to the website were recorded. Participants who entered the site spent on average 9 min per visit and viewed 18 pages. Website use declined over time; 77% of all visits followed the first e-mail. Conclusions. Limited website engagement, despite the perceived usefulness of the materials, demonstrates possible constraints on the use of e-mails and websites in delivering health behavior change programs. In the often-cluttered information environment of workplaces, issues of engagement and retention in website-delivered programs require attention. (C) 2004 The Institute For Cancer Prevention and Elsevier Inc. All rights reserved.
Resumo:
Australia ’s media policy agenda has recently been dominated by debate over two key issues: media ownership reform, and the local content provisions of the Australia–United States Free Trade Agreement. Challenging the tendency to analyse these issues separately, the article considers them as interlinked indicators of fundamental shifts occurring in the digital media environment. Converged media corporations increasingly seek to achieve economies of scale through ‘content streaming’: multi-purposing proprietary content across numerous digitally enabled platforms. This has resulted in rivalries for control of delivery technologies (as witnessed in media ownership debates) as well as over market access for corporate content (in the case of local content debates). The article contextualises Australia’s contemporary media policy flashpoints within international developments and longer-term industry strategising. It further questions the power of media policy as it is currently conceived to deal adequately with the challenges raised by a converging digital media marketplace.
Resumo:
A vision of the future of intraoperative monitoring for anesthesia is presented-a multimodal world based on advanced sensing capabilities. I explore progress towards this vision, outlining the general nature of the anesthetist's monitoring task and the dangers of attentional capture. Research in attention indicates different kinds of attentional control, such as endogenous and exogenous orienting, which are critical to how awareness of patient state is maintained, but which may work differently across different modalities. Four kinds of medical monitoring displays are surveyed: (1) integrated visual displays, (2) head-mounted displays, (3) advanced auditory displays and (4) auditory alarms. Achievements and challenges in each area are outlined. In future research, we should focus more clearly on identifying anesthetists' information needs and we should develop models of attention in different modalities and across different modalities that are more capable of guiding design. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The Securities and Exchange Commission (SEC) in the United States mandated a new digital reporting system for US companies in late 2008. The new generation of information provision has been dubbed by Chairman Cox, ‘interactive data’ (SEC, 2006a). Despite the promise of its name, we find that in the development of the project retail investors are invoked as calculative actors rather than engaged in dialogue. Similarly, the potential for the underlying technology to be applied in ways to encourage new forms of accountability appears to be forfeited in the interests of enrolling company filers. We theorise the activities of the SEC and in particular its chairman at the time, Christopher Cox, over a three year period, both prior to and following the ‘credit crisis’. We argue that individuals and institutions play a central role in advancing the socio-technical project that is constituted by interactive data. We adopt insights from ANT (Callon, 1986; Latour, 1987, 2005b) and governmentality (Miller, 2008; Miller and Rose, 2008) to show how regulators and the proponents of the technology have acted as spokespersons for the interactive data technology and the retail investor. We examine the way in which calculative accountability has been privileged in the SEC’s construction of the retail investor as concerned with atomised, quantitative data (Kamuf, 2007; Roberts, 2009; Tsoukas, 1997). We find that the possibilities for the democratising effects of digital information on the Internet has not been realised in the interactive data project and that it contains risks for the very investors the SEC claims to seek to protect.
Resumo:
In the contemporary business environment, to adhere to the need of the customers, caused the shift from mass production to mass-customization. This necessitates the supply chain (SC) to be effective flexible. The purpose of this paper is to seek flexibility through adoption of family-based dispatching rules under the influence of inventory system implemented at downstream echelons of an industrial supply chain network. We compared the family-based dispatching rules in existing literature under the purview of inventory system and information sharing within a supply chain network. The dispatching rules are compared for Average Flow Time performance, which is averaged over the three product families. The performance is measured using extensive discrete event simulation process. Given the various inventory related operational factors at downstream echelons, the present paper highlights the importance of strategically adopting appropriate family-based dispatching rule at the manufacturing end. In the environment of mass customization, it becomes imperative to adopt the family-based dispatching rule from the system wide SC perspective. This warrants the application of intra as well as inter-echelon information coordination. The holonic paradigm emerges in this research stream, amidst the holistic approach and the vital systemic approach. The present research shows its novelty in triplet. Firstly, it provides leverage to manager to strategically adopting a dispatching rule from the inventory system perspective. Secondly, the findings provide direction for the attenuation of adverse impact accruing from demand amplification (bullwhip effect) in the form of inventory levels by appropriately adopting family-based dispatching rule. Thirdly, the information environment is conceptualized under the paradigm of Koestler's holonic theory.
Resumo:
Indicators are widely used by organizations as a way of evaluating, measuring and classifying organizational performance. As part of performance evaluation systems, indicators are often shared or compared across internal sectors or with other organizations. However, indicators can be vague and imprecise, and also can lack semantics, making comparisons with other indicators difficult. Thus, this paper presents a knowledge model based on an ontology that may be used to represent indicators semantically and generically, dealing with the imprecision and vagueness, and thus facilitating better comparison. Semantic technologies are shown to be suitable for this solution, so that it could be able to represent complex data involved in indicators comparison.
Resumo:
This thesis addressed the problem of risk analysis in mental healthcare, with respect to the GRiST project at Aston University. That project provides a risk-screening tool based on the knowledge of 46 experts, captured as mind maps that describe relationships between risks and patterns of behavioural cues. Mind mapping, though, fails to impose control over content, and is not considered to formally represent knowledge. In contrast, this thesis treated GRiSTs mind maps as a rich knowledge base in need of refinement; that process drew on existing techniques for designing databases and knowledge bases. Identifying well-defined mind map concepts, though, was hindered by spelling mistakes, and by ambiguity and lack of coverage in the tools used for researching words. A novel use of the Edit Distance overcame those problems, by assessing similarities between mind map texts, and between spelling mistakes and suggested corrections. That algorithm further identified stems, the shortest text string found in related word-forms. As opposed to existing approaches’ reliance on built-in linguistic knowledge, this thesis devised a novel, more flexible text-based technique. An additional tool, Correspondence Analysis, found patterns in word usage that allowed machines to determine likely intended meanings for ambiguous words. Correspondence Analysis further produced clusters of related concepts, which in turn drove the automatic generation of novel mind maps. Such maps underpinned adjuncts to the mind mapping software used by GRiST; one such new facility generated novel mind maps, to reflect the collected expert knowledge on any specified concept. Mind maps from GRiST are stored as XML, which suggested storing them in an XML database. In fact, the entire approach here is ”XML-centric”, in that all stages rely on XML as far as possible. A XML-based query language allows user to retrieve information from the mind map knowledge base. The approach, it was concluded, will prove valuable to mind mapping in general, and to detecting patterns in any type of digital information.
Resumo:
Laura Kurgan’s Monochrome Landscapes (2004), first exhibited in the Whitney Museum of American Art in New York City, consists of four oblong Cibachrome prints derived from digital files sourced from the commercial Ikonos and QuickBird satellites. The prints are ostensibly flat, depthless fields of white, green, blue, and yellow, yet the captions provided explain that the sites represented are related to contested military, industrial, and cartographic practices. In Kurgan’s account of Monochrome Landscapes she explains that it is in dialogue with another work from the Whitney by abstract artist Ellsworth Kelly. This article pursues the relationship between formalist abstraction and satellite imaging in order to demonstrate how formalist strategies aimed at producing an immediate retinal response are bound up with contemporary uses of digital information and the truth claims such information can be made to substantiate.
Resumo:
Estudiosos de todo el mundo se están centrando en el estudio del fenómeno de las ciudades inteligentes. La producción bibliográfica española sobre este tema ha crecido exponencialmente en los últimos años. Las nuevas ciudades inteligentes se fundamentan en nuevas visiones de desarrollo urbano que integran múltiples soluciones tecnológicas ligadas al mundo de la información y de la comunicación, todas ellas actuales y al servicio de las necesidades de la ciudad. La literatura en español sobre este tema proviene de campos tan diferentes como la Arquitectura, la Ingeniería, las Ciencias Políticas y el Derecho o las Ciencias Empresariales. La finalidad de las ciudades inteligentes es la mejora de la vida de sus ciudadanos a través de la implementación de tecnologías de la información y de la comunicación que resuelvan las necesidades de sus habitantes, por lo que los investigadores del campo de las Ciencias de la Comunicación y de la Información tienen mucho que decir. Este trabajo analiza un total de 120 textos y concluye que el fenómeno de las ciudades inteligentes será uno de los ejes centrales de la investigación multidisciplinar en los próximos años en nuestro país.
Resumo:
The Mobile Information Literacy curriculum is a growing collection of training materials designed to build literacies for the millions of people worldwide coming online every month via a mobile phone. Most information information and digital literacy curricula were designed for a PC age, and public and private organizations around the world have used these curricula to help newcomers use computers and the internet effectively and safely. The better curricula address not only skills, but also concepts and attitudes. The central question for this project is: what are the relevant skills, concepts, and attitudes for people using mobiles, not PCs, to access the internet? As part of the Information Strategies for Societies in Transition project, we developed a six-module curriculum for mobile-first users. The project is situated in Myanmar, a country undergoing massive political, economic, and social changes, and where mobile penetration is expected to reach 80% by the end of 2015 from just 4% in 2014. Combined with the country’s history of media censorship, Myanmar presents unique challenges for addressing the needs of people who need the ability to find and evaluate the quality and credibility of information obtained online, understand how to create and share online information effectively, and participate safely and securely.
Resumo:
The electric power systems are getting more complex and covering larger areas day by day. This fact has been contribuiting to the development of monitoring techniques that aim to help the analysis, control and planning of power systems. Supervisory Control and Data Acquisition (SCADA) systems, Wide Area Measurement Systems and disturbance record systems. Unlike SCADA and WAMS, disturbance record systems are mainly used for offilne analysis in occurrences where a fault resulted in tripping of and apparatus such as a transimission line, transformer, generator and so on. The device responsible for record the disturbances is called Digital Fault Recorder (DFR) and records, basically, electrical quantities as voltage and currents and also, records digital information from protection system devices. Generally, in power plants, all the DFRs data are centralized in the utility data centre and it results in an excess of data that difficults the task of analysis by the specialist engineers. This dissertation shows a new methodology for automated analysis of disturbances in power plants. A fuzzy reasoning system is proposed to deal with the data from the DFRs. The objective of the system is to help the engineer resposnible for the analysis of the DFRs’s information by means of a pre-classification of data. For that, the fuzzy system is responsible for generating unit operational state diagnosis and fault classification.
Resumo:
Malware is a foundational component of cyber crime that enables an attacker to modify the normal operation of a computer or access sensitive, digital information. Despite the extensive research performed to identify such programs, existing schemes fail to detect evasive malware, an increasingly popular class of malware that can alter its behavior at run-time, making it difficult to detect using today’s state of the art malware analysis systems. In this thesis, we present DVasion, a comprehensive strategy that exposes such evasive behavior through a multi-execution technique. DVasion successfully detects behavior that would have been missed by traditional, single-execution approaches, while addressing the limitations of previously proposed multi-execution systems. We demonstrate the accuracy of our system through strong parallels with existing work on evasive malware, as well as uncover the hidden behavior within 167 of 1,000 samples.