821 resultados para Truth and value
Resumo:
The questions of whether science pursues truth as correspondence to reality and whether science in fact progresses towards attaining a truthful understanding of physical reality are fundamental and contested in the philosophy of science. On one side of the debate stands Popper, who argues that science is objective, necessarily assumes a correspondence theory of truth, and inevitably progresses toward truth as physical theories develop, gaining a more truthful understanding of reality through progressively more sophisticated empirical analysis. Conversely Kuhn, influenced by postmodern philosophy, argues that ultimate truth cannot be attained since no objective metaphysical reality exists and it cannot be known, and consequently the notion of scientific objectivity and "progress" is a myth, marred by philosophical and ideological value judgments. Ultimately, Kuhn reduces so-called scientific progress through the adoption of successive paradigms to leaps of "faith". This paper seeks a reconciliation of the two extremes, arguing that Popper is correct in the sense that science assumes a correspondence theory of truth and may progress toward truth as physical theories develop, while simultaneously acknowledging with Kuhn that science is not purely objective and free of value judgments. The notion of faith is also critical, for it was the acknowledgement of God's existence as the creator and instituter of observable natural laws which allowed the development of science and the scientific method in the first place. Therefore, accepting and synthesising the contentions that science is to some extent founded on faith, assumes and progresses toward truth, and is subject to value judgments is necessary for the progress of science.
Resumo:
This exegesis examines how a writer can effectively negotiate the relationship between author, character, fact and truth, in a work of Creative Nonfiction. It was found that individual truths, in a work of Creative Nonfiction, are not necessarily universal truths due to individual, cultural, historical and religious circumstances. What was also identified, through the examination of published Creative Nonfiction, is a necessity to ensure there are clear demarcation lines between authorial truth and fiction. The Creative Nonfiction works examined, which established this framework for the reader, ensured an ethical relationship between author and audience. These strategies and frameworks were then applied to my own Creative Nonfiction.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.
Resumo:
This article sets out to interpret the construction of truth discourse in the War of Canudos, through the classic 'Rebellion in the backland' by Euclides da Cunha. To enrich the research, the articles wrote by Cunha, while he was a war correspondent for the Estado de São Paulo newspaper, will be analyzed, too. Along with the text, the expression “truth-effects” designed by French philosopher Michel Foucault is being used. “Effects of truth” is an expression in reference to the idea of discourses being neither true nor false. In Os sertões, the effects of truth emerge from strategic power disputes amongst the Church, landowners, politicians and a seaside ruling elite that ignores the reality of the poor and forsaken hinterlands. Keywords: discourse, power, truth.
Resumo:
The subject of this paper is the changes in the taxation of non-profit organisations which seem to be more or less inherent in the value added taxes. The Australian federal Coalition's proposed goods and services tax will be part of the discussion.
Resumo:
How can we reach out to institutions, artists and audiences with sometimes radically different agendas to encourage them to see, participate in and support the development of new practices and programs in the performing arts? In this paper, based on a plenary panel at PSi#18 Performance Culture Industry at the University of Leeds, Clarissa Ruiz (Columbia), AnuradhaKapur (India) and Sheena Wrigley (England) together with interloctorBree Hadley (Australia) speak about their work in as policy-makers, managers and producers in the performing arts in Europe, Asia and America over the past several decades. Acknowledged trailblazers in their fields, Ruiz, Kapur and Wrigley all have a commitment to creating a vital, viable and sustainable performing arts ecologies. Each has extensive experience in performance, politics, and the challenging process of managing histories, visions, stakeholders, and sometimes scarce resources to generate lasting benefits for the various communities have worked for, with and within. Their work, cultivating new initiatives, programs or policy has made them expert at brokering relationships in and in between private, public and political spheres to elevate the status of and support for performing arts as a socially and economically beneficial activity everyone can participate in. Each gives examples from their own practice to provide insight into how to negotiate the interests of artistic, government, corporate, community and education partners, and the interests of audiences, to create aesthetic, cultural and / or economic value. Together, their views offer a compelling set of perspectives on the changing meanings of the ‘value of the arts’ and the effects this has had for the artists that make and arts organisations that produce and present work in a range of different regional, national and cross-national contexts.
Resumo:
In recent years, the imperative to communicate organisational impacts to a variety of stakeholders has gained increasing importance within all sectors. Despite growing external demands for evaluation and social impact measurement, there has been limited critically informed analysis about the presumed importance of these activities to organisational success and the practical challenges faced by organisations in undertaking such assessment. In this paper, we present the findings from an action research study of five Australian small to medium social enterprises’ practices and use of evaluation and social impact analysis. Our findings have implications for social enterprise operators, policy makers and social investors regarding when, why and at what level these activities contribute to organisational performance and the fulfilment of mission.
Resumo:
Membrane filtration technology has been proven to be a technically sound process to improve the quality of clarified cane juice and subsequently to increase the productivity of crystallisation and the quality of sugar production. However, commercial applications have been hindered because the benefits to crystallisation and sugar quality have not outweighed the increased processing costs associated with membrane applications. An 'Integrated Sugar Production Process (ISPP) Concept Model' is proposed to recover more value from the non-sucrose streams generated by membrane processing. Pilot scale membrane fractionation trials confirmed the technical feasibility of separating high-molecular weight, antioxidant and reducing sugar fractions from cane juice in forms suitable for value recovery. It was also found that up to 40% of potassium salts from the juice can be removed by membrane application while removing the similar amount of water with potential energy saving in subsequent evaporation. Application of ISPP would allow sugar industry to co-produce multiple products and high quality mill sugar while eliminating energy intensive refining processes.
Resumo:
Objective 1. Measure spatial and temporal trawl frequency of scallop grounds using VMS data. This will provide a relative measure of how often individual undersized scallops are caught and put through a tumbler 2. Estimate discard mortality and growth rates for saucer scallops using cage experiments. 3. Evaluate the current management measures, in particular the seasonal closure, rotational closure and seasonally varying minimum legal sizes using stock assessment and management modeling models. Recommend optimal range of management measures to ensure long-term viability and value of the Scallop fishery based on a formal management strategy evaluation. Outcomes acheived to date: 1. Improved understanding of the survival rates of discarded sub-legal scallops; 2. Preliminary von Bertalanffy growth parameters using data from tagged-and-released scallops; 3. Changing trends in vessels and fishing gear used in the Queensland scallop fishery and their effect on scallop catch rates over time using standardised catch rates quantified; 4. Increases in fishing power of vessels operating in the Queensland scallop fishery quantified; 5. Trawl intensity mapped and quantified for all Scallop Replenishment Areas; 6. Harvest Strategy Evaluations completed.
Composition operators, Aleksandrov measures and value distribution of analytic maps in the unit disc
Resumo:
A composition operator is a linear operator that precomposes any given function with another function, which is held fixed and called the symbol of the composition operator. This dissertation studies such operators and questions related to their theory in the case when the functions to be composed are analytic in the unit disc of the complex plane. Thus the subject of the dissertation lies at the intersection of analytic function theory and operator theory. The work contains three research articles. The first article is concerned with the value distribution of analytic functions. In the literature there are two different conditions which characterize when a composition operator is compact on the Hardy spaces of the unit disc. One condition is in terms of the classical Nevanlinna counting function, defined inside the disc, and the other condition involves a family of certain measures called the Aleksandrov (or Clark) measures and supported on the boundary of the disc. The article explains the connection between these two approaches from a function-theoretic point of view. It is shown that the Aleksandrov measures can be interpreted as kinds of boundary limits of the Nevanlinna counting function as one approaches the boundary from within the disc. The other two articles investigate the compactness properties of the difference of two composition operators, which is beneficial for understanding the structure of the set of all composition operators. The second article considers this question on the Hardy and related spaces of the disc, and employs Aleksandrov measures as its main tool. The results obtained generalize those existing for the case of a single composition operator. However, there are some peculiarities which do not occur in the theory of a single operator. The third article studies the compactness of the difference operator on the Bloch and Lipschitz spaces, improving and extending results given in the previous literature. Moreover, in this connection one obtains a general result which characterizes the compactness and weak compactness of the difference of two weighted composition operators on certain weighted Hardy-type spaces.