31 resultados para Industrial Discourses
Resumo:
There is an urgent interest in marketing to move away from neo-classical value definitions suggesting that value creation is a process of exchanging goods for money. In the present paper, value creation is conceptualized as an integration of two distinct, yet closely coupled processes. First, actors co-create what this paper calls an underlying basis of value. This is done by interactively re-configuring resources. By relating and combining resources, activity sets, and risks across actor boundaries in novel ways actors create joint productivity gains – a concept very similar to density (Normann, 2001). Second, actors engage in a process of signification and evaluation. Signification implies co-constructing the meaning and worth of joint productivity gains co-created through interactive resource re-configuration, as well as sharing those gains through a pricing mechanism as value to involved actors. The conceptual framework highlights an all-important dynamics associated with ´value creation´ and ´value´ - a dynamics the paper claims has eluded past marketing research. The paper argues that the framework presented here is appropriate for the interactive service perspective, where value and value creation are not objectively given, but depend on the power of involved actors´ socially constructed frames to mobilize resources across actor boundaries in ways that ´enhance system well-being´ (Vargo et al., 2008). The paper contributes to research on Service Logic, Service-Dominant Logic, and Service Science.
Resumo:
Yhteenveto: Kemikaalien teollisesta käsittelystä vesieliöille aiheutuvien riskien arviointi mallin avulla.
Resumo:
This master thesis studies how trade liberalization affects the firm-level productivity and industrial evolution. To do so, I built a dynamic model that considers firm-level productivity as endogenous to investigate the influence of trade on firm’s productivity and the market structure. In the framework, heterogeneous firms in the same industry operate differently in equilibrium. Specifically, firms are ex ante identical but heterogeneity arises as an equilibrium outcome. Under the setting of monopolistic competition, this type of model yields an industry that is represented not by a steady-state outcome, but by an evolution that rely on the decisions made by individual firms. I prove that trade liberalization has a general positive impact on technological adoption rates and hence increases the firm-level productivity. Besides, this endogenous technology adoption model also captures the stylized facts: exporting firms are larger and more productive than their non-exporting counterparts in the same sector. I assume that the number of firms is endogenous, since, according to the empirical literature, the industrial evolution shows considerably different patterns across countries; some industries experience large scale of firms’ exit in the period of contracting market shares, while some industries display relative stable number of firms or gradually increase quantities. The special word “shakeout” is used to describe the dramatic decrease in the number of firms. In order to explain the causes of shakeout, I construct a model where forward-looking firms decide to enter and exit the market on the basis of their state of technology. In equilibrium, firms choose different dates to adopt innovation which generate a gradual diffusion process. It is exactly this gradual diffusion process that generates the rapid, large-scale exit phenomenon. Specifically, it demonstrates that there is a positive feedback between firm’s exit and adoption, the reduction in the number of firms increases the incentives for remaining firms to adopt innovation. Therefore, in the setting of complete information, this model not only generates a shakeout but also captures the stability of an industry. However, the solely national view of industrial evolution neglects the importance of international trade in determining the shape of market structure. In particular, I show that the higher trade barriers lead to more fragile markets, encouraging the over-entry in the initial stage of industry life cycle and raising the probability of a shakeout. Therefore, more liberalized trade generates more stable market structure from both national and international viewpoints. The main references are Ederington and McCalman(2008,2009).
Resumo:
This thesis identifies, examines and problematizes some of the discourses that have so far come to light on the issue of protection for environmental refugees. By analyzing the discourses produced by the United Nations Office of the High Commissioner for Refugees (UNHCR) and two non-governmental organizations - the Environmental Justice Foundation (EJF) and Equity and Justice Working Group Bangladesh (EquityBD), I examine the struggling discourses that have emerged about how protection for environmental refugees has been interpreted. To do this, I rely on Ernesto Laclau and Chantal Mouffe's theory and method of discourse analysis. The results show that responsibilization is the main point of struggle in the discussions on the protection of environmental refugees. As a floating signifier, it was utilized by the discourses produced by the UNCHR and the selected NGOs in contingent ways and with different political objectives. The UNHCR discourse responsibilized both the environmental refugees for their own protection and the individual states. The EJF and EquityBD, by contrast, allocated responsibility for the protection of environmental refugees to the international community. These contingent understandings of responsibilization necessitated different justifications. While the EJF discourse relied on humanitarianism for the assistance of environmental refugees, the EquityBD discourse constructed a rights based, more permanent solution. The humanitarian based discourse of the EJF was found to be inextricably linked with the neoliberal discourse produced by the UNHCR. Both these discourses encouraged environmental refugees to stay in their homelands, undermining the politics of protection. Another way in which protection was undermined was by UNHCR's discourse on securitization. In this context, climate change induced displacement became threat to developed countries, the global economy and transnational classes. The struggling discourses about who/what has been allocated responsibility for the protection of environmental refugees also meant that identities of the displaced be constructed in specific ways. While the UNHCR discourse constructed as voluntary migrants and predators, the EJF and EquityBD discourses portrayed them as victims. However, even though the EJF discourse constructed them as victims, their reliance on humanitarianism could also be interpreted as a way of giving the environmental refugee a predator like identity. These discourses on responsibilization and identity formation clashed with each other in the aim of achieving a hegemonic position in discussions and debates about the protection of environmental refugees.
Resumo:
This thesis identifies, examines and problematizes some of the discourses that have so far come to light on the issue of protection for environmental refugees. By analyzing the discourses produced by the United Nations Office of the High Commissioner for Refugees (UNHCR) and two non-governmental organizations - the Environmental Justice Foundation (EJF) and Equity and Justice Working Group Bangladesh (EquityBD), I examine the struggling discourses that have emerged about how protection for environmental refugees has been interpreted. To do this, I rely on Ernesto Laclau and Chantal Mouffe's theory and method of discourse analysis. The results show that responsibilization is the main point of struggle in the discussions on the protection of environmental refugees. As a floating signifier, it was utilized by the discourses produced by the UNCHR and the selected NGOs in contingent ways and with different political objectives. The UNHCR discourse responsibilized both the environmental refugees for their own protection and the individual states. The EJF and EquityBD, by contrast, allocated responsibility for the protection of environmental refugees to the international community. These contingent understandings of responsibilization necessitated different justifications. While the EJF discourse relied on humanitarianism for the assistance of environmental refugees, the EquityBD discourse constructed a rights based, more permanent solution. The humanitarian based discourse of the EJF was found to be inextricably linked with the neoliberal discourse produced by the UNHCR. Both these discourses encouraged environmental refugees to stay in their homelands, undermining the politics of protection. Another way in which protection was undermined was by UNHCR's discourse on securitization. In this context, climate change induced displacement became threat to developed countries, the global economy and transnational classes. The struggling discourses about who/what has been allocated responsibility for the protection of environmental refugees also meant that identities of the displaced be constructed in specific ways. While the UNHCR discourse constructed as voluntary migrants and predators, the EJF and EquityBD discourses portrayed them as victims. However, even though the EJF discourse constructed them as victims, their reliance on humanitarianism could also be interpreted as a way of giving the environmental refugee a predator like identity. These discourses on responsibilization and identity formation clashed with each other in the aim of achieving a hegemonic position in discussions and debates about the protection of environmental refugees.
Resumo:
In the study, the potential allowable cut in the district of Pohjois-Savo - based on the non-industrial private forest landowners' (NIPF) choices of timber management strategies - was clarified. Alternative timber management strategies were generated, and the choices and factors affecting the choices of timber management strategies by NIPF landowners were studied. The choices of timber management strategies were solved by maximizing the utility functions of the NIPF landowners. The parameters of the utility functions were estimated using the Analytic Hierarchy Process (AHP). The level of the potential allowable cut was compared to the cutting budgets based on the 7th and 8th National Forest Inventories (NFI7 and NFI8), to the combining of private forestry plans, and to the realized drain from non-industrial private forests. The potential allowable cut was calculated using the same MELA system as has been used in the calculation of the national cutting budget. The data consisted of the NIPF holdings (from the TASO planning system) that had been inventoried compartmentwise and had forestry plans made during the years 1984-1992. The NIPF landowners' choices of timber management strategies were clarified by a two-phase mail inquiry. The most preferred strategy obtained was "sustainability" (chosen by 62 % of landowners). The second in order of preference was "finance" (17 %) and the third was "saving" (11 %). "No cuttings", and "maximum cuttings" were the least preferred (9 % and 1 %, resp.). The factors promoting the choices of strategies with intensive cuttings were a) "farmer as forest owner" and "owning fields", b) "increase in the size of the forest holding", c) agriculture and forestry orientation in production, d) "decreasing short term stumpage earning expectations", e) "increasing intensity of future cuttings", and f) "choice of forest taxation system based on site productivity". The potential allowable cut defined in the study was 20 % higher than the average of the realized drain during the years 1988-1993, which in turn, was at the same level as the cutting budget based on the combining of forestry plans in eastern Finland. Respectively, the potential allowable cut defined in the study was 12 % lower than the NFI8-based greatest sustained allowable cut for the 1990s. Using the method presented in this study, timber management strategies can be clarified for non-industrial private forest landowners in different parts of Finland. Based on the choices of timber managemet strategies, regular cutting budgets can be calculated more realistically than before.
Resumo:
The factors affecting the non-industrial, private forest landowners' (hereafter referred to using the acronym NIPF) strategic decisions in management planning are studied. A genetic algorithm is used to induce a set of rules predicting potential cut of the landowners' choices of preferred timber management strategies. The rules are based on variables describing the characteristics of the landowners and their forest holdings. The predictive ability of a genetic algorithm is compared to linear regression analysis using identical data sets. The data are cross-validated seven times applying both genetic algorithm and regression analyses in order to examine the data-sensitivity and robustness of the generated models. The optimal rule set derived from genetic algorithm analyses included the following variables: mean initial volume, landowner's positive price expectations for the next eight years, landowner being classified as farmer, and preference for the recreational use of forest property. When tested with previously unseen test data, the optimal rule set resulted in a relative root mean square error of 0.40. In the regression analyses, the optimal regression equation consisted of the following variables: mean initial volume, proportion of forestry income, intention to cut extensively in future, and positive price expectations for the next two years. The R2 of the optimal regression equation was 0.34 and the relative root mean square error obtained from the test data was 0.38. In both models, mean initial volume and positive stumpage price expectations were entered as significant predictors of potential cut of preferred timber management strategy. When tested with the complete data set of 201 observations, both the optimal rule set and the optimal regression model achieved the same level of accuracy.
Resumo:
The methods of secondary wood processing are assumed to evolve over time and to affect the requirements set for the wood material and its suppliers. The study aimed at analysing the industrial operating modes applied by joinery and furniture manufacturers as sawnwood users. Industrial operating mode was defined as a pattern of important decisions and actions taken by a company which describes the company's level of adjustment in the late-industrial transition. A non-probabilistic sample of 127 companies was interviewed, including companies from Denmark, Germany, the Netherlands, and Finland. Fifty-two of the firms were furniture manufacturers and the other 75 were producing windows and doors. Variables related to business philosophy, production operations, and supplier choice criteria were measured and used as a basis for a customer typology; variables related to wood usage and perceived sawmill performance were measured to be used to profile the customer types. Factor analysis was used to determine the latent dimensions of industrial operating mode. Canonical correlations analysis was applied in developing the final base for classifying the observations. Non-hierarchical cluster analysis was employed to build a five-group typology of secondary wood processing firms; these ranged from traditional mass producers to late-industrial flexible manufacturers. There is a clear connection between the amount of late-industrial elements in a company and the share of special and customised sawnwood it uses. Those joinery or furniture manufacturers that are more late-industrial also are likely to use more component-type wood material and to appreciate customer-oriented technical precision. The results show that the change is towards the use of late-industrial sawnwood materials and late-industrial supplier relationships.
Resumo:
Human activities extract and displace different substances and materials from the earth s crust, thus causing various environmental problems, such as climate change, acidification and eutrophication. As problems have become more complicated, more holistic measures that consider the origins and sources of pollutants have been called for. Industrial ecology is a field of science that forms a comprehensive framework for studying the interactions between the modern technological society and the environment. Industrial ecology considers humans and their technologies to be part of the natural environment, not separate from it. Industrial operations form natural systems that must also function as such within the constraints set by the biosphere. Industrial symbiosis (IS) is a central concept of industrial ecology. Industrial symbiosis studies look at the physical flows of materials and energy in local industrial systems. In an ideal IS, waste material and energy are exchanged by the actors of the system, thereby reducing the consumption of virgin material and energy inputs and the generation of waste and emissions. Companies are seen as part of the chains of suppliers and consumers that resemble those of natural ecosystems. The aim of this study was to analyse the environmental performance of an industrial symbiosis based on pulp and paper production, taking into account life cycle impacts as well. Life Cycle Assessment (LCA) is a tool for quantitatively and systematically evaluating the environmental aspects of a product, technology or service throughout its whole life cycle. Moreover, the Natural Step Sustainability Principles formed a conceptual framework for assessing the environmental performance of the case study symbiosis (Paper I). The environmental performance of the case study symbiosis was compared to four counterfactual reference scenarios in which the actors of the symbiosis operated on their own. The research methods used were process-based life cycle assessment (LCA) (Papers II and III) and hybrid LCA, which combines both process and input-output LCA (Paper IV). The results showed that the environmental impacts caused by the extraction and processing of the materials and the energy used by the symbiosis were considerable. If only the direct emissions and resource use of the symbiosis had been considered, less than half of the total environmental impacts of the system would have been taken into account. When the results were compared with the counterfactual reference scenarios, the net environmental impacts of the symbiosis were smaller than those of the reference scenarios. The reduction in environmental impacts was mainly due to changes in the way energy was produced. However, the results are sensitive to the way the reference scenarios are defined. LCA is a useful tool for assessing the overall environmental performance of industrial symbioses. It is recommended that in addition to the direct effects, the upstream impacts should be taken into account as well when assessing the environmental performance of industrial symbioses. Industrial symbiosis should be seen as part of the process of improving the environmental performance of a system. In some cases, it may be more efficient, from an environmental point of view, to focus on supply chain management instead.
Resumo:
XVIII IUFRO World Congress, Ljubljana 1986.
Resumo:
XVIII IUFRO World Congress, Ljubljana 1986.
Resumo:
The subject and methodology of biblical scholarship has expanded immense-ly during the last few decades. The traditional text-, literary-, source- and form-critical approaches, labeled historical-critical scholarship , have faced the challenge of social sciences. Various new literary, synchronic readings, sometimes characterized with the vague term postmodernism, have in turn challenged historicalcritical, and social-scientific approaches. Widened limits and diverging methodologies have caused a sense of crisis in biblical criticism. This metatheoretical thesis attempts to bridge the gap between philosophical discussion about the basis of biblical criticism and practical academic biblical scholarship. The study attempts to trace those epistemological changes that have produced the wealth of methods and results within biblical criticism. The account of the cult reform of King Josiah of Judah as reported in 2 Kings 22:1 23:30 serves as the case study because of its importance for critical study of the Hebrew Bible. Various scholarly approaches embracing 2 Kings 22:1 23:30 are experimentally arranged around four methodological positions: text, author, reader, and context. The heuristic model is a tentative application of Oliver Jahraus s model of four paradigms in literary theory. The study argues for six theses: 1) Our knowledge of the world is con-structed, fallible and theory-laden. 2) Methodological plurality is the neces-sary result of changes in epistemology and culture in general. 3) Oliver Jahraus s four methodological positions in regard to literature are also an applicable model within biblical criticism to comprehend the methodological plurality embracing the study of the Hebrew Bible. 4) Underlying the methodological discourse embracing biblical criticism is the epistemological ten-sion between the natural sciences and the humanities. 5) Biblical scholars should reconsider and analyze in detail concepts such as author and editor to overcome the dichotomy between the Göttingen and Cross schools. 6) To say something about the historicity of 2 Kings 22:1 23:30 one must bring together disparate elements from various disciplines and, finally, admit that though it may be possible to draw some permanent results, our conclusions often remain provisional.
Resumo:
Light scattering, or scattering and absorption of electromagnetic waves, is an important tool in all remote-sensing observations. In astronomy, the light scattered or absorbed by a distant object can be the only source of information. In Solar-system studies, the light-scattering methods are employed when interpreting observations of atmosphereless bodies such as asteroids, atmospheres of planets, and cometary or interplanetary dust. Our Earth is constantly monitored from artificial satellites at different wavelengths. With remote sensing of Earth the light-scattering methods are not the only source of information: there is always the possibility to make in situ measurements. The satellite-based remote sensing is, however, superior in the sense of speed and coverage if only the scattered signal can be reliably interpreted. The optical properties of many industrial products play a key role in their quality. Especially for products such as paint and paper, the ability to obscure the background and to reflect light is of utmost importance. High-grade papers are evaluated based on their brightness, opacity, color, and gloss. In product development, there is a need for computer-based simulation methods that could predict the optical properties and, therefore, could be used in optimizing the quality while reducing the material costs. With paper, for instance, pilot experiments with an actual paper machine can be very time- and resource-consuming. The light-scattering methods presented in this thesis solve rigorously the interaction of light and material with wavelength-scale structures. These methods are computationally demanding, thus the speed and accuracy of the methods play a key role. Different implementations of the discrete-dipole approximation are compared in the thesis and the results provide practical guidelines in choosing a suitable code. In addition, a novel method is presented for the numerical computations of orientation-averaged light-scattering properties of a particle, and the method is compared against existing techniques. Simulation of light scattering for various targets and the possible problems arising from the finite size of the model target are discussed in the thesis. Scattering by single particles and small clusters is considered, as well as scattering in particulate media, and scattering in continuous media with porosity or surface roughness. Various techniques for modeling the scattering media are presented and the results are applied to optimizing the structure of paper. However, the same methods can be applied in light-scattering studies of Solar-system regoliths or cometary dust, or in any remote-sensing problem involving light scattering in random media with wavelength-scale structures.