853 resultados para Interactive infographics. Theory of journalism. Production routines. Diário do nordeste. Internet
Resumo:
This paper presents an approach, based on Lean production philosophy, for rationalising the processes involved in the production of specification documents for construction projects. Current construction literature erroneously depicts the process for the creation of construction specifications as a linear one. This traditional understanding of the specification process often culminates in process-wastes. On the contrary, the evidence suggests that though generalised, the activities involved in producing specification documents are nonlinear. Drawing on the outcome of participant observation, this paper presents an optimised approach for representing construction specifications. Consequently, the actors typically involved in producing specification documents are identified, the processes suitable for automation are highlighted and the central role of tacit knowledge is integrated into a conceptual template of construction specifications. By applying the transformation, flow, value (TFV) theory of Lean production the paper argues that value creation can be realised by eliminating the wastes associated with the traditional preparation of specification documents with a view to integrating specifications in digital models such as Building Information Models (BIM). Therefore, the paper presents an approach for rationalising the TFV theory as a method for optimising current approaches for generating construction specifications based on a revised specification writing model.
Resumo:
The importance of journalism to civil society is constantly proclaimed, but empirical evidence on journalism's impact, and how this operates, is surprisingly thin. Indeed, there is confusion even about what is meant by the term “impact”. Meanwhile, the issue of the role of journalism is becoming increasingly urgent as a consequence of the rapid changes engulfing the news media, brought about by technological change and the flow-on effect to the traditional advertising-supported business model. Assessing the impact of journalism has recently been the topic of debate among practitioners and scholars particularly in the United States, where philanthropists have responded to the perceived crisis in investigative journalism by funding not-for-profit newsrooms, with resulting new pressures being placed on journalists and editors to quantify their impact on society. These recent attempts have so far failed to achieve clarity or a satisfactory conclusion, which is not surprising given the complex web of causation within which journalism operates. In this paper, the authors propose a stratified definition of journalistic impact and function. They propose a methodology for studying impact drawing on realistic evaluation—a theory-based approach developed primarily to assess large social programmes occurring in open systems. The authors argue this could allow a conceptual and methodological advance on the question of media impacts, leading to research capable of usefully informing responses at a time of worrying change.
Resumo:
Closed form solutions for a simultaneously AM and high-harmonic FM mode locked laser system is presented. Analytical expressions for the pulsewidth and pulsewidth-bandwidth products are derived in terms of the system parameters. The analysis predicts production of 17 ps duration pulses in a Nd:YAG laser mode locked with AM and FM modulators driven at 80 MHz and 1.76 GHz for 1 W modulator input power. The predicted values of the pulsewidth-bandwidth product lie between the values corresponding to the pure AM and FM mode locking values.
Resumo:
We consider the speech production mechanism and the asso- ciated linear source-filter model. For voiced speech sounds in particular, the source/glottal excitation is modeled as a stream of impulses and the filter as a cascade of second-order resonators. We show that the process of sampling speech signals can be modeled as filtering a stream of Dirac impulses (a model for the excitation) with a kernel function (the vocal tract response),and then sampling uniformly. We show that the problem of esti- mating the excitation is equivalent to the problem of recovering a stream of Dirac impulses from samples of a filtered version. We present associated algorithms based on the annihilating filter and also make a comparison with the classical linear prediction technique, which is well known in speech analysis. Results on synthesized as well as natural speech data are presented.
Resumo:
An attempt is made to provide a theoretical explanation of the effect of the positive column on the voltage-current characteristic of a glow or an arc discharge. Such theories have been developed before, and all are based on balancing the production and loss of charged particles and accounting for the energy supplied to the plasma by the applied electric field. Differences among the theories arise from the approximations and omissions made in selecting processes that affect the particle and energy balances. This work is primarily concerned with the deviation from the ambipolar description of the positive column caused by space charge, electron-ion volume recombination, and temperature inhomogeneities.
The presentation is divided into three parts, the first of which involved the derivation of the final macroscopic equations from kinetic theory. The final equations are obtained by taking the first three moments of the Boltzmann equation for each of the three species in the plasma. Although the method used and the equations obtained are not novel, the derivation is carried out in detail in order to appraise the validity of numerous approximations and to justify the use of data from other sources. The equations are applied to a molecular hydrogen discharge contained between parallel walls. The applied electric field is parallel to the walls, and the dependent variables—electron and ion flux to the walls, electron and ion densities, transverse electric field, and gas temperature—vary only in the direction perpendicular to the walls. The mathematical description is given by a sixth-order nonlinear two-point boundary value problem which contains the applied field as a parameter. The amount of neutral gas and its temperature at the walls are held fixed, and the relation between the applied field and the electron density at the center of the discharge is obtained in the process of solving the problem. This relation corresponds to that between current and voltage and is used to interpret the effect of space charge, recombination, and temperature inhomogeneities on the voltage-current characteristic of the discharge.
The complete solution of the equations is impractical both numerically and analytically, and in Part II the gas temperature is assumed uniform so as to focus on the combined effects of space charge and recombination. The terms representing these effects are treated as perturbations to equations that would otherwise describe the ambipolar situation. However, the term representing space charge is not negligible in a thin boundary layer or sheath near the walls, and consequently the perturbation problem is singular. Separate solutions must be obtained in the sheath and in the main region of the discharge, and the relation between the electron density and the applied field is not determined until these solutions are matched.
In Part III the electron and ion densities are assumed equal, and the complicated space-charge calculation is thereby replaced by the ambipolar description. Recombination and temperature inhomogeneities are both important at high values of the electron density. However, the formulation of the problem permits a comparison of the relative effects, and temperature inhomogeneities are shown to be important at lower values of the electron density than recombination. The equations are solved by a direct numerical integration and by treating the term representing temperature inhomogeneities as a perturbation.
The conclusions reached in the study are primarily concerned with the association of the relation between electron density and axial field with the voltage-current characteristic. It is known that the effect of space charge can account for the subnormal glow discharge and that the normal glow corresponds to a close approach to an ambipolar situation. The effect of temperature inhomogeneities helps explain the decreasing characteristic of the arc, and the effect of recombination is not expected to appear except at very high electron densities.
Resumo:
Handwriting production is viewed as a constrained modulation of an underlying oscillatory process. Coupled oscillations in horizontal and vertical directions produce letter forms, and when superimposed on a rightward constant velocity horizontal sweep result in spatially separated letters. Modulation of the vertical oscillation is responsible for control of letter height, either through altering the frequency or altering the acceleration amplitude. Modulation of the horizontal oscillation is responsible for control of corner shape through altering phase or amplitude. The vertical velocity zero crossing in the velocity space diagram is important from the standpoint of control. Changing the horizontal velocity value at this zero crossing controls corner shape, and such changes can be effected through modifying the horizontal oscillation amplitude and phase. Changing the slope at this zero crossing controls writing slant; this slope depends on the horizontal and vertical velocity zero amplitudes and on the relative phase difference. Letter height modulation is also best applied at the vertical velocity zero crossing to preserve an even baseline. The corner shape and slant constraints completely determine the amplitude and phase relations between the two oscillations. Under these constraints interletter separation is not an independent parameter. This theory applies generally to a number of acceleration oscillation patterns such as sinusoidal, rectangular and trapezoidal oscillations. The oscillation theory also provides an explanation for how handwriting might degenerate with speed. An implementation of the theory in the context of the spring muscle model is developed. Here sinusoidal oscillations arise from a purely mechanical sources; orthogonal antagonistic spring pairs generate particular cycloids depending on the initial conditions. Modulating between cycloids can be achieved by changing the spring zero settings at the appropriate times. Frequency can be modulated either by shifting between coactivation and alternating activation of the antagonistic springs or by presuming variable spring constant springs. An acceleration and position measuring apparatus was developed for measurements of human handwriting. Measurements of human writing are consistent with the oscillation theory. It is shown that the minimum energy movement for the spring muscle is bang-coast-bang. For certain parameter values a singular arc solution can be shown to be minimizing. Experimental measurements however indicate that handwriting is not a minimum energy movement.
Resumo:
As técnicas estatísticas são fundamentais em ciência e a análise de regressão linear é, quiçá, uma das metodologias mais usadas. É bem conhecido da literatura que, sob determinadas condições, a regressão linear é uma ferramenta estatística poderosíssima. Infelizmente, na prática, algumas dessas condições raramente são satisfeitas e os modelos de regressão tornam-se mal-postos, inviabilizando, assim, a aplicação dos tradicionais métodos de estimação. Este trabalho apresenta algumas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, em particular na estimação de modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. A investigação é desenvolvida em três vertentes, nomeadamente na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, na estimação do parâmetro ridge em regressão ridge e, por último, em novos desenvolvimentos na estimação com máxima entropia. Na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, o trabalho desenvolvido evidencia um melhor desempenho dos estimadores de máxima entropia em relação ao estimador de máxima verosimilhança. Este bom desempenho é notório em modelos com poucas observações por estado e em modelos com um grande número de estados, os quais são comummente afetados por colinearidade. Espera-se que a utilização de estimadores de máxima entropia contribua para o tão desejado aumento de trabalho empírico com estas fronteiras de produção. Em regressão ridge o maior desafio é a estimação do parâmetro ridge. Embora existam inúmeros procedimentos disponíveis na literatura, a verdade é que não existe nenhum que supere todos os outros. Neste trabalho é proposto um novo estimador do parâmetro ridge, que combina a análise do traço ridge e a estimação com máxima entropia. Os resultados obtidos nos estudos de simulação sugerem que este novo estimador é um dos melhores procedimentos existentes na literatura para a estimação do parâmetro ridge. O estimador de máxima entropia de Leuven é baseado no método dos mínimos quadrados, na entropia de Shannon e em conceitos da eletrodinâmica quântica. Este estimador suplanta a principal crítica apontada ao estimador de máxima entropia generalizada, uma vez que prescinde dos suportes para os parâmetros e erros do modelo de regressão. Neste trabalho são apresentadas novas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, tendo por base o estimador de máxima entropia de Leuven, a teoria da informação e a regressão robusta. Os estimadores desenvolvidos revelam um bom desempenho em modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. Por último, são apresentados alguns códigos computacionais para estimação com máxima entropia, contribuindo, deste modo, para um aumento dos escassos recursos computacionais atualmente disponíveis.
Resumo:
This paper proposes a way of addressing unresolved issues in international business theory by modelling the multinational enterprise as a coordinator of supply chains. It identifies a new market seeking strategy that is an alternative to conventional strategies such as exporting, licensing and FDI, and analyses the conditions under which it will be adopted by firms. The new strategy involves the off-shoring of production and the out-sourcing of R&D, and is implemented through co-operation between a source country firm and a host country firm.
Resumo:
The economic theory of the firm is central to the theory of the multinational enterprise. Recent literature on multinationals, however, makes only limited reference to the economic theory of the firm. Multinationals play an important role in coordinating the international division of labour through internal markets. The paper reviews the economic principles that underlie this view. Optimal internalisation equates marginal benefits and costs. The benefits of internalisation stem mainly from the difficulties of licensing proprietary knowledge, reflecting the view that MNEs possess an ‘ownership’ or ‘firm-specific’ advantage. The costs of internalisation, it is argued, reflect managerial capability, and in particular the capability to manage a large firm. The paper argues that management capability is a complement to ownership advantage. Ownership advantage determines the potential of the firm, and management capability governs the fulfilment of this potential through overcoming barriers to growth. The analysis is applied to a variety of issues, including out-sourcing, geographical dispersion of production, and regional specialisation in marketing.
Resumo:
After the “European” experience of BSE and further food safety crises consumer trust is playing an increasingly important role in political and marketing decision making. This also relates to the area of consumer acceptance of GM food. This paper integrates consumer trust with the theory of planned behavior and a stated choice model to gain a more complete picture of consumer decision making. Preliminary results indicate that when GM products offer practical benefits to consumers acceptance may increase considerably. Furthermore, both trust and perceived benefits contribute significantly to explaining the level of acceptance.
Resumo:
This is a study conducted at, and for, the National Museum of History in Stockholm. The aim of the study was to confirm or disconfirm the hypothesis that visitors in a traditional museum environment might not take part in interactivity in an interactive exhibition. And if they do the visitors might skip the texts and objects on display. To answer this and other questions a multiple method was used. Both non participant observations and exit interviews were conducted. After a description of the interactive exhibits, theory of knowledge and learning is presented before the gathered data is presented. All together 443 visitors were observed. In the observations the visitors were timed on how much time they spent in the room, the time spent on the interactivity, texts and objects. In the 40 interviews information about visitors’ participation in the interactivity was gathered. What interactivity the visitor found easiest, hardest, funniest and most boring.The result did not confirm the hypothesis. All kinds of visitors, children and adults, participated in the interactivities. The visitors took part in the texts and objects and the interactive exhibits.
Resumo:
Includes bibliography
Resumo:
Perhaps due to its origins in a production scheduling software called Optimised Production Technology (OPT), plus the idea of focusing on system constraints, many believe that the Theory of Constraints (TOC) has a vocation for optimal solutions. Those who assess TOC according to this perspective indicate that it guarantees an optimal solution only in certain circumstances. In opposition to this view and founded on a numeric example of a production mix problem, this paper shows, by means of TOC assumptions, why the TOC should not be compared to methods intended to seek optimal or the best solutions, but rather sufficiently good solutions, possible in non-deterministic environments. Moreover, we extend the range of relevant literature on product mix decision by introducing a heuristic based on the uniquely identified work that aims at achieving feasible solutions according to the TOC point of view. The heuristic proposed is tested on 100 production mix problems and the results are compared with the responses obtained with the use of Integer Linear Programming. The results show that the heuristic gives good results on average, but performance falls sharply in some situations. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.