36 resultados para Technological choices
em University of Queensland eSpace - Australia
Resumo:
Calculating the potentials on the heart’s epicardial surface from the body surface potentials constitutes one form of inverse problems in electrocardiography (ECG). Since these problems are ill-posed, one approach is to use zero-order Tikhonov regularization, where the squared norms of both the residual and the solution are minimized, with a relative weight determined by the regularization parameter. In this paper, we used three different methods to choose the regularization parameter in the inverse solutions of ECG. The three methods include the L-curve, the generalized cross validation (GCV) and the discrepancy principle (DP). Among them, the GCV method has received less attention in solutions to ECG inverse problems than the other methods. Since the DP approach needs knowledge of norm of noises, we used a model function to estimate the noise. The performance of various methods was compared using a concentric sphere model and a real geometry heart-torso model with a distribution of current dipoles placed inside the heart model as the source. Gaussian measurement noises were added to the body surface potentials. The results show that the three methods all produce good inverse solutions with little noise; but, as the noise increases, the DP approach produces better results than the L-curve and GCV methods, particularly in the real geometry model. Both the GCV and L-curve methods perform well in low to medium noise situations.
Resumo:
The possibility of controlling vector-borne disease through the development and release of transgenic insect vectors has recently gained popular support and is being actively pursued by a number of research laboratories around the world. Several technical problems must be solved before such a strategy could be implemented: genes encoding refractory traits (traits that render the insect unable to transmit the pathogen) must be identified, a transformation system for important vector species has to be developed, and a strategy to spread the refractory trait into natural vector populations must be designed. Recent advances in this field of research make it seem likely that this technology will be available in the near future. In this paper we review recent progress in this area as well as argue that care should be taken in selecting the most appropriate disease system with which to first attempt this form of intervention. Much attention is currently being given to the application of this technology to the control of malaria, transmitted by Anopheles gambiae in Africa. While malaria is undoubtedly the most important vector-borne disease in the world and its control should remain an important goal, we maintain that the complex epidemiology of malaria together with the intense transmission rates in Africa may make it unsuitable for the first application of this technology. Diseases such as African trypanosomiasis, transmitted by the tsetse fly, or unstable malaria in India may provide more appropriate initial targets to evaluate the potential of this form of intervention.
Resumo:
This paper combines insights from the literature on the economics of organisation with traditional models of market structure to construct a theory of equilibrium firm size heterogeneity under the assumption of a homogenous product industry. It is possible that configurations consisting entirely of small firms (run by entrepreneurs with limited attention) and with larger firms (using managerial techniques to substitute away these limits to allow increasing returns technologies to become profitable) can arise in equilibrium. However, there also exist equilibrium configurations with the co-existence of large and small firms. The efficiency properties of these respective equilibria are discussed. Finally, the implications of an expanding market size are considered.
Resumo:
From the mid-1970s through the 1980s and into the 1990s, wage inequality and skill differentials in earnings and employment increased sharply in OECD countries. After 1973 and especially in the 1980s, the US experienced a dismal real wage performance for the less skilled. Among the factors singled out by economists as possible major contributors to this development are economic globalisation processes and skill-biased technological change. Although these are most commonly considered as independent influences, after critically outlining views about these factors, this article argues that strong interdependence exists between them. The article then examines potential policy responses to this growing inequality.