12 resultados para Technology and civilization
em Université de Lausanne, Switzerland
Resumo:
Technology (i.e. tools, methods of cultivation and domestication, systems of construction and appropriation, machines) has increased the vital rates of humans, and is one of the defining features of the transition from Malthusian ecological stagnation to a potentially perpetual rising population growth. Maladaptations, on the other hand, encompass behaviours, customs and practices that decrease the vital rates of individuals. Technology and maladaptations are part of the total stock of culture carried by the individuals in a population. Here, we develop a quantitative model for the coevolution of cumulative adaptive technology and maladaptive culture in a 'producer-scrounger' game, which can also usefully be interpreted as an 'individual-social' learner interaction. Producers (individual learners) are assumed to invent new adaptations and maladaptations by trial-and-error learning, insight or deduction, and they pay the cost of innovation. Scroungers (social learners) are assumed to copy or imitate (cultural transmission) both the adaptations and maladaptations generated by producers. We show that the coevolutionary dynamics of producers and scroungers in the presence of cultural transmission can have a variety of effects on population carrying capacity. From stable polymorphism, where scroungers bring an advantage to the population (increase in carrying capacity), to periodic cycling, where scroungers decrease carrying capacity, we find that selection-driven cultural innovation and transmission may send a population on the path of indefinite growth or to extinction.
Resumo:
In this thesis, I examine the diffusion process for a complex medical technology, the PET scanner, in two different health care systems, one of which is more market-oriented (Switzerland) and the other more centrally managed by a public agency (Quebec). The research draws on institutional and socio-political theories of the diffusion of innovations to examine how institutional contexts affect processes of diffusion. I find that diffusion proceeds more rapidly in Switzerland than in Quebec, but that processes in both jurisdictions are characterized by intense struggles among providers and between providers and public agencies. I show that the institutional environment influences these processes by determining the patterns of material resources and authority available to actors in their struggles to strategically control the technology, and by constituting the discursive resources or institutional logics on which actors may legitimately draw in their struggles to give meaning to the technology in line with their interests and values. This thesis illustrates how institutional structures and meanings manifest themselves in the context of specific decisions within an organizational field, and reveals the ways in which governance structures may be contested and realigned when they conflict with interests that are legitimized by dominant institutional logics. It is argued that this form of contestation and readjustment at the margins constitutes one mechanism by which institutional frameworks are tested, stretched and reproduced or redefined.
Resumo:
The determination of characteristic cardiac parameters, such as displacement, stress and strain distribution are essential for an understanding of the mechanics of the heart. The calculation of these parameters has been limited until recently by the use of idealised mathematical representations of biventricular geometries and by applying simple material laws. On the basis of 20 short axis heart slices and in consideration of linear and nonlinear material behaviour we have developed a FE model with about 100,000 degrees of freedom. Marching Cubes and Phong's incremental shading technique were used to visualise the three dimensional geometry. In a quasistatic FE analysis continuous distribution of regional stress and strain corresponding to the endsystolic state were calculated. Substantial regional variation of the Von Mises stress and the total strain energy were observed at all levels of the heart model. The results of both the linear elastic model and the model with a nonlinear material description (Mooney-Rivlin) were compared. While the stress distribution and peak stress values were found to be comparable, the displacement vectors obtained with the nonlinear model were generally higher in comparison with the linear elastic case indicating the need to include nonlinear effects.
Resumo:
Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.
Resumo:
ABSTRACT This dissertation focuses on new technology commercialization, innovation and new business development. Industry-based novel technology may achieve commercialization through its transfer to a large research laboratory acting as a lead user and technical partner, and providing the new technology with complementary assets and meaningful initial use in social practice. The research lab benefits from the new technology and innovation through major performance improvements and cost savings. Such mutually beneficial collaboration between the lab and the firm does not require any additional administrative efforts or funds from the lab, yet requires openness to technologies and partner companies that may not be previously known to the lab- Labs achieve the benefits by applying a proactive procurement model that promotes active pre-tender search of new technologies and pre-tender testing and piloting of these technological options. The collaboration works best when based on the development needs of both parties. This means that first of all the lab has significant engineering activity with well-defined technological needs and second, that the firm has advanced prototype technology yet needs further testing, piloting and the initial market and references to achieve the market breakthrough. The empirical evidence of the dissertation is based on a longitudinal multiple-case study with the European Laboratory for Particle Physics. The key theoretical contribution of this study is that large research labs, including basic research, play an important role in product and business development toward the end, rather than front-end, of the innovation process. This also implies that product-orientation and business-orientation can contribute to basic re-search. The study provides practical managerial and policy guidelines on how to initiate and manage mutually beneficial lab-industry collaboration and proactive procurement.
Resumo:
BACKGROUND: Invasive fungal diseases are important causes of morbidity and mortality. Clarity and uniformity in defining these infections are important factors in improving the quality of clinical studies. A standard set of definitions strengthens the consistency and reproducibility of such studies. METHODS: After the introduction of the original European Organization for Research and Treatment of Cancer/Invasive Fungal Infections Cooperative Group and the National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) Consensus Group definitions, advances in diagnostic technology and the recognition of areas in need of improvement led to a revision of this document. The revision process started with a meeting of participants in 2003, to decide on the process and to draft the proposal. This was followed by several rounds of consultation until a final draft was approved in 2005. This was made available for 6 months to allow public comment, and then the manuscript was prepared and approved. RESULTS: The revised definitions retain the original classifications of "proven," "probable," and "possible" invasive fungal disease, but the definition of "probable" has been expanded, whereas the scope of the category "possible" has been diminished. The category of proven invasive fungal disease can apply to any patient, regardless of whether the patient is immunocompromised, whereas the probable and possible categories are proposed for immunocompromised patients only. CONCLUSIONS: These revised definitions of invasive fungal disease are intended to advance clinical and epidemiological research and may serve as a useful model for defining other infections in high-risk patients.
Resumo:
Computer-Aided Tomography Angiography (CTA) images are the standard for assessing Peripheral artery disease (PAD). This paper presents a Computer Aided Detection (CAD) and Computer Aided Measurement (CAM) system for PAD. The CAD stage detects the arterial network using a 3D region growing method and a fast 3D morphology operation. The CAM stage aims to accurately measure the artery diameters from the detected vessel centerline, compensating for the partial volume effect using Expectation Maximization (EM) and a Markov Random field (MRF). The system has been evaluated on phantom data and also applied to fifteen (15) CTA datasets, where the detection accuracy of stenosis was 88% and the measurement accuracy was with an 8% error.
Resumo:
This work consists of three essays investigating the ability of structural macroeconomic models to price zero coupon U.S. government bonds. 1. A small scale 3 factor DSGE model implying constant term premium is able to provide reasonable a fit for the term structure only at the expense of the persistence parameters of the structural shocks. The test of the structural model against one that has constant but unrestricted prices of risk parameters shows that the exogenous prices of risk-model is only weakly preferred. We provide an MLE based variance-covariance matrix of the Metropolis Proposal Density that improves convergence speeds in MCMC chains. 2. Affine in observable macro-variables, prices of risk specification is excessively flexible and provides term-structure fit without significantly altering the structural parameters. The exogenous component of the SDF is separating the macro part of the model from the term structure and the good term structure fit has as a driving force an extremely volatile SDF and an implied average short rate that is inexplicable. We conclude that the no arbitrage restrictions do not suffice to temper the SDF, thus there is need for more restrictions. We introduce a penalty-function methodology that proves useful in showing that affine prices of risk specifications are able to reconcile stable macro-dynamics with good term structure fit and a plausible SDF. 3. The level factor is reproduced most importantly by the preference shock to which it is strongly and positively related but technology and monetary shocks, with negative loadings, are also contributing to its replication. The slope factor is only related to the monetary policy shocks and it is poorly explained. We find that there are gains in in- and out-of-sample forecast of consumption and inflation if term structure information is used in a time varying hybrid prices of risk setting. In-sample yield forecast are better in models with non-stationary shocks for the period 1982-1988. After this period, time varying market price of risk models provide better in-sample forecasts. For the period 2005-2008, out of sample forecast of consumption and inflation are better if term structure information is incorporated in the DSGE model but yields are better forecasted by a pure macro DSGE model.
Resumo:
Résumé: Output, inflation and interest rates are key macroeconomic variables, in particular for monetary policy. In modern macroeconomic models they are driven by random shocks which feed through the economy in various ways. Models differ in the nature of shocks and their transmission mechanisms. This is the common theme underlying the three essays of this thesis. Each essay takes a different perspective on the subject: First, the thesis shows empirically how different shocks lead to different behavior of interest rates over the business cycle. For commonly analyzed shocks (technology and monetary policy errors), the patterns square with standard models. The big unknown are sources of inflation persistence. Then the thesis presents a theory of monetary policy, when the central bank can better observe structural shocks than the public. The public will then seek to infer the bank's extra knowledge from its policy actions and expectation management becomes a key factor of optimal policy. In a simple New Keynesian model, monetary policy becomes more concerned with inflation persistence than otherwise. Finally, the thesis points to the huge uncertainties involved in estimating the responses to structural shocks with permanent effects.