969 resultados para Technology software
Resumo:
The paper uses a range of primary-source empirical evidence to address the question: ‘why is it to hard to value intangible assets?’ The setting is venture capital investment in high technology companies. While the investors are risk specialists and financial experts, the entrepreneurs are more knowledgeable about product innovation. Thus the context lends itself to analysis within a principal-agent framework, in which information asymmetry may give rise to adverse selection, pre-contract, and moral hazard, post-contract. We examine how the investor might attenuate such problems and attach a value to such high-tech investments in what are often merely intangible assets, through expert due diligence, monitoring and control. Qualitative evidence is used to qualify the more clear cut picture provided by a principal-agent approach to a more mixed picture in which the ‘art and science’ of investment appraisal are utilised by both parties alike
Resumo:
El present treball compta amb dues parts. La primera es una recopilació sobre temes relacionats amb el correu electrònic i el seu ús: la seva història; els elements que el composen; serveis i programes que ofereix, l´ús d’aquesta eina; l’importància d´aquest dins del e-marketing; la seva efectivitat com a eina de marketing; atributs que se li assignen; les seves principals aplicacions; legislació que el regula; i altres dades que poden ser de gran utilitat a l´hora de fer una tramesa de correu electrònic. La segona part d´aquest treball conté una investigació quantitativa sobre alguns elements o variables que poden influir en l´efectivitat final de la tramesa massiva de correus electrònics realitzada per una empresa amb finalitat comercial.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
Resumo:
En aquesta memòria l'autor, fent servir un enfoc modern, redissenya i implementa la plataforma que una empresa de telecomunicacions del segle 21 necessita per poder donar serveis de telefonia i comunicacions als seus usuaris i clients. Al llarg d'aquesta exposició es condueix al lector des d'una fase inicial de disseny fins a la implementació i posada en producció del sistema final desenvolupat, centrant-nos en solucionar les necessitats actuals que això implica. Aquesta memòria cubreix el software, hardware i els processos de negoci associats al repte de fer realitat aquest objectiu, i presenta al lector les múltiples tecnologies emprades per aconseguir-ho, fent emfàsi en la convergència actual de xarxes cap al concepte de xarxes IP i basant-se en aquesta tendència i utilitzant aquesta tecnologia de veu sobre IP per donar forma a la plataforma que finalment, de forma pràctica, es posa en producció.
Resumo:
Untreated wastewater being directly discharged into rivers is a very harmful environmental hazard that needs to be tackled urgently in many countries. In order to safeguard the river ecosystem and reduce water pollution, it is important to have an effluent charge policy that promotes the investment of wastewater treatment technology by domestic firms. This paper considers the strategic interaction between the government and the domestic firms regarding the investment in the wastewater treatment technology and the design of optimal effluent charge policy that should be implemented. In this model, the higher is the proportion of non-investing firms, the higher would be the probability of having to incur an effluent charge and the higher would be that charge. On one hand the government needs to impose a sufficiently strict policy to ensure that firms have strong incentive to invest. On the other hand, it cannot be too strict that it drives out firms which cannot afford to invest in such expensive technology. The paper analyses the factors that affect the probability of investment in this technology. It also explains the difficulty of imposing a strict environment policy in countries that have too many small firms which cannot afford to invest unless subsidised.
Resumo:
This paper is inspired by articles in the last decade or so that have argued for more attention to theory, and to empirical analysis, within the well-known, and long-lasting, contingency framework for explaining the organisational form of the firm. Its contribution is to extend contingency analysis in three ways: (a) by empirically testing it, using explicit econometric modelling (rather than case study evidence) involving estimation by ordered probit analysis; (b) by extending its scope from large firms to SMEs; (c) by extending its applications from Western economic contexts, to an emerging economy context, using field work evidence from China. It calibrates organizational form in a new way, as an ordinal dependent variable, and also utilises new measures of familiar contingency factors from the literature (i.e. Environment, Strategy, Size and Technology) as the independent variables. An ordered probit model of contingency was constructed, and estimated by maximum likelihood, using a cross section of 83 private Chinese firms. The probit was found to be a good fit to the data, and displayed significant coefficients with plausible interpretations for key variables under all the four categories of contingency analysis, namely Environment, Strategy, Size and Technology. Thus we have generalised the contingency model, in terms of specification, interpretation and applications area.
Resumo:
Report for the scientific sojourn carried out in the International Center for Numerical Methods in Engineering (CIMNE) –state agency – from February until November 2007. The work within the project Technology innovation in underground construction can be grouped into the following tasks: development of the software for modelling underground excavation based on the discrete element method - the numerical algorithms have been implemented in the computer programs and applied to simulation of excavation using roadheaders and TBM-s -; coupling of the discrete element method with the finite element method; development of the numerical model of rock cutting taking into account of wear of rock cutting tools -this work considers a very important factor influencing effectiveness of underground works -.
Resumo:
Es va realitzar el II Workshop en Tomografia Computeritzada (TC) a Monells. El primer dia es va dedicar íntegrament a la utilització del TC en temes de classificació de canals porcines, i el segon dia es va obrir a altres aplicacions del TC, ja sigui en animals vius o en diferents aspectes de qualitat de la carn o els productes carnis. Al workshop hi van assistir 45 persones de 12 països de la UE. The II workshop on the use of Computed Tomography (CT) in pig carcass classification. Other CT applications: live animals and meat technology was held in Monells. The first day it was dedicated to the use of CT in pig carcass classification. The segond day it was open to otehr CT applications, in live animals or in meat and meat products quality. There were 45 assistants of 12 EU countries.
Resumo:
Performance analysis is the task of monitor the behavior of a program execution. The main goal is to find out the possible adjustments that might be done in order improve the performance. To be able to get that improvement it is necessary to find the different causes of overhead. Nowadays we are already in the multicore era, but there is a gap between the level of development of the two main divisions of multicore technology (hardware and software). When we talk about multicore we are also speaking of shared memory systems, on this master thesis we talk about the issues involved on the performance analysis and tuning of applications running specifically in a shared Memory system. We move one step ahead to take the performance analysis to another level by analyzing the applications structure and patterns. We also present some tools specifically addressed to the performance analysis of OpenMP multithread application. At the end we present the results of some experiments performed with a set of OpenMP scientific application.
Resumo:
Type 2 diabetes mellitus (T2DM) is a major disease affecting nearly 280 million people worldwide. Whilst the pathophysiological mechanisms leading to disease are poorly understood, dysfunction of the insulin-producing pancreatic beta-cells is key event for disease development. Monitoring the gene expression profiles of pancreatic beta-cells under several genetic or chemical perturbations has shed light on genes and pathways involved in T2DM. The EuroDia database has been established to build a unique collection of gene expression measurements performed on beta-cells of three organisms, namely human, mouse and rat. The Gene Expression Data Analysis Interface (GEDAI) has been developed to support this database. The quality of each dataset is assessed by a series of quality control procedures to detect putative hybridization outliers. The system integrates a web interface to several standard analysis functions from R/Bioconductor to identify differentially expressed genes and pathways. It also allows the combination of multiple experiments performed on different array platforms of the same technology. The design of this system enables each user to rapidly design a custom analysis pipeline and thus produce their own list of genes and pathways. Raw and normalized data can be downloaded for each experiment. The flexible engine of this database (GEDAI) is currently used to handle gene expression data from several laboratory-run projects dealing with different organisms and platforms. Database URL: http://eurodia.vital-it.ch.
Resumo:
There are two main ways in which the knowledge created in universities has been transferred to firms: licensing agreements and the creation of spin-offs. In this paper, we describe the main steps in the transfer of university innovations, the main incentive issues that appear in this process, and the contractual solutions proposed to address them.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.