107 resultados para Input-output Tables
Resumo:
We develop a mediation model in which firm size is proposed to affect the scale and quality of innovative output through the adoption of different decision styles during the R&D process. The aim of this study is to understand how the internal changes that firms undergo as they evolve from small to larger organizations affect R&D productivity. In so doing, we illuminate the underlying theoretical mechanism affecting two different dimensions of R&D productivity, namely the scale and quality of innovative output which have not received much attention in previous literature. Using longitudinal data of Spanish manufacturing firms we explore the validity of this mediation model. Our results show that as firms evolve in size, they increasingly emphasize analytical decision making, and consequently, large-sized firms aim for higher-quality innovations while small firms aim for a larger scale of innovative output.
Resumo:
El projecte es basa en el disseny i la fabricació d’un circuit desfasador, amb una entrada i una sortida, que commuta entre dos estats, un estat amb desfasament de 180º i l’altre sense desfasament. El circuit es dissenya amb microstrip,per una fc=5GHz. S’intenten obtenir les millors característiques per aquest disseny, és a dir, bon ample de banda i pèrdues per inserció baixes, acompanyat d’una bona resposta en les relacions de magnitud i fase. Es segueixen diferents etapes de disseny, on es comença per un model simple i s’avança en complexitat, afegint els nous components que acabaran conformant el circuit final. Després del disseny es passa a la fabricació del circuit, per veure el seu funcionament real. La memòria recull i ordena la informació obtinguda a través d’aquest procés, intentant mostrar-la de manera clara, per tal de seguir el procés de disseny realitzat, i així poder interpretar els resultats obtinguts. L’objectiu final és veure com es comporta el circuit dissenyat i definir les pautes a seguir per millorar-lo en un futur.
Resumo:
L'objectiu d'aquest TFC consisteix a desenvolupar i implementar l'eina de visualització molecular opengl: HVM. Aquesta aplicació, que permet la visualització i la inspecció de molècules, és de gran utilitat en àrees com la química, la farmàcia, la docència, etc., i admet definicions de molècules mitjançant un fitxer d'entrada (una variació simplificada del format XMOL XYZ), construint-ne el model, cosa que afavoreix que s'hi pugui navegar, com també la selecció i la identificació dels seus elements i el càlcul de distàncies i angles de torsió entre ells. A més, permet la definició d'un eix sobre el qual es pot generar una rotació del model i gravar una seqüència de sortida.
Resumo:
Los Frameworks constituyen el nuevo paradigma en cuanto al desarrollo de software se refiere. Entre sus principales características se encuentran la facilidad para la reutilización de código. En este marco específico proporcionados por la tecnología usaremos la tecnología JAVA y su extensión en cuanto a la persistencia de datos. El Framework de persistencia es el responsable de gestionar la lógica de acceso a los datos en un SGBD (Sistema de Gestión de Bases de Datos), ya sea de entrada o salida, y ocultando los detalles más pesados relativos a la estructura propia de la Base de Datos utilizada, de manera completa y transparentemente. En conclusión, este proyecto se basa en un análisis del los Frameworks existentes, analizando sus características y profundizando en los detalles concretos de su actividad y manejo en cuanto a la persistencia.
Resumo:
The Final Year Project consists of two essentially different parts, which share acommon theme: HTML code validation. The first of these two parts focuses on the study of the validation process. It supplies a brief introduction to the evolution of HTML and XHTML, the new tags introduced in HTML5 and the most common errors found in today's websites. Already developed HTML validation tools are analyzed and examined in detail in order to compare their features and evaluate their performances. Lastly, a comparison of the parsing process in the most common browsers found nowadays is provided. In the second part of the project the focus of the project is shifted towards the development of a XHTML5 validation tool. The input is a XHTML5 file whose content may or may not comply with the W3C specification, and therefore, may or may not be a valid XHTML5 document. The output provided by this tool will be a fixed XHTML5 document and an error log returned in the form of a XML file. Information as to the course of action pursued to fix the error and its location will also be included.
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult toachieve because the relative values of the forecast components often fail to behave ina way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It hasbeen shown that cause-specic mortality forecasts are pessimistic when compared withall-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approachof using log mortality rates and forecasts the density of deaths in the life table. Sincethese values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbingstate), they are intrinsically relative rather than absolute values across decrements aswell as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison(1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that theunit sum constraint is honoured. The structure of the best-known, single-decrementmortality-rate forecasting model, devised by Lee and Carter (1992), is expressed incompositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortalityby cause of death for Japan
Resumo:
This research studies from an internal view based on the Competency-Based Perspective (CBP), key organizational competencies developed for small new business. CBP is chosen in an attempt to explain the differences characterizing the closed companies from the consolidated ones. The main contribution of this paper is the definition of a set of key organizational competencies for new ventures from services and low technology based sectors. Using the classification proposed by [1] and a review of the entrepreneurship literature, the main competencies were defined and classified as: managerial, input-based, transformation-based, and output-based competencies. The proposed model for evaluating new ventures organizational competence is tested by means of Structural Equation
Resumo:
The statistical analysis of compositional data should be treated using logratios of parts,which are difficult to use correctly in standard statistical packages. For this reason afreeware package, named CoDaPack was created. This software implements most of thebasic statistical methods suitable for compositional data.In this paper we describe the new version of the package that now is calledCoDaPack3D. It is developed in Visual Basic for applications (associated with Excel©),Visual Basic and Open GL, and it is oriented towards users with a minimum knowledgeof computers with the aim at being simple and easy to use.This new version includes new graphical output in 2D and 3D. These outputs could bezoomed and, in 3D, rotated. Also a customization menu is included and outputs couldbe saved in jpeg format. Also this new version includes an interactive help and alldialog windows have been improved in order to facilitate its use.To use CoDaPack one has to access Excel© and introduce the data in a standardspreadsheet. These should be organized as a matrix where Excel© rows correspond tothe observations and columns to the parts. The user executes macros that returnnumerical or graphical results. There are two kinds of numerical results: new variablesand descriptive statistics, and both appear on the same sheet. Graphical output appearsin independent windows. In the present version there are 8 menus, with a total of 38submenus which, after some dialogue, directly call the corresponding macro. Thedialogues ask the user to input variables and further parameters needed, as well aswhere to put these results. The web site http://ima.udg.es/CoDaPack contains thisfreeware package and only Microsoft Excel© under Microsoft Windows© is required torun the software.Kew words: Compositional data Analysis, Software
Resumo:
By using suitable parameters, we present a uni¯ed aproach for describing four methods for representing categorical data in a contingency table. These methods include:correspondence analysis (CA), the alternative approach using Hellinger distance (HD),the log-ratio (LR) alternative, which is appropriate for compositional data, and theso-called non-symmetrical correspondence analysis (NSCA). We then make an appropriate comparison among these four methods and some illustrative examples are given.Some approaches based on cumulative frequencies are also linked and studied usingmatrices.Key words: Correspondence analysis, Hellinger distance, Non-symmetrical correspondence analysis, log-ratio analysis, Taguchi inertia
Resumo:
A condition needed for testing nested hypotheses from a Bayesianviewpoint is that the prior for the alternative model concentratesmass around the small, or null, model. For testing independencein contingency tables, the intrinsic priors satisfy this requirement.Further, the degree of concentration of the priors is controlled bya discrete parameter m, the training sample size, which plays animportant role in the resulting answer regardless of the samplesize.In this paper we study robustness of the tests of independencein contingency tables with respect to the intrinsic priors withdifferent degree of concentration around the null, and comparewith other “robust” results by Good and Crook. Consistency ofthe intrinsic Bayesian tests is established.We also discuss conditioning issues and sampling schemes,and argue that conditioning should be on either one margin orthe table total, but not on both margins.Examples using real are simulated data are given
Resumo:
Aquest estudi pretén investigar els intercanvis verbals mestre/a – aprenent(s) en dos contextos d'instrucció diferents: classes amb un enfocament AICLE (Aprenentatge Integrat de Continguts Curriculars i Llengua Estrangera) on s’aprenen continguts no lingüístics a través de l’anglès, per una banda, i classes 'tradicionals' d'anglès com a llengua estrangera, on l’anglès és alhora objecte d’estudi i vehicle de comunicació, per una altra banda. Més concretament, les preguntes que formula el/la mestre/a, la producció oral dels aprenents i el 'feedback' del/de la mestre/a en els episodis d’atenció a la forma s’han estudiat a la llum de les principals teories provinents del camp de l’Adquisició de Segones Llengües (SLA) per tal de demostrar el seu paper en l’aprenentatge de l’anglès. El corpus de dades prové de l’enregistrament de 7 sessions AICLE i d'11 sessions EFL enregistrades en format àudio i vídeo en dos centres públics d’Educació Primària (EP) de Catalunya. A cadascuna de les escoles, el/la mateix/a mestre/a és l’encarregat/da dels dos tipus d’instrucció amb el mateix grup d’aprenents (10-11 anys d’edat), fet que permet eliminar variables individuals com l'aptitud dels aprenents o l'estil del/de la mestre/a.Els resultats mostren un cert nombre de similituds discursives entre AICLE i EFL donat que ambdós enfocaments tenen lloc en el context-classe amb unes característiques ben definides. Tal com apunta la recerca realitzada en aquest camp, la instrucció AICLE reuneix un seguit de condicions idònies per un major desenvolupament dels nivells de llengua anglesa més enllà de les classes ‘tradicionals’ d’anglès. Malgrat això, aquest estudi sembla indicar que el potencial d'AICLE pel que fa a facilitar una exposició rica a l’anglès i una producció oral significativa no s’explota degudament. En aquest sentit, els resultats d’aquest estudi poden contribuir a la formació dels futurs professors d'AICLE si es busca l’assoliment d’una complementarietat d’ambdós contextos amb l’objectiu últim de millorar els nivells de domini de la llengua anglesa.
Resumo:
Desde el Acta Única Europea de 1987, las sucesivas reformas de la Unión Europea se han realizado con el doble lema de mejorar tanto la eficacia de las políticas de la Unión como la transparencia y posibilidades de control democrático en el proceso decisorio europeo. La consolidación del doble lema eficacia-democracia ha evidenciado la necesidad de la UE de satisfacer las dos vertientes de la legitimidad clásicas de las democracias liberales: por un lado, la legitimidad asentada en los resultados (output legitimacy), es decir, que las instituciones funcionen eficientemente y sean capaces de llevar a cabo políticas eficaces y conformes a los valores que imperan en una sociedad; y por el otro, la legitimidad del proceso (input legitimacy) o, lo que es lo mismo, que las políticas se desarrollen siguiendo los procesos institucionales establecidos y que permitan el grado de participación y control público considerado adecuado en una comunidad política. El ámbito de la política exterior no ha estado exento de este debate sobre la necesidad de combinar eficacia y democracia, sobre todo a medida que dicho ámbito se ha ido ampliando funcionalmente con el desarrollo la Política Exterior y de Seguridad Común (PESC), su dimensión de defensa (PESD), los aspectos exteriores de las políticas de Justicia y Asuntos de Interior (JAI), o la Política Europea de Vecindad, que pasa transversalmente por todos los "pilares" de la UE. Los capítulos del presente libro pretenden examinar cómo ha progresado en los años recientes la política exterior europea en las dos dimensiones de la legitimidad señaladas (eficacia y control democrático), en cuatro áreas temáticas de la política exterior europea: promoción de la democracia y los derechos humanos, dimensión exterior de las políticas de Justicia y Asuntos de Interior, Política Europea de Vecindad y, por último, el papel de la UE en la gobernanza global.
Resumo:
It is well known that multiple-input multiple-output (MIMO) techniques can bring numerous benefits, such as higher spectral efficiency, to point-to-point wireless links. More recently, there has been interest in extending MIMO concepts tomultiuser wireless systems. Our focus in this paper is on network MIMO, a family of techniques whereby each end user in a wireless access network is served through several access points within its range of influence. By tightly coordinating the transmission and reception of signals at multiple access points, network MIMO can transcend the limits on spectral efficiency imposed by cochannel interference. Taking prior information-theoretic analyses of networkMIMO to the next level, we quantify the spectral efficiency gains obtainable under realistic propagation and operational conditions in a typical indoor deployment. Our study relies on detailed simulations and, for specificity, is conducted largely within the physical-layer framework of the IEEE 802.16e Mobile WiMAX system. Furthermore,to facilitate the coordination between access points, we assume that a high-capacity local area network, such as Gigabit Ethernet,connects all the access points. Our results confirm that network MIMO stands to provide a multiple-fold increase in spectralefficiency under these conditions.
Resumo:
The mutual information of independent parallel Gaussian-noise channels is maximized, under an average power constraint, by independent Gaussian inputs whose power is allocated according to the waterfilling policy. In practice, discrete signalling constellations with limited peak-to-average ratios (m-PSK, m-QAM, etc) are used in lieu of the ideal Gaussian signals. This paper gives the power allocation policy that maximizes the mutual information over parallel channels with arbitrary input distributions. Such policy admits a graphical interpretation, referred to as mercury/waterfilling, which generalizes the waterfilling solution and allows retaining some of its intuition. The relationship between mutual information of Gaussian channels and nonlinear minimum mean-square error proves key to solving the power allocation problem.
Resumo:
We characterize the capacity-achieving input covariance for multi-antenna channels known instantaneously at the receiver and in distribution at the transmitter. Our characterization, valid for arbitrary numbers of antennas, encompasses both the eigenvectors and the eigenvalues. The eigenvectors are found for zero-mean channels with arbitrary fading profiles and a wide range of correlation and keyhole structures. For the eigenvalues, in turn, we present necessary and sufficient conditions as well as an iterative algorithm that exhibits remarkable properties: universal applicability, robustness and rapid convergence. In addition, we identify channel structures for which an isotropic input achieves capacity.