45 resultados para paint packages
Resumo:
Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.
Resumo:
We model a Systemically Important Financial Institution (SIFI) that is too big(or too interconnected) to fail. Without credible regulation and strong supervision,the shareholders of this institution might deliberately let its managers take excessiverisk. We propose a solution to this problem, showing how insurance againstsystemic shocks can be provided without generating moral hazard. The solutioninvolves levying a systemic tax needed to cover the costs of future crises and moreimportantly establishing a Systemic Risk Authority endowed with special resolutionpowers, including the control of bankers' compensation packages during crisisperiods.
Resumo:
We investigate the theoretical conditions for effectiveness of government consumptionexpenditure expansions using US, Euro area and UK data. Fiscal expansions taking placewhen monetary policy is accommodative lead to large output multipliers in normal times.The 2009-2010 packages need not produce significant output multipliers, may havemoderate debt effects, and only generate temporary inflation. Expenditure expansionsaccompanied by deficit/debt consolidations schemes may lead to short run output gains buttheir success depends on how monetary policy and expectations behave. Trade opennessand the cyclicality of the labor wedge explain cross-country differences in the magnitude ofthe multipliers.
Resumo:
Does the labor market place wage premia on jobs that involve physical strain,job, insecurity or bad regulation of hours? This paper derives bounds on themonetary returns to these job disamenities in the West German labor market.We show that in a market with dispersion in both job characteristics andwages, the average wage change of workers who switch jobs voluntarily and optfor consuming more (less) disamenities,provides an upper (lower) bound on themarket return to the disamenity. Using longitudinal information from workersin the German Socio Economic Panel, we estimate an upper bound of 5% and alower bound of 3.5% for the market return to work strain in a job.
Resumo:
Executive compensation packages are often valued in an inconsistent manner: while employee stock options (ESOs) are typically valued ex-ante, cash bonuses are valued ex-post. This renders the existing valuation models of employee compensation packages theoretically unsatisfactory and, potentially, empirically distortive. In this paper, we propose an option-based framework for ex-ante valuation of cash bonus contracts. After obtaining closed-form expressions for ex-ante values of several frequently used types of bonus contracts, we utilize them to explore the e¤ects that the shape of a bonus contract has on the executive s attitude toward risk-taking. We, also, study pay-performance sensitivity of such contracts. We show that the terms of a bonus contract can dramatically impact both risk-taking behavior as well as pay-performance incentives. Several testable predictions are made, and venues of future research outlined.
Resumo:
Although correspondence analysis is now widely available in statistical software packages and applied in a variety of contexts, notably the social and environmental sciences, there are still some misconceptions about this method as well as unresolved issues which remain controversial to this day. In this paper we hope to settle these matters, namely (i) the way CA measures variance in a two-way table and how to compare variances between tables of different sizes, (ii) the influence, or rather lack of influence, of outliers in the usual CA maps, (iii) the scaling issue and the biplot interpretation of maps,(iv) whether or not to rotate a solution, and (v) statistical significance of results.
Resumo:
Tot seguit presentem un entorn per analitzar senyals de tot tipus amb LDB (Local Discriminant Bases) i MLDB (Modified Local Discriminant Bases). Aquest entorn utilitza funcions desenvolupades en el marc d’una tesi en fase de desenvolupament. Per entendre part d’aquestes funcions es requereix un nivell de coneixement avançat de processament de senyals. S’han extret dels treballs realitzats per Naoki Saito [3], que s’han agafat com a punt de partida per la realització de l’algorisme de la tesi doctoral no finalitzada de Jose Antonio Soria. Aquesta interfície desenvolupada accepta la incorporació de nous paquets i funcions. Hem deixat un menú preparat per integrar Sinus IV packet transform i Cosine IV packet transform, tot i que també podem incorporar-n’hi altres. L’aplicació consta de dues interfícies, un Assistent i una interfície principal. També hem creat una finestra per importar i exportar les variables desitjades a diferents entorns. Per fer aquesta aplicació s’han programat tots els elements de les finestres, en lloc d’utilitzar el GUIDE (Graphical User Interface Development Enviroment) de MATLAB, per tal que sigui compatible entre les diferents versions d’aquest programa. En total hem fet 73 funcions en la interfície principal (d’aquestes, 10 pertanyen a la finestra d’importar i exportar) i 23 en la de l’Assistent. En aquest treball només explicarem 6 funcions i les 3 de creació d’aquestes interfícies per no fer-lo excessivament extens. Les funcions que explicarem són les més importants, ja sigui perquè s’utilitzen sovint, perquè, segons la complexitat McCabe, són les més complicades o perquè són necessàries pel processament del senyal. Passem cada entrada de dades per part de l’usuari per funcions que ens detectaran errors en aquesta entrada, com eliminació de zeros o de caràcters que no siguin números, com comprovar que són enters o que estan dins dels límits màxims i mínims que li pertoquen.
Resumo:
Les empreses sempre han buscat com optimitzar el màxim els seus recursos i ser més eficients a la hora de realitzar les tasques que li han estat encomanades. És per aquest motiu que constantment les empreses realitzen estudis i valoracions de com poder millorar dia a dia. Aquest fet no és diferenciador a l’empresa Serralleria i Alumini Vilaró (S.A.V), que dia a dia estudia com optimitzar els seus processos o de vegades introduir-ne de nous per tal d’expandir la seva oferta de serveis. L’empresa és dedica a la fabricació de peces metàl•liques el procés ja sigui només de tall i mecanitzat, plegat, soldadura, acabats en inoxidable, pintura i fins i tot embalatge pel que fa a la part productiva, respecte a la part d’oficina tècnica també ofereix serveis de desenvolupament de productes segons especificacions del client i reenginyeria de qualsevol producte, analitzant la part que és vol millorar. En l’actualitat l’empresa ha detectat una mancança que creu que es podria solucionar, el problema és que l’empresa disposa de varies màquines de tall, entre les quals hi ha una màquina de tall làser i el problema principal és que la càrrega de les planxes del calaix de magatzem a la bancada de la màquina es realitza o bé manualment o a través d’un gripper sostingut al pont grua, depenent del pes de la planxa a transportar. L’objectiu principal d’aquest treball és fer el disseny d’una màquina que permeti automatitzar el procés de transportar la planxa metàl•lica del calaix de magatzem dipositat sobre una taula mòbil a la bancada de la màquina de tall. El disseny que pretenem fer és complet començant per fer un disseny estructural de la màquina més els seus respectius càlculs, moviments que volem aconseguir, tria de components ( motors, sensors ...), elaboració d’un pressupost per poder fer una estimació i finalment la elaboració del programa de control de tota la màquina més la interacció amb la màquina a través d’una pantalla tàctil. Es a dir, el que pretenem és realitzar un projecte que puguem fabricar en la realitat utilitzant tota la informació continguda dins del mateix
Resumo:
Introduction. This paper studies the situation of research on Catalan literature between 1976 and 2003 by carrying out a bibliometric and social network analysis of PhD theses defended in Spain. It has a dual aim: to present interesting results for the discipline and to demonstrate the methodological efficacy of scientometric tools in the humanities, a field in which they are often neglected due to the difficulty of gathering data. Method. The analysis was performed on 151 records obtained from the TESEO database of PhD theses. The quantitative estimates include the use of the UCINET and Pajek software packages. Authority control was performed on the records. Analysis. Descriptive statistics were used to describe the sample and the distribution of responses to each question. Sex differences on key questions were analysed using the Chi-squared test. Results. The value of the figures obtained is demonstrated. The information obtained on the topic and the periods studied in the theses, and on the actors involved (doctoral students, thesis supervisors and members of defence committees), provide important insights into the mechanisms of humanities disciplines. The main research tendencies of Catalan literature are identified. It is observed that the composition of members of the thesis defence committees follows Lotka's Law. Conclusions. Bibliometric analysis and social network analysis may be especially useful in the humanities and in other fields which are lacking in scientometric data in comparison with the experimental sciences.
Resumo:
The study of the thermal behavior of complex packages as multichip modules (MCM¿s) is usually carried out by measuring the so-called thermal impedance response, that is: the transient temperature after a power step. From the analysis of this signal, the thermal frequency response can be estimated, and consequently, compact thermal models may be extracted. We present a method to obtain an estimate of the time constant distribution underlying the observed transient. The method is based on an iterative deconvolution that produces an approximation to the time constant spectrum while preserving a convenient convolution form. This method is applied to the obtained thermal response of a microstructure as analyzed by finite element method as well as to the measured thermal response of a transistor array integrated circuit (IC) in a SMD package.
Resumo:
The performance of a device based on modified injection-locking techniques is studied by means of numerical simulations. The device incorporates master and slave configurations, each one with a DFB laser and an electroabsortion modulator (EAM). This arrangement allows the generation of high peak power, narrow optical pulses according to a periodic or pseudorandom bit stream provided by a current signal generator. The device is able to considerably increase the modulation bandwidth of free-running gain-switched semiconductor lasers using multiplexing in the time domain. Opportunities for integration in small packages or single chips are discussed.
Resumo:
This paper presents the preliminary findings of pH and colour measurements carried out on artworks on paperand on wood that had been treated with a poly(vinyl acetate) (PVAC) based adhesive in the 1980s. In both cases, areas treated with PVAC proved to be less acidic than untreated areas. Contrary to expectations, the conservation treatments have not, as yet, increased acidity levels in the objects under study. Colour measurements of the works on paper showed that those that had been backed with a cotton fabric using a mixture of methylcellulose and PVAC were less yellow than those from the same print run that had not been backed. This finding suggests that the backing somehow prevented the natural degradation of the support. In view of these preliminary results, further research is clearly needed. This study forms part of a broader ongoing project to assess the role of PVAC in the conservation of a range of cultural assets.
Resumo:
This paper presents the preliminary findings of pH and colour measurements carried out on artworks on paperand on wood that had been treated with a poly(vinyl acetate) (PVAC) based adhesive in the 1980s. In both cases, areas treated with PVAC proved to be less acidic than untreated areas. Contrary to expectations, the conservation treatments have not, as yet, increased acidity levels in the objects under study. Colour measurements of the works on paper showed that those that had been backed with a cotton fabric using a mixture of methylcellulose and PVAC were less yellow than those from the same print run that had not been backed. This finding suggests that the backing somehow prevented the natural degradation of the support. In view of these preliminary results, further research is clearly needed. This study forms part of a broader ongoing project to assess the role of PVAC in the conservation of a range of cultural assets.
Resumo:
This paper highlights the role of non-functional information when reusing from a component library. We describe a method for selecting appropriate implementations of Ada packages taking non-functional constraints into account; these constraints model the context of reuse. Constraints take the form of queries using an interface description language called NoFun, which is also used to state non-functional information in Ada packages; query results are trees of implementations, following the import relationships between components. We define two different situations when reusing components, depending whether we take the library being searched as closed or extendible. The resulting tree of implementations can be manipulated by the user to solve ambiguities, to state default behaviours, and by the like. As part of the proposal, we face the problem of computing from code the non-functional information that determines the selection process.
Resumo:
Why do public-sector workers receive so much of their compensation in the formof pensions and other benefits? This paper presents a political economy model inwhich politicians compete for taxpayers' and government employees' votes by promising compensation packages, but some voters cannot evaluate every aspect of promisedcompensation. If pension packages are "shrouded", so that public-sector workers better understand their value than ordinary taxpayers, then compensation will be highlyback-loaded. In equilibrium, the welfare of public-sector workers could be improved,holding total public-sector costs constant, if they received higher wages and lowerpensions. Centralizing pension determination has two offsetting effects on generosity:more state-level media attention helps taxpayers better understand pension costs, andthat reduces pension generosity; but a larger share of public-sector workers will votewithin the jurisdiction, which increases pension generosity. A short discussion of pensions in two decentralized states (California and Pennsylvania) and two centralizedstates (Massachusetts and Ohio) suggests that centralization appears to have modestlyreduced pensions, but, as the model suggests, this is unlikely to be universal.