914 resultados para DYNAMIC-ANALYSIS
Resumo:
In order to sustain their competitive advantage in the current increasingly globalized and turbulent context, more and more firms are competing globally in alliances and networks that oblige them to adopt new managerial paradigms and tools. However, their strategic analyses rarely take into account the strategic implications of these alliances and networks, considering their global relational characteristics, admittedly because of a lack of adequate tools to do so. This paper contributes to research that seeks to fill this gap by proposing the Global Strategic Network Analysis - SNA - framework. Its purpose is to help firms that compete globally in alliances and networks to carry out their strategic assessments and decision-making with a view to ensuring dynamic strategic fit from both a global and relational perspective.
Resumo:
This paper addresses the impact of the CO2 opportunity cost on the wholesale electricity price in the context of the Iberian electricity market (MIBEL), namely on the Portuguese system, for the period corresponding to the Phase II of the European Union Emission Trading Scheme (EU ETS). In the econometric analysis a vector error correction model (VECM) is specified to estimate both long–run equilibrium relations and short–run interactions between the electricity price and the fuel (natural gas and coal) and carbon prices. The model is estimated using daily spot market prices and the four commodities prices are jointly modelled as endogenous variables. Moreover, a set of exogenous variables is incorporated in order to account for the electricity demand conditions (temperature) and the electricity generation mix (quantity of electricity traded according the technology used). The outcomes for the Portuguese electricity system suggest that the dynamic pass–through of carbon prices into electricity prices is strongly significant and a long–run elasticity was estimated (equilibrium relation) that is aligned with studies that have been conducted for other markets.
Resumo:
ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.
Resumo:
Direct methanol fuel cell, DMFC, model, mass transport, Maxwell-Stefan, Flory-Huggins, crossover, polymer electrolyte membrane, Nafion
Resumo:
Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2011
Resumo:
DMFC, dynamic behaviour, current steps, system analysis, methanol oxidation, flow field design
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2009
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Diss., 2013
Resumo:
This study examines the impact of globalization on cross-country inequality and poverty using a panel data set for 65 developing counties, over the period 1970-2008. With separate modelling for poverty and inequality, explicit control for financial intermediation, and comparative analysis for developing countries, the study attempts to provide a deeper understanding of cross country variations in income inequality and poverty. The major findings of the study are five fold. First, a non-monotonic relationship between income distribution and the level of economic development holds in all samples of countries. Second, both openness to trade and FDI do not have a favourable effect on income distribution in developing countries. Third, high financial liberalization exerts a negative and significant influence on income distribution in developing countries. Fourth, inflation seems to distort income distribution in all sets of countries. Finally, the government emerges as a major player in impacting income distribution in developing countries.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.
Resumo:
The aim of this paper is to analyse the impact of university knowledge and technology transfer activities on academic research output. Specifically, we study whether researchers with collaborative links with the private sector publish less than their peers without such links, once controlling for other sources of heterogeneity. We report findings from a longitudinal dataset on researchers from two engineering departments in the UK between 1985 until 2006. Our results indicate that researchers with industrial links publish significantly more than their peers. Academic productivity, though, is higher for low levels of industry involvement as compared to high levels.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
Leakage detection is an important issue in many chemical sensing applications. Leakage detection hy thresholds suffers from important drawbacks when sensors have serious drifts or they are affected by cross-sensitivities. Here we present an adaptive method based in a Dynamic Principal Component Analysis that models the relationships between the sensors in the may. In normal conditions a certain variance distribution characterizes sensor signals. However, in the presence of a new source of variance the PCA decomposition changes drastically. In order to prevent the influence of sensor drifts the model is adaptive and it is calculated in a recursive manner with minimum computational effort. The behavior of this technique is studied with synthetic signals and with real signals arising by oil vapor leakages in an air compressor. Results clearly demonstrate the efficiency of the proposed method.