79 resultados para Case analysis
Resumo:
Within only two decades olive oil developed from a niche product which could hardly be found in food stores outside the producing regions towards an integrated component in the diets of industrial countries. This paper discusses the impacts of the promotion of the “healthy Mediterranean diet” on land use and agro-ecosystems in the producing countries. It examines the dynamics of olive oil production, trade and consumption in the EU15 in the period 1972 to 2003 and the links between dietary patterns, trade and land use. It analyses the underlying socio-economic driving forces behind the increasing spatial disconnect between production and consumption of olive oil in the EU15 and in particular in Spain, the world largest producer during the last three decades. In the observed period olive oil consumption increased 16 fold in the non-producing EU15 countries. In the geographically limited producing regions like Spain, the 5 fold increase in export production was associated with the rapid industrialization of olive production, the conversion of vast Mediterranean landscapes to olive monocultures and a range of environmental pressures. High amounts of subsidies of the European Common Agricultural Policy and feedback loops within production and consumption systems were driving the transformation of the olive oil system. Our analysis indicates the process of change was not immediately driven by increases in demand for olive oil in non-producing countries, but rather by the institutional setting of the European Union and by concerted political interventions.
Resumo:
Treaty Establishing the European Community, operative until December 1st 2009, had already established in its article 2 the mission of the up until then European Community and actual European Union is to promote an harmonious, equilibrated and sustainable development of the economic activities of the whole Community. This Mission must be achieved by establishing a Common Market, an Economic and Monetary Union and the realization of Common Policies. One of the instruments to obtain these objectives is the use of free circulation of people, services and capitals inside the Common and Interior Market of the European Union. The European Union is characterized by the confirmation of the total movement of capitals, services and individuals and legal peoples’ freedom; freedom that was already predicated by the Maastricht Treaty, through the suppression of whatever obstacles which are in the way of the objectives before exposed. The old TEC in its Title III, now Title IV of the Treaty on the Functioning of the European Union, covered the free circulation of people, services and capitals. Consequently, the inclusion of this mechanism inside one of the regulating texts of the European Union indicates the importance this freedom supposes for the European Union objectives’ development. Once stood up the relevance of the free movement of people, services and capitals, we must mention that in this paper we are going to centre our study in one of these freedoms of movement: the free movement of capital. In order to analyze in detail the free movement of capital within the European framework, we are going to depart from the analysis of the existent case law of the Court of Justice of the European Union. The use of jurisprudence is basic to know how Community legislation is interpreted. For this reason, we are going to develop this work through judgements dictated by the European Union Court. This way we can observe how Member States’ regulating laws and the European Common Law affect the free movement of capital. The starting point of this paper will be the Judgement C-67/08 European Court of Justice of February 12th 2009, known as Block case. So, following the argumentation the Luxemburg Court did about the mentioned case, we are going to develop how free movement of capital could be affected by the current disparity of Member States’ legislation. This disparity can produce double taxation cases due to the lack of tax harmonized legislation within the interior market and the lack of treaties to avoid double taxation within the European Union. Developing this idea we are going to see how double taxation, at least indirectly, can infringe free movement of capital.
Resumo:
The field of laser application to the restoration and cleaning of cultural assets is amongst the most thriving developments of recent times. Ablative laser technological systems are able to clean and protect inestimable works of art subject to atmospheric agents and degradation over time. This new technology, which has been developing for the last forty year, is now available to restorers and has received a significant success all over Europe. An important contribution in the process of laser innovation has been carried out in Florence by local actors belonging to a creative cluster. The objects of the analysis are the genesis of this innovation in this local Florentine context, and the relationships among the main actors who have contributed in it. The study investigates how culture can play a part in the generation of ideas and innovations, and which are the creative environments that can favour it. In this context, the issue of laser technologies for the restoration of cultural heritage has been analysed as a case study in the various paths taken by the Creative Capacity of the Culture (CCC).
Resumo:
The link between energy consumption and economic growth has been widely studied in the economic literature. Understanding this relationship is important from both an environmental and a socio-economic point of view, as energy consumption is crucial to economic activity and human environmental impact. This relevance is even higher for developing countries, since energy consumption per unit of output varies through the phases of development, increasing from an agricultural stage to an industrial one and then decreasing for certain service based economies. In the Argentinean case, the relevance of energy consumption to economic development seems to be particularly important. While energy intensity seems to exhibit a U-Shaped curve from 1990 to 2003 decreasing slightly after that year, total energy consumption increases along the period of analysis. Why does this happen? How can we relate this result with the sustainability debate? All these questions are very important due to Argentinean hydrocarbons dependence and due to the recent reduction in oil and natural gas reserves, which can lead to a lack of security of supply. In this paper we study Argentinean energy consumption pattern for the period 1990-2007, to discuss current and future energy and economic sustainability. To this purpose, we developed a conventional analysis, studying energy intensity, and a non conventional analysis, using the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting methodology. Both methodologies show that the development process followed by Argentina has not been good enough to assure sustainability in the long term. Instead of improving energy use, energy intensity has increased. The current composition of its energy mix, and the recent economic crisis in Argentina, as well as its development path, are some of the possible explanations.
Resumo:
Nonlinear Noisy Leaky Integrate and Fire (NNLIF) models for neurons networks can be written as Fokker-Planck-Kolmogorov equations on the probability density of neurons, the main parameters in the model being the connectivity of the network and the noise. We analyse several aspects of the NNLIF model: the number of steady states, a priori estimates, blow-up issues and convergence toward equilibrium in the linear case. In particular, for excitatory networks, blow-up always occurs for initial data concentrated close to the firing potential. These results show how critical is the balance between noise and excitatory/inhibitory interactions to the connectivity parameter.
Resumo:
En aquest projecte s’ha analitzat i optimitzat l’enllaç satèl·lit amb avió per a un sistema aeronàutic global. Aquest nou sistema anomenat ANTARES està dissenyat per a comunicar avions amb estacions base mitjançant un satèl·lit. Aquesta és una iniciativa on hi participen institucions oficials en l’aviació com ara l’ECAC i que és desenvolupat en una col·laboració europea d’universitats i empreses. El treball dut a terme en el projecte compren bàsicament tres aspectes. El disseny i anàlisi de la gestió de recursos. La idoneïtat d’utilitzar correcció d’errors en la capa d’enllaç i en cas que sigui necessària dissenyar una opció de codificació preliminar. Finalment, estudiar i analitzar l’efecte de la interferència co-canal en sistemes multifeix. Tots aquests temes són considerats només per al “forward link”. L’estructura que segueix el projecte és primer presentar les característiques globals del sistema, després centrar-se i analitzar els temes mencionats per a poder donar resultats i extreure conclusions.
Resumo:
We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the mathematical model. Our main conclusion is that mathematical and computational models are good complements for research in social sciences. Indeed, while computational models are extremely useful to extend the scope of the analysis to complex scenarios hard to analyze mathematically, formal models can be useful to verify and to explain the outcomes of computational models.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement (SCR), under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
The genetic diversity of three temperate fruit tree phytoplasmas ‘Candidatus Phytoplasma prunorum’, ‘Ca. P. mali’ and ‘Ca. P. pyri’ has been established by multilocus sequence analysis. Among the four genetic loci used, the genes imp and aceF distinguished 30 and 24 genotypes, respectively, and showed the highest variability. Percentage of substitution for imp ranged from 50 to 68% according to species. Percentage of substitution varied between 9 and 12% for aceF, whereas it was between 5 and 6% for pnp and secY. In the case of ‘Ca P. prunorum’ the three most prevalent aceF genotypes were detected in both plants and insect vectors, confirming that the prevalent isolates are propagated by insects. The four isolates known to be hypo-virulent had the same aceF sequence, indicating a possible monophyletic origin. Haplotype network reconstructed by eBURST revealed that among the 34 haplotypes of ‘Ca. P. prunorum’, the four hypo-virulent isolates also grouped together in the same clade. Genotyping of some Spanish and Azerbaijanese ‘Ca. P. pyri’ isolates showed that they shared some alleles with ‘Ca. P. prunorum’, supporting for the first time to our knowledge, the existence of inter-species recombination between these two species.
Resumo:
This paper aims to analyse the impact of human capital on business productivity, focusing the analysis on the possible effect of the complementarity that exists between human capital and new production technologies, particularly advanced manufacturing technologies (AMTs) for the specific case of small and medium enterprises (SMEs) in Catalonia. Additionally, following the theory of skill-biased technological change, the paper analyses whether technological change produces bias exclusively in the skills required for managers, or whether the bias extends to the skills required of production staff. With this objective, we have compared the possible existence of complementarity between AMTs and the level of human capital for different occupational groups. The results confirm the complementary relationship between human capital and new production technologies. The results by occupational group confirm that to maximise the productivity of new technologies, skilled staff are needed both in management and production, with managers and professionals as well as skilled operatives playing a vital role. Keywords: human capital, process technologies, complementarity, business productivity. (JEL D24, J24, O30).
Resumo:
The main aim of this work is to define an environmental tax on products and services based on their carbon footprint. We examine the relevance of conventional life cycle analysis (LCA) and environmentally extended input-output analysis (EIO) as methodological tools to identify emission intensities of products and services on which the tax is based. The short-term price effects of the tax and the policy implications of considering non-GHG are also analyzed. The results from the specific case study on pulp production show that the environmental tax rate based on the LCA approach (1,8%) is higher than both EIO approaches (0,8% for product and 1,4% for industry approach), but they are comparable. Even though LCA is more product specific and provides detailed analysis, EIO would be the more relevant approach to apply economy wide environmental tax. When the environmental tax considers non-GHG emissions instead of only CO2, sectors such as agriculture, mining of coal and extraction of peat, and food exhibit higher environmental tax and price effects. Therefore, it is worthwhile for policy makers to pay attention on the implication of considering only CO2 tax or GHG emissions tax in order for such a policy measure to be effective and meaningful. Keywords: Environmental tax; Life cycle analysis; Environmental input-output analysis.
Resumo:
Es tracta d'una recerca d'eines CASEque actualment suporten OCL en la generació automàtica de codi Java per estudiar-les ianalitzar-les a través d'un model de proves consistent en un diagrama de classes del modelestàtic de l'UML i una mostra variada d'instruccions OCL, amb l'objectiu de detectar lesseves mancances, analitzant el codi obtingut i determinar si controla o no cada tipus derestricció, i si s'han implementat bé en el codi.
Resumo:
In this paper we look at how a web-based social software can be used to make qualitative data analysis of online peer-to-peer learning experiences. Specifically, we propose to use Cohere, a web-based social sense-making tool, to observe, track, annotate and visualize discussion group activities in online courses. We define a specific methodology for data observation and structuring, and present results of the analysis of peer interactions conducted in discussion forum in a real case study of a P2PU course. Finally we discuss how network visualization and analysis can be used to gather a better understanding of the peer-to-peer learning experience. To do so, we provide preliminary insights on the social, dialogical and conceptual connections that have been generated within one online discussion group.
Resumo:
In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments
Resumo:
Factor analysis as frequent technique for multivariate data inspection is widely used also for compositional data analysis. The usual way is to use a centered logratio (clr)transformation to obtain the random vector y of dimension D. The factor model istheny = Λf + e (1)with the factors f of dimension k & D, the error term e, and the loadings matrix Λ.Using the usual model assumptions (see, e.g., Basilevsky, 1994), the factor analysismodel (1) can be written asCov(y) = ΛΛT + ψ (2)where ψ = Cov(e) has a diagonal form. The diagonal elements of ψ as well as theloadings matrix Λ are estimated from an estimation of Cov(y).Given observed clr transformed data Y as realizations of the random vectory. Outliers or deviations from the idealized model assumptions of factor analysiscan severely effect the parameter estimation. As a way out, robust estimation ofthe covariance matrix of Y will lead to robust estimates of Λ and ψ in (2), seePison et al. (2003). Well known robust covariance estimators with good statisticalproperties, like the MCD or the S-estimators (see, e.g. Maronna et al., 2006), relyon a full-rank data matrix Y which is not the case for clr transformed data (see,e.g., Aitchison, 1986).The isometric logratio (ilr) transformation (Egozcue et al., 2003) solves thissingularity problem. The data matrix Y is transformed to a matrix Z by usingan orthonormal basis of lower dimension. Using the ilr transformed data, a robustcovariance matrix C(Z) can be estimated. The result can be back-transformed tothe clr space byC(Y ) = V C(Z)V Twhere the matrix V with orthonormal columns comes from the relation betweenthe clr and the ilr transformation. Now the parameters in the model (2) can beestimated (Basilevsky, 1994) and the results have a direct interpretation since thelinks to the original variables are still preserved.The above procedure will be applied to data from geochemistry. Our specialinterest is on comparing the results with those of Reimann et al. (2002) for the Kolaproject data