764 resultados para Idiosyncratic skewness


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] Purpose of the paper - This research analyzes the impact of three types of embedded ties, namely, specialized complementary resources, idiosyncratic investments, and knowledge sharing, on the innovation capacity of the firms. We also study the particularities of the Machine-Tool industry. Theoretical background – Our evaluation of the embedded buyer-supplier ties is based on the potential sources of relational rents proposed by Dyer and Sing (1998). We also draw on Uzzi and Lancaster (2003), Noordhoff et al. (2011), among others, to discuss the positive and negative aspects of embedded ties. Design/Methodology/Approach ‐ Using data from a survey of 202 European machine-tool firms acting as buyers and sellers, we propose and evaluate a Structural Equation model. Findings ‐ Only knowledge-sharing routines exert a significant positive effect on product innovation performance. Neither an increase in the idiosyncratic investments nor in complementary resources and capabilities enhance innovation performance. Also, knowledge-sharing routines mediate in the effect from idiosyncratic investments on innovation performance. Research Limitations. ‐ The machine tool industry has unique characteristics that make this generalization difficult. Also, there is considerable difficulty associated with testing more deeply the interrelations among these embedded ties in the long run. It is plausible to understand that these interrelations operate within a gradual process. Originality/Value/Contribution of Paper ‐ This research contributes to a better understanding of the role of embedded ties on innovativeness. To the best of our knowledge, there is no previous international empirical research analyzing the mediation effects among specialized complementary resources, idiosyncratic investments and knowledge sharing, and their effects on the innovation capacity of firms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Travaux d'études doctorales réalisées conjointement avec les travaux de recherches doctorales de Nicolas Leduc, étudiant au doctorat en génie informatique à l'École Polytechnique de Montréal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

POSTDATA is a 5 year's European Research Council (ERC) Starting Grant Project that started in May 2016 and is hosted by the Universidad Nacional de Educación a Distancia (UNED), Madrid, Spain. The context of the project is the corpora of European Poetry (EP), with a special focus on poetic materials from different languages and literary traditions. POSTDATA aims to offer a standardized model in the philological field and a metadata application profile (MAP) for EP in order to build a common classification of all these poetic materials. The information of Spanish, Italian and French repertoires will be published in the Linked Open Data (LOD) ecosystem. Later we expect to extend the model to include additional corpora. There are a number of Web Based Information Systems in Europe with repertoires of poems available to human consumption but not in an appropriate condition to be accessible and reusable by the Semantic Web. These systems are not interoperable; they are in fact locked in their databases and proprietary software, not suitable to be linked in the Semantic Web. A way to make this data interoperable is to develop a MAP in order to be able to publish this data available in the LOD ecosystem, and also to publish new data that will be created and modeled based on this MAP. To create a common data model for EP is not simple since the existent data models are based on conceptualizations and terminology belonging to their own poetical traditions and each tradition has developed an idiosyncratic analytical terminology in a different and independent way for years. The result of this uncoordinated evolution is a set of varied terminologies to explain analogous metrical phenomena through the different poetic systems whose correspondences have been hardly studied – see examples in González-Blanco & Rodríguez (2014a and b). This work has to be done by domain experts before the modeling actually starts. On the other hand, the development of a MAP is a complex task though it is imperative to follow a method for this development. The last years Curado Malta & Baptista (2012, 2013a, 2013b) have been studying the development of MAP's in a Design Science Research (DSR) methodological process in order to define a method for the development of MAPs (see Curado Malta (2014)). The output of this DSR process was a first version of a method for the development of Metadata Application Profiles (Me4MAP) (paper to be published). The DSR process is now in the validation phase of the Relevance Cycle to validate Me4MAP. The development of this MAP for poetry will follow the guidelines of Me4MAP and this development will be used to do the validation of Me4MAP. The final goal of the POSTDATA project is: i) to be able to publish all the data locked in the WIS, in LOD, where any agent interested will be able to build applications over the data in order to serve final users; ii) to build a Web platform where: a) researchers, students and other final users interested in EP will be able to access poems (and their analyses) of all databases; b) researchers, students and other final users will be able to upload poems, the digitalized images of manuscripts, and fill in the information concerning the analysis of the poem, collaboratively contributing to a LOD dataset of poetry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how imperfect information affects firms' investment decision helps answer important questions in economics, such as how we may better measure economic uncertainty; how firms' forecasts would affect their decision-making when their beliefs are not backed by economic fundamentals; and how important are the business cycle impacts of changes in firms' productivity uncertainty in an environment of incomplete information. This dissertation provides a synthetic answer to all these questions, both empirically and theoretically. The first chapter, provides empirical evidence to demonstrate that survey-based forecast dispersion identifies a distinctive type of second moment shocks different from the canonical volatility shocks to productivity, i.e. uncertainty shocks. Such forecast disagreement disturbances can affect the distribution of firm-level beliefs regardless of whether or not belief changes are backed by changes in economic fundamentals. At the aggregate level, innovations that increase the dispersion of firms' forecasts lead to persistent declines in aggregate investment and output, which are followed by a slow recovery. On the contrary, the larger dispersion of future firm-specific productivity innovations, the standard way to measure economic uncertainty, delivers the ``wait and see" effect, such that aggregate investment experiences a sharp decline, followed by a quick rebound, and then overshoots. At the firm level, data uncovers that more productive firms increase investments given rises in productivity dispersion for the future, whereas investments drop when firms disagree more about the well-being of their future business conditions. These findings challenge the view that the dispersion of the firms' heterogeneous beliefs captures the concept of economic uncertainty, defined by a model of uncertainty shocks. The second chapter presents a general equilibrium model of heterogeneous firms subject to the real productivity uncertainty shocks and informational disagreement shocks. As firms cannot perfectly disentangle aggregate from idiosyncratic productivity because of imperfect information, information quality thus drives the wedge of difference between the unobserved productivity fundamentals, and the firms' beliefs about how productive they are. Distribution of the firms' beliefs is no longer perfectly aligned with the distribution of firm-level productivity across firms. This model not only explains why, at the macro and micro level, disagreement shocks are different from uncertainty shocks, as documented in Chapter 1, but helps reconcile a key challenge faced by the standard framework to study economic uncertainty: a trade-off between sizable business cycle effects due to changes in uncertainty, and the right amount of pro-cyclicality of firm-level investment rate dispersion, as measured by its correlation with the output cycles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The pharmaceutical industry is knowledge and research-intensive. Due to technological, socio-political and organisational changes there has been a continuous evolution in the knowledge base utilized to achieve and maintain competitive advantages in this global industry. There is a gap in analysing the linkages and effects of those changes on knowledge creation processes associated with pharmaceutical R&D activities. Our paper looks to fill this gap. We built on an idiosyncratic research approach – the systematic literature review – and looked to unearth current trends affecting knowledge creation in international/global pharmaceutical R&D. We reviewed scientific papers published between 1980 and 2005. Key findings include promising trends in pharmaceutical innovation and human resource management, and their potential implications on current R&D practices within the pharmaceutical industry, from managerial and policy-making perspectives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Black women cultural entrepreneurs are a group of entrepreneurs that merit further inquiry. Using qualitative interview and participant observation data, this dissertation investigates the ways in which black women cultural entrepreneurs define success. My findings reveal that black women cultural entrepreneurs are a particular interpretive community with values, perspectives and experiences, which are not wholly idiosyncratic, but shaped by collective experiences and larger social forces. Black women are not a monolith, but they are neither disconnected individuals completely devoid of group identity. The meaning they give to their businesses, professional experiences and understandings of success are influenced by their shared social position and identity as black women. For black women cultural entrepreneurs, the New Bottom Line goes beyond financial gain. This group, while not uniform in their understandings of success, largely understand the most meaningful accomplishments they can realize as social impact in the form of cultural intervention, black community uplift and professional/creative agency. These particular considerations represent a new paramount concern, and alternative understanding of what is typically understood as the bottom line. The structural, social and personal challenges that black women cultural entrepreneurs encounter have shaped their particular perspectives on success. I also explore the ways research participants articulated an oppositional consciousness to create an alternative means of defining and achieving success. I argue that this consciousness empowers them with resources, connections and meaning not readily conferred in traditional entrepreneurial settings. In this sense, the personal, social and structural challenges have been foundational to the formation of an alternative economy, which I refer to as The Connected Economy. Leading and participating in The Connected Economy, black women cultural entrepreneurs represent a black feminist and womanist critique of dominant understandings of success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This exhibition expands upon the history, approaches of experimental film and image making. Through speculative and abstract approaches the artists appropriate images to deal with trauma, confusion and nostalgia. The artists in this exhibition use personal imagery to demonstrate abstract ideals and idiosyncratic perspectives. Work in the show will be made up of photographic prints, collage, 16mm film and video work. Through physical manipulations of the image surface, retrenching of forgotten archives and poetic layerings of time and place, this exhibition aims to examine the de-linear and personal ways artists can experiment with the image.Through incorporating work of long standing artists Dirk De Bruyn and Luigi Fusinato in contrast with the work of young artists Anna Higgins and Beth Caird, the exhibition will examine the relationship between experimental film from a pre-digital context and how it influences, echoes and evolves in a post-digital environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the volume of work that has been conducted on the topic, the role of surface topography in mediating bacterial cell adhesion is not well understood. The primary reason for this lack of understanding is the relatively limited extent of topographical characterisation employed in many studies. In the present study, the topographies of three sub-nanometrically smooth titanium (Ti) surfaces were comprehensively characterised, using nine individual parameters that together describe the height, shape and distribution of their surface features. This topographical analysis was then correlated with the adhesion behaviour of the pathogenic bacteria Staphylococcus aureus and Pseudomonas aeruginosa, in an effort to understand the role played by each aspect of surface architecture in influencing bacterial attachment. While P. aeruginosa was largely unable to adhere to any of the three sub-nanometrically smooth Ti surfaces, the extent of S. aureus cell attachment was found to be greater on surfaces with higher average, RMS and maximum roughness and higher surface areas. The cells also attached in greater numbers to surfaces that had shorter autocorrelation lengths and skewness values that approached zero, indicating a preference for less ordered surfaces with peak heights and valley depths evenly distributed around the mean plane. Across the sub-nanometrically smooth range of surfaces tested, it was shown that S. aureus more easily attached to surfaces with larger features that were evenly distributed between peaks and valleys, with higher levels of randomness. This study demonstrated that the traditionally employed amplitudinal roughness parameters are not the only determinants of bacterial adhesion, and that spatial parameters can also be used to predict the extent of attachment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since asset returns have been recognized as not normally distributed, the avenue of research regarding portfolio higher moments soon emerged. To account for uncertainty and vagueness of portfolio returns as well as of higher moment risks, we proposed a new portfolio selection model employing fuzzy sets in this paper. A fuzzy multi-objective linear programming (MOLP) for portfolio optimization is formulated using marginal impacts of assets on portfolio higher moments, which are modelled by trapezoidal fuzzy numbers. Through a consistent centroid-based ranking of fuzzy numbers, the fuzzy MOLP is transformed into an MOLP that is then solved by the maximin method. By taking portfolio higher moments into account, the approach enables investors to optimize not only the normal risk (variance) but also the asymmetric risk (skewness) and the risk of fat-tails (kurtosis). An illustrative example demonstrates the efficiency of the proposed methodology comparing to previous portfolio optimization models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This essay examines Neil Taylor’s (1945-) animations, situated between the moving image, performance and sculpture and in the shadows of his recognised wire-based sculptural practice. Taylor’s animations automatically inscribe the surfaces of flipbooks and note pads, (Short Lives 1980-90) cash register rolls (Roll Film 1990) and are enhanced by hand-made ‘machines’ (Copy Copy 1998) designed to shape this idiosyncratic activity. These short films are part of an avant-garde project ‘that continues to explore the physical properties of film and the nature of perceptual transactions which take place between viewer and film.’ (John Hanhardt, 1976: 44). Taylor’s practice is additionally placed, through Vilem Flusser’s ‘technical image’ in relation to the ascendancy of digital culture, and Pierre Bourdieu’s ‘habitus’ is used to frame both Taylor’s art and teaching practice, in order to examine the discounting of the technical classes that these fields of production consistently perform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper empirically investigates volatility transmission among stock and foreign exchange markets in seven major world economies during the period July 1988 to January 2015. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yilmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each market. To gain further insights, we examine the time-varying behaviour of net pair-wise directional connectedness during the financial turmoil periods experienced in the sample period Our results suggest that slightly more than half of the total variance of the forecast errors is explained by shocks across markets rather than by idiosyncratic shocks. Furthermore, we find that volatility connectedness varies over time, with a surge during periods of increasing economic and financial instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

¿What have we learnt from the 2006-2012 crisis, including events such as the subprime crisis, the bankruptcy of Lehman Brothers or the European sovereign debt crisis, among others? It is usually assumed that in firms that have a CDS quotation, this CDS is the key factor in establishing the credit premiumrisk for a new financial asset. Thus, the CDS is a key element for any investor in taking relative value opportunities across a firm’s capital structure. In the first chapter we study the most relevant aspects of the microstructure of the CDS market in terms of pricing, to have a clear idea of how this market works. We consider that such an analysis is a necessary point for establishing a solid base for the rest of the chapters in order to carry out the different empirical studies we perform. In its document “Basel III: A global regulatory framework for more resilient banks and banking systems”, Basel sets the requirement of a capital charge for credit valuation adjustment (CVA) risk in the trading book and its methodology for the computation for the capital requirement. This regulatory requirement has added extra pressure for in-depth knowledge of the CDS market and this motivates the analysis performed in this thesis. The problem arises in estimating of the credit risk premium for those counterparties without a directly quoted CDS in the market. How can we estimate the credit spread for an issuer without CDS? In addition to this, given the high volatility period in the credit market in the last few years and, in particular, after the default of Lehman Brothers on 15 September 2008, we observe the presence of big outliers in the distribution of credit spread in the different combinations of rating, industry and region. After an exhaustive analysis of the results from the different models studied, we have reached the following conclusions. It is clear that hierarchical regression models fit the data much better than those of non-hierarchical regression. Furthermore,we generally prefer the median model (50%-quantile regression) to the mean model (standard OLS regression) due to its robustness when assigning the price to a new credit asset without spread,minimizing the “inversion problem”. Finally, an additional fundamental reason to prefer the median model is the typical "right skewness" distribution of CDS spreads...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating the process capability index (PCI) for non-normal processes has been discussed by many researches. There are two basic approaches to estimating the PCI for non-normal processes. The first commonly used approach is to transform the non-normal data into normal data using transformation techniques and then use a conventional normal method to estimate the PCI for transformed data. This is a straightforward approach and is easy to deploy. The alternate approach is to use non-normal percentiles to calculate the PCI. The latter approach is not easy to implement and a deviation in estimating the distribution of the process may affect the efficacy of the estimated PCI. The aim of this paper is to estimate the PCI for non-normal processes using a transformation technique called root transformation. The efficacy of the proposed technique is assessed by conducting a simulation study using gamma, Weibull, and beta distributions. The root transformation technique is used to estimate the PCI for each set of simulated data. These results are then compared with the PCI obtained using exact percentiles and the Box-Cox method. Finally, a case study based on real-world data is presented.