922 resultados para tegumento seminal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The integrability of the nonlinear Schräodinger equation (NLSE) by the inverse scattering transform shown in a seminal work [1] gave an interesting opportunity to treat the corresponding nonlinear channel similar to a linear one by using the nonlinear Fourier transform. Integrability of the NLSE is in the background of the old idea of eigenvalue communications [2] that was resurrected in recent works [3{7]. In [6, 7] the new method for the coherent optical transmission employing the continuous nonlinear spectral data | nonlinear inverse synthesis was introduced. It assumes the modulation and detection of data using directly the continuous part of nonlinear spectrum associated with an integrable transmission channel (the NLSE in the case considered). Although such a transmission method is inherently free from nonlinear impairments, the noisy signal corruptions, arising due to the ampli¯er spontaneous emission, inevitably degrade the optical system performance. We study properties of the noise-corrupted channel model in the nonlinear spectral domain attributed to NLSE. We derive the general stochastic equations governing the signal evolution inside the nonlinear spectral domain and elucidate the properties of the emerging nonlinear spectral noise using well-established methods of perturbation theory based on inverse scattering transform [8]. It is shown that in the presence of small noise the communication channel in the nonlinear domain is the additive Gaussian channel with memory and signal-dependent correlation matrix. We demonstrate that the effective spectral noise acquires colouring", its autocorrelation function becomes slow decaying and non-diagonal as a function of \frequencies", and the noise loses its circular symmetry, becoming elliptically polarized. Then we derive a low bound for the spectral effiency for such a channel. Our main result is that by using the nonlinear spectral techniques one can significantly increase the achievable spectral effiency compared to the currently available methods [9]. REFERENCES 1. Zakharov, V. E. and A. B. Shabat, Sov. Phys. JETP, Vol. 34, 62{69, 1972. 2. Hasegawa, A. and T. Nyu, J. Lightwave Technol., Vol. 11, 395{399, 1993. 3. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4312{4328, 2014. 4. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4329{4345 2014. 5. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4346{4369, 2014. 6. Prilepsky, J. E., S. A. Derevyanko, K. J. Blow, I. Gabitov, and S. K. Turitsyn, Phys. Rev. Lett., Vol. 113, 013901, 2014. 7. Le, S. T., J. E. Prilepsky, and S. K. Turitsyn, Opt. Express, Vol. 22, 26720{26741, 2014. 8. Kaup, D. J. and A. C. Newell, Proc. R. Soc. Lond. A, Vol. 361, 413{446, 1978. 9. Essiambre, R.-J., G. Kramer, P. J. Winzer, G. J. Foschini, and B. Goebel, J. Lightwave Technol., Vol. 28, 662{701, 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ten years ago, Bowen and Ostroff (2004) criticized the one-sided focus on the content-based approach, where researchers take into account the inherent virtues (or vices) associated with the content of HR practices to explain performance. They explicitly highlight the role of the psychological processes through which employees attach meaning to HRM. In this first article of the special section entitled “Is the HRM Process Important?” we present an overview of past, current, and future challenges. For past challenges, we attempt to categorize the various research streams that originated from the seminal piece. To outline current challenges, we present the results of a content analysis of the original 15 articles put forward for the special section. In addition, we provide the overview of a caucus focused on this theme that was held at the Academy of Management annual meeting in Boston in 2012. In conclusion, we discuss future challenges relating to the HRM process approach and review the contributions that have been selected—against a competitive field—for this special issue

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The seminal multiple-view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis (MVS) methodology. The somewhat small size and variability of these data sets, however, limit their scope and the conclusions that can be derived from them. To facilitate further development within MVS, we here present a new and varied data set consisting of 80 scenes, seen from 49 or 64 accurate camera positions. This is accompanied by accurate structured light scans for reference and evaluation. In addition all images are taken under seven different lighting conditions. As a benchmark and to validate the use of our data set for obtaining reasonable and statistically significant findings about MVS, we have applied the three state-of-the-art MVS algorithms by Campbell et al., Furukawa et al., and Tola et al. to the data set. To do this we have extended the evaluation protocol from the Middlebury evaluation, necessitated by the more complex geometry of some of our scenes. The data set and accompanying evaluation framework are made freely available online. Based on this evaluation, we are able to observe several characteristics of state-of-the-art MVS, e.g. that there is a tradeoff between the quality of the reconstructed 3D points (accuracy) and how much of an object’s surface is captured (completeness). Also, several issues that we hypothesized would challenge MVS, such as specularities and changing lighting conditions did not pose serious problems. Our study finds that the two most pressing issues for MVS are lack of texture and meshing (forming 3D points into closed triangulated surfaces).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Convergence has been a popular theme in applied economics since the seminal papers of Barro (1991) and Barro and Sala-i-Martin (1992). The very notion of convergence quickly becomes problematic from an academic viewpoint however when we try and formalise a framework to think about these issues. In the light of the abundance of available convergence concepts, it would be useful to have a more universal framework that encompassed existing concepts as special cases. Moreover, much of the convergence literature has treated the issue as a zero-one outcome. We argue that it is more sensible and useful for policy decision makers and academic researchers to consider also ongoing convergence over time. Assessing the progress of ongoing convergence is one interesting and important means of evaluating whether the Eastern European New Member Countries (NMC) of the European Union (EU) are getting closer to being deemed “ready” to join the European Monetary Union (EMU), that is, fulfilling the Maastricht convergence criteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

According to the textbook approach, the developmental states of the Far East have been considered as strong and autonomous entities. Although their bureaucratic elites have remained isolated from direct pressures stemming from society, the state capacity has also been utilised in order to allocate resources in the interest of the whole society. Yet, society – by and large –has remained weak and subordinated to the state elite. On the other hand, the general perception of Sub-Saharan Africa (SSA) has been just the opposite. The violent and permanent conflict amongst rent-seeking groups for influence and authority over resources has culminated in a situation where states have become extremely weak and fragmented, while society – depending on the capacity of competing groups for mobilising resources to organise themselves mostly on a regional or local level (resulting in local petty kingdoms) – has never had the chance to evolve as a strong player. State failure in the literature, therefore, – in the context of SSA – refers not just to a weak and captured state but also to a non-functioning, and sometimes even non-existent society, too. Recently, however, the driving forces of globalisation might have triggered serious changes in the above described status quo. Accordingly, our hypothesis is the following: globalisation, especially the dynamic changes of technology, capital and communication have made the simplistic “strong state–weak society” (in Asia) and “weak state–weak society” (in Africa) categorisation somewhat obsolete. While our comparative study has a strong emphasis on the empirical scrutiny of trying to uncover the dynamics of changes in state–society relations in the two chosen regions both qualitatively and quantitatively, it also aims at complementing the meaning and essence of the concepts and methodology of stateness, state capacity and state-society relations, the well-known building blocks of the seminal works of Evans (1995), Leftwich (1995), Migdal (1988) or Myrdal (1968).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the impact of state subsidy on the behavior of the entrepreneur under asymmetric information. Several authors formulated concerns about state intervention as it can aggravate moral hazard in corporate financing. In the seminal paper of Holmström and Tirole (1997) a two-player moral hazard model is presented with an entrepreneur initiating a risky scalable project and a private investor (e.g. bank or venture capitalist) providing outside financing. The novelty of our research is that this basic moral hazard model is extended to the case of positive externalities and to three players by introducing the state subsidizing the project. It is shown that in the optimum, state subsidy does not harm, but improves the incentives of the entrepreneur to make efforts for the success of the project; hence in effect state intervention reduces moral hazard. Consequently, state subsidy increases social welfare which is defined as the sum of private and public net benefits. Also, the exact form of the state subsidy (ex-ante/ex-post, conditional/unconditional, refundable/nonrefundable) is irrelevant in respect of the optimal size and the total welfare effect of the project. Moreover, in case of nonrefundable subsidies state does not crowd out private investors; but on the contrary, by providing additional capital it boosts private financing. In case of refundable subsidies some crowding effects may occur depending on the subsidy form and the parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the seminal works of Markowitz (1952), Sharpe (1964), and Lintner (1965), numerous studies on portfolio selection and performance measure have been based upon the mean-variance framework. However, several researchers (e.g., Arditti (1967, and 1971), Samuelson (1970), and Rubinstein (1973)) argue that the higher moments cannot be neglected unless there is reason to believe that: (i) the asset returns are normally distributed and the investor's utility function is quadratic, or (ii) the empirical evidence demonstrates that higher moments are irrelevant to the investor's decision. Based on the same argument, this dissertation investigates the impact of higher moments of return distributions on three issues concerning the 14 international stock markets.^ First, the portfolio selection with skewness is determined using: the Polynomial Goal Programming in which investor preferences for skewness can be incorporated. The empirical findings suggest that the return distributions of international stock markets are not normally distributed, and that the incorporation of skewness into an investor's portfolio decision causes a major change in the construction of his optimal portfolio. The evidence also indicates that an investor will trade expected return of the portfolio for skewness. Moreover, when short sales are allowed, investors are better off as they attain higher expected return and skewness simultaneously.^ Second, the performance of international stock markets are evaluated using two types of performance measures: (i) the two-moment performance measures of Sharpe (1966), and Treynor (1965), and (ii) the higher-moment performance measures of Prakash and Bear (1986), and Stephens and Proffitt (1991). The empirical evidence indicates that higher moments of return distributions are significant and relevant to the investor's decision. Thus, the higher moment performance measures should be more appropriate to evaluate the performances of international stock markets. The evidence also indicates that various measures provide a vastly different performance ranking of the markets, albeit in the same direction.^ Finally, the inter-temporal stability of the international stock markets is investigated using the Parhizgari and Prakash (1989) algorithm for the Sen and Puri (1968) test which accounts for non-normality of return distributions. The empirical finding indicates that there is strong evidence to support the stability in international stock market movements. However, when the Anderson test which assumes normality of return distributions is employed, the stability in the correlation structure is rejected. This suggests that the non-normality of the return distribution is an important factor that cannot be ignored in the investigation of inter-temporal stability of international stock markets. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation examines the monetary models of exchange rate determination for Brazil, Canada, and two countries in the Caribbean, namely, the Dominican Republic and Jamaica. With the exception of Canada, the others adopted the floating regime during the past ten years.^ The empirical validity of four seminal models in exchange rate economics were determined. Three of these models were entirely classical (Bilson and Frenkel) or Keynesian (Dornbusch) in nature. The fourth model (Real Interest Differential Model) was a mixture of the two schools of economic theory.^ There is no clear empirical evidence of the validity of the monetary models. However, the signs of the coefficients of the nominal interest differential variable were as predicted by the Keynesian hypothesis in the case of Canada and as predicted by the Chicago theorists in the remaining countries. Moreover, in case of Brazil, due to hyperinflation, the exchange rate is heavily influenced by domestic money supply.^ I also tested the purchasing power parity (PPP) for this same set of countries. For both the monetary as well as the PPP hypothesis, I tested for co-integration and applied ordinary least squares estimation procedure. The error correction model was also used for the PPP model, to determine convergence to equilibrium.^ The validity of PPP is also questionable for my set of countries. Endogeinity among the regressors as well as the lack of proper price indices are the contributing factors. More importantly, Central Bank intervention negate rapid adjustment of price and exchange rates to their equilibrium value. However, its forecasting capability for the period 1993-1994 is superior compared to the monetary models in two of the four cases.^ I conclude that in spite of the questionable validity of these models, the monetary models give better results in the case of the "smaller" economies like the Dominican Republic and Jamaica where monetary influences swamp the other determinants of exchange rate. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the development of employee engagement through a historical perspective lens. Using a structured literature review, seminal works are identified and reviewed. A working definition is proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In his discourse - The Chef In Society: Origins And Development - Marcel R. Escoffier, Graduate Student, School of Hospitality Management at Florida International University, initially offers: “The role of the modern professional chef has its origins in ancient Greece. The author traces that history and looks at the evolution of the executive chef as a manager and administrator.” “Chefs, as tradespersons, can trace their origins to ancient Greece,” the author offers with citation. “Most were slaves…” he also informs you. Even at that low estate in life, the chef was master of the slaves and servants who were at close hand in the environment in which they worked. “In Athens, a cook was the master of all the household slaves…” says Escoffier. As Athenian influence wanes and Roman civilization picks-up the torch, chefs maintain and increase their status as important tradesmen in society. “Here the first professional societies of cooks were formed, almost a hierarchy,” Escoffier again cites the information. “It was in Rome that cooks established their first academy: Colleqium Coquorum,” he further reports. Chefs, again, increase their significance during the following Italian Renaissance as the scope of their influence widens. “…it is an historical fact that the marriage of Henry IV and Catherine de Medici introduced France to the culinary wonders of the Italian Renaissance,” Escoffier enlightens you. “Certainly the professional chef in France became more sophisticated and more highly regarded by society after the introduction of the Italian cooking concepts.” The author wants you to know that by this time cookbooks are already making important inroads and contributing to the history of cooking above and beyond their obvious informational status. Outside of the apparent European influences in cooking, Escoffier also ephemerally mentions the development of Chinese and Indian chefs. “It is interesting to note that the Chinese, held by at least one theory as the progenitors of most of the culinary heritage, never developed a high esteem for the position of chef,” Escoffier maintains the historical tack. “It was not until the middle 18th Century that the first professional chef went public. Until that time, only the great houses of the nobility could afford to maintain a chef,” Escoffier notes. This private-to-public transition, in conjunction with culinary writing are benchmarks for the profession. Chefs now establish authority and eminence. The remainder of the article devotes itself to the development of the professional chef; especially the melding of two seminal figures in the culinary arts, Cesar Ritz and August Escoffier. The works of Frederick Taylor are also highlighted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the beginning of airline deregulations in 1978, U.S. domestic operations were in for a period of turmoil, adjustment, vibrancy, entrepreneurship, and change. A great deal has been written about the effects of deregulation on airlines and their personnel, and on the public at large. Less attention has been paid to the effects on travel agents and on the seminal role of computerized reservations systems (CRSs) in the flowering of travel agencies. This article examines both of these phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several tests that evaluate the quality of seeds are destructive and require time, which is considered long and expensive in the processes that involves the production and marketing of seed. Thus, techniques that allow reducing the time related to assess the quality of seed lots is very favorable, considering the technical, economic and scientific point of view. The techniques images of seed analyzed both by X-ray such as digital images, represent alternative for this sector, and are considered reproducible and fast, giving greater flexibility and autonomy to the activities of production systems. Summarily, the objective was to analyze the internal morphology of seeds of this species through x-rayed images and the efficiency of weed seed area increased during soaking through image analysis and compare them with the results of germination tests and force the evaluation of physiological seed quality. For X-ray tests, the seeds were exposed for 0.14 seconds at radiation 40kV and 2.0 mAs. Were analyzed images using the ImageJ program and subsequently put to germinate in B.O.D chamber at 27 ° C, in which there was the comparison of results for germination. To determine the test area increase (% IA), seeds were used with and without seed coat, maintained the B.O.D chamber at 15 ° to 20 ° C, the seeds were photographed before and after the soaking period, the results were compared to the germination rates. For the X-ray test, it was observed that seeds with empty area greater than 20%, showed a higher percentage of abnormal seedlings. And the area increment analysis showed that it is possible to rank the batch after 8 hours of imbibition at 15 ° C according to the germination and vigor tests

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work described in this thesis revolves around the 1,1,n,ntetramethyl[n](2,11)teropyrenophanes, which are a series of [n]cyclophanes with a severely bent, board-shaped polynuclear aromatic hydrocarbons (PAH). The thesis is divided into seven Chapters. The first Chapter conatins an overview of the seminal work on [n]cyclophanes of the first two members of the “capped rylene” series of PAHs: benzene and pyrene. Three different general strategies for the synthesis of [n]cyclophanes are discussed and this leads in to a discussion of some slected syntheses of [n]paracyclopahnes and [n](2,7)pyrenophanes. The chemical, structural, spectroscopic and photophysical properties of these benzene and pyrene-derived cyclophanes are discussed with emphasis on the changes that occur with changes in the structure of the aromatic system. Chapter 1 concludes with a brief introduction to [n]cyclophanes of the fourth member of the capped rylene series of PAHs: teropyrene. The focus of the work described in Chapter 2 is the synthesis of of 1,1,n,ntetramethyl[n](2,11)teropyrenophane (n = 6 and 7) using a double-McMurry strategy. While the synthesis 1,1,7,7-tetramethyl[7](2,11)teropyrenophane was successful, the synthesis of the lower homologue 1,1,6,6-tetramethyl[6](2,11)teropyrenophane was not. The conformational behaviour of [n.2]pyrenophanes was also studied by 1H NMR spectroscopy and this provided a conformation-based rationale for the failure of the synthesis of 1,1,6,6-tetramethyl[6](2,11)teropyrenophane. Chapter 3 contains details of the synthesis of 1,1,n,n-tetramethyl[n](2,11)teropyrenophanes (n = 7-9) using a Wurtz / McMurry strategy, which proved to be more general than the double McMurry strategy. The three teropyrenophanes were obtained in ca. 10 milligram quantities. Trends in the spectroscopic properties that accompany changes in the structure of the teropyrene system are discussed. A violation of Kasha’s rule was observed when the teropyrenophanes were irradiated at 260 nm. The work described in the fourth Chapter concentrates on the development of gram-scale syntheses of 1,1,n,n-tetramethyl[n](2,11)teropyrenophanes (n = 7–10) using the Wurtz / McMurry strategy. Several major modifications to the orginal synthetic pathway had to be made to enable the first several steps to be performed comfortably on tens of grams of material. Solubility problems severely limited the amount of material that could be produced at a late stage of the synthetic pathways leading to the evennumbered members of the series (n = 8, 10). Ultimately, only 1,1,9,9- tetramethyl[9](2,11)teropyrenophane was synthesized on a multi-gram scale. In the final step in the synthesis, a valence isomerization / dehydrogenation (VID) reaction, the teropyrenophane was observed to become unstable under the conditions of its formation at n = 8. The synthesis of 1,1,10,10-tetramethyl[10](2,11)teropyrenophane was achieved for the first time, but only on a few hundred milligram scale. In Chapter 5, the results of an investigation of the electrophilic aromatic bromination of the 1,1,n,n-tetramethyl[n](2,11)teropyrenophanes (n = 7–10) are presented. Being the most abundant cyclophane, most of the work was performed on 1,1,9,9-tetramethyl[9](2,11)teropyrenophane. Reaction of this compound with varying amounts of of bromine revealed that bromination occurs most rapidly at the symmetryrelated 4, 9, 13 and 18 positions (teropyrene numbering) and that the 4,9,13,18- tetrabromide could be formed exclusively. Subsequent bromination occurs selectively on the symmetry-related 6, 7, 15 and 16 positions (teropyrene numbering), but considerably more slowly. Only mixtures of penta-, hexa-, hepta and octabromides could be formed. Bromination reactions of the higher and lower homologues (n = 7, 8 and 10) revealed that the reactivity of the teropyrene system increased with the degree of bend. Crystal structures of some tetra-, hexa-, hepta- and octa-brominated products were obtained. The goal of the work described in Chapter 6 is to use 1,1,9,9- tetramethyl[9](2,11)teropyrenophane as a starting material for the synthesis of warped nanographenophanes. A bromination, Suzuki-Miyaura, cyclodehydrogenation sequence was unsuccessful, as was a C–H arylation / cyclodehydrogenation approach. Itami’s recently-developed K-region-selective annulative -extension (APEX) reaction proved to be successful, affording a giant [n]cyclophane with a C84 PAH. Attempted bay-region Diels-Alder reactions and some cursory host-guest chemistry of teropyrenophanes are also discussed. In Chapter 7 a synthetic approach toward a planar model compound, 2,11-di-tbutylteropyrene, is described. The synthesis could not be completed owing to solubility problems at the end of the synthetic pathway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Globalization and technological changes that has happened since the 80s have brought remarkable changes in the industrial and commercial paradigm, which are expressed mainly in the international fragmentation of production and in the formation of Global Value Chains (GVC). This thesis sought to understand such phenomena and discuss new relevant variables in this context for a more accurate analysis of the current trade patterns not addressed by the seminal economic theories that relate trade and economic growth. It sought to evaluate how the trade specialization pattern of Brazil evolved compared to other economies (China, India, Russia, United States, Japan and selected Latin American economies) in the light of these phenomena from 1995 to 2011. Therefore, we have used the methodology of gross exports decomposition in value added measures, developed by Koopman et al. (2014), and indicators estimated from data of two global matrices I-O: a WIOT (2013) and the TiVA (2015). It was also tested two hypotheses regarding the role of these phenomena as determinants of economic growth in recent years: 1º) fragmentation and participation in GVC ensure higher growth rates for countries; 2º) the place (stage) in which the country finds itself in GVC associated with sectoral technological aspects is also important for economic growth. For this, we used dynamic panel models (Difference GMM and System GMM) for a sample of 40 countries from 2003 to 2011. The studies carried out on Brazil show that the country is no longer on the margins of these phenomena, because it shows increasing rates of participation in GVC, including in sectors considered most strategic for fragmentation. However, there is not a standard convergence of trade specialization of the country to those presented by developed countries or movements earned by China and Mexico in terms of their position and profile of participating in GVC. Another important result obtained by the thesis is the identification of these phenomena are in fact new variables relevant for economic growth, because it shows empirical evidences to support the hypothesis 1 and, partially, the hypothesis 2. A joint analysis of the estimated econometric results with the results of the descriptive analysis of the Brazilian economy, it leads us to conclude that the trade specialization pattern of the country in the context of the new trade setups is presented unfavorably to its growth strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.