842 resultados para Real assets and portfolio diversification


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia de Produção - FEG

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rock-wallaby genus Petrogale comprises a group of habitat-specialist macropodids endemic to Australia. Their restriction to rocky outcrops, with infrequent interpopulation dispersal, has been suggested as the cause of their recent and rapid diversification. Molecular phylogenetic relationships within and among species of Petrogale were analysed using mitochondrial (cytochrome oxidase c subunit 1, cytochrome b. NADH dehydrogenase subunit 2) and nuclear (omega-globin intron, breast and ovarian cancer susceptibility gene) sequence data with representatives that encompassed the morphological and chromosomal variation within the genus, including for the first time both Petrogale concinna and Petrogale purpureicollis. Four distinct lineages were identified, (1) the brachyotis group, (2) Petrogale persephone, (3) Petrogale xanthopus and (4) the lateralis-penicillata group. Three of these lineages include taxa with the ancestral karyotype (2n = 22). Paraphyletic relationships within the brachyotis group indicate the need for a focused phylogeographic study. There was support for P. purpureicollis being reinstated as a full species and P. concinna being placed within Petrogale rather than in the monotypic genus Peradorcas. Bayesian analyses of divergence times suggest that episodes of diversification commenced in the late Miocene-Pliocene and continued throughout the Pleistocene. Ancestral state reconstructions suggest that Petrogale originated in a mesic environment and dispersed into more arid environments, events that correlate with the timing of radiations in other arid zone vertebrate taxa across Australia. Crown Copyright (C) 2011 Published by Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We deal with the optimization of the production of branched sheet metal products. New forming techniques for sheet metal give rise to a wide variety of possible profiles and possible ways of production. In particular, we show how the problem of producing a given profile geometry can be modeled as a discrete optimization problem. We provide a theoretical analysis of the model in order to improve its solution time. In this context we give the complete convex hull description of some substructures of the underlying polyhedron. Moreover, we introduce a new class of facet-defining inequalities that represent connectivity constraints for the profile and show how these inequalities can be separated in polynomial time. Finally, we present numerical results for various test instances, both real-world and academic examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the new active absorption wave basin, named Hydrodynamic Calibrator (HC), constructed at the University of São Paulo (USP), in the Laboratory facilities of the Numerical Offshore Tank (TPN). The square (14 m 14 m) tank is able to generate and absorb waves from 0.5 Hz to 2.0 Hz, by means of 148 active hinged flap wave makers. An independent mechanical system drives each flap by means of a 1HP servo-motor and a ball-screw based transmission system. A customized ultrasonic wave probe is installed in each flap, and is responsible for measuring wave elevation in the flap. A complex automation architecture was implemented, with three Programmable Logic Computers (PLCs), and a low-level software is responsible for all the interlocks and maintenance functions of the tank. Furthermore, all the control algorithms for the generation and absorption are implemented using higher level software (MATLAB /Simulink block diagrams). These algorithms calculate the motions of the wave makers both to generate and absorb the required wave field by taking into account the layout of the flaps and the limits of wave generation. The experimental transfer function that relates the flap amplitude to the wave elevation amplitude is used for the calculation of the motion of each flap. This paper describes the main features of the tank, followed by a detailed presentation of the whole automation system. It includes the measuring devices, signal conditioning, PLC and network architecture, real-time and synchronizing software and motor control loop. Finally, a validation of the whole automation system is presented, by means of the experimental analysis of the transfer function of the waves generated and the calculation of all the delays introduced by the automation system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work studies the impact of two traditional Romanian treatments, Red Petroleum and Propolis, in terms of real efficiency and consequence on the wooden artifacts. The application of these solutions is still a widely adopted and popular technique in preservative conservation but the impact of these solutions is not well known. It is important to know the effect of treatments on chemical-physical and structural characteristics of the artifacts, not only for understanding the influence on present conditions but also for foreseeing the future behavior. These treatments with Romanian traditional products are compared with a commercial antifungal product, Biotin R, which is utilized as reference to control the effectiveness of Red Petroleum and Propolis. Red Petroleum and Propolis are not active against mould while Biotin R is very active. Mould attack is mostly concentrated in the painted layer, where the tempera, containing glue and egg, enhance nutrition availability for moulds. Biotin R, even if is not a real insecticide but a fungicide, was the most active product against insect attack of the three products, followed by Red Petroleum, Propolis and untreated reference. As for colour, it did not change so much after the application of Red Petroleum and Biotin R and the colour difference was almost not perceptible. On the contrary, Propolis affected the colour a lot. During the exposure at different RH, the colour changes significantly at 100% RH at equilibrium and this is mainly due to the mould attack. Red Petroleum penetrates deeply into wood, while Propolis does not penetrate and remains only on the surface. However, Red Petroleum does not interact chemically with wood substance and it is easy volatilized in oven-dry condition. On the contrary Propolis interacts chemically with wood substance and hardly volatilized, even in oven-dry condition and consequently Propolis remains where it penetrated, mostly on the surface. Treatment by immersion has impact on wood physical parameters while treatment by brushing does not have significant impact. Especially Red Petroleum has an apparent impact on moisture content (MC) due to the penetration of solution, while Propolis does not penetrate so much and remains only on surface therefore Propolis does not have so much impact as Red Petroleum. However, if the weight of the solution penetrated in wood is eliminated, there is not significant difference in MC between treated and untreated samples. Considering physical parameters, dimensional stability is an important parameter. The variation of wood moisture content causes shrinkages/swelling of the wood that polychrome layer can only partially follow. The dimension of wooden supports varied under different moisture conditioning; the painted layer cannot completely follow this deformation, and consequently a degradation and deterioration caused by detachment, occurs. That detachment affects the polychrome stratification of the panel painting and eventually the connections between the different layer compositions of the panel painting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis studies the economic and financial conditions of Italian households, by using microeconomic data of the Survey on Household Income and Wealth (SHIW) over the period 1998-2006. It develops along two lines of enquiry. First it studies the determinants of households holdings of assets and liabilities and estimates their correlation degree. After a review of the literature, it estimates two non-linear multivariate models on the interactions between assets and liabilities with repeated cross-sections. Second, it analyses households financial difficulties. It defines a quantitative measure of financial distress and tests, by means of non-linear dynamic probit models, whether the probability of experiencing financial difficulties is persistent over time. Chapter 1 provides a critical review of the theoretical and empirical literature on the estimation of assets and liabilities holdings, on their interactions and on households net wealth. The review stresses the fact that a large part of the literature explain households debt holdings as a function, among others, of net wealth, an assumption that runs into possible endogeneity problems. Chapter 2 defines two non-linear multivariate models to study the interactions between assets and liabilities held by Italian households. Estimation refers to a pooling of cross-sections of SHIW. The first model is a bivariate tobit that estimates factors affecting assets and liabilities and their degree of correlation with results coherent with theoretical expectations. To tackle the presence of non normality and heteroskedasticity in the error term, generating non consistent tobit estimators, semi-parametric estimates are provided that confirm the results of the tobit model. The second model is a quadrivariate probit on three different assets (safe, risky and real) and total liabilities; the results show the expected patterns of interdependence suggested by theoretical considerations. Chapter 3 reviews the methodologies for estimating non-linear dynamic panel data models, drawing attention to the problems to be dealt with to obtain consistent estimators. Specific attention is given to the initial condition problem raised by the inclusion of the lagged dependent variable in the set of explanatory variables. The advantage of using dynamic panel data models lies in the fact that they allow to simultaneously account for true state dependence, via the lagged variable, and unobserved heterogeneity via individual effects specification. Chapter 4 applies the models reviewed in Chapter 3 to analyse financial difficulties of Italian households, by using information on net wealth as provided in the panel component of the SHIW. The aim is to test whether households persistently experience financial difficulties over time. A thorough discussion is provided of the alternative approaches proposed by the literature (subjective/qualitative indicators versus quantitative indexes) to identify households in financial distress. Households in financial difficulties are identified as those holding amounts of net wealth lower than the value corresponding to the first quartile of net wealth distribution. Estimation is conducted via four different methods: the pooled probit model, the random effects probit model with exogenous initial conditions, the Heckman model and the recently developed Wooldridge model. Results obtained from all estimators accept the null hypothesis of true state dependence and show that, according with the literature, less sophisticated models, namely the pooled and exogenous models, over-estimate such persistence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il territorio italiano presenta una grandissima ricchezza nel campo dei Beni Culturali, sia mobili che immobili; si tratta di un patrimonio di grande importanza che va gestito e tutelato nel migliore dei modi e con strumenti adeguati, anche in relazione ai problemi ad esso legati in termini di manutenzione e di salvaguardia dai fattori di rischio a cui può essere esposto. Per una buona conoscenza del Patrimonio Culturale, è fondamentale un’acquisizione preliminare di informazioni condotte in modo sistematico e unitario, che siano diffuse ed organiche, ma anche utili ad una valutazione preventiva e ad una successiva programmazione degli interventi di restauro di tipo conservativo. In questo ambito, l'impiego delle tecniche e tecnologie geomatiche nel campo dei Beni Culturali, può fornire un valido contributo, che va dalla catalogazione e documentazione del bene culturale al suo controllo e monitoraggio. Oggigiorno il crescente sviluppo di nuove tecnologie digitali, accompagnato dai notevoli passi avanti compiuti dalle discipline della geomatica (in primo luogo topografiche e fotogrammetriche), rende possibile una efficace integrazione tra varie tecniche, favorita anche dalla diffusione di soluzioni per l’interscambio dati e per la comunicazione tra differenti dispositivi. Lo studio oggetto della presente tesi si propone, di approfondire gli aspetti legati all’uso delle tecniche e tecnologie della Geomatica, per mettere in risalto le condizioni di un bene ed il suo stato di degrado. Per la gestione e la salvaguardia di un bene culturale , si presenta il SIT Carta del Rischio che evidenzia le pericolosità legate al patrimonio, e come esse sommate alla vulnerabilità di un singolo bene, contribuiscano all’individuazione del grado di rischio. di approfondire gli aspetti legati all’uso delle tecniche e tecnologie delle Geomatica, per mettere in risalto le condizioni di un bene ed il suo stato di degrado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been suggested that there are several distinct phenotypes of childhood asthma or childhood wheezing. Here, we review the research relating to these phenotypes, with a focus on the methods used to define and validate them. Childhood wheezing disorders manifest themselves in a range of observable (phenotypic) features such as lung function, bronchial responsiveness, atopy and a highly variable time course (prognosis). The underlying causes are not sufficiently understood to define disease entities based on aetiology. Nevertheless, there is a need for a classification that would (i) facilitate research into aetiology and pathophysiology, (ii) allow targeted treatment and preventive measures and (iii) improve the prediction of long-term outcome. Classical attempts to define phenotypes have been one-dimensional, relying on few or single features such as triggers (exclusive viral wheeze vs. multiple trigger wheeze) or time course (early transient wheeze, persistent and late onset wheeze). These definitions are simple but essentially subjective. Recently, a multi-dimensional approach has been adopted. This approach is based on a wide range of features and relies on multivariate methods such as cluster or latent class analysis. Phenotypes identified in this manner are more complex but arguably more objective. Although phenotypes have an undisputed standing in current research on childhood asthma and wheezing, there is confusion about the meaning of the term 'phenotype' causing much circular debate. If phenotypes are meant to represent 'real' underlying disease entities rather than superficial features, there is a need for validation and harmonization of definitions. The multi-dimensional approach allows validation by replication across different populations and may contribute to a more reliable classification of childhood wheezing disorders and to improved precision of research relying on phenotype recognition, particularly in genetics. Ultimately, the underlying pathophysiology and aetiology will need to be understood to properly characterize the diseases causing recurrent wheeze in children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of chronic critical limb ischaemia (CLI) emerged late in the history of peripheral arterial occlusive disease (PAOD). The historical background and changing definitions of CLI over the last decades are important to know in order to understand why epidemiologic data are so difficult to compare between articles and over time. The prevalence of CLI is probably very high and largely underestimated, and significant differences exist between population studies and clinical series. The extremely high costs associated with management of these patients make CLI a real public health issue for the future. In the era of emerging vascular surgery in the 1950s, the initial classification of PAOD by Fontaine, with stages III and IV corresponding to CLI, was based only on clinical symptoms. Later, with increasing access to non-invasive haemodynamic measurements (ankle pressure, toe pressure), the need to prove a causal relationship between PAOD and clinical findings suggestive of CLI became a real concern, and the Rutherford classification published in 1986 included objective haemodynamic criteria. The first consensus document on CLI was published in 1991 and included clinical criteria associated with ankle and toe pressure and transcutaneous oxygen pressure (TcPO(2)) cut-off levels <50 mmHg, <30 mmHg and <10 mmHg respectively). This rigorous definition reflects an arterial insufficiency that is so severe as to cause microcirculatory changes and compromise tissue integrity, with a high rate of major amputation and mortality. The TASC I consensus document published in 2000 used less severe pressure cut-offs (≤ 50-70 mmHg, ≤ 30-50 mmHg and ≤ 30-50 mmHg respectively). The thresholds for toe pressure and especially TcPO(2) (which will be also included in TASC II consensus document) are however just below the lower limit of normality. It is therefore easy to infer that patients qualifying as CLI based on TASC criteria can suffer from far less severe disease than those qualifying as CLI in the initial 1991 consensus document. Furthermore, inclusion criteria of many recent interventional studies have even shifted further from the efforts of definition standardisation with objective criteria, by including patients as CLI based merely on Fontaine classification (stage III and IV) without haemodynamic criteria. The differences in the natural history of patients with CLI, including prognosis of the limb and the patient, are thus difficult to compare between studies in this context. Overall, CLI as defined by clinical and haemodynamic criteria remains a severe condition with poor prognosis, high medical costs and a major impact in terms of public health and patients' loss of functional capacity. The major progresses in best medical therapy of arterial disease and revascularisation procedures will certainly improve the outcome of CLI patients. In the future, an effort to apply a standardised definition with clinical and objective haemodynamic criteria will be needed to better demonstrate and compare the advances in management of these patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tajikistan, with 93% of its surface area taken up by mountains and 65% of its labor force employed in agriculture, is judged to be highly vulnerable to risks, including climate change risks and food insecurity risks. The article examines a set of land use policies and practices that can be used to mitigate the vulnerability of Tajikistan’s large rural population, primarily by increasing family incomes. Empirical evidence from Tajikistan and other CIS countries suggests that families with more land and higher commercialization earn higher incomes and achieve higher well-being. The recommended policy measures that are likely to increase rural family incomes accordingly advocate expansion of smallholder farms, improvement of livestock productivity, increase of farm commercialization through improvement of farm services, and greater diversification of both income sources and the product mix. The analysis relies for supporting evidence on official statistics and recent farm surveys. Examples from local initiatives promoting sustainable land management practices and demonstrating the implementation of the proposed policy measures are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the religious implications of eroticism in Western culture since the Sexual Revolution, a period at once applauded for its open and immanent view of sexuality and denounced for its shamelessness and promiscuity. After discussing the work and effects of Alfred C. Kinsey, the father of the Sexual Revolution, I focus on a critical appraisal of Kinsey written by French theorist Georges Bataille (“Kinsey, the Underworld and Work,” in L’Erotisme, 1957). Bataille situates contemporary Western sexuality within a larger historical movement towards the “desacralization” of all aspects of human life: sex, under the scientific gaze of the Kinsey team, became simply another “object” to be analyzed and classified, and “good” sex defined solely in terms of frequency and explosiveness of orgasm. For many, including Hugh Hefner, this approach to sex occasioned a refreshing awakening from the long dark night of Victorian sexual repression. However, as Bataille’s protégé Foucault has shown, the scientific approach to sexuality often masks a desire to control and delimit sexual behaviour, not “liberate” it. Moreover, Bataille makes the point that the desacralization of sexuality denudes sex of a vital component—eroticism—which is necessary for real pleasure and ecstasy. Beyond the “moral” critiques one often hears leveled against Kinsey and his work, Bataille provides a “religious” critique, one that stands, perhaps surprisingly, on the “near side” of sexuality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The brain is in many ways an immunologically and pharmacologically privileged site. The blood-brain barrier (BBB) of the cerebrovascular endothelium and its participation in the complex structure of the neurovascular unit (NVU) restrict access of immune cells and immune mediators to the central nervous system (CNS). In pathologic conditions, very well-organized immunologic responses can develop within the CNS, raising important questions about the real nature and the intrinsic and extrinsic regulation of this immune privilege. We assess the interactions of immune cells and immune mediators with the BBB and NVU in neurologic disease, cerebrovascular disease, and intracerebral tumors. The goals of this review are to outline key scientific advances and the status of the science central to both the neuroinflammation and CNS barriers fields, and highlight the opportunities and priorities in advancing brain barriers research in the context of the larger immunology and neuroscience disciplines. This review article was developed from reports presented at the 2011 Annual Blood-Brain Barrier Consortium Meeting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project was stimulated by the unprecedented speed and scope of changes in Bulgarian higher education since 1989. The rapid growth of the student population and the emergence of a new private sector in higher education led to tightening governmental control and a growing criticism of autonomy and academic freedom. This raised questions about the need for diversification in the field, about the importance of recent innovations in terms of strategic choices for future development and so of how higher education governance could maintain diversity without the system deteriorating. The group first traced the extent of spontaneous processes of innovation at the level of content, of institutions, and the organisation of teaching and learning processes. They then identified the different parties in the struggle for institutionalisation and against diversification, and promising mechanisms for maintaining diversity in higher education. On this basis they outlined a basis for a wide-ranging public discussion of the issue which may serve as a corrective to the mechanisms of state control. Their work included analysis of the legislative framework laid down in the Higher Education Act, which effectively dispenses with the autonomy of universities. They then surveyed the views of both high-level executives in the field and the academics actually involved in the process, as well as of the "consumers" of the educational product, i.e. the students. In considering diversification, they focused on four different types of programmes, including those where diversification is largely limited to content level (e.g. Law), those where it operates mainly on structural levels (e.g. Industrial Management), those where it is often feigned (e.g. Social Work), and those where it is at best formal and sporadic (e.g. Mechanical Engineering). They conclude that the educational system in Bulgaria has considerable internal resources for development. The greatest need is for adequate statutory regulation of academic life which will provide incentives for responsible academic development of higher education institutions and create conditions for the institutionalisation of academic self-organisation and self-control, which will in turn limit the pathological trends in the diversification processes.