968 resultados para Inherent Audiences


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oscillatory contractile activity is an inherent property of blood vessels. Various cellular mechanisms have been proposed to contribute to oscillatory activity. Mouse small mesenteric arteries display a unique low frequency contractile oscillatory activity (1 cycle every 10-12 min) upon phenylephrine stimulation. Our objective was to identify mechanisms involved in this peculiar oscillatory activity. First-order mesenteric arteries were mounted in tissue baths for isometric force measurement. The oscillatory activity was observed only in vessels with endothelium, but it was not blocked by L-NAME (100 µM) or indomethacin (10 µM), ruling out the participation of nitric oxide and prostacyclin, respectively, in this phenomenon. Oscillatory activity was not observed in vessels contracted with K+ (90 mM) or after stimulation with phenylephrine plus 10 mM K+. Ouabain (1 to 10 µM, an Na+/K+-ATPase inhibitor), but not K+ channel antagonists [tetraethylammonium (100 µM, a nonselective K+ channel blocker), Tram-34 (10 µM, blocker of intermediate conductance K+ channels) or UCL-1684 (0.1 µM, a small conductance K+ channel blocker)], inhibited the oscillatory activity. The contractile activity was also abolished when experiments were performed at 20°C or in K+-free medium. Taken together, these results demonstrate that Na+/K+-ATPase is a potential source of these oscillations. The presence of α-1 and α-2 Na+/K+-ATPase isoforms was confirmed in murine mesenteric arteries by Western blot. Chronic infusion of mice with ouabain did not abolish oscillatory contraction, but up-regulated vascular Na+/K+-ATPase expression and increased blood pressure. Together, these observations suggest that the Na+/K+ pump plays a major role in the oscillatory activity of murine small mesenteric arteries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart home implementation in residential buildings promises to optimize energy usage and save significant amount of energy simply due to a better understanding of user's energy usage profile. Apart from the energy optimisation prospects of this technology, it also aims to guarantee occupants significant amount of comfort and remote control over home appliances both at home locations and at remote places. However, smart home investment just like any other kind of investment requires an adequate measurement and justification of the economic gains it could proffer before its realization. These economic gains could differ for different occupants due to their inherent behaviours and tendencies. Thus it is pertinent to investigate the various behaviours and tendencies of occupants in different domain of interests and to measure the value of the energy savings accrued by smart home implementations in these domains of interest in order to justify such economic gains. This thesis investigates two domains of interests (the rented apartment and owned apartment) for primarily two behavioural tendencies (Finland and Germany) obtained from observation and corroborated by conducted interviews to measure the payback time and Return on Investment (ROI) of their smart home implementations. Also, similar measures are obtained for identified Australian use case. The research finding reveals that building automation for the Finnish behavioural tendencies seems to proffers a better ROI and payback time for smart home implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, when most of the business are moving forward to sustainability by providing or getting different services from different vendors, Service Level Agreement (SLA) becomes very important for both the business providers/vendors and as well as for users/customers. There are many ways to inform users/customers about various services with its inherent execution functionalities and even non-functional/Quality of Services (QoS) aspects through negotiating, evaluating or monitoring SLAs. However, these traditional SLA actually do not cover eco-efficient green issues or IT ethics issues for sustainability. That is why green SLA (GSLA) should come into play. GSLA is a formal agreement incorporating all the traditional commitments as well as green issues and ethics issues in IT business sectors. GSLA research would survey on different traditional SLA parameters for various services like as network, compute, storage and multimedia in IT business areas. At the same time, this survey could focus on finding the gaps and incorporation of these traditional SLA parameters with green issues for all these mentioned services. This research is mainly points on integration of green parameters in existing SLAs, defining GSLA with new green performance indicators and their measurable units. Finally, a GSLA template could define compiling all the green indicators such as recycling, radio-wave, toxic material usage, obsolescence indication, ICT product life cycles, energy cost etc for sustainable development. Moreover, people’s interaction and IT ethics issues such as security and privacy, user satisfaction, intellectual property right, user reliability, confidentiality etc could also need to add for proposing a new GSLA. However, integration of new and existing performance indicators in the proposed GSLA for sustainable development could be difficult for ICT engineers. Therefore, this research also discovers the management complexity of proposed green SLA through designing a general informational model and analyses of all the relationships, dependencies and effects between various newly identified services under sustainability pillars. However, sustainability could only be achieved through proper implementation of newly proposed GSLA, which largely depends on monitoring the performance of the green indicators. Therefore, this research focuses on monitoring and evaluating phase of GSLA indicators through the interactions with traditional basic SLA indicators, which would help to achieve proper implementation of future GSLA. Finally, this newly proposed GSLA informational model and monitoring aspects could definitely help different service providers/vendors to design their future business strategy in this new transitional sustainable society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation centres on the themes of knowledge creation, interdisciplinarity and knowledge work. My research approaches interdisciplinary knowledge creation (IKC) as practical situated activity. I argue that by approaching IKC from the practice-based perspective makes it possible to “deconstruct” how knowledge creation actually happens, and demystify its strong intellectual, mentalistic and expertise-based connotations. I have rendered the work of the observed knowledge workers into something ordinary, accessible and routinized. Consequently this has made it possible to grasp the pragmatic challenges as well the concrete drivers of such activity. Thus the effective way of organizing such activities becomes a question of organizing and leading effective everyday practices. To achieve that end, I have conducted ethnographic research of one explicitly interdisciplinary space within higher education, Aalto Design Factory in Helsinki, Finland, where I observed how students from different disciplines collaborated in new product development projects. I argue that IKC is a multi-dimensional construct that intertwines a particular way of doing; a way of experiencing; a way of embodied being; and a way of reflecting on the very doing itself. This places emphasis not only the practices themselves, but also on the way the individual experiences the practices, as this directly affects how the individual practices. My findings suggest that in order to effectively organize and execute knowledge creation activities organizations need to better accept and manage the emergent diversity and complexity inherent in such activities. In order to accomplish this, I highlight the importance of understanding and using a variety of (material) objects, the centrality of mundane everyday practices, the acceptance of contradictions and negotiations well as the role of management that is involved and engaged. To succeed in interdisciplinary knowledge creation is to lead not only by example, but also by being very much present in the very everyday practices that make it happen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

University of Turku, Faculty of Medicine, Department of Cardiology and Cardiovascular Medicine, Doctoral Programme of Clinical Investigation, Heart Center, Turku University Hospital, Turku, Finland Division of Internal Medicine, Department of Cardiology, Seinäjoki Central Hospital, Seinäjoki, Finland Heart Center, Satakunta Central Hospital, Pori, Finland Annales Universitatis Turkuensis Painosalama Oy, Turku, Finland 2015 Antithrombotic therapy during and after coronary procedures always entails the challenging establishment of a balance between bleeding and thrombotic complications. It has been generally recommended to patients on long-term warfarin therapy to discontinue warfarin a few days prior to elective coronary angiography or intervention to prevent bleeding complications. Bridging therapy with heparin is recommended for patients at an increased risk of thromboembolism who require the interruption of anticoagulation for elective surgery or an invasive procedure. In study I, consecutive patients on warfarin therapy referred for diagnostic coronary angiography were compared to control patients with a similar disease presentation without warfarin. The strategy of performing coronary angiography during uninterrupted therapeutic warfarin anticoagulation appeared to be a relatively safe alternative to bridging therapy, if the international normalized ratio level was not on a supratherapeutic level. In-stent restenosis remains an important reason for failure of long-term success after a percutaneous coronary intervention (PCI). Drug-eluting stents (DES) reduce the problem of restenosis inherent to bare metal stents (BMS). However, a longer delay in arterial healing may extend the risk of stent thrombosis (ST) far beyond 30 days after the DES implantation. Early discontinuation of antiplatelet therapy has been the most important predisposing factor for ST. In study II, patients on long-term oral anticoagulant (OAC) underwent DES or BMS stenting with a median of 3.5 years’follow-up. The selective use of DESs with a short triple therapy seemed to be safe in OAC patients, since late STs were rare even without long clopidogrel treatment. Major bleeding and cardiac events were common in this patient group irrespective of stent type. In order to help to predict the bleeding risk in patients on OAC, several different bleeding risk scorings have been developed. Risk scoring systems have also been used also in the setting of patients undergoing a PCI. In study III, the predictive value of an outpatient bleeding risk index (OBRI) to identify patients at high risk of bleeding was analysed. The bleeding risk seemed not to modify periprocedural or long-term treatment choices in patients on OAC after a percutaneous coronary intervention. Patients with a high OBRI often had major bleeding episodes, and the OBRI may be suitable for risk evaluation in this patient group. Optical coherence tomography (OCT) is a novel technology for imaging intravascular coronary arteries. OCT is a light-based imaging modality that enables a 12–18 µm tissue axial resolution to visualize plaques in the vessel, possible dissections and thrombi as well as, stent strut appositions and coverage, and to measure the vessel lumen and lesions. In study IV, 30 days after titanium-nitride-oxide (TITANOX)-coated stent implantation, the binary stent strut coverage was satisfactory and the prevalence of malapposed struts was low as evaluated by OCT. Long-term clinical events in patients treated with (TITANOX)-coated bio-active stents (BAS) and paclitaxel-eluting stents (PES) in routine clinical practice were examined in study V. At the 3-year follow-up, BAS resulted in better long-term outcome when compared with PES with an infrequent need for target vessel revascularization. Keywords: anticoagulation, restenosis, thrombosis, bleeding, optical coherence tomography, titanium

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At the present celiac disease has no known cure, and its only treatment is a strict lifelong adherence to a gluten-free diet. Cheese bread is a traditional Brazilian product and a safe option for celiacs. However, like other gluten-free breads, it has inherent low levels of fibers and minerals. The objective of this study was to evaluate the effect of incorporation of whole amaranth flour on the physical properties and nutritional value of cheese bread. Amaranth flour was incorporated at 10, 15, and 20% proportions in different formulations. The increasing amaranth levels darkened the product, reduced specific volume, and increased compression force. Ten percent amaranth-content cheese breads exhibited slight differences in physical properties compared with the controls. These results demonstrated the possibility of incorporating 10% of whole amaranth flour in the formulation of cheese bread resulting in a product with higher dietary fiber and iron contents and the same level of acceptance as that of the conventional formulation. The aim of this approach is to increase the availability of gluten-free bakery products with added nutritional value contributing to increase the variety of the diet of celiac patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have investigated Russian children’s reading acquisition during an intermediate period in their development: after literacy onset, but before they have acquired well-developed decoding skills. The results of our study suggest that Russian first graders rely primarily on phonemes and syllables as reading grain-size units. Phonemic awareness seems to have reached the metalinguistic level more rapidly than syllabic awareness after the onset of reading instruction, the reversal which is typical for the initial stages of formal reading instruction creating external demand for phonemic awareness. Another reason might be the inherent instability of syllabic boundaries in Russian. We have shown that body-coda is a more natural representation of subsyllabic structure in Russian than onset-rime. We also found that Russian children displayed variability of syllable onset and offset decisions which can be attributed to the lack of congruence between syllabic and morphemic word division in Russian. We suggest that fuzziness of syllable boundary decisions is a sign of the transitional nature of this stage in the reading development and it indicates progress towards an awareness of morphologically determined closed syllables. Our study also showed that orthographic complexity exerts an influence on reading in Russian from the very start of reading acquisition. Besides, we found that Russian first graders experience fluency difficulties in reading orthographically simple words and nonwords of two and more syllables. The transition from monosyllabic to bisyllabic lexical items constitutes a certain threshold, for which the syllabic structure seemed to be of no difference. When we compared the outcomes of the Russian children with the ones produced by speakers of other languages, we discovered that in the tasks which could be performed with the help of alphabetic recoding Russian children’s accuracy was comparable to that of children learning to read in relatively shallow orthographies. In tasks where this approach works only partially, Russian children demonstrated accuracy results similar to those in deeper orthographies. This pattern of moderate results in accuracy and excellent performance in terms of reaction times is an indication that children apply phonological recoding as their dominant strategy to various reading tasks and are only beginning to develop suitable multiple strategies in dealing with orthographically complex material. The development of these strategies is not completed during Grade 1 and the shift towards diversification of strategies apparently continues in Grade 2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global warming is one of the most alarming problems of this century. Initial scepticism concerning its validity is currently dwarfed by the intensification of extreme weather events whilst the gradual arising level of anthropogenic CO2 is pointed out as its main driver. Most of the greenhouse gas (GHG) emissions come from large point sources (heat and power production and industrial processes) and the continued use of fossil fuels requires quick and effective measures to meet the world’s energy demand whilst (at least) stabilizing CO2 atmospheric levels. The framework known as Carbon Capture and Storage (CCS) – or Carbon Capture Utilization and Storage (CCUS) – comprises a portfolio of technologies applicable to large‐scale GHG sources for preventing CO2 from entering the atmosphere. Amongst them, CO2 capture and mineralisation (CCM) presents the highest potential for CO2 sequestration as the predicted carbon storage capacity (as mineral carbonates) far exceeds the estimated levels of the worldwide identified fossil fuel reserves. The work presented in this thesis aims at taking a step forward to the deployment of an energy/cost effective process for simultaneous capture and storage of CO2 in the form of thermodynamically stable and environmentally friendly solid carbonates. R&D work on the process considered here began in 2007 at Åbo Akademi University in Finland. It involves the processing of magnesium silicate minerals with recyclable ammonium salts for extraction of magnesium at ambient pressure and 400‐440⁰C, followed by aqueous precipitation of magnesium in the form of hydroxide, Mg(OH)2, and finally Mg(OH)2 carbonation in a pressurised fluidized bed reactor at ~510⁰C and ~20 bar PCO2 to produce high purity MgCO3. Rock material taken from the Hitura nickel mine, Finland, and serpentinite collected from Bragança, Portugal, were tested for magnesium extraction with both ammonium sulphate and bisulphate (AS and ABS) for determination of optimal operation parameters, primarily: reaction time, reactor type and presence of moisture. Typical efficiencies range from 50 to 80% of magnesium extraction at 350‐450⁰C. In general ABS performs better than AS showing comparable efficiencies at lower temperature and reaction times. The best experimental results so far obtained include 80% magnesium extraction with ABS at 450⁰C in a laboratory scale rotary kiln and 70% Mg(OH)2 carbonation in the PFB at 500⁰C, 20 bar CO2 pressure for 15 minutes. The extraction reaction with ammonium salts is not at all selective towards magnesium. Other elements like iron, nickel, chromium, copper, etc., are also co‐extracted. Their separation, recovery and valorisation are addressed as well and found to be of great importance. The assessment of the exergetic performance of the process was carried out using Aspen Plus® software and pinch analysis technology. The choice of fluxing agent and its recovery method have a decisive sway in the performance of the process: AS is recovered by crystallisation and in general the whole process requires more exergy (2.48–5.09 GJ/tCO2sequestered) than ABS (2.48–4.47 GJ/tCO2sequestered) when ABS is recovered by thermal decomposition. However, the corrosive nature of molten ABS and operational problems inherent to thermal regeneration of ABS prohibit this route. Regeneration of ABS through addition of H2SO4 to AS (followed by crystallisation) results in an overall negative exergy balance (mainly at the expense of low grade heat) but will flood the system with sulphates. Although the ÅA route is still energy intensive, its performance is comparable to conventional CO2 capture methods using alkanolamine solvents. An energy‐neutral process is dependent on the availability and quality of nearby waste heat and economic viability might be achieved with: magnesium extraction and carbonation levels ≥ 90%, the processing of CO2‐containing flue gases (eliminating the expensive capture step) and production of marketable products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this Master’s thesis was to study customer knowledge transfer processes in multinational corporations (MNCs). The main objective was to examine how customer knowledge is transferred in MNCs and what kind of factors enhance or inhibit the knowledge transfer process, and to create a framework on the basis of the existing literature and the empirical findings. In this thesis the factors were organized according to whether they are properties of the unit involved in knowledge management, properties of relationships between the units or properties of the knowledge itself. There are various properties that influence knowledge transfer but in this thesis the focus was on examining the relevant findings from the customer knowledge viewpoint. Empirical results show that internal fragmentation in the MNC seems to be inherent in this type of organization, and may cause many problems in customer knowledge transfer and utilization. These knowledge transfer inhibitors rise from the organization’s properties: it’s absorptive capacity, motivation, organizational culture, and the two dimensions of knowledge. However, in spite of the inherent forces causing internal fragmentation and inhibiting knowledge transfer, moderate customer knowledge and expertise codification, cooperative working practices among the experts, and socialization mechanisms posed by the headquarters seem to help maintain customer knowledge transfer, and value creation in the long-term relationship. This value creation can be seen to be based on accessing and integrating a wide variety of knowledge resources in order to create a coherent product and service offering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective and originality of this paper lie in identifying Stiglitz's main theoretical contributions to Financial Economics and in briefly portraying the contemporary economic thought out of which these contributions emerged as well as in suggesting their connections with the subsequent economic thought. Grounded on a detailed analysis of Stiglitz's works on finance, his most important theoretical findings are singled out and gathered into four issues: (1) the conditions under which the Modigliani-Miller theorem is valid; (2) the inconsistency inherent to the efficient market hypothesis; (3) the microeconomic effects of asymmetrical information in financial markets; and (4) its real macroeconomic effects. In all of these topics, the focal point of Stiglitz's theoretical research is the unrealistic underpinnings on which the Arrow-Debreu competitive equilibrium model relies. It is also emphasised that this same perspective he coherently followed to construct a fully-fledged theoretical framework would be preserved in his empirical investigations, notably about developing countries, on which he has concentrated effort since the beginnings of the nineties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After surpassed more than half a decade since the adoption of inflation targeting in Brazil, it can be seen that maintaining a high interest rate is inherent to the strategy for the conduction of the monetary policy. The objective of this paper is to show that the present policy for defining the basic interest rate of the economy, based on the response to inflation considering both market and administered prices, is onerous for the Brazilian society. Based on findings from empirical evidence in the period 1999-2004, the adoption of a core inflation, a change in the time horizon for definition of targets, and, in common agreement between Banco Central do Brasil and National Treasury, a definition of these inflation targets, as a framework to increase efficiency of the monetary regime, creates possibilities for proposing a reduction on the Selic rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central hypothesis of this article is that in the context of globalization, monetary inconvertibility is a crucial problem of peripheral countries. It begins with a brief review of the debate from a historical point of view and then stresses the contemporary opposite's views on the fragility of financial system of emerging countries: the original sin and the debt intolerance hypothesis. Despite of supporting the first one, the article goes further and explores the domestic implication of inconvertibility. It criticizes the jurisdicional uncertainty proposition showing that an inherent flaw in the store of value of emerging market currencies, derived from original sin is the main reason for de facto inconvertibility and underdevelopment of domestic financial system of these countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The State and the economy in South Korea: from the developmentist state to the Asian crisis and later recovery. The article focuses on the institutions of South Korean capitalism and on the interactions between the state and the economy. The economic model in South Korea was characterized by a very interventionist state, which played a very active role in the process of industrialization. However, South Korea suffered a severe crisis in 1997, attributed by many authors to the distortions inherent to strong state intervention. The article shows that the crisis was a result of the combination between internal economic fragilities and a rapid process of financial deregulation, which undermined the state's capacity of control. The crisis, nevertheless, does not disqualify the role of the national institutions in the very successful process of industrialization. Despite the reforms, the Korean capitalism conserves much of the previous model of business organization and industrial relations. The state continues strong and played active role in the process of economic reforms. There are, nevertheless, doubts about the impacts of the reforms and the new configuration of Korean capitalism. They will depend on the current transformations in world economy and in the East Asian countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determinative common factors of currency and financial crisis. This paper identifies and evaluates determinative common factors of currency and financial crisis in relation to 86 crises episodes between 1970-2004, based on factor analysis, cluster and discriminant analysis. One evidenced that the rise of the ratios of domestic credit, fiscal deficit and residents bank deposits to the GDP is inherent to the different types of crises classified for economic literature. It was also identified as common factors to these episodes some indicators that capture the excessive monetary expansion of the economies and that reflect the fall in international reserves, represented by M2/Reserves and Imports/Reserves ratios and also the total volume of international reserves.