867 resultados para Time-Consistent Policy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a part of the AMAZE-08 campaign during the wet season in the rainforest of central Amazonia, an ultraviolet aerodynamic particle sizer (UV-APS) was operated for continuous measurements of fluorescent biological aerosol particles (FBAP). In the coarse particle size range (> 1 mu m) the campaign median and quartiles of FBAP number and mass concentration were 7.3x10(4) m(-3) (4.0-13.2x10(4) m(-3)) and 0.72 mu g m(-3) (0.42-1.19 mu g m(-3)), respectively, accounting for 24% (11-41%) of total particle number and 47% (25-65%) of total particle mass. During the five-week campaign in February-March 2008 the concentration of coarse-mode Saharan dust particles was highly variable. In contrast, FBAP concentrations remained fairly constant over the course of weeks and had a consistent daily pattern, peaking several hours before sunrise, suggesting observed FBAP was dominated by nocturnal spore emission. This conclusion was supported by the consistent FBAP number size distribution peaking at 2.3 mu m, also attributed to fungal spores and mixed biological particles by scanning electron microscopy (SEM), light microscopy and biochemical staining. A second primary biological aerosol particle (PBAP) mode between 0.5 and 1.0 mu m was also observed by SEM, but exhibited little fluorescence and no true fungal staining. This mode may have consisted of single bacterial cells, brochosomes, various fragments of biological material, and small Chromalveolata (Chromista) spores. Particles liquid-coated with mixed organic-inorganic material constituted a large fraction of observations, and these coatings contained salts likely from primary biological origin. We provide key support for the suggestion that real-time laser-induce fluorescence (LIF) techniques using 355 nm excitation provide size-resolved concentrations of FBAP as a lower limit for the atmospheric abundance of biological particles in a pristine environment. We also show some limitations of using the instrument for ambient monitoring of weakly fluorescent particles < 2 mu m. Our measurements confirm that primary biological particles, fungal spores in particular, are an important fraction of supermicron aerosol in the Amazon and that may contribute significantly to hydrological cycling, especially when coated by mixed inorganic material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer cachexia induces loss of fat mass that accounts for a large part of the dramatic weight loss observed both in humans and in animal models; however, the literature does not provide consistent information regarding the set point of weight loss and how the different visceral adipose tissue depots contribute to this symptom. To evaluate that, 8-week-old male Wistar rats were subcutaneously inoculated with 1 ml (2 x 10(7)) of tumour cells (Walker 256). Samples of different visceral white adipose tissue (WAT) depots were collected at days 0, 4, 7 and 14 and stored at -80 degrees C (seven to ten animals/each day per group). Mesenteric and retroperitoneal depot mass was decreased to the greatest extent on day 14 compared with day 0. Gene and protein expression of PPAR gamma(2) (PPARG) fell significantly following tumour implantation in all three adipose tissue depots while C/EBP alpha (CEBPA) and SREBP-1c (SREBF1) expression decreased over time only in epididymal and retroperitoneal depots. Decreased adipogenic gene expression and morphological disruption of visceral WAT are further supported by the dramatic reduction in mRNA and protein levels of perilipin. Classical markers of inflammation and macrophage infiltration (f4/80, CD68 and MIF-1 alpha) in WAT were significantly increased in the later stage of cachexia (although showing a incremental pattern along the course of cachexia) and presented a depot-specific regulation. These results indicate that impairment in the lipid-storing function of adipose tissue occurs at different times and that the mesenteric adipose tissue is more resistant to the 'fat-reducing effect' than the other visceral depots during cancer cachexia progression. Journal of Endocrinology (2012) 215, 363-373

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LA-MC-ICP-MS U-Pb zircon dating was performed on syntectonic, early post-collisional granitic and associated mafic rocks that are intrusive in the Brusque Metamorphic Complex and in the Florianopolis Batholith, major tectonic domains separated by the Neoproterozoic Major Gercino Shear Zone (MGSZ) in south Brazil. The inferred ages of magmatic crystallization are consistent with field relationships, and show that the syntectonic granites from both domains are similar, with ages around 630-620 Ma for high-K calc-alkaline metaluminous granites and ca. 610 Ma for slightly peraluminous granites. Although ca. 650 Ma inherited zircon components are identified in granites from both domains, important contrasts on the crustal architecture in each domain are revealed by the patterns of zircon inheritance, indicating different crustal sources for the granites in each domain. The granites from the southern domain (Floriandpolis Batholith) have essentially Neoproterozoic (650-700 Ma and 900-950 Ma) inheritance; with a single 2.0-2.2 Ga inherited age obtained in the peraluminous Mariscal Granite. In the northern Brusque Metamorphic Complex, the metaluminous Rio Pequeno Granite and associated mafic rocks have scarce inherited cores with ages around 1.65 Ga, whereas the slightly peraluminous Serra dos Macacos Granite has abundant Paleoproterozoic (1.8-2.2 Ga) and Archean (2.9-3.4 Ga) inherited zircons. Our results are consistent with the hypothesis that the MGSZ separates domains with distinct geologic evolution; however, the contemporaneity of 630-610 Ma granitic magmatism with similar structural and geochemical patterns on both sides of this major shear zone indicates that these domains were already part of a single continental mass at 630 Ma, reinforcing the post-collisional character of these granites. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Cardiovascular disease is the leading cause of death in Brazil, and hypertension is its major risk factor. The benefit of its drug treatment to prevent major cardiovascular events was consistently demonstrated. Angiotensin-receptor blockers (ARB) have been the preferential drugs in the management of hypertension worldwide, despite the absence of any consistent evidence of advantage over older agents, and the concern that they may be associated with lower renal protection and risk for cancer. Diuretics are as efficacious as other agents, are well tolerated, have longer duration of action and low cost, but have been scarcely compared with ARBs. A study comparing diuretic and ARB is therefore warranted. Methods/design This is a randomized, double-blind, clinical trial, comparing the association of chlorthalidone and amiloride with losartan as first drug option in patients aged 30 to 70 years, with stage I hypertension. The primary outcomes will be variation of blood pressure by time, adverse events and development or worsening of microalbuminuria and of left ventricular hypertrophy in the EKG. The secondary outcomes will be fatal or non-fatal cardiovascular events: myocardial infarction, stroke, heart failure, evidence of new subclinical atherosclerosis and sudden death. The study will last 18 months. The sample size will be of 1200 participants for group in order to confer enough power to test for all primary outcomes. The project was approved by the Ethics committee of each participating institution. Discussion The putative pleiotropic effects of ARB agents, particularly renal protection, have been disputed, and they have been scarcely compared with diuretics in large clinical trials, despite that they have been at least as efficacious as newer agents in managing hypertension. Even if the null hypothesis is not rejected, the information will be useful for health care policy to treat hypertension in Brazil. Clinical trials registration number ClinicalTrials.gov: NCT00971165

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We studied the energy and frequency dependence of the Fourier time lags and intrinsic coherence of the kilohertz quasi-periodic oscillations (kHz QPOs) in the neutron-star lowmass X-ray binaries 4U 1608−52 and 4U 1636−53, using a large data set obtained with the Rossi X-ray Timing Explorer. We confirmed that, in both sources, the time lags of the lower kHz QPO are soft and their magnitude increases with energy. We also found that: (i) In 4U 1636−53, the soft lags of the lower kHz QPO remain constant at∼30 μs in the QPO frequency range 500–850 Hz, and decrease to ∼10 μs when the QPO frequency increases further. In 4U 1608−52, the soft lags of the lower kHz QPO remain constant at 40 μs up to 800 Hz, the highest frequency reached by this QPO in our data. (ii) In both sources, the time lags of the upper kHz QPO are hard, independent of energy or frequency and inconsistent with the soft lags of the lower kHz QPO. (iii) In both sources the intrinsic coherence of the lower kHz QPO remains constant at ∼0.6 between 5 and 12 keV, and drops to zero above that energy. The intrinsic coherence of the upper kHz QPO is consistent with being zero across the full energy range. (iv) In 4U 1636−53, the intrinsic coherence of the lower kHz QPO increases from ∼0 at ∼600 Hz to ∼1, and it decreases to ∼0.5 at 920 Hz; in 4U 1608−52, the intrinsic coherence is consistent with the same trend. (v) In both sources the intrinsic coherence of the upper kHz QPO is consistent with zero over the full frequency range of the QPO, except in 4U 1636−53 between 700 and 900 Hz where the intrinsic coherence marginally increases. We discuss our results in the context of scenarios in which the soft lags are either due to reflection off the accretion disc or up-/down-scattering in a hot medium close to the neutron star. We finally explore the connection between, on one hand the time lags and the intrinsic coherence of the kHz QPOs, and on the other the QPOs’ amplitude and quality factor in these two sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] In this work we propose a new variational model for the consistent estimation of motion fields. The aim of this work is to develop appropriate spatio-temporal coherence models. In this sense, we propose two main contributions: a nonlinear flow constancy assumption, similar in spirit to the nonlinear brightness constancy assumption, which conveniently relates flow fields at different time instants; and a nonlinear temporal regularization scheme, which complements the spatial regularization and can cope with piecewise continuous motion fields. These contributions pose a congruent variational model since all the energy terms, except the spatial regularization, are based on nonlinear warpings of the flow field. This model is more general than its spatial counterpart, provides more accurate solutions and preserves the continuity of optical flows in time. In the experimental results, we show that the method attains better results and, in particular, it considerably improves the accuracy in the presence of large displacements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, a rising interest in political and economic integration/disintegration issues has been developed in the political economy field. This growing strand of literature partly draws on traditional issues of fiscal federalism and optimum public good provision and focuses on a trade-off between the benefits of centralization, arising from economies of scale or externalities, and the costs of harmonizing policies as a consequence of the increased heterogeneity of individual preferences in an international union or in a country composed of at least two regions. This thesis stems from this strand of literature and aims to shed some light on two highly relevant aspects of the political economy of European integration. The first concerns the role of public opinion in the integration process; more precisely, how economic benefits and costs of integration shape citizens' support for European Union (EU) membership. The second is the allocation of policy competences among different levels of government: European, national and regional. Chapter 1 introduces the topics developed in this thesis by reviewing the main recent theoretical developments in the political economy analysis of integration processes. It is structured as follows. First, it briefly surveys a few relevant articles on economic theories of integration and disintegration processes (Alesina and Spolaore 1997, Bolton and Roland 1997, Alesina et al. 2000, Casella and Feinstein 2002) and discusses their relevance for the study of the impact of economic benefits and costs on public opinion attitude towards the EU. Subsequently, it explores the links existing between such political economy literature and theories of fiscal federalism, especially with regard to normative considerations concerning the optimal allocation of competences in a union. Chapter 2 firstly proposes a model of citizens’ support for membership of international unions, with explicit reference to the EU; subsequently it tests the model on a panel of EU countries. What are the factors that influence public opinion support for the European Union (EU)? In international relations theory, the idea that citizens' support for the EU depends on material benefits deriving from integration, i.e. whether European integration makes individuals economically better off (utilitarian support), has been common since the 1970s, but has never been the subject of a formal treatment (Hix 2005). A small number of studies in the 1990s have investigated econometrically the link between national economic performance and mass support for European integration (Eichenberg and Dalton 1993; Anderson and Kalthenthaler 1996), but only making informal assumptions. The main aim of Chapter 2 is thus to propose and test our model with a view to providing a more complete and theoretically grounded picture of public support for the EU. Following theories of utilitarian support, we assume that citizens are in favour of membership if they receive economic benefits from it. To develop this idea, we propose a simple political economic model drawing on the recent economic literature on integration and disintegration processes. The basic element is the existence of a trade-off between the benefits of centralisation and the costs of harmonising policies in presence of heterogeneous preferences among countries. The approach we follow is that of the recent literature on the political economy of international unions and the unification or break-up of nations (Bolton and Roland 1997, Alesina and Wacziarg 1999, Alesina et al. 2001, 2005a, to mention only the relevant). The general perspective is that unification provides returns to scale in the provision of public goods, but reduces each member state’s ability to determine its most favoured bundle of public goods. In the simple model presented in Chapter 2, support for membership of the union is increasing in the union’s average income and in the loss of efficiency stemming from being outside the union, and decreasing in a country’s average income, while increasing heterogeneity of preferences among countries points to a reduced scope of the union. Afterwards we empirically test the model with data on the EU; more precisely, we perform an econometric analysis employing a panel of member countries over time. The second part of Chapter 2 thus tries to answer the following question: does public opinion support for the EU really depend on economic factors? The findings are broadly consistent with our theoretical expectations: the conditions of the national economy, differences in income among member states and heterogeneity of preferences shape citizens’ attitude towards their country’s membership of the EU. Consequently, this analysis offers some interesting policy implications for the present debate about ratification of the European Constitution and, more generally, about how the EU could act in order to gain more support from the European public. Citizens in many member states are called to express their opinion in national referenda, which may well end up in rejection of the Constitution, as recently happened in France and the Netherlands, triggering a European-wide political crisis. These events show that nowadays understanding public attitude towards the EU is not only of academic interest, but has a strong relevance for policy-making too. Chapter 3 empirically investigates the link between European integration and regional autonomy in Italy. Over the last few decades, the double tendency towards supranationalism and regional autonomy, which has characterised some European States, has taken a very interesting form in this country, because Italy, besides being one of the founding members of the EU, also implemented a process of decentralisation during the 1970s, further strengthened by a constitutional reform in 2001. Moreover, the issue of the allocation of competences among the EU, the Member States and the regions is now especially topical. The process leading to the drafting of European Constitution (even if then it has not come into force) has attracted much attention from a constitutional political economy perspective both on a normative and positive point of view (Breuss and Eller 2004, Mueller 2005). The Italian parliament has recently passed a new thorough constitutional reform, still to be approved by citizens in a referendum, which includes, among other things, the so called “devolution”, i.e. granting the regions exclusive competence in public health care, education and local police. Following and extending the methodology proposed in a recent influential article by Alesina et al. (2005b), which only concentrated on the EU activity (treaties, legislation, and European Court of Justice’s rulings), we develop a set of quantitative indicators measuring the intensity of the legislative activity of the Italian State, the EU and the Italian regions from 1973 to 2005 in a large number of policy categories. By doing so, we seek to answer the following broad questions. Are European and regional legislations substitutes for state laws? To what extent are the competences attributed by the European treaties or the Italian Constitution actually exerted in the various policy areas? Is their exertion consistent with the normative recommendations from the economic literature about their optimum allocation among different levels of government? The main results show that, first, there seems to be a certain substitutability between EU and national legislations (even if not a very strong one), but not between regional and national ones. Second, the EU concentrates its legislative activity mainly in international trade and agriculture, whilst social policy is where the regions and the State (which is also the main actor in foreign policy) are more active. Third, at least two levels of government (in some cases all of them) are significantly involved in the legislative activity in many sectors, even where the rationale for that is, at best, very questionable, indicating that they actually share a larger number of policy tasks than that suggested by the economic theory. It appears therefore that an excessive number of competences are actually shared among different levels of government. From an economic perspective, it may well be recommended that some competences be shared, but only when the balance between scale or spillover effects and heterogeneity of preferences suggests so. When, on the contrary, too many levels of government are involved in a certain policy area, the distinction between their different responsibilities easily becomes unnecessarily blurred. This may not only leads to a slower and inefficient policy-making process, but also risks to make it too complicate to understand for citizens, who, on the contrary, should be able to know who is really responsible for a certain policy when they vote in national,local or European elections or in referenda on national or European constitutional issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Recovery is often defined as the less studied and documented phase of the Emergency Management Cycle, a wide literature is available for describing characteristics and sub-phases of this process. Previous works do not allow to gain an overall perspective because of a lack of systematic consistent monitoring of recovery utilizing advanced technologies such as remote sensing and GIS technologies. Taking into consideration the key role of Remote Sensing in Response and Damage Assessment, this thesis is aimed to verify the appropriateness of such advanced monitoring techniques to detect recovery advancements over time, with close attention to the main characteristics of the study event: Hurricane Katrina storm surge. Based on multi-source, multi-sensor and multi-temporal data, the post-Katrina recovery was analysed using both a qualitative and a quantitative approach. The first phase was dedicated to the investigation of the relation between urban types, damage and recovery state, referring to geographical and technological parameters. Damage and recovery scales were proposed to review critical observations on remarkable surge- induced effects on various typologies of structures, analyzed at a per-building level. This wide-ranging investigation allowed a new understanding of the distinctive features of the recovery process. A quantitative analysis was employed to develop methodological procedures suited to recognize and monitor distribution, timing and characteristics of recovery activities in the study area. Promising results, gained by applying supervised classification algorithms to detect localization and distribution of blue tarp, have proved that this methodology may help the analyst in the detection and monitoring of recovery activities in areas that have been affected by medium damage. The study found that Mahalanobis Distance was the classifier which provided the most accurate results, in localising blue roofs with 93.7% of blue roof classified correctly and a producer accuracy of 70%. It was seen to be the classifier least sensitive to spectral signature alteration. The application of the dissimilarity textural classification to satellite imagery has demonstrated the suitability of this technique for the detection of debris distribution and for the monitoring of demolition and reconstruction activities in the study area. Linking these geographically extensive techniques with expert per-building interpretation of advanced-technology ground surveys provides a multi-faceted view of the physical recovery process. Remote sensing and GIS technologies combined to advanced ground survey approach provides extremely valuable capability in Recovery activities monitoring and may constitute a technical basis to lead aid organization and local government in the Recovery management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clusters have increasingly become an essential part of policy discourses at all levels, EU, national, regional, dealing with regional development, competitiveness, innovation, entrepreneurship, SMEs. These impressive efforts in promoting the concept of clusters on the policy-making arena have been accompanied by much less academic and scientific research work investigating the actual economic performance of firms in clusters, the design and execution of cluster policies and going beyond singular case studies to a more methodologically integrated and comparative approach to the study of clusters and their real-world impact. The theoretical background is far from being consolidated and there is a variety of methodologies and approaches for studying and interpreting this phenomenon while at the same time little comparability among studies on actual cluster performances. The conceptual framework of clustering suggests that they affect performance but theory makes little prediction as to the ultimate distribution of the value being created by clusters. This thesis takes the case of Eastern European countries for two reasons. One is that clusters, as coopetitive environments, are a new phenomenon as the previous centrally-based system did not allow for such types of firm organizations. The other is that, as new EU member states, they have been subject to the increased popularization of the cluster policy approach by the European Commission, especially in the framework of the National Reform Programmes related to the Lisbon objectives. The originality of the work lays in the fact that starting from an overview of theoretical contributions on clustering, it offers a comparative empirical study of clusters in transition countries. There have been very few examples in the literature that attempt to examine cluster performance in a comparative cross-country perspective. It adds to this an analysis of cluster policies and their implementation or lack of such as a way to analyse the way the cluster concept has been introduced to transition economies. Our findings show that the implementation of cluster policies does vary across countries with some countries which have embraced it more than others. The specific modes of implementation, however, are very similar, based mostly on soft measures such as funding for cluster initiatives, usually directed towards the creation of cluster management structures or cluster facilitators. They are essentially founded on a common assumption that the added values of clusters is in the creation of linkages among firms, human capital, skills and knowledge at the local level, most often perceived as the regional level. Often times geographical proximity is not a necessary element in the application process and cluster application are very similar to network membership. Cluster mapping is rarely a factor in the selection of cluster initiatives for funding and the relative question about critical mass and expected outcomes is not considered. In fact, monitoring and evaluation are not elements of the cluster policy cycle which have received a lot of attention. Bulgaria and the Czech Republic are the countries which have implemented cluster policies most decisively, Hungary and Poland have made significant efforts, while Slovakia and Romania have only sporadically and not systematically used cluster initiatives. When examining whether, in fact, firms located within regional clusters perform better and are more efficient than similar firms outside clusters, we do find positive results across countries and across sectors. The only country with negative impact from being located in a cluster is the Czech Republic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetic Resonance Spectroscopy (MRS) is an advanced clinical and research application which guarantees a specific biochemical and metabolic characterization of tissues by the detection and quantification of key metabolites for diagnosis and disease staging. The "Associazione Italiana di Fisica Medica (AIFM)" has promoted the activity of the "Interconfronto di spettroscopia in RM" working group. The purpose of the study is to compare and analyze results obtained by perfoming MRS on scanners of different manufacturing in order to compile a robust protocol for spectroscopic examinations in clinical routines. This thesis takes part into this project by using the GE Signa HDxt 1.5 T at the Pavillion no. 11 of the S.Orsola-Malpighi hospital in Bologna. The spectral analyses have been performed with the jMRUI package, which includes a wide range of preprocessing and quantification algorithms for signal analysis in the time domain. After the quality assurance on the scanner with standard and innovative methods, both spectra with and without suppression of the water peak have been acquired on the GE test phantom. The comparison of the ratios of the metabolite amplitudes over Creatine computed by the workstation software, which works on the frequencies, and jMRUI shows good agreement, suggesting that quantifications in both domains may lead to consistent results. The characterization of an in-house phantom provided by the working group has achieved its goal of assessing the solution content and the metabolite concentrations with good accuracy. The goodness of the experimental procedure and data analysis has been demonstrated by the correct estimation of the T2 of water, the observed biexponential relaxation curve of Creatine and the correct TE value at which the modulation by J coupling causes the Lactate doublet to be inverted in the spectrum. The work of this thesis has demonstrated that it is possible to perform measurements and establish protocols for data analysis, based on the physical principles of NMR, which are able to provide robust values for the spectral parameters of clinical use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Im Bereich sicherheitsrelevanter eingebetteter Systeme stellt sich der Designprozess von Anwendungen als sehr komplex dar. Entsprechend einer gegebenen Hardwarearchitektur lassen sich Steuergeräte aufrüsten, um alle bestehenden Prozesse und Signale pünktlich auszuführen. Die zeitlichen Anforderungen sind strikt und müssen in jeder periodischen Wiederkehr der Prozesse erfüllt sein, da die Sicherstellung der parallelen Ausführung von größter Bedeutung ist. Existierende Ansätze können schnell Designalternativen berechnen, aber sie gewährleisten nicht, dass die Kosten für die nötigen Hardwareänderungen minimal sind. Wir stellen einen Ansatz vor, der kostenminimale Lösungen für das Problem berechnet, die alle zeitlichen Bedingungen erfüllen. Unser Algorithmus verwendet Lineare Programmierung mit Spaltengenerierung, eingebettet in eine Baumstruktur, um untere und obere Schranken während des Optimierungsprozesses bereitzustellen. Die komplexen Randbedingungen zur Gewährleistung der periodischen Ausführung verlagern sich durch eine Zerlegung des Hauptproblems in unabhängige Unterprobleme, die als ganzzahlige lineare Programme formuliert sind. Sowohl die Analysen zur Prozessausführung als auch die Methoden zur Signalübertragung werden untersucht und linearisierte Darstellungen angegeben. Des Weiteren präsentieren wir eine neue Formulierung für die Ausführung mit fixierten Prioritäten, die zusätzlich Prozessantwortzeiten im schlimmsten anzunehmenden Fall berechnet, welche für Szenarien nötig sind, in denen zeitliche Bedingungen an Teilmengen von Prozessen und Signalen gegeben sind. Wir weisen die Anwendbarkeit unserer Methoden durch die Analyse von Instanzen nach, welche Prozessstrukturen aus realen Anwendungen enthalten. Unsere Ergebnisse zeigen, dass untere Schranken schnell berechnet werden können, um die Optimalität von heuristischen Lösungen zu beweisen. Wenn wir optimale Lösungen mit Antwortzeiten liefern, stellt sich unsere neue Formulierung in der Laufzeitanalyse vorteilhaft gegenüber anderen Ansätzen dar. Die besten Resultate werden mit einem hybriden Ansatz erzielt, der heuristische Startlösungen, eine Vorverarbeitung und eine heuristische mit einer kurzen nachfolgenden exakten Berechnungsphase verbindet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software visualizations can provide a concise overview of a complex software system. Unfortunately, as software has no physical shape, there is no `natural' mapping of software to a two-dimensional space. As a consequence most visualizations tend to use a layout in which position and distance have no meaning, and consequently layout typically diverges from one visualization to another. We propose an approach to consistent layout for software visualization, called Software Cartography, in which the position of a software artifact reflects its vocabulary, and distance corresponds to similarity of vocabulary. We use Latent Semantic Indexing (LSI) to map software artifacts to a vector space, and then use Multidimensional Scaling (MDS) to map this vector space down to two dimensions. The resulting consistent layout allows us to develop a variety of thematic software maps that express very different aspects of software while making it easy to compare them. The approach is especially suitable for comparing views of evolving software, as the vocabulary of software artifacts tends to be stable over time. We present a prototype implementation of Software Cartography, and illustrate its use with practical examples from numerous open-source case studies.