901 resultados para Formal Methods. Component-Based Development. Competition. Model Checking


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the great importance of soybeans in Brazil, there have been few applications of soybean crop modeling on Brazilian conditions. Thus, the objective of this study was to use modified crop models to estimate the depleted and potential soybean crop yield in Brazil. The climatic variable data used in the modified simulation of the soybean crop models were temperature, insolation and rainfall. The data set was taken from 33 counties (28 Sao Paulo state counties, and 5 counties from other states that neighbor São Paulo). Among the models, modifications in the estimation of the leaf area of the soybean crop, which includes corrections for the temperature, shading, senescence, CO2, and biomass partition were proposed; also, the methods of input for the model's simulation of the climatic variables were reconsidered. The depleted yields were estimated through a water balance, from which the depletion coefficient was estimated. It can be concluded that the adaptation soybean growth crop model might be used to predict the results of the depleted and potential yield of soybeans, and it can also be used to indicate better locations and periods of tillage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[ES] El presente trabajo describe el desarrollo de una aplicación para el registro de las horas de docencia impartidas por el profesorado en la universidad. Con esto se persigue tener la información digitalizada para agilizar las gestiones que se tengan que realizar con ella. Por el lado del profesorado, se enviarán notificaciones vía correo electrónico para confirmar la docencia firmada, a modo de registro personal para que el profesor sepa la docencia que ha impartido y, en caso de sustitución, que también el profesor sustituido tenga constancia de la sustitución. El desarrollo se hará apoyándose en métodos ágiles, utilizando el desarrollo guiado por pruebas los módulos del modelo y persistencia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for a convergence between semi-structured data management and Information Retrieval techniques is manifest to the scientific community. In order to fulfil this growing request, W3C has recently proposed XQuery Full Text, an IR-oriented extension of XQuery. However, the issue of query optimization requires the study of important properties like query equivalence and containment; to this aim, a formal representation of document and queries is needed. The goal of this thesis is to establish such formal background. We define a data model for XML documents and propose an algebra able to represent most of XQuery Full-Text expressions. We show how an XQuery Full-Text expression can be translated into an algebraic expression and how an algebraic expression can be optimized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] This paper presents a location–price equilibrium problem on a tree. A sufficient condition for having a Nash equilibrium in a spatial competition model that incorporates price, transport, and externality costs is given. This condition implies both competitors are located at the same point, a vertex that is the unique median of the tree. However, this is not an equilibrium necessary condition. Some examples show that not all medians are equilibria. Finally, an application to the Tenerife tram is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stocks’ overexploitation and socio-economic sustainability are two major issues currently at stake in European fisheries. In this view the European Commission is considering the implementation of management plans as a means to move towards a longer-term perspective on fisheries management, to consider regional differences and to increase stakeholder involvement. Adriatic small pelagic species (anchovies and sardines) are some of the most studied species in the world from a biologic perspective; several economic analysis have also been realised on Italian pelagic fishery; despite that, no complete bioeconomic modelization has been carried out yet considering all biologic, technical and economic issues. Bioeconomic models cannot be considered foolproof tools but are important implements to help decision makers and can supply a fundamental scientific basis for management plans. This research gathers all available information (from biologic, technologic and economic perspectives) in order to carry out a bioeconomic model of the Adriatic pelagic fishery. Different approaches are analyzed and some of them developed to highlight potential divergences in results, characteristics and implications. Growth, production and demand functions are estimated. A formal analysis about interaction and competition between Italian and Croatian fleet is examined proposing different equilibriums for open access, duopoly and a form of cooperative solution. Anyway normative judgments are limited because of poor knowledge of population dynamics and data related to the Croatian fleet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern software systems, in particular distributed ones, are everywhere around us and are at the basis of our everyday activities. Hence, guaranteeing their cor- rectness, consistency and safety is of paramount importance. Their complexity makes the verification of such properties a very challenging task. It is natural to expect that these systems are reliable and above all usable. i) In order to be reliable, compositional models of software systems need to account for consistent dynamic reconfiguration, i.e., changing at runtime the communication patterns of a program. ii) In order to be useful, compositional models of software systems need to account for interaction, which can be seen as communication patterns among components which collaborate together to achieve a common task. The aim of the Ph.D. was to develop powerful techniques based on formal methods for the verification of correctness, consistency and safety properties related to dynamic reconfiguration and communication in complex distributed systems. In particular, static analysis techniques based on types and type systems appeared to be an adequate methodology, considering their success in guaranteeing not only basic safety properties, but also more sophisticated ones like, deadlock or livelock freedom in a concurrent setting. The main contributions of this dissertation are twofold. i) On the components side: we design types and a type system for a concurrent object-oriented calculus to statically ensure consistency of dynamic reconfigurations related to modifications of communication patterns in a program during execution time. ii) On the communication side: we study advanced safety properties related to communication in complex distributed systems like deadlock-freedom, livelock- freedom and progress. Most importantly, we exploit an encoding of types and terms of a typical distributed language, session π-calculus, into the standard typed π- calculus, in order to understand their expressive power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Rahmen der interdisziplinären Zusammenarbeit zur Durchsetzung des »Menschenrecht Gesundheit« wurde ein geomedizinisches Informationssystem erstellt, das auf die nordexponierten Bergdörfer zwischen 350 m ü. NN und 450 m ü. NN des Kabupaten Sikka auf der Insel Flores in Indonesien anwendbar ist. Es wurde eine Analyse der Zeit-Raum-Dimension der Gesundheitssituation in Wololuma und Napun Lawan - exemplarisch für die nordexponierten Bergdörfer - durchgeführt. Im Untersuchungsraum wurden Gesundheitsgefahren und Gesundheitsrisiken analysiert, Zonen der Gefahren herausgearbeitet und Risikoräume bewertet. Trotz eines El Niño-Jahres waren prinzipielle Bezüge der Krankheiten zum jahreszeitlichen Rhythmus der wechselfeuchten Tropen zu erkennen. Ausgehend von der Vermutung, dass Krankheiten mit spezifischen Klimaelementen korrelieren, wurden Zusammenhänge gesucht. Für jede Krankheit wurden Makro-, Meso- und Mikrorisikoräume ermittelt. Somit wurden Krankheitsherde lokalisiert. Die Generalisierung des geomedizinischen Informationssystems lässt sich auf der Makroebene auf die nordexponierten Bergdörfer zwischen 350 m ü. NN und 450 m ü. NN des Kabupaten Sikka übertragen. Aus einer Vielzahl von angetroffenen Krankheiten wurden sechs Krankheiten selektiert. Aufgrund der Häufigkeitszahlen ergibt sich für das Gesundheitsrisiko der Bevölkerung eine Prioritätenliste:rn- Dermatomykosen (ganzjährig)rn- Typhus (ganzjährig)rn- Infektionen der unteren Atemwege (Übergangszeit)rn- Infektionen der oberen Atemwege (Übergangszeit)rn- Malaria (Regenzeit)rn- Struma (ganzjährig)rnDie Hauptrisikogruppe der Makroebene ist die feminine Bevölkerung. Betroffen sind weibliche Kleinkinder von null bis sechs Jahren und Frauen ab 41 Jahren. Die erstellten Karten des zeitlichen und räumlichen Verbreitungsmusters der Krankheiten und des Zugangs zu Gesundheitsdienstleistungen dienen Entscheidungsträgern als Entscheidungshilfe für den Einsatz der Mittel zur Primärprävention. Die Geographie als Wissenschaft mit ihren Methoden und dem Zeit-Raum-Modell hat gezeigt, dass sie die Basis für die interdisziplinäre Forschung darstellt. Die interdisziplinäre Zusammenarbeit zur Gesundheitsforschung im Untersuchungszeitraum 2009 hat sich bewährt und muss weiter ausgebaut werden. Die vorgeschlagenen Lösungsmöglichkeiten dienen der Minimierung des Gesundheitsrisikos und der Gesundheitsvorsorge. Da die Systemzusammenhänge der Ätiologie der einzelnen Krankheiten sehr komplex sind, besteht noch immer sehr großer Forschungsbedarf. rnDas Ergebnis der vorliegenden Untersuchung zeigt, dass Wasser in jeder Form die primäre Ursache für das Gesundheitsrisiko der Bergdörfer im Kabupaten Sikka auf der Insel Flores in Indonesien ist.rnDer Zugang zu Wasser ist unerlässlich für die Verwirklichung des »Menschenrecht Gesundheit«. Das Recht auf Wasser besagt, dass jeder Mensch Zugang zu nicht gesundheitsgefährdendem, ausreichendem und bezahlbarem Wasser haben soll. Alle Staaten dieser Erde sollten sich dieser Forderung verpflichtet fühlen.rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims at investigating methods and software architectures for discovering what are the typical and frequently occurring structures used for organizing knowledge in the Web. We identify these structures as Knowledge Patterns (KPs). KP discovery needs to address two main research problems: the heterogeneity of sources, formats and semantics in the Web (i.e., the knowledge soup problem) and the difficulty to draw relevant boundary around data that allows to capture the meaningful knowledge with respect to a certain context (i.e., the knowledge boundary problem). Hence, we introduce two methods that provide different solutions to these two problems by tackling KP discovery from two different perspectives: (i) the transformation of KP-like artifacts to KPs formalized as OWL2 ontologies; (ii) the bottom-up extraction of KPs by analyzing how data are organized in Linked Data. The two methods address the knowledge soup and boundary problems in different ways. The first method provides a solution to the two aforementioned problems that is based on a purely syntactic transformation step of the original source to RDF followed by a refactoring step whose aim is to add semantics to RDF by select meaningful RDF triples. The second method allows to draw boundaries around RDF in Linked Data by analyzing type paths. A type path is a possible route through an RDF that takes into account the types associated to the nodes of a path. Then we present K~ore, a software architecture conceived to be the basis for developing KP discovery systems and designed according to two software architectural styles, i.e, the Component-based and REST. Finally we provide an example of reuse of KP based on Aemoo, an exploratory search tool which exploits KPs for performing entity summarization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis work aims to develop original analytical methods for the determination of drugs with a potential for abuse, for the analysis of substances used in the pharmacological treatment of drug addiction in biological samples and for the monitoring of potentially toxic compounds added to street drugs. In fact reliable analytical techniques can play an important role in this setting. They can be employed to reveal drug intake, allowing the identification of drug users and to assess drug blood levels, assisting physicians in the management of the treatment. Pharmacological therapy needs to be carefully monitored indeed in order to optimize the dose scheduling according to the specific needs of the patient and to discourage improper use of the medication. In particular, different methods have been developed for the detection of gamma-hydroxybutiric acid (GHB), prescribed for the treatment of alcohol addiction, of glucocorticoids, one of the most abused pharmaceutical class to enhance sport performance and of adulterants, pharmacologically active compounds added to illicit drugs for recreational purposes. All the presented methods are based on capillary electrophoresis (CE) and high performance liquid chromatography (HPLC) coupled to various detectors (diode array detector, mass spectrometer). Biological samples pre-treatment was carried out using different extraction techniques, liquid-liquid extraction (LLE) and solid phase extraction (SPE). Different matrices have been considered: human plasma, dried blood spots, human urine, simulated street drugs. These developed analytical methods are individually described and discussed in this thesis work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Das Enterobakterium Escherichia coli sowie das Bodenbakterium Bacillus subtilis können C4-Dicarbonsäuren als aerobe Kohlenstoffquelle zur Energiekonservierung nutzen. Die Regulation des C4-Dicarboxylatstoffwechsels erfolgt in E. coli und B. subtilis durch das Zweikomponentensystem DcuSREc bzw. DctSRBs, bestehend aus einer Sensorkinase und einem Responseregulator. Diese kontrollieren die Expression des C4-Dicarboxylat-Transporters DctA. Der Sensor DcuSEc benötigt für seine Funktion im aeroben Stoffwechsel den Transporter DctA als Cosensor. Für das DctSRBs-System gibt es Hinweise aus genetischen Untersuchungen, dass DctSBs das Bindeprotein DctBBs und möglicherweise auch DctABs als Cosensoren für seine Funktion benötigt. In dieser Arbeit sollte ein direkter Nachweis geführt werden, ob DctBBs und DctABs gemeinsam oder nur jeweils eine der Komponenten als Cosensoren für DctSBs fungieren. Sowohl für DctBBs als auch für DctABs wurde eine direkte Protein-Protein-Interaktion mit DctSBs durch zwei in vivo Interaktionsmethoden nachgewiesen. Beide Methoden beruhen auf der Co-Reinigung der Interaktionspartner mittels Affinitätschromatographie und werden je nach Affinitätssäule als mSPINE oder mHPINE (Membrane Strep/His-Protein INteraction Experiment) bezeichnet. Die Interaktion von DctSBs mit DctBBs wurde zusätzlich über ein bakterielles Two-Hybrid System nachgewiesen. Nach Coexpression mit DctSBs interagieren DctABs und DctBBs in mSPINE-Tests gleichzeitig mit der Sensorkinase. DctSBs bildet somit eine sensorische DctS/DctA/DctB-Einheit in B. subtilis und das Bindeprotein DctBBs agiert nur als Cosensor, nicht aber als Transport-Bindeprotein. Eine direkte Interaktion zwischen dem Transporter DctABs und dem Bindeprotein DctBBs besteht nicht. Transportmessungen belegen, dass der DctA-vermittelte Transport von [14C]-Succinat unabhängig ist von DctBBs. Außerdem wurde untersucht, ob Zweikomponentensysteme aus anderen Bakteriengruppen nach einem ähnlichen Schema wie DcuSREc bzw. DctSRBs aufgebaut sind. Das thermophile Bakterium Geobacillus kaustophilus verfügt über ein DctSR-System, welches auf genetischer Ebene mit einem Transporter des DctA-Typs und einem DctB-Bindeprotein geclustert vorliegt. Die Sensorkinase DctSGk wurde in E. coli heterolog exprimiert und gereinigt. Diese zeigt in einer E. coli DcuS-Insertionsmutanten Komplementation der DcuS-Funktion und besitzt dabei Spezifität für die C4-Dicarbonsäuren Malat, Fumarat, L-Tartrat und Succinat sowie für die C6-Tricarbonsäure Citrat. In Liposomen rekonstituiertes DctSGk zeigt Autokinase-Aktivität nach Zugabe von [γ-33P]-ATP. Der KD-Wert für [γ-33P]-ATP der Kinasedomäne von DctSGk liegt bei 43 μM, die Affinität für ATP ist damit etwa 10-fach höher als in DcuSEc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Graphene, the thinnest two-dimensional material possible, is considered as a realistic candidate for the numerous applications in electronic, energy storage and conversion devices due to its unique properties, such as high optical transmittance, high conductivity, excellent chemical and thermal stability. However, the electronic and chemical properties of graphene are highly dependent on their preparation methods. Therefore, the development of novel chemical exfoliation process which aims at high yield synthesis of high quality graphene while maintaining good solution processability is of great concern. This thesis focuses on the solution production of high-quality graphene by wet-chemical exfoliation methods and addresses the applications of the chemically exfoliated graphene in organic electronics and energy storage devices.rnPlatinum is the most commonly used catalysts for fuel cells but they suffered from sluggish electron transfer kinetics. On the other hand, heteroatom doped graphene is known to enhance not only electrical conductivity but also long term operation stability. In this regard, a simple synthetic method is developed for the nitrogen doped graphene (NG) preparation. Moreover, iron (Fe) can be incorporated into the synthetic process. As-prepared NG with and without Fe shows excellent catalytic activity and stability compared to that of Pt based catalysts.rnHigh electrical conductivity is one of the most important requirements for the application of graphene in electronic devices. Therefore, for the fabrication of electrically conductive graphene films, a novel methane plasma assisted reduction of GO is developed. The high electrical conductivity of plasma reduced GO films revealed an excellent electrochemical performance in terms of high power and energy densities when used as an electrode in the micro-supercapacitors.rnAlthough, GO can be prepared in bulk scale, large amount of defect density and low electrical conductivity are major drawbacks. To overcome the intrinsic limitation of poor quality of GO and/or reduced GO, a novel protocol is extablished for mass production of high-quality graphene by means of electrochemical exfoliation of graphite. The prepared graphene shows high electrical conductivity, low defect density and good solution processability. Furthermore, when used as electrodes in organic field-effect transistors and/or in supercapacitors, the electrochemically exfoliated graphene shows excellent device performances. The low cost and environment friendly production of such high-quality graphene is of great importance for future generation electronics and energy storage devices. rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel methodology to generate realistic network flow traces to enable systematic evaluation of network monitoring systems in various traffic conditions. Our technique uses a graph-based approach to model the communication structure observed in real-world traces and to extract traffic templates. By combining extracted and user-defined traffic templates, realistic network flow traces that comprise normal traffic and customized conditions are generated in a scalable manner. A proof-of-concept implementation demonstrates the utility and simplicity of our method to produce a variety of evaluation scenarios. We show that the extraction of templates from real-world traffic leads to a manageable number of templates that still enable accurate re-creation of the original communication properties on the network flow level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The major objectives of this paper are: (1) to review the pros and cons of the scenarios of past anthropogenic land cover change (ALCC) developed during the last ten years, (2) to discuss issues related to pollen-based reconstruction of the past land-cover and introduce a new method, REVEALS (Regional Estimates of VEgetation Abundance from Large Sites), to infer long-term records of past land-cover from pollen data, (3) to present a new project (LANDCLIM: LAND cover – CLIMate interactions in NW Europe during the Holocene) currently underway, and show preliminary results of REVEALS reconstructions of the regional land-cover in the Czech Republic for five selected time windows of the Holocene, and (4) to discuss the implications and future directions in climate and vegetation/land-cover modeling, and in the assessment of the effects of human-induced changes in land-cover on the regional climate through altered feedbacks. The existing ALCC scenarios show large discrepancies between them, and few cover time periods older than AD 800. When these scenarios are used to assess the impact of human land-use on climate, contrasting results are obtained. It emphasizes the need for methods such as the REVEALS model-based land-cover reconstructions. They might help to fine-tune descriptions of past land-cover and lead to a better understanding of how long-term changes in ALCC might have influenced climate. The REVEALS model is demonstrated to provide better estimates of the regional vegetation/land-cover changes than the traditional use of pollen percentages. This will achieve a robust assessment of land cover at regional- to continental-spatial scale throughout the Holocene. We present maps of REVEALS estimates for the percentage cover of 10 plant functional types (PFTs) at 200 BP and 6000 BP, and of the two open-land PFTs "grassland" and "agricultural land" at five time-windows from 6000 BP to recent time. The LANDCLIM results are expected to provide crucial data to reassess ALCC estimates for a better understanding of the land suface-atmosphere interactions.